Nov 24 11:06:32 crc systemd[1]: Starting Kubernetes Kubelet... Nov 24 11:06:32 crc restorecon[4751]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 24 11:06:32 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 24 11:06:33 crc restorecon[4751]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 24 11:06:33 crc restorecon[4751]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Nov 24 11:06:34 crc kubenswrapper[4752]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 24 11:06:34 crc kubenswrapper[4752]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Nov 24 11:06:34 crc kubenswrapper[4752]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 24 11:06:34 crc kubenswrapper[4752]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 24 11:06:34 crc kubenswrapper[4752]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Nov 24 11:06:34 crc kubenswrapper[4752]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.470281 4752 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.479966 4752 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480062 4752 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480076 4752 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480086 4752 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480096 4752 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480190 4752 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480337 4752 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480576 4752 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480584 4752 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480591 4752 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480596 4752 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480601 4752 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480607 4752 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480611 4752 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480616 4752 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480620 4752 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480624 4752 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480628 4752 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480631 4752 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480636 4752 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480640 4752 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480643 4752 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480648 4752 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480651 4752 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480656 4752 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480660 4752 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480663 4752 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480667 4752 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480670 4752 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480674 4752 feature_gate.go:330] unrecognized feature gate: Example Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480678 4752 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480681 4752 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480685 4752 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480689 4752 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480692 4752 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480696 4752 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480700 4752 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480704 4752 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480714 4752 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480721 4752 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480727 4752 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480733 4752 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480757 4752 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480762 4752 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480768 4752 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480773 4752 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480778 4752 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480783 4752 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480789 4752 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480794 4752 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480799 4752 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480802 4752 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480807 4752 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480810 4752 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480814 4752 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480818 4752 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480823 4752 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480826 4752 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480830 4752 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480834 4752 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480838 4752 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480842 4752 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480847 4752 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480851 4752 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480855 4752 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480859 4752 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480862 4752 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480866 4752 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480871 4752 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480876 4752 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.480881 4752 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481007 4752 flags.go:64] FLAG: --address="0.0.0.0" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481020 4752 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481030 4752 flags.go:64] FLAG: --anonymous-auth="true" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481037 4752 flags.go:64] FLAG: --application-metrics-count-limit="100" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481044 4752 flags.go:64] FLAG: --authentication-token-webhook="false" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481049 4752 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481055 4752 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481061 4752 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481065 4752 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481069 4752 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481074 4752 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481079 4752 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481083 4752 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481087 4752 flags.go:64] FLAG: --cgroup-root="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481091 4752 flags.go:64] FLAG: --cgroups-per-qos="true" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481095 4752 flags.go:64] FLAG: --client-ca-file="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481099 4752 flags.go:64] FLAG: --cloud-config="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481103 4752 flags.go:64] FLAG: --cloud-provider="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481106 4752 flags.go:64] FLAG: --cluster-dns="[]" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481112 4752 flags.go:64] FLAG: --cluster-domain="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481116 4752 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481120 4752 flags.go:64] FLAG: --config-dir="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481124 4752 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481131 4752 flags.go:64] FLAG: --container-log-max-files="5" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481136 4752 flags.go:64] FLAG: --container-log-max-size="10Mi" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481141 4752 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481145 4752 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481150 4752 flags.go:64] FLAG: --containerd-namespace="k8s.io" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481154 4752 flags.go:64] FLAG: --contention-profiling="false" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481158 4752 flags.go:64] FLAG: --cpu-cfs-quota="true" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481162 4752 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481166 4752 flags.go:64] FLAG: --cpu-manager-policy="none" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481171 4752 flags.go:64] FLAG: --cpu-manager-policy-options="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481177 4752 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481181 4752 flags.go:64] FLAG: --enable-controller-attach-detach="true" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481185 4752 flags.go:64] FLAG: --enable-debugging-handlers="true" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481188 4752 flags.go:64] FLAG: --enable-load-reader="false" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481193 4752 flags.go:64] FLAG: --enable-server="true" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481197 4752 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481208 4752 flags.go:64] FLAG: --event-burst="100" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481212 4752 flags.go:64] FLAG: --event-qps="50" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481216 4752 flags.go:64] FLAG: --event-storage-age-limit="default=0" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481220 4752 flags.go:64] FLAG: --event-storage-event-limit="default=0" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481225 4752 flags.go:64] FLAG: --eviction-hard="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481230 4752 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481234 4752 flags.go:64] FLAG: --eviction-minimum-reclaim="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481238 4752 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481242 4752 flags.go:64] FLAG: --eviction-soft="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481246 4752 flags.go:64] FLAG: --eviction-soft-grace-period="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481250 4752 flags.go:64] FLAG: --exit-on-lock-contention="false" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481254 4752 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481258 4752 flags.go:64] FLAG: --experimental-mounter-path="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481262 4752 flags.go:64] FLAG: --fail-cgroupv1="false" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481266 4752 flags.go:64] FLAG: --fail-swap-on="true" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481270 4752 flags.go:64] FLAG: --feature-gates="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481277 4752 flags.go:64] FLAG: --file-check-frequency="20s" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481281 4752 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481286 4752 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481291 4752 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481295 4752 flags.go:64] FLAG: --healthz-port="10248" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481299 4752 flags.go:64] FLAG: --help="false" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481303 4752 flags.go:64] FLAG: --hostname-override="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481308 4752 flags.go:64] FLAG: --housekeeping-interval="10s" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481312 4752 flags.go:64] FLAG: --http-check-frequency="20s" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481316 4752 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481320 4752 flags.go:64] FLAG: --image-credential-provider-config="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481324 4752 flags.go:64] FLAG: --image-gc-high-threshold="85" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481328 4752 flags.go:64] FLAG: --image-gc-low-threshold="80" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481332 4752 flags.go:64] FLAG: --image-service-endpoint="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481336 4752 flags.go:64] FLAG: --kernel-memcg-notification="false" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481340 4752 flags.go:64] FLAG: --kube-api-burst="100" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481344 4752 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481348 4752 flags.go:64] FLAG: --kube-api-qps="50" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481352 4752 flags.go:64] FLAG: --kube-reserved="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481360 4752 flags.go:64] FLAG: --kube-reserved-cgroup="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481364 4752 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481368 4752 flags.go:64] FLAG: --kubelet-cgroups="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481372 4752 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481376 4752 flags.go:64] FLAG: --lock-file="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481381 4752 flags.go:64] FLAG: --log-cadvisor-usage="false" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481386 4752 flags.go:64] FLAG: --log-flush-frequency="5s" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481391 4752 flags.go:64] FLAG: --log-json-info-buffer-size="0" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481399 4752 flags.go:64] FLAG: --log-json-split-stream="false" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481405 4752 flags.go:64] FLAG: --log-text-info-buffer-size="0" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481410 4752 flags.go:64] FLAG: --log-text-split-stream="false" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481415 4752 flags.go:64] FLAG: --logging-format="text" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481419 4752 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481424 4752 flags.go:64] FLAG: --make-iptables-util-chains="true" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481428 4752 flags.go:64] FLAG: --manifest-url="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481432 4752 flags.go:64] FLAG: --manifest-url-header="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481440 4752 flags.go:64] FLAG: --max-housekeeping-interval="15s" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481444 4752 flags.go:64] FLAG: --max-open-files="1000000" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481450 4752 flags.go:64] FLAG: --max-pods="110" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481454 4752 flags.go:64] FLAG: --maximum-dead-containers="-1" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481458 4752 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481462 4752 flags.go:64] FLAG: --memory-manager-policy="None" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481466 4752 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481471 4752 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481475 4752 flags.go:64] FLAG: --node-ip="192.168.126.11" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481479 4752 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481495 4752 flags.go:64] FLAG: --node-status-max-images="50" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481500 4752 flags.go:64] FLAG: --node-status-update-frequency="10s" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481505 4752 flags.go:64] FLAG: --oom-score-adj="-999" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481511 4752 flags.go:64] FLAG: --pod-cidr="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481516 4752 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481524 4752 flags.go:64] FLAG: --pod-manifest-path="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481531 4752 flags.go:64] FLAG: --pod-max-pids="-1" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481537 4752 flags.go:64] FLAG: --pods-per-core="0" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481542 4752 flags.go:64] FLAG: --port="10250" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481547 4752 flags.go:64] FLAG: --protect-kernel-defaults="false" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481552 4752 flags.go:64] FLAG: --provider-id="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481558 4752 flags.go:64] FLAG: --qos-reserved="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481563 4752 flags.go:64] FLAG: --read-only-port="10255" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481568 4752 flags.go:64] FLAG: --register-node="true" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481579 4752 flags.go:64] FLAG: --register-schedulable="true" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481584 4752 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481593 4752 flags.go:64] FLAG: --registry-burst="10" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481597 4752 flags.go:64] FLAG: --registry-qps="5" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481602 4752 flags.go:64] FLAG: --reserved-cpus="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481609 4752 flags.go:64] FLAG: --reserved-memory="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481617 4752 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481622 4752 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481628 4752 flags.go:64] FLAG: --rotate-certificates="false" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481633 4752 flags.go:64] FLAG: --rotate-server-certificates="false" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481638 4752 flags.go:64] FLAG: --runonce="false" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481643 4752 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481649 4752 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481654 4752 flags.go:64] FLAG: --seccomp-default="false" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481657 4752 flags.go:64] FLAG: --serialize-image-pulls="true" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481662 4752 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481666 4752 flags.go:64] FLAG: --storage-driver-db="cadvisor" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481670 4752 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481675 4752 flags.go:64] FLAG: --storage-driver-password="root" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481680 4752 flags.go:64] FLAG: --storage-driver-secure="false" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481685 4752 flags.go:64] FLAG: --storage-driver-table="stats" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481695 4752 flags.go:64] FLAG: --storage-driver-user="root" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481702 4752 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481708 4752 flags.go:64] FLAG: --sync-frequency="1m0s" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481713 4752 flags.go:64] FLAG: --system-cgroups="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481719 4752 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481729 4752 flags.go:64] FLAG: --system-reserved-cgroup="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481734 4752 flags.go:64] FLAG: --tls-cert-file="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481762 4752 flags.go:64] FLAG: --tls-cipher-suites="[]" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481774 4752 flags.go:64] FLAG: --tls-min-version="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481778 4752 flags.go:64] FLAG: --tls-private-key-file="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481783 4752 flags.go:64] FLAG: --topology-manager-policy="none" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481787 4752 flags.go:64] FLAG: --topology-manager-policy-options="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481792 4752 flags.go:64] FLAG: --topology-manager-scope="container" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481796 4752 flags.go:64] FLAG: --v="2" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481803 4752 flags.go:64] FLAG: --version="false" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481809 4752 flags.go:64] FLAG: --vmodule="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481814 4752 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.481819 4752 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.481940 4752 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.481945 4752 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.481950 4752 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.481954 4752 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.481959 4752 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.481962 4752 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.481966 4752 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.481970 4752 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.481973 4752 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.481977 4752 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.481981 4752 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.481985 4752 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.481990 4752 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.481996 4752 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482001 4752 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482005 4752 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482010 4752 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482014 4752 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482019 4752 feature_gate.go:330] unrecognized feature gate: Example Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482023 4752 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482026 4752 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482030 4752 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482034 4752 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482038 4752 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482042 4752 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482045 4752 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482049 4752 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482052 4752 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482057 4752 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482061 4752 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482065 4752 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482069 4752 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482073 4752 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482077 4752 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482081 4752 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482085 4752 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482089 4752 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482092 4752 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482096 4752 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482100 4752 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482103 4752 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482107 4752 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482111 4752 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482116 4752 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482119 4752 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482123 4752 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482127 4752 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482132 4752 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482138 4752 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482142 4752 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482146 4752 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482151 4752 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482155 4752 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482158 4752 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482162 4752 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482165 4752 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482169 4752 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482172 4752 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482176 4752 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482179 4752 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482183 4752 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482186 4752 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482192 4752 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482196 4752 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482200 4752 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482203 4752 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482207 4752 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482212 4752 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482216 4752 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482220 4752 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.482224 4752 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.482240 4752 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.495408 4752 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.495445 4752 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.495592 4752 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.495604 4752 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.495614 4752 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.495658 4752 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.495670 4752 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.495682 4752 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.495691 4752 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.495700 4752 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.495709 4752 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.495718 4752 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.495727 4752 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.495735 4752 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.495778 4752 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.495786 4752 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.495794 4752 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.495802 4752 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.495811 4752 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.495821 4752 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.495832 4752 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.495842 4752 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.495852 4752 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.495862 4752 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.495871 4752 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.495879 4752 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.495887 4752 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.495896 4752 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.495904 4752 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.495914 4752 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.495922 4752 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.495931 4752 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.495940 4752 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.495948 4752 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.495956 4752 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.495965 4752 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.495974 4752 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.495983 4752 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.495992 4752 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496000 4752 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496008 4752 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496018 4752 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496028 4752 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496037 4752 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496045 4752 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496054 4752 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496062 4752 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496069 4752 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496078 4752 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496086 4752 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496094 4752 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496102 4752 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496109 4752 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496117 4752 feature_gate.go:330] unrecognized feature gate: Example Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496125 4752 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496133 4752 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496140 4752 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496148 4752 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496156 4752 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496164 4752 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496172 4752 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496179 4752 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496187 4752 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496195 4752 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496204 4752 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496211 4752 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496219 4752 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496228 4752 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496236 4752 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496243 4752 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496251 4752 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496261 4752 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496272 4752 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.496285 4752 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496508 4752 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496522 4752 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496535 4752 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496547 4752 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496560 4752 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496573 4752 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496583 4752 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496591 4752 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496600 4752 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496609 4752 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496619 4752 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496627 4752 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496635 4752 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496643 4752 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496651 4752 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496659 4752 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496666 4752 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496674 4752 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496681 4752 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496689 4752 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496697 4752 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496705 4752 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496712 4752 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496720 4752 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496729 4752 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496790 4752 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496802 4752 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496812 4752 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496822 4752 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496834 4752 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496843 4752 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496851 4752 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496860 4752 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496868 4752 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496875 4752 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496884 4752 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496892 4752 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496901 4752 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496909 4752 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496918 4752 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496926 4752 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496935 4752 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496943 4752 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496951 4752 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496958 4752 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496967 4752 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496975 4752 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496983 4752 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.496991 4752 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.497001 4752 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.497010 4752 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.497018 4752 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.497029 4752 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.497038 4752 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.497047 4752 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.497057 4752 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.497067 4752 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.497075 4752 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.497086 4752 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.497095 4752 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.497103 4752 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.497111 4752 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.497120 4752 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.497128 4752 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.497136 4752 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.497145 4752 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.497154 4752 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.497162 4752 feature_gate.go:330] unrecognized feature gate: Example Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.497170 4752 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.497178 4752 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.497186 4752 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.497198 4752 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.498603 4752 server.go:940] "Client rotation is on, will bootstrap in background" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.504096 4752 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.506292 4752 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.509573 4752 server.go:997] "Starting client certificate rotation" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.509624 4752 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.509828 4752 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-23 17:54:35.61402327 +0000 UTC Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.509955 4752 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.538016 4752 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 24 11:06:34 crc kubenswrapper[4752]: E1124 11:06:34.540266 4752 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.145:6443: connect: connection refused" logger="UnhandledError" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.543449 4752 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.567392 4752 log.go:25] "Validated CRI v1 runtime API" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.609473 4752 log.go:25] "Validated CRI v1 image API" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.611887 4752 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.617836 4752 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-11-24-11-01-46-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.617865 4752 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.639117 4752 manager.go:217] Machine: {Timestamp:2025-11-24 11:06:34.636901953 +0000 UTC m=+0.621722282 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:366115a7-2c9a-450b-9862-da5d0db853ac BootID:47425241-83f5-42c8-9f71-0c166d7ef9e2 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:b8:aa:5e Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:b8:aa:5e Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:bf:6e:74 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:72:6a:9a Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:cb:10:ac Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:f6:8e:1e Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:f0:05:71 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:ca:53:12:c3:b4:69 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:ae:5f:81:77:f1:22 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.639425 4752 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.639599 4752 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.642677 4752 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.642959 4752 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.643004 4752 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.643269 4752 topology_manager.go:138] "Creating topology manager with none policy" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.643283 4752 container_manager_linux.go:303] "Creating device plugin manager" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.643986 4752 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.644030 4752 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.644357 4752 state_mem.go:36] "Initialized new in-memory state store" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.644463 4752 server.go:1245] "Using root directory" path="/var/lib/kubelet" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.648310 4752 kubelet.go:418] "Attempting to sync node with API server" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.648337 4752 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.648402 4752 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.648420 4752 kubelet.go:324] "Adding apiserver pod source" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.648441 4752 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.653353 4752 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.654414 4752 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.656266 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.145:6443: connect: connection refused Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.656457 4752 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Nov 24 11:06:34 crc kubenswrapper[4752]: E1124 11:06:34.656454 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.145:6443: connect: connection refused" logger="UnhandledError" Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.656274 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.145:6443: connect: connection refused Nov 24 11:06:34 crc kubenswrapper[4752]: E1124 11:06:34.657016 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.145:6443: connect: connection refused" logger="UnhandledError" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.658095 4752 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.658132 4752 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.658143 4752 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.658153 4752 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.658169 4752 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.658178 4752 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.658192 4752 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.658206 4752 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.658227 4752 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.658241 4752 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.658271 4752 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.658288 4752 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.658348 4752 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.659307 4752 server.go:1280] "Started kubelet" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.659551 4752 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.661159 4752 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.661538 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.145:6443: connect: connection refused Nov 24 11:06:34 crc systemd[1]: Started Kubernetes Kubelet. Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.662173 4752 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.663428 4752 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.663466 4752 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.663614 4752 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 23:33:46.407839768 +0000 UTC Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.663964 4752 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 588h27m11.743885802s for next certificate rotation Nov 24 11:06:34 crc kubenswrapper[4752]: E1124 11:06:34.663940 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.664388 4752 volume_manager.go:287] "The desired_state_of_world populator starts" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.664421 4752 volume_manager.go:289] "Starting Kubelet Volume Manager" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.665077 4752 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.665133 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.145:6443: connect: connection refused Nov 24 11:06:34 crc kubenswrapper[4752]: E1124 11:06:34.670159 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.145:6443: connect: connection refused" logger="UnhandledError" Nov 24 11:06:34 crc kubenswrapper[4752]: E1124 11:06:34.671900 4752 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.145:6443: connect: connection refused" interval="200ms" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.674028 4752 factory.go:153] Registering CRI-O factory Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.674191 4752 factory.go:221] Registration of the crio container factory successfully Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.674297 4752 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.674316 4752 factory.go:55] Registering systemd factory Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.674330 4752 factory.go:221] Registration of the systemd container factory successfully Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.674362 4752 factory.go:103] Registering Raw factory Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.674381 4752 manager.go:1196] Started watching for new ooms in manager Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.675258 4752 server.go:460] "Adding debug handlers to kubelet server" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.675859 4752 manager.go:319] Starting recovery of all containers Nov 24 11:06:34 crc kubenswrapper[4752]: E1124 11:06:34.684518 4752 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.145:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187aeca457b54e5e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-24 11:06:34.659262046 +0000 UTC m=+0.644082335,LastTimestamp:2025-11-24 11:06:34.659262046 +0000 UTC m=+0.644082335,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.691444 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.691605 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.691624 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.691636 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.691650 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.691795 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.691813 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.691830 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.691869 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.691888 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.691905 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.691920 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.691932 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.691984 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.692002 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.692037 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.692051 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.692107 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.692126 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.692138 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.692151 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.692204 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.692217 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.694774 4752 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.694812 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.694827 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.694845 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.694861 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.694876 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.694893 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.694916 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.694993 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.695078 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.695156 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.695169 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.695181 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.695201 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.695264 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.695280 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.695334 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.695347 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.695360 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.695374 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.695388 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.695402 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.695417 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.695433 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.695501 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.695517 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.695551 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.695565 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.695579 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.695596 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.695658 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.695723 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.695777 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.695801 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.695817 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.695835 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.695849 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.695924 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.695941 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.696004 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.696113 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.696131 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.696177 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.696200 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.696214 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.696237 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.696253 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.696268 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.696313 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.696327 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.696341 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.696368 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.696389 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.696426 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.696443 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.696459 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.696499 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.696514 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.696536 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.696551 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.696564 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.696578 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.696591 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.696602 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.696660 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.696673 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.696880 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.696900 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.696911 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697027 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697042 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697056 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697105 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697123 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697139 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697151 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697163 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697177 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697193 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697207 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697244 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697263 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697286 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697301 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697314 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697387 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697400 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697415 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697436 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697454 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697469 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697482 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697495 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697514 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697530 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697542 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697554 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697566 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697577 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697588 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697601 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697615 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697630 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697643 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697710 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697727 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697758 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697773 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697787 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697807 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697825 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697870 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697884 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697898 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697909 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697922 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697936 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697947 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697961 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697975 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697987 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.697998 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.698012 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.698024 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.698035 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.698047 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.698393 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.698407 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.698420 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.698430 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.698442 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.698453 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.698464 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.698478 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.698490 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.698487 4752 manager.go:324] Recovery completed Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.698501 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.698629 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.698664 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.698679 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.698690 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.698701 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.698713 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.698797 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.698808 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.698818 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.698827 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.698837 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.698846 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.699303 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.699319 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.699345 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.699357 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.699368 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.699386 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.699406 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.699421 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.699433 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.700122 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.700226 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.700256 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.700296 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.700327 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.700368 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.700394 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.700418 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.700453 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.700481 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.700561 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.700602 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.700623 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.700659 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.700686 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.700715 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.700786 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.700810 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.700844 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.700869 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.700892 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.700927 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.700950 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.700987 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.701011 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.701036 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.701067 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.701090 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.701108 4752 reconstruct.go:97] "Volume reconstruction finished" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.701121 4752 reconciler.go:26] "Reconciler: start to sync state" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.713738 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.716946 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.716988 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.716996 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.718368 4752 cpu_manager.go:225] "Starting CPU manager" policy="none" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.718410 4752 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.718446 4752 state_mem.go:36] "Initialized new in-memory state store" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.724136 4752 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.726409 4752 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.726469 4752 status_manager.go:217] "Starting to sync pod status with apiserver" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.726695 4752 kubelet.go:2335] "Starting kubelet main sync loop" Nov 24 11:06:34 crc kubenswrapper[4752]: E1124 11:06:34.726856 4752 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Nov 24 11:06:34 crc kubenswrapper[4752]: W1124 11:06:34.727257 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.145:6443: connect: connection refused Nov 24 11:06:34 crc kubenswrapper[4752]: E1124 11:06:34.727317 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.145:6443: connect: connection refused" logger="UnhandledError" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.743278 4752 policy_none.go:49] "None policy: Start" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.744199 4752 memory_manager.go:170] "Starting memorymanager" policy="None" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.744236 4752 state_mem.go:35] "Initializing new in-memory state store" Nov 24 11:06:34 crc kubenswrapper[4752]: E1124 11:06:34.764548 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.790686 4752 manager.go:334] "Starting Device Plugin manager" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.791076 4752 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.791138 4752 server.go:79] "Starting device plugin registration server" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.791591 4752 eviction_manager.go:189] "Eviction manager: starting control loop" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.791661 4752 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.791882 4752 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.791997 4752 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.792017 4752 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Nov 24 11:06:34 crc kubenswrapper[4752]: E1124 11:06:34.803931 4752 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.827123 4752 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.827227 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.828500 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.828526 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.828536 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.828650 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.829024 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.829096 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.830139 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.830174 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.830186 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.830210 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.830244 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.830260 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.830391 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.830566 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.830653 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.832627 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.832661 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.832676 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.832839 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.832865 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.832872 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.832998 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.833032 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.832877 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.835005 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.835043 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.835058 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.837817 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.837856 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.837868 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.837982 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.838557 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.838627 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.839913 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.839958 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.839973 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.839988 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.840047 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.840060 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.840249 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.840277 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.841016 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.841086 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.841103 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:34 crc kubenswrapper[4752]: E1124 11:06:34.874348 4752 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.145:6443: connect: connection refused" interval="400ms" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.892666 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.893978 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.894018 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.894062 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.894094 4752 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 24 11:06:34 crc kubenswrapper[4752]: E1124 11:06:34.894645 4752 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.145:6443: connect: connection refused" node="crc" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.902998 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.903025 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.903072 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.903089 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.903106 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.904291 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.904386 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.904434 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.904527 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.904570 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.904612 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.904652 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.904691 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.904724 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 24 11:06:34 crc kubenswrapper[4752]: I1124 11:06:34.904895 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.006593 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.006956 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.006967 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.007174 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.007225 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.007299 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.007344 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.007354 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.007373 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.007468 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.007486 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.007522 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.007528 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.007552 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.007581 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.007612 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.007647 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.007646 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.007684 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.007696 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.007729 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.007796 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.007581 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.007617 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.007873 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.007911 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.007952 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.008004 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.007730 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.008095 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.094825 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.096802 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.096852 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.096870 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.096902 4752 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 24 11:06:35 crc kubenswrapper[4752]: E1124 11:06:35.097557 4752 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.145:6443: connect: connection refused" node="crc" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.161965 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.189037 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.203347 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.232498 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.243170 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 24 11:06:35 crc kubenswrapper[4752]: W1124 11:06:35.260287 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-e90ca28ad4429dd6a4b5d40109d8083e91c33045c2d89cee83f76dca925cf69a WatchSource:0}: Error finding container e90ca28ad4429dd6a4b5d40109d8083e91c33045c2d89cee83f76dca925cf69a: Status 404 returned error can't find the container with id e90ca28ad4429dd6a4b5d40109d8083e91c33045c2d89cee83f76dca925cf69a Nov 24 11:06:35 crc kubenswrapper[4752]: W1124 11:06:35.266313 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-5f558b79467faf85da9fa629a3e761ceef8dc208f19e673f5c70aeb6fae633c2 WatchSource:0}: Error finding container 5f558b79467faf85da9fa629a3e761ceef8dc208f19e673f5c70aeb6fae633c2: Status 404 returned error can't find the container with id 5f558b79467faf85da9fa629a3e761ceef8dc208f19e673f5c70aeb6fae633c2 Nov 24 11:06:35 crc kubenswrapper[4752]: W1124 11:06:35.274201 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-76942ad78d413cec3b8ff38c050a581aaf48070cd82783a6aaaf2e9813e8b48e WatchSource:0}: Error finding container 76942ad78d413cec3b8ff38c050a581aaf48070cd82783a6aaaf2e9813e8b48e: Status 404 returned error can't find the container with id 76942ad78d413cec3b8ff38c050a581aaf48070cd82783a6aaaf2e9813e8b48e Nov 24 11:06:35 crc kubenswrapper[4752]: E1124 11:06:35.275503 4752 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.145:6443: connect: connection refused" interval="800ms" Nov 24 11:06:35 crc kubenswrapper[4752]: W1124 11:06:35.275845 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-6161f081a7af892bb418261f0c1b6f0c292b3d33240b2c5536b0d198f4ec6159 WatchSource:0}: Error finding container 6161f081a7af892bb418261f0c1b6f0c292b3d33240b2c5536b0d198f4ec6159: Status 404 returned error can't find the container with id 6161f081a7af892bb418261f0c1b6f0c292b3d33240b2c5536b0d198f4ec6159 Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.498191 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.499717 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.499770 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.499780 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.499799 4752 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 24 11:06:35 crc kubenswrapper[4752]: E1124 11:06:35.500254 4752 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.145:6443: connect: connection refused" node="crc" Nov 24 11:06:35 crc kubenswrapper[4752]: W1124 11:06:35.608503 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.145:6443: connect: connection refused Nov 24 11:06:35 crc kubenswrapper[4752]: E1124 11:06:35.608593 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.145:6443: connect: connection refused" logger="UnhandledError" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.662402 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.145:6443: connect: connection refused Nov 24 11:06:35 crc kubenswrapper[4752]: W1124 11:06:35.671480 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.145:6443: connect: connection refused Nov 24 11:06:35 crc kubenswrapper[4752]: E1124 11:06:35.671571 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.145:6443: connect: connection refused" logger="UnhandledError" Nov 24 11:06:35 crc kubenswrapper[4752]: W1124 11:06:35.681237 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.145:6443: connect: connection refused Nov 24 11:06:35 crc kubenswrapper[4752]: E1124 11:06:35.681294 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.145:6443: connect: connection refused" logger="UnhandledError" Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.732005 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"76942ad78d413cec3b8ff38c050a581aaf48070cd82783a6aaaf2e9813e8b48e"} Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.733103 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5f558b79467faf85da9fa629a3e761ceef8dc208f19e673f5c70aeb6fae633c2"} Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.733904 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f8ae9fd8c4feee374cab5333a247c25f5e36ae9e9ad4f69337e605fddbd6d892"} Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.735888 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"e90ca28ad4429dd6a4b5d40109d8083e91c33045c2d89cee83f76dca925cf69a"} Nov 24 11:06:35 crc kubenswrapper[4752]: I1124 11:06:35.738036 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6161f081a7af892bb418261f0c1b6f0c292b3d33240b2c5536b0d198f4ec6159"} Nov 24 11:06:36 crc kubenswrapper[4752]: E1124 11:06:36.077068 4752 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.145:6443: connect: connection refused" interval="1.6s" Nov 24 11:06:36 crc kubenswrapper[4752]: W1124 11:06:36.153019 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.145:6443: connect: connection refused Nov 24 11:06:36 crc kubenswrapper[4752]: E1124 11:06:36.153141 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.145:6443: connect: connection refused" logger="UnhandledError" Nov 24 11:06:36 crc kubenswrapper[4752]: I1124 11:06:36.300536 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:36 crc kubenswrapper[4752]: I1124 11:06:36.305024 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:36 crc kubenswrapper[4752]: I1124 11:06:36.305061 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:36 crc kubenswrapper[4752]: I1124 11:06:36.305074 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:36 crc kubenswrapper[4752]: I1124 11:06:36.305100 4752 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 24 11:06:36 crc kubenswrapper[4752]: E1124 11:06:36.305610 4752 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.145:6443: connect: connection refused" node="crc" Nov 24 11:06:36 crc kubenswrapper[4752]: I1124 11:06:36.545819 4752 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Nov 24 11:06:36 crc kubenswrapper[4752]: E1124 11:06:36.547071 4752 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.145:6443: connect: connection refused" logger="UnhandledError" Nov 24 11:06:36 crc kubenswrapper[4752]: I1124 11:06:36.662285 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.145:6443: connect: connection refused Nov 24 11:06:36 crc kubenswrapper[4752]: I1124 11:06:36.744043 4752 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="3310005b0f0e931a397e87b2649c72ef96467f1f7439267f1e410109916e4034" exitCode=0 Nov 24 11:06:36 crc kubenswrapper[4752]: I1124 11:06:36.744112 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"3310005b0f0e931a397e87b2649c72ef96467f1f7439267f1e410109916e4034"} Nov 24 11:06:36 crc kubenswrapper[4752]: I1124 11:06:36.744159 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:36 crc kubenswrapper[4752]: I1124 11:06:36.745076 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:36 crc kubenswrapper[4752]: I1124 11:06:36.745121 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:36 crc kubenswrapper[4752]: I1124 11:06:36.745139 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:36 crc kubenswrapper[4752]: I1124 11:06:36.746545 4752 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="7631b7f595858c73834bf11a808301514a3b625052b201189819d3421e6e55bc" exitCode=0 Nov 24 11:06:36 crc kubenswrapper[4752]: I1124 11:06:36.746715 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:36 crc kubenswrapper[4752]: I1124 11:06:36.746776 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"7631b7f595858c73834bf11a808301514a3b625052b201189819d3421e6e55bc"} Nov 24 11:06:36 crc kubenswrapper[4752]: I1124 11:06:36.748100 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:36 crc kubenswrapper[4752]: I1124 11:06:36.748153 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:36 crc kubenswrapper[4752]: I1124 11:06:36.748170 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:36 crc kubenswrapper[4752]: I1124 11:06:36.751214 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8"} Nov 24 11:06:36 crc kubenswrapper[4752]: I1124 11:06:36.751258 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614"} Nov 24 11:06:36 crc kubenswrapper[4752]: I1124 11:06:36.751271 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0"} Nov 24 11:06:36 crc kubenswrapper[4752]: I1124 11:06:36.751280 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded"} Nov 24 11:06:36 crc kubenswrapper[4752]: I1124 11:06:36.751357 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:36 crc kubenswrapper[4752]: I1124 11:06:36.752373 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:36 crc kubenswrapper[4752]: I1124 11:06:36.752408 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:36 crc kubenswrapper[4752]: I1124 11:06:36.752419 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:36 crc kubenswrapper[4752]: I1124 11:06:36.755825 4752 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6" exitCode=0 Nov 24 11:06:36 crc kubenswrapper[4752]: I1124 11:06:36.755934 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6"} Nov 24 11:06:36 crc kubenswrapper[4752]: I1124 11:06:36.756014 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:36 crc kubenswrapper[4752]: I1124 11:06:36.757349 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:36 crc kubenswrapper[4752]: I1124 11:06:36.757404 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:36 crc kubenswrapper[4752]: I1124 11:06:36.757430 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:36 crc kubenswrapper[4752]: I1124 11:06:36.758103 4752 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0" exitCode=0 Nov 24 11:06:36 crc kubenswrapper[4752]: I1124 11:06:36.758192 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0"} Nov 24 11:06:36 crc kubenswrapper[4752]: I1124 11:06:36.758428 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:36 crc kubenswrapper[4752]: I1124 11:06:36.759648 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:36 crc kubenswrapper[4752]: I1124 11:06:36.759704 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:36 crc kubenswrapper[4752]: I1124 11:06:36.759715 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:36 crc kubenswrapper[4752]: I1124 11:06:36.763650 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:36 crc kubenswrapper[4752]: I1124 11:06:36.765150 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:36 crc kubenswrapper[4752]: I1124 11:06:36.765203 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:36 crc kubenswrapper[4752]: I1124 11:06:36.765230 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:37 crc kubenswrapper[4752]: I1124 11:06:37.662975 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.145:6443: connect: connection refused Nov 24 11:06:37 crc kubenswrapper[4752]: E1124 11:06:37.678906 4752 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.145:6443: connect: connection refused" interval="3.2s" Nov 24 11:06:37 crc kubenswrapper[4752]: I1124 11:06:37.765223 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"3fe4128b62953bbd3725ddc1ecf6a9c1c6fe5282ac7422e8c2ac007cda444ace"} Nov 24 11:06:37 crc kubenswrapper[4752]: I1124 11:06:37.765287 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:37 crc kubenswrapper[4752]: I1124 11:06:37.765319 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"9cf4d09b316208ce9cf70891db8aaedc1a10f2d347e3a901d11fee9f9da43a5e"} Nov 24 11:06:37 crc kubenswrapper[4752]: I1124 11:06:37.765342 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f9bd83a541e9bd2db206f6620364e5f41cab8fd696ba54e4df5d1e2b58028641"} Nov 24 11:06:37 crc kubenswrapper[4752]: I1124 11:06:37.766466 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:37 crc kubenswrapper[4752]: I1124 11:06:37.766521 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:37 crc kubenswrapper[4752]: I1124 11:06:37.766533 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:37 crc kubenswrapper[4752]: I1124 11:06:37.770267 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"24ec6d0c0c354883fbcb0141fb133d37861353398ad16742be6d2564a4c85c71"} Nov 24 11:06:37 crc kubenswrapper[4752]: I1124 11:06:37.770325 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4791d1130ba00230bee650a69082d11b331e4415d83301548ec1f2550b59958a"} Nov 24 11:06:37 crc kubenswrapper[4752]: I1124 11:06:37.770340 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ac0836e28edcd407034cbc884bd1aa77b7c75f500b20c088cb2e53052ba1af52"} Nov 24 11:06:37 crc kubenswrapper[4752]: I1124 11:06:37.770351 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2d3d3279037cda399bef36619c150353e6948dd375c8fa9db5fef6bdca4c44ae"} Nov 24 11:06:37 crc kubenswrapper[4752]: I1124 11:06:37.772269 4752 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348" exitCode=0 Nov 24 11:06:37 crc kubenswrapper[4752]: I1124 11:06:37.772337 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348"} Nov 24 11:06:37 crc kubenswrapper[4752]: I1124 11:06:37.772372 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:37 crc kubenswrapper[4752]: I1124 11:06:37.773129 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:37 crc kubenswrapper[4752]: I1124 11:06:37.773183 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:37 crc kubenswrapper[4752]: I1124 11:06:37.773234 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:37 crc kubenswrapper[4752]: I1124 11:06:37.776048 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"bfcc00186dc5f4713c23dff4b721d4b3e73262c7e7023a031afd6ad4148efb97"} Nov 24 11:06:37 crc kubenswrapper[4752]: I1124 11:06:37.776083 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:37 crc kubenswrapper[4752]: I1124 11:06:37.776179 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:37 crc kubenswrapper[4752]: I1124 11:06:37.781121 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:37 crc kubenswrapper[4752]: I1124 11:06:37.781270 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:37 crc kubenswrapper[4752]: I1124 11:06:37.781302 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:37 crc kubenswrapper[4752]: I1124 11:06:37.784780 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:37 crc kubenswrapper[4752]: I1124 11:06:37.784825 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:37 crc kubenswrapper[4752]: I1124 11:06:37.784842 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:37 crc kubenswrapper[4752]: W1124 11:06:37.792575 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.145:6443: connect: connection refused Nov 24 11:06:37 crc kubenswrapper[4752]: E1124 11:06:37.792673 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.145:6443: connect: connection refused" logger="UnhandledError" Nov 24 11:06:37 crc kubenswrapper[4752]: W1124 11:06:37.815516 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.145:6443: connect: connection refused Nov 24 11:06:37 crc kubenswrapper[4752]: E1124 11:06:37.815602 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.145:6443: connect: connection refused" logger="UnhandledError" Nov 24 11:06:37 crc kubenswrapper[4752]: I1124 11:06:37.906563 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:37 crc kubenswrapper[4752]: I1124 11:06:37.907799 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:37 crc kubenswrapper[4752]: I1124 11:06:37.907847 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:37 crc kubenswrapper[4752]: I1124 11:06:37.907858 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:37 crc kubenswrapper[4752]: I1124 11:06:37.907880 4752 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 24 11:06:37 crc kubenswrapper[4752]: E1124 11:06:37.908436 4752 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.145:6443: connect: connection refused" node="crc" Nov 24 11:06:38 crc kubenswrapper[4752]: I1124 11:06:38.781714 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b"} Nov 24 11:06:38 crc kubenswrapper[4752]: I1124 11:06:38.781985 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:38 crc kubenswrapper[4752]: I1124 11:06:38.783491 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:38 crc kubenswrapper[4752]: I1124 11:06:38.783572 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:38 crc kubenswrapper[4752]: I1124 11:06:38.783590 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:38 crc kubenswrapper[4752]: I1124 11:06:38.783846 4752 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c" exitCode=0 Nov 24 11:06:38 crc kubenswrapper[4752]: I1124 11:06:38.783941 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:38 crc kubenswrapper[4752]: I1124 11:06:38.783948 4752 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 24 11:06:38 crc kubenswrapper[4752]: I1124 11:06:38.783914 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c"} Nov 24 11:06:38 crc kubenswrapper[4752]: I1124 11:06:38.783979 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:38 crc kubenswrapper[4752]: I1124 11:06:38.784032 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:38 crc kubenswrapper[4752]: I1124 11:06:38.785051 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:38 crc kubenswrapper[4752]: I1124 11:06:38.785065 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:38 crc kubenswrapper[4752]: I1124 11:06:38.785076 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:38 crc kubenswrapper[4752]: I1124 11:06:38.785082 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:38 crc kubenswrapper[4752]: I1124 11:06:38.785087 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:38 crc kubenswrapper[4752]: I1124 11:06:38.785095 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:38 crc kubenswrapper[4752]: I1124 11:06:38.785369 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:38 crc kubenswrapper[4752]: I1124 11:06:38.785410 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:38 crc kubenswrapper[4752]: I1124 11:06:38.785428 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:39 crc kubenswrapper[4752]: I1124 11:06:39.126628 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 24 11:06:39 crc kubenswrapper[4752]: I1124 11:06:39.790585 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2b0a06e6bc374614ab27d7d45d36efc3cadfdba44e33705d62d85ed50e38fad0"} Nov 24 11:06:39 crc kubenswrapper[4752]: I1124 11:06:39.790633 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8877ec3aba0de1bc9e496bba5d4e699bf7d96515c1596c6a176e3a9267f76402"} Nov 24 11:06:39 crc kubenswrapper[4752]: I1124 11:06:39.790649 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2325c8c4a059eb206089a1e4c78f02cb313c544aabc8e368cb06f8f5e3725354"} Nov 24 11:06:39 crc kubenswrapper[4752]: I1124 11:06:39.790661 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0768511ec90890b5393b32427647df3c55b88445ac9e00cf2d516a92c47c9bc4"} Nov 24 11:06:39 crc kubenswrapper[4752]: I1124 11:06:39.790668 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:39 crc kubenswrapper[4752]: I1124 11:06:39.790727 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:39 crc kubenswrapper[4752]: I1124 11:06:39.790811 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 24 11:06:39 crc kubenswrapper[4752]: I1124 11:06:39.791649 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:39 crc kubenswrapper[4752]: I1124 11:06:39.791678 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:39 crc kubenswrapper[4752]: I1124 11:06:39.791686 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:39 crc kubenswrapper[4752]: I1124 11:06:39.791790 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:39 crc kubenswrapper[4752]: I1124 11:06:39.791703 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:39 crc kubenswrapper[4752]: I1124 11:06:39.791972 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:39 crc kubenswrapper[4752]: I1124 11:06:39.800586 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 24 11:06:40 crc kubenswrapper[4752]: I1124 11:06:40.535308 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 24 11:06:40 crc kubenswrapper[4752]: I1124 11:06:40.582319 4752 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Nov 24 11:06:40 crc kubenswrapper[4752]: I1124 11:06:40.798225 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f900f515002fbeba5eecdf005ccfda9ff78f6ed1b990af5498b4d2f101e0bea2"} Nov 24 11:06:40 crc kubenswrapper[4752]: I1124 11:06:40.798359 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:40 crc kubenswrapper[4752]: I1124 11:06:40.798359 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:40 crc kubenswrapper[4752]: I1124 11:06:40.799701 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:40 crc kubenswrapper[4752]: I1124 11:06:40.799731 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:40 crc kubenswrapper[4752]: I1124 11:06:40.799770 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:40 crc kubenswrapper[4752]: I1124 11:06:40.800661 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:40 crc kubenswrapper[4752]: I1124 11:06:40.800677 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:40 crc kubenswrapper[4752]: I1124 11:06:40.800687 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:41 crc kubenswrapper[4752]: I1124 11:06:41.109336 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:41 crc kubenswrapper[4752]: I1124 11:06:41.112079 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:41 crc kubenswrapper[4752]: I1124 11:06:41.112142 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:41 crc kubenswrapper[4752]: I1124 11:06:41.112163 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:41 crc kubenswrapper[4752]: I1124 11:06:41.112200 4752 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 24 11:06:41 crc kubenswrapper[4752]: I1124 11:06:41.645608 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 24 11:06:41 crc kubenswrapper[4752]: I1124 11:06:41.645820 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:41 crc kubenswrapper[4752]: I1124 11:06:41.646796 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:41 crc kubenswrapper[4752]: I1124 11:06:41.646837 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:41 crc kubenswrapper[4752]: I1124 11:06:41.646846 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:41 crc kubenswrapper[4752]: I1124 11:06:41.800892 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:41 crc kubenswrapper[4752]: I1124 11:06:41.800964 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:41 crc kubenswrapper[4752]: I1124 11:06:41.802558 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:41 crc kubenswrapper[4752]: I1124 11:06:41.802607 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:41 crc kubenswrapper[4752]: I1124 11:06:41.802631 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:41 crc kubenswrapper[4752]: I1124 11:06:41.802660 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:41 crc kubenswrapper[4752]: I1124 11:06:41.802709 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:41 crc kubenswrapper[4752]: I1124 11:06:41.802808 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:42 crc kubenswrapper[4752]: I1124 11:06:42.248738 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 24 11:06:42 crc kubenswrapper[4752]: I1124 11:06:42.249073 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:42 crc kubenswrapper[4752]: I1124 11:06:42.250988 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:42 crc kubenswrapper[4752]: I1124 11:06:42.251051 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:42 crc kubenswrapper[4752]: I1124 11:06:42.251060 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:42 crc kubenswrapper[4752]: I1124 11:06:42.841865 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 24 11:06:42 crc kubenswrapper[4752]: I1124 11:06:42.842082 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:42 crc kubenswrapper[4752]: I1124 11:06:42.843956 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:42 crc kubenswrapper[4752]: I1124 11:06:42.844021 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:42 crc kubenswrapper[4752]: I1124 11:06:42.844046 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:42 crc kubenswrapper[4752]: I1124 11:06:42.849336 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 24 11:06:43 crc kubenswrapper[4752]: I1124 11:06:43.808504 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:43 crc kubenswrapper[4752]: I1124 11:06:43.808693 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 24 11:06:43 crc kubenswrapper[4752]: I1124 11:06:43.810059 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:43 crc kubenswrapper[4752]: I1124 11:06:43.810118 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:43 crc kubenswrapper[4752]: I1124 11:06:43.810138 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:44 crc kubenswrapper[4752]: I1124 11:06:44.639499 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Nov 24 11:06:44 crc kubenswrapper[4752]: I1124 11:06:44.639840 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:44 crc kubenswrapper[4752]: I1124 11:06:44.641441 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:44 crc kubenswrapper[4752]: I1124 11:06:44.641500 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:44 crc kubenswrapper[4752]: I1124 11:06:44.641514 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:44 crc kubenswrapper[4752]: E1124 11:06:44.804032 4752 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 24 11:06:44 crc kubenswrapper[4752]: I1124 11:06:44.810135 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:44 crc kubenswrapper[4752]: I1124 11:06:44.811179 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:44 crc kubenswrapper[4752]: I1124 11:06:44.811214 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:44 crc kubenswrapper[4752]: I1124 11:06:44.811253 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:44 crc kubenswrapper[4752]: I1124 11:06:44.988354 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Nov 24 11:06:44 crc kubenswrapper[4752]: I1124 11:06:44.988581 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:44 crc kubenswrapper[4752]: I1124 11:06:44.990293 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:44 crc kubenswrapper[4752]: I1124 11:06:44.990384 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:44 crc kubenswrapper[4752]: I1124 11:06:44.990405 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:45 crc kubenswrapper[4752]: I1124 11:06:45.249054 4752 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 24 11:06:45 crc kubenswrapper[4752]: I1124 11:06:45.249159 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 24 11:06:45 crc kubenswrapper[4752]: I1124 11:06:45.750578 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 24 11:06:45 crc kubenswrapper[4752]: I1124 11:06:45.813106 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:45 crc kubenswrapper[4752]: I1124 11:06:45.814436 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:45 crc kubenswrapper[4752]: I1124 11:06:45.814498 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:45 crc kubenswrapper[4752]: I1124 11:06:45.814516 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:48 crc kubenswrapper[4752]: I1124 11:06:48.663626 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Nov 24 11:06:48 crc kubenswrapper[4752]: I1124 11:06:48.689814 4752 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:54994->192.168.126.11:17697: read: connection reset by peer" start-of-body= Nov 24 11:06:48 crc kubenswrapper[4752]: I1124 11:06:48.689916 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:54994->192.168.126.11:17697: read: connection reset by peer" Nov 24 11:06:48 crc kubenswrapper[4752]: W1124 11:06:48.743196 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Nov 24 11:06:48 crc kubenswrapper[4752]: I1124 11:06:48.743288 4752 trace.go:236] Trace[1661682713]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (24-Nov-2025 11:06:38.741) (total time: 10001ms): Nov 24 11:06:48 crc kubenswrapper[4752]: Trace[1661682713]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (11:06:48.743) Nov 24 11:06:48 crc kubenswrapper[4752]: Trace[1661682713]: [10.001669421s] [10.001669421s] END Nov 24 11:06:48 crc kubenswrapper[4752]: E1124 11:06:48.743323 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Nov 24 11:06:48 crc kubenswrapper[4752]: W1124 11:06:48.759992 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Nov 24 11:06:48 crc kubenswrapper[4752]: I1124 11:06:48.760087 4752 trace.go:236] Trace[102030584]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (24-Nov-2025 11:06:38.758) (total time: 10001ms): Nov 24 11:06:48 crc kubenswrapper[4752]: Trace[102030584]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (11:06:48.759) Nov 24 11:06:48 crc kubenswrapper[4752]: Trace[102030584]: [10.00166715s] [10.00166715s] END Nov 24 11:06:48 crc kubenswrapper[4752]: E1124 11:06:48.760107 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Nov 24 11:06:49 crc kubenswrapper[4752]: I1124 11:06:49.585982 4752 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 24 11:06:49 crc kubenswrapper[4752]: I1124 11:06:49.586119 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 24 11:06:49 crc kubenswrapper[4752]: I1124 11:06:49.592439 4752 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 24 11:06:49 crc kubenswrapper[4752]: I1124 11:06:49.592521 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 24 11:06:49 crc kubenswrapper[4752]: I1124 11:06:49.808608 4752 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Nov 24 11:06:49 crc kubenswrapper[4752]: [+]log ok Nov 24 11:06:49 crc kubenswrapper[4752]: [+]etcd ok Nov 24 11:06:49 crc kubenswrapper[4752]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Nov 24 11:06:49 crc kubenswrapper[4752]: [+]poststarthook/openshift.io-api-request-count-filter ok Nov 24 11:06:49 crc kubenswrapper[4752]: [+]poststarthook/openshift.io-startkubeinformers ok Nov 24 11:06:49 crc kubenswrapper[4752]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Nov 24 11:06:49 crc kubenswrapper[4752]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Nov 24 11:06:49 crc kubenswrapper[4752]: [+]poststarthook/start-apiserver-admission-initializer ok Nov 24 11:06:49 crc kubenswrapper[4752]: [+]poststarthook/generic-apiserver-start-informers ok Nov 24 11:06:49 crc kubenswrapper[4752]: [+]poststarthook/priority-and-fairness-config-consumer ok Nov 24 11:06:49 crc kubenswrapper[4752]: [+]poststarthook/priority-and-fairness-filter ok Nov 24 11:06:49 crc kubenswrapper[4752]: [+]poststarthook/storage-object-count-tracker-hook ok Nov 24 11:06:49 crc kubenswrapper[4752]: [+]poststarthook/start-apiextensions-informers ok Nov 24 11:06:49 crc kubenswrapper[4752]: [+]poststarthook/start-apiextensions-controllers ok Nov 24 11:06:49 crc kubenswrapper[4752]: [+]poststarthook/crd-informer-synced ok Nov 24 11:06:49 crc kubenswrapper[4752]: [+]poststarthook/start-system-namespaces-controller ok Nov 24 11:06:49 crc kubenswrapper[4752]: [+]poststarthook/start-cluster-authentication-info-controller ok Nov 24 11:06:49 crc kubenswrapper[4752]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Nov 24 11:06:49 crc kubenswrapper[4752]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Nov 24 11:06:49 crc kubenswrapper[4752]: [+]poststarthook/start-legacy-token-tracking-controller ok Nov 24 11:06:49 crc kubenswrapper[4752]: [+]poststarthook/start-service-ip-repair-controllers ok Nov 24 11:06:49 crc kubenswrapper[4752]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Nov 24 11:06:49 crc kubenswrapper[4752]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Nov 24 11:06:49 crc kubenswrapper[4752]: [+]poststarthook/priority-and-fairness-config-producer ok Nov 24 11:06:49 crc kubenswrapper[4752]: [+]poststarthook/bootstrap-controller ok Nov 24 11:06:49 crc kubenswrapper[4752]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Nov 24 11:06:49 crc kubenswrapper[4752]: [+]poststarthook/start-kube-aggregator-informers ok Nov 24 11:06:49 crc kubenswrapper[4752]: [+]poststarthook/apiservice-status-local-available-controller ok Nov 24 11:06:49 crc kubenswrapper[4752]: [+]poststarthook/apiservice-status-remote-available-controller ok Nov 24 11:06:49 crc kubenswrapper[4752]: [+]poststarthook/apiservice-registration-controller ok Nov 24 11:06:49 crc kubenswrapper[4752]: [+]poststarthook/apiservice-wait-for-first-sync ok Nov 24 11:06:49 crc kubenswrapper[4752]: [+]poststarthook/apiservice-discovery-controller ok Nov 24 11:06:49 crc kubenswrapper[4752]: [+]poststarthook/kube-apiserver-autoregistration ok Nov 24 11:06:49 crc kubenswrapper[4752]: [+]autoregister-completion ok Nov 24 11:06:49 crc kubenswrapper[4752]: [+]poststarthook/apiservice-openapi-controller ok Nov 24 11:06:49 crc kubenswrapper[4752]: [+]poststarthook/apiservice-openapiv3-controller ok Nov 24 11:06:49 crc kubenswrapper[4752]: livez check failed Nov 24 11:06:49 crc kubenswrapper[4752]: I1124 11:06:49.808696 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 24 11:06:49 crc kubenswrapper[4752]: I1124 11:06:49.826586 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 24 11:06:49 crc kubenswrapper[4752]: I1124 11:06:49.828923 4752 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b" exitCode=255 Nov 24 11:06:49 crc kubenswrapper[4752]: I1124 11:06:49.828966 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b"} Nov 24 11:06:49 crc kubenswrapper[4752]: I1124 11:06:49.829108 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:49 crc kubenswrapper[4752]: I1124 11:06:49.830019 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:49 crc kubenswrapper[4752]: I1124 11:06:49.830061 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:49 crc kubenswrapper[4752]: I1124 11:06:49.830077 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:49 crc kubenswrapper[4752]: I1124 11:06:49.830633 4752 scope.go:117] "RemoveContainer" containerID="3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b" Nov 24 11:06:50 crc kubenswrapper[4752]: I1124 11:06:50.833701 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 24 11:06:50 crc kubenswrapper[4752]: I1124 11:06:50.835346 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c270d60b2ca9fab983848fd8ee8c44a1e60544363b6c1408aba2e70c4d29534e"} Nov 24 11:06:50 crc kubenswrapper[4752]: I1124 11:06:50.835503 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:06:50 crc kubenswrapper[4752]: I1124 11:06:50.836489 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:06:50 crc kubenswrapper[4752]: I1124 11:06:50.836530 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:06:50 crc kubenswrapper[4752]: I1124 11:06:50.836544 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:06:53 crc kubenswrapper[4752]: I1124 11:06:53.552307 4752 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Nov 24 11:06:53 crc kubenswrapper[4752]: I1124 11:06:53.660365 4752 apiserver.go:52] "Watching apiserver" Nov 24 11:06:53 crc kubenswrapper[4752]: I1124 11:06:53.666673 4752 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Nov 24 11:06:53 crc kubenswrapper[4752]: I1124 11:06:53.667025 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h"] Nov 24 11:06:53 crc kubenswrapper[4752]: I1124 11:06:53.667475 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 24 11:06:53 crc kubenswrapper[4752]: I1124 11:06:53.667588 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:06:53 crc kubenswrapper[4752]: I1124 11:06:53.667649 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:06:53 crc kubenswrapper[4752]: I1124 11:06:53.667662 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 24 11:06:53 crc kubenswrapper[4752]: I1124 11:06:53.667678 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 24 11:06:53 crc kubenswrapper[4752]: E1124 11:06:53.667840 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:06:53 crc kubenswrapper[4752]: I1124 11:06:53.667866 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:06:53 crc kubenswrapper[4752]: E1124 11:06:53.667926 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:06:53 crc kubenswrapper[4752]: E1124 11:06:53.668013 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:06:53 crc kubenswrapper[4752]: I1124 11:06:53.670503 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Nov 24 11:06:53 crc kubenswrapper[4752]: I1124 11:06:53.670529 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Nov 24 11:06:53 crc kubenswrapper[4752]: I1124 11:06:53.670825 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Nov 24 11:06:53 crc kubenswrapper[4752]: I1124 11:06:53.671228 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Nov 24 11:06:53 crc kubenswrapper[4752]: I1124 11:06:53.671280 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Nov 24 11:06:53 crc kubenswrapper[4752]: I1124 11:06:53.671346 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Nov 24 11:06:53 crc kubenswrapper[4752]: I1124 11:06:53.671368 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Nov 24 11:06:53 crc kubenswrapper[4752]: I1124 11:06:53.671594 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Nov 24 11:06:53 crc kubenswrapper[4752]: I1124 11:06:53.671849 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Nov 24 11:06:53 crc kubenswrapper[4752]: I1124 11:06:53.701230 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:53 crc kubenswrapper[4752]: I1124 11:06:53.712432 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:53 crc kubenswrapper[4752]: I1124 11:06:53.724343 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:53 crc kubenswrapper[4752]: I1124 11:06:53.740258 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:53 crc kubenswrapper[4752]: I1124 11:06:53.756732 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:53 crc kubenswrapper[4752]: I1124 11:06:53.765903 4752 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Nov 24 11:06:53 crc kubenswrapper[4752]: I1124 11:06:53.768099 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:53 crc kubenswrapper[4752]: I1124 11:06:53.787620 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:54 crc kubenswrapper[4752]: E1124 11:06:54.591954 4752 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Nov 24 11:06:54 crc kubenswrapper[4752]: E1124 11:06:54.594167 4752 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.594579 4752 trace.go:236] Trace[1953845324]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (24-Nov-2025 11:06:43.336) (total time: 11257ms): Nov 24 11:06:54 crc kubenswrapper[4752]: Trace[1953845324]: ---"Objects listed" error: 11257ms (11:06:54.594) Nov 24 11:06:54 crc kubenswrapper[4752]: Trace[1953845324]: [11.257831074s] [11.257831074s] END Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.594609 4752 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.596988 4752 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.597827 4752 trace.go:236] Trace[422636970]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (24-Nov-2025 11:06:41.856) (total time: 12741ms): Nov 24 11:06:54 crc kubenswrapper[4752]: Trace[422636970]: ---"Objects listed" error: 12740ms (11:06:54.597) Nov 24 11:06:54 crc kubenswrapper[4752]: Trace[422636970]: [12.741113036s] [12.741113036s] END Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.597854 4752 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.600719 4752 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.620596 4752 csr.go:261] certificate signing request csr-qsgn6 is approved, waiting to be issued Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.631891 4752 csr.go:257] certificate signing request csr-qsgn6 is issued Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.636532 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.640945 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.652196 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.654330 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.671155 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.693361 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.697549 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.697589 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.697618 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.697642 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.697671 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.697706 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.697751 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.697804 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.697961 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.697995 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.698021 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.698043 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.698037 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.698070 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.698044 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: E1124 11:06:54.698160 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:06:55.1981336 +0000 UTC m=+21.182953879 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.698051 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.698198 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.698229 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.698237 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.698248 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.698265 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.698285 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.698264 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.698299 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.698301 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.698420 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.698448 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.698474 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.698496 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.698522 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.698547 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.698551 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.698562 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.698578 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.698617 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.698642 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.698664 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.698685 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.698704 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.698725 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.698751 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.698784 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.698807 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.698887 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.698921 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.698946 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.698968 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.698990 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.698578 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699018 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699020 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699041 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.698761 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699070 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699094 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699179 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699236 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699257 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699304 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699330 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699382 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699407 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699426 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699448 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699468 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699485 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699505 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699523 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699542 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699563 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699579 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699598 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699619 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699643 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699664 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699687 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699708 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699728 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699753 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699788 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699809 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699829 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699850 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699889 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699906 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699928 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699949 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699967 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699984 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700003 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700028 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700046 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700063 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700085 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700108 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700127 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700147 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700169 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700191 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700213 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700241 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700266 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700290 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700343 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700366 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700390 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700413 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700438 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700484 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700504 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700527 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700547 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700567 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700584 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700602 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700622 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700640 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700663 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700683 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700700 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700717 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700736 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700757 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700793 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700812 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700830 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700846 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700865 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700883 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700900 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700917 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700934 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700954 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700970 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700989 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.701006 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.701025 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.701042 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.701060 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.701076 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.701092 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.701109 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.701126 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.701143 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.701159 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.701174 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.701189 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.701207 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.701222 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.701239 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.701257 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.701274 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.701292 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.701310 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.701327 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.701345 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.701361 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.701379 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.701398 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.701416 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.701432 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.701448 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.701463 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.701483 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.701500 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.701518 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.701536 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.701577 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.701595 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.701614 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.701632 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.701650 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.701667 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.701692 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.701709 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.701728 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.701747 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.702007 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.702028 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.702045 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.702062 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.702078 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.702120 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.702137 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.702154 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.702171 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.702189 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.702205 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.702223 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.698807 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.698878 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.698883 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.698989 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699293 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699308 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699352 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699369 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699379 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699383 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699585 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699591 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699729 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699786 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699810 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699815 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699835 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.699907 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700033 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700064 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700101 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700129 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700146 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700197 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700275 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700306 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700363 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700465 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700498 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700534 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700608 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.700834 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.702234 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.702486 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.702697 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.702742 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.703194 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.703206 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.703238 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.703314 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.703422 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.703448 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.703511 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.703599 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.702242 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704048 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704082 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704105 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704126 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704150 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704169 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704190 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704210 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704232 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704250 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704268 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704289 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704308 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704327 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704344 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704361 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704408 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704429 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704451 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704472 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704489 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704508 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704532 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704553 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704572 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704592 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704613 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704635 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704655 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704672 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704707 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704752 4752 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704779 4752 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704791 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704803 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704813 4752 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704822 4752 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704832 4752 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704840 4752 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704850 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704859 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704862 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704868 4752 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704900 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704910 4752 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.704995 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.705010 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.705013 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.705028 4752 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.705047 4752 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.705065 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.705077 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.705088 4752 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.705102 4752 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.705114 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.705127 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.705142 4752 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.705154 4752 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.705166 4752 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.705177 4752 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.705188 4752 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.705209 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.705391 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.705536 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.705794 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.706983 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.707014 4752 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.707026 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.707037 4752 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.707047 4752 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.707058 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.707069 4752 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.707080 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.707091 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.707102 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.707114 4752 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.707126 4752 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.707137 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.707151 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.707161 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.707172 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.707183 4752 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.707193 4752 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.707205 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.707214 4752 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.707530 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.707572 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.707584 4752 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.707598 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.707609 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.707620 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.707641 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.707652 4752 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.707663 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.707099 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.707149 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.708023 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.708640 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.708896 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.709045 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.709094 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.709281 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.709559 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.709854 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.709856 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.709953 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.710020 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.710050 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.710296 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.710424 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.710516 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.710633 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.711035 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.711095 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.711817 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.711867 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.711915 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.712145 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.712309 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.712426 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.712440 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.712568 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.712578 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.711250 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.713579 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.713985 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.714112 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.714407 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.714464 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.714638 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.714756 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.714674 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.715749 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.715928 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.716018 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.716339 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.716346 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.716732 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.716902 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.717032 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.717148 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.717172 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.717362 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.717368 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.717686 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.717817 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.718050 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.718210 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.718412 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.722191 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.718518 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.718618 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.718910 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.719139 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.719150 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.719474 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.719804 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.719834 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: E1124 11:06:54.719880 4752 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.719901 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.719943 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.720184 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.720268 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.720459 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.720531 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.720492 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.720635 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.720675 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.720981 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.720991 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.721294 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.721312 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.721325 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.721585 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.721789 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.721885 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.722045 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.722122 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: E1124 11:06:54.722383 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-24 11:06:55.222364564 +0000 UTC m=+21.207184863 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.720161 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.722395 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.722846 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: E1124 11:06:54.723089 4752 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 24 11:06:54 crc kubenswrapper[4752]: E1124 11:06:54.723174 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-24 11:06:55.223151149 +0000 UTC m=+21.207971648 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.723367 4752 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.725748 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.731850 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.731848 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.732172 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.734226 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.735890 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.735823 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.738590 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.738940 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.739702 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: E1124 11:06:54.740248 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 24 11:06:54 crc kubenswrapper[4752]: E1124 11:06:54.740273 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 24 11:06:54 crc kubenswrapper[4752]: E1124 11:06:54.740290 4752 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 11:06:54 crc kubenswrapper[4752]: E1124 11:06:54.740357 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-24 11:06:55.240333901 +0000 UTC m=+21.225154390 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.742795 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.742858 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.743124 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.743122 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.743612 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.743850 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.743940 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.744036 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.744147 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.744734 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.745971 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.746197 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.746880 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.746963 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.747523 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.748219 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.748405 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.748538 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.748851 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.749496 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.755498 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.756898 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.750845 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.758479 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.758650 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Nov 24 11:06:54 crc kubenswrapper[4752]: E1124 11:06:54.759528 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 24 11:06:54 crc kubenswrapper[4752]: E1124 11:06:54.759558 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 24 11:06:54 crc kubenswrapper[4752]: E1124 11:06:54.759573 4752 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 11:06:54 crc kubenswrapper[4752]: E1124 11:06:54.759627 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-24 11:06:55.25961051 +0000 UTC m=+21.244430799 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.759795 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.760125 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.760588 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.760988 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.761152 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.761119 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.764129 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.764410 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.764822 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.765518 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.766145 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.767673 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.767924 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.768069 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.768073 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.768406 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.768773 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.769071 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.769300 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.770487 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.772276 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.773885 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.773294 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.773337 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.773440 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.773486 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.771397 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.774559 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.775085 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.775969 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.776249 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.776402 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.776586 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.776835 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.776950 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.777271 4752 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.777340 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.777483 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.779002 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.780145 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.781317 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.782019 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.782571 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.788005 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.788836 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.790329 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.791351 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.793199 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.793762 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.794717 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.795925 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.796668 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.797292 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.798538 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.799134 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.800232 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.800857 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.802296 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.802892 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.804005 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.804123 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.804535 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.805141 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.806314 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.806902 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.808159 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.808418 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.808823 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.808851 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809001 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809035 4752 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809047 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809047 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809055 4752 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809095 4752 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809106 4752 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809117 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809127 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809137 4752 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809146 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809156 4752 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809165 4752 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809173 4752 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809182 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809190 4752 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809198 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809207 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809215 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809224 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809232 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809242 4752 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809250 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809259 4752 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809267 4752 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809275 4752 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809283 4752 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809293 4752 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809301 4752 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809310 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809320 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809329 4752 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809339 4752 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809349 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809357 4752 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809366 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809375 4752 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809383 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809401 4752 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809410 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809420 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809430 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809439 4752 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809447 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809456 4752 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809465 4752 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809474 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809483 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809492 4752 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809503 4752 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809512 4752 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809521 4752 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809529 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809538 4752 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809546 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809554 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809564 4752 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809574 4752 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809582 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809590 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809598 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809607 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809616 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809626 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809635 4752 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809646 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809656 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809663 4752 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809671 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809681 4752 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809689 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809696 4752 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809703 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809712 4752 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809721 4752 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809730 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809738 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809749 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809757 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809824 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809833 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809840 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809849 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809857 4752 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809865 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809873 4752 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809882 4752 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809890 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809898 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809907 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809915 4752 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809923 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809931 4752 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809952 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809961 4752 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809969 4752 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809977 4752 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.809988 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.810010 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.810018 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.810027 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.810035 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.810044 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.810051 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.810059 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.810067 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.810075 4752 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.810083 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.810091 4752 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.810099 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.810107 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.810115 4752 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.810124 4752 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.810131 4752 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.810139 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.810147 4752 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.810155 4752 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.810163 4752 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.810171 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.810180 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.810189 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.810197 4752 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.810204 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.810211 4752 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.810220 4752 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.810228 4752 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.810237 4752 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.810245 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.810253 4752 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.810261 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.810270 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.810278 4752 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.810287 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.810295 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.810304 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.810325 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.810334 4752 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.810343 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.810351 4752 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.810360 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.813644 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d681098-9a27-4c4a-abf7-c110797b560a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.814152 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.816940 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.822980 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.833859 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.834288 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:54 crc kubenswrapper[4752]: E1124 11:06:54.851557 4752 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.853598 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.865523 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.877491 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b427613-8be1-45a9-8e97-45b0fd75ff4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d3d3279037cda399bef36619c150353e6948dd375c8fa9db5fef6bdca4c44ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4791d1130ba00230bee650a69082d11b331e4415d83301548ec1f2550b59958a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0836e28edcd407034cbc884bd1aa77b7c75f500b20c088cb2e53052ba1af52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c270d60b2ca9fab983848fd8ee8c44a1e60544363b6c1408aba2e70c4d29534e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T11:06:48Z\\\",\\\"message\\\":\\\"W1124 11:06:37.846162 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1124 11:06:37.846539 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763982397 cert, and key in /tmp/serving-cert-3641171055/serving-signer.crt, /tmp/serving-cert-3641171055/serving-signer.key\\\\nI1124 11:06:38.225612 1 observer_polling.go:159] Starting file observer\\\\nW1124 11:06:38.228082 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1124 11:06:38.228286 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1124 11:06:38.229260 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3641171055/tls.crt::/tmp/serving-cert-3641171055/tls.key\\\\\\\"\\\\nF1124 11:06:48.685179 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24ec6d0c0c354883fbcb0141fb133d37861353398ad16742be6d2564a4c85c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.881433 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.889845 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d681098-9a27-4c4a-abf7-c110797b560a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.891249 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.896920 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.905319 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.911625 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.915463 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.927895 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.938256 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:54 crc kubenswrapper[4752]: W1124 11:06:54.941038 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-73022eb06c2fb3bca078b8e69effb520fc189114da728e62a47e6338557a60b4 WatchSource:0}: Error finding container 73022eb06c2fb3bca078b8e69effb520fc189114da728e62a47e6338557a60b4: Status 404 returned error can't find the container with id 73022eb06c2fb3bca078b8e69effb520fc189114da728e62a47e6338557a60b4 Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.949610 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.950317 4752 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.961673 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.974816 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.984493 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:54 crc kubenswrapper[4752]: I1124 11:06:54.996702 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.008116 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.026482 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b427613-8be1-45a9-8e97-45b0fd75ff4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d3d3279037cda399bef36619c150353e6948dd375c8fa9db5fef6bdca4c44ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4791d1130ba00230bee650a69082d11b331e4415d83301548ec1f2550b59958a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0836e28edcd407034cbc884bd1aa77b7c75f500b20c088cb2e53052ba1af52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c270d60b2ca9fab983848fd8ee8c44a1e60544363b6c1408aba2e70c4d29534e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T11:06:48Z\\\",\\\"message\\\":\\\"W1124 11:06:37.846162 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1124 11:06:37.846539 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763982397 cert, and key in /tmp/serving-cert-3641171055/serving-signer.crt, /tmp/serving-cert-3641171055/serving-signer.key\\\\nI1124 11:06:38.225612 1 observer_polling.go:159] Starting file observer\\\\nW1124 11:06:38.228082 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1124 11:06:38.228286 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1124 11:06:38.229260 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3641171055/tls.crt::/tmp/serving-cert-3641171055/tls.key\\\\\\\"\\\\nF1124 11:06:48.685179 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24ec6d0c0c354883fbcb0141fb133d37861353398ad16742be6d2564a4c85c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.044441 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d681098-9a27-4c4a-abf7-c110797b560a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.104223 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.129782 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.131334 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.134833 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.148119 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.160760 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.177985 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.192473 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.204334 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.213467 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:06:55 crc kubenswrapper[4752]: E1124 11:06:55.213724 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:06:56.213699091 +0000 UTC m=+22.198519380 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.213823 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b427613-8be1-45a9-8e97-45b0fd75ff4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d3d3279037cda399bef36619c150353e6948dd375c8fa9db5fef6bdca4c44ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4791d1130ba00230bee650a69082d11b331e4415d83301548ec1f2550b59958a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0836e28edcd407034cbc884bd1aa77b7c75f500b20c088cb2e53052ba1af52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c270d60b2ca9fab983848fd8ee8c44a1e60544363b6c1408aba2e70c4d29534e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T11:06:48Z\\\",\\\"message\\\":\\\"W1124 11:06:37.846162 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1124 11:06:37.846539 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763982397 cert, and key in /tmp/serving-cert-3641171055/serving-signer.crt, /tmp/serving-cert-3641171055/serving-signer.key\\\\nI1124 11:06:38.225612 1 observer_polling.go:159] Starting file observer\\\\nW1124 11:06:38.228082 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1124 11:06:38.228286 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1124 11:06:38.229260 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3641171055/tls.crt::/tmp/serving-cert-3641171055/tls.key\\\\\\\"\\\\nF1124 11:06:48.685179 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24ec6d0c0c354883fbcb0141fb133d37861353398ad16742be6d2564a4c85c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.229514 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d681098-9a27-4c4a-abf7-c110797b560a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.238705 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.250803 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b427613-8be1-45a9-8e97-45b0fd75ff4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d3d3279037cda399bef36619c150353e6948dd375c8fa9db5fef6bdca4c44ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4791d1130ba00230bee650a69082d11b331e4415d83301548ec1f2550b59958a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0836e28edcd407034cbc884bd1aa77b7c75f500b20c088cb2e53052ba1af52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c270d60b2ca9fab983848fd8ee8c44a1e60544363b6c1408aba2e70c4d29534e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T11:06:48Z\\\",\\\"message\\\":\\\"W1124 11:06:37.846162 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1124 11:06:37.846539 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763982397 cert, and key in /tmp/serving-cert-3641171055/serving-signer.crt, /tmp/serving-cert-3641171055/serving-signer.key\\\\nI1124 11:06:38.225612 1 observer_polling.go:159] Starting file observer\\\\nW1124 11:06:38.228082 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1124 11:06:38.228286 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1124 11:06:38.229260 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3641171055/tls.crt::/tmp/serving-cert-3641171055/tls.key\\\\\\\"\\\\nF1124 11:06:48.685179 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24ec6d0c0c354883fbcb0141fb133d37861353398ad16742be6d2564a4c85c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.263936 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d681098-9a27-4c4a-abf7-c110797b560a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.275530 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.296429 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db7bddff-87d8-49ce-91b0-a9598e04fa97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2325c8c4a059eb206089a1e4c78f02cb313c544aabc8e368cb06f8f5e3725354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8877ec3aba0de1bc9e496bba5d4e699bf7d96515c1596c6a176e3a9267f76402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0a06e6bc374614ab27d7d45d36efc3cadfdba44e33705d62d85ed50e38fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f900f515002fbeba5eecdf005ccfda9ff78f6ed1b990af5498b4d2f101e0bea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0768511ec90890b5393b32427647df3c55b88445ac9e00cf2d516a92c47c9bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.307690 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.314214 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.314306 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.314348 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.314382 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:06:55 crc kubenswrapper[4752]: E1124 11:06:55.314543 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 24 11:06:55 crc kubenswrapper[4752]: E1124 11:06:55.314578 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 24 11:06:55 crc kubenswrapper[4752]: E1124 11:06:55.314597 4752 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 11:06:55 crc kubenswrapper[4752]: E1124 11:06:55.314664 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-24 11:06:56.314643337 +0000 UTC m=+22.299463666 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 11:06:55 crc kubenswrapper[4752]: E1124 11:06:55.314730 4752 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 24 11:06:55 crc kubenswrapper[4752]: E1124 11:06:55.314737 4752 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 24 11:06:55 crc kubenswrapper[4752]: E1124 11:06:55.314800 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 24 11:06:55 crc kubenswrapper[4752]: E1124 11:06:55.314810 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-24 11:06:56.314794972 +0000 UTC m=+22.299615291 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 24 11:06:55 crc kubenswrapper[4752]: E1124 11:06:55.314825 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 24 11:06:55 crc kubenswrapper[4752]: E1124 11:06:55.314840 4752 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 11:06:55 crc kubenswrapper[4752]: E1124 11:06:55.314848 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-24 11:06:56.314837843 +0000 UTC m=+22.299658172 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 24 11:06:55 crc kubenswrapper[4752]: E1124 11:06:55.314894 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-24 11:06:56.314882104 +0000 UTC m=+22.299702433 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.317362 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.327890 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.337839 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.637248 4752 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-11-24 11:01:54 +0000 UTC, rotation deadline is 2026-08-15 02:03:34.290491741 +0000 UTC Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.637312 4752 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6326h56m38.653181781s for next certificate rotation Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.726845 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.726893 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.726946 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:06:55 crc kubenswrapper[4752]: E1124 11:06:55.726976 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:06:55 crc kubenswrapper[4752]: E1124 11:06:55.727028 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:06:55 crc kubenswrapper[4752]: E1124 11:06:55.727123 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.848938 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"cd34b2d0c9f1731ce1dc24c078a17361fc33d962037fc48ee21ecbcf4986fd4c"} Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.848979 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"156cd45b83c97666189a0081b12acece2f273b3a03bc135ec0729c81eeb5170b"} Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.848988 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"73022eb06c2fb3bca078b8e69effb520fc189114da728e62a47e6338557a60b4"} Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.850045 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"8291c9a69abee1350f4f72a59d4b997abdf572549d2b651aea2a2848b6730eab"} Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.851430 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"bec4d7a924c5edd7f9268c8f39e92e2d281f41f1002216099e9307129538f4ff"} Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.851453 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"0efae1ca6d369ec74e12148be56160b4d7dae96ec087acd65f2d744d6b80141b"} Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.867794 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db7bddff-87d8-49ce-91b0-a9598e04fa97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2325c8c4a059eb206089a1e4c78f02cb313c544aabc8e368cb06f8f5e3725354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8877ec3aba0de1bc9e496bba5d4e699bf7d96515c1596c6a176e3a9267f76402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0a06e6bc374614ab27d7d45d36efc3cadfdba44e33705d62d85ed50e38fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f900f515002fbeba5eecdf005ccfda9ff78f6ed1b990af5498b4d2f101e0bea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0768511ec90890b5393b32427647df3c55b88445ac9e00cf2d516a92c47c9bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.877148 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.884962 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.895210 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.903825 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd34b2d0c9f1731ce1dc24c078a17361fc33d962037fc48ee21ecbcf4986fd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156cd45b83c97666189a0081b12acece2f273b3a03bc135ec0729c81eeb5170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.914402 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b427613-8be1-45a9-8e97-45b0fd75ff4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d3d3279037cda399bef36619c150353e6948dd375c8fa9db5fef6bdca4c44ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4791d1130ba00230bee650a69082d11b331e4415d83301548ec1f2550b59958a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0836e28edcd407034cbc884bd1aa77b7c75f500b20c088cb2e53052ba1af52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c270d60b2ca9fab983848fd8ee8c44a1e60544363b6c1408aba2e70c4d29534e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T11:06:48Z\\\",\\\"message\\\":\\\"W1124 11:06:37.846162 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1124 11:06:37.846539 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763982397 cert, and key in /tmp/serving-cert-3641171055/serving-signer.crt, /tmp/serving-cert-3641171055/serving-signer.key\\\\nI1124 11:06:38.225612 1 observer_polling.go:159] Starting file observer\\\\nW1124 11:06:38.228082 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1124 11:06:38.228286 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1124 11:06:38.229260 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3641171055/tls.crt::/tmp/serving-cert-3641171055/tls.key\\\\\\\"\\\\nF1124 11:06:48.685179 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24ec6d0c0c354883fbcb0141fb133d37861353398ad16742be6d2564a4c85c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.916946 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-vhwb4"] Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.917307 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-brns2"] Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.917429 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.917866 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-brns2" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.922325 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.922521 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.922559 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.922660 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.922748 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.922800 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.922902 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.922922 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.930132 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d681098-9a27-4c4a-abf7-c110797b560a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.984897 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:55Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:55 crc kubenswrapper[4752]: I1124 11:06:55.997994 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:55Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.021052 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f890fc2e-8d6c-4109-882a-9e90340097a2-rootfs\") pod \"machine-config-daemon-vhwb4\" (UID: \"f890fc2e-8d6c-4109-882a-9e90340097a2\") " pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.021120 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f890fc2e-8d6c-4109-882a-9e90340097a2-proxy-tls\") pod \"machine-config-daemon-vhwb4\" (UID: \"f890fc2e-8d6c-4109-882a-9e90340097a2\") " pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.021158 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzkhn\" (UniqueName: \"kubernetes.io/projected/04afe719-08a4-4d22-83d2-6dd16d191cd6-kube-api-access-lzkhn\") pod \"node-resolver-brns2\" (UID: \"04afe719-08a4-4d22-83d2-6dd16d191cd6\") " pod="openshift-dns/node-resolver-brns2" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.021200 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsrgx\" (UniqueName: \"kubernetes.io/projected/f890fc2e-8d6c-4109-882a-9e90340097a2-kube-api-access-hsrgx\") pod \"machine-config-daemon-vhwb4\" (UID: \"f890fc2e-8d6c-4109-882a-9e90340097a2\") " pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.021231 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/04afe719-08a4-4d22-83d2-6dd16d191cd6-hosts-file\") pod \"node-resolver-brns2\" (UID: \"04afe719-08a4-4d22-83d2-6dd16d191cd6\") " pod="openshift-dns/node-resolver-brns2" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.021262 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f890fc2e-8d6c-4109-882a-9e90340097a2-mcd-auth-proxy-config\") pod \"machine-config-daemon-vhwb4\" (UID: \"f890fc2e-8d6c-4109-882a-9e90340097a2\") " pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.021895 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db7bddff-87d8-49ce-91b0-a9598e04fa97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2325c8c4a059eb206089a1e4c78f02cb313c544aabc8e368cb06f8f5e3725354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8877ec3aba0de1bc9e496bba5d4e699bf7d96515c1596c6a176e3a9267f76402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0a06e6bc374614ab27d7d45d36efc3cadfdba44e33705d62d85ed50e38fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f900f515002fbeba5eecdf005ccfda9ff78f6ed1b990af5498b4d2f101e0bea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0768511ec90890b5393b32427647df3c55b88445ac9e00cf2d516a92c47c9bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:56Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.039418 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec4d7a924c5edd7f9268c8f39e92e2d281f41f1002216099e9307129538f4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:56Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.060426 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:56Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.078059 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:56Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.093133 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd34b2d0c9f1731ce1dc24c078a17361fc33d962037fc48ee21ecbcf4986fd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156cd45b83c97666189a0081b12acece2f273b3a03bc135ec0729c81eeb5170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:56Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.103641 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-brns2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04afe719-08a4-4d22-83d2-6dd16d191cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzkhn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-brns2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:56Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.113923 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f890fc2e-8d6c-4109-882a-9e90340097a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vhwb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:56Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.122096 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f890fc2e-8d6c-4109-882a-9e90340097a2-rootfs\") pod \"machine-config-daemon-vhwb4\" (UID: \"f890fc2e-8d6c-4109-882a-9e90340097a2\") " pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.122338 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f890fc2e-8d6c-4109-882a-9e90340097a2-proxy-tls\") pod \"machine-config-daemon-vhwb4\" (UID: \"f890fc2e-8d6c-4109-882a-9e90340097a2\") " pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.122472 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzkhn\" (UniqueName: \"kubernetes.io/projected/04afe719-08a4-4d22-83d2-6dd16d191cd6-kube-api-access-lzkhn\") pod \"node-resolver-brns2\" (UID: \"04afe719-08a4-4d22-83d2-6dd16d191cd6\") " pod="openshift-dns/node-resolver-brns2" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.122584 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsrgx\" (UniqueName: \"kubernetes.io/projected/f890fc2e-8d6c-4109-882a-9e90340097a2-kube-api-access-hsrgx\") pod \"machine-config-daemon-vhwb4\" (UID: \"f890fc2e-8d6c-4109-882a-9e90340097a2\") " pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.122677 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/04afe719-08a4-4d22-83d2-6dd16d191cd6-hosts-file\") pod \"node-resolver-brns2\" (UID: \"04afe719-08a4-4d22-83d2-6dd16d191cd6\") " pod="openshift-dns/node-resolver-brns2" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.122782 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f890fc2e-8d6c-4109-882a-9e90340097a2-mcd-auth-proxy-config\") pod \"machine-config-daemon-vhwb4\" (UID: \"f890fc2e-8d6c-4109-882a-9e90340097a2\") " pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.122242 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f890fc2e-8d6c-4109-882a-9e90340097a2-rootfs\") pod \"machine-config-daemon-vhwb4\" (UID: \"f890fc2e-8d6c-4109-882a-9e90340097a2\") " pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.122876 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/04afe719-08a4-4d22-83d2-6dd16d191cd6-hosts-file\") pod \"node-resolver-brns2\" (UID: \"04afe719-08a4-4d22-83d2-6dd16d191cd6\") " pod="openshift-dns/node-resolver-brns2" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.123461 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f890fc2e-8d6c-4109-882a-9e90340097a2-mcd-auth-proxy-config\") pod \"machine-config-daemon-vhwb4\" (UID: \"f890fc2e-8d6c-4109-882a-9e90340097a2\") " pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.129456 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f890fc2e-8d6c-4109-882a-9e90340097a2-proxy-tls\") pod \"machine-config-daemon-vhwb4\" (UID: \"f890fc2e-8d6c-4109-882a-9e90340097a2\") " pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.138903 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b427613-8be1-45a9-8e97-45b0fd75ff4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d3d3279037cda399bef36619c150353e6948dd375c8fa9db5fef6bdca4c44ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4791d1130ba00230bee650a69082d11b331e4415d83301548ec1f2550b59958a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0836e28edcd407034cbc884bd1aa77b7c75f500b20c088cb2e53052ba1af52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c270d60b2ca9fab983848fd8ee8c44a1e60544363b6c1408aba2e70c4d29534e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T11:06:48Z\\\",\\\"message\\\":\\\"W1124 11:06:37.846162 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1124 11:06:37.846539 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763982397 cert, and key in /tmp/serving-cert-3641171055/serving-signer.crt, /tmp/serving-cert-3641171055/serving-signer.key\\\\nI1124 11:06:38.225612 1 observer_polling.go:159] Starting file observer\\\\nW1124 11:06:38.228082 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1124 11:06:38.228286 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1124 11:06:38.229260 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3641171055/tls.crt::/tmp/serving-cert-3641171055/tls.key\\\\\\\"\\\\nF1124 11:06:48.685179 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24ec6d0c0c354883fbcb0141fb133d37861353398ad16742be6d2564a4c85c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:56Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.150494 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsrgx\" (UniqueName: \"kubernetes.io/projected/f890fc2e-8d6c-4109-882a-9e90340097a2-kube-api-access-hsrgx\") pod \"machine-config-daemon-vhwb4\" (UID: \"f890fc2e-8d6c-4109-882a-9e90340097a2\") " pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.150642 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzkhn\" (UniqueName: \"kubernetes.io/projected/04afe719-08a4-4d22-83d2-6dd16d191cd6-kube-api-access-lzkhn\") pod \"node-resolver-brns2\" (UID: \"04afe719-08a4-4d22-83d2-6dd16d191cd6\") " pod="openshift-dns/node-resolver-brns2" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.160011 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d681098-9a27-4c4a-abf7-c110797b560a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:56Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.188353 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:56Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.206025 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:56Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.223473 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:06:56 crc kubenswrapper[4752]: E1124 11:06:56.223646 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:06:58.223614523 +0000 UTC m=+24.208434942 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.235462 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.242546 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-brns2" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.324032 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.324087 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.324105 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.324123 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:06:56 crc kubenswrapper[4752]: E1124 11:06:56.324177 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 24 11:06:56 crc kubenswrapper[4752]: E1124 11:06:56.324204 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 24 11:06:56 crc kubenswrapper[4752]: E1124 11:06:56.324211 4752 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 24 11:06:56 crc kubenswrapper[4752]: E1124 11:06:56.324217 4752 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 11:06:56 crc kubenswrapper[4752]: E1124 11:06:56.324188 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 24 11:06:56 crc kubenswrapper[4752]: E1124 11:06:56.324252 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-24 11:06:58.324239069 +0000 UTC m=+24.309059358 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 24 11:06:56 crc kubenswrapper[4752]: E1124 11:06:56.324254 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 24 11:06:56 crc kubenswrapper[4752]: E1124 11:06:56.324263 4752 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 11:06:56 crc kubenswrapper[4752]: E1124 11:06:56.324263 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-24 11:06:58.324257939 +0000 UTC m=+24.309078228 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 11:06:56 crc kubenswrapper[4752]: E1124 11:06:56.324296 4752 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 24 11:06:56 crc kubenswrapper[4752]: E1124 11:06:56.324295 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-24 11:06:58.32428343 +0000 UTC m=+24.309103719 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 11:06:56 crc kubenswrapper[4752]: E1124 11:06:56.324314 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-24 11:06:58.324308761 +0000 UTC m=+24.309129050 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.343817 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-2b5jq"] Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.344561 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.346612 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.346856 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.347058 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.347186 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.347334 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.348640 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-jh899"] Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.348931 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.351888 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Nov 24 11:06:56 crc kubenswrapper[4752]: W1124 11:06:56.357508 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04afe719_08a4_4d22_83d2_6dd16d191cd6.slice/crio-ee9665f659a25042fc50df004dbee8272b865908c2eba54eb0556710b3d6774e WatchSource:0}: Error finding container ee9665f659a25042fc50df004dbee8272b865908c2eba54eb0556710b3d6774e: Status 404 returned error can't find the container with id ee9665f659a25042fc50df004dbee8272b865908c2eba54eb0556710b3d6774e Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.358376 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.379228 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db7bddff-87d8-49ce-91b0-a9598e04fa97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2325c8c4a059eb206089a1e4c78f02cb313c544aabc8e368cb06f8f5e3725354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8877ec3aba0de1bc9e496bba5d4e699bf7d96515c1596c6a176e3a9267f76402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0a06e6bc374614ab27d7d45d36efc3cadfdba44e33705d62d85ed50e38fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f900f515002fbeba5eecdf005ccfda9ff78f6ed1b990af5498b4d2f101e0bea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0768511ec90890b5393b32427647df3c55b88445ac9e00cf2d516a92c47c9bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:56Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.398800 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec4d7a924c5edd7f9268c8f39e92e2d281f41f1002216099e9307129538f4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:56Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.418212 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:56Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.425245 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f578963d-5ff1-4e31-945b-cc59f0b244bf-host-run-netns\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.425290 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/136bf646-d691-4d52-b178-1f94d2d19458-cni-binary-copy\") pod \"multus-additional-cni-plugins-2b5jq\" (UID: \"136bf646-d691-4d52-b178-1f94d2d19458\") " pod="openshift-multus/multus-additional-cni-plugins-2b5jq" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.425355 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f578963d-5ff1-4e31-945b-cc59f0b244bf-os-release\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.425409 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f578963d-5ff1-4e31-945b-cc59f0b244bf-host-var-lib-cni-bin\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.425427 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f578963d-5ff1-4e31-945b-cc59f0b244bf-host-var-lib-kubelet\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.425440 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f578963d-5ff1-4e31-945b-cc59f0b244bf-hostroot\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.425509 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f578963d-5ff1-4e31-945b-cc59f0b244bf-multus-socket-dir-parent\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.425543 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8gvg\" (UniqueName: \"kubernetes.io/projected/136bf646-d691-4d52-b178-1f94d2d19458-kube-api-access-b8gvg\") pod \"multus-additional-cni-plugins-2b5jq\" (UID: \"136bf646-d691-4d52-b178-1f94d2d19458\") " pod="openshift-multus/multus-additional-cni-plugins-2b5jq" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.425591 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f578963d-5ff1-4e31-945b-cc59f0b244bf-etc-kubernetes\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.425633 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/136bf646-d691-4d52-b178-1f94d2d19458-cnibin\") pod \"multus-additional-cni-plugins-2b5jq\" (UID: \"136bf646-d691-4d52-b178-1f94d2d19458\") " pod="openshift-multus/multus-additional-cni-plugins-2b5jq" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.425663 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f578963d-5ff1-4e31-945b-cc59f0b244bf-multus-conf-dir\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.425683 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/136bf646-d691-4d52-b178-1f94d2d19458-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2b5jq\" (UID: \"136bf646-d691-4d52-b178-1f94d2d19458\") " pod="openshift-multus/multus-additional-cni-plugins-2b5jq" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.425705 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/136bf646-d691-4d52-b178-1f94d2d19458-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2b5jq\" (UID: \"136bf646-d691-4d52-b178-1f94d2d19458\") " pod="openshift-multus/multus-additional-cni-plugins-2b5jq" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.425728 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f578963d-5ff1-4e31-945b-cc59f0b244bf-multus-cni-dir\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.425765 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f578963d-5ff1-4e31-945b-cc59f0b244bf-host-var-lib-cni-multus\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.425807 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f578963d-5ff1-4e31-945b-cc59f0b244bf-system-cni-dir\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.425856 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f578963d-5ff1-4e31-945b-cc59f0b244bf-host-run-multus-certs\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.425880 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/136bf646-d691-4d52-b178-1f94d2d19458-os-release\") pod \"multus-additional-cni-plugins-2b5jq\" (UID: \"136bf646-d691-4d52-b178-1f94d2d19458\") " pod="openshift-multus/multus-additional-cni-plugins-2b5jq" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.425897 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f578963d-5ff1-4e31-945b-cc59f0b244bf-cnibin\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.425958 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rcps\" (UniqueName: \"kubernetes.io/projected/f578963d-5ff1-4e31-945b-cc59f0b244bf-kube-api-access-2rcps\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.425980 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/136bf646-d691-4d52-b178-1f94d2d19458-system-cni-dir\") pod \"multus-additional-cni-plugins-2b5jq\" (UID: \"136bf646-d691-4d52-b178-1f94d2d19458\") " pod="openshift-multus/multus-additional-cni-plugins-2b5jq" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.426046 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f578963d-5ff1-4e31-945b-cc59f0b244bf-cni-binary-copy\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.426068 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f578963d-5ff1-4e31-945b-cc59f0b244bf-host-run-k8s-cni-cncf-io\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.426138 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f578963d-5ff1-4e31-945b-cc59f0b244bf-multus-daemon-config\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.433787 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"136bf646-d691-4d52-b178-1f94d2d19458\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2b5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:56Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.447170 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d681098-9a27-4c4a-abf7-c110797b560a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:56Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.460980 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:56Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.473520 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd34b2d0c9f1731ce1dc24c078a17361fc33d962037fc48ee21ecbcf4986fd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156cd45b83c97666189a0081b12acece2f273b3a03bc135ec0729c81eeb5170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:56Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.498253 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-brns2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04afe719-08a4-4d22-83d2-6dd16d191cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzkhn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-brns2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:56Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.511975 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b427613-8be1-45a9-8e97-45b0fd75ff4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d3d3279037cda399bef36619c150353e6948dd375c8fa9db5fef6bdca4c44ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4791d1130ba00230bee650a69082d11b331e4415d83301548ec1f2550b59958a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0836e28edcd407034cbc884bd1aa77b7c75f500b20c088cb2e53052ba1af52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c270d60b2ca9fab983848fd8ee8c44a1e60544363b6c1408aba2e70c4d29534e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T11:06:48Z\\\",\\\"message\\\":\\\"W1124 11:06:37.846162 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1124 11:06:37.846539 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763982397 cert, and key in /tmp/serving-cert-3641171055/serving-signer.crt, /tmp/serving-cert-3641171055/serving-signer.key\\\\nI1124 11:06:38.225612 1 observer_polling.go:159] Starting file observer\\\\nW1124 11:06:38.228082 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1124 11:06:38.228286 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1124 11:06:38.229260 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3641171055/tls.crt::/tmp/serving-cert-3641171055/tls.key\\\\\\\"\\\\nF1124 11:06:48.685179 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24ec6d0c0c354883fbcb0141fb133d37861353398ad16742be6d2564a4c85c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:56Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.526486 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f578963d-5ff1-4e31-945b-cc59f0b244bf-host-var-lib-cni-bin\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.526523 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/136bf646-d691-4d52-b178-1f94d2d19458-cni-binary-copy\") pod \"multus-additional-cni-plugins-2b5jq\" (UID: \"136bf646-d691-4d52-b178-1f94d2d19458\") " pod="openshift-multus/multus-additional-cni-plugins-2b5jq" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.526547 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f578963d-5ff1-4e31-945b-cc59f0b244bf-os-release\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.526565 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f578963d-5ff1-4e31-945b-cc59f0b244bf-multus-socket-dir-parent\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.526581 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f578963d-5ff1-4e31-945b-cc59f0b244bf-host-var-lib-kubelet\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.526597 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f578963d-5ff1-4e31-945b-cc59f0b244bf-hostroot\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.526612 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/136bf646-d691-4d52-b178-1f94d2d19458-cnibin\") pod \"multus-additional-cni-plugins-2b5jq\" (UID: \"136bf646-d691-4d52-b178-1f94d2d19458\") " pod="openshift-multus/multus-additional-cni-plugins-2b5jq" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.526628 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8gvg\" (UniqueName: \"kubernetes.io/projected/136bf646-d691-4d52-b178-1f94d2d19458-kube-api-access-b8gvg\") pod \"multus-additional-cni-plugins-2b5jq\" (UID: \"136bf646-d691-4d52-b178-1f94d2d19458\") " pod="openshift-multus/multus-additional-cni-plugins-2b5jq" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.526645 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f578963d-5ff1-4e31-945b-cc59f0b244bf-etc-kubernetes\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.526673 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f578963d-5ff1-4e31-945b-cc59f0b244bf-host-var-lib-cni-multus\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.526689 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f578963d-5ff1-4e31-945b-cc59f0b244bf-multus-conf-dir\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.526703 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/136bf646-d691-4d52-b178-1f94d2d19458-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2b5jq\" (UID: \"136bf646-d691-4d52-b178-1f94d2d19458\") " pod="openshift-multus/multus-additional-cni-plugins-2b5jq" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.526719 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/136bf646-d691-4d52-b178-1f94d2d19458-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2b5jq\" (UID: \"136bf646-d691-4d52-b178-1f94d2d19458\") " pod="openshift-multus/multus-additional-cni-plugins-2b5jq" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.526733 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f578963d-5ff1-4e31-945b-cc59f0b244bf-multus-cni-dir\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.526805 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f578963d-5ff1-4e31-945b-cc59f0b244bf-system-cni-dir\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.526825 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/136bf646-d691-4d52-b178-1f94d2d19458-os-release\") pod \"multus-additional-cni-plugins-2b5jq\" (UID: \"136bf646-d691-4d52-b178-1f94d2d19458\") " pod="openshift-multus/multus-additional-cni-plugins-2b5jq" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.526843 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f578963d-5ff1-4e31-945b-cc59f0b244bf-host-run-multus-certs\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.526874 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f578963d-5ff1-4e31-945b-cc59f0b244bf-cnibin\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.526871 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f578963d-5ff1-4e31-945b-cc59f0b244bf-host-var-lib-cni-bin\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.526891 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f578963d-5ff1-4e31-945b-cc59f0b244bf-multus-daemon-config\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.526958 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rcps\" (UniqueName: \"kubernetes.io/projected/f578963d-5ff1-4e31-945b-cc59f0b244bf-kube-api-access-2rcps\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.526982 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/136bf646-d691-4d52-b178-1f94d2d19458-system-cni-dir\") pod \"multus-additional-cni-plugins-2b5jq\" (UID: \"136bf646-d691-4d52-b178-1f94d2d19458\") " pod="openshift-multus/multus-additional-cni-plugins-2b5jq" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.527005 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f578963d-5ff1-4e31-945b-cc59f0b244bf-cni-binary-copy\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.527030 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f578963d-5ff1-4e31-945b-cc59f0b244bf-host-run-k8s-cni-cncf-io\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.527059 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f578963d-5ff1-4e31-945b-cc59f0b244bf-host-run-netns\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.527100 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/136bf646-d691-4d52-b178-1f94d2d19458-system-cni-dir\") pod \"multus-additional-cni-plugins-2b5jq\" (UID: \"136bf646-d691-4d52-b178-1f94d2d19458\") " pod="openshift-multus/multus-additional-cni-plugins-2b5jq" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.527153 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f578963d-5ff1-4e31-945b-cc59f0b244bf-host-run-netns\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.527304 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f578963d-5ff1-4e31-945b-cc59f0b244bf-host-run-k8s-cni-cncf-io\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.527368 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f578963d-5ff1-4e31-945b-cc59f0b244bf-host-var-lib-cni-multus\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.527448 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f578963d-5ff1-4e31-945b-cc59f0b244bf-hostroot\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.527516 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f578963d-5ff1-4e31-945b-cc59f0b244bf-multus-socket-dir-parent\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.527550 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f578963d-5ff1-4e31-945b-cc59f0b244bf-host-var-lib-kubelet\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.527573 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f578963d-5ff1-4e31-945b-cc59f0b244bf-multus-daemon-config\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.527655 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f578963d-5ff1-4e31-945b-cc59f0b244bf-os-release\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.527681 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f578963d-5ff1-4e31-945b-cc59f0b244bf-multus-cni-dir\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.527759 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f578963d-5ff1-4e31-945b-cc59f0b244bf-cni-binary-copy\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.527696 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/136bf646-d691-4d52-b178-1f94d2d19458-cnibin\") pod \"multus-additional-cni-plugins-2b5jq\" (UID: \"136bf646-d691-4d52-b178-1f94d2d19458\") " pod="openshift-multus/multus-additional-cni-plugins-2b5jq" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.527735 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f578963d-5ff1-4e31-945b-cc59f0b244bf-system-cni-dir\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.527808 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/136bf646-d691-4d52-b178-1f94d2d19458-os-release\") pod \"multus-additional-cni-plugins-2b5jq\" (UID: \"136bf646-d691-4d52-b178-1f94d2d19458\") " pod="openshift-multus/multus-additional-cni-plugins-2b5jq" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.527872 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f578963d-5ff1-4e31-945b-cc59f0b244bf-multus-conf-dir\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.527889 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f578963d-5ff1-4e31-945b-cc59f0b244bf-cnibin\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.527901 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f578963d-5ff1-4e31-945b-cc59f0b244bf-host-run-multus-certs\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.527922 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f578963d-5ff1-4e31-945b-cc59f0b244bf-etc-kubernetes\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.528202 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/136bf646-d691-4d52-b178-1f94d2d19458-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2b5jq\" (UID: \"136bf646-d691-4d52-b178-1f94d2d19458\") " pod="openshift-multus/multus-additional-cni-plugins-2b5jq" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.528483 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/136bf646-d691-4d52-b178-1f94d2d19458-cni-binary-copy\") pod \"multus-additional-cni-plugins-2b5jq\" (UID: \"136bf646-d691-4d52-b178-1f94d2d19458\") " pod="openshift-multus/multus-additional-cni-plugins-2b5jq" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.530515 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/136bf646-d691-4d52-b178-1f94d2d19458-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2b5jq\" (UID: \"136bf646-d691-4d52-b178-1f94d2d19458\") " pod="openshift-multus/multus-additional-cni-plugins-2b5jq" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.531069 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:56Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.542139 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rcps\" (UniqueName: \"kubernetes.io/projected/f578963d-5ff1-4e31-945b-cc59f0b244bf-kube-api-access-2rcps\") pod \"multus-jh899\" (UID: \"f578963d-5ff1-4e31-945b-cc59f0b244bf\") " pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.542904 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8gvg\" (UniqueName: \"kubernetes.io/projected/136bf646-d691-4d52-b178-1f94d2d19458-kube-api-access-b8gvg\") pod \"multus-additional-cni-plugins-2b5jq\" (UID: \"136bf646-d691-4d52-b178-1f94d2d19458\") " pod="openshift-multus/multus-additional-cni-plugins-2b5jq" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.546598 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:56Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.558529 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f890fc2e-8d6c-4109-882a-9e90340097a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vhwb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:56Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.573142 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b427613-8be1-45a9-8e97-45b0fd75ff4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d3d3279037cda399bef36619c150353e6948dd375c8fa9db5fef6bdca4c44ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4791d1130ba00230bee650a69082d11b331e4415d83301548ec1f2550b59958a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0836e28edcd407034cbc884bd1aa77b7c75f500b20c088cb2e53052ba1af52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c270d60b2ca9fab983848fd8ee8c44a1e60544363b6c1408aba2e70c4d29534e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T11:06:48Z\\\",\\\"message\\\":\\\"W1124 11:06:37.846162 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1124 11:06:37.846539 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763982397 cert, and key in /tmp/serving-cert-3641171055/serving-signer.crt, /tmp/serving-cert-3641171055/serving-signer.key\\\\nI1124 11:06:38.225612 1 observer_polling.go:159] Starting file observer\\\\nW1124 11:06:38.228082 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1124 11:06:38.228286 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1124 11:06:38.229260 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3641171055/tls.crt::/tmp/serving-cert-3641171055/tls.key\\\\\\\"\\\\nF1124 11:06:48.685179 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24ec6d0c0c354883fbcb0141fb133d37861353398ad16742be6d2564a4c85c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:56Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.583796 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:56Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.595780 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:56Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.605466 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f890fc2e-8d6c-4109-882a-9e90340097a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vhwb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:56Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.629434 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db7bddff-87d8-49ce-91b0-a9598e04fa97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2325c8c4a059eb206089a1e4c78f02cb313c544aabc8e368cb06f8f5e3725354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8877ec3aba0de1bc9e496bba5d4e699bf7d96515c1596c6a176e3a9267f76402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0a06e6bc374614ab27d7d45d36efc3cadfdba44e33705d62d85ed50e38fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f900f515002fbeba5eecdf005ccfda9ff78f6ed1b990af5498b4d2f101e0bea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0768511ec90890b5393b32427647df3c55b88445ac9e00cf2d516a92c47c9bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:56Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.661086 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.667968 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jh899" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.670603 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec4d7a924c5edd7f9268c8f39e92e2d281f41f1002216099e9307129538f4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:56Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:56 crc kubenswrapper[4752]: W1124 11:06:56.690102 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf578963d_5ff1_4e31_945b_cc59f0b244bf.slice/crio-cc9ac3c698385ed8023022f7bc318b9fd3790bba3885592dc17130670138755e WatchSource:0}: Error finding container cc9ac3c698385ed8023022f7bc318b9fd3790bba3885592dc17130670138755e: Status 404 returned error can't find the container with id cc9ac3c698385ed8023022f7bc318b9fd3790bba3885592dc17130670138755e Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.702771 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:56Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.738330 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.739596 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.741336 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.742333 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.743232 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.744699 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.745426 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.746059 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"136bf646-d691-4d52-b178-1f94d2d19458\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2b5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:56Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.746892 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.747555 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.748874 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.749434 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.750875 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.751506 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bkksr"] Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.752525 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.772274 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.793593 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.812512 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.829738 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.829802 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-run-openvswitch\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.829829 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-host-run-ovn-kubernetes\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.829855 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-ovnkube-config\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.829875 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gwnt\" (UniqueName: \"kubernetes.io/projected/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-kube-api-access-6gwnt\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.829902 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-host-kubelet\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.829974 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-node-log\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.830020 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-ovnkube-script-lib\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.830053 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-host-cni-bin\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.830078 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-run-systemd\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.830103 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-log-socket\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.830154 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-host-slash\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.830177 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-ovn-node-metrics-cert\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.830214 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-host-run-netns\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.830236 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-host-cni-netd\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.830316 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-run-ovn\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.830372 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-systemd-units\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.830405 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-etc-openvswitch\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.830435 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-env-overrides\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.830472 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-var-lib-openvswitch\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.833800 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.853294 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.856027 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerStarted","Data":"0a0197a911066059e8637bae1251b551c637ed350ba2fcd92b4dcace745a28c2"} Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.856492 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerStarted","Data":"19b6540a68ba35f1f647d8d4bc6fe738e35d1da8b8a4b8cdf8347e246bdc9a3d"} Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.856504 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerStarted","Data":"84cef0080dc4f009a7032860644d4f51c99c50762d8f60cea3354c963094da79"} Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.858207 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-brns2" event={"ID":"04afe719-08a4-4d22-83d2-6dd16d191cd6","Type":"ContainerStarted","Data":"a0789d2dfd0d83f253e77f99448478eb42e5e45f12e9621ea16bc89e97492010"} Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.858229 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-brns2" event={"ID":"04afe719-08a4-4d22-83d2-6dd16d191cd6","Type":"ContainerStarted","Data":"ee9665f659a25042fc50df004dbee8272b865908c2eba54eb0556710b3d6774e"} Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.861014 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jh899" event={"ID":"f578963d-5ff1-4e31-945b-cc59f0b244bf","Type":"ContainerStarted","Data":"8397670f159f43e20cc8c6710ec3dca2af99b4defccbe1a2401fdf3e116fa8be"} Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.861040 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jh899" event={"ID":"f578963d-5ff1-4e31-945b-cc59f0b244bf","Type":"ContainerStarted","Data":"cc9ac3c698385ed8023022f7bc318b9fd3790bba3885592dc17130670138755e"} Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.863288 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" event={"ID":"136bf646-d691-4d52-b178-1f94d2d19458","Type":"ContainerStarted","Data":"3ad3a2a0a499bf9b73ddf1eaf7cb1e71a6e6d14079a0b61d28ccca920116a0d0"} Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.873402 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.906401 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d681098-9a27-4c4a-abf7-c110797b560a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:56Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.913651 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.931135 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-host-cni-bin\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.931171 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-run-systemd\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.931195 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-log-socket\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.931225 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-host-slash\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.931240 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-ovn-node-metrics-cert\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.931260 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-host-run-netns\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.931289 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-host-cni-netd\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.931323 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-run-ovn\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.931376 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-systemd-units\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.931402 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-etc-openvswitch\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.931437 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-env-overrides\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.931470 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-var-lib-openvswitch\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.931512 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.931531 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-run-openvswitch\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.931550 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-host-run-ovn-kubernetes\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.931568 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-ovnkube-config\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.931586 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gwnt\" (UniqueName: \"kubernetes.io/projected/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-kube-api-access-6gwnt\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.931639 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-host-kubelet\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.931656 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-node-log\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.931675 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-ovnkube-script-lib\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.932450 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-ovnkube-script-lib\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.932514 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-host-cni-bin\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.932550 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-run-systemd\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.932582 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-log-socket\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.932630 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-host-slash\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.932852 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-var-lib-openvswitch\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.933515 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-env-overrides\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.933516 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-systemd-units\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.933566 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-etc-openvswitch\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.933643 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-host-run-netns\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.933693 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-run-ovn\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.933826 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.933861 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-run-openvswitch\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.933863 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-host-cni-netd\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.933887 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-host-run-ovn-kubernetes\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.933963 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-host-kubelet\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.934004 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-node-log\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.934483 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-ovnkube-config\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.935694 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-ovn-node-metrics-cert\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:56 crc kubenswrapper[4752]: I1124 11:06:56.985496 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jh899" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f578963d-5ff1-4e31-945b-cc59f0b244bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rcps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jh899\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:56Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:57 crc kubenswrapper[4752]: I1124 11:06:57.022337 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:57Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:57 crc kubenswrapper[4752]: I1124 11:06:57.044627 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gwnt\" (UniqueName: \"kubernetes.io/projected/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-kube-api-access-6gwnt\") pod \"ovnkube-node-bkksr\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:57 crc kubenswrapper[4752]: I1124 11:06:57.061893 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd34b2d0c9f1731ce1dc24c078a17361fc33d962037fc48ee21ecbcf4986fd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156cd45b83c97666189a0081b12acece2f273b3a03bc135ec0729c81eeb5170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:57Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:57 crc kubenswrapper[4752]: I1124 11:06:57.099578 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-brns2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04afe719-08a4-4d22-83d2-6dd16d191cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzkhn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-brns2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:57Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:57 crc kubenswrapper[4752]: I1124 11:06:57.103849 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:06:57 crc kubenswrapper[4752]: W1124 11:06:57.118560 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa360dfd_2d4c_4442_84c9_af5d97c4c1fa.slice/crio-c35a8ef86d4ab8140f3da0310e5ef06887a7ea52a57a2f5d5b14e725dffcd022 WatchSource:0}: Error finding container c35a8ef86d4ab8140f3da0310e5ef06887a7ea52a57a2f5d5b14e725dffcd022: Status 404 returned error can't find the container with id c35a8ef86d4ab8140f3da0310e5ef06887a7ea52a57a2f5d5b14e725dffcd022 Nov 24 11:06:57 crc kubenswrapper[4752]: I1124 11:06:57.145340 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:57Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:57 crc kubenswrapper[4752]: I1124 11:06:57.184337 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:57Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:57 crc kubenswrapper[4752]: I1124 11:06:57.219434 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f890fc2e-8d6c-4109-882a-9e90340097a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0197a911066059e8637bae1251b551c637ed350ba2fcd92b4dcace745a28c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b6540a68ba35f1f647d8d4bc6fe738e35d1da8b8a4b8cdf8347e246bdc9a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vhwb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:57Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:57 crc kubenswrapper[4752]: I1124 11:06:57.273152 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b427613-8be1-45a9-8e97-45b0fd75ff4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d3d3279037cda399bef36619c150353e6948dd375c8fa9db5fef6bdca4c44ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4791d1130ba00230bee650a69082d11b331e4415d83301548ec1f2550b59958a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0836e28edcd407034cbc884bd1aa77b7c75f500b20c088cb2e53052ba1af52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c270d60b2ca9fab983848fd8ee8c44a1e60544363b6c1408aba2e70c4d29534e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T11:06:48Z\\\",\\\"message\\\":\\\"W1124 11:06:37.846162 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1124 11:06:37.846539 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763982397 cert, and key in /tmp/serving-cert-3641171055/serving-signer.crt, /tmp/serving-cert-3641171055/serving-signer.key\\\\nI1124 11:06:38.225612 1 observer_polling.go:159] Starting file observer\\\\nW1124 11:06:38.228082 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1124 11:06:38.228286 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1124 11:06:38.229260 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3641171055/tls.crt::/tmp/serving-cert-3641171055/tls.key\\\\\\\"\\\\nF1124 11:06:48.685179 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24ec6d0c0c354883fbcb0141fb133d37861353398ad16742be6d2564a4c85c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:57Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:57 crc kubenswrapper[4752]: I1124 11:06:57.301246 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:57Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:57 crc kubenswrapper[4752]: I1124 11:06:57.345146 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"136bf646-d691-4d52-b178-1f94d2d19458\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2b5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:57Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:57 crc kubenswrapper[4752]: I1124 11:06:57.389794 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db7bddff-87d8-49ce-91b0-a9598e04fa97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2325c8c4a059eb206089a1e4c78f02cb313c544aabc8e368cb06f8f5e3725354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8877ec3aba0de1bc9e496bba5d4e699bf7d96515c1596c6a176e3a9267f76402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0a06e6bc374614ab27d7d45d36efc3cadfdba44e33705d62d85ed50e38fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f900f515002fbeba5eecdf005ccfda9ff78f6ed1b990af5498b4d2f101e0bea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0768511ec90890b5393b32427647df3c55b88445ac9e00cf2d516a92c47c9bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:57Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:57 crc kubenswrapper[4752]: I1124 11:06:57.423018 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec4d7a924c5edd7f9268c8f39e92e2d281f41f1002216099e9307129538f4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:57Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:57 crc kubenswrapper[4752]: I1124 11:06:57.465340 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jh899" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f578963d-5ff1-4e31-945b-cc59f0b244bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8397670f159f43e20cc8c6710ec3dca2af99b4defccbe1a2401fdf3e116fa8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rcps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jh899\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:57Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:57 crc kubenswrapper[4752]: I1124 11:06:57.507310 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bkksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:57Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:57 crc kubenswrapper[4752]: I1124 11:06:57.541215 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d681098-9a27-4c4a-abf7-c110797b560a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:57Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:57 crc kubenswrapper[4752]: I1124 11:06:57.578594 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-brns2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04afe719-08a4-4d22-83d2-6dd16d191cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0789d2dfd0d83f253e77f99448478eb42e5e45f12e9621ea16bc89e97492010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzkhn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-brns2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:57Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:57 crc kubenswrapper[4752]: I1124 11:06:57.644050 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:57Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:57 crc kubenswrapper[4752]: I1124 11:06:57.666422 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd34b2d0c9f1731ce1dc24c078a17361fc33d962037fc48ee21ecbcf4986fd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156cd45b83c97666189a0081b12acece2f273b3a03bc135ec0729c81eeb5170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:57Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:57 crc kubenswrapper[4752]: I1124 11:06:57.727928 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:06:57 crc kubenswrapper[4752]: I1124 11:06:57.727985 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:06:57 crc kubenswrapper[4752]: E1124 11:06:57.728063 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:06:57 crc kubenswrapper[4752]: E1124 11:06:57.728210 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:06:57 crc kubenswrapper[4752]: I1124 11:06:57.727945 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:06:57 crc kubenswrapper[4752]: E1124 11:06:57.728473 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:06:57 crc kubenswrapper[4752]: I1124 11:06:57.868464 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"afccdbdac1eff9b564c64997649d9c34db6052cced3d053b1b5faf51ab14a12f"} Nov 24 11:06:57 crc kubenswrapper[4752]: I1124 11:06:57.870134 4752 generic.go:334] "Generic (PLEG): container finished" podID="136bf646-d691-4d52-b178-1f94d2d19458" containerID="e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c" exitCode=0 Nov 24 11:06:57 crc kubenswrapper[4752]: I1124 11:06:57.870194 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" event={"ID":"136bf646-d691-4d52-b178-1f94d2d19458","Type":"ContainerDied","Data":"e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c"} Nov 24 11:06:57 crc kubenswrapper[4752]: I1124 11:06:57.871682 4752 generic.go:334] "Generic (PLEG): container finished" podID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerID="fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac" exitCode=0 Nov 24 11:06:57 crc kubenswrapper[4752]: I1124 11:06:57.871715 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" event={"ID":"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa","Type":"ContainerDied","Data":"fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac"} Nov 24 11:06:57 crc kubenswrapper[4752]: I1124 11:06:57.871735 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" event={"ID":"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa","Type":"ContainerStarted","Data":"c35a8ef86d4ab8140f3da0310e5ef06887a7ea52a57a2f5d5b14e725dffcd022"} Nov 24 11:06:57 crc kubenswrapper[4752]: I1124 11:06:57.883935 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afccdbdac1eff9b564c64997649d9c34db6052cced3d053b1b5faf51ab14a12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:57Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:57 crc kubenswrapper[4752]: I1124 11:06:57.911824 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd34b2d0c9f1731ce1dc24c078a17361fc33d962037fc48ee21ecbcf4986fd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156cd45b83c97666189a0081b12acece2f273b3a03bc135ec0729c81eeb5170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:57Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:57 crc kubenswrapper[4752]: I1124 11:06:57.925642 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-brns2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04afe719-08a4-4d22-83d2-6dd16d191cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0789d2dfd0d83f253e77f99448478eb42e5e45f12e9621ea16bc89e97492010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzkhn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-brns2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:57Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:57 crc kubenswrapper[4752]: I1124 11:06:57.937072 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:57Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:57 crc kubenswrapper[4752]: I1124 11:06:57.950564 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f890fc2e-8d6c-4109-882a-9e90340097a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0197a911066059e8637bae1251b551c637ed350ba2fcd92b4dcace745a28c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b6540a68ba35f1f647d8d4bc6fe738e35d1da8b8a4b8cdf8347e246bdc9a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vhwb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:57Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:57 crc kubenswrapper[4752]: I1124 11:06:57.965523 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b427613-8be1-45a9-8e97-45b0fd75ff4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d3d3279037cda399bef36619c150353e6948dd375c8fa9db5fef6bdca4c44ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4791d1130ba00230bee650a69082d11b331e4415d83301548ec1f2550b59958a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0836e28edcd407034cbc884bd1aa77b7c75f500b20c088cb2e53052ba1af52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c270d60b2ca9fab983848fd8ee8c44a1e60544363b6c1408aba2e70c4d29534e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T11:06:48Z\\\",\\\"message\\\":\\\"W1124 11:06:37.846162 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1124 11:06:37.846539 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763982397 cert, and key in /tmp/serving-cert-3641171055/serving-signer.crt, /tmp/serving-cert-3641171055/serving-signer.key\\\\nI1124 11:06:38.225612 1 observer_polling.go:159] Starting file observer\\\\nW1124 11:06:38.228082 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1124 11:06:38.228286 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1124 11:06:38.229260 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3641171055/tls.crt::/tmp/serving-cert-3641171055/tls.key\\\\\\\"\\\\nF1124 11:06:48.685179 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24ec6d0c0c354883fbcb0141fb133d37861353398ad16742be6d2564a4c85c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:57Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:57 crc kubenswrapper[4752]: I1124 11:06:57.980903 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:57Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:57 crc kubenswrapper[4752]: I1124 11:06:57.995775 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"136bf646-d691-4d52-b178-1f94d2d19458\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2b5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:57Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.028514 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db7bddff-87d8-49ce-91b0-a9598e04fa97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2325c8c4a059eb206089a1e4c78f02cb313c544aabc8e368cb06f8f5e3725354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8877ec3aba0de1bc9e496bba5d4e699bf7d96515c1596c6a176e3a9267f76402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0a06e6bc374614ab27d7d45d36efc3cadfdba44e33705d62d85ed50e38fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f900f515002fbeba5eecdf005ccfda9ff78f6ed1b990af5498b4d2f101e0bea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0768511ec90890b5393b32427647df3c55b88445ac9e00cf2d516a92c47c9bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:58Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.061197 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec4d7a924c5edd7f9268c8f39e92e2d281f41f1002216099e9307129538f4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:58Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.101941 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:58Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.162578 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bkksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:58Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.186423 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d681098-9a27-4c4a-abf7-c110797b560a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:58Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.221585 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jh899" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f578963d-5ff1-4e31-945b-cc59f0b244bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8397670f159f43e20cc8c6710ec3dca2af99b4defccbe1a2401fdf3e116fa8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rcps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jh899\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:58Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.243904 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:06:58 crc kubenswrapper[4752]: E1124 11:06:58.244080 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:07:02.244065419 +0000 UTC m=+28.228885708 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.261649 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b427613-8be1-45a9-8e97-45b0fd75ff4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d3d3279037cda399bef36619c150353e6948dd375c8fa9db5fef6bdca4c44ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4791d1130ba00230bee650a69082d11b331e4415d83301548ec1f2550b59958a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0836e28edcd407034cbc884bd1aa77b7c75f500b20c088cb2e53052ba1af52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c270d60b2ca9fab983848fd8ee8c44a1e60544363b6c1408aba2e70c4d29534e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T11:06:48Z\\\",\\\"message\\\":\\\"W1124 11:06:37.846162 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1124 11:06:37.846539 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763982397 cert, and key in /tmp/serving-cert-3641171055/serving-signer.crt, /tmp/serving-cert-3641171055/serving-signer.key\\\\nI1124 11:06:38.225612 1 observer_polling.go:159] Starting file observer\\\\nW1124 11:06:38.228082 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1124 11:06:38.228286 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1124 11:06:38.229260 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3641171055/tls.crt::/tmp/serving-cert-3641171055/tls.key\\\\\\\"\\\\nF1124 11:06:48.685179 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24ec6d0c0c354883fbcb0141fb133d37861353398ad16742be6d2564a4c85c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:58Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.300535 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:58Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.342852 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:58Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.344612 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.344653 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.344672 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.344694 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:06:58 crc kubenswrapper[4752]: E1124 11:06:58.344797 4752 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 24 11:06:58 crc kubenswrapper[4752]: E1124 11:06:58.344822 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 24 11:06:58 crc kubenswrapper[4752]: E1124 11:06:58.344842 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 24 11:06:58 crc kubenswrapper[4752]: E1124 11:06:58.344844 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 24 11:06:58 crc kubenswrapper[4752]: E1124 11:06:58.344855 4752 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 11:06:58 crc kubenswrapper[4752]: E1124 11:06:58.344863 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-24 11:07:02.344843629 +0000 UTC m=+28.329663928 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 24 11:06:58 crc kubenswrapper[4752]: E1124 11:06:58.344870 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 24 11:06:58 crc kubenswrapper[4752]: E1124 11:06:58.344882 4752 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 11:06:58 crc kubenswrapper[4752]: E1124 11:06:58.344895 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-24 11:07:02.344881691 +0000 UTC m=+28.329701980 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 11:06:58 crc kubenswrapper[4752]: E1124 11:06:58.344910 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-24 11:07:02.344901811 +0000 UTC m=+28.329722100 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 11:06:58 crc kubenswrapper[4752]: E1124 11:06:58.344938 4752 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 24 11:06:58 crc kubenswrapper[4752]: E1124 11:06:58.344960 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-24 11:07:02.344953723 +0000 UTC m=+28.329774012 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.381465 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f890fc2e-8d6c-4109-882a-9e90340097a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0197a911066059e8637bae1251b551c637ed350ba2fcd92b4dcace745a28c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b6540a68ba35f1f647d8d4bc6fe738e35d1da8b8a4b8cdf8347e246bdc9a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vhwb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:58Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.426670 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db7bddff-87d8-49ce-91b0-a9598e04fa97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2325c8c4a059eb206089a1e4c78f02cb313c544aabc8e368cb06f8f5e3725354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8877ec3aba0de1bc9e496bba5d4e699bf7d96515c1596c6a176e3a9267f76402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0a06e6bc374614ab27d7d45d36efc3cadfdba44e33705d62d85ed50e38fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f900f515002fbeba5eecdf005ccfda9ff78f6ed1b990af5498b4d2f101e0bea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0768511ec90890b5393b32427647df3c55b88445ac9e00cf2d516a92c47c9bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:58Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.462321 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec4d7a924c5edd7f9268c8f39e92e2d281f41f1002216099e9307129538f4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:58Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.503007 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:58Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.545147 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"136bf646-d691-4d52-b178-1f94d2d19458\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2b5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:58Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.585004 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d681098-9a27-4c4a-abf7-c110797b560a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:58Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.621870 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jh899" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f578963d-5ff1-4e31-945b-cc59f0b244bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8397670f159f43e20cc8c6710ec3dca2af99b4defccbe1a2401fdf3e116fa8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rcps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jh899\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:58Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.665413 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bkksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:58Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.705864 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afccdbdac1eff9b564c64997649d9c34db6052cced3d053b1b5faf51ab14a12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:58Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.740565 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd34b2d0c9f1731ce1dc24c078a17361fc33d962037fc48ee21ecbcf4986fd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156cd45b83c97666189a0081b12acece2f273b3a03bc135ec0729c81eeb5170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:58Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.781305 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-brns2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04afe719-08a4-4d22-83d2-6dd16d191cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0789d2dfd0d83f253e77f99448478eb42e5e45f12e9621ea16bc89e97492010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzkhn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-brns2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:58Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.877500 4752 generic.go:334] "Generic (PLEG): container finished" podID="136bf646-d691-4d52-b178-1f94d2d19458" containerID="667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8" exitCode=0 Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.877592 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" event={"ID":"136bf646-d691-4d52-b178-1f94d2d19458","Type":"ContainerDied","Data":"667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8"} Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.881472 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" event={"ID":"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa","Type":"ContainerStarted","Data":"a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c"} Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.881515 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" event={"ID":"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa","Type":"ContainerStarted","Data":"c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad"} Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.881533 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" event={"ID":"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa","Type":"ContainerStarted","Data":"7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24"} Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.881549 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" event={"ID":"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa","Type":"ContainerStarted","Data":"8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab"} Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.881563 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" event={"ID":"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa","Type":"ContainerStarted","Data":"5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989"} Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.881577 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" event={"ID":"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa","Type":"ContainerStarted","Data":"d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255"} Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.895569 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afccdbdac1eff9b564c64997649d9c34db6052cced3d053b1b5faf51ab14a12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:58Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.908658 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd34b2d0c9f1731ce1dc24c078a17361fc33d962037fc48ee21ecbcf4986fd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156cd45b83c97666189a0081b12acece2f273b3a03bc135ec0729c81eeb5170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:58Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.914031 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-jktjk"] Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.914442 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jktjk" Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.917460 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.920164 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-brns2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04afe719-08a4-4d22-83d2-6dd16d191cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0789d2dfd0d83f253e77f99448478eb42e5e45f12e9621ea16bc89e97492010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzkhn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-brns2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:58Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.933712 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.950529 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgkv8\" (UniqueName: \"kubernetes.io/projected/270867eb-4cb0-47a6-958a-f411367a85b7-kube-api-access-pgkv8\") pod \"node-ca-jktjk\" (UID: \"270867eb-4cb0-47a6-958a-f411367a85b7\") " pod="openshift-image-registry/node-ca-jktjk" Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.950844 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/270867eb-4cb0-47a6-958a-f411367a85b7-host\") pod \"node-ca-jktjk\" (UID: \"270867eb-4cb0-47a6-958a-f411367a85b7\") " pod="openshift-image-registry/node-ca-jktjk" Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.950943 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/270867eb-4cb0-47a6-958a-f411367a85b7-serviceca\") pod \"node-ca-jktjk\" (UID: \"270867eb-4cb0-47a6-958a-f411367a85b7\") " pod="openshift-image-registry/node-ca-jktjk" Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.954133 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Nov 24 11:06:58 crc kubenswrapper[4752]: I1124 11:06:58.972345 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Nov 24 11:06:59 crc kubenswrapper[4752]: I1124 11:06:59.022251 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b427613-8be1-45a9-8e97-45b0fd75ff4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d3d3279037cda399bef36619c150353e6948dd375c8fa9db5fef6bdca4c44ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4791d1130ba00230bee650a69082d11b331e4415d83301548ec1f2550b59958a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0836e28edcd407034cbc884bd1aa77b7c75f500b20c088cb2e53052ba1af52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c270d60b2ca9fab983848fd8ee8c44a1e60544363b6c1408aba2e70c4d29534e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T11:06:48Z\\\",\\\"message\\\":\\\"W1124 11:06:37.846162 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1124 11:06:37.846539 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763982397 cert, and key in /tmp/serving-cert-3641171055/serving-signer.crt, /tmp/serving-cert-3641171055/serving-signer.key\\\\nI1124 11:06:38.225612 1 observer_polling.go:159] Starting file observer\\\\nW1124 11:06:38.228082 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1124 11:06:38.228286 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1124 11:06:38.229260 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3641171055/tls.crt::/tmp/serving-cert-3641171055/tls.key\\\\\\\"\\\\nF1124 11:06:48.685179 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24ec6d0c0c354883fbcb0141fb133d37861353398ad16742be6d2564a4c85c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:59Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:59 crc kubenswrapper[4752]: I1124 11:06:59.051801 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgkv8\" (UniqueName: \"kubernetes.io/projected/270867eb-4cb0-47a6-958a-f411367a85b7-kube-api-access-pgkv8\") pod \"node-ca-jktjk\" (UID: \"270867eb-4cb0-47a6-958a-f411367a85b7\") " pod="openshift-image-registry/node-ca-jktjk" Nov 24 11:06:59 crc kubenswrapper[4752]: I1124 11:06:59.051899 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/270867eb-4cb0-47a6-958a-f411367a85b7-host\") pod \"node-ca-jktjk\" (UID: \"270867eb-4cb0-47a6-958a-f411367a85b7\") " pod="openshift-image-registry/node-ca-jktjk" Nov 24 11:06:59 crc kubenswrapper[4752]: I1124 11:06:59.051931 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/270867eb-4cb0-47a6-958a-f411367a85b7-serviceca\") pod \"node-ca-jktjk\" (UID: \"270867eb-4cb0-47a6-958a-f411367a85b7\") " pod="openshift-image-registry/node-ca-jktjk" Nov 24 11:06:59 crc kubenswrapper[4752]: I1124 11:06:59.052033 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/270867eb-4cb0-47a6-958a-f411367a85b7-host\") pod \"node-ca-jktjk\" (UID: \"270867eb-4cb0-47a6-958a-f411367a85b7\") " pod="openshift-image-registry/node-ca-jktjk" Nov 24 11:06:59 crc kubenswrapper[4752]: I1124 11:06:59.053018 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/270867eb-4cb0-47a6-958a-f411367a85b7-serviceca\") pod \"node-ca-jktjk\" (UID: \"270867eb-4cb0-47a6-958a-f411367a85b7\") " pod="openshift-image-registry/node-ca-jktjk" Nov 24 11:06:59 crc kubenswrapper[4752]: I1124 11:06:59.061988 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:59Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:59 crc kubenswrapper[4752]: I1124 11:06:59.095176 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgkv8\" (UniqueName: \"kubernetes.io/projected/270867eb-4cb0-47a6-958a-f411367a85b7-kube-api-access-pgkv8\") pod \"node-ca-jktjk\" (UID: \"270867eb-4cb0-47a6-958a-f411367a85b7\") " pod="openshift-image-registry/node-ca-jktjk" Nov 24 11:06:59 crc kubenswrapper[4752]: I1124 11:06:59.123699 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:59Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:59 crc kubenswrapper[4752]: I1124 11:06:59.177374 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f890fc2e-8d6c-4109-882a-9e90340097a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0197a911066059e8637bae1251b551c637ed350ba2fcd92b4dcace745a28c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b6540a68ba35f1f647d8d4bc6fe738e35d1da8b8a4b8cdf8347e246bdc9a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vhwb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:59Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:59 crc kubenswrapper[4752]: I1124 11:06:59.217936 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db7bddff-87d8-49ce-91b0-a9598e04fa97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2325c8c4a059eb206089a1e4c78f02cb313c544aabc8e368cb06f8f5e3725354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8877ec3aba0de1bc9e496bba5d4e699bf7d96515c1596c6a176e3a9267f76402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0a06e6bc374614ab27d7d45d36efc3cadfdba44e33705d62d85ed50e38fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f900f515002fbeba5eecdf005ccfda9ff78f6ed1b990af5498b4d2f101e0bea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0768511ec90890b5393b32427647df3c55b88445ac9e00cf2d516a92c47c9bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:59Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:59 crc kubenswrapper[4752]: I1124 11:06:59.230275 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jktjk" Nov 24 11:06:59 crc kubenswrapper[4752]: I1124 11:06:59.251572 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec4d7a924c5edd7f9268c8f39e92e2d281f41f1002216099e9307129538f4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:59Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:59 crc kubenswrapper[4752]: I1124 11:06:59.280311 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:59Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:59 crc kubenswrapper[4752]: I1124 11:06:59.322083 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"136bf646-d691-4d52-b178-1f94d2d19458\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2b5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:59Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:59 crc kubenswrapper[4752]: I1124 11:06:59.370230 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d681098-9a27-4c4a-abf7-c110797b560a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:59Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:59 crc kubenswrapper[4752]: I1124 11:06:59.402559 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jh899" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f578963d-5ff1-4e31-945b-cc59f0b244bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8397670f159f43e20cc8c6710ec3dca2af99b4defccbe1a2401fdf3e116fa8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rcps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jh899\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:59Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:59 crc kubenswrapper[4752]: I1124 11:06:59.448268 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bkksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:59Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:59 crc kubenswrapper[4752]: I1124 11:06:59.482135 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afccdbdac1eff9b564c64997649d9c34db6052cced3d053b1b5faf51ab14a12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:59Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:59 crc kubenswrapper[4752]: I1124 11:06:59.524037 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd34b2d0c9f1731ce1dc24c078a17361fc33d962037fc48ee21ecbcf4986fd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156cd45b83c97666189a0081b12acece2f273b3a03bc135ec0729c81eeb5170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:59Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:59 crc kubenswrapper[4752]: I1124 11:06:59.562628 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-brns2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04afe719-08a4-4d22-83d2-6dd16d191cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0789d2dfd0d83f253e77f99448478eb42e5e45f12e9621ea16bc89e97492010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzkhn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-brns2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:59Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:59 crc kubenswrapper[4752]: I1124 11:06:59.604438 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f890fc2e-8d6c-4109-882a-9e90340097a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0197a911066059e8637bae1251b551c637ed350ba2fcd92b4dcace745a28c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b6540a68ba35f1f647d8d4bc6fe738e35d1da8b8a4b8cdf8347e246bdc9a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vhwb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:59Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:59 crc kubenswrapper[4752]: I1124 11:06:59.652116 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b427613-8be1-45a9-8e97-45b0fd75ff4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d3d3279037cda399bef36619c150353e6948dd375c8fa9db5fef6bdca4c44ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4791d1130ba00230bee650a69082d11b331e4415d83301548ec1f2550b59958a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0836e28edcd407034cbc884bd1aa77b7c75f500b20c088cb2e53052ba1af52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c270d60b2ca9fab983848fd8ee8c44a1e60544363b6c1408aba2e70c4d29534e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T11:06:48Z\\\",\\\"message\\\":\\\"W1124 11:06:37.846162 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1124 11:06:37.846539 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763982397 cert, and key in /tmp/serving-cert-3641171055/serving-signer.crt, /tmp/serving-cert-3641171055/serving-signer.key\\\\nI1124 11:06:38.225612 1 observer_polling.go:159] Starting file observer\\\\nW1124 11:06:38.228082 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1124 11:06:38.228286 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1124 11:06:38.229260 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3641171055/tls.crt::/tmp/serving-cert-3641171055/tls.key\\\\\\\"\\\\nF1124 11:06:48.685179 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24ec6d0c0c354883fbcb0141fb133d37861353398ad16742be6d2564a4c85c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:59Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:59 crc kubenswrapper[4752]: I1124 11:06:59.682599 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:59Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:59 crc kubenswrapper[4752]: I1124 11:06:59.721565 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:59Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:59 crc kubenswrapper[4752]: I1124 11:06:59.727272 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:06:59 crc kubenswrapper[4752]: I1124 11:06:59.727289 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:06:59 crc kubenswrapper[4752]: I1124 11:06:59.727373 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:06:59 crc kubenswrapper[4752]: E1124 11:06:59.727478 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:06:59 crc kubenswrapper[4752]: E1124 11:06:59.727662 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:06:59 crc kubenswrapper[4752]: E1124 11:06:59.727888 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:06:59 crc kubenswrapper[4752]: I1124 11:06:59.773077 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db7bddff-87d8-49ce-91b0-a9598e04fa97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2325c8c4a059eb206089a1e4c78f02cb313c544aabc8e368cb06f8f5e3725354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8877ec3aba0de1bc9e496bba5d4e699bf7d96515c1596c6a176e3a9267f76402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0a06e6bc374614ab27d7d45d36efc3cadfdba44e33705d62d85ed50e38fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f900f515002fbeba5eecdf005ccfda9ff78f6ed1b990af5498b4d2f101e0bea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0768511ec90890b5393b32427647df3c55b88445ac9e00cf2d516a92c47c9bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:59Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:59 crc kubenswrapper[4752]: I1124 11:06:59.802393 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec4d7a924c5edd7f9268c8f39e92e2d281f41f1002216099e9307129538f4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:59Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:59 crc kubenswrapper[4752]: I1124 11:06:59.841534 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:59Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:59 crc kubenswrapper[4752]: I1124 11:06:59.886536 4752 generic.go:334] "Generic (PLEG): container finished" podID="136bf646-d691-4d52-b178-1f94d2d19458" containerID="06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927" exitCode=0 Nov 24 11:06:59 crc kubenswrapper[4752]: I1124 11:06:59.886639 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" event={"ID":"136bf646-d691-4d52-b178-1f94d2d19458","Type":"ContainerDied","Data":"06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927"} Nov 24 11:06:59 crc kubenswrapper[4752]: I1124 11:06:59.887286 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"136bf646-d691-4d52-b178-1f94d2d19458\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2b5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:59Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:59 crc kubenswrapper[4752]: I1124 11:06:59.888292 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jktjk" event={"ID":"270867eb-4cb0-47a6-958a-f411367a85b7","Type":"ContainerStarted","Data":"ade38f0d51834521068693ef4008ed9760b9d54a8a8cfc4be5f687bb266dc4e0"} Nov 24 11:06:59 crc kubenswrapper[4752]: I1124 11:06:59.888324 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jktjk" event={"ID":"270867eb-4cb0-47a6-958a-f411367a85b7","Type":"ContainerStarted","Data":"974e9f8b522521010ed910e007face3d8aa372c7737105d309e5bb25f240bb62"} Nov 24 11:06:59 crc kubenswrapper[4752]: I1124 11:06:59.923363 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jktjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270867eb-4cb0-47a6-958a-f411367a85b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgkv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jktjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:59Z is after 2025-08-24T17:21:41Z" Nov 24 11:06:59 crc kubenswrapper[4752]: I1124 11:06:59.963551 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d681098-9a27-4c4a-abf7-c110797b560a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:06:59Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:00 crc kubenswrapper[4752]: I1124 11:07:00.002718 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jh899" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f578963d-5ff1-4e31-945b-cc59f0b244bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8397670f159f43e20cc8c6710ec3dca2af99b4defccbe1a2401fdf3e116fa8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rcps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jh899\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:00Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:00 crc kubenswrapper[4752]: I1124 11:07:00.045638 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bkksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:00Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:00 crc kubenswrapper[4752]: I1124 11:07:00.099105 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db7bddff-87d8-49ce-91b0-a9598e04fa97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2325c8c4a059eb206089a1e4c78f02cb313c544aabc8e368cb06f8f5e3725354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8877ec3aba0de1bc9e496bba5d4e699bf7d96515c1596c6a176e3a9267f76402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0a06e6bc374614ab27d7d45d36efc3cadfdba44e33705d62d85ed50e38fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f900f515002fbeba5eecdf005ccfda9ff78f6ed1b990af5498b4d2f101e0bea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0768511ec90890b5393b32427647df3c55b88445ac9e00cf2d516a92c47c9bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:00Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:00 crc kubenswrapper[4752]: I1124 11:07:00.123006 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec4d7a924c5edd7f9268c8f39e92e2d281f41f1002216099e9307129538f4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:00Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:00 crc kubenswrapper[4752]: I1124 11:07:00.162525 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:00Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:00 crc kubenswrapper[4752]: I1124 11:07:00.203382 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"136bf646-d691-4d52-b178-1f94d2d19458\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2b5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:00Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:00 crc kubenswrapper[4752]: I1124 11:07:00.241936 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d681098-9a27-4c4a-abf7-c110797b560a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:00Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:00 crc kubenswrapper[4752]: I1124 11:07:00.283906 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jh899" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f578963d-5ff1-4e31-945b-cc59f0b244bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8397670f159f43e20cc8c6710ec3dca2af99b4defccbe1a2401fdf3e116fa8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rcps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jh899\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:00Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:00 crc kubenswrapper[4752]: I1124 11:07:00.324487 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bkksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:00Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:00 crc kubenswrapper[4752]: I1124 11:07:00.363400 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jktjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270867eb-4cb0-47a6-958a-f411367a85b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ade38f0d51834521068693ef4008ed9760b9d54a8a8cfc4be5f687bb266dc4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgkv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jktjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:00Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:00 crc kubenswrapper[4752]: I1124 11:07:00.402558 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afccdbdac1eff9b564c64997649d9c34db6052cced3d053b1b5faf51ab14a12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:00Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:00 crc kubenswrapper[4752]: I1124 11:07:00.445995 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd34b2d0c9f1731ce1dc24c078a17361fc33d962037fc48ee21ecbcf4986fd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156cd45b83c97666189a0081b12acece2f273b3a03bc135ec0729c81eeb5170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:00Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:00 crc kubenswrapper[4752]: I1124 11:07:00.484369 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-brns2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04afe719-08a4-4d22-83d2-6dd16d191cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0789d2dfd0d83f253e77f99448478eb42e5e45f12e9621ea16bc89e97492010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzkhn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-brns2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:00Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:00 crc kubenswrapper[4752]: I1124 11:07:00.528529 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b427613-8be1-45a9-8e97-45b0fd75ff4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d3d3279037cda399bef36619c150353e6948dd375c8fa9db5fef6bdca4c44ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4791d1130ba00230bee650a69082d11b331e4415d83301548ec1f2550b59958a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0836e28edcd407034cbc884bd1aa77b7c75f500b20c088cb2e53052ba1af52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c270d60b2ca9fab983848fd8ee8c44a1e60544363b6c1408aba2e70c4d29534e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T11:06:48Z\\\",\\\"message\\\":\\\"W1124 11:06:37.846162 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1124 11:06:37.846539 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763982397 cert, and key in /tmp/serving-cert-3641171055/serving-signer.crt, /tmp/serving-cert-3641171055/serving-signer.key\\\\nI1124 11:06:38.225612 1 observer_polling.go:159] Starting file observer\\\\nW1124 11:06:38.228082 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1124 11:06:38.228286 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1124 11:06:38.229260 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3641171055/tls.crt::/tmp/serving-cert-3641171055/tls.key\\\\\\\"\\\\nF1124 11:06:48.685179 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24ec6d0c0c354883fbcb0141fb133d37861353398ad16742be6d2564a4c85c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:00Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:00 crc kubenswrapper[4752]: I1124 11:07:00.566491 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:00Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:00 crc kubenswrapper[4752]: I1124 11:07:00.608685 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:00Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:00 crc kubenswrapper[4752]: I1124 11:07:00.646862 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f890fc2e-8d6c-4109-882a-9e90340097a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0197a911066059e8637bae1251b551c637ed350ba2fcd92b4dcace745a28c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b6540a68ba35f1f647d8d4bc6fe738e35d1da8b8a4b8cdf8347e246bdc9a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vhwb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:00Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:00 crc kubenswrapper[4752]: I1124 11:07:00.896491 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" event={"ID":"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa","Type":"ContainerStarted","Data":"b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5"} Nov 24 11:07:00 crc kubenswrapper[4752]: I1124 11:07:00.899586 4752 generic.go:334] "Generic (PLEG): container finished" podID="136bf646-d691-4d52-b178-1f94d2d19458" containerID="463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae" exitCode=0 Nov 24 11:07:00 crc kubenswrapper[4752]: I1124 11:07:00.899652 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" event={"ID":"136bf646-d691-4d52-b178-1f94d2d19458","Type":"ContainerDied","Data":"463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae"} Nov 24 11:07:00 crc kubenswrapper[4752]: I1124 11:07:00.923851 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afccdbdac1eff9b564c64997649d9c34db6052cced3d053b1b5faf51ab14a12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:00Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:00 crc kubenswrapper[4752]: I1124 11:07:00.940816 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd34b2d0c9f1731ce1dc24c078a17361fc33d962037fc48ee21ecbcf4986fd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156cd45b83c97666189a0081b12acece2f273b3a03bc135ec0729c81eeb5170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:00Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:00 crc kubenswrapper[4752]: I1124 11:07:00.954486 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-brns2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04afe719-08a4-4d22-83d2-6dd16d191cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0789d2dfd0d83f253e77f99448478eb42e5e45f12e9621ea16bc89e97492010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzkhn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-brns2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:00Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:00 crc kubenswrapper[4752]: I1124 11:07:00.969707 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:00Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:00 crc kubenswrapper[4752]: I1124 11:07:00.983863 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f890fc2e-8d6c-4109-882a-9e90340097a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0197a911066059e8637bae1251b551c637ed350ba2fcd92b4dcace745a28c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b6540a68ba35f1f647d8d4bc6fe738e35d1da8b8a4b8cdf8347e246bdc9a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vhwb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:00Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:00 crc kubenswrapper[4752]: I1124 11:07:00.994611 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 11:07:00 crc kubenswrapper[4752]: I1124 11:07:00.997495 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:00 crc kubenswrapper[4752]: I1124 11:07:00.997544 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:00 crc kubenswrapper[4752]: I1124 11:07:00.997556 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:00 crc kubenswrapper[4752]: I1124 11:07:00.997703 4752 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.005210 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b427613-8be1-45a9-8e97-45b0fd75ff4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d3d3279037cda399bef36619c150353e6948dd375c8fa9db5fef6bdca4c44ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4791d1130ba00230bee650a69082d11b331e4415d83301548ec1f2550b59958a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0836e28edcd407034cbc884bd1aa77b7c75f500b20c088cb2e53052ba1af52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c270d60b2ca9fab983848fd8ee8c44a1e60544363b6c1408aba2e70c4d29534e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T11:06:48Z\\\",\\\"message\\\":\\\"W1124 11:06:37.846162 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1124 11:06:37.846539 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763982397 cert, and key in /tmp/serving-cert-3641171055/serving-signer.crt, /tmp/serving-cert-3641171055/serving-signer.key\\\\nI1124 11:06:38.225612 1 observer_polling.go:159] Starting file observer\\\\nW1124 11:06:38.228082 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1124 11:06:38.228286 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1124 11:06:38.229260 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3641171055/tls.crt::/tmp/serving-cert-3641171055/tls.key\\\\\\\"\\\\nF1124 11:06:48.685179 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24ec6d0c0c354883fbcb0141fb133d37861353398ad16742be6d2564a4c85c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:01Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.005885 4752 kubelet_node_status.go:115] "Node was previously registered" node="crc" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.006202 4752 kubelet_node_status.go:79] "Successfully registered node" node="crc" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.008140 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.008184 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.008197 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.008214 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.008227 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:01Z","lastTransitionTime":"2025-11-24T11:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.017912 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:01Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:01 crc kubenswrapper[4752]: E1124 11:07:01.025551 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"47425241-83f5-42c8-9f71-0c166d7ef9e2\\\",\\\"systemUUID\\\":\\\"366115a7-2c9a-450b-9862-da5d0db853ac\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:01Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.029848 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.029885 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.029896 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.029913 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.029926 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:01Z","lastTransitionTime":"2025-11-24T11:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.040159 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"136bf646-d691-4d52-b178-1f94d2d19458\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2b5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:01Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:01 crc kubenswrapper[4752]: E1124 11:07:01.043699 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"47425241-83f5-42c8-9f71-0c166d7ef9e2\\\",\\\"systemUUID\\\":\\\"366115a7-2c9a-450b-9862-da5d0db853ac\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:01Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.050669 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.050695 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.050704 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.050722 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.050744 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:01Z","lastTransitionTime":"2025-11-24T11:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.058292 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db7bddff-87d8-49ce-91b0-a9598e04fa97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2325c8c4a059eb206089a1e4c78f02cb313c544aabc8e368cb06f8f5e3725354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8877ec3aba0de1bc9e496bba5d4e699bf7d96515c1596c6a176e3a9267f76402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0a06e6bc374614ab27d7d45d36efc3cadfdba44e33705d62d85ed50e38fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f900f515002fbeba5eecdf005ccfda9ff78f6ed1b990af5498b4d2f101e0bea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0768511ec90890b5393b32427647df3c55b88445ac9e00cf2d516a92c47c9bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:01Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:01 crc kubenswrapper[4752]: E1124 11:07:01.063713 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"47425241-83f5-42c8-9f71-0c166d7ef9e2\\\",\\\"systemUUID\\\":\\\"366115a7-2c9a-450b-9862-da5d0db853ac\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:01Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.067084 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.067111 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.067121 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.067133 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.067142 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:01Z","lastTransitionTime":"2025-11-24T11:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:01 crc kubenswrapper[4752]: E1124 11:07:01.079500 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"47425241-83f5-42c8-9f71-0c166d7ef9e2\\\",\\\"systemUUID\\\":\\\"366115a7-2c9a-450b-9862-da5d0db853ac\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:01Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.082205 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec4d7a924c5edd7f9268c8f39e92e2d281f41f1002216099e9307129538f4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:01Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.083365 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.083407 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.083420 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.083438 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.083449 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:01Z","lastTransitionTime":"2025-11-24T11:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:01 crc kubenswrapper[4752]: E1124 11:07:01.094873 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"47425241-83f5-42c8-9f71-0c166d7ef9e2\\\",\\\"systemUUID\\\":\\\"366115a7-2c9a-450b-9862-da5d0db853ac\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:01Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:01 crc kubenswrapper[4752]: E1124 11:07:01.095155 4752 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.125656 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:01Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.127237 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.127285 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.127302 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.127327 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.127348 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:01Z","lastTransitionTime":"2025-11-24T11:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.165961 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bkksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:01Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.198290 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jktjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270867eb-4cb0-47a6-958a-f411367a85b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ade38f0d51834521068693ef4008ed9760b9d54a8a8cfc4be5f687bb266dc4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgkv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jktjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:01Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.229697 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.229724 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.229732 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.229758 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.229767 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:01Z","lastTransitionTime":"2025-11-24T11:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.241315 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d681098-9a27-4c4a-abf7-c110797b560a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:01Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.285346 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jh899" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f578963d-5ff1-4e31-945b-cc59f0b244bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8397670f159f43e20cc8c6710ec3dca2af99b4defccbe1a2401fdf3e116fa8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rcps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jh899\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:01Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.332174 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.332222 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.332232 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.332249 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.332261 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:01Z","lastTransitionTime":"2025-11-24T11:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.434739 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.434802 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.434813 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.434830 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.434840 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:01Z","lastTransitionTime":"2025-11-24T11:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.538780 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.538848 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.538870 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.538894 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.538912 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:01Z","lastTransitionTime":"2025-11-24T11:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.643185 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.643238 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.643256 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.643280 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.643298 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:01Z","lastTransitionTime":"2025-11-24T11:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.728208 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:07:01 crc kubenswrapper[4752]: E1124 11:07:01.728412 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.730657 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:07:01 crc kubenswrapper[4752]: E1124 11:07:01.730873 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.730990 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:07:01 crc kubenswrapper[4752]: E1124 11:07:01.731169 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.746775 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.746854 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.746876 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.747202 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.747418 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:01Z","lastTransitionTime":"2025-11-24T11:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.851108 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.851314 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.851406 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.851482 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.851553 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:01Z","lastTransitionTime":"2025-11-24T11:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.908013 4752 generic.go:334] "Generic (PLEG): container finished" podID="136bf646-d691-4d52-b178-1f94d2d19458" containerID="6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41" exitCode=0 Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.908070 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" event={"ID":"136bf646-d691-4d52-b178-1f94d2d19458","Type":"ContainerDied","Data":"6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41"} Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.938790 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b427613-8be1-45a9-8e97-45b0fd75ff4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d3d3279037cda399bef36619c150353e6948dd375c8fa9db5fef6bdca4c44ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4791d1130ba00230bee650a69082d11b331e4415d83301548ec1f2550b59958a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0836e28edcd407034cbc884bd1aa77b7c75f500b20c088cb2e53052ba1af52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c270d60b2ca9fab983848fd8ee8c44a1e60544363b6c1408aba2e70c4d29534e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T11:06:48Z\\\",\\\"message\\\":\\\"W1124 11:06:37.846162 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1124 11:06:37.846539 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763982397 cert, and key in /tmp/serving-cert-3641171055/serving-signer.crt, /tmp/serving-cert-3641171055/serving-signer.key\\\\nI1124 11:06:38.225612 1 observer_polling.go:159] Starting file observer\\\\nW1124 11:06:38.228082 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1124 11:06:38.228286 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1124 11:06:38.229260 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3641171055/tls.crt::/tmp/serving-cert-3641171055/tls.key\\\\\\\"\\\\nF1124 11:06:48.685179 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24ec6d0c0c354883fbcb0141fb133d37861353398ad16742be6d2564a4c85c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:01Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.955188 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.955238 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.955252 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.955273 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.955291 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:01Z","lastTransitionTime":"2025-11-24T11:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.970887 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:01Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:01 crc kubenswrapper[4752]: I1124 11:07:01.987454 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:01Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.003736 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f890fc2e-8d6c-4109-882a-9e90340097a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0197a911066059e8637bae1251b551c637ed350ba2fcd92b4dcace745a28c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b6540a68ba35f1f647d8d4bc6fe738e35d1da8b8a4b8cdf8347e246bdc9a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vhwb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:02Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.018088 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec4d7a924c5edd7f9268c8f39e92e2d281f41f1002216099e9307129538f4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:02Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.032394 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:02Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.058683 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.058718 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.058729 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.058766 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.058783 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:02Z","lastTransitionTime":"2025-11-24T11:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.058713 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"136bf646-d691-4d52-b178-1f94d2d19458\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2b5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:02Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.076900 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db7bddff-87d8-49ce-91b0-a9598e04fa97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2325c8c4a059eb206089a1e4c78f02cb313c544aabc8e368cb06f8f5e3725354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8877ec3aba0de1bc9e496bba5d4e699bf7d96515c1596c6a176e3a9267f76402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0a06e6bc374614ab27d7d45d36efc3cadfdba44e33705d62d85ed50e38fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f900f515002fbeba5eecdf005ccfda9ff78f6ed1b990af5498b4d2f101e0bea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0768511ec90890b5393b32427647df3c55b88445ac9e00cf2d516a92c47c9bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:02Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.089277 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jh899" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f578963d-5ff1-4e31-945b-cc59f0b244bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8397670f159f43e20cc8c6710ec3dca2af99b4defccbe1a2401fdf3e116fa8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rcps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jh899\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:02Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.108305 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bkksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:02Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.119065 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jktjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270867eb-4cb0-47a6-958a-f411367a85b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ade38f0d51834521068693ef4008ed9760b9d54a8a8cfc4be5f687bb266dc4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgkv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jktjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:02Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.129621 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d681098-9a27-4c4a-abf7-c110797b560a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:02Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.140607 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd34b2d0c9f1731ce1dc24c078a17361fc33d962037fc48ee21ecbcf4986fd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156cd45b83c97666189a0081b12acece2f273b3a03bc135ec0729c81eeb5170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:02Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.149886 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-brns2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04afe719-08a4-4d22-83d2-6dd16d191cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0789d2dfd0d83f253e77f99448478eb42e5e45f12e9621ea16bc89e97492010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzkhn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-brns2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:02Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.159050 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afccdbdac1eff9b564c64997649d9c34db6052cced3d053b1b5faf51ab14a12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:02Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.161353 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.161379 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.161387 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.161401 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.161409 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:02Z","lastTransitionTime":"2025-11-24T11:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.264636 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.264931 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.264998 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.265062 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.265135 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:02Z","lastTransitionTime":"2025-11-24T11:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.286083 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:07:02 crc kubenswrapper[4752]: E1124 11:07:02.286315 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:07:10.286290942 +0000 UTC m=+36.271111241 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.368908 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.368952 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.368964 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.368983 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.368997 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:02Z","lastTransitionTime":"2025-11-24T11:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.387514 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:07:02 crc kubenswrapper[4752]: E1124 11:07:02.387655 4752 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 24 11:07:02 crc kubenswrapper[4752]: E1124 11:07:02.387761 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-24 11:07:10.387728553 +0000 UTC m=+36.372548852 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.387919 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:07:02 crc kubenswrapper[4752]: E1124 11:07:02.388076 4752 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 24 11:07:02 crc kubenswrapper[4752]: E1124 11:07:02.388166 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-24 11:07:10.388146356 +0000 UTC m=+36.372966685 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 24 11:07:02 crc kubenswrapper[4752]: E1124 11:07:02.388257 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 24 11:07:02 crc kubenswrapper[4752]: E1124 11:07:02.388274 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 24 11:07:02 crc kubenswrapper[4752]: E1124 11:07:02.388288 4752 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 11:07:02 crc kubenswrapper[4752]: E1124 11:07:02.388330 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-24 11:07:10.388318791 +0000 UTC m=+36.373139091 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.388099 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.388730 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:07:02 crc kubenswrapper[4752]: E1124 11:07:02.388989 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 24 11:07:02 crc kubenswrapper[4752]: E1124 11:07:02.389057 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 24 11:07:02 crc kubenswrapper[4752]: E1124 11:07:02.389081 4752 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 11:07:02 crc kubenswrapper[4752]: E1124 11:07:02.389180 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-24 11:07:10.389152898 +0000 UTC m=+36.373973227 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.472044 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.472383 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.472545 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.472725 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.472905 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:02Z","lastTransitionTime":"2025-11-24T11:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.580237 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.580303 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.580320 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.580347 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.580364 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:02Z","lastTransitionTime":"2025-11-24T11:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.684467 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.684523 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.684542 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.684567 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.684584 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:02Z","lastTransitionTime":"2025-11-24T11:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.788812 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.788890 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.788909 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.788983 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.789003 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:02Z","lastTransitionTime":"2025-11-24T11:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.891954 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.892389 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.892407 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.892435 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.892454 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:02Z","lastTransitionTime":"2025-11-24T11:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.918002 4752 generic.go:334] "Generic (PLEG): container finished" podID="136bf646-d691-4d52-b178-1f94d2d19458" containerID="c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652" exitCode=0 Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.918171 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" event={"ID":"136bf646-d691-4d52-b178-1f94d2d19458","Type":"ContainerDied","Data":"c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652"} Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.942299 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d681098-9a27-4c4a-abf7-c110797b560a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:02Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.973022 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jh899" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f578963d-5ff1-4e31-945b-cc59f0b244bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8397670f159f43e20cc8c6710ec3dca2af99b4defccbe1a2401fdf3e116fa8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rcps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jh899\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:02Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.995405 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bkksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:02Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.997564 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.997831 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.998675 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.998740 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:02 crc kubenswrapper[4752]: I1124 11:07:02.998871 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:02Z","lastTransitionTime":"2025-11-24T11:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.007167 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jktjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270867eb-4cb0-47a6-958a-f411367a85b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ade38f0d51834521068693ef4008ed9760b9d54a8a8cfc4be5f687bb266dc4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgkv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jktjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:03Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.020552 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afccdbdac1eff9b564c64997649d9c34db6052cced3d053b1b5faf51ab14a12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:03Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.034306 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd34b2d0c9f1731ce1dc24c078a17361fc33d962037fc48ee21ecbcf4986fd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156cd45b83c97666189a0081b12acece2f273b3a03bc135ec0729c81eeb5170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:03Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.048264 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-brns2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04afe719-08a4-4d22-83d2-6dd16d191cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0789d2dfd0d83f253e77f99448478eb42e5e45f12e9621ea16bc89e97492010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzkhn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-brns2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:03Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.068234 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b427613-8be1-45a9-8e97-45b0fd75ff4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d3d3279037cda399bef36619c150353e6948dd375c8fa9db5fef6bdca4c44ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4791d1130ba00230bee650a69082d11b331e4415d83301548ec1f2550b59958a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0836e28edcd407034cbc884bd1aa77b7c75f500b20c088cb2e53052ba1af52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c270d60b2ca9fab983848fd8ee8c44a1e60544363b6c1408aba2e70c4d29534e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T11:06:48Z\\\",\\\"message\\\":\\\"W1124 11:06:37.846162 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1124 11:06:37.846539 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763982397 cert, and key in /tmp/serving-cert-3641171055/serving-signer.crt, /tmp/serving-cert-3641171055/serving-signer.key\\\\nI1124 11:06:38.225612 1 observer_polling.go:159] Starting file observer\\\\nW1124 11:06:38.228082 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1124 11:06:38.228286 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1124 11:06:38.229260 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3641171055/tls.crt::/tmp/serving-cert-3641171055/tls.key\\\\\\\"\\\\nF1124 11:06:48.685179 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24ec6d0c0c354883fbcb0141fb133d37861353398ad16742be6d2564a4c85c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:03Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.080904 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:03Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.092606 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:03Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.101472 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.101512 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.101523 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.101559 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.101569 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:03Z","lastTransitionTime":"2025-11-24T11:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.105329 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f890fc2e-8d6c-4109-882a-9e90340097a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0197a911066059e8637bae1251b551c637ed350ba2fcd92b4dcace745a28c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b6540a68ba35f1f647d8d4bc6fe738e35d1da8b8a4b8cdf8347e246bdc9a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vhwb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:03Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.128051 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db7bddff-87d8-49ce-91b0-a9598e04fa97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2325c8c4a059eb206089a1e4c78f02cb313c544aabc8e368cb06f8f5e3725354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8877ec3aba0de1bc9e496bba5d4e699bf7d96515c1596c6a176e3a9267f76402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0a06e6bc374614ab27d7d45d36efc3cadfdba44e33705d62d85ed50e38fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f900f515002fbeba5eecdf005ccfda9ff78f6ed1b990af5498b4d2f101e0bea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0768511ec90890b5393b32427647df3c55b88445ac9e00cf2d516a92c47c9bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:03Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.139218 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec4d7a924c5edd7f9268c8f39e92e2d281f41f1002216099e9307129538f4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:03Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.154204 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:03Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.171248 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"136bf646-d691-4d52-b178-1f94d2d19458\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2b5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:03Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.203512 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.203544 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.203556 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.203570 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.203579 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:03Z","lastTransitionTime":"2025-11-24T11:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.306022 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.306079 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.306091 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.306109 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.306120 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:03Z","lastTransitionTime":"2025-11-24T11:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.409011 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.409049 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.409057 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.409071 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.409080 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:03Z","lastTransitionTime":"2025-11-24T11:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.510915 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.510966 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.510982 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.511005 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.511021 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:03Z","lastTransitionTime":"2025-11-24T11:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.613975 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.614045 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.614066 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.614093 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.614115 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:03Z","lastTransitionTime":"2025-11-24T11:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.716514 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.716542 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.716550 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.716562 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.716573 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:03Z","lastTransitionTime":"2025-11-24T11:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.726986 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.727033 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:07:03 crc kubenswrapper[4752]: E1124 11:07:03.727152 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.727166 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:07:03 crc kubenswrapper[4752]: E1124 11:07:03.727346 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:07:03 crc kubenswrapper[4752]: E1124 11:07:03.727465 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.818518 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.818556 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.818567 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.818579 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.818588 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:03Z","lastTransitionTime":"2025-11-24T11:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.921825 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.921897 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.921913 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.921938 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.921956 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:03Z","lastTransitionTime":"2025-11-24T11:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.930295 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" event={"ID":"136bf646-d691-4d52-b178-1f94d2d19458","Type":"ContainerStarted","Data":"0a75772043abc2503950b09cf6a83fcd6c12bc5c3f68a1a4ae51c968d80c1cc8"} Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.940679 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" event={"ID":"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa","Type":"ContainerStarted","Data":"9775e846766b8287b76f9bcaf9040479ff92ad811c2d72e7dd403b5a74171707"} Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.941003 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.941079 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.969298 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db7bddff-87d8-49ce-91b0-a9598e04fa97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2325c8c4a059eb206089a1e4c78f02cb313c544aabc8e368cb06f8f5e3725354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8877ec3aba0de1bc9e496bba5d4e699bf7d96515c1596c6a176e3a9267f76402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0a06e6bc374614ab27d7d45d36efc3cadfdba44e33705d62d85ed50e38fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f900f515002fbeba5eecdf005ccfda9ff78f6ed1b990af5498b4d2f101e0bea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0768511ec90890b5393b32427647df3c55b88445ac9e00cf2d516a92c47c9bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:03Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:03 crc kubenswrapper[4752]: I1124 11:07:03.987224 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.005090 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.023485 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec4d7a924c5edd7f9268c8f39e92e2d281f41f1002216099e9307129538f4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.024278 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.024307 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.024318 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.024335 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.024347 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:04Z","lastTransitionTime":"2025-11-24T11:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.050472 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.065386 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"136bf646-d691-4d52-b178-1f94d2d19458\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a75772043abc2503950b09cf6a83fcd6c12bc5c3f68a1a4ae51c968d80c1cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2b5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.077153 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jktjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270867eb-4cb0-47a6-958a-f411367a85b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ade38f0d51834521068693ef4008ed9760b9d54a8a8cfc4be5f687bb266dc4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgkv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jktjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.091847 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d681098-9a27-4c4a-abf7-c110797b560a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.103607 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jh899" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f578963d-5ff1-4e31-945b-cc59f0b244bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8397670f159f43e20cc8c6710ec3dca2af99b4defccbe1a2401fdf3e116fa8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rcps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jh899\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.122806 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bkksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.126465 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.126508 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.126521 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.126572 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.126582 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:04Z","lastTransitionTime":"2025-11-24T11:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.136438 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afccdbdac1eff9b564c64997649d9c34db6052cced3d053b1b5faf51ab14a12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.150500 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd34b2d0c9f1731ce1dc24c078a17361fc33d962037fc48ee21ecbcf4986fd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156cd45b83c97666189a0081b12acece2f273b3a03bc135ec0729c81eeb5170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.165242 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-brns2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04afe719-08a4-4d22-83d2-6dd16d191cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0789d2dfd0d83f253e77f99448478eb42e5e45f12e9621ea16bc89e97492010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzkhn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-brns2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.177923 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f890fc2e-8d6c-4109-882a-9e90340097a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0197a911066059e8637bae1251b551c637ed350ba2fcd92b4dcace745a28c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b6540a68ba35f1f647d8d4bc6fe738e35d1da8b8a4b8cdf8347e246bdc9a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vhwb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.192564 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b427613-8be1-45a9-8e97-45b0fd75ff4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d3d3279037cda399bef36619c150353e6948dd375c8fa9db5fef6bdca4c44ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4791d1130ba00230bee650a69082d11b331e4415d83301548ec1f2550b59958a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0836e28edcd407034cbc884bd1aa77b7c75f500b20c088cb2e53052ba1af52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c270d60b2ca9fab983848fd8ee8c44a1e60544363b6c1408aba2e70c4d29534e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T11:06:48Z\\\",\\\"message\\\":\\\"W1124 11:06:37.846162 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1124 11:06:37.846539 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763982397 cert, and key in /tmp/serving-cert-3641171055/serving-signer.crt, /tmp/serving-cert-3641171055/serving-signer.key\\\\nI1124 11:06:38.225612 1 observer_polling.go:159] Starting file observer\\\\nW1124 11:06:38.228082 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1124 11:06:38.228286 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1124 11:06:38.229260 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3641171055/tls.crt::/tmp/serving-cert-3641171055/tls.key\\\\\\\"\\\\nF1124 11:06:48.685179 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24ec6d0c0c354883fbcb0141fb133d37861353398ad16742be6d2564a4c85c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.202950 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.213104 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.224699 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b427613-8be1-45a9-8e97-45b0fd75ff4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d3d3279037cda399bef36619c150353e6948dd375c8fa9db5fef6bdca4c44ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4791d1130ba00230bee650a69082d11b331e4415d83301548ec1f2550b59958a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0836e28edcd407034cbc884bd1aa77b7c75f500b20c088cb2e53052ba1af52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c270d60b2ca9fab983848fd8ee8c44a1e60544363b6c1408aba2e70c4d29534e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T11:06:48Z\\\",\\\"message\\\":\\\"W1124 11:06:37.846162 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1124 11:06:37.846539 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763982397 cert, and key in /tmp/serving-cert-3641171055/serving-signer.crt, /tmp/serving-cert-3641171055/serving-signer.key\\\\nI1124 11:06:38.225612 1 observer_polling.go:159] Starting file observer\\\\nW1124 11:06:38.228082 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1124 11:06:38.228286 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1124 11:06:38.229260 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3641171055/tls.crt::/tmp/serving-cert-3641171055/tls.key\\\\\\\"\\\\nF1124 11:06:48.685179 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24ec6d0c0c354883fbcb0141fb133d37861353398ad16742be6d2564a4c85c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.228469 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.228532 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.228551 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.228579 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.228612 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:04Z","lastTransitionTime":"2025-11-24T11:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.236912 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.251274 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.262611 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f890fc2e-8d6c-4109-882a-9e90340097a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0197a911066059e8637bae1251b551c637ed350ba2fcd92b4dcace745a28c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b6540a68ba35f1f647d8d4bc6fe738e35d1da8b8a4b8cdf8347e246bdc9a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vhwb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.283590 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db7bddff-87d8-49ce-91b0-a9598e04fa97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2325c8c4a059eb206089a1e4c78f02cb313c544aabc8e368cb06f8f5e3725354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8877ec3aba0de1bc9e496bba5d4e699bf7d96515c1596c6a176e3a9267f76402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0a06e6bc374614ab27d7d45d36efc3cadfdba44e33705d62d85ed50e38fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f900f515002fbeba5eecdf005ccfda9ff78f6ed1b990af5498b4d2f101e0bea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0768511ec90890b5393b32427647df3c55b88445ac9e00cf2d516a92c47c9bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.295325 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec4d7a924c5edd7f9268c8f39e92e2d281f41f1002216099e9307129538f4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.305332 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.319624 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"136bf646-d691-4d52-b178-1f94d2d19458\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a75772043abc2503950b09cf6a83fcd6c12bc5c3f68a1a4ae51c968d80c1cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2b5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.331055 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.331098 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.331110 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.331129 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.331141 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:04Z","lastTransitionTime":"2025-11-24T11:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.331428 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d681098-9a27-4c4a-abf7-c110797b560a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.341605 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jh899" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f578963d-5ff1-4e31-945b-cc59f0b244bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8397670f159f43e20cc8c6710ec3dca2af99b4defccbe1a2401fdf3e116fa8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rcps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jh899\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.364771 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9775e846766b8287b76f9bcaf9040479ff92ad811c2d72e7dd403b5a74171707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bkksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.376853 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jktjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270867eb-4cb0-47a6-958a-f411367a85b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ade38f0d51834521068693ef4008ed9760b9d54a8a8cfc4be5f687bb266dc4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgkv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jktjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.387638 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afccdbdac1eff9b564c64997649d9c34db6052cced3d053b1b5faf51ab14a12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.399615 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd34b2d0c9f1731ce1dc24c078a17361fc33d962037fc48ee21ecbcf4986fd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156cd45b83c97666189a0081b12acece2f273b3a03bc135ec0729c81eeb5170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.408950 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-brns2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04afe719-08a4-4d22-83d2-6dd16d191cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0789d2dfd0d83f253e77f99448478eb42e5e45f12e9621ea16bc89e97492010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzkhn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-brns2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.434017 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.434070 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.434089 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.434114 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.434133 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:04Z","lastTransitionTime":"2025-11-24T11:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.509735 4752 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.542839 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.542902 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.542917 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.542938 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.542952 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:04Z","lastTransitionTime":"2025-11-24T11:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.645482 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.645793 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.645899 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.646001 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.646095 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:04Z","lastTransitionTime":"2025-11-24T11:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.749693 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.749798 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.749821 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.749841 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.749854 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:04Z","lastTransitionTime":"2025-11-24T11:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.752808 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b427613-8be1-45a9-8e97-45b0fd75ff4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d3d3279037cda399bef36619c150353e6948dd375c8fa9db5fef6bdca4c44ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4791d1130ba00230bee650a69082d11b331e4415d83301548ec1f2550b59958a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0836e28edcd407034cbc884bd1aa77b7c75f500b20c088cb2e53052ba1af52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c270d60b2ca9fab983848fd8ee8c44a1e60544363b6c1408aba2e70c4d29534e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T11:06:48Z\\\",\\\"message\\\":\\\"W1124 11:06:37.846162 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1124 11:06:37.846539 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763982397 cert, and key in /tmp/serving-cert-3641171055/serving-signer.crt, /tmp/serving-cert-3641171055/serving-signer.key\\\\nI1124 11:06:38.225612 1 observer_polling.go:159] Starting file observer\\\\nW1124 11:06:38.228082 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1124 11:06:38.228286 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1124 11:06:38.229260 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3641171055/tls.crt::/tmp/serving-cert-3641171055/tls.key\\\\\\\"\\\\nF1124 11:06:48.685179 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24ec6d0c0c354883fbcb0141fb133d37861353398ad16742be6d2564a4c85c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.773763 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.790013 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.808711 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f890fc2e-8d6c-4109-882a-9e90340097a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0197a911066059e8637bae1251b551c637ed350ba2fcd92b4dcace745a28c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b6540a68ba35f1f647d8d4bc6fe738e35d1da8b8a4b8cdf8347e246bdc9a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vhwb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.838853 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db7bddff-87d8-49ce-91b0-a9598e04fa97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2325c8c4a059eb206089a1e4c78f02cb313c544aabc8e368cb06f8f5e3725354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8877ec3aba0de1bc9e496bba5d4e699bf7d96515c1596c6a176e3a9267f76402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0a06e6bc374614ab27d7d45d36efc3cadfdba44e33705d62d85ed50e38fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f900f515002fbeba5eecdf005ccfda9ff78f6ed1b990af5498b4d2f101e0bea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0768511ec90890b5393b32427647df3c55b88445ac9e00cf2d516a92c47c9bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.852307 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.852539 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.852603 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.852677 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.852746 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:04Z","lastTransitionTime":"2025-11-24T11:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.860182 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec4d7a924c5edd7f9268c8f39e92e2d281f41f1002216099e9307129538f4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.876246 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.890973 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"136bf646-d691-4d52-b178-1f94d2d19458\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a75772043abc2503950b09cf6a83fcd6c12bc5c3f68a1a4ae51c968d80c1cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2b5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.904652 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d681098-9a27-4c4a-abf7-c110797b560a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.916020 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jh899" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f578963d-5ff1-4e31-945b-cc59f0b244bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8397670f159f43e20cc8c6710ec3dca2af99b4defccbe1a2401fdf3e116fa8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rcps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jh899\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.944546 4752 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.949423 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9775e846766b8287b76f9bcaf9040479ff92ad811c2d72e7dd403b5a74171707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bkksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.957006 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.957062 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.957094 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.957116 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.957131 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:04Z","lastTransitionTime":"2025-11-24T11:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.962677 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jktjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270867eb-4cb0-47a6-958a-f411367a85b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ade38f0d51834521068693ef4008ed9760b9d54a8a8cfc4be5f687bb266dc4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgkv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jktjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.973560 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afccdbdac1eff9b564c64997649d9c34db6052cced3d053b1b5faf51ab14a12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.983864 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd34b2d0c9f1731ce1dc24c078a17361fc33d962037fc48ee21ecbcf4986fd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156cd45b83c97666189a0081b12acece2f273b3a03bc135ec0729c81eeb5170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:04 crc kubenswrapper[4752]: I1124 11:07:04.995210 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-brns2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04afe719-08a4-4d22-83d2-6dd16d191cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0789d2dfd0d83f253e77f99448478eb42e5e45f12e9621ea16bc89e97492010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzkhn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-brns2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.059902 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.059965 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.059982 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.060007 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.060026 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:05Z","lastTransitionTime":"2025-11-24T11:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.162472 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.162527 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.162543 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.162565 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.162584 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:05Z","lastTransitionTime":"2025-11-24T11:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.265635 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.265678 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.265687 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.265704 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.265716 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:05Z","lastTransitionTime":"2025-11-24T11:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.368440 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.368475 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.368484 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.368497 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.368505 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:05Z","lastTransitionTime":"2025-11-24T11:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.470639 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.470866 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.470932 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.470999 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.471066 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:05Z","lastTransitionTime":"2025-11-24T11:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.576436 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.576691 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.576792 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.576875 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.576951 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:05Z","lastTransitionTime":"2025-11-24T11:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.678534 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.678811 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.678889 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.678967 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.679044 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:05Z","lastTransitionTime":"2025-11-24T11:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.727121 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.727191 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:07:05 crc kubenswrapper[4752]: E1124 11:07:05.727304 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:07:05 crc kubenswrapper[4752]: E1124 11:07:05.727365 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.727516 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:07:05 crc kubenswrapper[4752]: E1124 11:07:05.727686 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.781117 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.781412 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.781439 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.781459 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.781473 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:05Z","lastTransitionTime":"2025-11-24T11:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.884192 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.884236 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.884251 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.884270 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.884285 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:05Z","lastTransitionTime":"2025-11-24T11:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.966827 4752 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.986667 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.986727 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.986778 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.986808 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:05 crc kubenswrapper[4752]: I1124 11:07:05.986823 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:05Z","lastTransitionTime":"2025-11-24T11:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.089931 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.089992 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.090009 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.090034 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.090050 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:06Z","lastTransitionTime":"2025-11-24T11:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.193014 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.193069 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.193085 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.193106 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.193122 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:06Z","lastTransitionTime":"2025-11-24T11:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.295737 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.295976 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.296023 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.296047 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.296063 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:06Z","lastTransitionTime":"2025-11-24T11:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.399169 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.399233 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.399255 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.399306 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.399327 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:06Z","lastTransitionTime":"2025-11-24T11:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.503322 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.503414 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.503441 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.503473 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.503496 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:06Z","lastTransitionTime":"2025-11-24T11:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.607271 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.607345 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.607369 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.607402 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.607428 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:06Z","lastTransitionTime":"2025-11-24T11:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.710064 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.710106 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.710117 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.710134 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.710145 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:06Z","lastTransitionTime":"2025-11-24T11:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.812304 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.812343 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.812355 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.812371 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.812383 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:06Z","lastTransitionTime":"2025-11-24T11:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.915652 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.915696 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.915707 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.915727 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.915771 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:06Z","lastTransitionTime":"2025-11-24T11:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.972188 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bkksr_fa360dfd-2d4c-4442-84c9-af5d97c4c1fa/ovnkube-controller/0.log" Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.976454 4752 generic.go:334] "Generic (PLEG): container finished" podID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerID="9775e846766b8287b76f9bcaf9040479ff92ad811c2d72e7dd403b5a74171707" exitCode=1 Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.976525 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" event={"ID":"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa","Type":"ContainerDied","Data":"9775e846766b8287b76f9bcaf9040479ff92ad811c2d72e7dd403b5a74171707"} Nov 24 11:07:06 crc kubenswrapper[4752]: I1124 11:07:06.978149 4752 scope.go:117] "RemoveContainer" containerID="9775e846766b8287b76f9bcaf9040479ff92ad811c2d72e7dd403b5a74171707" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.001182 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d681098-9a27-4c4a-abf7-c110797b560a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:06Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.018268 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jh899" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f578963d-5ff1-4e31-945b-cc59f0b244bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8397670f159f43e20cc8c6710ec3dca2af99b4defccbe1a2401fdf3e116fa8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rcps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jh899\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:07Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.020562 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.020590 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.020598 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.020614 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.020627 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:07Z","lastTransitionTime":"2025-11-24T11:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.046460 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9775e846766b8287b76f9bcaf9040479ff92ad811c2d72e7dd403b5a74171707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9775e846766b8287b76f9bcaf9040479ff92ad811c2d72e7dd403b5a74171707\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T11:07:06Z\\\",\\\"message\\\":\\\"ormers/externalversions/factory.go:140\\\\nI1124 11:07:06.075969 6051 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1124 11:07:06.075989 6051 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1124 11:07:06.076004 6051 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1124 11:07:06.076018 6051 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1124 11:07:06.076024 6051 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1124 11:07:06.076035 6051 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1124 11:07:06.076039 6051 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1124 11:07:06.076049 6051 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1124 11:07:06.076068 6051 factory.go:656] Stopping watch factory\\\\nI1124 11:07:06.076085 6051 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1124 11:07:06.076085 6051 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1124 11:07:06.076094 6051 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1124 11:07:06.076100 6051 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1124 11:07:06.076132 6051 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bkksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:07Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.062389 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jktjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270867eb-4cb0-47a6-958a-f411367a85b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ade38f0d51834521068693ef4008ed9760b9d54a8a8cfc4be5f687bb266dc4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgkv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jktjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:07Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.076470 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afccdbdac1eff9b564c64997649d9c34db6052cced3d053b1b5faf51ab14a12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:07Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.092569 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd34b2d0c9f1731ce1dc24c078a17361fc33d962037fc48ee21ecbcf4986fd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156cd45b83c97666189a0081b12acece2f273b3a03bc135ec0729c81eeb5170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:07Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.104689 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-brns2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04afe719-08a4-4d22-83d2-6dd16d191cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0789d2dfd0d83f253e77f99448478eb42e5e45f12e9621ea16bc89e97492010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzkhn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-brns2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:07Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.121452 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b427613-8be1-45a9-8e97-45b0fd75ff4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d3d3279037cda399bef36619c150353e6948dd375c8fa9db5fef6bdca4c44ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4791d1130ba00230bee650a69082d11b331e4415d83301548ec1f2550b59958a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0836e28edcd407034cbc884bd1aa77b7c75f500b20c088cb2e53052ba1af52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c270d60b2ca9fab983848fd8ee8c44a1e60544363b6c1408aba2e70c4d29534e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T11:06:48Z\\\",\\\"message\\\":\\\"W1124 11:06:37.846162 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1124 11:06:37.846539 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763982397 cert, and key in /tmp/serving-cert-3641171055/serving-signer.crt, /tmp/serving-cert-3641171055/serving-signer.key\\\\nI1124 11:06:38.225612 1 observer_polling.go:159] Starting file observer\\\\nW1124 11:06:38.228082 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1124 11:06:38.228286 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1124 11:06:38.229260 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3641171055/tls.crt::/tmp/serving-cert-3641171055/tls.key\\\\\\\"\\\\nF1124 11:06:48.685179 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24ec6d0c0c354883fbcb0141fb133d37861353398ad16742be6d2564a4c85c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:07Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.123665 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.123692 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.123702 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.123717 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.123726 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:07Z","lastTransitionTime":"2025-11-24T11:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.135696 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.136732 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:07Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.154291 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:07Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.167443 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f890fc2e-8d6c-4109-882a-9e90340097a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0197a911066059e8637bae1251b551c637ed350ba2fcd92b4dcace745a28c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b6540a68ba35f1f647d8d4bc6fe738e35d1da8b8a4b8cdf8347e246bdc9a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vhwb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:07Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.190583 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db7bddff-87d8-49ce-91b0-a9598e04fa97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2325c8c4a059eb206089a1e4c78f02cb313c544aabc8e368cb06f8f5e3725354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8877ec3aba0de1bc9e496bba5d4e699bf7d96515c1596c6a176e3a9267f76402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0a06e6bc374614ab27d7d45d36efc3cadfdba44e33705d62d85ed50e38fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f900f515002fbeba5eecdf005ccfda9ff78f6ed1b990af5498b4d2f101e0bea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0768511ec90890b5393b32427647df3c55b88445ac9e00cf2d516a92c47c9bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:07Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.205008 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec4d7a924c5edd7f9268c8f39e92e2d281f41f1002216099e9307129538f4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:07Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.217160 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:07Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.225745 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.225778 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.225786 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.225797 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.225806 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:07Z","lastTransitionTime":"2025-11-24T11:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.229804 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"136bf646-d691-4d52-b178-1f94d2d19458\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a75772043abc2503950b09cf6a83fcd6c12bc5c3f68a1a4ae51c968d80c1cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2b5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:07Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.338370 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.338426 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.338442 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.338464 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.338488 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:07Z","lastTransitionTime":"2025-11-24T11:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.441004 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.441039 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.441050 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.441064 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.441074 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:07Z","lastTransitionTime":"2025-11-24T11:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.542542 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.542566 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.542573 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.542586 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.542594 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:07Z","lastTransitionTime":"2025-11-24T11:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.645675 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.645739 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.645816 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.645845 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.645866 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:07Z","lastTransitionTime":"2025-11-24T11:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.727273 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.727341 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:07:07 crc kubenswrapper[4752]: E1124 11:07:07.727404 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:07:07 crc kubenswrapper[4752]: E1124 11:07:07.727499 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.727598 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:07:07 crc kubenswrapper[4752]: E1124 11:07:07.727678 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.748334 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.748376 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.748388 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.748405 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.748418 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:07Z","lastTransitionTime":"2025-11-24T11:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.850561 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.850619 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.850636 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.850659 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.850678 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:07Z","lastTransitionTime":"2025-11-24T11:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.911697 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.933273 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b427613-8be1-45a9-8e97-45b0fd75ff4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d3d3279037cda399bef36619c150353e6948dd375c8fa9db5fef6bdca4c44ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4791d1130ba00230bee650a69082d11b331e4415d83301548ec1f2550b59958a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0836e28edcd407034cbc884bd1aa77b7c75f500b20c088cb2e53052ba1af52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c270d60b2ca9fab983848fd8ee8c44a1e60544363b6c1408aba2e70c4d29534e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T11:06:48Z\\\",\\\"message\\\":\\\"W1124 11:06:37.846162 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1124 11:06:37.846539 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763982397 cert, and key in /tmp/serving-cert-3641171055/serving-signer.crt, /tmp/serving-cert-3641171055/serving-signer.key\\\\nI1124 11:06:38.225612 1 observer_polling.go:159] Starting file observer\\\\nW1124 11:06:38.228082 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1124 11:06:38.228286 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1124 11:06:38.229260 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3641171055/tls.crt::/tmp/serving-cert-3641171055/tls.key\\\\\\\"\\\\nF1124 11:06:48.685179 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24ec6d0c0c354883fbcb0141fb133d37861353398ad16742be6d2564a4c85c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:07Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.950190 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:07Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.953148 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.953197 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.953208 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.953224 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.953551 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:07Z","lastTransitionTime":"2025-11-24T11:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.966822 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:07Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.981955 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f890fc2e-8d6c-4109-882a-9e90340097a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0197a911066059e8637bae1251b551c637ed350ba2fcd92b4dcace745a28c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b6540a68ba35f1f647d8d4bc6fe738e35d1da8b8a4b8cdf8347e246bdc9a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vhwb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:07Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.983056 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bkksr_fa360dfd-2d4c-4442-84c9-af5d97c4c1fa/ovnkube-controller/0.log" Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.987317 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" event={"ID":"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa","Type":"ContainerStarted","Data":"46efc85a4fcde5af1909029f818819e466a4e8204f01a093d4dd53c53dc2b568"} Nov 24 11:07:07 crc kubenswrapper[4752]: I1124 11:07:07.988298 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.008120 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db7bddff-87d8-49ce-91b0-a9598e04fa97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2325c8c4a059eb206089a1e4c78f02cb313c544aabc8e368cb06f8f5e3725354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8877ec3aba0de1bc9e496bba5d4e699bf7d96515c1596c6a176e3a9267f76402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0a06e6bc374614ab27d7d45d36efc3cadfdba44e33705d62d85ed50e38fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f900f515002fbeba5eecdf005ccfda9ff78f6ed1b990af5498b4d2f101e0bea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0768511ec90890b5393b32427647df3c55b88445ac9e00cf2d516a92c47c9bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:08Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.024923 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec4d7a924c5edd7f9268c8f39e92e2d281f41f1002216099e9307129538f4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:08Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.041936 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:08Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.055646 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.055687 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.055700 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.055716 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.055727 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:08Z","lastTransitionTime":"2025-11-24T11:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.055843 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"136bf646-d691-4d52-b178-1f94d2d19458\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a75772043abc2503950b09cf6a83fcd6c12bc5c3f68a1a4ae51c968d80c1cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2b5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:08Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.070030 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d681098-9a27-4c4a-abf7-c110797b560a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:08Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.085357 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jh899" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f578963d-5ff1-4e31-945b-cc59f0b244bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8397670f159f43e20cc8c6710ec3dca2af99b4defccbe1a2401fdf3e116fa8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rcps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jh899\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:08Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.102888 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9775e846766b8287b76f9bcaf9040479ff92ad811c2d72e7dd403b5a74171707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9775e846766b8287b76f9bcaf9040479ff92ad811c2d72e7dd403b5a74171707\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T11:07:06Z\\\",\\\"message\\\":\\\"ormers/externalversions/factory.go:140\\\\nI1124 11:07:06.075969 6051 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1124 11:07:06.075989 6051 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1124 11:07:06.076004 6051 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1124 11:07:06.076018 6051 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1124 11:07:06.076024 6051 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1124 11:07:06.076035 6051 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1124 11:07:06.076039 6051 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1124 11:07:06.076049 6051 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1124 11:07:06.076068 6051 factory.go:656] Stopping watch factory\\\\nI1124 11:07:06.076085 6051 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1124 11:07:06.076085 6051 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1124 11:07:06.076094 6051 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1124 11:07:06.076100 6051 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1124 11:07:06.076132 6051 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bkksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:08Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.112336 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jktjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270867eb-4cb0-47a6-958a-f411367a85b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ade38f0d51834521068693ef4008ed9760b9d54a8a8cfc4be5f687bb266dc4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgkv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jktjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:08Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.126689 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afccdbdac1eff9b564c64997649d9c34db6052cced3d053b1b5faf51ab14a12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:08Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.143296 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd34b2d0c9f1731ce1dc24c078a17361fc33d962037fc48ee21ecbcf4986fd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156cd45b83c97666189a0081b12acece2f273b3a03bc135ec0729c81eeb5170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:08Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.156823 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-brns2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04afe719-08a4-4d22-83d2-6dd16d191cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0789d2dfd0d83f253e77f99448478eb42e5e45f12e9621ea16bc89e97492010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzkhn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-brns2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:08Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.157923 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.157959 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.157974 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.157999 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.158014 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:08Z","lastTransitionTime":"2025-11-24T11:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.178042 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b427613-8be1-45a9-8e97-45b0fd75ff4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d3d3279037cda399bef36619c150353e6948dd375c8fa9db5fef6bdca4c44ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4791d1130ba00230bee650a69082d11b331e4415d83301548ec1f2550b59958a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0836e28edcd407034cbc884bd1aa77b7c75f500b20c088cb2e53052ba1af52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c270d60b2ca9fab983848fd8ee8c44a1e60544363b6c1408aba2e70c4d29534e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T11:06:48Z\\\",\\\"message\\\":\\\"W1124 11:06:37.846162 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1124 11:06:37.846539 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763982397 cert, and key in /tmp/serving-cert-3641171055/serving-signer.crt, /tmp/serving-cert-3641171055/serving-signer.key\\\\nI1124 11:06:38.225612 1 observer_polling.go:159] Starting file observer\\\\nW1124 11:06:38.228082 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1124 11:06:38.228286 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1124 11:06:38.229260 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3641171055/tls.crt::/tmp/serving-cert-3641171055/tls.key\\\\\\\"\\\\nF1124 11:06:48.685179 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24ec6d0c0c354883fbcb0141fb133d37861353398ad16742be6d2564a4c85c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:08Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.191538 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:08Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.203129 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:08Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.217168 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f890fc2e-8d6c-4109-882a-9e90340097a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0197a911066059e8637bae1251b551c637ed350ba2fcd92b4dcace745a28c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b6540a68ba35f1f647d8d4bc6fe738e35d1da8b8a4b8cdf8347e246bdc9a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vhwb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:08Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.220536 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z64nn"] Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.226535 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z64nn" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.230444 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.230483 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.236083 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec4d7a924c5edd7f9268c8f39e92e2d281f41f1002216099e9307129538f4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:08Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.248325 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:08Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.261114 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.261195 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.261222 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.261255 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.261281 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:08Z","lastTransitionTime":"2025-11-24T11:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.262210 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"136bf646-d691-4d52-b178-1f94d2d19458\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a75772043abc2503950b09cf6a83fcd6c12bc5c3f68a1a4ae51c968d80c1cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2b5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:08Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.280772 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db7bddff-87d8-49ce-91b0-a9598e04fa97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2325c8c4a059eb206089a1e4c78f02cb313c544aabc8e368cb06f8f5e3725354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8877ec3aba0de1bc9e496bba5d4e699bf7d96515c1596c6a176e3a9267f76402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0a06e6bc374614ab27d7d45d36efc3cadfdba44e33705d62d85ed50e38fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f900f515002fbeba5eecdf005ccfda9ff78f6ed1b990af5498b4d2f101e0bea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0768511ec90890b5393b32427647df3c55b88445ac9e00cf2d516a92c47c9bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:08Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.296109 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jh899" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f578963d-5ff1-4e31-945b-cc59f0b244bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8397670f159f43e20cc8c6710ec3dca2af99b4defccbe1a2401fdf3e116fa8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rcps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jh899\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:08Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.318629 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46efc85a4fcde5af1909029f818819e466a4e8204f01a093d4dd53c53dc2b568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9775e846766b8287b76f9bcaf9040479ff92ad811c2d72e7dd403b5a74171707\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T11:07:06Z\\\",\\\"message\\\":\\\"ormers/externalversions/factory.go:140\\\\nI1124 11:07:06.075969 6051 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1124 11:07:06.075989 6051 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1124 11:07:06.076004 6051 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1124 11:07:06.076018 6051 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1124 11:07:06.076024 6051 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1124 11:07:06.076035 6051 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1124 11:07:06.076039 6051 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1124 11:07:06.076049 6051 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1124 11:07:06.076068 6051 factory.go:656] Stopping watch factory\\\\nI1124 11:07:06.076085 6051 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1124 11:07:06.076085 6051 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1124 11:07:06.076094 6051 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1124 11:07:06.076100 6051 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1124 11:07:06.076132 6051 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bkksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:08Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.332238 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jktjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270867eb-4cb0-47a6-958a-f411367a85b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ade38f0d51834521068693ef4008ed9760b9d54a8a8cfc4be5f687bb266dc4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgkv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jktjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:08Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.346559 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d681098-9a27-4c4a-abf7-c110797b560a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:08Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.348838 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7d919d0f-1c4b-493d-8e80-7927e899e908-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-z64nn\" (UID: \"7d919d0f-1c4b-493d-8e80-7927e899e908\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z64nn" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.348873 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fscc\" (UniqueName: \"kubernetes.io/projected/7d919d0f-1c4b-493d-8e80-7927e899e908-kube-api-access-6fscc\") pod \"ovnkube-control-plane-749d76644c-z64nn\" (UID: \"7d919d0f-1c4b-493d-8e80-7927e899e908\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z64nn" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.348938 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7d919d0f-1c4b-493d-8e80-7927e899e908-env-overrides\") pod \"ovnkube-control-plane-749d76644c-z64nn\" (UID: \"7d919d0f-1c4b-493d-8e80-7927e899e908\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z64nn" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.349057 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7d919d0f-1c4b-493d-8e80-7927e899e908-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-z64nn\" (UID: \"7d919d0f-1c4b-493d-8e80-7927e899e908\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z64nn" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.359669 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd34b2d0c9f1731ce1dc24c078a17361fc33d962037fc48ee21ecbcf4986fd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156cd45b83c97666189a0081b12acece2f273b3a03bc135ec0729c81eeb5170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:08Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.363601 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.363627 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.363636 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.363650 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.363659 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:08Z","lastTransitionTime":"2025-11-24T11:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.371612 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-brns2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04afe719-08a4-4d22-83d2-6dd16d191cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0789d2dfd0d83f253e77f99448478eb42e5e45f12e9621ea16bc89e97492010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzkhn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-brns2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:08Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.383899 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afccdbdac1eff9b564c64997649d9c34db6052cced3d053b1b5faf51ab14a12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:08Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.394012 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:08Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.402798 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f890fc2e-8d6c-4109-882a-9e90340097a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0197a911066059e8637bae1251b551c637ed350ba2fcd92b4dcace745a28c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b6540a68ba35f1f647d8d4bc6fe738e35d1da8b8a4b8cdf8347e246bdc9a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vhwb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:08Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.413232 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b427613-8be1-45a9-8e97-45b0fd75ff4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d3d3279037cda399bef36619c150353e6948dd375c8fa9db5fef6bdca4c44ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4791d1130ba00230bee650a69082d11b331e4415d83301548ec1f2550b59958a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0836e28edcd407034cbc884bd1aa77b7c75f500b20c088cb2e53052ba1af52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c270d60b2ca9fab983848fd8ee8c44a1e60544363b6c1408aba2e70c4d29534e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T11:06:48Z\\\",\\\"message\\\":\\\"W1124 11:06:37.846162 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1124 11:06:37.846539 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763982397 cert, and key in /tmp/serving-cert-3641171055/serving-signer.crt, /tmp/serving-cert-3641171055/serving-signer.key\\\\nI1124 11:06:38.225612 1 observer_polling.go:159] Starting file observer\\\\nW1124 11:06:38.228082 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1124 11:06:38.228286 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1124 11:06:38.229260 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3641171055/tls.crt::/tmp/serving-cert-3641171055/tls.key\\\\\\\"\\\\nF1124 11:06:48.685179 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24ec6d0c0c354883fbcb0141fb133d37861353398ad16742be6d2564a4c85c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:08Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.422923 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:08Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.442481 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"136bf646-d691-4d52-b178-1f94d2d19458\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a75772043abc2503950b09cf6a83fcd6c12bc5c3f68a1a4ae51c968d80c1cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2b5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:08Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.450477 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7d919d0f-1c4b-493d-8e80-7927e899e908-env-overrides\") pod \"ovnkube-control-plane-749d76644c-z64nn\" (UID: \"7d919d0f-1c4b-493d-8e80-7927e899e908\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z64nn" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.450528 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7d919d0f-1c4b-493d-8e80-7927e899e908-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-z64nn\" (UID: \"7d919d0f-1c4b-493d-8e80-7927e899e908\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z64nn" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.450573 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7d919d0f-1c4b-493d-8e80-7927e899e908-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-z64nn\" (UID: \"7d919d0f-1c4b-493d-8e80-7927e899e908\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z64nn" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.450603 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fscc\" (UniqueName: \"kubernetes.io/projected/7d919d0f-1c4b-493d-8e80-7927e899e908-kube-api-access-6fscc\") pod \"ovnkube-control-plane-749d76644c-z64nn\" (UID: \"7d919d0f-1c4b-493d-8e80-7927e899e908\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z64nn" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.451123 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7d919d0f-1c4b-493d-8e80-7927e899e908-env-overrides\") pod \"ovnkube-control-plane-749d76644c-z64nn\" (UID: \"7d919d0f-1c4b-493d-8e80-7927e899e908\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z64nn" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.451350 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7d919d0f-1c4b-493d-8e80-7927e899e908-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-z64nn\" (UID: \"7d919d0f-1c4b-493d-8e80-7927e899e908\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z64nn" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.457480 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7d919d0f-1c4b-493d-8e80-7927e899e908-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-z64nn\" (UID: \"7d919d0f-1c4b-493d-8e80-7927e899e908\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z64nn" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.466273 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db7bddff-87d8-49ce-91b0-a9598e04fa97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2325c8c4a059eb206089a1e4c78f02cb313c544aabc8e368cb06f8f5e3725354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8877ec3aba0de1bc9e496bba5d4e699bf7d96515c1596c6a176e3a9267f76402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0a06e6bc374614ab27d7d45d36efc3cadfdba44e33705d62d85ed50e38fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f900f515002fbeba5eecdf005ccfda9ff78f6ed1b990af5498b4d2f101e0bea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0768511ec90890b5393b32427647df3c55b88445ac9e00cf2d516a92c47c9bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:08Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.466733 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.466777 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.466788 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.466803 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.466814 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:08Z","lastTransitionTime":"2025-11-24T11:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.468703 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fscc\" (UniqueName: \"kubernetes.io/projected/7d919d0f-1c4b-493d-8e80-7927e899e908-kube-api-access-6fscc\") pod \"ovnkube-control-plane-749d76644c-z64nn\" (UID: \"7d919d0f-1c4b-493d-8e80-7927e899e908\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z64nn" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.479883 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec4d7a924c5edd7f9268c8f39e92e2d281f41f1002216099e9307129538f4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:08Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.490096 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:08Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.505958 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46efc85a4fcde5af1909029f818819e466a4e8204f01a093d4dd53c53dc2b568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9775e846766b8287b76f9bcaf9040479ff92ad811c2d72e7dd403b5a74171707\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T11:07:06Z\\\",\\\"message\\\":\\\"ormers/externalversions/factory.go:140\\\\nI1124 11:07:06.075969 6051 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1124 11:07:06.075989 6051 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1124 11:07:06.076004 6051 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1124 11:07:06.076018 6051 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1124 11:07:06.076024 6051 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1124 11:07:06.076035 6051 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1124 11:07:06.076039 6051 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1124 11:07:06.076049 6051 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1124 11:07:06.076068 6051 factory.go:656] Stopping watch factory\\\\nI1124 11:07:06.076085 6051 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1124 11:07:06.076085 6051 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1124 11:07:06.076094 6051 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1124 11:07:06.076100 6051 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1124 11:07:06.076132 6051 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bkksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:08Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.519172 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jktjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270867eb-4cb0-47a6-958a-f411367a85b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ade38f0d51834521068693ef4008ed9760b9d54a8a8cfc4be5f687bb266dc4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgkv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jktjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:08Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.530517 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z64nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d919d0f-1c4b-493d-8e80-7927e899e908\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6fscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6fscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:07:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z64nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:08Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.542167 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d681098-9a27-4c4a-abf7-c110797b560a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:08Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.544222 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z64nn" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.552862 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jh899" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f578963d-5ff1-4e31-945b-cc59f0b244bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8397670f159f43e20cc8c6710ec3dca2af99b4defccbe1a2401fdf3e116fa8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rcps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jh899\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:08Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.567567 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afccdbdac1eff9b564c64997649d9c34db6052cced3d053b1b5faf51ab14a12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:08Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.568853 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.568888 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.568900 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.568920 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.568931 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:08Z","lastTransitionTime":"2025-11-24T11:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.585894 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd34b2d0c9f1731ce1dc24c078a17361fc33d962037fc48ee21ecbcf4986fd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156cd45b83c97666189a0081b12acece2f273b3a03bc135ec0729c81eeb5170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:08Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.599210 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-brns2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04afe719-08a4-4d22-83d2-6dd16d191cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0789d2dfd0d83f253e77f99448478eb42e5e45f12e9621ea16bc89e97492010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzkhn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-brns2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:08Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.672597 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.672654 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.672666 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.672690 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.672705 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:08Z","lastTransitionTime":"2025-11-24T11:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.775631 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.775690 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.775700 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.775723 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.775737 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:08Z","lastTransitionTime":"2025-11-24T11:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.878344 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.878400 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.878413 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.878433 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.878444 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:08Z","lastTransitionTime":"2025-11-24T11:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.980851 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.980893 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.980924 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.980941 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.980950 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:08Z","lastTransitionTime":"2025-11-24T11:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.991541 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z64nn" event={"ID":"7d919d0f-1c4b-493d-8e80-7927e899e908","Type":"ContainerStarted","Data":"cad758e1529fcd67c61a844094a6e21583cea12332ab55d8f34e1fbb6f3d1524"} Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.991577 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z64nn" event={"ID":"7d919d0f-1c4b-493d-8e80-7927e899e908","Type":"ContainerStarted","Data":"3148c48d5d24649328dc14aa236f7f2b91c042388ca5d3a144213a5c7378260f"} Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.991587 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z64nn" event={"ID":"7d919d0f-1c4b-493d-8e80-7927e899e908","Type":"ContainerStarted","Data":"66da66eb1cfc7b1ecdb28f21e0892877c0b42f7dabab97b9fe2e20eb461abac3"} Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.993322 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bkksr_fa360dfd-2d4c-4442-84c9-af5d97c4c1fa/ovnkube-controller/1.log" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.993908 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bkksr_fa360dfd-2d4c-4442-84c9-af5d97c4c1fa/ovnkube-controller/0.log" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.996240 4752 generic.go:334] "Generic (PLEG): container finished" podID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerID="46efc85a4fcde5af1909029f818819e466a4e8204f01a093d4dd53c53dc2b568" exitCode=1 Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.996272 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" event={"ID":"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa","Type":"ContainerDied","Data":"46efc85a4fcde5af1909029f818819e466a4e8204f01a093d4dd53c53dc2b568"} Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.996311 4752 scope.go:117] "RemoveContainer" containerID="9775e846766b8287b76f9bcaf9040479ff92ad811c2d72e7dd403b5a74171707" Nov 24 11:07:08 crc kubenswrapper[4752]: I1124 11:07:08.996763 4752 scope.go:117] "RemoveContainer" containerID="46efc85a4fcde5af1909029f818819e466a4e8204f01a093d4dd53c53dc2b568" Nov 24 11:07:08 crc kubenswrapper[4752]: E1124 11:07:08.996898 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bkksr_openshift-ovn-kubernetes(fa360dfd-2d4c-4442-84c9-af5d97c4c1fa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.035132 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db7bddff-87d8-49ce-91b0-a9598e04fa97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2325c8c4a059eb206089a1e4c78f02cb313c544aabc8e368cb06f8f5e3725354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8877ec3aba0de1bc9e496bba5d4e699bf7d96515c1596c6a176e3a9267f76402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0a06e6bc374614ab27d7d45d36efc3cadfdba44e33705d62d85ed50e38fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f900f515002fbeba5eecdf005ccfda9ff78f6ed1b990af5498b4d2f101e0bea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0768511ec90890b5393b32427647df3c55b88445ac9e00cf2d516a92c47c9bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.054092 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec4d7a924c5edd7f9268c8f39e92e2d281f41f1002216099e9307129538f4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.070274 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.084191 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.084231 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.084243 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.084263 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.084273 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:09Z","lastTransitionTime":"2025-11-24T11:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.085334 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"136bf646-d691-4d52-b178-1f94d2d19458\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a75772043abc2503950b09cf6a83fcd6c12bc5c3f68a1a4ae51c968d80c1cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2b5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.097168 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d681098-9a27-4c4a-abf7-c110797b560a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.110795 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jh899" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f578963d-5ff1-4e31-945b-cc59f0b244bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8397670f159f43e20cc8c6710ec3dca2af99b4defccbe1a2401fdf3e116fa8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rcps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jh899\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.128632 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46efc85a4fcde5af1909029f818819e466a4e8204f01a093d4dd53c53dc2b568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9775e846766b8287b76f9bcaf9040479ff92ad811c2d72e7dd403b5a74171707\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T11:07:06Z\\\",\\\"message\\\":\\\"ormers/externalversions/factory.go:140\\\\nI1124 11:07:06.075969 6051 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1124 11:07:06.075989 6051 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1124 11:07:06.076004 6051 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1124 11:07:06.076018 6051 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1124 11:07:06.076024 6051 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1124 11:07:06.076035 6051 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1124 11:07:06.076039 6051 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1124 11:07:06.076049 6051 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1124 11:07:06.076068 6051 factory.go:656] Stopping watch factory\\\\nI1124 11:07:06.076085 6051 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1124 11:07:06.076085 6051 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1124 11:07:06.076094 6051 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1124 11:07:06.076100 6051 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1124 11:07:06.076132 6051 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bkksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.138705 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jktjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270867eb-4cb0-47a6-958a-f411367a85b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ade38f0d51834521068693ef4008ed9760b9d54a8a8cfc4be5f687bb266dc4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgkv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jktjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.150669 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z64nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d919d0f-1c4b-493d-8e80-7927e899e908\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3148c48d5d24649328dc14aa236f7f2b91c042388ca5d3a144213a5c7378260f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6fscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cad758e1529fcd67c61a844094a6e21583cea12332ab55d8f34e1fbb6f3d1524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6fscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:07:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z64nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.162215 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afccdbdac1eff9b564c64997649d9c34db6052cced3d053b1b5faf51ab14a12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.174795 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd34b2d0c9f1731ce1dc24c078a17361fc33d962037fc48ee21ecbcf4986fd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156cd45b83c97666189a0081b12acece2f273b3a03bc135ec0729c81eeb5170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.184607 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-brns2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04afe719-08a4-4d22-83d2-6dd16d191cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0789d2dfd0d83f253e77f99448478eb42e5e45f12e9621ea16bc89e97492010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzkhn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-brns2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.186140 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.186178 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.186188 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.186201 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.186210 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:09Z","lastTransitionTime":"2025-11-24T11:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.197719 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b427613-8be1-45a9-8e97-45b0fd75ff4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d3d3279037cda399bef36619c150353e6948dd375c8fa9db5fef6bdca4c44ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4791d1130ba00230bee650a69082d11b331e4415d83301548ec1f2550b59958a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0836e28edcd407034cbc884bd1aa77b7c75f500b20c088cb2e53052ba1af52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c270d60b2ca9fab983848fd8ee8c44a1e60544363b6c1408aba2e70c4d29534e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T11:06:48Z\\\",\\\"message\\\":\\\"W1124 11:06:37.846162 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1124 11:06:37.846539 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763982397 cert, and key in /tmp/serving-cert-3641171055/serving-signer.crt, /tmp/serving-cert-3641171055/serving-signer.key\\\\nI1124 11:06:38.225612 1 observer_polling.go:159] Starting file observer\\\\nW1124 11:06:38.228082 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1124 11:06:38.228286 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1124 11:06:38.229260 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3641171055/tls.crt::/tmp/serving-cert-3641171055/tls.key\\\\\\\"\\\\nF1124 11:06:48.685179 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24ec6d0c0c354883fbcb0141fb133d37861353398ad16742be6d2564a4c85c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.210248 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.220589 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.230342 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f890fc2e-8d6c-4109-882a-9e90340097a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0197a911066059e8637bae1251b551c637ed350ba2fcd92b4dcace745a28c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b6540a68ba35f1f647d8d4bc6fe738e35d1da8b8a4b8cdf8347e246bdc9a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vhwb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.250393 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db7bddff-87d8-49ce-91b0-a9598e04fa97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2325c8c4a059eb206089a1e4c78f02cb313c544aabc8e368cb06f8f5e3725354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8877ec3aba0de1bc9e496bba5d4e699bf7d96515c1596c6a176e3a9267f76402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0a06e6bc374614ab27d7d45d36efc3cadfdba44e33705d62d85ed50e38fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f900f515002fbeba5eecdf005ccfda9ff78f6ed1b990af5498b4d2f101e0bea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0768511ec90890b5393b32427647df3c55b88445ac9e00cf2d516a92c47c9bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.263305 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec4d7a924c5edd7f9268c8f39e92e2d281f41f1002216099e9307129538f4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.276079 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.288701 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.288774 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.288789 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.288810 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.288827 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:09Z","lastTransitionTime":"2025-11-24T11:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.289870 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"136bf646-d691-4d52-b178-1f94d2d19458\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a75772043abc2503950b09cf6a83fcd6c12bc5c3f68a1a4ae51c968d80c1cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2b5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.302338 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d681098-9a27-4c4a-abf7-c110797b560a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.319971 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jh899" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f578963d-5ff1-4e31-945b-cc59f0b244bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8397670f159f43e20cc8c6710ec3dca2af99b4defccbe1a2401fdf3e116fa8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rcps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jh899\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.346671 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46efc85a4fcde5af1909029f818819e466a4e8204f01a093d4dd53c53dc2b568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9775e846766b8287b76f9bcaf9040479ff92ad811c2d72e7dd403b5a74171707\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T11:07:06Z\\\",\\\"message\\\":\\\"ormers/externalversions/factory.go:140\\\\nI1124 11:07:06.075969 6051 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1124 11:07:06.075989 6051 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1124 11:07:06.076004 6051 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1124 11:07:06.076018 6051 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1124 11:07:06.076024 6051 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1124 11:07:06.076035 6051 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1124 11:07:06.076039 6051 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1124 11:07:06.076049 6051 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1124 11:07:06.076068 6051 factory.go:656] Stopping watch factory\\\\nI1124 11:07:06.076085 6051 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1124 11:07:06.076085 6051 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1124 11:07:06.076094 6051 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1124 11:07:06.076100 6051 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1124 11:07:06.076132 6051 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46efc85a4fcde5af1909029f818819e466a4e8204f01a093d4dd53c53dc2b568\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"message\\\":\\\"y (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 11:07:07.843897 6176 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1124 11:07:07.843920 6176 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1124 11:07:07.843925 6176 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1124 11:07:07.843936 6176 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1124 11:07:07.843969 6176 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1124 11:07:07.843994 6176 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1124 11:07:07.843999 6176 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1124 11:07:07.844044 6176 factory.go:656] Stopping watch factory\\\\nI1124 11:07:07.844063 6176 ovnkube.go:599] Stopped ovnkube\\\\nI1124 11:07:07.844088 6176 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1124 11:07:07.844087 6176 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1124 11:07:07.844106 6176 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1124 11:07:07.844112 6176 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1124 11:07:07.844120 6176 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1124 11:07:07.844042 6176 handler.go:208] Removed *v1.Node event handler 7\\\\nI11\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bkksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.359677 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jktjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270867eb-4cb0-47a6-958a-f411367a85b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ade38f0d51834521068693ef4008ed9760b9d54a8a8cfc4be5f687bb266dc4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgkv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jktjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.372309 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z64nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d919d0f-1c4b-493d-8e80-7927e899e908\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3148c48d5d24649328dc14aa236f7f2b91c042388ca5d3a144213a5c7378260f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6fscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cad758e1529fcd67c61a844094a6e21583cea12332ab55d8f34e1fbb6f3d1524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6fscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:07:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z64nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.384839 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afccdbdac1eff9b564c64997649d9c34db6052cced3d053b1b5faf51ab14a12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.392801 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.392839 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.392851 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.392878 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.392890 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:09Z","lastTransitionTime":"2025-11-24T11:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.396616 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd34b2d0c9f1731ce1dc24c078a17361fc33d962037fc48ee21ecbcf4986fd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156cd45b83c97666189a0081b12acece2f273b3a03bc135ec0729c81eeb5170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.407412 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-brns2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04afe719-08a4-4d22-83d2-6dd16d191cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0789d2dfd0d83f253e77f99448478eb42e5e45f12e9621ea16bc89e97492010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzkhn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-brns2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.421785 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b427613-8be1-45a9-8e97-45b0fd75ff4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d3d3279037cda399bef36619c150353e6948dd375c8fa9db5fef6bdca4c44ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4791d1130ba00230bee650a69082d11b331e4415d83301548ec1f2550b59958a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0836e28edcd407034cbc884bd1aa77b7c75f500b20c088cb2e53052ba1af52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c270d60b2ca9fab983848fd8ee8c44a1e60544363b6c1408aba2e70c4d29534e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T11:06:48Z\\\",\\\"message\\\":\\\"W1124 11:06:37.846162 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1124 11:06:37.846539 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763982397 cert, and key in /tmp/serving-cert-3641171055/serving-signer.crt, /tmp/serving-cert-3641171055/serving-signer.key\\\\nI1124 11:06:38.225612 1 observer_polling.go:159] Starting file observer\\\\nW1124 11:06:38.228082 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1124 11:06:38.228286 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1124 11:06:38.229260 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3641171055/tls.crt::/tmp/serving-cert-3641171055/tls.key\\\\\\\"\\\\nF1124 11:06:48.685179 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24ec6d0c0c354883fbcb0141fb133d37861353398ad16742be6d2564a4c85c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.435931 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.449157 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.464715 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f890fc2e-8d6c-4109-882a-9e90340097a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0197a911066059e8637bae1251b551c637ed350ba2fcd92b4dcace745a28c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b6540a68ba35f1f647d8d4bc6fe738e35d1da8b8a4b8cdf8347e246bdc9a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vhwb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.496116 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.496171 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.496187 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.496207 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.496222 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:09Z","lastTransitionTime":"2025-11-24T11:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.598473 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.598519 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.598530 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.598547 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.598560 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:09Z","lastTransitionTime":"2025-11-24T11:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.696501 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-8gb7x"] Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.697175 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:07:09 crc kubenswrapper[4752]: E1124 11:07:09.697255 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.700629 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.700671 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.700686 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.700706 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.700723 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:09Z","lastTransitionTime":"2025-11-24T11:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.727200 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.727325 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:07:09 crc kubenswrapper[4752]: E1124 11:07:09.727378 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.727210 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:07:09 crc kubenswrapper[4752]: E1124 11:07:09.727585 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:07:09 crc kubenswrapper[4752]: E1124 11:07:09.727720 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.740254 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db7bddff-87d8-49ce-91b0-a9598e04fa97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2325c8c4a059eb206089a1e4c78f02cb313c544aabc8e368cb06f8f5e3725354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8877ec3aba0de1bc9e496bba5d4e699bf7d96515c1596c6a176e3a9267f76402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0a06e6bc374614ab27d7d45d36efc3cadfdba44e33705d62d85ed50e38fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f900f515002fbeba5eecdf005ccfda9ff78f6ed1b990af5498b4d2f101e0bea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0768511ec90890b5393b32427647df3c55b88445ac9e00cf2d516a92c47c9bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.757142 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec4d7a924c5edd7f9268c8f39e92e2d281f41f1002216099e9307129538f4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.765501 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6dea074-570b-440e-b555-46c1dde88efa-metrics-certs\") pod \"network-metrics-daemon-8gb7x\" (UID: \"d6dea074-570b-440e-b555-46c1dde88efa\") " pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.765604 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jscg9\" (UniqueName: \"kubernetes.io/projected/d6dea074-570b-440e-b555-46c1dde88efa-kube-api-access-jscg9\") pod \"network-metrics-daemon-8gb7x\" (UID: \"d6dea074-570b-440e-b555-46c1dde88efa\") " pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.771303 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.791613 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"136bf646-d691-4d52-b178-1f94d2d19458\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a75772043abc2503950b09cf6a83fcd6c12bc5c3f68a1a4ae51c968d80c1cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2b5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.802836 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.802882 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.802908 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.802928 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.802943 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:09Z","lastTransitionTime":"2025-11-24T11:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.808714 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d681098-9a27-4c4a-abf7-c110797b560a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.830858 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jh899" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f578963d-5ff1-4e31-945b-cc59f0b244bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8397670f159f43e20cc8c6710ec3dca2af99b4defccbe1a2401fdf3e116fa8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rcps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jh899\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.855220 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46efc85a4fcde5af1909029f818819e466a4e8204f01a093d4dd53c53dc2b568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9775e846766b8287b76f9bcaf9040479ff92ad811c2d72e7dd403b5a74171707\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T11:07:06Z\\\",\\\"message\\\":\\\"ormers/externalversions/factory.go:140\\\\nI1124 11:07:06.075969 6051 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1124 11:07:06.075989 6051 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1124 11:07:06.076004 6051 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1124 11:07:06.076018 6051 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1124 11:07:06.076024 6051 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1124 11:07:06.076035 6051 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1124 11:07:06.076039 6051 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1124 11:07:06.076049 6051 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1124 11:07:06.076068 6051 factory.go:656] Stopping watch factory\\\\nI1124 11:07:06.076085 6051 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1124 11:07:06.076085 6051 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1124 11:07:06.076094 6051 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1124 11:07:06.076100 6051 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1124 11:07:06.076132 6051 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46efc85a4fcde5af1909029f818819e466a4e8204f01a093d4dd53c53dc2b568\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"message\\\":\\\"y (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 11:07:07.843897 6176 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1124 11:07:07.843920 6176 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1124 11:07:07.843925 6176 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1124 11:07:07.843936 6176 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1124 11:07:07.843969 6176 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1124 11:07:07.843994 6176 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1124 11:07:07.843999 6176 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1124 11:07:07.844044 6176 factory.go:656] Stopping watch factory\\\\nI1124 11:07:07.844063 6176 ovnkube.go:599] Stopped ovnkube\\\\nI1124 11:07:07.844088 6176 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1124 11:07:07.844087 6176 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1124 11:07:07.844106 6176 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1124 11:07:07.844112 6176 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1124 11:07:07.844120 6176 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1124 11:07:07.844042 6176 handler.go:208] Removed *v1.Node event handler 7\\\\nI11\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bkksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.867164 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jscg9\" (UniqueName: \"kubernetes.io/projected/d6dea074-570b-440e-b555-46c1dde88efa-kube-api-access-jscg9\") pod \"network-metrics-daemon-8gb7x\" (UID: \"d6dea074-570b-440e-b555-46c1dde88efa\") " pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.867250 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6dea074-570b-440e-b555-46c1dde88efa-metrics-certs\") pod \"network-metrics-daemon-8gb7x\" (UID: \"d6dea074-570b-440e-b555-46c1dde88efa\") " pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:07:09 crc kubenswrapper[4752]: E1124 11:07:09.867415 4752 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 24 11:07:09 crc kubenswrapper[4752]: E1124 11:07:09.867637 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6dea074-570b-440e-b555-46c1dde88efa-metrics-certs podName:d6dea074-570b-440e-b555-46c1dde88efa nodeName:}" failed. No retries permitted until 2025-11-24 11:07:10.367608307 +0000 UTC m=+36.352428636 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d6dea074-570b-440e-b555-46c1dde88efa-metrics-certs") pod "network-metrics-daemon-8gb7x" (UID: "d6dea074-570b-440e-b555-46c1dde88efa") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.868541 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jktjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270867eb-4cb0-47a6-958a-f411367a85b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ade38f0d51834521068693ef4008ed9760b9d54a8a8cfc4be5f687bb266dc4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgkv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jktjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.881204 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z64nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d919d0f-1c4b-493d-8e80-7927e899e908\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3148c48d5d24649328dc14aa236f7f2b91c042388ca5d3a144213a5c7378260f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6fscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cad758e1529fcd67c61a844094a6e21583cea12332ab55d8f34e1fbb6f3d1524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6fscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:07:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z64nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.893044 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jscg9\" (UniqueName: \"kubernetes.io/projected/d6dea074-570b-440e-b555-46c1dde88efa-kube-api-access-jscg9\") pod \"network-metrics-daemon-8gb7x\" (UID: \"d6dea074-570b-440e-b555-46c1dde88efa\") " pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.894111 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8gb7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6dea074-570b-440e-b555-46c1dde88efa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jscg9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jscg9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:07:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8gb7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.904705 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afccdbdac1eff9b564c64997649d9c34db6052cced3d053b1b5faf51ab14a12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.905399 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.905443 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.905470 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.905491 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.905505 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:09Z","lastTransitionTime":"2025-11-24T11:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.915577 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd34b2d0c9f1731ce1dc24c078a17361fc33d962037fc48ee21ecbcf4986fd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156cd45b83c97666189a0081b12acece2f273b3a03bc135ec0729c81eeb5170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.927055 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-brns2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04afe719-08a4-4d22-83d2-6dd16d191cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0789d2dfd0d83f253e77f99448478eb42e5e45f12e9621ea16bc89e97492010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzkhn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-brns2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.946539 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b427613-8be1-45a9-8e97-45b0fd75ff4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d3d3279037cda399bef36619c150353e6948dd375c8fa9db5fef6bdca4c44ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4791d1130ba00230bee650a69082d11b331e4415d83301548ec1f2550b59958a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0836e28edcd407034cbc884bd1aa77b7c75f500b20c088cb2e53052ba1af52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c270d60b2ca9fab983848fd8ee8c44a1e60544363b6c1408aba2e70c4d29534e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T11:06:48Z\\\",\\\"message\\\":\\\"W1124 11:06:37.846162 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1124 11:06:37.846539 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763982397 cert, and key in /tmp/serving-cert-3641171055/serving-signer.crt, /tmp/serving-cert-3641171055/serving-signer.key\\\\nI1124 11:06:38.225612 1 observer_polling.go:159] Starting file observer\\\\nW1124 11:06:38.228082 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1124 11:06:38.228286 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1124 11:06:38.229260 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3641171055/tls.crt::/tmp/serving-cert-3641171055/tls.key\\\\\\\"\\\\nF1124 11:06:48.685179 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24ec6d0c0c354883fbcb0141fb133d37861353398ad16742be6d2564a4c85c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.962709 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.978277 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:09 crc kubenswrapper[4752]: I1124 11:07:09.990508 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f890fc2e-8d6c-4109-882a-9e90340097a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0197a911066059e8637bae1251b551c637ed350ba2fcd92b4dcace745a28c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b6540a68ba35f1f647d8d4bc6fe738e35d1da8b8a4b8cdf8347e246bdc9a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vhwb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:09Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.001615 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bkksr_fa360dfd-2d4c-4442-84c9-af5d97c4c1fa/ovnkube-controller/1.log" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.005837 4752 scope.go:117] "RemoveContainer" containerID="46efc85a4fcde5af1909029f818819e466a4e8204f01a093d4dd53c53dc2b568" Nov 24 11:07:10 crc kubenswrapper[4752]: E1124 11:07:10.006026 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bkksr_openshift-ovn-kubernetes(fa360dfd-2d4c-4442-84c9-af5d97c4c1fa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.007337 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.007416 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.007435 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.007453 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.007464 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:10Z","lastTransitionTime":"2025-11-24T11:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.021655 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b427613-8be1-45a9-8e97-45b0fd75ff4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d3d3279037cda399bef36619c150353e6948dd375c8fa9db5fef6bdca4c44ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4791d1130ba00230bee650a69082d11b331e4415d83301548ec1f2550b59958a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0836e28edcd407034cbc884bd1aa77b7c75f500b20c088cb2e53052ba1af52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c270d60b2ca9fab983848fd8ee8c44a1e60544363b6c1408aba2e70c4d29534e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T11:06:48Z\\\",\\\"message\\\":\\\"W1124 11:06:37.846162 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1124 11:06:37.846539 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763982397 cert, and key in /tmp/serving-cert-3641171055/serving-signer.crt, /tmp/serving-cert-3641171055/serving-signer.key\\\\nI1124 11:06:38.225612 1 observer_polling.go:159] Starting file observer\\\\nW1124 11:06:38.228082 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1124 11:06:38.228286 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1124 11:06:38.229260 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3641171055/tls.crt::/tmp/serving-cert-3641171055/tls.key\\\\\\\"\\\\nF1124 11:06:48.685179 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24ec6d0c0c354883fbcb0141fb133d37861353398ad16742be6d2564a4c85c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:10Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.037693 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:10Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.049709 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:10Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.060523 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f890fc2e-8d6c-4109-882a-9e90340097a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0197a911066059e8637bae1251b551c637ed350ba2fcd92b4dcace745a28c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b6540a68ba35f1f647d8d4bc6fe738e35d1da8b8a4b8cdf8347e246bdc9a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vhwb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:10Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.091905 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db7bddff-87d8-49ce-91b0-a9598e04fa97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2325c8c4a059eb206089a1e4c78f02cb313c544aabc8e368cb06f8f5e3725354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8877ec3aba0de1bc9e496bba5d4e699bf7d96515c1596c6a176e3a9267f76402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0a06e6bc374614ab27d7d45d36efc3cadfdba44e33705d62d85ed50e38fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f900f515002fbeba5eecdf005ccfda9ff78f6ed1b990af5498b4d2f101e0bea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0768511ec90890b5393b32427647df3c55b88445ac9e00cf2d516a92c47c9bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:10Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.108868 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec4d7a924c5edd7f9268c8f39e92e2d281f41f1002216099e9307129538f4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:10Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.112127 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.112163 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.112172 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.112185 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.112193 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:10Z","lastTransitionTime":"2025-11-24T11:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.124087 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:10Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.142666 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"136bf646-d691-4d52-b178-1f94d2d19458\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a75772043abc2503950b09cf6a83fcd6c12bc5c3f68a1a4ae51c968d80c1cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2b5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:10Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.164316 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d681098-9a27-4c4a-abf7-c110797b560a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:10Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.204189 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jh899" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f578963d-5ff1-4e31-945b-cc59f0b244bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8397670f159f43e20cc8c6710ec3dca2af99b4defccbe1a2401fdf3e116fa8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rcps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jh899\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:10Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.214582 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.214642 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.214657 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.214694 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.214715 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:10Z","lastTransitionTime":"2025-11-24T11:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.246725 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46efc85a4fcde5af1909029f818819e466a4e8204f01a093d4dd53c53dc2b568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46efc85a4fcde5af1909029f818819e466a4e8204f01a093d4dd53c53dc2b568\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"message\\\":\\\"y (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 11:07:07.843897 6176 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1124 11:07:07.843920 6176 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1124 11:07:07.843925 6176 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1124 11:07:07.843936 6176 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1124 11:07:07.843969 6176 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1124 11:07:07.843994 6176 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1124 11:07:07.843999 6176 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1124 11:07:07.844044 6176 factory.go:656] Stopping watch factory\\\\nI1124 11:07:07.844063 6176 ovnkube.go:599] Stopped ovnkube\\\\nI1124 11:07:07.844088 6176 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1124 11:07:07.844087 6176 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1124 11:07:07.844106 6176 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1124 11:07:07.844112 6176 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1124 11:07:07.844120 6176 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1124 11:07:07.844042 6176 handler.go:208] Removed *v1.Node event handler 7\\\\nI11\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bkksr_openshift-ovn-kubernetes(fa360dfd-2d4c-4442-84c9-af5d97c4c1fa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bkksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:10Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.281194 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jktjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270867eb-4cb0-47a6-958a-f411367a85b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ade38f0d51834521068693ef4008ed9760b9d54a8a8cfc4be5f687bb266dc4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgkv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jktjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:10Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.317507 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.317604 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.317624 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.317648 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.317676 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:10Z","lastTransitionTime":"2025-11-24T11:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.326044 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z64nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d919d0f-1c4b-493d-8e80-7927e899e908\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3148c48d5d24649328dc14aa236f7f2b91c042388ca5d3a144213a5c7378260f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6fscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cad758e1529fcd67c61a844094a6e21583cea12332ab55d8f34e1fbb6f3d1524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6fscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:07:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z64nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:10Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.365039 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8gb7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6dea074-570b-440e-b555-46c1dde88efa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jscg9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jscg9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:07:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8gb7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:10Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.373714 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.373985 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6dea074-570b-440e-b555-46c1dde88efa-metrics-certs\") pod \"network-metrics-daemon-8gb7x\" (UID: \"d6dea074-570b-440e-b555-46c1dde88efa\") " pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:07:10 crc kubenswrapper[4752]: E1124 11:07:10.374006 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:07:26.373972867 +0000 UTC m=+52.358793166 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:07:10 crc kubenswrapper[4752]: E1124 11:07:10.374200 4752 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 24 11:07:10 crc kubenswrapper[4752]: E1124 11:07:10.374276 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6dea074-570b-440e-b555-46c1dde88efa-metrics-certs podName:d6dea074-570b-440e-b555-46c1dde88efa nodeName:}" failed. No retries permitted until 2025-11-24 11:07:11.374253626 +0000 UTC m=+37.359073955 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d6dea074-570b-440e-b555-46c1dde88efa-metrics-certs") pod "network-metrics-daemon-8gb7x" (UID: "d6dea074-570b-440e-b555-46c1dde88efa") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.402288 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afccdbdac1eff9b564c64997649d9c34db6052cced3d053b1b5faf51ab14a12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:10Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.419544 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.419593 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.419609 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.419632 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.419649 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:10Z","lastTransitionTime":"2025-11-24T11:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.442520 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd34b2d0c9f1731ce1dc24c078a17361fc33d962037fc48ee21ecbcf4986fd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156cd45b83c97666189a0081b12acece2f273b3a03bc135ec0729c81eeb5170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:10Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.474606 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.474803 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:07:10 crc kubenswrapper[4752]: E1124 11:07:10.474839 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.474869 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.474926 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:07:10 crc kubenswrapper[4752]: E1124 11:07:10.474881 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 24 11:07:10 crc kubenswrapper[4752]: E1124 11:07:10.474987 4752 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 24 11:07:10 crc kubenswrapper[4752]: E1124 11:07:10.475012 4752 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 11:07:10 crc kubenswrapper[4752]: E1124 11:07:10.474965 4752 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 24 11:07:10 crc kubenswrapper[4752]: E1124 11:07:10.475081 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 24 11:07:10 crc kubenswrapper[4752]: E1124 11:07:10.475125 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 24 11:07:10 crc kubenswrapper[4752]: E1124 11:07:10.475144 4752 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 11:07:10 crc kubenswrapper[4752]: E1124 11:07:10.475085 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-24 11:07:26.475058887 +0000 UTC m=+52.459879206 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 24 11:07:10 crc kubenswrapper[4752]: E1124 11:07:10.475249 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-24 11:07:26.475225433 +0000 UTC m=+52.460045722 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 11:07:10 crc kubenswrapper[4752]: E1124 11:07:10.475271 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-24 11:07:26.475262094 +0000 UTC m=+52.460082373 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 24 11:07:10 crc kubenswrapper[4752]: E1124 11:07:10.475286 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-24 11:07:26.475278334 +0000 UTC m=+52.460098623 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.483078 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-brns2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04afe719-08a4-4d22-83d2-6dd16d191cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0789d2dfd0d83f253e77f99448478eb42e5e45f12e9621ea16bc89e97492010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzkhn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-brns2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:10Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.522584 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.522633 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.522649 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.522674 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.522694 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:10Z","lastTransitionTime":"2025-11-24T11:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.625799 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.625843 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.625862 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.625884 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.625902 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:10Z","lastTransitionTime":"2025-11-24T11:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.728591 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.728661 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.728683 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.728717 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.728741 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:10Z","lastTransitionTime":"2025-11-24T11:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.831902 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.831946 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.831962 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.831980 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.831994 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:10Z","lastTransitionTime":"2025-11-24T11:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.935342 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.935392 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.935404 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.935422 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:10 crc kubenswrapper[4752]: I1124 11:07:10.935435 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:10Z","lastTransitionTime":"2025-11-24T11:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.058293 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.058363 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.058374 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.058395 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.058409 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:11Z","lastTransitionTime":"2025-11-24T11:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.161230 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.161317 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.161332 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.161357 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.161375 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:11Z","lastTransitionTime":"2025-11-24T11:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.265108 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.265178 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.265194 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.265223 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.265240 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:11Z","lastTransitionTime":"2025-11-24T11:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.368175 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.368230 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.368247 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.368268 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.368285 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:11Z","lastTransitionTime":"2025-11-24T11:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.385079 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6dea074-570b-440e-b555-46c1dde88efa-metrics-certs\") pod \"network-metrics-daemon-8gb7x\" (UID: \"d6dea074-570b-440e-b555-46c1dde88efa\") " pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:07:11 crc kubenswrapper[4752]: E1124 11:07:11.385253 4752 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 24 11:07:11 crc kubenswrapper[4752]: E1124 11:07:11.385318 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6dea074-570b-440e-b555-46c1dde88efa-metrics-certs podName:d6dea074-570b-440e-b555-46c1dde88efa nodeName:}" failed. No retries permitted until 2025-11-24 11:07:13.385301185 +0000 UTC m=+39.370121484 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d6dea074-570b-440e-b555-46c1dde88efa-metrics-certs") pod "network-metrics-daemon-8gb7x" (UID: "d6dea074-570b-440e-b555-46c1dde88efa") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.390545 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.390649 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.390669 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.390700 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.390729 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:11Z","lastTransitionTime":"2025-11-24T11:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:11 crc kubenswrapper[4752]: E1124 11:07:11.411943 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"47425241-83f5-42c8-9f71-0c166d7ef9e2\\\",\\\"systemUUID\\\":\\\"366115a7-2c9a-450b-9862-da5d0db853ac\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:11Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.417252 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.417306 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.417324 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.417351 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.417384 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:11Z","lastTransitionTime":"2025-11-24T11:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:11 crc kubenswrapper[4752]: E1124 11:07:11.441680 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"47425241-83f5-42c8-9f71-0c166d7ef9e2\\\",\\\"systemUUID\\\":\\\"366115a7-2c9a-450b-9862-da5d0db853ac\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:11Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.446154 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.446240 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.446259 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.446286 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.446304 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:11Z","lastTransitionTime":"2025-11-24T11:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:11 crc kubenswrapper[4752]: E1124 11:07:11.463368 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"47425241-83f5-42c8-9f71-0c166d7ef9e2\\\",\\\"systemUUID\\\":\\\"366115a7-2c9a-450b-9862-da5d0db853ac\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:11Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.467090 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.467263 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.467359 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.467449 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.467520 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:11Z","lastTransitionTime":"2025-11-24T11:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:11 crc kubenswrapper[4752]: E1124 11:07:11.481486 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"47425241-83f5-42c8-9f71-0c166d7ef9e2\\\",\\\"systemUUID\\\":\\\"366115a7-2c9a-450b-9862-da5d0db853ac\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:11Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.485537 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.485724 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.485810 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.485876 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.485932 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:11Z","lastTransitionTime":"2025-11-24T11:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:11 crc kubenswrapper[4752]: E1124 11:07:11.500345 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"47425241-83f5-42c8-9f71-0c166d7ef9e2\\\",\\\"systemUUID\\\":\\\"366115a7-2c9a-450b-9862-da5d0db853ac\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:11Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:11 crc kubenswrapper[4752]: E1124 11:07:11.500457 4752 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.501811 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.501928 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.502023 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.502126 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.502210 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:11Z","lastTransitionTime":"2025-11-24T11:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.605384 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.605678 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.605795 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.605887 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.605960 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:11Z","lastTransitionTime":"2025-11-24T11:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.708985 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.709056 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.709075 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.709519 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.709588 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:11Z","lastTransitionTime":"2025-11-24T11:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.727193 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.727218 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.727241 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:07:11 crc kubenswrapper[4752]: E1124 11:07:11.727324 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:07:11 crc kubenswrapper[4752]: E1124 11:07:11.727453 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:07:11 crc kubenswrapper[4752]: E1124 11:07:11.727603 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.727718 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:07:11 crc kubenswrapper[4752]: E1124 11:07:11.727917 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.811734 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.811814 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.811832 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.811854 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.811873 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:11Z","lastTransitionTime":"2025-11-24T11:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.914015 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.914054 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.914065 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.914080 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:11 crc kubenswrapper[4752]: I1124 11:07:11.914091 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:11Z","lastTransitionTime":"2025-11-24T11:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.016486 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.016536 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.016551 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.016571 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.016586 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:12Z","lastTransitionTime":"2025-11-24T11:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.119354 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.119406 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.119424 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.119448 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.119468 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:12Z","lastTransitionTime":"2025-11-24T11:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.222733 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.222824 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.222846 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.222877 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.222900 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:12Z","lastTransitionTime":"2025-11-24T11:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.326232 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.326308 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.326333 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.326360 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.326382 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:12Z","lastTransitionTime":"2025-11-24T11:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.429669 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.429708 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.429719 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.429735 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.429769 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:12Z","lastTransitionTime":"2025-11-24T11:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.533210 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.533259 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.533279 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.533300 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.533314 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:12Z","lastTransitionTime":"2025-11-24T11:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.635440 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.635513 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.635534 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.635566 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.635590 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:12Z","lastTransitionTime":"2025-11-24T11:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.738244 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.738301 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.738318 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.738342 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.738359 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:12Z","lastTransitionTime":"2025-11-24T11:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.841720 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.841841 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.841861 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.841887 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.841911 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:12Z","lastTransitionTime":"2025-11-24T11:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.945785 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.945844 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.945861 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.945884 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:12 crc kubenswrapper[4752]: I1124 11:07:12.945902 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:12Z","lastTransitionTime":"2025-11-24T11:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.048671 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.048734 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.048784 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.048853 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.048874 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:13Z","lastTransitionTime":"2025-11-24T11:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.152059 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.152108 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.152125 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.152153 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.152169 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:13Z","lastTransitionTime":"2025-11-24T11:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.259830 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.259886 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.259901 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.259931 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.259947 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:13Z","lastTransitionTime":"2025-11-24T11:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.362589 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.362647 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.362658 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.362676 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.362688 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:13Z","lastTransitionTime":"2025-11-24T11:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.409218 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6dea074-570b-440e-b555-46c1dde88efa-metrics-certs\") pod \"network-metrics-daemon-8gb7x\" (UID: \"d6dea074-570b-440e-b555-46c1dde88efa\") " pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:07:13 crc kubenswrapper[4752]: E1124 11:07:13.409382 4752 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 24 11:07:13 crc kubenswrapper[4752]: E1124 11:07:13.409475 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6dea074-570b-440e-b555-46c1dde88efa-metrics-certs podName:d6dea074-570b-440e-b555-46c1dde88efa nodeName:}" failed. No retries permitted until 2025-11-24 11:07:17.409456278 +0000 UTC m=+43.394276567 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d6dea074-570b-440e-b555-46c1dde88efa-metrics-certs") pod "network-metrics-daemon-8gb7x" (UID: "d6dea074-570b-440e-b555-46c1dde88efa") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.466385 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.466444 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.466460 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.466482 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.466510 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:13Z","lastTransitionTime":"2025-11-24T11:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.568786 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.568857 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.568868 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.568884 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.568898 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:13Z","lastTransitionTime":"2025-11-24T11:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.671458 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.671526 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.671564 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.671582 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.671592 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:13Z","lastTransitionTime":"2025-11-24T11:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.727272 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.727329 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.727283 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:07:13 crc kubenswrapper[4752]: E1124 11:07:13.727420 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.727497 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:07:13 crc kubenswrapper[4752]: E1124 11:07:13.727607 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:07:13 crc kubenswrapper[4752]: E1124 11:07:13.727812 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:07:13 crc kubenswrapper[4752]: E1124 11:07:13.727943 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.773601 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.773638 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.773649 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.773665 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.773677 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:13Z","lastTransitionTime":"2025-11-24T11:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.876726 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.876793 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.876810 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.876829 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.876845 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:13Z","lastTransitionTime":"2025-11-24T11:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.979835 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.979875 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.979884 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.979900 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:13 crc kubenswrapper[4752]: I1124 11:07:13.979910 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:13Z","lastTransitionTime":"2025-11-24T11:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.082288 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.082354 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.082371 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.082396 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.082413 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:14Z","lastTransitionTime":"2025-11-24T11:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.185511 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.185564 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.185576 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.185592 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.185603 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:14Z","lastTransitionTime":"2025-11-24T11:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.288132 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.288199 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.288213 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.288237 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.288251 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:14Z","lastTransitionTime":"2025-11-24T11:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.390645 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.390678 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.390689 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.390708 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.390721 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:14Z","lastTransitionTime":"2025-11-24T11:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.493589 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.493631 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.493641 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.493656 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.493668 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:14Z","lastTransitionTime":"2025-11-24T11:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.596451 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.596506 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.596520 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.596539 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.596551 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:14Z","lastTransitionTime":"2025-11-24T11:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.699867 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.699927 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.699944 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.699967 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.699984 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:14Z","lastTransitionTime":"2025-11-24T11:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.753776 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db7bddff-87d8-49ce-91b0-a9598e04fa97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2325c8c4a059eb206089a1e4c78f02cb313c544aabc8e368cb06f8f5e3725354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8877ec3aba0de1bc9e496bba5d4e699bf7d96515c1596c6a176e3a9267f76402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0a06e6bc374614ab27d7d45d36efc3cadfdba44e33705d62d85ed50e38fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f900f515002fbeba5eecdf005ccfda9ff78f6ed1b990af5498b4d2f101e0bea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0768511ec90890b5393b32427647df3c55b88445ac9e00cf2d516a92c47c9bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:14Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.770303 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec4d7a924c5edd7f9268c8f39e92e2d281f41f1002216099e9307129538f4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:14Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.784826 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:14Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.802304 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.802345 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.802355 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.802370 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.802382 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:14Z","lastTransitionTime":"2025-11-24T11:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.802413 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"136bf646-d691-4d52-b178-1f94d2d19458\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a75772043abc2503950b09cf6a83fcd6c12bc5c3f68a1a4ae51c968d80c1cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2b5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:14Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.821321 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z64nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d919d0f-1c4b-493d-8e80-7927e899e908\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3148c48d5d24649328dc14aa236f7f2b91c042388ca5d3a144213a5c7378260f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6fscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cad758e1529fcd67c61a844094a6e21583cea12332ab55d8f34e1fbb6f3d1524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6fscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:07:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z64nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:14Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.835530 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8gb7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6dea074-570b-440e-b555-46c1dde88efa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jscg9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jscg9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:07:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8gb7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:14Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.849058 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d681098-9a27-4c4a-abf7-c110797b560a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:14Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.866983 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jh899" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f578963d-5ff1-4e31-945b-cc59f0b244bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8397670f159f43e20cc8c6710ec3dca2af99b4defccbe1a2401fdf3e116fa8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rcps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jh899\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:14Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.890550 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46efc85a4fcde5af1909029f818819e466a4e8204f01a093d4dd53c53dc2b568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46efc85a4fcde5af1909029f818819e466a4e8204f01a093d4dd53c53dc2b568\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"message\\\":\\\"y (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 11:07:07.843897 6176 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1124 11:07:07.843920 6176 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1124 11:07:07.843925 6176 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1124 11:07:07.843936 6176 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1124 11:07:07.843969 6176 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1124 11:07:07.843994 6176 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1124 11:07:07.843999 6176 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1124 11:07:07.844044 6176 factory.go:656] Stopping watch factory\\\\nI1124 11:07:07.844063 6176 ovnkube.go:599] Stopped ovnkube\\\\nI1124 11:07:07.844088 6176 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1124 11:07:07.844087 6176 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1124 11:07:07.844106 6176 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1124 11:07:07.844112 6176 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1124 11:07:07.844120 6176 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1124 11:07:07.844042 6176 handler.go:208] Removed *v1.Node event handler 7\\\\nI11\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bkksr_openshift-ovn-kubernetes(fa360dfd-2d4c-4442-84c9-af5d97c4c1fa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bkksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:14Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.905057 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jktjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270867eb-4cb0-47a6-958a-f411367a85b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ade38f0d51834521068693ef4008ed9760b9d54a8a8cfc4be5f687bb266dc4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgkv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jktjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:14Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.906921 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.906957 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.906973 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.906993 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.907008 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:14Z","lastTransitionTime":"2025-11-24T11:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.919592 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afccdbdac1eff9b564c64997649d9c34db6052cced3d053b1b5faf51ab14a12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:14Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.931499 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd34b2d0c9f1731ce1dc24c078a17361fc33d962037fc48ee21ecbcf4986fd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156cd45b83c97666189a0081b12acece2f273b3a03bc135ec0729c81eeb5170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:14Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.939666 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-brns2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04afe719-08a4-4d22-83d2-6dd16d191cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0789d2dfd0d83f253e77f99448478eb42e5e45f12e9621ea16bc89e97492010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzkhn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-brns2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:14Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.953075 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b427613-8be1-45a9-8e97-45b0fd75ff4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d3d3279037cda399bef36619c150353e6948dd375c8fa9db5fef6bdca4c44ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4791d1130ba00230bee650a69082d11b331e4415d83301548ec1f2550b59958a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0836e28edcd407034cbc884bd1aa77b7c75f500b20c088cb2e53052ba1af52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c270d60b2ca9fab983848fd8ee8c44a1e60544363b6c1408aba2e70c4d29534e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T11:06:48Z\\\",\\\"message\\\":\\\"W1124 11:06:37.846162 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1124 11:06:37.846539 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763982397 cert, and key in /tmp/serving-cert-3641171055/serving-signer.crt, /tmp/serving-cert-3641171055/serving-signer.key\\\\nI1124 11:06:38.225612 1 observer_polling.go:159] Starting file observer\\\\nW1124 11:06:38.228082 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1124 11:06:38.228286 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1124 11:06:38.229260 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3641171055/tls.crt::/tmp/serving-cert-3641171055/tls.key\\\\\\\"\\\\nF1124 11:06:48.685179 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24ec6d0c0c354883fbcb0141fb133d37861353398ad16742be6d2564a4c85c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:14Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.966811 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:14Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.979480 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:14Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:14 crc kubenswrapper[4752]: I1124 11:07:14.991427 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f890fc2e-8d6c-4109-882a-9e90340097a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0197a911066059e8637bae1251b551c637ed350ba2fcd92b4dcace745a28c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b6540a68ba35f1f647d8d4bc6fe738e35d1da8b8a4b8cdf8347e246bdc9a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vhwb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:14Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.010125 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.010181 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.010204 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.010234 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.010257 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:15Z","lastTransitionTime":"2025-11-24T11:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.113183 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.113220 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.113234 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.113255 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.113271 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:15Z","lastTransitionTime":"2025-11-24T11:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.216398 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.216458 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.216476 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.216503 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.216520 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:15Z","lastTransitionTime":"2025-11-24T11:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.318906 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.318965 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.318978 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.318997 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.319010 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:15Z","lastTransitionTime":"2025-11-24T11:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.421791 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.421849 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.421870 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.421895 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.421911 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:15Z","lastTransitionTime":"2025-11-24T11:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.524844 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.524884 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.524922 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.524942 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.524955 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:15Z","lastTransitionTime":"2025-11-24T11:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.628424 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.628487 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.628505 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.628532 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.628550 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:15Z","lastTransitionTime":"2025-11-24T11:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.727911 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.728030 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:07:15 crc kubenswrapper[4752]: E1124 11:07:15.728132 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.728047 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:07:15 crc kubenswrapper[4752]: E1124 11:07:15.728287 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.728045 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:07:15 crc kubenswrapper[4752]: E1124 11:07:15.728456 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:07:15 crc kubenswrapper[4752]: E1124 11:07:15.728616 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.731056 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.731087 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.731100 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.731119 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.731132 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:15Z","lastTransitionTime":"2025-11-24T11:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.833303 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.833347 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.833359 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.833373 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.833385 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:15Z","lastTransitionTime":"2025-11-24T11:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.937329 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.937408 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.937456 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.937492 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:15 crc kubenswrapper[4752]: I1124 11:07:15.937510 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:15Z","lastTransitionTime":"2025-11-24T11:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.040974 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.041061 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.041088 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.041119 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.041252 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:16Z","lastTransitionTime":"2025-11-24T11:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.145545 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.145622 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.145645 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.145670 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.145687 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:16Z","lastTransitionTime":"2025-11-24T11:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.249179 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.249230 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.249248 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.249273 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.249290 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:16Z","lastTransitionTime":"2025-11-24T11:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.352543 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.352645 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.352664 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.352689 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.352708 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:16Z","lastTransitionTime":"2025-11-24T11:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.455771 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.455840 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.455876 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.455906 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.455930 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:16Z","lastTransitionTime":"2025-11-24T11:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.564420 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.564497 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.564515 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.564673 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.564705 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:16Z","lastTransitionTime":"2025-11-24T11:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.667933 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.667991 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.668008 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.668031 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.668048 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:16Z","lastTransitionTime":"2025-11-24T11:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.772839 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.772942 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.772967 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.773032 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.773049 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:16Z","lastTransitionTime":"2025-11-24T11:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.876955 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.877008 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.877024 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.877050 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.877068 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:16Z","lastTransitionTime":"2025-11-24T11:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.981088 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.981159 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.981177 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.981207 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:16 crc kubenswrapper[4752]: I1124 11:07:16.981224 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:16Z","lastTransitionTime":"2025-11-24T11:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.083968 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.084072 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.084092 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.084115 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.084134 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:17Z","lastTransitionTime":"2025-11-24T11:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.186778 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.186840 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.186853 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.186873 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.186885 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:17Z","lastTransitionTime":"2025-11-24T11:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.290066 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.290130 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.290164 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.290193 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.290214 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:17Z","lastTransitionTime":"2025-11-24T11:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.393307 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.393365 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.393382 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.393405 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.393422 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:17Z","lastTransitionTime":"2025-11-24T11:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.455899 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6dea074-570b-440e-b555-46c1dde88efa-metrics-certs\") pod \"network-metrics-daemon-8gb7x\" (UID: \"d6dea074-570b-440e-b555-46c1dde88efa\") " pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:07:17 crc kubenswrapper[4752]: E1124 11:07:17.456157 4752 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 24 11:07:17 crc kubenswrapper[4752]: E1124 11:07:17.456301 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6dea074-570b-440e-b555-46c1dde88efa-metrics-certs podName:d6dea074-570b-440e-b555-46c1dde88efa nodeName:}" failed. No retries permitted until 2025-11-24 11:07:25.456271504 +0000 UTC m=+51.441091833 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d6dea074-570b-440e-b555-46c1dde88efa-metrics-certs") pod "network-metrics-daemon-8gb7x" (UID: "d6dea074-570b-440e-b555-46c1dde88efa") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.496941 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.497005 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.497021 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.497045 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.497061 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:17Z","lastTransitionTime":"2025-11-24T11:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.601006 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.601070 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.601089 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.601112 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.601129 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:17Z","lastTransitionTime":"2025-11-24T11:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.703739 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.703800 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.703811 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.703827 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.703838 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:17Z","lastTransitionTime":"2025-11-24T11:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.727220 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.727254 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.727293 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.727226 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:07:17 crc kubenswrapper[4752]: E1124 11:07:17.727362 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:07:17 crc kubenswrapper[4752]: E1124 11:07:17.727427 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:07:17 crc kubenswrapper[4752]: E1124 11:07:17.727549 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:07:17 crc kubenswrapper[4752]: E1124 11:07:17.727644 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.806358 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.806395 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.806403 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.806419 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.806427 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:17Z","lastTransitionTime":"2025-11-24T11:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.908841 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.908919 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.908936 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.908957 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:17 crc kubenswrapper[4752]: I1124 11:07:17.908976 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:17Z","lastTransitionTime":"2025-11-24T11:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.012165 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.012202 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.012216 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.012235 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.012249 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:18Z","lastTransitionTime":"2025-11-24T11:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.114854 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.114892 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.114903 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.114919 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.114930 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:18Z","lastTransitionTime":"2025-11-24T11:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.217558 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.217603 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.217623 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.217643 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.217663 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:18Z","lastTransitionTime":"2025-11-24T11:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.320962 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.321005 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.321019 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.321044 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.321063 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:18Z","lastTransitionTime":"2025-11-24T11:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.425106 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.425179 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.425192 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.425215 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.425229 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:18Z","lastTransitionTime":"2025-11-24T11:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.529250 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.529339 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.529363 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.529401 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.529424 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:18Z","lastTransitionTime":"2025-11-24T11:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.632913 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.632992 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.633014 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.633047 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.633069 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:18Z","lastTransitionTime":"2025-11-24T11:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.738499 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.738577 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.738590 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.738609 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.738623 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:18Z","lastTransitionTime":"2025-11-24T11:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.841658 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.841729 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.841780 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.841808 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.841832 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:18Z","lastTransitionTime":"2025-11-24T11:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.944917 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.945002 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.945031 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.945060 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:18 crc kubenswrapper[4752]: I1124 11:07:18.945133 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:18Z","lastTransitionTime":"2025-11-24T11:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.054229 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.054288 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.054311 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.054340 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.054362 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:19Z","lastTransitionTime":"2025-11-24T11:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.158426 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.158813 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.158847 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.158877 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.158898 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:19Z","lastTransitionTime":"2025-11-24T11:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.261944 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.261982 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.261992 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.262007 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.262018 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:19Z","lastTransitionTime":"2025-11-24T11:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.365657 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.365703 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.365715 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.365784 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.365796 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:19Z","lastTransitionTime":"2025-11-24T11:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.469315 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.469370 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.469381 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.469400 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.469412 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:19Z","lastTransitionTime":"2025-11-24T11:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.572489 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.572541 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.572585 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.572609 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.572628 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:19Z","lastTransitionTime":"2025-11-24T11:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.675300 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.675352 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.675364 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.675380 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.675392 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:19Z","lastTransitionTime":"2025-11-24T11:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.727342 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.727379 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.727343 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.727455 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:07:19 crc kubenswrapper[4752]: E1124 11:07:19.727597 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:07:19 crc kubenswrapper[4752]: E1124 11:07:19.727639 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:07:19 crc kubenswrapper[4752]: E1124 11:07:19.727690 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:07:19 crc kubenswrapper[4752]: E1124 11:07:19.727833 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.778594 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.778632 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.778641 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.778654 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.778663 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:19Z","lastTransitionTime":"2025-11-24T11:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.880916 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.880962 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.880973 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.880990 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.881001 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:19Z","lastTransitionTime":"2025-11-24T11:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.985906 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.985974 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.985993 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.986015 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:19 crc kubenswrapper[4752]: I1124 11:07:19.986031 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:19Z","lastTransitionTime":"2025-11-24T11:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.089008 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.089061 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.089080 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.089104 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.089250 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:20Z","lastTransitionTime":"2025-11-24T11:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.192055 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.192114 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.192133 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.192158 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.192172 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:20Z","lastTransitionTime":"2025-11-24T11:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.296269 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.296366 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.296391 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.296424 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.296447 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:20Z","lastTransitionTime":"2025-11-24T11:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.398575 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.398638 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.398650 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.398667 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.398681 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:20Z","lastTransitionTime":"2025-11-24T11:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.501823 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.501885 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.501905 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.501939 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.501962 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:20Z","lastTransitionTime":"2025-11-24T11:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.605981 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.606067 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.606091 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.606167 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.606248 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:20Z","lastTransitionTime":"2025-11-24T11:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.709672 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.709798 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.709826 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.709858 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.709879 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:20Z","lastTransitionTime":"2025-11-24T11:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.812576 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.812630 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.812646 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.812669 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.812687 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:20Z","lastTransitionTime":"2025-11-24T11:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.915621 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.915683 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.915694 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.915711 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:20 crc kubenswrapper[4752]: I1124 11:07:20.915723 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:20Z","lastTransitionTime":"2025-11-24T11:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.018497 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.018535 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.018544 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.018556 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.018568 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:21Z","lastTransitionTime":"2025-11-24T11:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.121038 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.121098 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.121443 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.121473 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.121489 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:21Z","lastTransitionTime":"2025-11-24T11:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.224929 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.224991 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.225010 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.225034 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.225053 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:21Z","lastTransitionTime":"2025-11-24T11:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.327326 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.327431 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.327485 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.327510 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.327528 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:21Z","lastTransitionTime":"2025-11-24T11:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.431633 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.431700 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.431717 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.431771 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.431800 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:21Z","lastTransitionTime":"2025-11-24T11:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.541053 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.541123 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.541142 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.541171 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.541188 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:21Z","lastTransitionTime":"2025-11-24T11:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.643660 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.643712 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.643728 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.643846 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.643866 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:21Z","lastTransitionTime":"2025-11-24T11:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.671820 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.671908 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.671934 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.671962 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.671987 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:21Z","lastTransitionTime":"2025-11-24T11:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:21 crc kubenswrapper[4752]: E1124 11:07:21.692178 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"47425241-83f5-42c8-9f71-0c166d7ef9e2\\\",\\\"systemUUID\\\":\\\"366115a7-2c9a-450b-9862-da5d0db853ac\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:21Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.696734 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.696819 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.696831 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.696849 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.696861 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:21Z","lastTransitionTime":"2025-11-24T11:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:21 crc kubenswrapper[4752]: E1124 11:07:21.715578 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"47425241-83f5-42c8-9f71-0c166d7ef9e2\\\",\\\"systemUUID\\\":\\\"366115a7-2c9a-450b-9862-da5d0db853ac\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:21Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.719716 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.719800 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.719819 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.719842 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.719860 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:21Z","lastTransitionTime":"2025-11-24T11:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.727763 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.727811 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.727840 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.727853 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:07:21 crc kubenswrapper[4752]: E1124 11:07:21.727995 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:07:21 crc kubenswrapper[4752]: E1124 11:07:21.728262 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:07:21 crc kubenswrapper[4752]: E1124 11:07:21.728317 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:07:21 crc kubenswrapper[4752]: E1124 11:07:21.728423 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.729612 4752 scope.go:117] "RemoveContainer" containerID="46efc85a4fcde5af1909029f818819e466a4e8204f01a093d4dd53c53dc2b568" Nov 24 11:07:21 crc kubenswrapper[4752]: E1124 11:07:21.740003 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"47425241-83f5-42c8-9f71-0c166d7ef9e2\\\",\\\"systemUUID\\\":\\\"366115a7-2c9a-450b-9862-da5d0db853ac\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:21Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.744064 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.744103 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.744115 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.744133 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.744146 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:21Z","lastTransitionTime":"2025-11-24T11:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:21 crc kubenswrapper[4752]: E1124 11:07:21.758872 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"47425241-83f5-42c8-9f71-0c166d7ef9e2\\\",\\\"systemUUID\\\":\\\"366115a7-2c9a-450b-9862-da5d0db853ac\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:21Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.770862 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.770890 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.770898 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.770911 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.770921 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:21Z","lastTransitionTime":"2025-11-24T11:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:21 crc kubenswrapper[4752]: E1124 11:07:21.787664 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"47425241-83f5-42c8-9f71-0c166d7ef9e2\\\",\\\"systemUUID\\\":\\\"366115a7-2c9a-450b-9862-da5d0db853ac\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:21Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:21 crc kubenswrapper[4752]: E1124 11:07:21.788161 4752 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.790441 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.790492 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.790501 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.790516 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.790525 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:21Z","lastTransitionTime":"2025-11-24T11:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.892632 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.892662 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.892671 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.892684 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.892693 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:21Z","lastTransitionTime":"2025-11-24T11:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.994514 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.994543 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.994552 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.994564 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:21 crc kubenswrapper[4752]: I1124 11:07:21.994574 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:21Z","lastTransitionTime":"2025-11-24T11:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.096469 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.096512 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.096524 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.096538 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.096550 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:22Z","lastTransitionTime":"2025-11-24T11:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.097680 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bkksr_fa360dfd-2d4c-4442-84c9-af5d97c4c1fa/ovnkube-controller/1.log" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.100950 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" event={"ID":"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa","Type":"ContainerStarted","Data":"5256aecef8078d063a5fa47611079029d475ec85a54504d52038cc0cf1b69978"} Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.101692 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.121041 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"136bf646-d691-4d52-b178-1f94d2d19458\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a75772043abc2503950b09cf6a83fcd6c12bc5c3f68a1a4ae51c968d80c1cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2b5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:22Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.143845 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db7bddff-87d8-49ce-91b0-a9598e04fa97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2325c8c4a059eb206089a1e4c78f02cb313c544aabc8e368cb06f8f5e3725354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8877ec3aba0de1bc9e496bba5d4e699bf7d96515c1596c6a176e3a9267f76402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0a06e6bc374614ab27d7d45d36efc3cadfdba44e33705d62d85ed50e38fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f900f515002fbeba5eecdf005ccfda9ff78f6ed1b990af5498b4d2f101e0bea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0768511ec90890b5393b32427647df3c55b88445ac9e00cf2d516a92c47c9bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:22Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.160989 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec4d7a924c5edd7f9268c8f39e92e2d281f41f1002216099e9307129538f4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:22Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.174441 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:22Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.196416 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5256aecef8078d063a5fa47611079029d475ec85a54504d52038cc0cf1b69978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46efc85a4fcde5af1909029f818819e466a4e8204f01a093d4dd53c53dc2b568\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"message\\\":\\\"y (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 11:07:07.843897 6176 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1124 11:07:07.843920 6176 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1124 11:07:07.843925 6176 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1124 11:07:07.843936 6176 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1124 11:07:07.843969 6176 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1124 11:07:07.843994 6176 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1124 11:07:07.843999 6176 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1124 11:07:07.844044 6176 factory.go:656] Stopping watch factory\\\\nI1124 11:07:07.844063 6176 ovnkube.go:599] Stopped ovnkube\\\\nI1124 11:07:07.844088 6176 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1124 11:07:07.844087 6176 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1124 11:07:07.844106 6176 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1124 11:07:07.844112 6176 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1124 11:07:07.844120 6176 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1124 11:07:07.844042 6176 handler.go:208] Removed *v1.Node event handler 7\\\\nI11\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bkksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:22Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.198193 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.198216 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.198224 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.198237 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.198245 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:22Z","lastTransitionTime":"2025-11-24T11:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.207304 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jktjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270867eb-4cb0-47a6-958a-f411367a85b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ade38f0d51834521068693ef4008ed9760b9d54a8a8cfc4be5f687bb266dc4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgkv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jktjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:22Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.226781 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z64nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d919d0f-1c4b-493d-8e80-7927e899e908\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3148c48d5d24649328dc14aa236f7f2b91c042388ca5d3a144213a5c7378260f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6fscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cad758e1529fcd67c61a844094a6e21583cea12332ab55d8f34e1fbb6f3d1524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6fscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:07:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z64nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:22Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.240733 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8gb7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6dea074-570b-440e-b555-46c1dde88efa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jscg9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jscg9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:07:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8gb7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:22Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.257969 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d681098-9a27-4c4a-abf7-c110797b560a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:22Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.278135 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jh899" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f578963d-5ff1-4e31-945b-cc59f0b244bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8397670f159f43e20cc8c6710ec3dca2af99b4defccbe1a2401fdf3e116fa8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rcps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jh899\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:22Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.288500 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afccdbdac1eff9b564c64997649d9c34db6052cced3d053b1b5faf51ab14a12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:22Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.299546 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd34b2d0c9f1731ce1dc24c078a17361fc33d962037fc48ee21ecbcf4986fd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156cd45b83c97666189a0081b12acece2f273b3a03bc135ec0729c81eeb5170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:22Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.300836 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.300860 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.300868 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.300881 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.300889 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:22Z","lastTransitionTime":"2025-11-24T11:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.309878 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-brns2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04afe719-08a4-4d22-83d2-6dd16d191cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0789d2dfd0d83f253e77f99448478eb42e5e45f12e9621ea16bc89e97492010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzkhn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-brns2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:22Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.321770 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:22Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.331110 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f890fc2e-8d6c-4109-882a-9e90340097a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0197a911066059e8637bae1251b551c637ed350ba2fcd92b4dcace745a28c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b6540a68ba35f1f647d8d4bc6fe738e35d1da8b8a4b8cdf8347e246bdc9a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vhwb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:22Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.344812 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b427613-8be1-45a9-8e97-45b0fd75ff4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d3d3279037cda399bef36619c150353e6948dd375c8fa9db5fef6bdca4c44ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4791d1130ba00230bee650a69082d11b331e4415d83301548ec1f2550b59958a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0836e28edcd407034cbc884bd1aa77b7c75f500b20c088cb2e53052ba1af52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c270d60b2ca9fab983848fd8ee8c44a1e60544363b6c1408aba2e70c4d29534e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T11:06:48Z\\\",\\\"message\\\":\\\"W1124 11:06:37.846162 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1124 11:06:37.846539 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763982397 cert, and key in /tmp/serving-cert-3641171055/serving-signer.crt, /tmp/serving-cert-3641171055/serving-signer.key\\\\nI1124 11:06:38.225612 1 observer_polling.go:159] Starting file observer\\\\nW1124 11:06:38.228082 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1124 11:06:38.228286 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1124 11:06:38.229260 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3641171055/tls.crt::/tmp/serving-cert-3641171055/tls.key\\\\\\\"\\\\nF1124 11:06:48.685179 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24ec6d0c0c354883fbcb0141fb133d37861353398ad16742be6d2564a4c85c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:22Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.356348 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:22Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.403237 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.403271 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.403282 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.403296 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.403305 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:22Z","lastTransitionTime":"2025-11-24T11:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.506631 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.506679 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.506690 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.506705 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.506717 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:22Z","lastTransitionTime":"2025-11-24T11:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.609240 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.609288 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.609301 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.609320 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.609331 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:22Z","lastTransitionTime":"2025-11-24T11:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.712064 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.712106 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.712117 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.712134 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.712146 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:22Z","lastTransitionTime":"2025-11-24T11:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.815439 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.815476 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.815486 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.815506 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.815519 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:22Z","lastTransitionTime":"2025-11-24T11:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.918615 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.918693 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.918717 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.918775 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:22 crc kubenswrapper[4752]: I1124 11:07:22.918792 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:22Z","lastTransitionTime":"2025-11-24T11:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.022323 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.022358 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.022367 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.022380 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.022396 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:23Z","lastTransitionTime":"2025-11-24T11:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.107127 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bkksr_fa360dfd-2d4c-4442-84c9-af5d97c4c1fa/ovnkube-controller/2.log" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.107967 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bkksr_fa360dfd-2d4c-4442-84c9-af5d97c4c1fa/ovnkube-controller/1.log" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.111456 4752 generic.go:334] "Generic (PLEG): container finished" podID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerID="5256aecef8078d063a5fa47611079029d475ec85a54504d52038cc0cf1b69978" exitCode=1 Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.111528 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" event={"ID":"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa","Type":"ContainerDied","Data":"5256aecef8078d063a5fa47611079029d475ec85a54504d52038cc0cf1b69978"} Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.111595 4752 scope.go:117] "RemoveContainer" containerID="46efc85a4fcde5af1909029f818819e466a4e8204f01a093d4dd53c53dc2b568" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.112313 4752 scope.go:117] "RemoveContainer" containerID="5256aecef8078d063a5fa47611079029d475ec85a54504d52038cc0cf1b69978" Nov 24 11:07:23 crc kubenswrapper[4752]: E1124 11:07:23.112504 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bkksr_openshift-ovn-kubernetes(fa360dfd-2d4c-4442-84c9-af5d97c4c1fa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.125020 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.125073 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.125089 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.125109 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.125122 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:23Z","lastTransitionTime":"2025-11-24T11:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.127939 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jktjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270867eb-4cb0-47a6-958a-f411367a85b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ade38f0d51834521068693ef4008ed9760b9d54a8a8cfc4be5f687bb266dc4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgkv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jktjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:23Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.142421 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z64nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d919d0f-1c4b-493d-8e80-7927e899e908\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3148c48d5d24649328dc14aa236f7f2b91c042388ca5d3a144213a5c7378260f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6fscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cad758e1529fcd67c61a844094a6e21583cea12332ab55d8f34e1fbb6f3d1524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6fscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:07:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z64nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:23Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.154579 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8gb7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6dea074-570b-440e-b555-46c1dde88efa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jscg9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jscg9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:07:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8gb7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:23Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.181221 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d681098-9a27-4c4a-abf7-c110797b560a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:23Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.216539 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jh899" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f578963d-5ff1-4e31-945b-cc59f0b244bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8397670f159f43e20cc8c6710ec3dca2af99b4defccbe1a2401fdf3e116fa8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rcps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jh899\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:23Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.227964 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.227999 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.228008 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.228022 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.228032 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:23Z","lastTransitionTime":"2025-11-24T11:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.243984 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5256aecef8078d063a5fa47611079029d475ec85a54504d52038cc0cf1b69978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46efc85a4fcde5af1909029f818819e466a4e8204f01a093d4dd53c53dc2b568\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"message\\\":\\\"y (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 11:07:07.843897 6176 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1124 11:07:07.843920 6176 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1124 11:07:07.843925 6176 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1124 11:07:07.843936 6176 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1124 11:07:07.843969 6176 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1124 11:07:07.843994 6176 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1124 11:07:07.843999 6176 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1124 11:07:07.844044 6176 factory.go:656] Stopping watch factory\\\\nI1124 11:07:07.844063 6176 ovnkube.go:599] Stopped ovnkube\\\\nI1124 11:07:07.844088 6176 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1124 11:07:07.844087 6176 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1124 11:07:07.844106 6176 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1124 11:07:07.844112 6176 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1124 11:07:07.844120 6176 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1124 11:07:07.844042 6176 handler.go:208] Removed *v1.Node event handler 7\\\\nI11\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5256aecef8078d063a5fa47611079029d475ec85a54504d52038cc0cf1b69978\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T11:07:22Z\\\",\\\"message\\\":\\\"w:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1124 11:07:22.609205 6399 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bkksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:23Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.255217 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afccdbdac1eff9b564c64997649d9c34db6052cced3d053b1b5faf51ab14a12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:23Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.269109 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd34b2d0c9f1731ce1dc24c078a17361fc33d962037fc48ee21ecbcf4986fd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156cd45b83c97666189a0081b12acece2f273b3a03bc135ec0729c81eeb5170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:23Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.279993 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-brns2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04afe719-08a4-4d22-83d2-6dd16d191cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0789d2dfd0d83f253e77f99448478eb42e5e45f12e9621ea16bc89e97492010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzkhn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-brns2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:23Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.290174 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f890fc2e-8d6c-4109-882a-9e90340097a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0197a911066059e8637bae1251b551c637ed350ba2fcd92b4dcace745a28c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b6540a68ba35f1f647d8d4bc6fe738e35d1da8b8a4b8cdf8347e246bdc9a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vhwb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:23Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.302147 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b427613-8be1-45a9-8e97-45b0fd75ff4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d3d3279037cda399bef36619c150353e6948dd375c8fa9db5fef6bdca4c44ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4791d1130ba00230bee650a69082d11b331e4415d83301548ec1f2550b59958a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0836e28edcd407034cbc884bd1aa77b7c75f500b20c088cb2e53052ba1af52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c270d60b2ca9fab983848fd8ee8c44a1e60544363b6c1408aba2e70c4d29534e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T11:06:48Z\\\",\\\"message\\\":\\\"W1124 11:06:37.846162 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1124 11:06:37.846539 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763982397 cert, and key in /tmp/serving-cert-3641171055/serving-signer.crt, /tmp/serving-cert-3641171055/serving-signer.key\\\\nI1124 11:06:38.225612 1 observer_polling.go:159] Starting file observer\\\\nW1124 11:06:38.228082 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1124 11:06:38.228286 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1124 11:06:38.229260 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3641171055/tls.crt::/tmp/serving-cert-3641171055/tls.key\\\\\\\"\\\\nF1124 11:06:48.685179 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24ec6d0c0c354883fbcb0141fb133d37861353398ad16742be6d2564a4c85c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:23Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.316023 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:23Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.329438 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:23Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.333839 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.333889 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.333902 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.333919 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.333931 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:23Z","lastTransitionTime":"2025-11-24T11:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.351310 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db7bddff-87d8-49ce-91b0-a9598e04fa97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2325c8c4a059eb206089a1e4c78f02cb313c544aabc8e368cb06f8f5e3725354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8877ec3aba0de1bc9e496bba5d4e699bf7d96515c1596c6a176e3a9267f76402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0a06e6bc374614ab27d7d45d36efc3cadfdba44e33705d62d85ed50e38fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f900f515002fbeba5eecdf005ccfda9ff78f6ed1b990af5498b4d2f101e0bea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0768511ec90890b5393b32427647df3c55b88445ac9e00cf2d516a92c47c9bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:23Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.364064 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec4d7a924c5edd7f9268c8f39e92e2d281f41f1002216099e9307129538f4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:23Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.376041 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:23Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.389375 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"136bf646-d691-4d52-b178-1f94d2d19458\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a75772043abc2503950b09cf6a83fcd6c12bc5c3f68a1a4ae51c968d80c1cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2b5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:23Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.437257 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.437309 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.437321 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.437341 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.437353 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:23Z","lastTransitionTime":"2025-11-24T11:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.540349 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.540423 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.540445 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.540473 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.540495 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:23Z","lastTransitionTime":"2025-11-24T11:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.643798 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.643844 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.643854 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.643870 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.643881 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:23Z","lastTransitionTime":"2025-11-24T11:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.727538 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.727667 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.727579 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.727556 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:07:23 crc kubenswrapper[4752]: E1124 11:07:23.727829 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:07:23 crc kubenswrapper[4752]: E1124 11:07:23.727950 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:07:23 crc kubenswrapper[4752]: E1124 11:07:23.728094 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:07:23 crc kubenswrapper[4752]: E1124 11:07:23.728189 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.746879 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.746947 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.746970 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.747001 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.747028 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:23Z","lastTransitionTime":"2025-11-24T11:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.850526 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.850580 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.850602 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.850621 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.850637 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:23Z","lastTransitionTime":"2025-11-24T11:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.954450 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.954546 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.954581 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.954612 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:23 crc kubenswrapper[4752]: I1124 11:07:23.954633 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:23Z","lastTransitionTime":"2025-11-24T11:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.070547 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.070608 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.070629 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.070659 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.070678 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:24Z","lastTransitionTime":"2025-11-24T11:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.117937 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bkksr_fa360dfd-2d4c-4442-84c9-af5d97c4c1fa/ovnkube-controller/2.log" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.124049 4752 scope.go:117] "RemoveContainer" containerID="5256aecef8078d063a5fa47611079029d475ec85a54504d52038cc0cf1b69978" Nov 24 11:07:24 crc kubenswrapper[4752]: E1124 11:07:24.124430 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bkksr_openshift-ovn-kubernetes(fa360dfd-2d4c-4442-84c9-af5d97c4c1fa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.145978 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z64nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d919d0f-1c4b-493d-8e80-7927e899e908\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3148c48d5d24649328dc14aa236f7f2b91c042388ca5d3a144213a5c7378260f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6fscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cad758e1529fcd67c61a844094a6e21583cea12332ab55d8f34e1fbb6f3d1524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6fscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:07:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z64nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:24Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.156544 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8gb7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6dea074-570b-440e-b555-46c1dde88efa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jscg9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jscg9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:07:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8gb7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:24Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.173727 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.173797 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.173810 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.173827 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.173844 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:24Z","lastTransitionTime":"2025-11-24T11:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.174431 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d681098-9a27-4c4a-abf7-c110797b560a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:24Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.192672 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jh899" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f578963d-5ff1-4e31-945b-cc59f0b244bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8397670f159f43e20cc8c6710ec3dca2af99b4defccbe1a2401fdf3e116fa8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rcps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jh899\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:24Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.212017 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5256aecef8078d063a5fa47611079029d475ec85a54504d52038cc0cf1b69978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5256aecef8078d063a5fa47611079029d475ec85a54504d52038cc0cf1b69978\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T11:07:22Z\\\",\\\"message\\\":\\\"w:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1124 11:07:22.609205 6399 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bkksr_openshift-ovn-kubernetes(fa360dfd-2d4c-4442-84c9-af5d97c4c1fa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bkksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:24Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.221846 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jktjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270867eb-4cb0-47a6-958a-f411367a85b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ade38f0d51834521068693ef4008ed9760b9d54a8a8cfc4be5f687bb266dc4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgkv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jktjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:24Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.233497 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afccdbdac1eff9b564c64997649d9c34db6052cced3d053b1b5faf51ab14a12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:24Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.246272 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd34b2d0c9f1731ce1dc24c078a17361fc33d962037fc48ee21ecbcf4986fd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156cd45b83c97666189a0081b12acece2f273b3a03bc135ec0729c81eeb5170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:24Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.257551 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-brns2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04afe719-08a4-4d22-83d2-6dd16d191cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0789d2dfd0d83f253e77f99448478eb42e5e45f12e9621ea16bc89e97492010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzkhn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-brns2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:24Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.272384 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b427613-8be1-45a9-8e97-45b0fd75ff4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d3d3279037cda399bef36619c150353e6948dd375c8fa9db5fef6bdca4c44ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4791d1130ba00230bee650a69082d11b331e4415d83301548ec1f2550b59958a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0836e28edcd407034cbc884bd1aa77b7c75f500b20c088cb2e53052ba1af52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c270d60b2ca9fab983848fd8ee8c44a1e60544363b6c1408aba2e70c4d29534e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T11:06:48Z\\\",\\\"message\\\":\\\"W1124 11:06:37.846162 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1124 11:06:37.846539 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763982397 cert, and key in /tmp/serving-cert-3641171055/serving-signer.crt, /tmp/serving-cert-3641171055/serving-signer.key\\\\nI1124 11:06:38.225612 1 observer_polling.go:159] Starting file observer\\\\nW1124 11:06:38.228082 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1124 11:06:38.228286 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1124 11:06:38.229260 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3641171055/tls.crt::/tmp/serving-cert-3641171055/tls.key\\\\\\\"\\\\nF1124 11:06:48.685179 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24ec6d0c0c354883fbcb0141fb133d37861353398ad16742be6d2564a4c85c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:24Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.276850 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.276907 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.276919 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.276937 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.276947 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:24Z","lastTransitionTime":"2025-11-24T11:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.286999 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:24Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.302933 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:24Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.317709 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f890fc2e-8d6c-4109-882a-9e90340097a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0197a911066059e8637bae1251b551c637ed350ba2fcd92b4dcace745a28c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b6540a68ba35f1f647d8d4bc6fe738e35d1da8b8a4b8cdf8347e246bdc9a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vhwb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:24Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.351974 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db7bddff-87d8-49ce-91b0-a9598e04fa97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2325c8c4a059eb206089a1e4c78f02cb313c544aabc8e368cb06f8f5e3725354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8877ec3aba0de1bc9e496bba5d4e699bf7d96515c1596c6a176e3a9267f76402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0a06e6bc374614ab27d7d45d36efc3cadfdba44e33705d62d85ed50e38fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f900f515002fbeba5eecdf005ccfda9ff78f6ed1b990af5498b4d2f101e0bea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0768511ec90890b5393b32427647df3c55b88445ac9e00cf2d516a92c47c9bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:24Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.371192 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec4d7a924c5edd7f9268c8f39e92e2d281f41f1002216099e9307129538f4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:24Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.380007 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.380052 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.380064 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.380082 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.380097 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:24Z","lastTransitionTime":"2025-11-24T11:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.386891 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:24Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.407238 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"136bf646-d691-4d52-b178-1f94d2d19458\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a75772043abc2503950b09cf6a83fcd6c12bc5c3f68a1a4ae51c968d80c1cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2b5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:24Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.483673 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.483765 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.483778 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.483798 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.483810 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:24Z","lastTransitionTime":"2025-11-24T11:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.587816 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.587882 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.587896 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.587919 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.587937 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:24Z","lastTransitionTime":"2025-11-24T11:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.691258 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.691323 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.691340 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.691369 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.691386 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:24Z","lastTransitionTime":"2025-11-24T11:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.763645 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db7bddff-87d8-49ce-91b0-a9598e04fa97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2325c8c4a059eb206089a1e4c78f02cb313c544aabc8e368cb06f8f5e3725354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8877ec3aba0de1bc9e496bba5d4e699bf7d96515c1596c6a176e3a9267f76402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0a06e6bc374614ab27d7d45d36efc3cadfdba44e33705d62d85ed50e38fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f900f515002fbeba5eecdf005ccfda9ff78f6ed1b990af5498b4d2f101e0bea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0768511ec90890b5393b32427647df3c55b88445ac9e00cf2d516a92c47c9bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:24Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.787125 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec4d7a924c5edd7f9268c8f39e92e2d281f41f1002216099e9307129538f4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:24Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.794206 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.794266 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.794291 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.794319 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.794354 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:24Z","lastTransitionTime":"2025-11-24T11:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.809700 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:24Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.827527 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"136bf646-d691-4d52-b178-1f94d2d19458\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a75772043abc2503950b09cf6a83fcd6c12bc5c3f68a1a4ae51c968d80c1cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2b5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:24Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.846171 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d681098-9a27-4c4a-abf7-c110797b560a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:24Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.861028 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jh899" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f578963d-5ff1-4e31-945b-cc59f0b244bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8397670f159f43e20cc8c6710ec3dca2af99b4defccbe1a2401fdf3e116fa8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rcps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jh899\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:24Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.882653 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5256aecef8078d063a5fa47611079029d475ec85a54504d52038cc0cf1b69978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5256aecef8078d063a5fa47611079029d475ec85a54504d52038cc0cf1b69978\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T11:07:22Z\\\",\\\"message\\\":\\\"w:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1124 11:07:22.609205 6399 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bkksr_openshift-ovn-kubernetes(fa360dfd-2d4c-4442-84c9-af5d97c4c1fa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bkksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:24Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.896656 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jktjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270867eb-4cb0-47a6-958a-f411367a85b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ade38f0d51834521068693ef4008ed9760b9d54a8a8cfc4be5f687bb266dc4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgkv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jktjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:24Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.897489 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.897524 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.897534 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.897550 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.897563 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:24Z","lastTransitionTime":"2025-11-24T11:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.914016 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z64nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d919d0f-1c4b-493d-8e80-7927e899e908\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3148c48d5d24649328dc14aa236f7f2b91c042388ca5d3a144213a5c7378260f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6fscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cad758e1529fcd67c61a844094a6e21583cea12332ab55d8f34e1fbb6f3d1524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6fscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:07:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z64nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:24Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.925703 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8gb7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6dea074-570b-440e-b555-46c1dde88efa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jscg9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jscg9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:07:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8gb7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:24Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.941370 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afccdbdac1eff9b564c64997649d9c34db6052cced3d053b1b5faf51ab14a12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:24Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.960174 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd34b2d0c9f1731ce1dc24c078a17361fc33d962037fc48ee21ecbcf4986fd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156cd45b83c97666189a0081b12acece2f273b3a03bc135ec0729c81eeb5170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:24Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.972086 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-brns2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04afe719-08a4-4d22-83d2-6dd16d191cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0789d2dfd0d83f253e77f99448478eb42e5e45f12e9621ea16bc89e97492010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzkhn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-brns2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:24Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:24 crc kubenswrapper[4752]: I1124 11:07:24.987550 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b427613-8be1-45a9-8e97-45b0fd75ff4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d3d3279037cda399bef36619c150353e6948dd375c8fa9db5fef6bdca4c44ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4791d1130ba00230bee650a69082d11b331e4415d83301548ec1f2550b59958a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0836e28edcd407034cbc884bd1aa77b7c75f500b20c088cb2e53052ba1af52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c270d60b2ca9fab983848fd8ee8c44a1e60544363b6c1408aba2e70c4d29534e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T11:06:48Z\\\",\\\"message\\\":\\\"W1124 11:06:37.846162 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1124 11:06:37.846539 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763982397 cert, and key in /tmp/serving-cert-3641171055/serving-signer.crt, /tmp/serving-cert-3641171055/serving-signer.key\\\\nI1124 11:06:38.225612 1 observer_polling.go:159] Starting file observer\\\\nW1124 11:06:38.228082 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1124 11:06:38.228286 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1124 11:06:38.229260 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3641171055/tls.crt::/tmp/serving-cert-3641171055/tls.key\\\\\\\"\\\\nF1124 11:06:48.685179 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24ec6d0c0c354883fbcb0141fb133d37861353398ad16742be6d2564a4c85c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:24Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.000217 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.000264 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.000277 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.000296 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.000308 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:25Z","lastTransitionTime":"2025-11-24T11:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.006845 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:25Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.019814 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:25Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.035607 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f890fc2e-8d6c-4109-882a-9e90340097a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0197a911066059e8637bae1251b551c637ed350ba2fcd92b4dcace745a28c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b6540a68ba35f1f647d8d4bc6fe738e35d1da8b8a4b8cdf8347e246bdc9a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vhwb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:25Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.103584 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.103619 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.103629 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.103645 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.103655 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:25Z","lastTransitionTime":"2025-11-24T11:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.207199 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.207274 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.207295 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.207321 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.207340 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:25Z","lastTransitionTime":"2025-11-24T11:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.310591 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.310652 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.310670 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.310696 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.310712 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:25Z","lastTransitionTime":"2025-11-24T11:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.413794 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.413854 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.413871 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.413898 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.413917 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:25Z","lastTransitionTime":"2025-11-24T11:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.516663 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.516729 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.516771 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.516799 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.516817 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:25Z","lastTransitionTime":"2025-11-24T11:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.550457 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6dea074-570b-440e-b555-46c1dde88efa-metrics-certs\") pod \"network-metrics-daemon-8gb7x\" (UID: \"d6dea074-570b-440e-b555-46c1dde88efa\") " pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:07:25 crc kubenswrapper[4752]: E1124 11:07:25.550637 4752 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 24 11:07:25 crc kubenswrapper[4752]: E1124 11:07:25.550734 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6dea074-570b-440e-b555-46c1dde88efa-metrics-certs podName:d6dea074-570b-440e-b555-46c1dde88efa nodeName:}" failed. No retries permitted until 2025-11-24 11:07:41.550708973 +0000 UTC m=+67.535529302 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d6dea074-570b-440e-b555-46c1dde88efa-metrics-certs") pod "network-metrics-daemon-8gb7x" (UID: "d6dea074-570b-440e-b555-46c1dde88efa") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.619600 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.619634 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.619647 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.619663 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.619676 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:25Z","lastTransitionTime":"2025-11-24T11:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.723576 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.723638 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.723658 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.723682 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.723702 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:25Z","lastTransitionTime":"2025-11-24T11:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.727990 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.727991 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.728099 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.728117 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:07:25 crc kubenswrapper[4752]: E1124 11:07:25.728320 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:07:25 crc kubenswrapper[4752]: E1124 11:07:25.728453 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:07:25 crc kubenswrapper[4752]: E1124 11:07:25.728584 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:07:25 crc kubenswrapper[4752]: E1124 11:07:25.728713 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.827616 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.827682 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.827702 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.827727 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.827772 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:25Z","lastTransitionTime":"2025-11-24T11:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.931503 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.931572 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.931590 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.931616 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:25 crc kubenswrapper[4752]: I1124 11:07:25.931633 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:25Z","lastTransitionTime":"2025-11-24T11:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.034900 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.034949 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.034962 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.034985 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.035008 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:26Z","lastTransitionTime":"2025-11-24T11:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.137360 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.137423 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.137445 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.137471 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.137490 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:26Z","lastTransitionTime":"2025-11-24T11:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.241545 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.241624 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.241648 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.241676 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.241698 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:26Z","lastTransitionTime":"2025-11-24T11:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.345570 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.345639 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.345663 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.345691 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.345710 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:26Z","lastTransitionTime":"2025-11-24T11:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.448987 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.449076 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.449103 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.449201 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.449232 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:26Z","lastTransitionTime":"2025-11-24T11:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.461008 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:07:26 crc kubenswrapper[4752]: E1124 11:07:26.461172 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:07:58.461137716 +0000 UTC m=+84.445958035 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.553504 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.553671 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.553721 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.553807 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.553831 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:26Z","lastTransitionTime":"2025-11-24T11:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.563350 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.563501 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.563575 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.563631 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:07:26 crc kubenswrapper[4752]: E1124 11:07:26.563677 4752 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 24 11:07:26 crc kubenswrapper[4752]: E1124 11:07:26.563726 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 24 11:07:26 crc kubenswrapper[4752]: E1124 11:07:26.563805 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-24 11:07:58.563778705 +0000 UTC m=+84.548599024 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 24 11:07:26 crc kubenswrapper[4752]: E1124 11:07:26.563826 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 24 11:07:26 crc kubenswrapper[4752]: E1124 11:07:26.563859 4752 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 11:07:26 crc kubenswrapper[4752]: E1124 11:07:26.563920 4752 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 24 11:07:26 crc kubenswrapper[4752]: E1124 11:07:26.563931 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 24 11:07:26 crc kubenswrapper[4752]: E1124 11:07:26.563962 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-24 11:07:58.56392904 +0000 UTC m=+84.548749379 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 11:07:26 crc kubenswrapper[4752]: E1124 11:07:26.563982 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 24 11:07:26 crc kubenswrapper[4752]: E1124 11:07:26.564011 4752 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 11:07:26 crc kubenswrapper[4752]: E1124 11:07:26.564046 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-24 11:07:58.564017883 +0000 UTC m=+84.548838202 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 24 11:07:26 crc kubenswrapper[4752]: E1124 11:07:26.564135 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-24 11:07:58.564110736 +0000 UTC m=+84.548931075 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.657574 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.657633 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.657652 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.657689 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.657708 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:26Z","lastTransitionTime":"2025-11-24T11:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.767012 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.767090 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.767136 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.767169 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.767191 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:26Z","lastTransitionTime":"2025-11-24T11:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.870624 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.870689 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.870712 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.870740 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.870798 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:26Z","lastTransitionTime":"2025-11-24T11:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.973339 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.973412 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.973436 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.973465 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:26 crc kubenswrapper[4752]: I1124 11:07:26.973489 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:26Z","lastTransitionTime":"2025-11-24T11:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.076130 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.076193 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.076211 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.076236 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.076254 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:27Z","lastTransitionTime":"2025-11-24T11:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.179451 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.179486 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.179499 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.179517 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.179530 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:27Z","lastTransitionTime":"2025-11-24T11:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.282637 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.282696 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.282714 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.282737 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.282787 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:27Z","lastTransitionTime":"2025-11-24T11:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.385362 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.385409 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.385427 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.385448 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.385461 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:27Z","lastTransitionTime":"2025-11-24T11:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.488684 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.488780 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.488808 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.488854 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.488884 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:27Z","lastTransitionTime":"2025-11-24T11:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.592862 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.592912 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.592942 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.592966 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.592980 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:27Z","lastTransitionTime":"2025-11-24T11:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.696197 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.696270 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.696324 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.696349 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.696368 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:27Z","lastTransitionTime":"2025-11-24T11:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.727975 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.728049 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.728054 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:07:27 crc kubenswrapper[4752]: E1124 11:07:27.728162 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.728254 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:07:27 crc kubenswrapper[4752]: E1124 11:07:27.728410 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:07:27 crc kubenswrapper[4752]: E1124 11:07:27.728488 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:07:27 crc kubenswrapper[4752]: E1124 11:07:27.728613 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.799705 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.799807 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.799840 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.799867 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.799886 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:27Z","lastTransitionTime":"2025-11-24T11:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.903154 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.903226 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.903249 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.903280 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:27 crc kubenswrapper[4752]: I1124 11:07:27.903301 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:27Z","lastTransitionTime":"2025-11-24T11:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.007222 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.007316 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.007355 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.007386 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.007412 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:28Z","lastTransitionTime":"2025-11-24T11:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.111833 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.111941 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.111966 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.111998 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.112022 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:28Z","lastTransitionTime":"2025-11-24T11:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.215625 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.215708 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.215724 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.215774 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.215792 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:28Z","lastTransitionTime":"2025-11-24T11:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.318790 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.318837 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.318855 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.318878 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.318895 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:28Z","lastTransitionTime":"2025-11-24T11:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.422072 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.422143 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.422166 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.422193 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.422213 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:28Z","lastTransitionTime":"2025-11-24T11:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.525597 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.525715 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.525739 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.525819 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.525842 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:28Z","lastTransitionTime":"2025-11-24T11:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.629330 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.629412 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.629456 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.629478 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.629493 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:28Z","lastTransitionTime":"2025-11-24T11:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.733246 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.733308 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.733344 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.733385 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.733410 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:28Z","lastTransitionTime":"2025-11-24T11:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.836807 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.836858 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.836875 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.836898 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.836915 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:28Z","lastTransitionTime":"2025-11-24T11:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.940163 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.940224 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.940240 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.940265 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:28 crc kubenswrapper[4752]: I1124 11:07:28.940281 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:28Z","lastTransitionTime":"2025-11-24T11:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.042781 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.042852 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.042864 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.042911 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.042925 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:29Z","lastTransitionTime":"2025-11-24T11:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.133490 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.146946 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.147505 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.147544 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.147553 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.147565 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.147575 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:29Z","lastTransitionTime":"2025-11-24T11:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.147657 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afccdbdac1eff9b564c64997649d9c34db6052cced3d053b1b5faf51ab14a12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:29Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.163289 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd34b2d0c9f1731ce1dc24c078a17361fc33d962037fc48ee21ecbcf4986fd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156cd45b83c97666189a0081b12acece2f273b3a03bc135ec0729c81eeb5170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:29Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.174443 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-brns2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04afe719-08a4-4d22-83d2-6dd16d191cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0789d2dfd0d83f253e77f99448478eb42e5e45f12e9621ea16bc89e97492010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzkhn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-brns2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:29Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.189279 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:29Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.202573 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f890fc2e-8d6c-4109-882a-9e90340097a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0197a911066059e8637bae1251b551c637ed350ba2fcd92b4dcace745a28c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b6540a68ba35f1f647d8d4bc6fe738e35d1da8b8a4b8cdf8347e246bdc9a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vhwb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:29Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.219016 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b427613-8be1-45a9-8e97-45b0fd75ff4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d3d3279037cda399bef36619c150353e6948dd375c8fa9db5fef6bdca4c44ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4791d1130ba00230bee650a69082d11b331e4415d83301548ec1f2550b59958a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0836e28edcd407034cbc884bd1aa77b7c75f500b20c088cb2e53052ba1af52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c270d60b2ca9fab983848fd8ee8c44a1e60544363b6c1408aba2e70c4d29534e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T11:06:48Z\\\",\\\"message\\\":\\\"W1124 11:06:37.846162 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1124 11:06:37.846539 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763982397 cert, and key in /tmp/serving-cert-3641171055/serving-signer.crt, /tmp/serving-cert-3641171055/serving-signer.key\\\\nI1124 11:06:38.225612 1 observer_polling.go:159] Starting file observer\\\\nW1124 11:06:38.228082 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1124 11:06:38.228286 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1124 11:06:38.229260 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3641171055/tls.crt::/tmp/serving-cert-3641171055/tls.key\\\\\\\"\\\\nF1124 11:06:48.685179 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24ec6d0c0c354883fbcb0141fb133d37861353398ad16742be6d2564a4c85c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:29Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.234196 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:29Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.251124 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.251189 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.251213 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.251243 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.251265 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:29Z","lastTransitionTime":"2025-11-24T11:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.260199 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"136bf646-d691-4d52-b178-1f94d2d19458\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a75772043abc2503950b09cf6a83fcd6c12bc5c3f68a1a4ae51c968d80c1cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2b5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:29Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.294732 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db7bddff-87d8-49ce-91b0-a9598e04fa97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2325c8c4a059eb206089a1e4c78f02cb313c544aabc8e368cb06f8f5e3725354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8877ec3aba0de1bc9e496bba5d4e699bf7d96515c1596c6a176e3a9267f76402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0a06e6bc374614ab27d7d45d36efc3cadfdba44e33705d62d85ed50e38fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f900f515002fbeba5eecdf005ccfda9ff78f6ed1b990af5498b4d2f101e0bea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0768511ec90890b5393b32427647df3c55b88445ac9e00cf2d516a92c47c9bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:29Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.316458 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec4d7a924c5edd7f9268c8f39e92e2d281f41f1002216099e9307129538f4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:29Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.332459 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:29Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.353920 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.354026 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.354054 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.354089 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.354112 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:29Z","lastTransitionTime":"2025-11-24T11:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.362024 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5256aecef8078d063a5fa47611079029d475ec85a54504d52038cc0cf1b69978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5256aecef8078d063a5fa47611079029d475ec85a54504d52038cc0cf1b69978\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T11:07:22Z\\\",\\\"message\\\":\\\"w:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1124 11:07:22.609205 6399 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bkksr_openshift-ovn-kubernetes(fa360dfd-2d4c-4442-84c9-af5d97c4c1fa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bkksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:29Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.376710 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jktjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270867eb-4cb0-47a6-958a-f411367a85b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ade38f0d51834521068693ef4008ed9760b9d54a8a8cfc4be5f687bb266dc4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgkv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jktjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:29Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.394169 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z64nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d919d0f-1c4b-493d-8e80-7927e899e908\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3148c48d5d24649328dc14aa236f7f2b91c042388ca5d3a144213a5c7378260f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6fscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cad758e1529fcd67c61a844094a6e21583cea12332ab55d8f34e1fbb6f3d1524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6fscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:07:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z64nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:29Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.412314 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8gb7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6dea074-570b-440e-b555-46c1dde88efa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jscg9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jscg9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:07:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8gb7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:29Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.434210 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d681098-9a27-4c4a-abf7-c110797b560a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:29Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.453278 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jh899" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f578963d-5ff1-4e31-945b-cc59f0b244bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8397670f159f43e20cc8c6710ec3dca2af99b4defccbe1a2401fdf3e116fa8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rcps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jh899\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:29Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.457128 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.457186 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.457204 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.457242 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.457259 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:29Z","lastTransitionTime":"2025-11-24T11:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.560471 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.560510 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.560521 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.560538 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.560621 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:29Z","lastTransitionTime":"2025-11-24T11:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.663644 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.663712 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.663728 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.663783 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.663801 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:29Z","lastTransitionTime":"2025-11-24T11:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.727130 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.727305 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.727541 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:07:29 crc kubenswrapper[4752]: E1124 11:07:29.727516 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:07:29 crc kubenswrapper[4752]: E1124 11:07:29.727718 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:07:29 crc kubenswrapper[4752]: E1124 11:07:29.727881 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.728113 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:07:29 crc kubenswrapper[4752]: E1124 11:07:29.728286 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.766397 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.766474 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.766508 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.766539 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.766561 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:29Z","lastTransitionTime":"2025-11-24T11:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.870104 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.870185 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.870199 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.870217 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.870228 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:29Z","lastTransitionTime":"2025-11-24T11:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.973721 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.973824 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.973859 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.973891 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:29 crc kubenswrapper[4752]: I1124 11:07:29.973911 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:29Z","lastTransitionTime":"2025-11-24T11:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.077434 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.077489 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.077507 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.077532 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.077552 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:30Z","lastTransitionTime":"2025-11-24T11:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.181132 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.181183 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.181198 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.181219 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.181235 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:30Z","lastTransitionTime":"2025-11-24T11:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.284245 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.284309 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.284326 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.284353 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.284370 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:30Z","lastTransitionTime":"2025-11-24T11:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.386956 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.386996 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.387005 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.387019 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.387030 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:30Z","lastTransitionTime":"2025-11-24T11:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.489920 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.490017 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.490036 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.490059 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.490076 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:30Z","lastTransitionTime":"2025-11-24T11:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.592219 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.592286 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.592345 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.592366 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.592378 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:30Z","lastTransitionTime":"2025-11-24T11:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.694507 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.694552 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.694566 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.694584 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.694596 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:30Z","lastTransitionTime":"2025-11-24T11:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.797313 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.797380 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.797403 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.797433 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.797454 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:30Z","lastTransitionTime":"2025-11-24T11:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.901103 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.901161 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.901182 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.901209 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:30 crc kubenswrapper[4752]: I1124 11:07:30.901230 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:30Z","lastTransitionTime":"2025-11-24T11:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.004803 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.004873 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.004891 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.004914 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.004935 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:31Z","lastTransitionTime":"2025-11-24T11:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.107557 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.107619 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.107637 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.107660 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.107677 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:31Z","lastTransitionTime":"2025-11-24T11:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.210909 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.210977 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.211000 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.211030 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.211053 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:31Z","lastTransitionTime":"2025-11-24T11:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.314090 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.314176 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.314199 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.314233 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.314261 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:31Z","lastTransitionTime":"2025-11-24T11:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.417150 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.417227 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.417250 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.417279 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.417305 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:31Z","lastTransitionTime":"2025-11-24T11:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.521100 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.521152 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.521168 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.521199 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.521215 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:31Z","lastTransitionTime":"2025-11-24T11:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.624456 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.624528 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.624545 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.624569 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.624590 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:31Z","lastTransitionTime":"2025-11-24T11:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.726403 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.726453 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.726465 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.726482 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.726496 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:31Z","lastTransitionTime":"2025-11-24T11:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.727289 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.727321 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.727319 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:07:31 crc kubenswrapper[4752]: E1124 11:07:31.727399 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:07:31 crc kubenswrapper[4752]: E1124 11:07:31.727540 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:07:31 crc kubenswrapper[4752]: E1124 11:07:31.727621 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.727721 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:07:31 crc kubenswrapper[4752]: E1124 11:07:31.727869 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.829423 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.829498 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.829515 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.829541 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.829559 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:31Z","lastTransitionTime":"2025-11-24T11:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.933129 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.933182 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.933199 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.933225 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:31 crc kubenswrapper[4752]: I1124 11:07:31.933242 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:31Z","lastTransitionTime":"2025-11-24T11:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.013174 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.013221 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.013234 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.013249 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.013260 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:32Z","lastTransitionTime":"2025-11-24T11:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:32 crc kubenswrapper[4752]: E1124 11:07:32.030080 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"47425241-83f5-42c8-9f71-0c166d7ef9e2\\\",\\\"systemUUID\\\":\\\"366115a7-2c9a-450b-9862-da5d0db853ac\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:32Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.034849 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.034884 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.034895 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.034909 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.034920 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:32Z","lastTransitionTime":"2025-11-24T11:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:32 crc kubenswrapper[4752]: E1124 11:07:32.055632 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"47425241-83f5-42c8-9f71-0c166d7ef9e2\\\",\\\"systemUUID\\\":\\\"366115a7-2c9a-450b-9862-da5d0db853ac\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:32Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.066364 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.066461 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.066517 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.066550 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.066709 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:32Z","lastTransitionTime":"2025-11-24T11:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:32 crc kubenswrapper[4752]: E1124 11:07:32.090018 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"47425241-83f5-42c8-9f71-0c166d7ef9e2\\\",\\\"systemUUID\\\":\\\"366115a7-2c9a-450b-9862-da5d0db853ac\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:32Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.096241 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.096344 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.096408 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.096431 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.096479 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:32Z","lastTransitionTime":"2025-11-24T11:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:32 crc kubenswrapper[4752]: E1124 11:07:32.116438 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"47425241-83f5-42c8-9f71-0c166d7ef9e2\\\",\\\"systemUUID\\\":\\\"366115a7-2c9a-450b-9862-da5d0db853ac\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:32Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.122094 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.122137 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.122153 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.122173 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.122191 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:32Z","lastTransitionTime":"2025-11-24T11:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:32 crc kubenswrapper[4752]: E1124 11:07:32.145887 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"47425241-83f5-42c8-9f71-0c166d7ef9e2\\\",\\\"systemUUID\\\":\\\"366115a7-2c9a-450b-9862-da5d0db853ac\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:32Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:32 crc kubenswrapper[4752]: E1124 11:07:32.146196 4752 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.148282 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.148354 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.148365 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.148383 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.148396 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:32Z","lastTransitionTime":"2025-11-24T11:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.251148 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.251204 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.251221 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.251244 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.251261 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:32Z","lastTransitionTime":"2025-11-24T11:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.355360 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.355415 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.355433 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.355458 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.355482 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:32Z","lastTransitionTime":"2025-11-24T11:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.458909 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.459116 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.459144 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.459173 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.459197 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:32Z","lastTransitionTime":"2025-11-24T11:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.563067 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.563150 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.563167 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.563196 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.563214 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:32Z","lastTransitionTime":"2025-11-24T11:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.667156 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.667220 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.667278 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.667304 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.667322 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:32Z","lastTransitionTime":"2025-11-24T11:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.770333 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.770397 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.770414 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.770436 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.770453 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:32Z","lastTransitionTime":"2025-11-24T11:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.873258 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.873326 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.873343 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.873368 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.873386 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:32Z","lastTransitionTime":"2025-11-24T11:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.977153 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.977221 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.977244 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.977275 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:32 crc kubenswrapper[4752]: I1124 11:07:32.977298 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:32Z","lastTransitionTime":"2025-11-24T11:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.080686 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.080867 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.080899 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.080934 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.080960 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:33Z","lastTransitionTime":"2025-11-24T11:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.183729 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.183851 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.183870 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.183894 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.183910 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:33Z","lastTransitionTime":"2025-11-24T11:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.287257 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.287340 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.287363 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.287394 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.287417 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:33Z","lastTransitionTime":"2025-11-24T11:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.391104 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.391180 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.391215 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.391246 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.391266 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:33Z","lastTransitionTime":"2025-11-24T11:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.494804 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.494890 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.494925 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.494954 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.494978 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:33Z","lastTransitionTime":"2025-11-24T11:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.597595 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.597658 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.597675 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.597703 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.597724 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:33Z","lastTransitionTime":"2025-11-24T11:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.700932 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.701035 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.701060 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.701092 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.701117 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:33Z","lastTransitionTime":"2025-11-24T11:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.727350 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.727408 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.727425 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:07:33 crc kubenswrapper[4752]: E1124 11:07:33.727529 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.727352 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:07:33 crc kubenswrapper[4752]: E1124 11:07:33.727639 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:07:33 crc kubenswrapper[4752]: E1124 11:07:33.727838 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:07:33 crc kubenswrapper[4752]: E1124 11:07:33.728049 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.805089 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.805167 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.805192 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.805223 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.805247 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:33Z","lastTransitionTime":"2025-11-24T11:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.909064 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.909145 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.909167 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.909196 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:33 crc kubenswrapper[4752]: I1124 11:07:33.909222 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:33Z","lastTransitionTime":"2025-11-24T11:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.012494 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.012540 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.012551 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.012571 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.012584 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:34Z","lastTransitionTime":"2025-11-24T11:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.116024 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.116094 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.116130 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.116162 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.116204 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:34Z","lastTransitionTime":"2025-11-24T11:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.218948 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.219013 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.219030 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.219054 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.219070 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:34Z","lastTransitionTime":"2025-11-24T11:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.321857 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.321950 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.321963 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.321980 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.321997 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:34Z","lastTransitionTime":"2025-11-24T11:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.425230 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.425288 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.425301 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.425319 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.425331 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:34Z","lastTransitionTime":"2025-11-24T11:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.528516 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.528596 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.528611 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.528632 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.528667 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:34Z","lastTransitionTime":"2025-11-24T11:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.632463 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.632509 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.632521 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.632538 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.632551 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:34Z","lastTransitionTime":"2025-11-24T11:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.735017 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.735079 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.735102 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.735132 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.735157 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:34Z","lastTransitionTime":"2025-11-24T11:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.749086 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b427613-8be1-45a9-8e97-45b0fd75ff4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d3d3279037cda399bef36619c150353e6948dd375c8fa9db5fef6bdca4c44ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4791d1130ba00230bee650a69082d11b331e4415d83301548ec1f2550b59958a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0836e28edcd407034cbc884bd1aa77b7c75f500b20c088cb2e53052ba1af52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c270d60b2ca9fab983848fd8ee8c44a1e60544363b6c1408aba2e70c4d29534e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T11:06:48Z\\\",\\\"message\\\":\\\"W1124 11:06:37.846162 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1124 11:06:37.846539 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763982397 cert, and key in /tmp/serving-cert-3641171055/serving-signer.crt, /tmp/serving-cert-3641171055/serving-signer.key\\\\nI1124 11:06:38.225612 1 observer_polling.go:159] Starting file observer\\\\nW1124 11:06:38.228082 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1124 11:06:38.228286 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1124 11:06:38.229260 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3641171055/tls.crt::/tmp/serving-cert-3641171055/tls.key\\\\\\\"\\\\nF1124 11:06:48.685179 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24ec6d0c0c354883fbcb0141fb133d37861353398ad16742be6d2564a4c85c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:34Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.767425 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:34Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.785380 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:34Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.798173 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f890fc2e-8d6c-4109-882a-9e90340097a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0197a911066059e8637bae1251b551c637ed350ba2fcd92b4dcace745a28c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b6540a68ba35f1f647d8d4bc6fe738e35d1da8b8a4b8cdf8347e246bdc9a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vhwb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:34Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.818191 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db7bddff-87d8-49ce-91b0-a9598e04fa97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2325c8c4a059eb206089a1e4c78f02cb313c544aabc8e368cb06f8f5e3725354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8877ec3aba0de1bc9e496bba5d4e699bf7d96515c1596c6a176e3a9267f76402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0a06e6bc374614ab27d7d45d36efc3cadfdba44e33705d62d85ed50e38fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f900f515002fbeba5eecdf005ccfda9ff78f6ed1b990af5498b4d2f101e0bea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0768511ec90890b5393b32427647df3c55b88445ac9e00cf2d516a92c47c9bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:34Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.835218 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec4d7a924c5edd7f9268c8f39e92e2d281f41f1002216099e9307129538f4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:34Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.838628 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.838882 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.838925 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.838952 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.838972 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:34Z","lastTransitionTime":"2025-11-24T11:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.856527 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:34Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.878168 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"136bf646-d691-4d52-b178-1f94d2d19458\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a75772043abc2503950b09cf6a83fcd6c12bc5c3f68a1a4ae51c968d80c1cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2b5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:34Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.894784 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d681098-9a27-4c4a-abf7-c110797b560a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:34Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.907662 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddeea6f3-0f59-4651-b29c-895c5980d711\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9bd83a541e9bd2db206f6620364e5f41cab8fd696ba54e4df5d1e2b58028641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf4d09b316208ce9cf70891db8aaedc1a10f2d347e3a901d11fee9f9da43a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe4128b62953bbd3725ddc1ecf6a9c1c6fe5282ac7422e8c2ac007cda444ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7631b7f595858c73834bf11a808301514a3b625052b201189819d3421e6e55bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7631b7f595858c73834bf11a808301514a3b625052b201189819d3421e6e55bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:34Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.922801 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jh899" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f578963d-5ff1-4e31-945b-cc59f0b244bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8397670f159f43e20cc8c6710ec3dca2af99b4defccbe1a2401fdf3e116fa8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rcps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jh899\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:34Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.942446 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.942514 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.942527 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.942544 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.942555 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:34Z","lastTransitionTime":"2025-11-24T11:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.956878 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5256aecef8078d063a5fa47611079029d475ec85a54504d52038cc0cf1b69978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5256aecef8078d063a5fa47611079029d475ec85a54504d52038cc0cf1b69978\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T11:07:22Z\\\",\\\"message\\\":\\\"w:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1124 11:07:22.609205 6399 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bkksr_openshift-ovn-kubernetes(fa360dfd-2d4c-4442-84c9-af5d97c4c1fa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bkksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:34Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.969728 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jktjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270867eb-4cb0-47a6-958a-f411367a85b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ade38f0d51834521068693ef4008ed9760b9d54a8a8cfc4be5f687bb266dc4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgkv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jktjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:34Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:34 crc kubenswrapper[4752]: I1124 11:07:34.986530 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z64nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d919d0f-1c4b-493d-8e80-7927e899e908\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3148c48d5d24649328dc14aa236f7f2b91c042388ca5d3a144213a5c7378260f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6fscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cad758e1529fcd67c61a844094a6e21583cea12332ab55d8f34e1fbb6f3d1524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6fscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:07:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z64nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:34Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.001796 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8gb7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6dea074-570b-440e-b555-46c1dde88efa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jscg9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jscg9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:07:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8gb7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:34Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.015945 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afccdbdac1eff9b564c64997649d9c34db6052cced3d053b1b5faf51ab14a12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:35Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.031332 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd34b2d0c9f1731ce1dc24c078a17361fc33d962037fc48ee21ecbcf4986fd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156cd45b83c97666189a0081b12acece2f273b3a03bc135ec0729c81eeb5170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:35Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.044888 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-brns2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04afe719-08a4-4d22-83d2-6dd16d191cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0789d2dfd0d83f253e77f99448478eb42e5e45f12e9621ea16bc89e97492010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzkhn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-brns2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:35Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.045680 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.045756 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.045774 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.045795 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.045812 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:35Z","lastTransitionTime":"2025-11-24T11:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.148515 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.148574 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.148592 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.148616 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.148633 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:35Z","lastTransitionTime":"2025-11-24T11:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.251827 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.252326 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.252341 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.252358 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.252367 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:35Z","lastTransitionTime":"2025-11-24T11:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.354856 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.354921 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.354938 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.354962 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.354982 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:35Z","lastTransitionTime":"2025-11-24T11:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.458630 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.458694 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.458711 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.458736 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.458788 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:35Z","lastTransitionTime":"2025-11-24T11:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.562584 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.562651 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.562676 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.562707 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.562729 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:35Z","lastTransitionTime":"2025-11-24T11:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.666481 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.666535 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.666555 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.666581 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.666604 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:35Z","lastTransitionTime":"2025-11-24T11:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.726983 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.727089 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.727112 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.727329 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:07:35 crc kubenswrapper[4752]: E1124 11:07:35.727320 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:07:35 crc kubenswrapper[4752]: E1124 11:07:35.727563 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:07:35 crc kubenswrapper[4752]: E1124 11:07:35.727694 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:07:35 crc kubenswrapper[4752]: E1124 11:07:35.727721 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.769277 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.769327 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.769346 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.769368 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.769385 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:35Z","lastTransitionTime":"2025-11-24T11:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.872735 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.872839 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.872870 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.872900 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.872920 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:35Z","lastTransitionTime":"2025-11-24T11:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.976588 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.976634 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.976650 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.976673 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:35 crc kubenswrapper[4752]: I1124 11:07:35.976689 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:35Z","lastTransitionTime":"2025-11-24T11:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.079730 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.079825 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.079842 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.079867 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.079889 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:36Z","lastTransitionTime":"2025-11-24T11:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.182853 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.182907 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.182919 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.182940 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.182954 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:36Z","lastTransitionTime":"2025-11-24T11:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.285913 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.285965 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.285981 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.286003 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.286020 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:36Z","lastTransitionTime":"2025-11-24T11:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.393526 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.393630 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.393648 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.393673 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.393700 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:36Z","lastTransitionTime":"2025-11-24T11:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.496689 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.497236 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.497437 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.497680 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.497932 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:36Z","lastTransitionTime":"2025-11-24T11:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.601523 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.601615 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.601643 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.601675 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.601699 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:36Z","lastTransitionTime":"2025-11-24T11:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.704283 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.704596 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.704688 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.704811 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.704975 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:36Z","lastTransitionTime":"2025-11-24T11:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.808413 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.808474 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.808494 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.808519 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.808534 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:36Z","lastTransitionTime":"2025-11-24T11:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.911562 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.911675 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.911700 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.911767 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:36 crc kubenswrapper[4752]: I1124 11:07:36.911809 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:36Z","lastTransitionTime":"2025-11-24T11:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.013943 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.013986 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.013998 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.014018 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.014032 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:37Z","lastTransitionTime":"2025-11-24T11:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.120964 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.121020 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.121040 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.121067 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.121089 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:37Z","lastTransitionTime":"2025-11-24T11:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.223335 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.223366 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.223375 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.223389 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.223398 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:37Z","lastTransitionTime":"2025-11-24T11:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.326570 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.326618 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.326636 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.326658 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.326674 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:37Z","lastTransitionTime":"2025-11-24T11:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.430195 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.430235 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.430244 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.430259 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.430269 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:37Z","lastTransitionTime":"2025-11-24T11:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.533209 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.533260 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.533276 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.533299 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.533316 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:37Z","lastTransitionTime":"2025-11-24T11:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.636075 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.636110 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.636121 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.636136 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.636148 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:37Z","lastTransitionTime":"2025-11-24T11:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.727874 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:07:37 crc kubenswrapper[4752]: E1124 11:07:37.728097 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.728549 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:07:37 crc kubenswrapper[4752]: E1124 11:07:37.728716 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.730626 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.730695 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:07:37 crc kubenswrapper[4752]: E1124 11:07:37.730851 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:07:37 crc kubenswrapper[4752]: E1124 11:07:37.730967 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.735368 4752 scope.go:117] "RemoveContainer" containerID="5256aecef8078d063a5fa47611079029d475ec85a54504d52038cc0cf1b69978" Nov 24 11:07:37 crc kubenswrapper[4752]: E1124 11:07:37.735706 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bkksr_openshift-ovn-kubernetes(fa360dfd-2d4c-4442-84c9-af5d97c4c1fa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.739215 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.739265 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.739274 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.739289 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.739298 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:37Z","lastTransitionTime":"2025-11-24T11:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.841936 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.841967 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.841975 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.841989 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.841997 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:37Z","lastTransitionTime":"2025-11-24T11:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.944594 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.944621 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.944630 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.944643 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:37 crc kubenswrapper[4752]: I1124 11:07:37.944651 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:37Z","lastTransitionTime":"2025-11-24T11:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.048311 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.048385 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.048403 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.048429 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.048446 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:38Z","lastTransitionTime":"2025-11-24T11:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.151049 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.151124 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.151136 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.151154 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.151167 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:38Z","lastTransitionTime":"2025-11-24T11:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.253798 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.253872 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.253889 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.253913 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.253933 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:38Z","lastTransitionTime":"2025-11-24T11:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.358244 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.358341 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.358384 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.358448 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.358473 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:38Z","lastTransitionTime":"2025-11-24T11:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.461718 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.461810 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.461826 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.461851 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.461870 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:38Z","lastTransitionTime":"2025-11-24T11:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.564447 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.564480 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.564489 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.564501 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.564510 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:38Z","lastTransitionTime":"2025-11-24T11:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.667336 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.667469 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.667489 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.667515 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.667533 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:38Z","lastTransitionTime":"2025-11-24T11:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.773148 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.773216 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.773242 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.773257 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.773266 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:38Z","lastTransitionTime":"2025-11-24T11:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.875304 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.875334 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.875344 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.875358 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.875369 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:38Z","lastTransitionTime":"2025-11-24T11:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.978101 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.978147 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.978164 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.978188 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:38 crc kubenswrapper[4752]: I1124 11:07:38.978204 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:38Z","lastTransitionTime":"2025-11-24T11:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.080936 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.080976 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.080986 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.081002 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.081011 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:39Z","lastTransitionTime":"2025-11-24T11:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.182600 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.182647 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.182658 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.182673 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.182687 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:39Z","lastTransitionTime":"2025-11-24T11:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.285649 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.285711 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.285729 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.285783 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.285803 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:39Z","lastTransitionTime":"2025-11-24T11:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.388159 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.388207 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.388219 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.388237 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.388248 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:39Z","lastTransitionTime":"2025-11-24T11:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.491372 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.491405 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.491417 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.491432 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.491446 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:39Z","lastTransitionTime":"2025-11-24T11:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.593736 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.593830 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.593847 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.593872 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.593890 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:39Z","lastTransitionTime":"2025-11-24T11:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.696527 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.696583 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.696606 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.696638 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.696659 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:39Z","lastTransitionTime":"2025-11-24T11:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.727577 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.727622 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.727672 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.727736 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:07:39 crc kubenswrapper[4752]: E1124 11:07:39.727910 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:07:39 crc kubenswrapper[4752]: E1124 11:07:39.728150 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:07:39 crc kubenswrapper[4752]: E1124 11:07:39.728221 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:07:39 crc kubenswrapper[4752]: E1124 11:07:39.728311 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.799524 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.799573 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.799589 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.799607 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.799620 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:39Z","lastTransitionTime":"2025-11-24T11:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.901437 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.901474 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.901485 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.901500 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:39 crc kubenswrapper[4752]: I1124 11:07:39.901510 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:39Z","lastTransitionTime":"2025-11-24T11:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.004521 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.004575 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.004584 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.004599 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.004609 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:40Z","lastTransitionTime":"2025-11-24T11:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.106891 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.106943 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.106952 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.106966 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.106976 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:40Z","lastTransitionTime":"2025-11-24T11:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.209603 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.209664 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.209676 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.209692 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.209701 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:40Z","lastTransitionTime":"2025-11-24T11:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.312964 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.313020 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.313037 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.313058 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.313075 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:40Z","lastTransitionTime":"2025-11-24T11:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.415766 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.415799 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.415808 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.415821 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.415834 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:40Z","lastTransitionTime":"2025-11-24T11:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.519032 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.519080 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.519098 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.519124 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.519140 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:40Z","lastTransitionTime":"2025-11-24T11:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.621864 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.621903 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.621914 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.621929 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.621940 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:40Z","lastTransitionTime":"2025-11-24T11:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.724111 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.724151 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.724159 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.724174 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.724183 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:40Z","lastTransitionTime":"2025-11-24T11:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.826930 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.826970 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.826986 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.827008 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.827026 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:40Z","lastTransitionTime":"2025-11-24T11:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.930162 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.930195 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.930204 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.930216 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:40 crc kubenswrapper[4752]: I1124 11:07:40.930226 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:40Z","lastTransitionTime":"2025-11-24T11:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.032889 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.032925 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.032935 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.032951 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.032961 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:41Z","lastTransitionTime":"2025-11-24T11:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.135479 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.136251 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.136350 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.136439 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.136518 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:41Z","lastTransitionTime":"2025-11-24T11:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.238832 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.238891 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.238902 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.238919 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.238931 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:41Z","lastTransitionTime":"2025-11-24T11:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.340859 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.341140 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.341215 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.341278 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.341338 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:41Z","lastTransitionTime":"2025-11-24T11:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.445316 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.445354 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.445365 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.445380 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.445392 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:41Z","lastTransitionTime":"2025-11-24T11:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.549861 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.549902 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.549910 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.549924 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.549932 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:41Z","lastTransitionTime":"2025-11-24T11:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.646255 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6dea074-570b-440e-b555-46c1dde88efa-metrics-certs\") pod \"network-metrics-daemon-8gb7x\" (UID: \"d6dea074-570b-440e-b555-46c1dde88efa\") " pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:07:41 crc kubenswrapper[4752]: E1124 11:07:41.646392 4752 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 24 11:07:41 crc kubenswrapper[4752]: E1124 11:07:41.646438 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6dea074-570b-440e-b555-46c1dde88efa-metrics-certs podName:d6dea074-570b-440e-b555-46c1dde88efa nodeName:}" failed. No retries permitted until 2025-11-24 11:08:13.646425404 +0000 UTC m=+99.631245693 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d6dea074-570b-440e-b555-46c1dde88efa-metrics-certs") pod "network-metrics-daemon-8gb7x" (UID: "d6dea074-570b-440e-b555-46c1dde88efa") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.653043 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.653079 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.653090 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.653105 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.653114 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:41Z","lastTransitionTime":"2025-11-24T11:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.727128 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.727178 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.727142 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.727125 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:07:41 crc kubenswrapper[4752]: E1124 11:07:41.727254 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:07:41 crc kubenswrapper[4752]: E1124 11:07:41.727301 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:07:41 crc kubenswrapper[4752]: E1124 11:07:41.727347 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:07:41 crc kubenswrapper[4752]: E1124 11:07:41.727381 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.755900 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.755941 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.755956 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.755974 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.755997 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:41Z","lastTransitionTime":"2025-11-24T11:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.859252 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.859319 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.859333 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.859352 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.859364 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:41Z","lastTransitionTime":"2025-11-24T11:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.962199 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.962239 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.962252 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.962271 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:41 crc kubenswrapper[4752]: I1124 11:07:41.962283 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:41Z","lastTransitionTime":"2025-11-24T11:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.065340 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.065373 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.065382 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.065397 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.065408 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:42Z","lastTransitionTime":"2025-11-24T11:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.170352 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.170395 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.170404 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.170419 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.170430 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:42Z","lastTransitionTime":"2025-11-24T11:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.272823 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.272866 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.272877 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.272890 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.272899 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:42Z","lastTransitionTime":"2025-11-24T11:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.375716 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.375827 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.375854 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.375885 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.375908 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:42Z","lastTransitionTime":"2025-11-24T11:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.423940 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.424013 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.424030 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.424056 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.424073 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:42Z","lastTransitionTime":"2025-11-24T11:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:42 crc kubenswrapper[4752]: E1124 11:07:42.443421 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"47425241-83f5-42c8-9f71-0c166d7ef9e2\\\",\\\"systemUUID\\\":\\\"366115a7-2c9a-450b-9862-da5d0db853ac\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:42Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.447149 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.447190 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.447202 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.447219 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.447232 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:42Z","lastTransitionTime":"2025-11-24T11:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:42 crc kubenswrapper[4752]: E1124 11:07:42.463431 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"47425241-83f5-42c8-9f71-0c166d7ef9e2\\\",\\\"systemUUID\\\":\\\"366115a7-2c9a-450b-9862-da5d0db853ac\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:42Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.468366 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.468419 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.468437 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.468463 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.468484 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:42Z","lastTransitionTime":"2025-11-24T11:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:42 crc kubenswrapper[4752]: E1124 11:07:42.487837 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"47425241-83f5-42c8-9f71-0c166d7ef9e2\\\",\\\"systemUUID\\\":\\\"366115a7-2c9a-450b-9862-da5d0db853ac\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:42Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.492248 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.492301 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.492320 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.492346 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.492364 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:42Z","lastTransitionTime":"2025-11-24T11:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:42 crc kubenswrapper[4752]: E1124 11:07:42.539873 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"47425241-83f5-42c8-9f71-0c166d7ef9e2\\\",\\\"systemUUID\\\":\\\"366115a7-2c9a-450b-9862-da5d0db853ac\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:42Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.545720 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.545803 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.545820 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.545845 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.545861 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:42Z","lastTransitionTime":"2025-11-24T11:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:42 crc kubenswrapper[4752]: E1124 11:07:42.566912 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"47425241-83f5-42c8-9f71-0c166d7ef9e2\\\",\\\"systemUUID\\\":\\\"366115a7-2c9a-450b-9862-da5d0db853ac\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:42Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:42 crc kubenswrapper[4752]: E1124 11:07:42.567080 4752 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.568573 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.568611 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.568623 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.568641 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.568653 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:42Z","lastTransitionTime":"2025-11-24T11:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.670767 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.670811 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.670822 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.670838 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.670851 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:42Z","lastTransitionTime":"2025-11-24T11:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.776772 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.776808 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.776819 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.776836 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.776849 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:42Z","lastTransitionTime":"2025-11-24T11:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.879478 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.879532 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.879547 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.879589 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.879602 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:42Z","lastTransitionTime":"2025-11-24T11:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.982724 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.982802 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.982823 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.982850 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:42 crc kubenswrapper[4752]: I1124 11:07:42.982918 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:42Z","lastTransitionTime":"2025-11-24T11:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.085420 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.085466 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.085480 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.085501 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.085512 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:43Z","lastTransitionTime":"2025-11-24T11:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.187301 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.187339 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.187348 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.187361 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.187372 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:43Z","lastTransitionTime":"2025-11-24T11:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.290626 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.290684 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.290696 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.290718 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.290730 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:43Z","lastTransitionTime":"2025-11-24T11:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.393730 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.393775 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.393783 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.393797 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.393806 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:43Z","lastTransitionTime":"2025-11-24T11:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.496195 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.496241 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.496258 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.496282 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.496300 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:43Z","lastTransitionTime":"2025-11-24T11:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.599036 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.599072 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.599083 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.599098 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.599108 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:43Z","lastTransitionTime":"2025-11-24T11:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.702122 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.702160 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.702171 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.702188 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.702201 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:43Z","lastTransitionTime":"2025-11-24T11:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.727705 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.727806 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.727806 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.727858 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:07:43 crc kubenswrapper[4752]: E1124 11:07:43.727954 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:07:43 crc kubenswrapper[4752]: E1124 11:07:43.728109 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:07:43 crc kubenswrapper[4752]: E1124 11:07:43.728177 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:07:43 crc kubenswrapper[4752]: E1124 11:07:43.728339 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.805252 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.805291 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.805299 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.805312 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.805322 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:43Z","lastTransitionTime":"2025-11-24T11:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.907951 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.908004 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.908016 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.908034 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:43 crc kubenswrapper[4752]: I1124 11:07:43.908045 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:43Z","lastTransitionTime":"2025-11-24T11:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.010485 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.010538 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.010549 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.010566 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.010579 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:44Z","lastTransitionTime":"2025-11-24T11:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.113135 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.113181 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.113192 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.113237 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.113250 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:44Z","lastTransitionTime":"2025-11-24T11:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.193709 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jh899_f578963d-5ff1-4e31-945b-cc59f0b244bf/kube-multus/0.log" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.193779 4752 generic.go:334] "Generic (PLEG): container finished" podID="f578963d-5ff1-4e31-945b-cc59f0b244bf" containerID="8397670f159f43e20cc8c6710ec3dca2af99b4defccbe1a2401fdf3e116fa8be" exitCode=1 Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.193809 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jh899" event={"ID":"f578963d-5ff1-4e31-945b-cc59f0b244bf","Type":"ContainerDied","Data":"8397670f159f43e20cc8c6710ec3dca2af99b4defccbe1a2401fdf3e116fa8be"} Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.194101 4752 scope.go:117] "RemoveContainer" containerID="8397670f159f43e20cc8c6710ec3dca2af99b4defccbe1a2401fdf3e116fa8be" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.212049 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afccdbdac1eff9b564c64997649d9c34db6052cced3d053b1b5faf51ab14a12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:44Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.218277 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.218316 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.218324 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.218336 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.218348 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:44Z","lastTransitionTime":"2025-11-24T11:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.230879 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd34b2d0c9f1731ce1dc24c078a17361fc33d962037fc48ee21ecbcf4986fd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156cd45b83c97666189a0081b12acece2f273b3a03bc135ec0729c81eeb5170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:44Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.246779 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-brns2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04afe719-08a4-4d22-83d2-6dd16d191cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0789d2dfd0d83f253e77f99448478eb42e5e45f12e9621ea16bc89e97492010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzkhn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-brns2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:44Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.268901 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b427613-8be1-45a9-8e97-45b0fd75ff4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d3d3279037cda399bef36619c150353e6948dd375c8fa9db5fef6bdca4c44ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4791d1130ba00230bee650a69082d11b331e4415d83301548ec1f2550b59958a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0836e28edcd407034cbc884bd1aa77b7c75f500b20c088cb2e53052ba1af52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c270d60b2ca9fab983848fd8ee8c44a1e60544363b6c1408aba2e70c4d29534e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T11:06:48Z\\\",\\\"message\\\":\\\"W1124 11:06:37.846162 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1124 11:06:37.846539 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763982397 cert, and key in /tmp/serving-cert-3641171055/serving-signer.crt, /tmp/serving-cert-3641171055/serving-signer.key\\\\nI1124 11:06:38.225612 1 observer_polling.go:159] Starting file observer\\\\nW1124 11:06:38.228082 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1124 11:06:38.228286 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1124 11:06:38.229260 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3641171055/tls.crt::/tmp/serving-cert-3641171055/tls.key\\\\\\\"\\\\nF1124 11:06:48.685179 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24ec6d0c0c354883fbcb0141fb133d37861353398ad16742be6d2564a4c85c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:44Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.282990 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:44Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.296636 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:44Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.309323 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f890fc2e-8d6c-4109-882a-9e90340097a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0197a911066059e8637bae1251b551c637ed350ba2fcd92b4dcace745a28c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b6540a68ba35f1f647d8d4bc6fe738e35d1da8b8a4b8cdf8347e246bdc9a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vhwb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:44Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.321088 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.321152 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.321168 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.321183 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.321194 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:44Z","lastTransitionTime":"2025-11-24T11:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.339665 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db7bddff-87d8-49ce-91b0-a9598e04fa97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2325c8c4a059eb206089a1e4c78f02cb313c544aabc8e368cb06f8f5e3725354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8877ec3aba0de1bc9e496bba5d4e699bf7d96515c1596c6a176e3a9267f76402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0a06e6bc374614ab27d7d45d36efc3cadfdba44e33705d62d85ed50e38fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f900f515002fbeba5eecdf005ccfda9ff78f6ed1b990af5498b4d2f101e0bea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0768511ec90890b5393b32427647df3c55b88445ac9e00cf2d516a92c47c9bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:44Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.353765 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec4d7a924c5edd7f9268c8f39e92e2d281f41f1002216099e9307129538f4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:44Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.364957 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:44Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.387055 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"136bf646-d691-4d52-b178-1f94d2d19458\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a75772043abc2503950b09cf6a83fcd6c12bc5c3f68a1a4ae51c968d80c1cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2b5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:44Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.399153 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d681098-9a27-4c4a-abf7-c110797b560a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:44Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.410822 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddeea6f3-0f59-4651-b29c-895c5980d711\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9bd83a541e9bd2db206f6620364e5f41cab8fd696ba54e4df5d1e2b58028641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf4d09b316208ce9cf70891db8aaedc1a10f2d347e3a901d11fee9f9da43a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe4128b62953bbd3725ddc1ecf6a9c1c6fe5282ac7422e8c2ac007cda444ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7631b7f595858c73834bf11a808301514a3b625052b201189819d3421e6e55bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7631b7f595858c73834bf11a808301514a3b625052b201189819d3421e6e55bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:44Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.424112 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.424141 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.424150 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.424165 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.424175 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:44Z","lastTransitionTime":"2025-11-24T11:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.426824 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jh899" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f578963d-5ff1-4e31-945b-cc59f0b244bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8397670f159f43e20cc8c6710ec3dca2af99b4defccbe1a2401fdf3e116fa8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8397670f159f43e20cc8c6710ec3dca2af99b4defccbe1a2401fdf3e116fa8be\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T11:07:43Z\\\",\\\"message\\\":\\\"2025-11-24T11:06:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8569c526-7330-4242-a385-ab82dd32903e\\\\n2025-11-24T11:06:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8569c526-7330-4242-a385-ab82dd32903e to /host/opt/cni/bin/\\\\n2025-11-24T11:06:58Z [verbose] multus-daemon started\\\\n2025-11-24T11:06:58Z [verbose] Readiness Indicator file check\\\\n2025-11-24T11:07:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rcps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jh899\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:44Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.446713 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5256aecef8078d063a5fa47611079029d475ec85a54504d52038cc0cf1b69978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5256aecef8078d063a5fa47611079029d475ec85a54504d52038cc0cf1b69978\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T11:07:22Z\\\",\\\"message\\\":\\\"w:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1124 11:07:22.609205 6399 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bkksr_openshift-ovn-kubernetes(fa360dfd-2d4c-4442-84c9-af5d97c4c1fa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bkksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:44Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.457017 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jktjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270867eb-4cb0-47a6-958a-f411367a85b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ade38f0d51834521068693ef4008ed9760b9d54a8a8cfc4be5f687bb266dc4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgkv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jktjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:44Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.468830 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z64nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d919d0f-1c4b-493d-8e80-7927e899e908\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3148c48d5d24649328dc14aa236f7f2b91c042388ca5d3a144213a5c7378260f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6fscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cad758e1529fcd67c61a844094a6e21583cea12332ab55d8f34e1fbb6f3d1524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6fscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:07:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z64nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:44Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.479062 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8gb7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6dea074-570b-440e-b555-46c1dde88efa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jscg9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jscg9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:07:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8gb7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:44Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.526205 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.526248 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.526256 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.526268 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.526276 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:44Z","lastTransitionTime":"2025-11-24T11:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.628378 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.628417 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.628430 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.628446 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.628460 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:44Z","lastTransitionTime":"2025-11-24T11:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.730972 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.731011 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.731020 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.731033 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.731044 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:44Z","lastTransitionTime":"2025-11-24T11:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.744860 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d681098-9a27-4c4a-abf7-c110797b560a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:44Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.756720 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddeea6f3-0f59-4651-b29c-895c5980d711\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9bd83a541e9bd2db206f6620364e5f41cab8fd696ba54e4df5d1e2b58028641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf4d09b316208ce9cf70891db8aaedc1a10f2d347e3a901d11fee9f9da43a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe4128b62953bbd3725ddc1ecf6a9c1c6fe5282ac7422e8c2ac007cda444ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7631b7f595858c73834bf11a808301514a3b625052b201189819d3421e6e55bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7631b7f595858c73834bf11a808301514a3b625052b201189819d3421e6e55bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:44Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.769687 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jh899" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f578963d-5ff1-4e31-945b-cc59f0b244bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8397670f159f43e20cc8c6710ec3dca2af99b4defccbe1a2401fdf3e116fa8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8397670f159f43e20cc8c6710ec3dca2af99b4defccbe1a2401fdf3e116fa8be\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T11:07:43Z\\\",\\\"message\\\":\\\"2025-11-24T11:06:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8569c526-7330-4242-a385-ab82dd32903e\\\\n2025-11-24T11:06:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8569c526-7330-4242-a385-ab82dd32903e to /host/opt/cni/bin/\\\\n2025-11-24T11:06:58Z [verbose] multus-daemon started\\\\n2025-11-24T11:06:58Z [verbose] Readiness Indicator file check\\\\n2025-11-24T11:07:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rcps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jh899\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:44Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.786878 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5256aecef8078d063a5fa47611079029d475ec85a54504d52038cc0cf1b69978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5256aecef8078d063a5fa47611079029d475ec85a54504d52038cc0cf1b69978\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T11:07:22Z\\\",\\\"message\\\":\\\"w:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1124 11:07:22.609205 6399 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bkksr_openshift-ovn-kubernetes(fa360dfd-2d4c-4442-84c9-af5d97c4c1fa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bkksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:44Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.798875 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jktjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270867eb-4cb0-47a6-958a-f411367a85b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ade38f0d51834521068693ef4008ed9760b9d54a8a8cfc4be5f687bb266dc4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgkv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jktjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:44Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.809503 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z64nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d919d0f-1c4b-493d-8e80-7927e899e908\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3148c48d5d24649328dc14aa236f7f2b91c042388ca5d3a144213a5c7378260f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6fscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cad758e1529fcd67c61a844094a6e21583cea12332ab55d8f34e1fbb6f3d1524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6fscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:07:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z64nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:44Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.821416 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8gb7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6dea074-570b-440e-b555-46c1dde88efa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jscg9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jscg9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:07:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8gb7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:44Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.832589 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.832630 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.832640 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.832658 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.832672 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:44Z","lastTransitionTime":"2025-11-24T11:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.834274 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afccdbdac1eff9b564c64997649d9c34db6052cced3d053b1b5faf51ab14a12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:44Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.846772 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd34b2d0c9f1731ce1dc24c078a17361fc33d962037fc48ee21ecbcf4986fd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156cd45b83c97666189a0081b12acece2f273b3a03bc135ec0729c81eeb5170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:44Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.856894 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-brns2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04afe719-08a4-4d22-83d2-6dd16d191cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0789d2dfd0d83f253e77f99448478eb42e5e45f12e9621ea16bc89e97492010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzkhn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-brns2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:44Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.868861 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b427613-8be1-45a9-8e97-45b0fd75ff4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d3d3279037cda399bef36619c150353e6948dd375c8fa9db5fef6bdca4c44ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4791d1130ba00230bee650a69082d11b331e4415d83301548ec1f2550b59958a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0836e28edcd407034cbc884bd1aa77b7c75f500b20c088cb2e53052ba1af52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c270d60b2ca9fab983848fd8ee8c44a1e60544363b6c1408aba2e70c4d29534e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T11:06:48Z\\\",\\\"message\\\":\\\"W1124 11:06:37.846162 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1124 11:06:37.846539 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763982397 cert, and key in /tmp/serving-cert-3641171055/serving-signer.crt, /tmp/serving-cert-3641171055/serving-signer.key\\\\nI1124 11:06:38.225612 1 observer_polling.go:159] Starting file observer\\\\nW1124 11:06:38.228082 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1124 11:06:38.228286 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1124 11:06:38.229260 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3641171055/tls.crt::/tmp/serving-cert-3641171055/tls.key\\\\\\\"\\\\nF1124 11:06:48.685179 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24ec6d0c0c354883fbcb0141fb133d37861353398ad16742be6d2564a4c85c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:44Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.879889 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:44Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.892928 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:44Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.902076 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f890fc2e-8d6c-4109-882a-9e90340097a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0197a911066059e8637bae1251b551c637ed350ba2fcd92b4dcace745a28c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b6540a68ba35f1f647d8d4bc6fe738e35d1da8b8a4b8cdf8347e246bdc9a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vhwb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:44Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.922202 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db7bddff-87d8-49ce-91b0-a9598e04fa97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2325c8c4a059eb206089a1e4c78f02cb313c544aabc8e368cb06f8f5e3725354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8877ec3aba0de1bc9e496bba5d4e699bf7d96515c1596c6a176e3a9267f76402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0a06e6bc374614ab27d7d45d36efc3cadfdba44e33705d62d85ed50e38fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f900f515002fbeba5eecdf005ccfda9ff78f6ed1b990af5498b4d2f101e0bea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0768511ec90890b5393b32427647df3c55b88445ac9e00cf2d516a92c47c9bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:44Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.936769 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.936823 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.936834 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.936854 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.936867 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:44Z","lastTransitionTime":"2025-11-24T11:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.937062 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec4d7a924c5edd7f9268c8f39e92e2d281f41f1002216099e9307129538f4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:44Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.949674 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:44Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:44 crc kubenswrapper[4752]: I1124 11:07:44.963275 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"136bf646-d691-4d52-b178-1f94d2d19458\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a75772043abc2503950b09cf6a83fcd6c12bc5c3f68a1a4ae51c968d80c1cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2b5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:44Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.039286 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.039324 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.039340 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.039355 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.039366 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:45Z","lastTransitionTime":"2025-11-24T11:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.141195 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.141235 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.141245 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.141260 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.141271 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:45Z","lastTransitionTime":"2025-11-24T11:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.198499 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jh899_f578963d-5ff1-4e31-945b-cc59f0b244bf/kube-multus/0.log" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.198547 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jh899" event={"ID":"f578963d-5ff1-4e31-945b-cc59f0b244bf","Type":"ContainerStarted","Data":"f51d87e3b03bc979ed61278f3c0e2021000517c2ff53f53793ac2862928974f9"} Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.210046 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jktjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270867eb-4cb0-47a6-958a-f411367a85b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ade38f0d51834521068693ef4008ed9760b9d54a8a8cfc4be5f687bb266dc4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgkv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jktjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:45Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.222149 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z64nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d919d0f-1c4b-493d-8e80-7927e899e908\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3148c48d5d24649328dc14aa236f7f2b91c042388ca5d3a144213a5c7378260f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6fscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cad758e1529fcd67c61a844094a6e21583cea12332ab55d8f34e1fbb6f3d1524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6fscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:07:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z64nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:45Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.233092 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8gb7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6dea074-570b-440e-b555-46c1dde88efa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jscg9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jscg9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:07:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8gb7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:45Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.243448 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.243473 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.243489 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.243506 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.243517 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:45Z","lastTransitionTime":"2025-11-24T11:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.244915 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d681098-9a27-4c4a-abf7-c110797b560a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:45Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.255737 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddeea6f3-0f59-4651-b29c-895c5980d711\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9bd83a541e9bd2db206f6620364e5f41cab8fd696ba54e4df5d1e2b58028641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf4d09b316208ce9cf70891db8aaedc1a10f2d347e3a901d11fee9f9da43a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe4128b62953bbd3725ddc1ecf6a9c1c6fe5282ac7422e8c2ac007cda444ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7631b7f595858c73834bf11a808301514a3b625052b201189819d3421e6e55bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7631b7f595858c73834bf11a808301514a3b625052b201189819d3421e6e55bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:45Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.268868 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jh899" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f578963d-5ff1-4e31-945b-cc59f0b244bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51d87e3b03bc979ed61278f3c0e2021000517c2ff53f53793ac2862928974f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8397670f159f43e20cc8c6710ec3dca2af99b4defccbe1a2401fdf3e116fa8be\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T11:07:43Z\\\",\\\"message\\\":\\\"2025-11-24T11:06:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8569c526-7330-4242-a385-ab82dd32903e\\\\n2025-11-24T11:06:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8569c526-7330-4242-a385-ab82dd32903e to /host/opt/cni/bin/\\\\n2025-11-24T11:06:58Z [verbose] multus-daemon started\\\\n2025-11-24T11:06:58Z [verbose] Readiness Indicator file check\\\\n2025-11-24T11:07:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rcps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jh899\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:45Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.290387 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5256aecef8078d063a5fa47611079029d475ec85a54504d52038cc0cf1b69978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5256aecef8078d063a5fa47611079029d475ec85a54504d52038cc0cf1b69978\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T11:07:22Z\\\",\\\"message\\\":\\\"w:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1124 11:07:22.609205 6399 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bkksr_openshift-ovn-kubernetes(fa360dfd-2d4c-4442-84c9-af5d97c4c1fa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bkksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:45Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.301073 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afccdbdac1eff9b564c64997649d9c34db6052cced3d053b1b5faf51ab14a12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:45Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.310735 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd34b2d0c9f1731ce1dc24c078a17361fc33d962037fc48ee21ecbcf4986fd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156cd45b83c97666189a0081b12acece2f273b3a03bc135ec0729c81eeb5170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:45Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.320134 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-brns2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04afe719-08a4-4d22-83d2-6dd16d191cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0789d2dfd0d83f253e77f99448478eb42e5e45f12e9621ea16bc89e97492010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzkhn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-brns2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:45Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.330811 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f890fc2e-8d6c-4109-882a-9e90340097a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0197a911066059e8637bae1251b551c637ed350ba2fcd92b4dcace745a28c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b6540a68ba35f1f647d8d4bc6fe738e35d1da8b8a4b8cdf8347e246bdc9a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vhwb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:45Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.346184 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.346217 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.346228 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.346246 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.346256 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:45Z","lastTransitionTime":"2025-11-24T11:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.346609 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b427613-8be1-45a9-8e97-45b0fd75ff4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d3d3279037cda399bef36619c150353e6948dd375c8fa9db5fef6bdca4c44ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4791d1130ba00230bee650a69082d11b331e4415d83301548ec1f2550b59958a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0836e28edcd407034cbc884bd1aa77b7c75f500b20c088cb2e53052ba1af52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c270d60b2ca9fab983848fd8ee8c44a1e60544363b6c1408aba2e70c4d29534e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T11:06:48Z\\\",\\\"message\\\":\\\"W1124 11:06:37.846162 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1124 11:06:37.846539 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763982397 cert, and key in /tmp/serving-cert-3641171055/serving-signer.crt, /tmp/serving-cert-3641171055/serving-signer.key\\\\nI1124 11:06:38.225612 1 observer_polling.go:159] Starting file observer\\\\nW1124 11:06:38.228082 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1124 11:06:38.228286 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1124 11:06:38.229260 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3641171055/tls.crt::/tmp/serving-cert-3641171055/tls.key\\\\\\\"\\\\nF1124 11:06:48.685179 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24ec6d0c0c354883fbcb0141fb133d37861353398ad16742be6d2564a4c85c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:45Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.357357 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:45Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.375139 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:45Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.400009 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db7bddff-87d8-49ce-91b0-a9598e04fa97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2325c8c4a059eb206089a1e4c78f02cb313c544aabc8e368cb06f8f5e3725354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8877ec3aba0de1bc9e496bba5d4e699bf7d96515c1596c6a176e3a9267f76402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0a06e6bc374614ab27d7d45d36efc3cadfdba44e33705d62d85ed50e38fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f900f515002fbeba5eecdf005ccfda9ff78f6ed1b990af5498b4d2f101e0bea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0768511ec90890b5393b32427647df3c55b88445ac9e00cf2d516a92c47c9bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:45Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.413545 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec4d7a924c5edd7f9268c8f39e92e2d281f41f1002216099e9307129538f4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:45Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.426135 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:45Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.440022 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"136bf646-d691-4d52-b178-1f94d2d19458\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a75772043abc2503950b09cf6a83fcd6c12bc5c3f68a1a4ae51c968d80c1cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2b5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:45Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.448575 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.448609 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.448617 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.448631 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.448642 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:45Z","lastTransitionTime":"2025-11-24T11:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.550277 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.550339 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.550347 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.550359 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.550369 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:45Z","lastTransitionTime":"2025-11-24T11:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.652179 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.652226 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.652238 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.652254 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.652266 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:45Z","lastTransitionTime":"2025-11-24T11:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.727726 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.727804 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:07:45 crc kubenswrapper[4752]: E1124 11:07:45.727893 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.727930 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.727977 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:07:45 crc kubenswrapper[4752]: E1124 11:07:45.727999 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:07:45 crc kubenswrapper[4752]: E1124 11:07:45.728070 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:07:45 crc kubenswrapper[4752]: E1124 11:07:45.728194 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.755816 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.755878 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.755896 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.755918 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.755935 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:45Z","lastTransitionTime":"2025-11-24T11:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.859082 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.859132 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.859141 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.859157 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.859167 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:45Z","lastTransitionTime":"2025-11-24T11:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.961832 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.961878 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.961889 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.961912 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:45 crc kubenswrapper[4752]: I1124 11:07:45.961923 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:45Z","lastTransitionTime":"2025-11-24T11:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.063543 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.063583 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.063594 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.063610 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.063622 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:46Z","lastTransitionTime":"2025-11-24T11:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.166347 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.166399 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.166411 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.166451 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.166465 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:46Z","lastTransitionTime":"2025-11-24T11:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.269363 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.269419 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.269430 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.269443 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.269452 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:46Z","lastTransitionTime":"2025-11-24T11:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.371500 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.371537 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.371547 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.371561 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.371570 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:46Z","lastTransitionTime":"2025-11-24T11:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.474762 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.474800 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.474811 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.474826 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.474835 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:46Z","lastTransitionTime":"2025-11-24T11:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.578362 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.578403 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.578411 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.578445 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.578459 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:46Z","lastTransitionTime":"2025-11-24T11:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.680408 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.680450 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.680459 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.680473 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.680483 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:46Z","lastTransitionTime":"2025-11-24T11:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.782933 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.782984 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.782996 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.783014 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.783026 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:46Z","lastTransitionTime":"2025-11-24T11:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.885719 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.885761 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.885769 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.885783 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.885791 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:46Z","lastTransitionTime":"2025-11-24T11:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.988484 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.988532 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.988543 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.988578 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:46 crc kubenswrapper[4752]: I1124 11:07:46.988590 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:46Z","lastTransitionTime":"2025-11-24T11:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.091319 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.091383 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.091401 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.091425 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.091442 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:47Z","lastTransitionTime":"2025-11-24T11:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.194375 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.194463 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.194485 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.194538 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.194556 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:47Z","lastTransitionTime":"2025-11-24T11:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.298060 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.298106 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.298118 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.298137 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.298149 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:47Z","lastTransitionTime":"2025-11-24T11:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.401056 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.401114 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.401131 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.401158 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.401180 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:47Z","lastTransitionTime":"2025-11-24T11:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.503185 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.503220 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.503230 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.503247 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.503257 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:47Z","lastTransitionTime":"2025-11-24T11:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.605185 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.605238 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.605269 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.605290 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.605304 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:47Z","lastTransitionTime":"2025-11-24T11:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.707483 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.707519 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.707529 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.707543 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.707554 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:47Z","lastTransitionTime":"2025-11-24T11:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.726970 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.726990 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.726990 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:07:47 crc kubenswrapper[4752]: E1124 11:07:47.727116 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.727145 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:07:47 crc kubenswrapper[4752]: E1124 11:07:47.727213 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:07:47 crc kubenswrapper[4752]: E1124 11:07:47.727280 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:07:47 crc kubenswrapper[4752]: E1124 11:07:47.727284 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.810136 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.810170 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.810181 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.810196 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.810207 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:47Z","lastTransitionTime":"2025-11-24T11:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.912471 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.912514 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.912525 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.912537 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:47 crc kubenswrapper[4752]: I1124 11:07:47.912547 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:47Z","lastTransitionTime":"2025-11-24T11:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.014903 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.014965 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.014986 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.015013 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.015036 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:48Z","lastTransitionTime":"2025-11-24T11:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.117394 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.117433 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.117443 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.117460 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.117471 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:48Z","lastTransitionTime":"2025-11-24T11:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.219487 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.219548 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.219558 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.219572 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.219592 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:48Z","lastTransitionTime":"2025-11-24T11:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.322058 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.322723 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.322818 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.322883 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.322944 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:48Z","lastTransitionTime":"2025-11-24T11:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.426149 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.426209 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.426231 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.426259 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.426282 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:48Z","lastTransitionTime":"2025-11-24T11:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.529122 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.529184 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.529202 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.529226 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.529243 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:48Z","lastTransitionTime":"2025-11-24T11:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.632468 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.632773 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.632863 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.632988 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.633103 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:48Z","lastTransitionTime":"2025-11-24T11:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.735260 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.735297 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.735308 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.735324 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.735335 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:48Z","lastTransitionTime":"2025-11-24T11:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.838384 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.838465 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.838504 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.838534 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.838558 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:48Z","lastTransitionTime":"2025-11-24T11:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.941334 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.941375 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.941391 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.941409 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:48 crc kubenswrapper[4752]: I1124 11:07:48.941422 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:48Z","lastTransitionTime":"2025-11-24T11:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.045395 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.045451 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.045467 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.045490 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.045507 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:49Z","lastTransitionTime":"2025-11-24T11:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.149463 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.149655 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.149685 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.149725 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.149811 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:49Z","lastTransitionTime":"2025-11-24T11:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.253771 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.253811 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.253825 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.253840 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.253850 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:49Z","lastTransitionTime":"2025-11-24T11:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.356161 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.356235 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.356251 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.356275 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.356291 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:49Z","lastTransitionTime":"2025-11-24T11:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.459184 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.459339 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.459358 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.459373 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.459381 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:49Z","lastTransitionTime":"2025-11-24T11:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.561369 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.561402 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.561410 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.561425 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.561433 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:49Z","lastTransitionTime":"2025-11-24T11:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.664289 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.664335 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.664345 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.664361 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.664371 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:49Z","lastTransitionTime":"2025-11-24T11:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.727027 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.727131 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:07:49 crc kubenswrapper[4752]: E1124 11:07:49.727279 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.727329 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.727425 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:07:49 crc kubenswrapper[4752]: E1124 11:07:49.727477 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:07:49 crc kubenswrapper[4752]: E1124 11:07:49.727606 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:07:49 crc kubenswrapper[4752]: E1124 11:07:49.727668 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.766444 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.766684 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.766765 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.766862 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.766932 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:49Z","lastTransitionTime":"2025-11-24T11:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.869814 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.869880 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.869889 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.869906 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.869916 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:49Z","lastTransitionTime":"2025-11-24T11:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.972767 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.973167 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.973182 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.973200 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:49 crc kubenswrapper[4752]: I1124 11:07:49.973210 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:49Z","lastTransitionTime":"2025-11-24T11:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.076448 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.076526 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.076545 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.076574 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.076589 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:50Z","lastTransitionTime":"2025-11-24T11:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.179782 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.179842 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.179852 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.179871 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.179887 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:50Z","lastTransitionTime":"2025-11-24T11:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.282613 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.282695 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.282713 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.282780 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.282799 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:50Z","lastTransitionTime":"2025-11-24T11:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.385859 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.385909 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.386103 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.386140 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.386157 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:50Z","lastTransitionTime":"2025-11-24T11:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.488346 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.488428 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.488442 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.488461 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.488473 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:50Z","lastTransitionTime":"2025-11-24T11:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.591026 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.591100 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.591121 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.591151 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.591173 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:50Z","lastTransitionTime":"2025-11-24T11:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.694090 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.694137 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.694165 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.694184 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.694195 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:50Z","lastTransitionTime":"2025-11-24T11:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.731276 4752 scope.go:117] "RemoveContainer" containerID="5256aecef8078d063a5fa47611079029d475ec85a54504d52038cc0cf1b69978" Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.741621 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.796829 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.797231 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.797253 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.797279 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.797297 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:50Z","lastTransitionTime":"2025-11-24T11:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.900124 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.900422 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.900568 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.900696 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:50 crc kubenswrapper[4752]: I1124 11:07:50.900849 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:50Z","lastTransitionTime":"2025-11-24T11:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.002541 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.002574 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.002583 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.002595 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.002604 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:51Z","lastTransitionTime":"2025-11-24T11:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.104963 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.104983 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.104991 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.105002 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.105012 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:51Z","lastTransitionTime":"2025-11-24T11:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.207369 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.207403 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.207412 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.207425 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.207435 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:51Z","lastTransitionTime":"2025-11-24T11:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.216931 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bkksr_fa360dfd-2d4c-4442-84c9-af5d97c4c1fa/ovnkube-controller/2.log" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.219176 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" event={"ID":"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa","Type":"ContainerStarted","Data":"566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af"} Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.219426 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.232223 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d681098-9a27-4c4a-abf7-c110797b560a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:51Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.241967 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddeea6f3-0f59-4651-b29c-895c5980d711\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9bd83a541e9bd2db206f6620364e5f41cab8fd696ba54e4df5d1e2b58028641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf4d09b316208ce9cf70891db8aaedc1a10f2d347e3a901d11fee9f9da43a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe4128b62953bbd3725ddc1ecf6a9c1c6fe5282ac7422e8c2ac007cda444ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7631b7f595858c73834bf11a808301514a3b625052b201189819d3421e6e55bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7631b7f595858c73834bf11a808301514a3b625052b201189819d3421e6e55bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:51Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.254453 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jh899" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f578963d-5ff1-4e31-945b-cc59f0b244bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51d87e3b03bc979ed61278f3c0e2021000517c2ff53f53793ac2862928974f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8397670f159f43e20cc8c6710ec3dca2af99b4defccbe1a2401fdf3e116fa8be\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T11:07:43Z\\\",\\\"message\\\":\\\"2025-11-24T11:06:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8569c526-7330-4242-a385-ab82dd32903e\\\\n2025-11-24T11:06:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8569c526-7330-4242-a385-ab82dd32903e to /host/opt/cni/bin/\\\\n2025-11-24T11:06:58Z [verbose] multus-daemon started\\\\n2025-11-24T11:06:58Z [verbose] Readiness Indicator file check\\\\n2025-11-24T11:07:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rcps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jh899\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:51Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.271203 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5256aecef8078d063a5fa47611079029d475ec85a54504d52038cc0cf1b69978\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T11:07:22Z\\\",\\\"message\\\":\\\"w:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1124 11:07:22.609205 6399 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bkksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:51Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.281375 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jktjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270867eb-4cb0-47a6-958a-f411367a85b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ade38f0d51834521068693ef4008ed9760b9d54a8a8cfc4be5f687bb266dc4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgkv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jktjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:51Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.290385 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z64nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d919d0f-1c4b-493d-8e80-7927e899e908\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3148c48d5d24649328dc14aa236f7f2b91c042388ca5d3a144213a5c7378260f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6fscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cad758e1529fcd67c61a844094a6e21583cea12332ab55d8f34e1fbb6f3d1524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6fscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:07:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z64nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:51Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.300860 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8gb7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6dea074-570b-440e-b555-46c1dde88efa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jscg9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jscg9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:07:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8gb7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:51Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.309319 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.309369 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.309381 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.309396 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.309408 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:51Z","lastTransitionTime":"2025-11-24T11:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.312566 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afccdbdac1eff9b564c64997649d9c34db6052cced3d053b1b5faf51ab14a12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:51Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.325247 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd34b2d0c9f1731ce1dc24c078a17361fc33d962037fc48ee21ecbcf4986fd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156cd45b83c97666189a0081b12acece2f273b3a03bc135ec0729c81eeb5170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:51Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.335869 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-brns2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04afe719-08a4-4d22-83d2-6dd16d191cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0789d2dfd0d83f253e77f99448478eb42e5e45f12e9621ea16bc89e97492010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzkhn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-brns2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:51Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.347571 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1d7e03f-e4e6-4fcc-b5c3-2959ec48c7a5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bfcc00186dc5f4713c23dff4b721d4b3e73262c7e7023a031afd6ad4148efb97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3310005b0f0e931a397e87b2649c72ef96467f1f7439267f1e410109916e4034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3310005b0f0e931a397e87b2649c72ef96467f1f7439267f1e410109916e4034\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:51Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.368754 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b427613-8be1-45a9-8e97-45b0fd75ff4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d3d3279037cda399bef36619c150353e6948dd375c8fa9db5fef6bdca4c44ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4791d1130ba00230bee650a69082d11b331e4415d83301548ec1f2550b59958a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0836e28edcd407034cbc884bd1aa77b7c75f500b20c088cb2e53052ba1af52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c270d60b2ca9fab983848fd8ee8c44a1e60544363b6c1408aba2e70c4d29534e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T11:06:48Z\\\",\\\"message\\\":\\\"W1124 11:06:37.846162 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1124 11:06:37.846539 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763982397 cert, and key in /tmp/serving-cert-3641171055/serving-signer.crt, /tmp/serving-cert-3641171055/serving-signer.key\\\\nI1124 11:06:38.225612 1 observer_polling.go:159] Starting file observer\\\\nW1124 11:06:38.228082 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1124 11:06:38.228286 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1124 11:06:38.229260 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3641171055/tls.crt::/tmp/serving-cert-3641171055/tls.key\\\\\\\"\\\\nF1124 11:06:48.685179 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24ec6d0c0c354883fbcb0141fb133d37861353398ad16742be6d2564a4c85c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:51Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.384032 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:51Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.396032 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:51Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.411205 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f890fc2e-8d6c-4109-882a-9e90340097a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0197a911066059e8637bae1251b551c637ed350ba2fcd92b4dcace745a28c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b6540a68ba35f1f647d8d4bc6fe738e35d1da8b8a4b8cdf8347e246bdc9a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vhwb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:51Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.411984 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.412049 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.412072 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.412100 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.412123 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:51Z","lastTransitionTime":"2025-11-24T11:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.430082 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db7bddff-87d8-49ce-91b0-a9598e04fa97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2325c8c4a059eb206089a1e4c78f02cb313c544aabc8e368cb06f8f5e3725354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8877ec3aba0de1bc9e496bba5d4e699bf7d96515c1596c6a176e3a9267f76402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0a06e6bc374614ab27d7d45d36efc3cadfdba44e33705d62d85ed50e38fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f900f515002fbeba5eecdf005ccfda9ff78f6ed1b990af5498b4d2f101e0bea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0768511ec90890b5393b32427647df3c55b88445ac9e00cf2d516a92c47c9bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:51Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.443056 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec4d7a924c5edd7f9268c8f39e92e2d281f41f1002216099e9307129538f4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:51Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.460531 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:51Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.480489 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"136bf646-d691-4d52-b178-1f94d2d19458\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a75772043abc2503950b09cf6a83fcd6c12bc5c3f68a1a4ae51c968d80c1cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2b5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:51Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.514801 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.514848 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.514859 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.514876 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.514888 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:51Z","lastTransitionTime":"2025-11-24T11:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.617597 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.617635 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.617645 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.617661 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.617672 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:51Z","lastTransitionTime":"2025-11-24T11:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.719619 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.719657 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.719666 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.719680 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.719690 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:51Z","lastTransitionTime":"2025-11-24T11:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.727660 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:07:51 crc kubenswrapper[4752]: E1124 11:07:51.727823 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.727873 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.727884 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.727908 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:07:51 crc kubenswrapper[4752]: E1124 11:07:51.728072 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:07:51 crc kubenswrapper[4752]: E1124 11:07:51.728166 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:07:51 crc kubenswrapper[4752]: E1124 11:07:51.728221 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.822249 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.822305 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.822314 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.822328 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.822338 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:51Z","lastTransitionTime":"2025-11-24T11:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.926464 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.926502 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.926512 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.926526 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:51 crc kubenswrapper[4752]: I1124 11:07:51.926538 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:51Z","lastTransitionTime":"2025-11-24T11:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.029525 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.029594 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.029616 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.029637 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.029654 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:52Z","lastTransitionTime":"2025-11-24T11:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.132308 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.132380 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.132396 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.132420 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.132441 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:52Z","lastTransitionTime":"2025-11-24T11:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.223840 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bkksr_fa360dfd-2d4c-4442-84c9-af5d97c4c1fa/ovnkube-controller/3.log" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.225022 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bkksr_fa360dfd-2d4c-4442-84c9-af5d97c4c1fa/ovnkube-controller/2.log" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.227621 4752 generic.go:334] "Generic (PLEG): container finished" podID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerID="566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af" exitCode=1 Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.227661 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" event={"ID":"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa","Type":"ContainerDied","Data":"566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af"} Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.227693 4752 scope.go:117] "RemoveContainer" containerID="5256aecef8078d063a5fa47611079029d475ec85a54504d52038cc0cf1b69978" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.229024 4752 scope.go:117] "RemoveContainer" containerID="566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af" Nov 24 11:07:52 crc kubenswrapper[4752]: E1124 11:07:52.229354 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bkksr_openshift-ovn-kubernetes(fa360dfd-2d4c-4442-84c9-af5d97c4c1fa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.234269 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.234304 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.234314 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.234331 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.234348 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:52Z","lastTransitionTime":"2025-11-24T11:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.262613 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db7bddff-87d8-49ce-91b0-a9598e04fa97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2325c8c4a059eb206089a1e4c78f02cb313c544aabc8e368cb06f8f5e3725354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8877ec3aba0de1bc9e496bba5d4e699bf7d96515c1596c6a176e3a9267f76402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0a06e6bc374614ab27d7d45d36efc3cadfdba44e33705d62d85ed50e38fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f900f515002fbeba5eecdf005ccfda9ff78f6ed1b990af5498b4d2f101e0bea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0768511ec90890b5393b32427647df3c55b88445ac9e00cf2d516a92c47c9bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:52Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.279137 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec4d7a924c5edd7f9268c8f39e92e2d281f41f1002216099e9307129538f4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:52Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.296346 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:52Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.310737 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"136bf646-d691-4d52-b178-1f94d2d19458\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a75772043abc2503950b09cf6a83fcd6c12bc5c3f68a1a4ae51c968d80c1cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2b5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:52Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.321064 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jktjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270867eb-4cb0-47a6-958a-f411367a85b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ade38f0d51834521068693ef4008ed9760b9d54a8a8cfc4be5f687bb266dc4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgkv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jktjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:52Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.331814 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z64nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d919d0f-1c4b-493d-8e80-7927e899e908\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3148c48d5d24649328dc14aa236f7f2b91c042388ca5d3a144213a5c7378260f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6fscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cad758e1529fcd67c61a844094a6e21583cea12332ab55d8f34e1fbb6f3d1524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6fscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:07:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z64nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:52Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.336432 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.336465 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.336479 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.336495 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.336507 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:52Z","lastTransitionTime":"2025-11-24T11:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.341216 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8gb7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6dea074-570b-440e-b555-46c1dde88efa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jscg9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jscg9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:07:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8gb7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:52Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.351834 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d681098-9a27-4c4a-abf7-c110797b560a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:52Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.362585 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddeea6f3-0f59-4651-b29c-895c5980d711\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9bd83a541e9bd2db206f6620364e5f41cab8fd696ba54e4df5d1e2b58028641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf4d09b316208ce9cf70891db8aaedc1a10f2d347e3a901d11fee9f9da43a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe4128b62953bbd3725ddc1ecf6a9c1c6fe5282ac7422e8c2ac007cda444ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7631b7f595858c73834bf11a808301514a3b625052b201189819d3421e6e55bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7631b7f595858c73834bf11a808301514a3b625052b201189819d3421e6e55bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:52Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.373758 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jh899" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f578963d-5ff1-4e31-945b-cc59f0b244bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51d87e3b03bc979ed61278f3c0e2021000517c2ff53f53793ac2862928974f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8397670f159f43e20cc8c6710ec3dca2af99b4defccbe1a2401fdf3e116fa8be\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T11:07:43Z\\\",\\\"message\\\":\\\"2025-11-24T11:06:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8569c526-7330-4242-a385-ab82dd32903e\\\\n2025-11-24T11:06:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8569c526-7330-4242-a385-ab82dd32903e to /host/opt/cni/bin/\\\\n2025-11-24T11:06:58Z [verbose] multus-daemon started\\\\n2025-11-24T11:06:58Z [verbose] Readiness Indicator file check\\\\n2025-11-24T11:07:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rcps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jh899\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:52Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.393441 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5256aecef8078d063a5fa47611079029d475ec85a54504d52038cc0cf1b69978\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T11:07:22Z\\\",\\\"message\\\":\\\"w:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1124 11:07:22.609205 6399 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T11:07:51Z\\\",\\\"message\\\":\\\"ice node crc\\\\nI1124 11:07:51.574495 6759 egressservice_zone_node.go:113] Finished syncing Egress Service node crc: 14.16µs\\\\nI1124 11:07:51.574517 6759 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1124 11:07:51.573217 6759 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1124 11:07:51.574955 6759 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 11:07:51.575080 6759 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1124 11:07:51.575120 6759 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1124 11:07:51.575222 6759 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1124 11:07:51.575549 6759 ovnkube.go:599] Stopped ovnkube\\\\nI1124 11:07:51.575577 6759 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1124 11:07:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bkksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:52Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.403596 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afccdbdac1eff9b564c64997649d9c34db6052cced3d053b1b5faf51ab14a12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:52Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.416989 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd34b2d0c9f1731ce1dc24c078a17361fc33d962037fc48ee21ecbcf4986fd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156cd45b83c97666189a0081b12acece2f273b3a03bc135ec0729c81eeb5170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:52Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.427421 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-brns2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04afe719-08a4-4d22-83d2-6dd16d191cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0789d2dfd0d83f253e77f99448478eb42e5e45f12e9621ea16bc89e97492010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzkhn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-brns2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:52Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.436353 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f890fc2e-8d6c-4109-882a-9e90340097a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0197a911066059e8637bae1251b551c637ed350ba2fcd92b4dcace745a28c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b6540a68ba35f1f647d8d4bc6fe738e35d1da8b8a4b8cdf8347e246bdc9a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vhwb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:52Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.439408 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.439440 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.439452 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.439469 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.439481 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:52Z","lastTransitionTime":"2025-11-24T11:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.446622 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1d7e03f-e4e6-4fcc-b5c3-2959ec48c7a5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bfcc00186dc5f4713c23dff4b721d4b3e73262c7e7023a031afd6ad4148efb97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3310005b0f0e931a397e87b2649c72ef96467f1f7439267f1e410109916e4034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3310005b0f0e931a397e87b2649c72ef96467f1f7439267f1e410109916e4034\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:52Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.457500 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b427613-8be1-45a9-8e97-45b0fd75ff4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d3d3279037cda399bef36619c150353e6948dd375c8fa9db5fef6bdca4c44ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4791d1130ba00230bee650a69082d11b331e4415d83301548ec1f2550b59958a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0836e28edcd407034cbc884bd1aa77b7c75f500b20c088cb2e53052ba1af52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c270d60b2ca9fab983848fd8ee8c44a1e60544363b6c1408aba2e70c4d29534e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T11:06:48Z\\\",\\\"message\\\":\\\"W1124 11:06:37.846162 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1124 11:06:37.846539 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763982397 cert, and key in /tmp/serving-cert-3641171055/serving-signer.crt, /tmp/serving-cert-3641171055/serving-signer.key\\\\nI1124 11:06:38.225612 1 observer_polling.go:159] Starting file observer\\\\nW1124 11:06:38.228082 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1124 11:06:38.228286 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1124 11:06:38.229260 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3641171055/tls.crt::/tmp/serving-cert-3641171055/tls.key\\\\\\\"\\\\nF1124 11:06:48.685179 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24ec6d0c0c354883fbcb0141fb133d37861353398ad16742be6d2564a4c85c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:52Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.468198 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:52Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.482984 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:52Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.542092 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.542130 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.542140 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.542154 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.542165 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:52Z","lastTransitionTime":"2025-11-24T11:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.644890 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.644950 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.644966 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.644988 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.645004 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:52Z","lastTransitionTime":"2025-11-24T11:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.747729 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.747803 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.747815 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.747830 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.747841 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:52Z","lastTransitionTime":"2025-11-24T11:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.849728 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.849807 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.849824 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.849845 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.849862 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:52Z","lastTransitionTime":"2025-11-24T11:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.935576 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.935612 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.935619 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.935636 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.935649 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:52Z","lastTransitionTime":"2025-11-24T11:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:52 crc kubenswrapper[4752]: E1124 11:07:52.947180 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"47425241-83f5-42c8-9f71-0c166d7ef9e2\\\",\\\"systemUUID\\\":\\\"366115a7-2c9a-450b-9862-da5d0db853ac\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:52Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.951182 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.951216 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.951227 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.951241 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.951253 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:52Z","lastTransitionTime":"2025-11-24T11:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:52 crc kubenswrapper[4752]: E1124 11:07:52.971087 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"47425241-83f5-42c8-9f71-0c166d7ef9e2\\\",\\\"systemUUID\\\":\\\"366115a7-2c9a-450b-9862-da5d0db853ac\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:52Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.975314 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.975360 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.975371 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.975390 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.975403 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:52Z","lastTransitionTime":"2025-11-24T11:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:52 crc kubenswrapper[4752]: E1124 11:07:52.990106 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"47425241-83f5-42c8-9f71-0c166d7ef9e2\\\",\\\"systemUUID\\\":\\\"366115a7-2c9a-450b-9862-da5d0db853ac\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:52Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.993790 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.993831 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.993842 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.993859 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:52 crc kubenswrapper[4752]: I1124 11:07:52.993871 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:52Z","lastTransitionTime":"2025-11-24T11:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:53 crc kubenswrapper[4752]: E1124 11:07:53.010241 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"47425241-83f5-42c8-9f71-0c166d7ef9e2\\\",\\\"systemUUID\\\":\\\"366115a7-2c9a-450b-9862-da5d0db853ac\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:53Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.014154 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.014196 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.014208 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.014224 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.014237 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:53Z","lastTransitionTime":"2025-11-24T11:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:53 crc kubenswrapper[4752]: E1124 11:07:53.025414 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"47425241-83f5-42c8-9f71-0c166d7ef9e2\\\",\\\"systemUUID\\\":\\\"366115a7-2c9a-450b-9862-da5d0db853ac\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:53Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:53 crc kubenswrapper[4752]: E1124 11:07:53.025558 4752 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.027152 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.027182 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.027192 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.027207 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.027218 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:53Z","lastTransitionTime":"2025-11-24T11:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.130110 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.130183 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.130207 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.130237 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.130278 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:53Z","lastTransitionTime":"2025-11-24T11:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.232938 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.232965 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.232976 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.232991 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.233001 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:53Z","lastTransitionTime":"2025-11-24T11:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.234112 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bkksr_fa360dfd-2d4c-4442-84c9-af5d97c4c1fa/ovnkube-controller/3.log" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.237449 4752 scope.go:117] "RemoveContainer" containerID="566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af" Nov 24 11:07:53 crc kubenswrapper[4752]: E1124 11:07:53.237663 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bkksr_openshift-ovn-kubernetes(fa360dfd-2d4c-4442-84c9-af5d97c4c1fa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.248632 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8gb7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6dea074-570b-440e-b555-46c1dde88efa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jscg9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jscg9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:07:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8gb7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:53Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.261752 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d681098-9a27-4c4a-abf7-c110797b560a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:53Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.272697 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddeea6f3-0f59-4651-b29c-895c5980d711\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9bd83a541e9bd2db206f6620364e5f41cab8fd696ba54e4df5d1e2b58028641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf4d09b316208ce9cf70891db8aaedc1a10f2d347e3a901d11fee9f9da43a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe4128b62953bbd3725ddc1ecf6a9c1c6fe5282ac7422e8c2ac007cda444ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7631b7f595858c73834bf11a808301514a3b625052b201189819d3421e6e55bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7631b7f595858c73834bf11a808301514a3b625052b201189819d3421e6e55bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:53Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.285242 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jh899" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f578963d-5ff1-4e31-945b-cc59f0b244bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51d87e3b03bc979ed61278f3c0e2021000517c2ff53f53793ac2862928974f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8397670f159f43e20cc8c6710ec3dca2af99b4defccbe1a2401fdf3e116fa8be\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T11:07:43Z\\\",\\\"message\\\":\\\"2025-11-24T11:06:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8569c526-7330-4242-a385-ab82dd32903e\\\\n2025-11-24T11:06:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8569c526-7330-4242-a385-ab82dd32903e to /host/opt/cni/bin/\\\\n2025-11-24T11:06:58Z [verbose] multus-daemon started\\\\n2025-11-24T11:06:58Z [verbose] Readiness Indicator file check\\\\n2025-11-24T11:07:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rcps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jh899\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:53Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.304699 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T11:07:51Z\\\",\\\"message\\\":\\\"ice node crc\\\\nI1124 11:07:51.574495 6759 egressservice_zone_node.go:113] Finished syncing Egress Service node crc: 14.16µs\\\\nI1124 11:07:51.574517 6759 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1124 11:07:51.573217 6759 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1124 11:07:51.574955 6759 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 11:07:51.575080 6759 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1124 11:07:51.575120 6759 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1124 11:07:51.575222 6759 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1124 11:07:51.575549 6759 ovnkube.go:599] Stopped ovnkube\\\\nI1124 11:07:51.575577 6759 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1124 11:07:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bkksr_openshift-ovn-kubernetes(fa360dfd-2d4c-4442-84c9-af5d97c4c1fa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bkksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:53Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.315999 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jktjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270867eb-4cb0-47a6-958a-f411367a85b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ade38f0d51834521068693ef4008ed9760b9d54a8a8cfc4be5f687bb266dc4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgkv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jktjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:53Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.327357 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z64nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d919d0f-1c4b-493d-8e80-7927e899e908\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3148c48d5d24649328dc14aa236f7f2b91c042388ca5d3a144213a5c7378260f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6fscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cad758e1529fcd67c61a844094a6e21583cea12332ab55d8f34e1fbb6f3d1524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6fscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:07:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z64nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:53Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.338602 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.338637 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.338651 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.338666 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.338676 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:53Z","lastTransitionTime":"2025-11-24T11:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.340729 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afccdbdac1eff9b564c64997649d9c34db6052cced3d053b1b5faf51ab14a12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:53Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.354296 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd34b2d0c9f1731ce1dc24c078a17361fc33d962037fc48ee21ecbcf4986fd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156cd45b83c97666189a0081b12acece2f273b3a03bc135ec0729c81eeb5170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:53Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.366836 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-brns2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04afe719-08a4-4d22-83d2-6dd16d191cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0789d2dfd0d83f253e77f99448478eb42e5e45f12e9621ea16bc89e97492010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzkhn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-brns2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:53Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.379032 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1d7e03f-e4e6-4fcc-b5c3-2959ec48c7a5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bfcc00186dc5f4713c23dff4b721d4b3e73262c7e7023a031afd6ad4148efb97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3310005b0f0e931a397e87b2649c72ef96467f1f7439267f1e410109916e4034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3310005b0f0e931a397e87b2649c72ef96467f1f7439267f1e410109916e4034\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:53Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.393269 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b427613-8be1-45a9-8e97-45b0fd75ff4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d3d3279037cda399bef36619c150353e6948dd375c8fa9db5fef6bdca4c44ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4791d1130ba00230bee650a69082d11b331e4415d83301548ec1f2550b59958a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0836e28edcd407034cbc884bd1aa77b7c75f500b20c088cb2e53052ba1af52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c270d60b2ca9fab983848fd8ee8c44a1e60544363b6c1408aba2e70c4d29534e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T11:06:48Z\\\",\\\"message\\\":\\\"W1124 11:06:37.846162 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1124 11:06:37.846539 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763982397 cert, and key in /tmp/serving-cert-3641171055/serving-signer.crt, /tmp/serving-cert-3641171055/serving-signer.key\\\\nI1124 11:06:38.225612 1 observer_polling.go:159] Starting file observer\\\\nW1124 11:06:38.228082 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1124 11:06:38.228286 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1124 11:06:38.229260 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3641171055/tls.crt::/tmp/serving-cert-3641171055/tls.key\\\\\\\"\\\\nF1124 11:06:48.685179 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24ec6d0c0c354883fbcb0141fb133d37861353398ad16742be6d2564a4c85c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:53Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.406920 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:53Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.419718 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:53Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.430063 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f890fc2e-8d6c-4109-882a-9e90340097a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0197a911066059e8637bae1251b551c637ed350ba2fcd92b4dcace745a28c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b6540a68ba35f1f647d8d4bc6fe738e35d1da8b8a4b8cdf8347e246bdc9a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vhwb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:53Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.443598 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.443631 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.443640 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.443655 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.443663 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:53Z","lastTransitionTime":"2025-11-24T11:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.453091 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db7bddff-87d8-49ce-91b0-a9598e04fa97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2325c8c4a059eb206089a1e4c78f02cb313c544aabc8e368cb06f8f5e3725354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8877ec3aba0de1bc9e496bba5d4e699bf7d96515c1596c6a176e3a9267f76402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0a06e6bc374614ab27d7d45d36efc3cadfdba44e33705d62d85ed50e38fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f900f515002fbeba5eecdf005ccfda9ff78f6ed1b990af5498b4d2f101e0bea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0768511ec90890b5393b32427647df3c55b88445ac9e00cf2d516a92c47c9bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:53Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.466410 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec4d7a924c5edd7f9268c8f39e92e2d281f41f1002216099e9307129538f4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:53Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.479360 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:53Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.493292 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"136bf646-d691-4d52-b178-1f94d2d19458\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a75772043abc2503950b09cf6a83fcd6c12bc5c3f68a1a4ae51c968d80c1cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2b5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:53Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.546629 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.546699 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.546718 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.546769 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.546786 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:53Z","lastTransitionTime":"2025-11-24T11:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.649177 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.649432 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.649544 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.649645 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.649858 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:53Z","lastTransitionTime":"2025-11-24T11:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.727383 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.727457 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.727772 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:07:53 crc kubenswrapper[4752]: E1124 11:07:53.727874 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.727938 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:07:53 crc kubenswrapper[4752]: E1124 11:07:53.728096 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:07:53 crc kubenswrapper[4752]: E1124 11:07:53.728147 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:07:53 crc kubenswrapper[4752]: E1124 11:07:53.728311 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.753039 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.753341 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.753414 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.753489 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.753557 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:53Z","lastTransitionTime":"2025-11-24T11:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.856340 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.856659 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.856828 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.856952 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.857064 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:53Z","lastTransitionTime":"2025-11-24T11:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.959187 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.959463 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.959612 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.959700 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:53 crc kubenswrapper[4752]: I1124 11:07:53.959828 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:53Z","lastTransitionTime":"2025-11-24T11:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.062989 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.063027 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.063040 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.063057 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.063068 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:54Z","lastTransitionTime":"2025-11-24T11:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.166530 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.166563 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.166573 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.166589 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.166598 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:54Z","lastTransitionTime":"2025-11-24T11:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.269658 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.269715 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.269728 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.269763 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.269776 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:54Z","lastTransitionTime":"2025-11-24T11:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.372370 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.372722 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.372814 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.372902 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.372969 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:54Z","lastTransitionTime":"2025-11-24T11:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.475931 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.475964 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.475973 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.475988 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.475999 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:54Z","lastTransitionTime":"2025-11-24T11:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.579594 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.579628 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.579638 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.579652 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.579661 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:54Z","lastTransitionTime":"2025-11-24T11:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.681886 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.681939 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.681950 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.681968 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.681982 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:54Z","lastTransitionTime":"2025-11-24T11:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.741930 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afccdbdac1eff9b564c64997649d9c34db6052cced3d053b1b5faf51ab14a12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:54Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.756464 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd34b2d0c9f1731ce1dc24c078a17361fc33d962037fc48ee21ecbcf4986fd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156cd45b83c97666189a0081b12acece2f273b3a03bc135ec0729c81eeb5170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:54Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.770366 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-brns2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04afe719-08a4-4d22-83d2-6dd16d191cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0789d2dfd0d83f253e77f99448478eb42e5e45f12e9621ea16bc89e97492010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzkhn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-brns2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:54Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.780018 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1d7e03f-e4e6-4fcc-b5c3-2959ec48c7a5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bfcc00186dc5f4713c23dff4b721d4b3e73262c7e7023a031afd6ad4148efb97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3310005b0f0e931a397e87b2649c72ef96467f1f7439267f1e410109916e4034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3310005b0f0e931a397e87b2649c72ef96467f1f7439267f1e410109916e4034\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:54Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.783800 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.783854 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.783868 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.783887 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.783898 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:54Z","lastTransitionTime":"2025-11-24T11:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.792486 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b427613-8be1-45a9-8e97-45b0fd75ff4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d3d3279037cda399bef36619c150353e6948dd375c8fa9db5fef6bdca4c44ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4791d1130ba00230bee650a69082d11b331e4415d83301548ec1f2550b59958a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0836e28edcd407034cbc884bd1aa77b7c75f500b20c088cb2e53052ba1af52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c270d60b2ca9fab983848fd8ee8c44a1e60544363b6c1408aba2e70c4d29534e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T11:06:48Z\\\",\\\"message\\\":\\\"W1124 11:06:37.846162 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1124 11:06:37.846539 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763982397 cert, and key in /tmp/serving-cert-3641171055/serving-signer.crt, /tmp/serving-cert-3641171055/serving-signer.key\\\\nI1124 11:06:38.225612 1 observer_polling.go:159] Starting file observer\\\\nW1124 11:06:38.228082 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1124 11:06:38.228286 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1124 11:06:38.229260 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3641171055/tls.crt::/tmp/serving-cert-3641171055/tls.key\\\\\\\"\\\\nF1124 11:06:48.685179 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24ec6d0c0c354883fbcb0141fb133d37861353398ad16742be6d2564a4c85c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:54Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.803818 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:54Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.814949 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:54Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.829196 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f890fc2e-8d6c-4109-882a-9e90340097a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0197a911066059e8637bae1251b551c637ed350ba2fcd92b4dcace745a28c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b6540a68ba35f1f647d8d4bc6fe738e35d1da8b8a4b8cdf8347e246bdc9a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vhwb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:54Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.850086 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db7bddff-87d8-49ce-91b0-a9598e04fa97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2325c8c4a059eb206089a1e4c78f02cb313c544aabc8e368cb06f8f5e3725354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8877ec3aba0de1bc9e496bba5d4e699bf7d96515c1596c6a176e3a9267f76402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0a06e6bc374614ab27d7d45d36efc3cadfdba44e33705d62d85ed50e38fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f900f515002fbeba5eecdf005ccfda9ff78f6ed1b990af5498b4d2f101e0bea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0768511ec90890b5393b32427647df3c55b88445ac9e00cf2d516a92c47c9bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:54Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.863256 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec4d7a924c5edd7f9268c8f39e92e2d281f41f1002216099e9307129538f4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:54Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.875342 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:54Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.887326 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.887371 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.887412 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.887431 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.887449 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:54Z","lastTransitionTime":"2025-11-24T11:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.887638 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"136bf646-d691-4d52-b178-1f94d2d19458\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a75772043abc2503950b09cf6a83fcd6c12bc5c3f68a1a4ae51c968d80c1cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2b5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:54Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.897557 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8gb7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6dea074-570b-440e-b555-46c1dde88efa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jscg9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jscg9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:07:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8gb7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:54Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.908240 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d681098-9a27-4c4a-abf7-c110797b560a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:54Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.918910 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddeea6f3-0f59-4651-b29c-895c5980d711\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9bd83a541e9bd2db206f6620364e5f41cab8fd696ba54e4df5d1e2b58028641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf4d09b316208ce9cf70891db8aaedc1a10f2d347e3a901d11fee9f9da43a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe4128b62953bbd3725ddc1ecf6a9c1c6fe5282ac7422e8c2ac007cda444ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7631b7f595858c73834bf11a808301514a3b625052b201189819d3421e6e55bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7631b7f595858c73834bf11a808301514a3b625052b201189819d3421e6e55bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:54Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.931251 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jh899" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f578963d-5ff1-4e31-945b-cc59f0b244bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51d87e3b03bc979ed61278f3c0e2021000517c2ff53f53793ac2862928974f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8397670f159f43e20cc8c6710ec3dca2af99b4defccbe1a2401fdf3e116fa8be\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T11:07:43Z\\\",\\\"message\\\":\\\"2025-11-24T11:06:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8569c526-7330-4242-a385-ab82dd32903e\\\\n2025-11-24T11:06:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8569c526-7330-4242-a385-ab82dd32903e to /host/opt/cni/bin/\\\\n2025-11-24T11:06:58Z [verbose] multus-daemon started\\\\n2025-11-24T11:06:58Z [verbose] Readiness Indicator file check\\\\n2025-11-24T11:07:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rcps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jh899\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:54Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.946470 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T11:07:51Z\\\",\\\"message\\\":\\\"ice node crc\\\\nI1124 11:07:51.574495 6759 egressservice_zone_node.go:113] Finished syncing Egress Service node crc: 14.16µs\\\\nI1124 11:07:51.574517 6759 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1124 11:07:51.573217 6759 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1124 11:07:51.574955 6759 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 11:07:51.575080 6759 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1124 11:07:51.575120 6759 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1124 11:07:51.575222 6759 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1124 11:07:51.575549 6759 ovnkube.go:599] Stopped ovnkube\\\\nI1124 11:07:51.575577 6759 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1124 11:07:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bkksr_openshift-ovn-kubernetes(fa360dfd-2d4c-4442-84c9-af5d97c4c1fa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bkksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:54Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.955220 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jktjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270867eb-4cb0-47a6-958a-f411367a85b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ade38f0d51834521068693ef4008ed9760b9d54a8a8cfc4be5f687bb266dc4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgkv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jktjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:54Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.969154 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z64nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d919d0f-1c4b-493d-8e80-7927e899e908\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3148c48d5d24649328dc14aa236f7f2b91c042388ca5d3a144213a5c7378260f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6fscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cad758e1529fcd67c61a844094a6e21583cea12332ab55d8f34e1fbb6f3d1524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6fscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:07:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z64nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:07:54Z is after 2025-08-24T17:21:41Z" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.990175 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.990208 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.990220 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.990237 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:54 crc kubenswrapper[4752]: I1124 11:07:54.990252 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:54Z","lastTransitionTime":"2025-11-24T11:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.092972 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.093011 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.093020 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.093034 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.093070 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:55Z","lastTransitionTime":"2025-11-24T11:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.196025 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.196092 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.196106 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.196124 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.196136 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:55Z","lastTransitionTime":"2025-11-24T11:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.298276 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.298315 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.298323 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.298336 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.298344 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:55Z","lastTransitionTime":"2025-11-24T11:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.401259 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.401336 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.401352 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.401374 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.401387 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:55Z","lastTransitionTime":"2025-11-24T11:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.504781 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.504852 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.504868 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.504892 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.504906 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:55Z","lastTransitionTime":"2025-11-24T11:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.608542 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.608594 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.608608 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.608626 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.608640 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:55Z","lastTransitionTime":"2025-11-24T11:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.710465 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.710506 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.710515 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.710529 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.710540 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:55Z","lastTransitionTime":"2025-11-24T11:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.727946 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.727975 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.728001 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.728045 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:07:55 crc kubenswrapper[4752]: E1124 11:07:55.728083 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:07:55 crc kubenswrapper[4752]: E1124 11:07:55.728213 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:07:55 crc kubenswrapper[4752]: E1124 11:07:55.728313 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:07:55 crc kubenswrapper[4752]: E1124 11:07:55.728363 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.813153 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.813190 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.813218 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.813231 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.813240 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:55Z","lastTransitionTime":"2025-11-24T11:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.915883 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.916058 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.916092 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.916175 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:55 crc kubenswrapper[4752]: I1124 11:07:55.916208 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:55Z","lastTransitionTime":"2025-11-24T11:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.018536 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.018586 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.018597 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.018614 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.018627 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:56Z","lastTransitionTime":"2025-11-24T11:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.120643 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.120677 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.120686 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.120700 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.120708 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:56Z","lastTransitionTime":"2025-11-24T11:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.224547 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.224602 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.224614 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.224643 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.224655 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:56Z","lastTransitionTime":"2025-11-24T11:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.327559 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.327617 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.327628 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.327647 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.327660 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:56Z","lastTransitionTime":"2025-11-24T11:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.430938 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.431071 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.431093 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.431121 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.431139 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:56Z","lastTransitionTime":"2025-11-24T11:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.534167 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.534646 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.534765 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.534861 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.534949 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:56Z","lastTransitionTime":"2025-11-24T11:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.637721 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.637771 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.637780 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.637795 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.637804 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:56Z","lastTransitionTime":"2025-11-24T11:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.739985 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.740035 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.740048 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.740066 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.740078 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:56Z","lastTransitionTime":"2025-11-24T11:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.842553 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.842633 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.842647 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.842668 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.842682 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:56Z","lastTransitionTime":"2025-11-24T11:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.944929 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.944988 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.945006 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.945055 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:56 crc kubenswrapper[4752]: I1124 11:07:56.945074 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:56Z","lastTransitionTime":"2025-11-24T11:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.047917 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.048341 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.048537 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.048690 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.048889 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:57Z","lastTransitionTime":"2025-11-24T11:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.152024 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.152368 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.152449 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.152522 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.152664 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:57Z","lastTransitionTime":"2025-11-24T11:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.254815 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.254893 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.254936 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.254966 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.254990 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:57Z","lastTransitionTime":"2025-11-24T11:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.357991 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.358043 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.358051 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.358063 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.358073 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:57Z","lastTransitionTime":"2025-11-24T11:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.462028 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.462125 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.462185 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.462216 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.462290 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:57Z","lastTransitionTime":"2025-11-24T11:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.565174 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.565209 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.565218 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.565231 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.565241 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:57Z","lastTransitionTime":"2025-11-24T11:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.668652 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.668687 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.668696 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.668711 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.668723 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:57Z","lastTransitionTime":"2025-11-24T11:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.727853 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.727895 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:07:57 crc kubenswrapper[4752]: E1124 11:07:57.728365 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.728035 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.727896 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:07:57 crc kubenswrapper[4752]: E1124 11:07:57.728467 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:07:57 crc kubenswrapper[4752]: E1124 11:07:57.728368 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:07:57 crc kubenswrapper[4752]: E1124 11:07:57.728516 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.771427 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.771479 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.771495 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.771514 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.771529 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:57Z","lastTransitionTime":"2025-11-24T11:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.873961 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.874027 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.874049 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.874076 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.874098 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:57Z","lastTransitionTime":"2025-11-24T11:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.977524 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.977586 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.977607 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.977643 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:57 crc kubenswrapper[4752]: I1124 11:07:57.977665 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:57Z","lastTransitionTime":"2025-11-24T11:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.080660 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.080734 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.080766 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.080788 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.080801 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:58Z","lastTransitionTime":"2025-11-24T11:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.184105 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.184186 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.184206 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.184228 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.184251 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:58Z","lastTransitionTime":"2025-11-24T11:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.286644 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.287011 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.287022 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.287034 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.287043 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:58Z","lastTransitionTime":"2025-11-24T11:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.389649 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.389712 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.389729 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.389795 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.389816 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:58Z","lastTransitionTime":"2025-11-24T11:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.492815 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.492918 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.492942 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.492974 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.492998 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:58Z","lastTransitionTime":"2025-11-24T11:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.520073 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:07:58 crc kubenswrapper[4752]: E1124 11:07:58.520310 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:09:02.52026302 +0000 UTC m=+148.505083339 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.596337 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.596401 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.596413 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.596450 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.596464 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:58Z","lastTransitionTime":"2025-11-24T11:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.622060 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.622133 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.622155 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.622180 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:07:58 crc kubenswrapper[4752]: E1124 11:07:58.622291 4752 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 24 11:07:58 crc kubenswrapper[4752]: E1124 11:07:58.622330 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 24 11:07:58 crc kubenswrapper[4752]: E1124 11:07:58.622354 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 24 11:07:58 crc kubenswrapper[4752]: E1124 11:07:58.622369 4752 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 11:07:58 crc kubenswrapper[4752]: E1124 11:07:58.622375 4752 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 24 11:07:58 crc kubenswrapper[4752]: E1124 11:07:58.622425 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-24 11:09:02.622388676 +0000 UTC m=+148.607209005 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 24 11:07:58 crc kubenswrapper[4752]: E1124 11:07:58.622458 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 24 11:07:58 crc kubenswrapper[4752]: E1124 11:07:58.622555 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 24 11:07:58 crc kubenswrapper[4752]: E1124 11:07:58.622467 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-24 11:09:02.622441687 +0000 UTC m=+148.607262016 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 24 11:07:58 crc kubenswrapper[4752]: E1124 11:07:58.622703 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-24 11:09:02.622642844 +0000 UTC m=+148.607463173 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 11:07:58 crc kubenswrapper[4752]: E1124 11:07:58.622817 4752 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 11:07:58 crc kubenswrapper[4752]: E1124 11:07:58.623050 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-24 11:09:02.622955913 +0000 UTC m=+148.607776242 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.698806 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.698891 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.698910 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.698933 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.698953 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:58Z","lastTransitionTime":"2025-11-24T11:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.802484 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.802559 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.802583 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.802611 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.802631 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:58Z","lastTransitionTime":"2025-11-24T11:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.905835 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.905886 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.905902 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.905927 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:58 crc kubenswrapper[4752]: I1124 11:07:58.905944 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:58Z","lastTransitionTime":"2025-11-24T11:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.008460 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.008509 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.008520 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.008536 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.008547 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:59Z","lastTransitionTime":"2025-11-24T11:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.111196 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.111242 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.111253 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.111269 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.111280 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:59Z","lastTransitionTime":"2025-11-24T11:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.213320 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.213361 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.213372 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.213389 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.213400 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:59Z","lastTransitionTime":"2025-11-24T11:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.317430 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.317479 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.317499 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.317525 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.317545 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:59Z","lastTransitionTime":"2025-11-24T11:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.420181 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.420221 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.420233 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.420249 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.420259 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:59Z","lastTransitionTime":"2025-11-24T11:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.522895 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.522943 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.522955 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.522972 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.522985 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:59Z","lastTransitionTime":"2025-11-24T11:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.625322 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.625357 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.625368 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.625383 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.625394 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:59Z","lastTransitionTime":"2025-11-24T11:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.726878 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.726896 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.726949 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:07:59 crc kubenswrapper[4752]: E1124 11:07:59.727008 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.727071 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:07:59 crc kubenswrapper[4752]: E1124 11:07:59.727172 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:07:59 crc kubenswrapper[4752]: E1124 11:07:59.727272 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:07:59 crc kubenswrapper[4752]: E1124 11:07:59.727353 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.728649 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.728675 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.728683 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.728697 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.728706 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:59Z","lastTransitionTime":"2025-11-24T11:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.831840 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.831887 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.831907 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.831930 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.831946 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:59Z","lastTransitionTime":"2025-11-24T11:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.934813 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.934854 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.934866 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.934882 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:07:59 crc kubenswrapper[4752]: I1124 11:07:59.934893 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:07:59Z","lastTransitionTime":"2025-11-24T11:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.038057 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.038098 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.038108 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.038125 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.038137 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:00Z","lastTransitionTime":"2025-11-24T11:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.141076 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.141114 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.141123 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.141139 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.141149 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:00Z","lastTransitionTime":"2025-11-24T11:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.244501 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.244556 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.244567 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.244582 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.244593 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:00Z","lastTransitionTime":"2025-11-24T11:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.346259 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.346303 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.346313 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.346327 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.346337 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:00Z","lastTransitionTime":"2025-11-24T11:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.449315 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.449362 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.449370 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.449385 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.449394 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:00Z","lastTransitionTime":"2025-11-24T11:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.552572 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.552623 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.552635 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.552653 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.552665 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:00Z","lastTransitionTime":"2025-11-24T11:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.656305 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.656368 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.656380 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.656401 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.656414 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:00Z","lastTransitionTime":"2025-11-24T11:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.759484 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.759554 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.759570 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.759594 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.759612 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:00Z","lastTransitionTime":"2025-11-24T11:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.862149 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.862213 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.862225 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.862240 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.862251 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:00Z","lastTransitionTime":"2025-11-24T11:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.964870 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.964919 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.964927 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.964941 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:00 crc kubenswrapper[4752]: I1124 11:08:00.964949 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:00Z","lastTransitionTime":"2025-11-24T11:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.070518 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.070558 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.070567 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.070584 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.070594 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:01Z","lastTransitionTime":"2025-11-24T11:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.173237 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.173273 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.173281 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.173296 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.173307 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:01Z","lastTransitionTime":"2025-11-24T11:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.275512 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.275566 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.275579 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.275596 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.275607 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:01Z","lastTransitionTime":"2025-11-24T11:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.378138 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.378180 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.378191 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.378208 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.378219 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:01Z","lastTransitionTime":"2025-11-24T11:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.485717 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.485785 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.485797 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.485815 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.485830 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:01Z","lastTransitionTime":"2025-11-24T11:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.588418 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.588484 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.588494 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.588533 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.588544 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:01Z","lastTransitionTime":"2025-11-24T11:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.691925 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.691988 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.692004 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.692025 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.692041 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:01Z","lastTransitionTime":"2025-11-24T11:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.727443 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.727470 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.727509 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.727618 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:08:01 crc kubenswrapper[4752]: E1124 11:08:01.727666 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:08:01 crc kubenswrapper[4752]: E1124 11:08:01.727695 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:08:01 crc kubenswrapper[4752]: E1124 11:08:01.727817 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:08:01 crc kubenswrapper[4752]: E1124 11:08:01.727925 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.795148 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.795187 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.795196 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.795209 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.795218 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:01Z","lastTransitionTime":"2025-11-24T11:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.896933 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.896966 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.896980 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.897006 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.897022 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:01Z","lastTransitionTime":"2025-11-24T11:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.999101 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.999136 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.999145 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.999175 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:01 crc kubenswrapper[4752]: I1124 11:08:01.999184 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:01Z","lastTransitionTime":"2025-11-24T11:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.100890 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.100950 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.100972 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.101001 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.101022 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:02Z","lastTransitionTime":"2025-11-24T11:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.203785 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.203830 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.203846 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.203869 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.203886 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:02Z","lastTransitionTime":"2025-11-24T11:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.307002 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.307046 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.307056 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.307070 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.307079 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:02Z","lastTransitionTime":"2025-11-24T11:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.409512 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.409559 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.409568 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.409583 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.409594 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:02Z","lastTransitionTime":"2025-11-24T11:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.512564 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.512604 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.512614 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.512630 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.512641 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:02Z","lastTransitionTime":"2025-11-24T11:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.616126 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.616168 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.616178 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.616195 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.616205 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:02Z","lastTransitionTime":"2025-11-24T11:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.718669 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.718720 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.718736 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.718792 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.718808 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:02Z","lastTransitionTime":"2025-11-24T11:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.821819 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.821890 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.821901 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.821924 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.821941 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:02Z","lastTransitionTime":"2025-11-24T11:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.925385 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.925429 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.925439 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.925455 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:02 crc kubenswrapper[4752]: I1124 11:08:02.925465 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:02Z","lastTransitionTime":"2025-11-24T11:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.028021 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.028091 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.028103 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.028125 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.028138 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:03Z","lastTransitionTime":"2025-11-24T11:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.131588 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.131650 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.131662 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.131682 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.131691 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:03Z","lastTransitionTime":"2025-11-24T11:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.234735 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.234842 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.234859 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.234888 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.234905 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:03Z","lastTransitionTime":"2025-11-24T11:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.308129 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.308205 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.308218 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.308245 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.308260 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:03Z","lastTransitionTime":"2025-11-24T11:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:03 crc kubenswrapper[4752]: E1124 11:08:03.323968 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:08:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:08:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"47425241-83f5-42c8-9f71-0c166d7ef9e2\\\",\\\"systemUUID\\\":\\\"366115a7-2c9a-450b-9862-da5d0db853ac\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:08:03Z is after 2025-08-24T17:21:41Z" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.328480 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.328529 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.328539 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.328557 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.328568 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:03Z","lastTransitionTime":"2025-11-24T11:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:03 crc kubenswrapper[4752]: E1124 11:08:03.352573 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:08:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:08:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"47425241-83f5-42c8-9f71-0c166d7ef9e2\\\",\\\"systemUUID\\\":\\\"366115a7-2c9a-450b-9862-da5d0db853ac\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:08:03Z is after 2025-08-24T17:21:41Z" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.356993 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.357046 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.357055 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.357076 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.357088 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:03Z","lastTransitionTime":"2025-11-24T11:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:03 crc kubenswrapper[4752]: E1124 11:08:03.373091 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:08:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:08:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"47425241-83f5-42c8-9f71-0c166d7ef9e2\\\",\\\"systemUUID\\\":\\\"366115a7-2c9a-450b-9862-da5d0db853ac\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:08:03Z is after 2025-08-24T17:21:41Z" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.378149 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.378203 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.378217 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.378237 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.378252 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:03Z","lastTransitionTime":"2025-11-24T11:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:03 crc kubenswrapper[4752]: E1124 11:08:03.393535 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:08:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:08:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"47425241-83f5-42c8-9f71-0c166d7ef9e2\\\",\\\"systemUUID\\\":\\\"366115a7-2c9a-450b-9862-da5d0db853ac\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:08:03Z is after 2025-08-24T17:21:41Z" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.397123 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.397154 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.397162 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.397177 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.397186 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:03Z","lastTransitionTime":"2025-11-24T11:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:03 crc kubenswrapper[4752]: E1124 11:08:03.410425 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:08:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T11:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T11:08:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"47425241-83f5-42c8-9f71-0c166d7ef9e2\\\",\\\"systemUUID\\\":\\\"366115a7-2c9a-450b-9862-da5d0db853ac\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:08:03Z is after 2025-08-24T17:21:41Z" Nov 24 11:08:03 crc kubenswrapper[4752]: E1124 11:08:03.410580 4752 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.412502 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.412691 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.412843 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.413000 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.413092 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:03Z","lastTransitionTime":"2025-11-24T11:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.516093 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.516162 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.516175 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.516197 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.516214 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:03Z","lastTransitionTime":"2025-11-24T11:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.619446 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.619502 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.619519 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.619543 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.619559 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:03Z","lastTransitionTime":"2025-11-24T11:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.721706 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.721739 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.721766 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.721780 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.721791 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:03Z","lastTransitionTime":"2025-11-24T11:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.727941 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.727982 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.728069 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:08:03 crc kubenswrapper[4752]: E1124 11:08:03.728073 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.728108 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:08:03 crc kubenswrapper[4752]: E1124 11:08:03.728180 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:08:03 crc kubenswrapper[4752]: E1124 11:08:03.728248 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:08:03 crc kubenswrapper[4752]: E1124 11:08:03.728301 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.825357 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.825400 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.825409 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.825423 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.825433 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:03Z","lastTransitionTime":"2025-11-24T11:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.928098 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.928169 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.928189 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.928215 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:03 crc kubenswrapper[4752]: I1124 11:08:03.928232 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:03Z","lastTransitionTime":"2025-11-24T11:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.030340 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.030393 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.030404 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.030420 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.030431 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:04Z","lastTransitionTime":"2025-11-24T11:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.133841 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.133880 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.133889 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.133903 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.133912 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:04Z","lastTransitionTime":"2025-11-24T11:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.236601 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.236641 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.236652 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.236670 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.236682 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:04Z","lastTransitionTime":"2025-11-24T11:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.339193 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.339243 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.339254 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.339271 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.339282 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:04Z","lastTransitionTime":"2025-11-24T11:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.441928 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.441977 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.441988 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.442006 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.442019 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:04Z","lastTransitionTime":"2025-11-24T11:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.544337 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.544411 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.544426 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.544450 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.544461 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:04Z","lastTransitionTime":"2025-11-24T11:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.647621 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.647672 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.647684 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.647701 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.647865 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:04Z","lastTransitionTime":"2025-11-24T11:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.739820 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z64nn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d919d0f-1c4b-493d-8e80-7927e899e908\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3148c48d5d24649328dc14aa236f7f2b91c042388ca5d3a144213a5c7378260f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6fscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cad758e1529fcd67c61a844094a6e21583cea12332ab55d8f34e1fbb6f3d1524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6fscc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:07:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z64nn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:08:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.750101 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.750149 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.750160 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.750176 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.750185 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:04Z","lastTransitionTime":"2025-11-24T11:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.750524 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8gb7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6dea074-570b-440e-b555-46c1dde88efa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jscg9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jscg9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:07:09Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8gb7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:08:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.760998 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d681098-9a27-4c4a-abf7-c110797b560a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6397ea9183eb8496cf44e704592093696e2259e829ac9e15909991c2a6c8d4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5972f2c703132e0b86e8f9760d131c0818594b0b907c71ad690b7e9f195fdded\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://432d6b220ac0f5a3af15e9cd8e4ca5b1226bd50506ad71a11ec6e7b6b4dac614\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed6c68da11e0ec6d330ace984e0819e2051f7f37eebf0455b91201c8589b1a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:08:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.770701 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddeea6f3-0f59-4651-b29c-895c5980d711\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9bd83a541e9bd2db206f6620364e5f41cab8fd696ba54e4df5d1e2b58028641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf4d09b316208ce9cf70891db8aaedc1a10f2d347e3a901d11fee9f9da43a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe4128b62953bbd3725ddc1ecf6a9c1c6fe5282ac7422e8c2ac007cda444ace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7631b7f595858c73834bf11a808301514a3b625052b201189819d3421e6e55bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7631b7f595858c73834bf11a808301514a3b625052b201189819d3421e6e55bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:08:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.781231 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jh899" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f578963d-5ff1-4e31-945b-cc59f0b244bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f51d87e3b03bc979ed61278f3c0e2021000517c2ff53f53793ac2862928974f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8397670f159f43e20cc8c6710ec3dca2af99b4defccbe1a2401fdf3e116fa8be\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T11:07:43Z\\\",\\\"message\\\":\\\"2025-11-24T11:06:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8569c526-7330-4242-a385-ab82dd32903e\\\\n2025-11-24T11:06:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8569c526-7330-4242-a385-ab82dd32903e to /host/opt/cni/bin/\\\\n2025-11-24T11:06:58Z [verbose] multus-daemon started\\\\n2025-11-24T11:06:58Z [verbose] Readiness Indicator file check\\\\n2025-11-24T11:07:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rcps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jh899\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:08:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.798834 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T11:07:51Z\\\",\\\"message\\\":\\\"ice node crc\\\\nI1124 11:07:51.574495 6759 egressservice_zone_node.go:113] Finished syncing Egress Service node crc: 14.16µs\\\\nI1124 11:07:51.574517 6759 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1124 11:07:51.573217 6759 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1124 11:07:51.574955 6759 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 11:07:51.575080 6759 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1124 11:07:51.575120 6759 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1124 11:07:51.575222 6759 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1124 11:07:51.575549 6759 ovnkube.go:599] Stopped ovnkube\\\\nI1124 11:07:51.575577 6759 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1124 11:07:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bkksr_openshift-ovn-kubernetes(fa360dfd-2d4c-4442-84c9-af5d97c4c1fa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6gwnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bkksr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:08:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.809075 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jktjk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270867eb-4cb0-47a6-958a-f411367a85b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ade38f0d51834521068693ef4008ed9760b9d54a8a8cfc4be5f687bb266dc4e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pgkv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jktjk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:08:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.818852 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afccdbdac1eff9b564c64997649d9c34db6052cced3d053b1b5faf51ab14a12f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:08:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.833556 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd34b2d0c9f1731ce1dc24c078a17361fc33d962037fc48ee21ecbcf4986fd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156cd45b83c97666189a0081b12acece2f273b3a03bc135ec0729c81eeb5170b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:08:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.844136 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-brns2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04afe719-08a4-4d22-83d2-6dd16d191cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0789d2dfd0d83f253e77f99448478eb42e5e45f12e9621ea16bc89e97492010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzkhn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-brns2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:08:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.852807 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.852839 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.852849 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.852864 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.852876 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:04Z","lastTransitionTime":"2025-11-24T11:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.855447 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1d7e03f-e4e6-4fcc-b5c3-2959ec48c7a5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bfcc00186dc5f4713c23dff4b721d4b3e73262c7e7023a031afd6ad4148efb97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3310005b0f0e931a397e87b2649c72ef96467f1f7439267f1e410109916e4034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3310005b0f0e931a397e87b2649c72ef96467f1f7439267f1e410109916e4034\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:08:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.867993 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b427613-8be1-45a9-8e97-45b0fd75ff4a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d3d3279037cda399bef36619c150353e6948dd375c8fa9db5fef6bdca4c44ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4791d1130ba00230bee650a69082d11b331e4415d83301548ec1f2550b59958a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac0836e28edcd407034cbc884bd1aa77b7c75f500b20c088cb2e53052ba1af52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c270d60b2ca9fab983848fd8ee8c44a1e60544363b6c1408aba2e70c4d29534e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c59c57ce45fcab41c773b0b12ac04a5c93474088c3a43e42add9cf9b246b66b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T11:06:48Z\\\",\\\"message\\\":\\\"W1124 11:06:37.846162 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1124 11:06:37.846539 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763982397 cert, and key in /tmp/serving-cert-3641171055/serving-signer.crt, /tmp/serving-cert-3641171055/serving-signer.key\\\\nI1124 11:06:38.225612 1 observer_polling.go:159] Starting file observer\\\\nW1124 11:06:38.228082 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1124 11:06:38.228286 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1124 11:06:38.229260 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3641171055/tls.crt::/tmp/serving-cert-3641171055/tls.key\\\\\\\"\\\\nF1124 11:06:48.685179 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24ec6d0c0c354883fbcb0141fb133d37861353398ad16742be6d2564a4c85c71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5987240b879fd75346e6204a9adc24367e83ad84ca031ccfb201de808a9a8f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:08:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.879658 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:08:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.890050 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:08:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.899009 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f890fc2e-8d6c-4109-882a-9e90340097a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0197a911066059e8637bae1251b551c637ed350ba2fcd92b4dcace745a28c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b6540a68ba35f1f647d8d4bc6fe738e35d1da8b8a4b8cdf8347e246bdc9a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hsrgx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vhwb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:08:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.917070 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db7bddff-87d8-49ce-91b0-a9598e04fa97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2325c8c4a059eb206089a1e4c78f02cb313c544aabc8e368cb06f8f5e3725354\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8877ec3aba0de1bc9e496bba5d4e699bf7d96515c1596c6a176e3a9267f76402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0a06e6bc374614ab27d7d45d36efc3cadfdba44e33705d62d85ed50e38fad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f900f515002fbeba5eecdf005ccfda9ff78f6ed1b990af5498b4d2f101e0bea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0768511ec90890b5393b32427647df3c55b88445ac9e00cf2d516a92c47c9bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f37718955a5a8058c86a2c0ab065d76799ffad1cd64e3e29e3ca3c2553aeb9f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17f201d7ad69f1691a5b491e56f00f34241ae05bf1fd5ac5b7fde09bd5c1348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff4354cb142aa5681e41d77ed00358a98a82ab81562f0b9c7e1cb3e5b7d5cf7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:08:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.930201 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bec4d7a924c5edd7f9268c8f39e92e2d281f41f1002216099e9307129538f4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:08:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.943139 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:08:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.956331 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.956370 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.956381 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.956398 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.956409 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:04Z","lastTransitionTime":"2025-11-24T11:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:04 crc kubenswrapper[4752]: I1124 11:08:04.956623 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"136bf646-d691-4d52-b178-1f94d2d19458\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T11:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a75772043abc2503950b09cf6a83fcd6c12bc5c3f68a1a4ae51c968d80c1cc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T11:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e54d9849965128d8f10c8fbca7e5f45f8dce3aed7f3b57090f209fb330ebba2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://667ab0b868d0f17ffe279b462cf7fee80942f9b75610110b68e868c5b7713fa8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06ad553bf80b5ed36282804d9a45ec146eb6f0622621c3530678956b15142927\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:06:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://463c58b275f977be867566dff8a40258a33dc12aec887b333421e96172d074ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ae8a0268c7c160a82a3ebf9df53c1b32f766e4d080a8b224585783b6efa3e41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c81340cd822084803240d4c4c0d77abbc1deee4c3e23853123d1e3b3e422c652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T11:07:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T11:07:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b8gvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T11:06:56Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2b5jq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T11:08:04Z is after 2025-08-24T17:21:41Z" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.058372 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.058409 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.058417 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.058432 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.058442 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:05Z","lastTransitionTime":"2025-11-24T11:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.161414 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.161484 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.161511 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.161546 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.161572 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:05Z","lastTransitionTime":"2025-11-24T11:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.263685 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.263732 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.263778 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.263800 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.263813 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:05Z","lastTransitionTime":"2025-11-24T11:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.366265 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.366321 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.366335 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.366352 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.366365 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:05Z","lastTransitionTime":"2025-11-24T11:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.468291 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.468336 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.468347 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.468365 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.468375 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:05Z","lastTransitionTime":"2025-11-24T11:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.571036 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.571090 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.571107 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.571128 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.571175 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:05Z","lastTransitionTime":"2025-11-24T11:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.674217 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.674313 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.674325 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.674346 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.674360 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:05Z","lastTransitionTime":"2025-11-24T11:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.727142 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.727193 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.727161 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.727142 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:08:05 crc kubenswrapper[4752]: E1124 11:08:05.727344 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:08:05 crc kubenswrapper[4752]: E1124 11:08:05.727539 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:08:05 crc kubenswrapper[4752]: E1124 11:08:05.727718 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:08:05 crc kubenswrapper[4752]: E1124 11:08:05.727852 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.777784 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.777839 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.777853 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.777875 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.777891 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:05Z","lastTransitionTime":"2025-11-24T11:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.881289 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.881380 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.881392 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.881417 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.881434 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:05Z","lastTransitionTime":"2025-11-24T11:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.985404 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.985456 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.985468 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.985488 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:05 crc kubenswrapper[4752]: I1124 11:08:05.985502 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:05Z","lastTransitionTime":"2025-11-24T11:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.088956 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.089040 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.089067 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.089100 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.089124 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:06Z","lastTransitionTime":"2025-11-24T11:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.192214 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.192324 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.192342 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.192424 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.192445 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:06Z","lastTransitionTime":"2025-11-24T11:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.295452 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.295508 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.295520 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.295541 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.295559 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:06Z","lastTransitionTime":"2025-11-24T11:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.398617 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.398651 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.398659 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.398672 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.398681 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:06Z","lastTransitionTime":"2025-11-24T11:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.506184 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.506268 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.506294 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.506327 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.506351 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:06Z","lastTransitionTime":"2025-11-24T11:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.610480 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.610518 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.610528 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.610546 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.610558 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:06Z","lastTransitionTime":"2025-11-24T11:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.713494 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.713548 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.713562 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.713589 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.713604 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:06Z","lastTransitionTime":"2025-11-24T11:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.817034 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.817080 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.817091 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.817112 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.817126 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:06Z","lastTransitionTime":"2025-11-24T11:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.920816 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.920865 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.920874 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.920894 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:06 crc kubenswrapper[4752]: I1124 11:08:06.920905 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:06Z","lastTransitionTime":"2025-11-24T11:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.024256 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.025026 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.025075 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.025105 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.025120 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:07Z","lastTransitionTime":"2025-11-24T11:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.128283 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.128321 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.128330 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.128348 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.128357 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:07Z","lastTransitionTime":"2025-11-24T11:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.230587 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.230666 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.230689 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.230718 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.230771 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:07Z","lastTransitionTime":"2025-11-24T11:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.333654 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.333709 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.333724 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.333768 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.333785 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:07Z","lastTransitionTime":"2025-11-24T11:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.436542 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.436575 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.436584 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.436600 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.436611 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:07Z","lastTransitionTime":"2025-11-24T11:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.539394 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.539442 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.539454 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.539471 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.539484 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:07Z","lastTransitionTime":"2025-11-24T11:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.642132 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.642169 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.642178 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.642192 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.642202 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:07Z","lastTransitionTime":"2025-11-24T11:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.728037 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.728091 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.728122 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.728163 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:08:07 crc kubenswrapper[4752]: E1124 11:08:07.728174 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:08:07 crc kubenswrapper[4752]: E1124 11:08:07.728470 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:08:07 crc kubenswrapper[4752]: E1124 11:08:07.728808 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:08:07 crc kubenswrapper[4752]: E1124 11:08:07.728850 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.728953 4752 scope.go:117] "RemoveContainer" containerID="566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af" Nov 24 11:08:07 crc kubenswrapper[4752]: E1124 11:08:07.729176 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bkksr_openshift-ovn-kubernetes(fa360dfd-2d4c-4442-84c9-af5d97c4c1fa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.746035 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.746072 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.746081 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.746095 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.746106 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:07Z","lastTransitionTime":"2025-11-24T11:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.849211 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.849259 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.849276 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.849302 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.849319 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:07Z","lastTransitionTime":"2025-11-24T11:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.952421 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.952484 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.952506 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.952535 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:07 crc kubenswrapper[4752]: I1124 11:08:07.952557 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:07Z","lastTransitionTime":"2025-11-24T11:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.055250 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.055323 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.055357 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.055396 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.055419 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:08Z","lastTransitionTime":"2025-11-24T11:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.158549 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.159000 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.159018 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.159043 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.159063 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:08Z","lastTransitionTime":"2025-11-24T11:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.261928 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.262010 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.262036 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.262067 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.262098 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:08Z","lastTransitionTime":"2025-11-24T11:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.365008 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.365064 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.365080 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.365105 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.365118 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:08Z","lastTransitionTime":"2025-11-24T11:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.468612 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.468682 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.468699 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.468725 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.468793 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:08Z","lastTransitionTime":"2025-11-24T11:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.571390 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.571433 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.571451 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.571471 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.571482 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:08Z","lastTransitionTime":"2025-11-24T11:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.673922 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.673965 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.673975 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.673992 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.674003 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:08Z","lastTransitionTime":"2025-11-24T11:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.780615 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.780693 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.780888 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.781210 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.781246 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:08Z","lastTransitionTime":"2025-11-24T11:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.883446 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.883511 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.883528 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.883551 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.883568 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:08Z","lastTransitionTime":"2025-11-24T11:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.986691 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.986789 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.986803 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.986826 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:08 crc kubenswrapper[4752]: I1124 11:08:08.986842 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:08Z","lastTransitionTime":"2025-11-24T11:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.089422 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.089483 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.089500 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.089527 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.089545 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:09Z","lastTransitionTime":"2025-11-24T11:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.192244 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.192301 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.192314 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.192331 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.192344 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:09Z","lastTransitionTime":"2025-11-24T11:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.295435 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.295492 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.295512 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.295538 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.295555 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:09Z","lastTransitionTime":"2025-11-24T11:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.399372 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.399444 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.399465 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.399495 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.399517 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:09Z","lastTransitionTime":"2025-11-24T11:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.502957 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.503010 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.503032 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.503059 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.503079 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:09Z","lastTransitionTime":"2025-11-24T11:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.608133 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.608272 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.608289 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.608659 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.608886 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:09Z","lastTransitionTime":"2025-11-24T11:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.711029 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.711320 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.711406 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.711488 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.711571 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:09Z","lastTransitionTime":"2025-11-24T11:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.727425 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.727470 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.727595 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:08:09 crc kubenswrapper[4752]: E1124 11:08:09.727820 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:08:09 crc kubenswrapper[4752]: E1124 11:08:09.727952 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:08:09 crc kubenswrapper[4752]: E1124 11:08:09.728080 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.727996 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:08:09 crc kubenswrapper[4752]: E1124 11:08:09.728232 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.814195 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.814261 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.814282 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.814307 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.814324 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:09Z","lastTransitionTime":"2025-11-24T11:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.917936 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.919184 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.919416 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.919584 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:09 crc kubenswrapper[4752]: I1124 11:08:09.919795 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:09Z","lastTransitionTime":"2025-11-24T11:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.023282 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.023343 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.023356 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.023379 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.023392 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:10Z","lastTransitionTime":"2025-11-24T11:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.126178 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.126254 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.126291 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.126326 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.126344 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:10Z","lastTransitionTime":"2025-11-24T11:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.229136 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.229577 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.229797 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.230033 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.230244 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:10Z","lastTransitionTime":"2025-11-24T11:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.332703 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.332753 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.332765 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.332780 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.332790 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:10Z","lastTransitionTime":"2025-11-24T11:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.435525 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.435569 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.435583 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.435598 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.435609 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:10Z","lastTransitionTime":"2025-11-24T11:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.539694 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.539741 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.539769 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.539788 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.539803 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:10Z","lastTransitionTime":"2025-11-24T11:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.642849 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.642894 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.642906 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.642921 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.642931 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:10Z","lastTransitionTime":"2025-11-24T11:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.744916 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.744988 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.745006 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.745031 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.745047 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:10Z","lastTransitionTime":"2025-11-24T11:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.848649 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.848723 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.848761 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.848791 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.848812 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:10Z","lastTransitionTime":"2025-11-24T11:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.951907 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.951967 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.951980 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.952002 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:10 crc kubenswrapper[4752]: I1124 11:08:10.952014 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:10Z","lastTransitionTime":"2025-11-24T11:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.054727 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.054833 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.054862 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.054893 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.054911 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:11Z","lastTransitionTime":"2025-11-24T11:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.157601 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.157661 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.157677 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.157699 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.157710 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:11Z","lastTransitionTime":"2025-11-24T11:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.260495 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.260572 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.260595 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.260621 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.260640 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:11Z","lastTransitionTime":"2025-11-24T11:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.363404 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.363454 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.363472 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.363493 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.363507 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:11Z","lastTransitionTime":"2025-11-24T11:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.466314 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.466361 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.466374 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.466393 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.466405 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:11Z","lastTransitionTime":"2025-11-24T11:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.569368 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.569428 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.569437 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.569452 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.569461 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:11Z","lastTransitionTime":"2025-11-24T11:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.671806 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.671868 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.671882 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.671900 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.671911 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:11Z","lastTransitionTime":"2025-11-24T11:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.727734 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.727862 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.727905 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.727871 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:08:11 crc kubenswrapper[4752]: E1124 11:08:11.728099 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:08:11 crc kubenswrapper[4752]: E1124 11:08:11.728226 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:08:11 crc kubenswrapper[4752]: E1124 11:08:11.728386 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:08:11 crc kubenswrapper[4752]: E1124 11:08:11.728485 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.777028 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.777162 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.777193 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.777271 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.777295 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:11Z","lastTransitionTime":"2025-11-24T11:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.880201 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.880252 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.880261 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.880277 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.880290 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:11Z","lastTransitionTime":"2025-11-24T11:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.982451 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.982498 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.982511 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.982528 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:11 crc kubenswrapper[4752]: I1124 11:08:11.982539 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:11Z","lastTransitionTime":"2025-11-24T11:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.085421 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.085522 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.085533 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.085548 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.085558 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:12Z","lastTransitionTime":"2025-11-24T11:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.188198 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.188257 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.188273 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.188294 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.188309 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:12Z","lastTransitionTime":"2025-11-24T11:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.291095 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.291157 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.291170 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.291188 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.291200 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:12Z","lastTransitionTime":"2025-11-24T11:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.393819 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.393850 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.393859 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.393873 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.393882 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:12Z","lastTransitionTime":"2025-11-24T11:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.497495 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.497603 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.497630 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.497661 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.497684 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:12Z","lastTransitionTime":"2025-11-24T11:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.601238 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.601315 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.601331 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.601350 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.601369 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:12Z","lastTransitionTime":"2025-11-24T11:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.704331 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.704372 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.704382 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.704397 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.704407 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:12Z","lastTransitionTime":"2025-11-24T11:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.807085 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.807195 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.807220 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.807250 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.807271 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:12Z","lastTransitionTime":"2025-11-24T11:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.910150 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.910197 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.910207 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.910223 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:12 crc kubenswrapper[4752]: I1124 11:08:12.910232 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:12Z","lastTransitionTime":"2025-11-24T11:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.012306 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.012353 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.012380 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.012396 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.012407 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:13Z","lastTransitionTime":"2025-11-24T11:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.115223 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.115255 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.115264 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.115277 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.115285 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:13Z","lastTransitionTime":"2025-11-24T11:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.217142 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.217189 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.217198 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.217213 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.217223 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:13Z","lastTransitionTime":"2025-11-24T11:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.318843 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.318877 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.318886 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.318902 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.318912 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:13Z","lastTransitionTime":"2025-11-24T11:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.421610 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.421669 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.421678 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.421692 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.421701 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:13Z","lastTransitionTime":"2025-11-24T11:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.525671 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.525772 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.525801 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.525844 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.525866 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:13Z","lastTransitionTime":"2025-11-24T11:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.618221 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.618275 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.618284 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.618300 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.618309 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T11:08:13Z","lastTransitionTime":"2025-11-24T11:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.689469 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6dea074-570b-440e-b555-46c1dde88efa-metrics-certs\") pod \"network-metrics-daemon-8gb7x\" (UID: \"d6dea074-570b-440e-b555-46c1dde88efa\") " pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:08:13 crc kubenswrapper[4752]: E1124 11:08:13.689730 4752 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 24 11:08:13 crc kubenswrapper[4752]: E1124 11:08:13.689876 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6dea074-570b-440e-b555-46c1dde88efa-metrics-certs podName:d6dea074-570b-440e-b555-46c1dde88efa nodeName:}" failed. No retries permitted until 2025-11-24 11:09:17.689846738 +0000 UTC m=+163.674667027 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d6dea074-570b-440e-b555-46c1dde88efa-metrics-certs") pod "network-metrics-daemon-8gb7x" (UID: "d6dea074-570b-440e-b555-46c1dde88efa") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.703878 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-bz6mk"] Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.708916 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bz6mk" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.713241 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.713248 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.713303 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.713679 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.728066 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.728193 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:08:13 crc kubenswrapper[4752]: E1124 11:08:13.728254 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.728300 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.728401 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:08:13 crc kubenswrapper[4752]: E1124 11:08:13.728475 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:08:13 crc kubenswrapper[4752]: E1124 11:08:13.728560 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:08:13 crc kubenswrapper[4752]: E1124 11:08:13.728621 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.759409 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-brns2" podStartSLOduration=78.759389983 podStartE2EDuration="1m18.759389983s" podCreationTimestamp="2025-11-24 11:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:08:13.747488204 +0000 UTC m=+99.732308493" watchObservedRunningTime="2025-11-24 11:08:13.759389983 +0000 UTC m=+99.744210262" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.773181 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=79.773161147 podStartE2EDuration="1m19.773161147s" podCreationTimestamp="2025-11-24 11:06:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:08:13.773066605 +0000 UTC m=+99.757886894" watchObservedRunningTime="2025-11-24 11:08:13.773161147 +0000 UTC m=+99.757981436" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.789997 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43cb43be-ad58-448e-8f2d-db06d7de9826-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-bz6mk\" (UID: \"43cb43be-ad58-448e-8f2d-db06d7de9826\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bz6mk" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.790072 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/43cb43be-ad58-448e-8f2d-db06d7de9826-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-bz6mk\" (UID: \"43cb43be-ad58-448e-8f2d-db06d7de9826\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bz6mk" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.790106 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/43cb43be-ad58-448e-8f2d-db06d7de9826-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-bz6mk\" (UID: \"43cb43be-ad58-448e-8f2d-db06d7de9826\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bz6mk" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.790138 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/43cb43be-ad58-448e-8f2d-db06d7de9826-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-bz6mk\" (UID: \"43cb43be-ad58-448e-8f2d-db06d7de9826\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bz6mk" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.790166 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43cb43be-ad58-448e-8f2d-db06d7de9826-service-ca\") pod \"cluster-version-operator-5c965bbfc6-bz6mk\" (UID: \"43cb43be-ad58-448e-8f2d-db06d7de9826\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bz6mk" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.822628 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podStartSLOduration=78.822607157 podStartE2EDuration="1m18.822607157s" podCreationTimestamp="2025-11-24 11:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:08:13.809316546 +0000 UTC m=+99.794136845" watchObservedRunningTime="2025-11-24 11:08:13.822607157 +0000 UTC m=+99.807427456" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.822954 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=23.822947687 podStartE2EDuration="23.822947687s" podCreationTimestamp="2025-11-24 11:07:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:08:13.819247826 +0000 UTC m=+99.804068125" watchObservedRunningTime="2025-11-24 11:08:13.822947687 +0000 UTC m=+99.807767986" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.875811 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-2b5jq" podStartSLOduration=78.875792399 podStartE2EDuration="1m18.875792399s" podCreationTimestamp="2025-11-24 11:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:08:13.875106358 +0000 UTC m=+99.859926657" watchObservedRunningTime="2025-11-24 11:08:13.875792399 +0000 UTC m=+99.860612688" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.891302 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/43cb43be-ad58-448e-8f2d-db06d7de9826-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-bz6mk\" (UID: \"43cb43be-ad58-448e-8f2d-db06d7de9826\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bz6mk" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.891350 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/43cb43be-ad58-448e-8f2d-db06d7de9826-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-bz6mk\" (UID: \"43cb43be-ad58-448e-8f2d-db06d7de9826\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bz6mk" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.891376 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43cb43be-ad58-448e-8f2d-db06d7de9826-service-ca\") pod \"cluster-version-operator-5c965bbfc6-bz6mk\" (UID: \"43cb43be-ad58-448e-8f2d-db06d7de9826\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bz6mk" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.891396 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43cb43be-ad58-448e-8f2d-db06d7de9826-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-bz6mk\" (UID: \"43cb43be-ad58-448e-8f2d-db06d7de9826\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bz6mk" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.891436 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/43cb43be-ad58-448e-8f2d-db06d7de9826-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-bz6mk\" (UID: \"43cb43be-ad58-448e-8f2d-db06d7de9826\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bz6mk" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.891493 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/43cb43be-ad58-448e-8f2d-db06d7de9826-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-bz6mk\" (UID: \"43cb43be-ad58-448e-8f2d-db06d7de9826\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bz6mk" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.891504 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/43cb43be-ad58-448e-8f2d-db06d7de9826-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-bz6mk\" (UID: \"43cb43be-ad58-448e-8f2d-db06d7de9826\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bz6mk" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.892288 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43cb43be-ad58-448e-8f2d-db06d7de9826-service-ca\") pod \"cluster-version-operator-5c965bbfc6-bz6mk\" (UID: \"43cb43be-ad58-448e-8f2d-db06d7de9826\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bz6mk" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.898804 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43cb43be-ad58-448e-8f2d-db06d7de9826-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-bz6mk\" (UID: \"43cb43be-ad58-448e-8f2d-db06d7de9826\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bz6mk" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.903675 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=78.903655858 podStartE2EDuration="1m18.903655858s" podCreationTimestamp="2025-11-24 11:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:08:13.90040545 +0000 UTC m=+99.885225739" watchObservedRunningTime="2025-11-24 11:08:13.903655858 +0000 UTC m=+99.888476157" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.908414 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/43cb43be-ad58-448e-8f2d-db06d7de9826-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-bz6mk\" (UID: \"43cb43be-ad58-448e-8f2d-db06d7de9826\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bz6mk" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.914636 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=44.914618848 podStartE2EDuration="44.914618848s" podCreationTimestamp="2025-11-24 11:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:08:13.914431462 +0000 UTC m=+99.899251761" watchObservedRunningTime="2025-11-24 11:08:13.914618848 +0000 UTC m=+99.899439137" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.930186 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-jh899" podStartSLOduration=78.930167666 podStartE2EDuration="1m18.930167666s" podCreationTimestamp="2025-11-24 11:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:08:13.930104564 +0000 UTC m=+99.914924863" watchObservedRunningTime="2025-11-24 11:08:13.930167666 +0000 UTC m=+99.914987955" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.964840 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-jktjk" podStartSLOduration=78.96481399 podStartE2EDuration="1m18.96481399s" podCreationTimestamp="2025-11-24 11:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:08:13.964427608 +0000 UTC m=+99.949247907" watchObservedRunningTime="2025-11-24 11:08:13.96481399 +0000 UTC m=+99.949634309" Nov 24 11:08:13 crc kubenswrapper[4752]: I1124 11:08:13.990783 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z64nn" podStartSLOduration=77.99073556 podStartE2EDuration="1m17.99073556s" podCreationTimestamp="2025-11-24 11:06:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:08:13.978238974 +0000 UTC m=+99.963059313" watchObservedRunningTime="2025-11-24 11:08:13.99073556 +0000 UTC m=+99.975555869" Nov 24 11:08:14 crc kubenswrapper[4752]: I1124 11:08:14.025405 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bz6mk" Nov 24 11:08:14 crc kubenswrapper[4752]: I1124 11:08:14.302384 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bz6mk" event={"ID":"43cb43be-ad58-448e-8f2d-db06d7de9826","Type":"ContainerStarted","Data":"5c42d4bc24c04689bd3e2a9e9adbd25b8d618eacceb3f57765882aa2fb25ba7d"} Nov 24 11:08:14 crc kubenswrapper[4752]: I1124 11:08:14.302469 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bz6mk" event={"ID":"43cb43be-ad58-448e-8f2d-db06d7de9826","Type":"ContainerStarted","Data":"3f273e29ede7b0a2c6ad7f5f3819c97c83f81915522a042dbef7cba2cb5fc7f0"} Nov 24 11:08:14 crc kubenswrapper[4752]: I1124 11:08:14.320633 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=80.320549324 podStartE2EDuration="1m20.320549324s" podCreationTimestamp="2025-11-24 11:06:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:08:14.025598661 +0000 UTC m=+100.010418970" watchObservedRunningTime="2025-11-24 11:08:14.320549324 +0000 UTC m=+100.305369613" Nov 24 11:08:14 crc kubenswrapper[4752]: I1124 11:08:14.321660 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bz6mk" podStartSLOduration=79.321654137 podStartE2EDuration="1m19.321654137s" podCreationTimestamp="2025-11-24 11:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:08:14.320447821 +0000 UTC m=+100.305268160" watchObservedRunningTime="2025-11-24 11:08:14.321654137 +0000 UTC m=+100.306474426" Nov 24 11:08:15 crc kubenswrapper[4752]: I1124 11:08:15.727048 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:08:15 crc kubenswrapper[4752]: I1124 11:08:15.727127 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:08:15 crc kubenswrapper[4752]: E1124 11:08:15.728177 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:08:15 crc kubenswrapper[4752]: I1124 11:08:15.727175 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:08:15 crc kubenswrapper[4752]: E1124 11:08:15.728243 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:08:15 crc kubenswrapper[4752]: I1124 11:08:15.727179 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:08:15 crc kubenswrapper[4752]: E1124 11:08:15.728292 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:08:15 crc kubenswrapper[4752]: E1124 11:08:15.728096 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:08:17 crc kubenswrapper[4752]: I1124 11:08:17.726911 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:08:17 crc kubenswrapper[4752]: I1124 11:08:17.726931 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:08:17 crc kubenswrapper[4752]: I1124 11:08:17.726946 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:08:17 crc kubenswrapper[4752]: E1124 11:08:17.727451 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:08:17 crc kubenswrapper[4752]: E1124 11:08:17.727311 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:08:17 crc kubenswrapper[4752]: I1124 11:08:17.726956 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:08:17 crc kubenswrapper[4752]: E1124 11:08:17.727501 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:08:17 crc kubenswrapper[4752]: E1124 11:08:17.727575 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:08:19 crc kubenswrapper[4752]: I1124 11:08:19.727047 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:08:19 crc kubenswrapper[4752]: I1124 11:08:19.727059 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:08:19 crc kubenswrapper[4752]: I1124 11:08:19.727087 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:08:19 crc kubenswrapper[4752]: I1124 11:08:19.727300 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:08:19 crc kubenswrapper[4752]: E1124 11:08:19.727399 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:08:19 crc kubenswrapper[4752]: E1124 11:08:19.727620 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:08:19 crc kubenswrapper[4752]: E1124 11:08:19.727768 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:08:19 crc kubenswrapper[4752]: E1124 11:08:19.727928 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:08:20 crc kubenswrapper[4752]: I1124 11:08:20.728546 4752 scope.go:117] "RemoveContainer" containerID="566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af" Nov 24 11:08:20 crc kubenswrapper[4752]: E1124 11:08:20.728879 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bkksr_openshift-ovn-kubernetes(fa360dfd-2d4c-4442-84c9-af5d97c4c1fa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" Nov 24 11:08:21 crc kubenswrapper[4752]: I1124 11:08:21.727772 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:08:21 crc kubenswrapper[4752]: E1124 11:08:21.727963 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:08:21 crc kubenswrapper[4752]: I1124 11:08:21.728259 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:08:21 crc kubenswrapper[4752]: E1124 11:08:21.728351 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:08:21 crc kubenswrapper[4752]: I1124 11:08:21.728555 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:08:21 crc kubenswrapper[4752]: E1124 11:08:21.728643 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:08:21 crc kubenswrapper[4752]: I1124 11:08:21.728887 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:08:21 crc kubenswrapper[4752]: E1124 11:08:21.728981 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:08:23 crc kubenswrapper[4752]: I1124 11:08:23.727875 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:08:23 crc kubenswrapper[4752]: I1124 11:08:23.727985 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:08:23 crc kubenswrapper[4752]: I1124 11:08:23.728007 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:08:23 crc kubenswrapper[4752]: I1124 11:08:23.727888 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:08:23 crc kubenswrapper[4752]: E1124 11:08:23.728102 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:08:23 crc kubenswrapper[4752]: E1124 11:08:23.728202 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:08:23 crc kubenswrapper[4752]: E1124 11:08:23.728247 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:08:23 crc kubenswrapper[4752]: E1124 11:08:23.728306 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:08:25 crc kubenswrapper[4752]: I1124 11:08:25.727867 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:08:25 crc kubenswrapper[4752]: I1124 11:08:25.727914 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:08:25 crc kubenswrapper[4752]: E1124 11:08:25.728031 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:08:25 crc kubenswrapper[4752]: I1124 11:08:25.728105 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:08:25 crc kubenswrapper[4752]: I1124 11:08:25.728140 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:08:25 crc kubenswrapper[4752]: E1124 11:08:25.728231 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:08:25 crc kubenswrapper[4752]: E1124 11:08:25.728332 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:08:25 crc kubenswrapper[4752]: E1124 11:08:25.728449 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:08:27 crc kubenswrapper[4752]: I1124 11:08:27.727182 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:08:27 crc kubenswrapper[4752]: I1124 11:08:27.727201 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:08:27 crc kubenswrapper[4752]: I1124 11:08:27.727207 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:08:27 crc kubenswrapper[4752]: E1124 11:08:27.727438 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:08:27 crc kubenswrapper[4752]: I1124 11:08:27.727069 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:08:27 crc kubenswrapper[4752]: E1124 11:08:27.727863 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:08:27 crc kubenswrapper[4752]: E1124 11:08:27.728033 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:08:27 crc kubenswrapper[4752]: E1124 11:08:27.728142 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:08:29 crc kubenswrapper[4752]: I1124 11:08:29.727904 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:08:29 crc kubenswrapper[4752]: I1124 11:08:29.728080 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:08:29 crc kubenswrapper[4752]: E1124 11:08:29.728196 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:08:29 crc kubenswrapper[4752]: E1124 11:08:29.728439 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:08:29 crc kubenswrapper[4752]: I1124 11:08:29.728899 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:08:29 crc kubenswrapper[4752]: E1124 11:08:29.729087 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:08:29 crc kubenswrapper[4752]: I1124 11:08:29.729164 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:08:29 crc kubenswrapper[4752]: E1124 11:08:29.730151 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:08:30 crc kubenswrapper[4752]: I1124 11:08:30.354108 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jh899_f578963d-5ff1-4e31-945b-cc59f0b244bf/kube-multus/1.log" Nov 24 11:08:30 crc kubenswrapper[4752]: I1124 11:08:30.354463 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jh899_f578963d-5ff1-4e31-945b-cc59f0b244bf/kube-multus/0.log" Nov 24 11:08:30 crc kubenswrapper[4752]: I1124 11:08:30.354508 4752 generic.go:334] "Generic (PLEG): container finished" podID="f578963d-5ff1-4e31-945b-cc59f0b244bf" containerID="f51d87e3b03bc979ed61278f3c0e2021000517c2ff53f53793ac2862928974f9" exitCode=1 Nov 24 11:08:30 crc kubenswrapper[4752]: I1124 11:08:30.354536 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jh899" event={"ID":"f578963d-5ff1-4e31-945b-cc59f0b244bf","Type":"ContainerDied","Data":"f51d87e3b03bc979ed61278f3c0e2021000517c2ff53f53793ac2862928974f9"} Nov 24 11:08:30 crc kubenswrapper[4752]: I1124 11:08:30.354569 4752 scope.go:117] "RemoveContainer" containerID="8397670f159f43e20cc8c6710ec3dca2af99b4defccbe1a2401fdf3e116fa8be" Nov 24 11:08:30 crc kubenswrapper[4752]: I1124 11:08:30.355176 4752 scope.go:117] "RemoveContainer" containerID="f51d87e3b03bc979ed61278f3c0e2021000517c2ff53f53793ac2862928974f9" Nov 24 11:08:30 crc kubenswrapper[4752]: E1124 11:08:30.355847 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-jh899_openshift-multus(f578963d-5ff1-4e31-945b-cc59f0b244bf)\"" pod="openshift-multus/multus-jh899" podUID="f578963d-5ff1-4e31-945b-cc59f0b244bf" Nov 24 11:08:31 crc kubenswrapper[4752]: I1124 11:08:31.358661 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jh899_f578963d-5ff1-4e31-945b-cc59f0b244bf/kube-multus/1.log" Nov 24 11:08:31 crc kubenswrapper[4752]: I1124 11:08:31.727087 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:08:31 crc kubenswrapper[4752]: I1124 11:08:31.727179 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:08:31 crc kubenswrapper[4752]: I1124 11:08:31.727214 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:08:31 crc kubenswrapper[4752]: I1124 11:08:31.727249 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:08:31 crc kubenswrapper[4752]: E1124 11:08:31.727223 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:08:31 crc kubenswrapper[4752]: E1124 11:08:31.727397 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:08:31 crc kubenswrapper[4752]: E1124 11:08:31.727434 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:08:31 crc kubenswrapper[4752]: E1124 11:08:31.727515 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:08:33 crc kubenswrapper[4752]: I1124 11:08:33.727610 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:08:33 crc kubenswrapper[4752]: I1124 11:08:33.727789 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:08:33 crc kubenswrapper[4752]: I1124 11:08:33.727849 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:08:33 crc kubenswrapper[4752]: E1124 11:08:33.727883 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:08:33 crc kubenswrapper[4752]: E1124 11:08:33.728043 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:08:33 crc kubenswrapper[4752]: I1124 11:08:33.728091 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:08:33 crc kubenswrapper[4752]: E1124 11:08:33.728249 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:08:33 crc kubenswrapper[4752]: E1124 11:08:33.728401 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:08:34 crc kubenswrapper[4752]: E1124 11:08:34.679377 4752 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Nov 24 11:08:34 crc kubenswrapper[4752]: E1124 11:08:34.826919 4752 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 24 11:08:35 crc kubenswrapper[4752]: I1124 11:08:35.728067 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:08:35 crc kubenswrapper[4752]: I1124 11:08:35.728100 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:08:35 crc kubenswrapper[4752]: I1124 11:08:35.728219 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:08:35 crc kubenswrapper[4752]: E1124 11:08:35.728359 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:08:35 crc kubenswrapper[4752]: I1124 11:08:35.728393 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:08:35 crc kubenswrapper[4752]: E1124 11:08:35.728461 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:08:35 crc kubenswrapper[4752]: E1124 11:08:35.728498 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:08:35 crc kubenswrapper[4752]: E1124 11:08:35.728910 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:08:35 crc kubenswrapper[4752]: I1124 11:08:35.729344 4752 scope.go:117] "RemoveContainer" containerID="566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af" Nov 24 11:08:36 crc kubenswrapper[4752]: I1124 11:08:36.376989 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bkksr_fa360dfd-2d4c-4442-84c9-af5d97c4c1fa/ovnkube-controller/3.log" Nov 24 11:08:36 crc kubenswrapper[4752]: I1124 11:08:36.379781 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" event={"ID":"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa","Type":"ContainerStarted","Data":"391be3743a920a4488b7847c25adb755ba6f2fbce288cd8e210d51dee7fe0448"} Nov 24 11:08:36 crc kubenswrapper[4752]: I1124 11:08:36.380209 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:08:36 crc kubenswrapper[4752]: I1124 11:08:36.403992 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" podStartSLOduration=101.403975001 podStartE2EDuration="1m41.403975001s" podCreationTimestamp="2025-11-24 11:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:08:36.403523347 +0000 UTC m=+122.388343636" watchObservedRunningTime="2025-11-24 11:08:36.403975001 +0000 UTC m=+122.388795290" Nov 24 11:08:36 crc kubenswrapper[4752]: I1124 11:08:36.609676 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8gb7x"] Nov 24 11:08:36 crc kubenswrapper[4752]: I1124 11:08:36.609891 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:08:36 crc kubenswrapper[4752]: E1124 11:08:36.610094 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:08:37 crc kubenswrapper[4752]: I1124 11:08:37.727038 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:08:37 crc kubenswrapper[4752]: I1124 11:08:37.728425 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:08:37 crc kubenswrapper[4752]: E1124 11:08:37.728520 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:08:37 crc kubenswrapper[4752]: I1124 11:08:37.728709 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:08:37 crc kubenswrapper[4752]: E1124 11:08:37.728812 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:08:37 crc kubenswrapper[4752]: E1124 11:08:37.728966 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:08:38 crc kubenswrapper[4752]: I1124 11:08:38.727517 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:08:38 crc kubenswrapper[4752]: E1124 11:08:38.727723 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:08:39 crc kubenswrapper[4752]: I1124 11:08:39.727789 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:08:39 crc kubenswrapper[4752]: I1124 11:08:39.727788 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:08:39 crc kubenswrapper[4752]: E1124 11:08:39.728426 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:08:39 crc kubenswrapper[4752]: E1124 11:08:39.728498 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:08:39 crc kubenswrapper[4752]: I1124 11:08:39.728957 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:08:39 crc kubenswrapper[4752]: E1124 11:08:39.729207 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:08:39 crc kubenswrapper[4752]: E1124 11:08:39.828638 4752 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 24 11:08:40 crc kubenswrapper[4752]: I1124 11:08:40.727899 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:08:40 crc kubenswrapper[4752]: E1124 11:08:40.728123 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:08:41 crc kubenswrapper[4752]: I1124 11:08:41.726994 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:08:41 crc kubenswrapper[4752]: E1124 11:08:41.727135 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:08:41 crc kubenswrapper[4752]: I1124 11:08:41.727330 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:08:41 crc kubenswrapper[4752]: E1124 11:08:41.727389 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:08:41 crc kubenswrapper[4752]: I1124 11:08:41.727504 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:08:41 crc kubenswrapper[4752]: E1124 11:08:41.727561 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:08:42 crc kubenswrapper[4752]: I1124 11:08:42.727559 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:08:42 crc kubenswrapper[4752]: E1124 11:08:42.727802 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:08:43 crc kubenswrapper[4752]: I1124 11:08:43.727233 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:08:43 crc kubenswrapper[4752]: I1124 11:08:43.727884 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:08:43 crc kubenswrapper[4752]: I1124 11:08:43.728072 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:08:43 crc kubenswrapper[4752]: E1124 11:08:43.728142 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:08:43 crc kubenswrapper[4752]: I1124 11:08:43.728298 4752 scope.go:117] "RemoveContainer" containerID="f51d87e3b03bc979ed61278f3c0e2021000517c2ff53f53793ac2862928974f9" Nov 24 11:08:43 crc kubenswrapper[4752]: E1124 11:08:43.729331 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:08:43 crc kubenswrapper[4752]: E1124 11:08:43.729418 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:08:44 crc kubenswrapper[4752]: I1124 11:08:44.411620 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jh899_f578963d-5ff1-4e31-945b-cc59f0b244bf/kube-multus/1.log" Nov 24 11:08:44 crc kubenswrapper[4752]: I1124 11:08:44.411681 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jh899" event={"ID":"f578963d-5ff1-4e31-945b-cc59f0b244bf","Type":"ContainerStarted","Data":"657984dabefbfedd0918034d9477ebc04272e1cdc385401ce0e3c550ce68faf0"} Nov 24 11:08:44 crc kubenswrapper[4752]: I1124 11:08:44.727131 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:08:44 crc kubenswrapper[4752]: E1124 11:08:44.728769 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:08:44 crc kubenswrapper[4752]: E1124 11:08:44.829422 4752 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 24 11:08:45 crc kubenswrapper[4752]: I1124 11:08:45.727015 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:08:45 crc kubenswrapper[4752]: I1124 11:08:45.727134 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:08:45 crc kubenswrapper[4752]: I1124 11:08:45.727428 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:08:45 crc kubenswrapper[4752]: E1124 11:08:45.727640 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:08:45 crc kubenswrapper[4752]: E1124 11:08:45.727842 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:08:45 crc kubenswrapper[4752]: E1124 11:08:45.727998 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:08:46 crc kubenswrapper[4752]: I1124 11:08:46.727594 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:08:46 crc kubenswrapper[4752]: E1124 11:08:46.729701 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:08:47 crc kubenswrapper[4752]: I1124 11:08:47.727522 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:08:47 crc kubenswrapper[4752]: I1124 11:08:47.727627 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:08:47 crc kubenswrapper[4752]: E1124 11:08:47.727708 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:08:47 crc kubenswrapper[4752]: I1124 11:08:47.727782 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:08:47 crc kubenswrapper[4752]: E1124 11:08:47.727886 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:08:47 crc kubenswrapper[4752]: E1124 11:08:47.727935 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:08:48 crc kubenswrapper[4752]: I1124 11:08:48.727289 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:08:48 crc kubenswrapper[4752]: E1124 11:08:48.727455 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8gb7x" podUID="d6dea074-570b-440e-b555-46c1dde88efa" Nov 24 11:08:49 crc kubenswrapper[4752]: I1124 11:08:49.727179 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:08:49 crc kubenswrapper[4752]: I1124 11:08:49.727218 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:08:49 crc kubenswrapper[4752]: I1124 11:08:49.727227 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:08:49 crc kubenswrapper[4752]: E1124 11:08:49.727331 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 11:08:49 crc kubenswrapper[4752]: E1124 11:08:49.727448 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 11:08:49 crc kubenswrapper[4752]: E1124 11:08:49.727542 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 11:08:50 crc kubenswrapper[4752]: I1124 11:08:50.727623 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:08:50 crc kubenswrapper[4752]: I1124 11:08:50.732216 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Nov 24 11:08:50 crc kubenswrapper[4752]: I1124 11:08:50.734006 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Nov 24 11:08:51 crc kubenswrapper[4752]: I1124 11:08:51.727980 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:08:51 crc kubenswrapper[4752]: I1124 11:08:51.728001 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:08:51 crc kubenswrapper[4752]: I1124 11:08:51.728015 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:08:51 crc kubenswrapper[4752]: I1124 11:08:51.730431 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Nov 24 11:08:51 crc kubenswrapper[4752]: I1124 11:08:51.730942 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Nov 24 11:08:51 crc kubenswrapper[4752]: I1124 11:08:51.731047 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Nov 24 11:08:51 crc kubenswrapper[4752]: I1124 11:08:51.731600 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.420775 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.460588 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tn49v"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.461078 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-cprsp"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.461237 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tn49v" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.461459 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cprsp" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.461627 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-znzhg"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.461964 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-znzhg" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.462592 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xvmg4"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.463230 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.463620 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-jntpt"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.464053 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jntpt" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.465618 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.465776 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.465965 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.466043 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.473628 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.477807 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-f8zgb"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.485217 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8zgb" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.486953 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.487195 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.487198 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.487346 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.487379 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.487465 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.487473 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.487627 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r4qhq"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.502068 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.502300 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.502554 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.502590 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r4qhq" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.502711 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.502941 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.504012 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.504129 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.504220 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.504426 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.504567 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.504682 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.504798 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.504817 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-d5cwn"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.504935 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.505064 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.505204 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.505342 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.505473 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d5cwn" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.505703 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.506377 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.506618 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.506767 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.506804 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.506873 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.506972 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.507024 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.507119 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.507167 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.507248 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.507272 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.507382 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.507495 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.506375 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gxnqd"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.507856 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.508114 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-x7lpn"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.508320 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.508433 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.508477 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-x7lpn" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.508610 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.508352 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gxnqd" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.509239 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-965hb"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.509728 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.509815 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.510722 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-sght6"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.511026 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.512804 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.514788 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.516702 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.517832 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-l5zrr"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.518306 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rmq2r"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.518902 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rmq2r" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.519453 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-sght6" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.519795 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-l5zrr" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.520131 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.522836 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-txmns"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.523366 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-txmns" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.524930 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bkhh5"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.525265 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bkhh5" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.529535 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-jzbd4"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.530097 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-c2q5l"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.530487 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7xbdq"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.531013 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-7xbdq" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.531598 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-jzbd4" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.531831 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-c2q5l" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.533642 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.533763 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.533814 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.533884 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4lbhg"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.534095 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.534361 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.534813 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.535073 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.535279 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.535455 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.535535 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.535620 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.535703 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.535813 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.536311 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.536375 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.536639 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.536677 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.536795 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.536910 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.536957 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.537012 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.537081 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.537136 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.537515 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.537613 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.537701 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.540123 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.543272 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.544737 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.545047 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.545284 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.545682 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.546120 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.546544 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.551930 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fvb85"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.552411 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.552964 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.553002 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fvb85" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.553201 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.553478 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.554231 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.558503 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.559024 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ptrzc"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.561570 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.562079 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.575911 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/267a4077-a682-409c-855f-7de05580fc97-serving-cert\") pod \"openshift-config-operator-7777fb866f-d5cwn\" (UID: \"267a4077-a682-409c-855f-7de05580fc97\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d5cwn" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.575945 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64cpv\" (UniqueName: \"kubernetes.io/projected/267a4077-a682-409c-855f-7de05580fc97-kube-api-access-64cpv\") pod \"openshift-config-operator-7777fb866f-d5cwn\" (UID: \"267a4077-a682-409c-855f-7de05580fc97\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d5cwn" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.575986 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/267a4077-a682-409c-855f-7de05580fc97-available-featuregates\") pod \"openshift-config-operator-7777fb866f-d5cwn\" (UID: \"267a4077-a682-409c-855f-7de05580fc97\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d5cwn" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.576190 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.578477 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-857vf"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.579054 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-857vf" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.579633 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ptrzc" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.582922 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tn49v"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.583281 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.583445 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.583570 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.583640 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-fnfwt"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.584431 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fnfwt" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.585521 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.590125 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.601223 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nhc89"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.602941 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-5l427"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.603441 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5l427" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.603643 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nhc89" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.603837 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.604568 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.604568 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wbs52"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.604981 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wbs52" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.606067 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.609140 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-chccp"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.609773 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-znzhg"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.610126 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-chccp" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.611904 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.613527 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-htkkq"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.614032 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.614295 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-htkkq" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.614652 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.620705 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399700-qknbd"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.621328 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399700-qknbd" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.621789 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-v9rmm"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.622800 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-v9rmm" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.624197 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.630101 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-cprsp"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.631576 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-txwcw"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.632111 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bps76"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.637466 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-txwcw" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.637740 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bps76" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.639041 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-ljttt"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.640481 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-ljttt" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.641278 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-xdzjw"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.644216 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.646816 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-4tkbk"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.648972 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-xdzjw" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.651420 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-w7dbc"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.651914 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-f8zgb"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.652003 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4tkbk" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.652488 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-w7dbc" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.652710 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jjd72"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.653286 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jjd72" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.653967 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r4qhq"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.656408 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-jzbd4"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.657054 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-cvplk"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.659458 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-5l427"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.659606 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-cvplk" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.659986 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-d5cwn"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.661876 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-965hb"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.663296 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.664650 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-c2q5l"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.666029 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-htkkq"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.669770 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-l5zrr"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.674167 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4lbhg"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.675306 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gxnqd"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.676449 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6e9f72eb-52cd-47cc-b939-3301c0aa7f3c-images\") pod \"machine-api-operator-5694c8668f-7xbdq\" (UID: \"6e9f72eb-52cd-47cc-b939-3301c0aa7f3c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7xbdq" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.676506 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6e9f72eb-52cd-47cc-b939-3301c0aa7f3c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7xbdq\" (UID: \"6e9f72eb-52cd-47cc-b939-3301c0aa7f3c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7xbdq" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.676544 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e9f72eb-52cd-47cc-b939-3301c0aa7f3c-config\") pod \"machine-api-operator-5694c8668f-7xbdq\" (UID: \"6e9f72eb-52cd-47cc-b939-3301c0aa7f3c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7xbdq" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.676580 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/267a4077-a682-409c-855f-7de05580fc97-serving-cert\") pod \"openshift-config-operator-7777fb866f-d5cwn\" (UID: \"267a4077-a682-409c-855f-7de05580fc97\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d5cwn" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.676607 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64cpv\" (UniqueName: \"kubernetes.io/projected/267a4077-a682-409c-855f-7de05580fc97-kube-api-access-64cpv\") pod \"openshift-config-operator-7777fb866f-d5cwn\" (UID: \"267a4077-a682-409c-855f-7de05580fc97\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d5cwn" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.676643 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hfpg\" (UniqueName: \"kubernetes.io/projected/6e9f72eb-52cd-47cc-b939-3301c0aa7f3c-kube-api-access-7hfpg\") pod \"machine-api-operator-5694c8668f-7xbdq\" (UID: \"6e9f72eb-52cd-47cc-b939-3301c0aa7f3c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7xbdq" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.676660 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/267a4077-a682-409c-855f-7de05580fc97-available-featuregates\") pod \"openshift-config-operator-7777fb866f-d5cwn\" (UID: \"267a4077-a682-409c-855f-7de05580fc97\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d5cwn" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.677018 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/267a4077-a682-409c-855f-7de05580fc97-available-featuregates\") pod \"openshift-config-operator-7777fb866f-d5cwn\" (UID: \"267a4077-a682-409c-855f-7de05580fc97\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d5cwn" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.679184 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-fnfwt"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.679207 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-sght6"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.681001 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-x7lpn"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.681736 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nhc89"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.682613 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/267a4077-a682-409c-855f-7de05580fc97-serving-cert\") pod \"openshift-config-operator-7777fb866f-d5cwn\" (UID: \"267a4077-a682-409c-855f-7de05580fc97\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d5cwn" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.682853 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.684411 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-txmns"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.686347 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-857vf"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.687323 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-xdzjw"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.688950 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-chccp"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.690307 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wbs52"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.692703 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399700-qknbd"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.694479 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rmq2r"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.696568 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xvmg4"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.698626 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7xbdq"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.699929 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bkhh5"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.702939 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.704824 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ptrzc"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.709819 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-4tkbk"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.710428 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-v9rmm"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.711695 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-ljttt"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.712610 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fvb85"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.713611 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-cvplk"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.714655 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jjd72"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.715720 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-txwcw"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.716813 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-qvffh"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.717518 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qvffh" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.718082 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-pchf4"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.718857 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-pchf4" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.720192 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bps76"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.721346 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qvffh"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.722437 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-sd7kr"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.722647 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.723530 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-sd7kr" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.724175 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-sd7kr"] Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.743590 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.763914 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.777091 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6e9f72eb-52cd-47cc-b939-3301c0aa7f3c-images\") pod \"machine-api-operator-5694c8668f-7xbdq\" (UID: \"6e9f72eb-52cd-47cc-b939-3301c0aa7f3c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7xbdq" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.777233 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6e9f72eb-52cd-47cc-b939-3301c0aa7f3c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7xbdq\" (UID: \"6e9f72eb-52cd-47cc-b939-3301c0aa7f3c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7xbdq" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.777347 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e9f72eb-52cd-47cc-b939-3301c0aa7f3c-config\") pod \"machine-api-operator-5694c8668f-7xbdq\" (UID: \"6e9f72eb-52cd-47cc-b939-3301c0aa7f3c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7xbdq" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.777548 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hfpg\" (UniqueName: \"kubernetes.io/projected/6e9f72eb-52cd-47cc-b939-3301c0aa7f3c-kube-api-access-7hfpg\") pod \"machine-api-operator-5694c8668f-7xbdq\" (UID: \"6e9f72eb-52cd-47cc-b939-3301c0aa7f3c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7xbdq" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.778182 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e9f72eb-52cd-47cc-b939-3301c0aa7f3c-config\") pod \"machine-api-operator-5694c8668f-7xbdq\" (UID: \"6e9f72eb-52cd-47cc-b939-3301c0aa7f3c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7xbdq" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.778265 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6e9f72eb-52cd-47cc-b939-3301c0aa7f3c-images\") pod \"machine-api-operator-5694c8668f-7xbdq\" (UID: \"6e9f72eb-52cd-47cc-b939-3301c0aa7f3c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7xbdq" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.783280 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.789248 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6e9f72eb-52cd-47cc-b939-3301c0aa7f3c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7xbdq\" (UID: \"6e9f72eb-52cd-47cc-b939-3301c0aa7f3c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7xbdq" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.802994 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.823259 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.844942 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.863086 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.882911 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.903937 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.924146 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.943295 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.963849 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Nov 24 11:08:54 crc kubenswrapper[4752]: I1124 11:08:54.984493 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.003045 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.024275 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.043492 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.084513 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.103958 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.123399 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.144667 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.163346 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.203598 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.224122 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.244132 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.266801 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.291492 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.304605 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.323466 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.343629 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.363772 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.384308 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.403843 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.424678 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.444726 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.464135 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.485137 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.503313 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.522936 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.544235 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.563906 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.582879 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.604894 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.622185 4752 request.go:700] Waited for 1.011440484s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-api/secrets?fieldSelector=metadata.name%3Dcontrol-plane-machine-set-operator-dockercfg-k9rxt&limit=500&resourceVersion=0 Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.624113 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.644138 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.663963 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.683705 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.703578 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.723446 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.743521 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.763167 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.783838 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.804134 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.843896 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.844572 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.864091 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.883404 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.903710 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.923287 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.943418 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.963723 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Nov 24 11:08:55 crc kubenswrapper[4752]: I1124 11:08:55.983995 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.003221 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.023872 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.044221 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.063730 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.083369 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.103371 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.123587 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.144443 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.163554 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.183831 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.203656 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.223968 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.244174 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.265055 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.285204 4752 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.303741 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.344290 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.349521 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64cpv\" (UniqueName: \"kubernetes.io/projected/267a4077-a682-409c-855f-7de05580fc97-kube-api-access-64cpv\") pod \"openshift-config-operator-7777fb866f-d5cwn\" (UID: \"267a4077-a682-409c-855f-7de05580fc97\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d5cwn" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.363942 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.383822 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.403917 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.423823 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.432084 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d5cwn" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.443830 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.463775 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.483323 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.504714 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.524084 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.561985 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hfpg\" (UniqueName: \"kubernetes.io/projected/6e9f72eb-52cd-47cc-b939-3301c0aa7f3c-kube-api-access-7hfpg\") pod \"machine-api-operator-5694c8668f-7xbdq\" (UID: \"6e9f72eb-52cd-47cc-b939-3301c0aa7f3c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7xbdq" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.594910 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/0bff57c9-60c1-47ad-8b27-040cb3453a55-audit\") pod \"apiserver-76f77b778f-xvmg4\" (UID: \"0bff57c9-60c1-47ad-8b27-040cb3453a55\") " pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.594983 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/224c53cb-9374-44a5-8256-0c45d1fbab84-etcd-client\") pod \"etcd-operator-b45778765-c2q5l\" (UID: \"224c53cb-9374-44a5-8256-0c45d1fbab84\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c2q5l" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.595017 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5618f796-dad3-4ed3-bff7-ceed08f8b07c-audit-policies\") pod \"oauth-openshift-558db77b4-965hb\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.595054 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ac73154f-7953-4cb0-b785-ebed9795fe9e-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-f8zgb\" (UID: \"ac73154f-7953-4cb0-b785-ebed9795fe9e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8zgb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.595078 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0bff57c9-60c1-47ad-8b27-040cb3453a55-audit-dir\") pod \"apiserver-76f77b778f-xvmg4\" (UID: \"0bff57c9-60c1-47ad-8b27-040cb3453a55\") " pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.595360 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-965hb\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.595430 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf8nr\" (UniqueName: \"kubernetes.io/projected/101dea51-f55e-4def-ae4f-e3d0ce84351f-kube-api-access-zf8nr\") pod \"openshift-controller-manager-operator-756b6f6bc6-bkhh5\" (UID: \"101dea51-f55e-4def-ae4f-e3d0ce84351f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bkhh5" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.595512 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f39385e4-9c04-4aae-878c-34ab4d097664-serving-cert\") pod \"console-operator-58897d9998-l5zrr\" (UID: \"f39385e4-9c04-4aae-878c-34ab4d097664\") " pod="openshift-console-operator/console-operator-58897d9998-l5zrr" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.595549 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d4bbc9d0-4420-4581-a3be-919da131bf9a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-tn49v\" (UID: \"d4bbc9d0-4420-4581-a3be-919da131bf9a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tn49v" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.595587 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b03c5740-8446-4caf-9072-274155052591-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-txmns\" (UID: \"b03c5740-8446-4caf-9072-274155052591\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-txmns" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.595620 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnn4t\" (UniqueName: \"kubernetes.io/projected/3a9e501c-00d3-4607-8b6b-b01485ba4961-kube-api-access-cnn4t\") pod \"cluster-image-registry-operator-dc59b4c8b-rmq2r\" (UID: \"3a9e501c-00d3-4607-8b6b-b01485ba4961\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rmq2r" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.595652 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngx8g\" (UniqueName: \"kubernetes.io/projected/5618f796-dad3-4ed3-bff7-ceed08f8b07c-kube-api-access-ngx8g\") pod \"oauth-openshift-558db77b4-965hb\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.595686 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/df4cedec-414c-4253-8a55-79ed8c8734c1-oauth-serving-cert\") pod \"console-f9d7485db-x7lpn\" (UID: \"df4cedec-414c-4253-8a55-79ed8c8734c1\") " pod="openshift-console/console-f9d7485db-x7lpn" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.595762 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-965hb\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.595807 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bae98c3e-6921-4f96-beab-9d294469a8fe-metrics-tls\") pod \"dns-operator-744455d44c-jzbd4\" (UID: \"bae98c3e-6921-4f96-beab-9d294469a8fe\") " pod="openshift-dns-operator/dns-operator-744455d44c-jzbd4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.595842 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fde8828e-798f-4da5-9f44-0b8a2726dcb1-serving-cert\") pod \"route-controller-manager-6576b87f9c-cprsp\" (UID: \"fde8828e-798f-4da5-9f44-0b8a2726dcb1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cprsp" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.595870 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0bff57c9-60c1-47ad-8b27-040cb3453a55-etcd-serving-ca\") pod \"apiserver-76f77b778f-xvmg4\" (UID: \"0bff57c9-60c1-47ad-8b27-040cb3453a55\") " pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.595893 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ac73154f-7953-4cb0-b785-ebed9795fe9e-audit-dir\") pod \"apiserver-7bbb656c7d-f8zgb\" (UID: \"ac73154f-7953-4cb0-b785-ebed9795fe9e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8zgb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.595959 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bff57c9-60c1-47ad-8b27-040cb3453a55-config\") pod \"apiserver-76f77b778f-xvmg4\" (UID: \"0bff57c9-60c1-47ad-8b27-040cb3453a55\") " pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.596018 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8ee461c-f13e-49f3-ba77-8063e92a9d01-service-ca-bundle\") pod \"authentication-operator-69f744f599-znzhg\" (UID: \"b8ee461c-f13e-49f3-ba77-8063e92a9d01\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-znzhg" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.596058 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-965hb\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.596094 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-965hb\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.596120 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b03c5740-8446-4caf-9072-274155052591-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-txmns\" (UID: \"b03c5740-8446-4caf-9072-274155052591\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-txmns" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.596143 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-965hb\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.596168 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8d1db8d4-8511-4932-8478-61ddedab3680-machine-approver-tls\") pod \"machine-approver-56656f9798-jntpt\" (UID: \"8d1db8d4-8511-4932-8478-61ddedab3680\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jntpt" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.596197 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/df4cedec-414c-4253-8a55-79ed8c8734c1-console-serving-cert\") pod \"console-f9d7485db-x7lpn\" (UID: \"df4cedec-414c-4253-8a55-79ed8c8734c1\") " pod="openshift-console/console-f9d7485db-x7lpn" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.596224 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvsgf\" (UniqueName: \"kubernetes.io/projected/ac73154f-7953-4cb0-b785-ebed9795fe9e-kube-api-access-kvsgf\") pod \"apiserver-7bbb656c7d-f8zgb\" (UID: \"ac73154f-7953-4cb0-b785-ebed9795fe9e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8zgb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.596253 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0bff57c9-60c1-47ad-8b27-040cb3453a55-encryption-config\") pod \"apiserver-76f77b778f-xvmg4\" (UID: \"0bff57c9-60c1-47ad-8b27-040cb3453a55\") " pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.596341 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nmlc\" (UniqueName: \"kubernetes.io/projected/e7f6a4fd-346f-4828-b077-4d4888917d6a-kube-api-access-9nmlc\") pod \"downloads-7954f5f757-sght6\" (UID: \"e7f6a4fd-346f-4828-b077-4d4888917d6a\") " pod="openshift-console/downloads-7954f5f757-sght6" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.596367 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/224c53cb-9374-44a5-8256-0c45d1fbab84-config\") pod \"etcd-operator-b45778765-c2q5l\" (UID: \"224c53cb-9374-44a5-8256-0c45d1fbab84\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c2q5l" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.596412 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c0f8aa1-dec1-46a3-ad52-b1e3275504b2-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-fvb85\" (UID: \"9c0f8aa1-dec1-46a3-ad52-b1e3275504b2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fvb85" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.596437 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/101dea51-f55e-4def-ae4f-e3d0ce84351f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-bkhh5\" (UID: \"101dea51-f55e-4def-ae4f-e3d0ce84351f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bkhh5" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.596464 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fde8828e-798f-4da5-9f44-0b8a2726dcb1-config\") pod \"route-controller-manager-6576b87f9c-cprsp\" (UID: \"fde8828e-798f-4da5-9f44-0b8a2726dcb1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cprsp" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.596589 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df4cedec-414c-4253-8a55-79ed8c8734c1-trusted-ca-bundle\") pod \"console-f9d7485db-x7lpn\" (UID: \"df4cedec-414c-4253-8a55-79ed8c8734c1\") " pod="openshift-console/console-f9d7485db-x7lpn" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.596688 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-965hb\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.596816 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6lhf\" (UniqueName: \"kubernetes.io/projected/a3037291-c53e-4eb9-ae1b-00f71fee5cc5-kube-api-access-d6lhf\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.596854 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c0f8aa1-dec1-46a3-ad52-b1e3275504b2-config\") pod \"kube-apiserver-operator-766d6c64bb-fvb85\" (UID: \"9c0f8aa1-dec1-46a3-ad52-b1e3275504b2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fvb85" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.596939 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b04041a-699e-42d5-a26e-3af266dc33ee-config\") pod \"openshift-apiserver-operator-796bbdcf4f-r4qhq\" (UID: \"7b04041a-699e-42d5-a26e-3af266dc33ee\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r4qhq" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.597008 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8d1db8d4-8511-4932-8478-61ddedab3680-auth-proxy-config\") pod \"machine-approver-56656f9798-jntpt\" (UID: \"8d1db8d4-8511-4932-8478-61ddedab3680\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jntpt" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.597050 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6jtr\" (UniqueName: \"kubernetes.io/projected/0bff57c9-60c1-47ad-8b27-040cb3453a55-kube-api-access-t6jtr\") pod \"apiserver-76f77b778f-xvmg4\" (UID: \"0bff57c9-60c1-47ad-8b27-040cb3453a55\") " pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.597145 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-965hb\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.597223 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4bbc9d0-4420-4581-a3be-919da131bf9a-serving-cert\") pod \"controller-manager-879f6c89f-tn49v\" (UID: \"d4bbc9d0-4420-4581-a3be-919da131bf9a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tn49v" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.597290 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3a9e501c-00d3-4607-8b6b-b01485ba4961-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-rmq2r\" (UID: \"3a9e501c-00d3-4607-8b6b-b01485ba4961\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rmq2r" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.597369 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pfbf\" (UniqueName: \"kubernetes.io/projected/7b04041a-699e-42d5-a26e-3af266dc33ee-kube-api-access-2pfbf\") pod \"openshift-apiserver-operator-796bbdcf4f-r4qhq\" (UID: \"7b04041a-699e-42d5-a26e-3af266dc33ee\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r4qhq" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.597442 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ac73154f-7953-4cb0-b785-ebed9795fe9e-audit-policies\") pod \"apiserver-7bbb656c7d-f8zgb\" (UID: \"ac73154f-7953-4cb0-b785-ebed9795fe9e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8zgb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.597506 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/0bff57c9-60c1-47ad-8b27-040cb3453a55-image-import-ca\") pod \"apiserver-76f77b778f-xvmg4\" (UID: \"0bff57c9-60c1-47ad-8b27-040cb3453a55\") " pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.597572 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b04041a-699e-42d5-a26e-3af266dc33ee-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-r4qhq\" (UID: \"7b04041a-699e-42d5-a26e-3af266dc33ee\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r4qhq" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.597642 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a3037291-c53e-4eb9-ae1b-00f71fee5cc5-trusted-ca\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.597707 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f39385e4-9c04-4aae-878c-34ab4d097664-config\") pod \"console-operator-58897d9998-l5zrr\" (UID: \"f39385e4-9c04-4aae-878c-34ab4d097664\") " pod="openshift-console-operator/console-operator-58897d9998-l5zrr" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.597777 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac73154f-7953-4cb0-b785-ebed9795fe9e-serving-cert\") pod \"apiserver-7bbb656c7d-f8zgb\" (UID: \"ac73154f-7953-4cb0-b785-ebed9795fe9e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8zgb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.597873 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.597925 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hsbd\" (UniqueName: \"kubernetes.io/projected/bae98c3e-6921-4f96-beab-9d294469a8fe-kube-api-access-6hsbd\") pod \"dns-operator-744455d44c-jzbd4\" (UID: \"bae98c3e-6921-4f96-beab-9d294469a8fe\") " pod="openshift-dns-operator/dns-operator-744455d44c-jzbd4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.597961 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3a9e501c-00d3-4607-8b6b-b01485ba4961-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-rmq2r\" (UID: \"3a9e501c-00d3-4607-8b6b-b01485ba4961\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rmq2r" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.598030 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fde8828e-798f-4da5-9f44-0b8a2726dcb1-client-ca\") pod \"route-controller-manager-6576b87f9c-cprsp\" (UID: \"fde8828e-798f-4da5-9f44-0b8a2726dcb1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cprsp" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.598063 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0bff57c9-60c1-47ad-8b27-040cb3453a55-node-pullsecrets\") pod \"apiserver-76f77b778f-xvmg4\" (UID: \"0bff57c9-60c1-47ad-8b27-040cb3453a55\") " pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.598093 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/df4cedec-414c-4253-8a55-79ed8c8734c1-console-oauth-config\") pod \"console-f9d7485db-x7lpn\" (UID: \"df4cedec-414c-4253-8a55-79ed8c8734c1\") " pod="openshift-console/console-f9d7485db-x7lpn" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.598156 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a3037291-c53e-4eb9-ae1b-00f71fee5cc5-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.598186 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/224c53cb-9374-44a5-8256-0c45d1fbab84-etcd-service-ca\") pod \"etcd-operator-b45778765-c2q5l\" (UID: \"224c53cb-9374-44a5-8256-0c45d1fbab84\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c2q5l" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.598219 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/224c53cb-9374-44a5-8256-0c45d1fbab84-etcd-ca\") pod \"etcd-operator-b45778765-c2q5l\" (UID: \"224c53cb-9374-44a5-8256-0c45d1fbab84\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c2q5l" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.598252 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bff57c9-60c1-47ad-8b27-040cb3453a55-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xvmg4\" (UID: \"0bff57c9-60c1-47ad-8b27-040cb3453a55\") " pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.598282 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg9d2\" (UniqueName: \"kubernetes.io/projected/f39385e4-9c04-4aae-878c-34ab4d097664-kube-api-access-hg9d2\") pod \"console-operator-58897d9998-l5zrr\" (UID: \"f39385e4-9c04-4aae-878c-34ab4d097664\") " pod="openshift-console-operator/console-operator-58897d9998-l5zrr" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.598314 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac73154f-7953-4cb0-b785-ebed9795fe9e-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-f8zgb\" (UID: \"ac73154f-7953-4cb0-b785-ebed9795fe9e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8zgb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.598344 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-965hb\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.598382 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ac73154f-7953-4cb0-b785-ebed9795fe9e-encryption-config\") pod \"apiserver-7bbb656c7d-f8zgb\" (UID: \"ac73154f-7953-4cb0-b785-ebed9795fe9e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8zgb" Nov 24 11:08:56 crc kubenswrapper[4752]: E1124 11:08:56.598405 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:08:57.098384891 +0000 UTC m=+143.083205200 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.598441 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cff4w\" (UniqueName: \"kubernetes.io/projected/6a04c082-6938-494c-833e-65f1f25817a9-kube-api-access-cff4w\") pod \"cluster-samples-operator-665b6dd947-gxnqd\" (UID: \"6a04c082-6938-494c-833e-65f1f25817a9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gxnqd" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.598493 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-965hb\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.598537 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-965hb\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.598575 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bff57c9-60c1-47ad-8b27-040cb3453a55-serving-cert\") pod \"apiserver-76f77b778f-xvmg4\" (UID: \"0bff57c9-60c1-47ad-8b27-040cb3453a55\") " pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.598615 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-965hb\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.598649 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/101dea51-f55e-4def-ae4f-e3d0ce84351f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-bkhh5\" (UID: \"101dea51-f55e-4def-ae4f-e3d0ce84351f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bkhh5" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.598679 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8ee461c-f13e-49f3-ba77-8063e92a9d01-config\") pod \"authentication-operator-69f744f599-znzhg\" (UID: \"b8ee461c-f13e-49f3-ba77-8063e92a9d01\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-znzhg" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.598707 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0bff57c9-60c1-47ad-8b27-040cb3453a55-etcd-client\") pod \"apiserver-76f77b778f-xvmg4\" (UID: \"0bff57c9-60c1-47ad-8b27-040cb3453a55\") " pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.598737 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4bbc9d0-4420-4581-a3be-919da131bf9a-config\") pod \"controller-manager-879f6c89f-tn49v\" (UID: \"d4bbc9d0-4420-4581-a3be-919da131bf9a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tn49v" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.598799 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4bbc9d0-4420-4581-a3be-919da131bf9a-client-ca\") pod \"controller-manager-879f6c89f-tn49v\" (UID: \"d4bbc9d0-4420-4581-a3be-919da131bf9a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tn49v" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.598834 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8ee461c-f13e-49f3-ba77-8063e92a9d01-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-znzhg\" (UID: \"b8ee461c-f13e-49f3-ba77-8063e92a9d01\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-znzhg" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.598869 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8ee461c-f13e-49f3-ba77-8063e92a9d01-serving-cert\") pod \"authentication-operator-69f744f599-znzhg\" (UID: \"b8ee461c-f13e-49f3-ba77-8063e92a9d01\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-znzhg" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.598902 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5618f796-dad3-4ed3-bff7-ceed08f8b07c-audit-dir\") pod \"oauth-openshift-558db77b4-965hb\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.598943 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a3037291-c53e-4eb9-ae1b-00f71fee5cc5-registry-certificates\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.598974 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a3037291-c53e-4eb9-ae1b-00f71fee5cc5-registry-tls\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.599004 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a3037291-c53e-4eb9-ae1b-00f71fee5cc5-bound-sa-token\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.599038 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c0f8aa1-dec1-46a3-ad52-b1e3275504b2-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-fvb85\" (UID: \"9c0f8aa1-dec1-46a3-ad52-b1e3275504b2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fvb85" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.599069 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f39385e4-9c04-4aae-878c-34ab4d097664-trusted-ca\") pod \"console-operator-58897d9998-l5zrr\" (UID: \"f39385e4-9c04-4aae-878c-34ab4d097664\") " pod="openshift-console-operator/console-operator-58897d9998-l5zrr" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.599100 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tmj4\" (UniqueName: \"kubernetes.io/projected/d4bbc9d0-4420-4581-a3be-919da131bf9a-kube-api-access-9tmj4\") pod \"controller-manager-879f6c89f-tn49v\" (UID: \"d4bbc9d0-4420-4581-a3be-919da131bf9a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tn49v" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.599135 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ac73154f-7953-4cb0-b785-ebed9795fe9e-etcd-client\") pod \"apiserver-7bbb656c7d-f8zgb\" (UID: \"ac73154f-7953-4cb0-b785-ebed9795fe9e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8zgb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.599235 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zglc9\" (UniqueName: \"kubernetes.io/projected/fde8828e-798f-4da5-9f44-0b8a2726dcb1-kube-api-access-zglc9\") pod \"route-controller-manager-6576b87f9c-cprsp\" (UID: \"fde8828e-798f-4da5-9f44-0b8a2726dcb1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cprsp" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.599280 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/df4cedec-414c-4253-8a55-79ed8c8734c1-console-config\") pod \"console-f9d7485db-x7lpn\" (UID: \"df4cedec-414c-4253-8a55-79ed8c8734c1\") " pod="openshift-console/console-f9d7485db-x7lpn" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.599324 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtnz8\" (UniqueName: \"kubernetes.io/projected/df4cedec-414c-4253-8a55-79ed8c8734c1-kube-api-access-xtnz8\") pod \"console-f9d7485db-x7lpn\" (UID: \"df4cedec-414c-4253-8a55-79ed8c8734c1\") " pod="openshift-console/console-f9d7485db-x7lpn" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.599360 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a3037291-c53e-4eb9-ae1b-00f71fee5cc5-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.599391 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ndvd\" (UniqueName: \"kubernetes.io/projected/b8ee461c-f13e-49f3-ba77-8063e92a9d01-kube-api-access-4ndvd\") pod \"authentication-operator-69f744f599-znzhg\" (UID: \"b8ee461c-f13e-49f3-ba77-8063e92a9d01\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-znzhg" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.599423 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d1db8d4-8511-4932-8478-61ddedab3680-config\") pod \"machine-approver-56656f9798-jntpt\" (UID: \"8d1db8d4-8511-4932-8478-61ddedab3680\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jntpt" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.599453 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h74z\" (UniqueName: \"kubernetes.io/projected/8d1db8d4-8511-4932-8478-61ddedab3680-kube-api-access-6h74z\") pod \"machine-approver-56656f9798-jntpt\" (UID: \"8d1db8d4-8511-4932-8478-61ddedab3680\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jntpt" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.599489 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b03c5740-8446-4caf-9072-274155052591-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-txmns\" (UID: \"b03c5740-8446-4caf-9072-274155052591\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-txmns" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.599528 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/df4cedec-414c-4253-8a55-79ed8c8734c1-service-ca\") pod \"console-f9d7485db-x7lpn\" (UID: \"df4cedec-414c-4253-8a55-79ed8c8734c1\") " pod="openshift-console/console-f9d7485db-x7lpn" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.599557 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/224c53cb-9374-44a5-8256-0c45d1fbab84-serving-cert\") pod \"etcd-operator-b45778765-c2q5l\" (UID: \"224c53cb-9374-44a5-8256-0c45d1fbab84\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c2q5l" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.599603 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd7zx\" (UniqueName: \"kubernetes.io/projected/224c53cb-9374-44a5-8256-0c45d1fbab84-kube-api-access-bd7zx\") pod \"etcd-operator-b45778765-c2q5l\" (UID: \"224c53cb-9374-44a5-8256-0c45d1fbab84\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c2q5l" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.599640 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6a04c082-6938-494c-833e-65f1f25817a9-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gxnqd\" (UID: \"6a04c082-6938-494c-833e-65f1f25817a9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gxnqd" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.599675 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a9e501c-00d3-4607-8b6b-b01485ba4961-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-rmq2r\" (UID: \"3a9e501c-00d3-4607-8b6b-b01485ba4961\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rmq2r" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.700515 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:08:56 crc kubenswrapper[4752]: E1124 11:08:56.700645 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:08:57.200619403 +0000 UTC m=+143.185439692 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.700893 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/891cfebf-6f4e-446d-85f4-4231622d7886-config-volume\") pod \"dns-default-sd7kr\" (UID: \"891cfebf-6f4e-446d-85f4-4231622d7886\") " pod="openshift-dns/dns-default-sd7kr" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.700912 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9503659c-61c6-42ce-988d-ee76cef6db29-serving-cert\") pod \"service-ca-operator-777779d784-v9rmm\" (UID: \"9503659c-61c6-42ce-988d-ee76cef6db29\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v9rmm" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.700941 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a5b18d85-5b4b-4e2d-a357-39462b1bbcd6-srv-cert\") pod \"olm-operator-6b444d44fb-wbs52\" (UID: \"a5b18d85-5b4b-4e2d-a357-39462b1bbcd6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wbs52" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.700958 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6gcq\" (UniqueName: \"kubernetes.io/projected/2521dbb1-8413-4180-bd70-2cf9019e583c-kube-api-access-l6gcq\") pod \"ingress-canary-qvffh\" (UID: \"2521dbb1-8413-4180-bd70-2cf9019e583c\") " pod="openshift-ingress-canary/ingress-canary-qvffh" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.701020 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a3037291-c53e-4eb9-ae1b-00f71fee5cc5-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.701106 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bff57c9-60c1-47ad-8b27-040cb3453a55-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xvmg4\" (UID: \"0bff57c9-60c1-47ad-8b27-040cb3453a55\") " pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.701258 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b5fea82-8918-4237-99a6-eae4894a0b5f-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-chccp\" (UID: \"9b5fea82-8918-4237-99a6-eae4894a0b5f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-chccp" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.701288 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/05ea66c9-da9e-44a9-b85b-11514abd77e1-registration-dir\") pod \"csi-hostpathplugin-cvplk\" (UID: \"05ea66c9-da9e-44a9-b85b-11514abd77e1\") " pod="hostpath-provisioner/csi-hostpathplugin-cvplk" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.701316 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg9d2\" (UniqueName: \"kubernetes.io/projected/f39385e4-9c04-4aae-878c-34ab4d097664-kube-api-access-hg9d2\") pod \"console-operator-58897d9998-l5zrr\" (UID: \"f39385e4-9c04-4aae-878c-34ab4d097664\") " pod="openshift-console-operator/console-operator-58897d9998-l5zrr" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.701338 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac73154f-7953-4cb0-b785-ebed9795fe9e-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-f8zgb\" (UID: \"ac73154f-7953-4cb0-b785-ebed9795fe9e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8zgb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.701406 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-965hb\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.701424 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-965hb\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.701440 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8731be1c-9d01-46ea-9237-741ac9b9eb31-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-ljttt\" (UID: \"8731be1c-9d01-46ea-9237-741ac9b9eb31\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ljttt" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.701460 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8ee461c-f13e-49f3-ba77-8063e92a9d01-config\") pod \"authentication-operator-69f744f599-znzhg\" (UID: \"b8ee461c-f13e-49f3-ba77-8063e92a9d01\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-znzhg" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.701477 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0bff57c9-60c1-47ad-8b27-040cb3453a55-etcd-client\") pod \"apiserver-76f77b778f-xvmg4\" (UID: \"0bff57c9-60c1-47ad-8b27-040cb3453a55\") " pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.701499 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bff57c9-60c1-47ad-8b27-040cb3453a55-serving-cert\") pod \"apiserver-76f77b778f-xvmg4\" (UID: \"0bff57c9-60c1-47ad-8b27-040cb3453a55\") " pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.701517 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-965hb\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.701533 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5284b897-69bd-490f-a02d-2d8647c73029-signing-cabundle\") pod \"service-ca-9c57cc56f-xdzjw\" (UID: \"5284b897-69bd-490f-a02d-2d8647c73029\") " pod="openshift-service-ca/service-ca-9c57cc56f-xdzjw" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.701552 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a3037291-c53e-4eb9-ae1b-00f71fee5cc5-registry-certificates\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.701567 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4bbc9d0-4420-4581-a3be-919da131bf9a-config\") pod \"controller-manager-879f6c89f-tn49v\" (UID: \"d4bbc9d0-4420-4581-a3be-919da131bf9a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tn49v" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.701584 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4bbc9d0-4420-4581-a3be-919da131bf9a-client-ca\") pod \"controller-manager-879f6c89f-tn49v\" (UID: \"d4bbc9d0-4420-4581-a3be-919da131bf9a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tn49v" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.701600 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8ee461c-f13e-49f3-ba77-8063e92a9d01-serving-cert\") pod \"authentication-operator-69f744f599-znzhg\" (UID: \"b8ee461c-f13e-49f3-ba77-8063e92a9d01\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-znzhg" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.701732 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ttbp\" (UniqueName: \"kubernetes.io/projected/08f929dc-5226-49d2-9048-9b4fb65eff57-kube-api-access-8ttbp\") pod \"catalog-operator-68c6474976-jjd72\" (UID: \"08f929dc-5226-49d2-9048-9b4fb65eff57\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jjd72" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.701764 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c0998cf9-2c91-46b3-a9b8-a070e8b855b2-stats-auth\") pod \"router-default-5444994796-w7dbc\" (UID: \"c0998cf9-2c91-46b3-a9b8-a070e8b855b2\") " pod="openshift-ingress/router-default-5444994796-w7dbc" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.701781 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3147bef-3d3e-4b85-be3f-bfd0887849a9-config\") pod \"kube-controller-manager-operator-78b949d7b-857vf\" (UID: \"f3147bef-3d3e-4b85-be3f-bfd0887849a9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-857vf" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.701796 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts5wg\" (UniqueName: \"kubernetes.io/projected/05ea66c9-da9e-44a9-b85b-11514abd77e1-kube-api-access-ts5wg\") pod \"csi-hostpathplugin-cvplk\" (UID: \"05ea66c9-da9e-44a9-b85b-11514abd77e1\") " pod="hostpath-provisioner/csi-hostpathplugin-cvplk" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.701824 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a3037291-c53e-4eb9-ae1b-00f71fee5cc5-bound-sa-token\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.701841 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c0f8aa1-dec1-46a3-ad52-b1e3275504b2-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-fvb85\" (UID: \"9c0f8aa1-dec1-46a3-ad52-b1e3275504b2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fvb85" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.701855 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f39385e4-9c04-4aae-878c-34ab4d097664-trusted-ca\") pod \"console-operator-58897d9998-l5zrr\" (UID: \"f39385e4-9c04-4aae-878c-34ab4d097664\") " pod="openshift-console-operator/console-operator-58897d9998-l5zrr" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.701869 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ac73154f-7953-4cb0-b785-ebed9795fe9e-etcd-client\") pod \"apiserver-7bbb656c7d-f8zgb\" (UID: \"ac73154f-7953-4cb0-b785-ebed9795fe9e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8zgb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.701886 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zglc9\" (UniqueName: \"kubernetes.io/projected/fde8828e-798f-4da5-9f44-0b8a2726dcb1-kube-api-access-zglc9\") pod \"route-controller-manager-6576b87f9c-cprsp\" (UID: \"fde8828e-798f-4da5-9f44-0b8a2726dcb1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cprsp" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.701902 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfztv\" (UniqueName: \"kubernetes.io/projected/891cfebf-6f4e-446d-85f4-4231622d7886-kube-api-access-hfztv\") pod \"dns-default-sd7kr\" (UID: \"891cfebf-6f4e-446d-85f4-4231622d7886\") " pod="openshift-dns/dns-default-sd7kr" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.701918 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14136410-157d-4287-96e1-6939c887bef3-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-nhc89\" (UID: \"14136410-157d-4287-96e1-6939c887bef3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nhc89" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.701943 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14136410-157d-4287-96e1-6939c887bef3-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-nhc89\" (UID: \"14136410-157d-4287-96e1-6939c887bef3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nhc89" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.701959 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ndvd\" (UniqueName: \"kubernetes.io/projected/b8ee461c-f13e-49f3-ba77-8063e92a9d01-kube-api-access-4ndvd\") pod \"authentication-operator-69f744f599-znzhg\" (UID: \"b8ee461c-f13e-49f3-ba77-8063e92a9d01\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-znzhg" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.701975 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d1db8d4-8511-4932-8478-61ddedab3680-config\") pod \"machine-approver-56656f9798-jntpt\" (UID: \"8d1db8d4-8511-4932-8478-61ddedab3680\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jntpt" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.701991 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cffc3e00-cf34-448e-83c9-96c95a232caa-images\") pod \"machine-config-operator-74547568cd-5l427\" (UID: \"cffc3e00-cf34-448e-83c9-96c95a232caa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5l427" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702016 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/93b229ce-883d-43b5-a41c-1012da66417b-proxy-tls\") pod \"machine-config-controller-84d6567774-4tkbk\" (UID: \"93b229ce-883d-43b5-a41c-1012da66417b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4tkbk" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702034 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/df4cedec-414c-4253-8a55-79ed8c8734c1-service-ca\") pod \"console-f9d7485db-x7lpn\" (UID: \"df4cedec-414c-4253-8a55-79ed8c8734c1\") " pod="openshift-console/console-f9d7485db-x7lpn" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702052 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6a04c082-6938-494c-833e-65f1f25817a9-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gxnqd\" (UID: \"6a04c082-6938-494c-833e-65f1f25817a9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gxnqd" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702069 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/eac73a5d-96c7-4aad-8882-edbb7eb2b55d-tmpfs\") pod \"packageserver-d55dfcdfc-txwcw\" (UID: \"eac73a5d-96c7-4aad-8882-edbb7eb2b55d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-txwcw" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702086 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/0bff57c9-60c1-47ad-8b27-040cb3453a55-audit\") pod \"apiserver-76f77b778f-xvmg4\" (UID: \"0bff57c9-60c1-47ad-8b27-040cb3453a55\") " pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702101 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ac73154f-7953-4cb0-b785-ebed9795fe9e-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-f8zgb\" (UID: \"ac73154f-7953-4cb0-b785-ebed9795fe9e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8zgb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702116 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0bff57c9-60c1-47ad-8b27-040cb3453a55-audit-dir\") pod \"apiserver-76f77b778f-xvmg4\" (UID: \"0bff57c9-60c1-47ad-8b27-040cb3453a55\") " pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702131 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-965hb\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702146 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf8nr\" (UniqueName: \"kubernetes.io/projected/101dea51-f55e-4def-ae4f-e3d0ce84351f-kube-api-access-zf8nr\") pod \"openshift-controller-manager-operator-756b6f6bc6-bkhh5\" (UID: \"101dea51-f55e-4def-ae4f-e3d0ce84351f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bkhh5" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702163 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/eac73a5d-96c7-4aad-8882-edbb7eb2b55d-webhook-cert\") pod \"packageserver-d55dfcdfc-txwcw\" (UID: \"eac73a5d-96c7-4aad-8882-edbb7eb2b55d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-txwcw" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702178 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c0998cf9-2c91-46b3-a9b8-a070e8b855b2-default-certificate\") pod \"router-default-5444994796-w7dbc\" (UID: \"c0998cf9-2c91-46b3-a9b8-a070e8b855b2\") " pod="openshift-ingress/router-default-5444994796-w7dbc" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702192 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrc5k\" (UniqueName: \"kubernetes.io/projected/5284b897-69bd-490f-a02d-2d8647c73029-kube-api-access-zrc5k\") pod \"service-ca-9c57cc56f-xdzjw\" (UID: \"5284b897-69bd-490f-a02d-2d8647c73029\") " pod="openshift-service-ca/service-ca-9c57cc56f-xdzjw" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702209 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ptsq\" (UniqueName: \"kubernetes.io/projected/782a3373-7524-40dc-b312-08c40423ffb6-kube-api-access-9ptsq\") pod \"marketplace-operator-79b997595-bps76\" (UID: \"782a3373-7524-40dc-b312-08c40423ffb6\") " pod="openshift-marketplace/marketplace-operator-79b997595-bps76" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702226 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f39385e4-9c04-4aae-878c-34ab4d097664-serving-cert\") pod \"console-operator-58897d9998-l5zrr\" (UID: \"f39385e4-9c04-4aae-878c-34ab4d097664\") " pod="openshift-console-operator/console-operator-58897d9998-l5zrr" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702241 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d4bbc9d0-4420-4581-a3be-919da131bf9a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-tn49v\" (UID: \"d4bbc9d0-4420-4581-a3be-919da131bf9a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tn49v" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702256 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b03c5740-8446-4caf-9072-274155052591-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-txmns\" (UID: \"b03c5740-8446-4caf-9072-274155052591\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-txmns" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702271 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq5s4\" (UniqueName: \"kubernetes.io/projected/f619c2d4-bcf5-4403-acd6-0bf90e2ece94-kube-api-access-gq5s4\") pod \"collect-profiles-29399700-qknbd\" (UID: \"f619c2d4-bcf5-4403-acd6-0bf90e2ece94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399700-qknbd" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702286 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9503659c-61c6-42ce-988d-ee76cef6db29-config\") pod \"service-ca-operator-777779d784-v9rmm\" (UID: \"9503659c-61c6-42ce-988d-ee76cef6db29\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v9rmm" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702302 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/782a3373-7524-40dc-b312-08c40423ffb6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bps76\" (UID: \"782a3373-7524-40dc-b312-08c40423ffb6\") " pod="openshift-marketplace/marketplace-operator-79b997595-bps76" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702338 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmnj7\" (UniqueName: \"kubernetes.io/projected/3258c094-1ed5-4938-8b6c-284465245440-kube-api-access-hmnj7\") pod \"ingress-operator-5b745b69d9-fnfwt\" (UID: \"3258c094-1ed5-4938-8b6c-284465245440\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fnfwt" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702366 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0bff57c9-60c1-47ad-8b27-040cb3453a55-etcd-serving-ca\") pod \"apiserver-76f77b778f-xvmg4\" (UID: \"0bff57c9-60c1-47ad-8b27-040cb3453a55\") " pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702402 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ac73154f-7953-4cb0-b785-ebed9795fe9e-audit-dir\") pod \"apiserver-7bbb656c7d-f8zgb\" (UID: \"ac73154f-7953-4cb0-b785-ebed9795fe9e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8zgb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702419 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/05ea66c9-da9e-44a9-b85b-11514abd77e1-csi-data-dir\") pod \"csi-hostpathplugin-cvplk\" (UID: \"05ea66c9-da9e-44a9-b85b-11514abd77e1\") " pod="hostpath-provisioner/csi-hostpathplugin-cvplk" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702444 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8ee461c-f13e-49f3-ba77-8063e92a9d01-service-ca-bundle\") pod \"authentication-operator-69f744f599-znzhg\" (UID: \"b8ee461c-f13e-49f3-ba77-8063e92a9d01\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-znzhg" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702459 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-965hb\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702476 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-965hb\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702490 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b03c5740-8446-4caf-9072-274155052591-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-txmns\" (UID: \"b03c5740-8446-4caf-9072-274155052591\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-txmns" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702521 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/eac73a5d-96c7-4aad-8882-edbb7eb2b55d-apiservice-cert\") pod \"packageserver-d55dfcdfc-txwcw\" (UID: \"eac73a5d-96c7-4aad-8882-edbb7eb2b55d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-txwcw" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702536 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cffc3e00-cf34-448e-83c9-96c95a232caa-auth-proxy-config\") pod \"machine-config-operator-74547568cd-5l427\" (UID: \"cffc3e00-cf34-448e-83c9-96c95a232caa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5l427" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702551 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgtb5\" (UniqueName: \"kubernetes.io/projected/cae410f5-e7a9-42e2-9aaa-d6634d848e58-kube-api-access-pgtb5\") pod \"migrator-59844c95c7-ptrzc\" (UID: \"cae410f5-e7a9-42e2-9aaa-d6634d848e58\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ptrzc" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702565 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0998cf9-2c91-46b3-a9b8-a070e8b855b2-service-ca-bundle\") pod \"router-default-5444994796-w7dbc\" (UID: \"c0998cf9-2c91-46b3-a9b8-a070e8b855b2\") " pod="openshift-ingress/router-default-5444994796-w7dbc" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702582 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/df4cedec-414c-4253-8a55-79ed8c8734c1-console-serving-cert\") pod \"console-f9d7485db-x7lpn\" (UID: \"df4cedec-414c-4253-8a55-79ed8c8734c1\") " pod="openshift-console/console-f9d7485db-x7lpn" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702599 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-965hb\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702615 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2521dbb1-8413-4180-bd70-2cf9019e583c-cert\") pod \"ingress-canary-qvffh\" (UID: \"2521dbb1-8413-4180-bd70-2cf9019e583c\") " pod="openshift-ingress-canary/ingress-canary-qvffh" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702632 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0bff57c9-60c1-47ad-8b27-040cb3453a55-encryption-config\") pod \"apiserver-76f77b778f-xvmg4\" (UID: \"0bff57c9-60c1-47ad-8b27-040cb3453a55\") " pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702646 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fde8828e-798f-4da5-9f44-0b8a2726dcb1-config\") pod \"route-controller-manager-6576b87f9c-cprsp\" (UID: \"fde8828e-798f-4da5-9f44-0b8a2726dcb1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cprsp" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702663 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/57e2fd42-88ec-49e8-b089-ff3d0a9405cf-node-bootstrap-token\") pod \"machine-config-server-pchf4\" (UID: \"57e2fd42-88ec-49e8-b089-ff3d0a9405cf\") " pod="openshift-machine-config-operator/machine-config-server-pchf4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702701 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/57e2fd42-88ec-49e8-b089-ff3d0a9405cf-certs\") pod \"machine-config-server-pchf4\" (UID: \"57e2fd42-88ec-49e8-b089-ff3d0a9405cf\") " pod="openshift-machine-config-operator/machine-config-server-pchf4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702717 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/782a3373-7524-40dc-b312-08c40423ffb6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bps76\" (UID: \"782a3373-7524-40dc-b312-08c40423ffb6\") " pod="openshift-marketplace/marketplace-operator-79b997595-bps76" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702738 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6lhf\" (UniqueName: \"kubernetes.io/projected/a3037291-c53e-4eb9-ae1b-00f71fee5cc5-kube-api-access-d6lhf\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702766 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b04041a-699e-42d5-a26e-3af266dc33ee-config\") pod \"openshift-apiserver-operator-796bbdcf4f-r4qhq\" (UID: \"7b04041a-699e-42d5-a26e-3af266dc33ee\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r4qhq" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702781 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f619c2d4-bcf5-4403-acd6-0bf90e2ece94-config-volume\") pod \"collect-profiles-29399700-qknbd\" (UID: \"f619c2d4-bcf5-4403-acd6-0bf90e2ece94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399700-qknbd" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702800 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6jtr\" (UniqueName: \"kubernetes.io/projected/0bff57c9-60c1-47ad-8b27-040cb3453a55-kube-api-access-t6jtr\") pod \"apiserver-76f77b778f-xvmg4\" (UID: \"0bff57c9-60c1-47ad-8b27-040cb3453a55\") " pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702816 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-965hb\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702831 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b68js\" (UniqueName: \"kubernetes.io/projected/eac73a5d-96c7-4aad-8882-edbb7eb2b55d-kube-api-access-b68js\") pod \"packageserver-d55dfcdfc-txwcw\" (UID: \"eac73a5d-96c7-4aad-8882-edbb7eb2b55d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-txwcw" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702849 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3258c094-1ed5-4938-8b6c-284465245440-bound-sa-token\") pod \"ingress-operator-5b745b69d9-fnfwt\" (UID: \"3258c094-1ed5-4938-8b6c-284465245440\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fnfwt" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702864 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ac73154f-7953-4cb0-b785-ebed9795fe9e-audit-policies\") pod \"apiserver-7bbb656c7d-f8zgb\" (UID: \"ac73154f-7953-4cb0-b785-ebed9795fe9e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8zgb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702882 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b04041a-699e-42d5-a26e-3af266dc33ee-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-r4qhq\" (UID: \"7b04041a-699e-42d5-a26e-3af266dc33ee\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r4qhq" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702897 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a3037291-c53e-4eb9-ae1b-00f71fee5cc5-trusted-ca\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702912 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f39385e4-9c04-4aae-878c-34ab4d097664-config\") pod \"console-operator-58897d9998-l5zrr\" (UID: \"f39385e4-9c04-4aae-878c-34ab4d097664\") " pod="openshift-console-operator/console-operator-58897d9998-l5zrr" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702927 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v9sd\" (UniqueName: \"kubernetes.io/projected/cffc3e00-cf34-448e-83c9-96c95a232caa-kube-api-access-5v9sd\") pod \"machine-config-operator-74547568cd-5l427\" (UID: \"cffc3e00-cf34-448e-83c9-96c95a232caa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5l427" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702942 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7k2d\" (UniqueName: \"kubernetes.io/projected/93b229ce-883d-43b5-a41c-1012da66417b-kube-api-access-r7k2d\") pod \"machine-config-controller-84d6567774-4tkbk\" (UID: \"93b229ce-883d-43b5-a41c-1012da66417b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4tkbk" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702956 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5284b897-69bd-490f-a02d-2d8647c73029-signing-key\") pod \"service-ca-9c57cc56f-xdzjw\" (UID: \"5284b897-69bd-490f-a02d-2d8647c73029\") " pod="openshift-service-ca/service-ca-9c57cc56f-xdzjw" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702969 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/05ea66c9-da9e-44a9-b85b-11514abd77e1-plugins-dir\") pod \"csi-hostpathplugin-cvplk\" (UID: \"05ea66c9-da9e-44a9-b85b-11514abd77e1\") " pod="hostpath-provisioner/csi-hostpathplugin-cvplk" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.702998 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fde8828e-798f-4da5-9f44-0b8a2726dcb1-client-ca\") pod \"route-controller-manager-6576b87f9c-cprsp\" (UID: \"fde8828e-798f-4da5-9f44-0b8a2726dcb1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cprsp" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703014 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0bff57c9-60c1-47ad-8b27-040cb3453a55-node-pullsecrets\") pod \"apiserver-76f77b778f-xvmg4\" (UID: \"0bff57c9-60c1-47ad-8b27-040cb3453a55\") " pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703028 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/df4cedec-414c-4253-8a55-79ed8c8734c1-console-oauth-config\") pod \"console-f9d7485db-x7lpn\" (UID: \"df4cedec-414c-4253-8a55-79ed8c8734c1\") " pod="openshift-console/console-f9d7485db-x7lpn" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703042 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2qhw\" (UniqueName: \"kubernetes.io/projected/a5b18d85-5b4b-4e2d-a357-39462b1bbcd6-kube-api-access-p2qhw\") pod \"olm-operator-6b444d44fb-wbs52\" (UID: \"a5b18d85-5b4b-4e2d-a357-39462b1bbcd6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wbs52" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703057 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8wjf\" (UniqueName: \"kubernetes.io/projected/9b5fea82-8918-4237-99a6-eae4894a0b5f-kube-api-access-t8wjf\") pod \"control-plane-machine-set-operator-78cbb6b69f-chccp\" (UID: \"9b5fea82-8918-4237-99a6-eae4894a0b5f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-chccp" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703087 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/224c53cb-9374-44a5-8256-0c45d1fbab84-etcd-service-ca\") pod \"etcd-operator-b45778765-c2q5l\" (UID: \"224c53cb-9374-44a5-8256-0c45d1fbab84\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c2q5l" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703105 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3147bef-3d3e-4b85-be3f-bfd0887849a9-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-857vf\" (UID: \"f3147bef-3d3e-4b85-be3f-bfd0887849a9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-857vf" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703120 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/224c53cb-9374-44a5-8256-0c45d1fbab84-etcd-ca\") pod \"etcd-operator-b45778765-c2q5l\" (UID: \"224c53cb-9374-44a5-8256-0c45d1fbab84\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c2q5l" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703139 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-965hb\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703154 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/93b229ce-883d-43b5-a41c-1012da66417b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-4tkbk\" (UID: \"93b229ce-883d-43b5-a41c-1012da66417b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4tkbk" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703171 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ac73154f-7953-4cb0-b785-ebed9795fe9e-encryption-config\") pod \"apiserver-7bbb656c7d-f8zgb\" (UID: \"ac73154f-7953-4cb0-b785-ebed9795fe9e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8zgb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703185 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cff4w\" (UniqueName: \"kubernetes.io/projected/6a04c082-6938-494c-833e-65f1f25817a9-kube-api-access-cff4w\") pod \"cluster-samples-operator-665b6dd947-gxnqd\" (UID: \"6a04c082-6938-494c-833e-65f1f25817a9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gxnqd" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703200 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvvzb\" (UniqueName: \"kubernetes.io/projected/57e2fd42-88ec-49e8-b089-ff3d0a9405cf-kube-api-access-nvvzb\") pod \"machine-config-server-pchf4\" (UID: \"57e2fd42-88ec-49e8-b089-ff3d0a9405cf\") " pod="openshift-machine-config-operator/machine-config-server-pchf4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703216 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvxxg\" (UniqueName: \"kubernetes.io/projected/14136410-157d-4287-96e1-6939c887bef3-kube-api-access-tvxxg\") pod \"kube-storage-version-migrator-operator-b67b599dd-nhc89\" (UID: \"14136410-157d-4287-96e1-6939c887bef3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nhc89" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703232 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/05ea66c9-da9e-44a9-b85b-11514abd77e1-socket-dir\") pod \"csi-hostpathplugin-cvplk\" (UID: \"05ea66c9-da9e-44a9-b85b-11514abd77e1\") " pod="hostpath-provisioner/csi-hostpathplugin-cvplk" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703249 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/101dea51-f55e-4def-ae4f-e3d0ce84351f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-bkhh5\" (UID: \"101dea51-f55e-4def-ae4f-e3d0ce84351f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bkhh5" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703263 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f619c2d4-bcf5-4403-acd6-0bf90e2ece94-secret-volume\") pod \"collect-profiles-29399700-qknbd\" (UID: \"f619c2d4-bcf5-4403-acd6-0bf90e2ece94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399700-qknbd" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703277 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7bd8\" (UniqueName: \"kubernetes.io/projected/c0998cf9-2c91-46b3-a9b8-a070e8b855b2-kube-api-access-h7bd8\") pod \"router-default-5444994796-w7dbc\" (UID: \"c0998cf9-2c91-46b3-a9b8-a070e8b855b2\") " pod="openshift-ingress/router-default-5444994796-w7dbc" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703293 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8ee461c-f13e-49f3-ba77-8063e92a9d01-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-znzhg\" (UID: \"b8ee461c-f13e-49f3-ba77-8063e92a9d01\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-znzhg" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703310 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5618f796-dad3-4ed3-bff7-ceed08f8b07c-audit-dir\") pod \"oauth-openshift-558db77b4-965hb\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703325 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3258c094-1ed5-4938-8b6c-284465245440-metrics-tls\") pod \"ingress-operator-5b745b69d9-fnfwt\" (UID: \"3258c094-1ed5-4938-8b6c-284465245440\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fnfwt" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703341 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a3037291-c53e-4eb9-ae1b-00f71fee5cc5-registry-tls\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703357 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tmj4\" (UniqueName: \"kubernetes.io/projected/d4bbc9d0-4420-4581-a3be-919da131bf9a-kube-api-access-9tmj4\") pod \"controller-manager-879f6c89f-tn49v\" (UID: \"d4bbc9d0-4420-4581-a3be-919da131bf9a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tn49v" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703372 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/df4cedec-414c-4253-8a55-79ed8c8734c1-console-config\") pod \"console-f9d7485db-x7lpn\" (UID: \"df4cedec-414c-4253-8a55-79ed8c8734c1\") " pod="openshift-console/console-f9d7485db-x7lpn" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703388 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtnz8\" (UniqueName: \"kubernetes.io/projected/df4cedec-414c-4253-8a55-79ed8c8734c1-kube-api-access-xtnz8\") pod \"console-f9d7485db-x7lpn\" (UID: \"df4cedec-414c-4253-8a55-79ed8c8734c1\") " pod="openshift-console/console-f9d7485db-x7lpn" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703405 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a3037291-c53e-4eb9-ae1b-00f71fee5cc5-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703420 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6h74z\" (UniqueName: \"kubernetes.io/projected/8d1db8d4-8511-4932-8478-61ddedab3680-kube-api-access-6h74z\") pod \"machine-approver-56656f9798-jntpt\" (UID: \"8d1db8d4-8511-4932-8478-61ddedab3680\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jntpt" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703435 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b03c5740-8446-4caf-9072-274155052591-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-txmns\" (UID: \"b03c5740-8446-4caf-9072-274155052591\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-txmns" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703452 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f3147bef-3d3e-4b85-be3f-bfd0887849a9-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-857vf\" (UID: \"f3147bef-3d3e-4b85-be3f-bfd0887849a9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-857vf" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703469 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/224c53cb-9374-44a5-8256-0c45d1fbab84-serving-cert\") pod \"etcd-operator-b45778765-c2q5l\" (UID: \"224c53cb-9374-44a5-8256-0c45d1fbab84\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c2q5l" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703485 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd7zx\" (UniqueName: \"kubernetes.io/projected/224c53cb-9374-44a5-8256-0c45d1fbab84-kube-api-access-bd7zx\") pod \"etcd-operator-b45778765-c2q5l\" (UID: \"224c53cb-9374-44a5-8256-0c45d1fbab84\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c2q5l" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703500 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3258c094-1ed5-4938-8b6c-284465245440-trusted-ca\") pod \"ingress-operator-5b745b69d9-fnfwt\" (UID: \"3258c094-1ed5-4938-8b6c-284465245440\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fnfwt" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703517 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a9e501c-00d3-4607-8b6b-b01485ba4961-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-rmq2r\" (UID: \"3a9e501c-00d3-4607-8b6b-b01485ba4961\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rmq2r" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703533 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/08f929dc-5226-49d2-9048-9b4fb65eff57-srv-cert\") pod \"catalog-operator-68c6474976-jjd72\" (UID: \"08f929dc-5226-49d2-9048-9b4fb65eff57\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jjd72" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703549 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/224c53cb-9374-44a5-8256-0c45d1fbab84-etcd-client\") pod \"etcd-operator-b45778765-c2q5l\" (UID: \"224c53cb-9374-44a5-8256-0c45d1fbab84\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c2q5l" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703563 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5618f796-dad3-4ed3-bff7-ceed08f8b07c-audit-policies\") pod \"oauth-openshift-558db77b4-965hb\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703592 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnn4t\" (UniqueName: \"kubernetes.io/projected/3a9e501c-00d3-4607-8b6b-b01485ba4961-kube-api-access-cnn4t\") pod \"cluster-image-registry-operator-dc59b4c8b-rmq2r\" (UID: \"3a9e501c-00d3-4607-8b6b-b01485ba4961\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rmq2r" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703608 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngx8g\" (UniqueName: \"kubernetes.io/projected/5618f796-dad3-4ed3-bff7-ceed08f8b07c-kube-api-access-ngx8g\") pod \"oauth-openshift-558db77b4-965hb\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703623 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bae98c3e-6921-4f96-beab-9d294469a8fe-metrics-tls\") pod \"dns-operator-744455d44c-jzbd4\" (UID: \"bae98c3e-6921-4f96-beab-9d294469a8fe\") " pod="openshift-dns-operator/dns-operator-744455d44c-jzbd4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703638 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/df4cedec-414c-4253-8a55-79ed8c8734c1-oauth-serving-cert\") pod \"console-f9d7485db-x7lpn\" (UID: \"df4cedec-414c-4253-8a55-79ed8c8734c1\") " pod="openshift-console/console-f9d7485db-x7lpn" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703655 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-965hb\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703671 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fde8828e-798f-4da5-9f44-0b8a2726dcb1-serving-cert\") pod \"route-controller-manager-6576b87f9c-cprsp\" (UID: \"fde8828e-798f-4da5-9f44-0b8a2726dcb1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cprsp" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703686 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bff57c9-60c1-47ad-8b27-040cb3453a55-config\") pod \"apiserver-76f77b778f-xvmg4\" (UID: \"0bff57c9-60c1-47ad-8b27-040cb3453a55\") " pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703701 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cffc3e00-cf34-448e-83c9-96c95a232caa-proxy-tls\") pod \"machine-config-operator-74547568cd-5l427\" (UID: \"cffc3e00-cf34-448e-83c9-96c95a232caa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5l427" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703716 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/05ea66c9-da9e-44a9-b85b-11514abd77e1-mountpoint-dir\") pod \"csi-hostpathplugin-cvplk\" (UID: \"05ea66c9-da9e-44a9-b85b-11514abd77e1\") " pod="hostpath-provisioner/csi-hostpathplugin-cvplk" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703734 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8d1db8d4-8511-4932-8478-61ddedab3680-machine-approver-tls\") pod \"machine-approver-56656f9798-jntpt\" (UID: \"8d1db8d4-8511-4932-8478-61ddedab3680\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jntpt" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703773 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/08f929dc-5226-49d2-9048-9b4fb65eff57-profile-collector-cert\") pod \"catalog-operator-68c6474976-jjd72\" (UID: \"08f929dc-5226-49d2-9048-9b4fb65eff57\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jjd72" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703806 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c0f8aa1-dec1-46a3-ad52-b1e3275504b2-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-fvb85\" (UID: \"9c0f8aa1-dec1-46a3-ad52-b1e3275504b2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fvb85" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703832 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvsgf\" (UniqueName: \"kubernetes.io/projected/ac73154f-7953-4cb0-b785-ebed9795fe9e-kube-api-access-kvsgf\") pod \"apiserver-7bbb656c7d-f8zgb\" (UID: \"ac73154f-7953-4cb0-b785-ebed9795fe9e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8zgb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703855 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nmlc\" (UniqueName: \"kubernetes.io/projected/e7f6a4fd-346f-4828-b077-4d4888917d6a-kube-api-access-9nmlc\") pod \"downloads-7954f5f757-sght6\" (UID: \"e7f6a4fd-346f-4828-b077-4d4888917d6a\") " pod="openshift-console/downloads-7954f5f757-sght6" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703877 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/224c53cb-9374-44a5-8256-0c45d1fbab84-config\") pod \"etcd-operator-b45778765-c2q5l\" (UID: \"224c53cb-9374-44a5-8256-0c45d1fbab84\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c2q5l" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703899 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df4cedec-414c-4253-8a55-79ed8c8734c1-trusted-ca-bundle\") pod \"console-f9d7485db-x7lpn\" (UID: \"df4cedec-414c-4253-8a55-79ed8c8734c1\") " pod="openshift-console/console-f9d7485db-x7lpn" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703919 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/101dea51-f55e-4def-ae4f-e3d0ce84351f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-bkhh5\" (UID: \"101dea51-f55e-4def-ae4f-e3d0ce84351f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bkhh5" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703940 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c0998cf9-2c91-46b3-a9b8-a070e8b855b2-metrics-certs\") pod \"router-default-5444994796-w7dbc\" (UID: \"c0998cf9-2c91-46b3-a9b8-a070e8b855b2\") " pod="openshift-ingress/router-default-5444994796-w7dbc" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703964 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-965hb\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.703986 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c0f8aa1-dec1-46a3-ad52-b1e3275504b2-config\") pod \"kube-apiserver-operator-766d6c64bb-fvb85\" (UID: \"9c0f8aa1-dec1-46a3-ad52-b1e3275504b2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fvb85" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.704008 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8d1db8d4-8511-4932-8478-61ddedab3680-auth-proxy-config\") pod \"machine-approver-56656f9798-jntpt\" (UID: \"8d1db8d4-8511-4932-8478-61ddedab3680\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jntpt" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.704024 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/891cfebf-6f4e-446d-85f4-4231622d7886-metrics-tls\") pod \"dns-default-sd7kr\" (UID: \"891cfebf-6f4e-446d-85f4-4231622d7886\") " pod="openshift-dns/dns-default-sd7kr" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.704041 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4bbc9d0-4420-4581-a3be-919da131bf9a-serving-cert\") pod \"controller-manager-879f6c89f-tn49v\" (UID: \"d4bbc9d0-4420-4581-a3be-919da131bf9a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tn49v" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.704056 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3a9e501c-00d3-4607-8b6b-b01485ba4961-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-rmq2r\" (UID: \"3a9e501c-00d3-4607-8b6b-b01485ba4961\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rmq2r" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.704072 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pfbf\" (UniqueName: \"kubernetes.io/projected/7b04041a-699e-42d5-a26e-3af266dc33ee-kube-api-access-2pfbf\") pod \"openshift-apiserver-operator-796bbdcf4f-r4qhq\" (UID: \"7b04041a-699e-42d5-a26e-3af266dc33ee\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r4qhq" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.704088 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wnkz\" (UniqueName: \"kubernetes.io/projected/8731be1c-9d01-46ea-9237-741ac9b9eb31-kube-api-access-7wnkz\") pod \"multus-admission-controller-857f4d67dd-ljttt\" (UID: \"8731be1c-9d01-46ea-9237-741ac9b9eb31\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ljttt" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.704107 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bflcp\" (UniqueName: \"kubernetes.io/projected/fb1fcea4-1195-476d-b01c-875ab966a466-kube-api-access-bflcp\") pod \"package-server-manager-789f6589d5-htkkq\" (UID: \"fb1fcea4-1195-476d-b01c-875ab966a466\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-htkkq" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.704124 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/0bff57c9-60c1-47ad-8b27-040cb3453a55-image-import-ca\") pod \"apiserver-76f77b778f-xvmg4\" (UID: \"0bff57c9-60c1-47ad-8b27-040cb3453a55\") " pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.704139 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a5b18d85-5b4b-4e2d-a357-39462b1bbcd6-profile-collector-cert\") pod \"olm-operator-6b444d44fb-wbs52\" (UID: \"a5b18d85-5b4b-4e2d-a357-39462b1bbcd6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wbs52" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.704155 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/fb1fcea4-1195-476d-b01c-875ab966a466-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-htkkq\" (UID: \"fb1fcea4-1195-476d-b01c-875ab966a466\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-htkkq" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.704206 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac73154f-7953-4cb0-b785-ebed9795fe9e-serving-cert\") pod \"apiserver-7bbb656c7d-f8zgb\" (UID: \"ac73154f-7953-4cb0-b785-ebed9795fe9e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8zgb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.704222 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fffk2\" (UniqueName: \"kubernetes.io/projected/9503659c-61c6-42ce-988d-ee76cef6db29-kube-api-access-fffk2\") pod \"service-ca-operator-777779d784-v9rmm\" (UID: \"9503659c-61c6-42ce-988d-ee76cef6db29\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v9rmm" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.704283 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.704302 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hsbd\" (UniqueName: \"kubernetes.io/projected/bae98c3e-6921-4f96-beab-9d294469a8fe-kube-api-access-6hsbd\") pod \"dns-operator-744455d44c-jzbd4\" (UID: \"bae98c3e-6921-4f96-beab-9d294469a8fe\") " pod="openshift-dns-operator/dns-operator-744455d44c-jzbd4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.704707 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3a9e501c-00d3-4607-8b6b-b01485ba4961-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-rmq2r\" (UID: \"3a9e501c-00d3-4607-8b6b-b01485ba4961\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rmq2r" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.720020 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0bff57c9-60c1-47ad-8b27-040cb3453a55-node-pullsecrets\") pod \"apiserver-76f77b778f-xvmg4\" (UID: \"0bff57c9-60c1-47ad-8b27-040cb3453a55\") " pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.720091 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5618f796-dad3-4ed3-bff7-ceed08f8b07c-audit-dir\") pod \"oauth-openshift-558db77b4-965hb\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.720141 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ac73154f-7953-4cb0-b785-ebed9795fe9e-audit-dir\") pod \"apiserver-7bbb656c7d-f8zgb\" (UID: \"ac73154f-7953-4cb0-b785-ebed9795fe9e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8zgb" Nov 24 11:08:56 crc kubenswrapper[4752]: E1124 11:08:56.722102 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:08:57.222091233 +0000 UTC m=+143.206911522 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.726569 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0bff57c9-60c1-47ad-8b27-040cb3453a55-audit-dir\") pod \"apiserver-76f77b778f-xvmg4\" (UID: \"0bff57c9-60c1-47ad-8b27-040cb3453a55\") " pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.749032 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac73154f-7953-4cb0-b785-ebed9795fe9e-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-f8zgb\" (UID: \"ac73154f-7953-4cb0-b785-ebed9795fe9e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8zgb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.749322 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c0f8aa1-dec1-46a3-ad52-b1e3275504b2-config\") pod \"kube-apiserver-operator-766d6c64bb-fvb85\" (UID: \"9c0f8aa1-dec1-46a3-ad52-b1e3275504b2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fvb85" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.749578 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b04041a-699e-42d5-a26e-3af266dc33ee-config\") pod \"openshift-apiserver-operator-796bbdcf4f-r4qhq\" (UID: \"7b04041a-699e-42d5-a26e-3af266dc33ee\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r4qhq" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.750763 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ac73154f-7953-4cb0-b785-ebed9795fe9e-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-f8zgb\" (UID: \"ac73154f-7953-4cb0-b785-ebed9795fe9e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8zgb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.752784 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bff57c9-60c1-47ad-8b27-040cb3453a55-config\") pod \"apiserver-76f77b778f-xvmg4\" (UID: \"0bff57c9-60c1-47ad-8b27-040cb3453a55\") " pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.751628 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0bff57c9-60c1-47ad-8b27-040cb3453a55-etcd-serving-ca\") pod \"apiserver-76f77b778f-xvmg4\" (UID: \"0bff57c9-60c1-47ad-8b27-040cb3453a55\") " pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.751852 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/0bff57c9-60c1-47ad-8b27-040cb3453a55-audit\") pod \"apiserver-76f77b778f-xvmg4\" (UID: \"0bff57c9-60c1-47ad-8b27-040cb3453a55\") " pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.752652 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/101dea51-f55e-4def-ae4f-e3d0ce84351f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-bkhh5\" (UID: \"101dea51-f55e-4def-ae4f-e3d0ce84351f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bkhh5" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.751492 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d1db8d4-8511-4932-8478-61ddedab3680-config\") pod \"machine-approver-56656f9798-jntpt\" (UID: \"8d1db8d4-8511-4932-8478-61ddedab3680\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jntpt" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.753309 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4bbc9d0-4420-4581-a3be-919da131bf9a-config\") pod \"controller-manager-879f6c89f-tn49v\" (UID: \"d4bbc9d0-4420-4581-a3be-919da131bf9a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tn49v" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.753796 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8d1db8d4-8511-4932-8478-61ddedab3680-auth-proxy-config\") pod \"machine-approver-56656f9798-jntpt\" (UID: \"8d1db8d4-8511-4932-8478-61ddedab3680\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jntpt" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.753883 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4bbc9d0-4420-4581-a3be-919da131bf9a-client-ca\") pod \"controller-manager-879f6c89f-tn49v\" (UID: \"d4bbc9d0-4420-4581-a3be-919da131bf9a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tn49v" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.754619 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0bff57c9-60c1-47ad-8b27-040cb3453a55-encryption-config\") pod \"apiserver-76f77b778f-xvmg4\" (UID: \"0bff57c9-60c1-47ad-8b27-040cb3453a55\") " pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.754613 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d4bbc9d0-4420-4581-a3be-919da131bf9a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-tn49v\" (UID: \"d4bbc9d0-4420-4581-a3be-919da131bf9a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tn49v" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.754623 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3a9e501c-00d3-4607-8b6b-b01485ba4961-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-rmq2r\" (UID: \"3a9e501c-00d3-4607-8b6b-b01485ba4961\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rmq2r" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.754894 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bff57c9-60c1-47ad-8b27-040cb3453a55-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xvmg4\" (UID: \"0bff57c9-60c1-47ad-8b27-040cb3453a55\") " pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.755677 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/0bff57c9-60c1-47ad-8b27-040cb3453a55-image-import-ca\") pod \"apiserver-76f77b778f-xvmg4\" (UID: \"0bff57c9-60c1-47ad-8b27-040cb3453a55\") " pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.756098 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b03c5740-8446-4caf-9072-274155052591-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-txmns\" (UID: \"b03c5740-8446-4caf-9072-274155052591\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-txmns" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.756432 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-965hb\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.756701 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c0f8aa1-dec1-46a3-ad52-b1e3275504b2-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-fvb85\" (UID: \"9c0f8aa1-dec1-46a3-ad52-b1e3275504b2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fvb85" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.757425 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8ee461c-f13e-49f3-ba77-8063e92a9d01-config\") pod \"authentication-operator-69f744f599-znzhg\" (UID: \"b8ee461c-f13e-49f3-ba77-8063e92a9d01\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-znzhg" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.757727 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a3037291-c53e-4eb9-ae1b-00f71fee5cc5-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.758129 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fde8828e-798f-4da5-9f44-0b8a2726dcb1-config\") pod \"route-controller-manager-6576b87f9c-cprsp\" (UID: \"fde8828e-798f-4da5-9f44-0b8a2726dcb1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cprsp" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.758393 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a3037291-c53e-4eb9-ae1b-00f71fee5cc5-trusted-ca\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.758404 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fde8828e-798f-4da5-9f44-0b8a2726dcb1-client-ca\") pod \"route-controller-manager-6576b87f9c-cprsp\" (UID: \"fde8828e-798f-4da5-9f44-0b8a2726dcb1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cprsp" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.758399 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a3037291-c53e-4eb9-ae1b-00f71fee5cc5-registry-certificates\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.759018 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-965hb\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.759181 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8ee461c-f13e-49f3-ba77-8063e92a9d01-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-znzhg\" (UID: \"b8ee461c-f13e-49f3-ba77-8063e92a9d01\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-znzhg" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.759934 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5618f796-dad3-4ed3-bff7-ceed08f8b07c-audit-policies\") pod \"oauth-openshift-558db77b4-965hb\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.759948 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ac73154f-7953-4cb0-b785-ebed9795fe9e-audit-policies\") pod \"apiserver-7bbb656c7d-f8zgb\" (UID: \"ac73154f-7953-4cb0-b785-ebed9795fe9e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8zgb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.760219 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f39385e4-9c04-4aae-878c-34ab4d097664-config\") pod \"console-operator-58897d9998-l5zrr\" (UID: \"f39385e4-9c04-4aae-878c-34ab4d097664\") " pod="openshift-console-operator/console-operator-58897d9998-l5zrr" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.760552 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8ee461c-f13e-49f3-ba77-8063e92a9d01-service-ca-bundle\") pod \"authentication-operator-69f744f599-znzhg\" (UID: \"b8ee461c-f13e-49f3-ba77-8063e92a9d01\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-znzhg" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.760658 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/df4cedec-414c-4253-8a55-79ed8c8734c1-oauth-serving-cert\") pod \"console-f9d7485db-x7lpn\" (UID: \"df4cedec-414c-4253-8a55-79ed8c8734c1\") " pod="openshift-console/console-f9d7485db-x7lpn" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.761157 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/224c53cb-9374-44a5-8256-0c45d1fbab84-etcd-ca\") pod \"etcd-operator-b45778765-c2q5l\" (UID: \"224c53cb-9374-44a5-8256-0c45d1fbab84\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c2q5l" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.761216 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-965hb\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.761619 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ac73154f-7953-4cb0-b785-ebed9795fe9e-encryption-config\") pod \"apiserver-7bbb656c7d-f8zgb\" (UID: \"ac73154f-7953-4cb0-b785-ebed9795fe9e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8zgb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.763320 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/df4cedec-414c-4253-8a55-79ed8c8734c1-service-ca\") pod \"console-f9d7485db-x7lpn\" (UID: \"df4cedec-414c-4253-8a55-79ed8c8734c1\") " pod="openshift-console/console-f9d7485db-x7lpn" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.763374 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/df4cedec-414c-4253-8a55-79ed8c8734c1-console-serving-cert\") pod \"console-f9d7485db-x7lpn\" (UID: \"df4cedec-414c-4253-8a55-79ed8c8734c1\") " pod="openshift-console/console-f9d7485db-x7lpn" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.763465 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/101dea51-f55e-4def-ae4f-e3d0ce84351f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-bkhh5\" (UID: \"101dea51-f55e-4def-ae4f-e3d0ce84351f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bkhh5" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.763609 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df4cedec-414c-4253-8a55-79ed8c8734c1-trusted-ca-bundle\") pod \"console-f9d7485db-x7lpn\" (UID: \"df4cedec-414c-4253-8a55-79ed8c8734c1\") " pod="openshift-console/console-f9d7485db-x7lpn" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.763710 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b04041a-699e-42d5-a26e-3af266dc33ee-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-r4qhq\" (UID: \"7b04041a-699e-42d5-a26e-3af266dc33ee\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r4qhq" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.763930 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/224c53cb-9374-44a5-8256-0c45d1fbab84-config\") pod \"etcd-operator-b45778765-c2q5l\" (UID: \"224c53cb-9374-44a5-8256-0c45d1fbab84\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c2q5l" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.763961 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/df4cedec-414c-4253-8a55-79ed8c8734c1-console-config\") pod \"console-f9d7485db-x7lpn\" (UID: \"df4cedec-414c-4253-8a55-79ed8c8734c1\") " pod="openshift-console/console-f9d7485db-x7lpn" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.764076 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/224c53cb-9374-44a5-8256-0c45d1fbab84-etcd-service-ca\") pod \"etcd-operator-b45778765-c2q5l\" (UID: \"224c53cb-9374-44a5-8256-0c45d1fbab84\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c2q5l" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.764133 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f39385e4-9c04-4aae-878c-34ab4d097664-trusted-ca\") pod \"console-operator-58897d9998-l5zrr\" (UID: \"f39385e4-9c04-4aae-878c-34ab4d097664\") " pod="openshift-console-operator/console-operator-58897d9998-l5zrr" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.764634 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b03c5740-8446-4caf-9072-274155052591-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-txmns\" (UID: \"b03c5740-8446-4caf-9072-274155052591\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-txmns" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.765031 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bff57c9-60c1-47ad-8b27-040cb3453a55-serving-cert\") pod \"apiserver-76f77b778f-xvmg4\" (UID: \"0bff57c9-60c1-47ad-8b27-040cb3453a55\") " pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.765312 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bae98c3e-6921-4f96-beab-9d294469a8fe-metrics-tls\") pod \"dns-operator-744455d44c-jzbd4\" (UID: \"bae98c3e-6921-4f96-beab-9d294469a8fe\") " pod="openshift-dns-operator/dns-operator-744455d44c-jzbd4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.765828 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8d1db8d4-8511-4932-8478-61ddedab3680-machine-approver-tls\") pod \"machine-approver-56656f9798-jntpt\" (UID: \"8d1db8d4-8511-4932-8478-61ddedab3680\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jntpt" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.766202 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ac73154f-7953-4cb0-b785-ebed9795fe9e-etcd-client\") pod \"apiserver-7bbb656c7d-f8zgb\" (UID: \"ac73154f-7953-4cb0-b785-ebed9795fe9e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8zgb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.766526 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6a04c082-6938-494c-833e-65f1f25817a9-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gxnqd\" (UID: \"6a04c082-6938-494c-833e-65f1f25817a9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gxnqd" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.766913 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a9e501c-00d3-4607-8b6b-b01485ba4961-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-rmq2r\" (UID: \"3a9e501c-00d3-4607-8b6b-b01485ba4961\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rmq2r" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.767016 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-965hb\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.767161 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4bbc9d0-4420-4581-a3be-919da131bf9a-serving-cert\") pod \"controller-manager-879f6c89f-tn49v\" (UID: \"d4bbc9d0-4420-4581-a3be-919da131bf9a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tn49v" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.767238 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-965hb\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.767221 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-965hb\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.767538 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a3037291-c53e-4eb9-ae1b-00f71fee5cc5-bound-sa-token\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.767418 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fde8828e-798f-4da5-9f44-0b8a2726dcb1-serving-cert\") pod \"route-controller-manager-6576b87f9c-cprsp\" (UID: \"fde8828e-798f-4da5-9f44-0b8a2726dcb1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cprsp" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.768209 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a3037291-c53e-4eb9-ae1b-00f71fee5cc5-registry-tls\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.769650 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8ee461c-f13e-49f3-ba77-8063e92a9d01-serving-cert\") pod \"authentication-operator-69f744f599-znzhg\" (UID: \"b8ee461c-f13e-49f3-ba77-8063e92a9d01\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-znzhg" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.770036 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-965hb\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.770296 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f39385e4-9c04-4aae-878c-34ab4d097664-serving-cert\") pod \"console-operator-58897d9998-l5zrr\" (UID: \"f39385e4-9c04-4aae-878c-34ab4d097664\") " pod="openshift-console-operator/console-operator-58897d9998-l5zrr" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.771425 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a3037291-c53e-4eb9-ae1b-00f71fee5cc5-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.771525 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-965hb\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.771592 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-965hb\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.771824 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/224c53cb-9374-44a5-8256-0c45d1fbab84-serving-cert\") pod \"etcd-operator-b45778765-c2q5l\" (UID: \"224c53cb-9374-44a5-8256-0c45d1fbab84\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c2q5l" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.772085 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac73154f-7953-4cb0-b785-ebed9795fe9e-serving-cert\") pod \"apiserver-7bbb656c7d-f8zgb\" (UID: \"ac73154f-7953-4cb0-b785-ebed9795fe9e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8zgb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.772177 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-965hb\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.772698 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/224c53cb-9374-44a5-8256-0c45d1fbab84-etcd-client\") pod \"etcd-operator-b45778765-c2q5l\" (UID: \"224c53cb-9374-44a5-8256-0c45d1fbab84\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c2q5l" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.773249 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/df4cedec-414c-4253-8a55-79ed8c8734c1-console-oauth-config\") pod \"console-f9d7485db-x7lpn\" (UID: \"df4cedec-414c-4253-8a55-79ed8c8734c1\") " pod="openshift-console/console-f9d7485db-x7lpn" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.774173 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-965hb\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.774198 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg9d2\" (UniqueName: \"kubernetes.io/projected/f39385e4-9c04-4aae-878c-34ab4d097664-kube-api-access-hg9d2\") pod \"console-operator-58897d9998-l5zrr\" (UID: \"f39385e4-9c04-4aae-878c-34ab4d097664\") " pod="openshift-console-operator/console-operator-58897d9998-l5zrr" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.775511 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0bff57c9-60c1-47ad-8b27-040cb3453a55-etcd-client\") pod \"apiserver-76f77b778f-xvmg4\" (UID: \"0bff57c9-60c1-47ad-8b27-040cb3453a55\") " pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.786831 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-d5cwn"] Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.797082 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6lhf\" (UniqueName: \"kubernetes.io/projected/a3037291-c53e-4eb9-ae1b-00f71fee5cc5-kube-api-access-d6lhf\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.806259 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.807074 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/eac73a5d-96c7-4aad-8882-edbb7eb2b55d-apiservice-cert\") pod \"packageserver-d55dfcdfc-txwcw\" (UID: \"eac73a5d-96c7-4aad-8882-edbb7eb2b55d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-txwcw" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.807211 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cffc3e00-cf34-448e-83c9-96c95a232caa-auth-proxy-config\") pod \"machine-config-operator-74547568cd-5l427\" (UID: \"cffc3e00-cf34-448e-83c9-96c95a232caa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5l427" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.807258 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgtb5\" (UniqueName: \"kubernetes.io/projected/cae410f5-e7a9-42e2-9aaa-d6634d848e58-kube-api-access-pgtb5\") pod \"migrator-59844c95c7-ptrzc\" (UID: \"cae410f5-e7a9-42e2-9aaa-d6634d848e58\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ptrzc" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.807293 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0998cf9-2c91-46b3-a9b8-a070e8b855b2-service-ca-bundle\") pod \"router-default-5444994796-w7dbc\" (UID: \"c0998cf9-2c91-46b3-a9b8-a070e8b855b2\") " pod="openshift-ingress/router-default-5444994796-w7dbc" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.807320 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2521dbb1-8413-4180-bd70-2cf9019e583c-cert\") pod \"ingress-canary-qvffh\" (UID: \"2521dbb1-8413-4180-bd70-2cf9019e583c\") " pod="openshift-ingress-canary/ingress-canary-qvffh" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.807355 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/57e2fd42-88ec-49e8-b089-ff3d0a9405cf-node-bootstrap-token\") pod \"machine-config-server-pchf4\" (UID: \"57e2fd42-88ec-49e8-b089-ff3d0a9405cf\") " pod="openshift-machine-config-operator/machine-config-server-pchf4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.807388 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/57e2fd42-88ec-49e8-b089-ff3d0a9405cf-certs\") pod \"machine-config-server-pchf4\" (UID: \"57e2fd42-88ec-49e8-b089-ff3d0a9405cf\") " pod="openshift-machine-config-operator/machine-config-server-pchf4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.807419 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/782a3373-7524-40dc-b312-08c40423ffb6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bps76\" (UID: \"782a3373-7524-40dc-b312-08c40423ffb6\") " pod="openshift-marketplace/marketplace-operator-79b997595-bps76" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.807484 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f619c2d4-bcf5-4403-acd6-0bf90e2ece94-config-volume\") pod \"collect-profiles-29399700-qknbd\" (UID: \"f619c2d4-bcf5-4403-acd6-0bf90e2ece94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399700-qknbd" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.807526 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b68js\" (UniqueName: \"kubernetes.io/projected/eac73a5d-96c7-4aad-8882-edbb7eb2b55d-kube-api-access-b68js\") pod \"packageserver-d55dfcdfc-txwcw\" (UID: \"eac73a5d-96c7-4aad-8882-edbb7eb2b55d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-txwcw" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.807557 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3258c094-1ed5-4938-8b6c-284465245440-bound-sa-token\") pod \"ingress-operator-5b745b69d9-fnfwt\" (UID: \"3258c094-1ed5-4938-8b6c-284465245440\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fnfwt" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.807585 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7k2d\" (UniqueName: \"kubernetes.io/projected/93b229ce-883d-43b5-a41c-1012da66417b-kube-api-access-r7k2d\") pod \"machine-config-controller-84d6567774-4tkbk\" (UID: \"93b229ce-883d-43b5-a41c-1012da66417b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4tkbk" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.807618 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5284b897-69bd-490f-a02d-2d8647c73029-signing-key\") pod \"service-ca-9c57cc56f-xdzjw\" (UID: \"5284b897-69bd-490f-a02d-2d8647c73029\") " pod="openshift-service-ca/service-ca-9c57cc56f-xdzjw" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.807647 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/05ea66c9-da9e-44a9-b85b-11514abd77e1-plugins-dir\") pod \"csi-hostpathplugin-cvplk\" (UID: \"05ea66c9-da9e-44a9-b85b-11514abd77e1\") " pod="hostpath-provisioner/csi-hostpathplugin-cvplk" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.807682 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5v9sd\" (UniqueName: \"kubernetes.io/projected/cffc3e00-cf34-448e-83c9-96c95a232caa-kube-api-access-5v9sd\") pod \"machine-config-operator-74547568cd-5l427\" (UID: \"cffc3e00-cf34-448e-83c9-96c95a232caa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5l427" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.807719 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2qhw\" (UniqueName: \"kubernetes.io/projected/a5b18d85-5b4b-4e2d-a357-39462b1bbcd6-kube-api-access-p2qhw\") pod \"olm-operator-6b444d44fb-wbs52\" (UID: \"a5b18d85-5b4b-4e2d-a357-39462b1bbcd6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wbs52" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.807766 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8wjf\" (UniqueName: \"kubernetes.io/projected/9b5fea82-8918-4237-99a6-eae4894a0b5f-kube-api-access-t8wjf\") pod \"control-plane-machine-set-operator-78cbb6b69f-chccp\" (UID: \"9b5fea82-8918-4237-99a6-eae4894a0b5f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-chccp" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.807804 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3147bef-3d3e-4b85-be3f-bfd0887849a9-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-857vf\" (UID: \"f3147bef-3d3e-4b85-be3f-bfd0887849a9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-857vf" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.807852 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/93b229ce-883d-43b5-a41c-1012da66417b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-4tkbk\" (UID: \"93b229ce-883d-43b5-a41c-1012da66417b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4tkbk" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.807934 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvvzb\" (UniqueName: \"kubernetes.io/projected/57e2fd42-88ec-49e8-b089-ff3d0a9405cf-kube-api-access-nvvzb\") pod \"machine-config-server-pchf4\" (UID: \"57e2fd42-88ec-49e8-b089-ff3d0a9405cf\") " pod="openshift-machine-config-operator/machine-config-server-pchf4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.807961 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvxxg\" (UniqueName: \"kubernetes.io/projected/14136410-157d-4287-96e1-6939c887bef3-kube-api-access-tvxxg\") pod \"kube-storage-version-migrator-operator-b67b599dd-nhc89\" (UID: \"14136410-157d-4287-96e1-6939c887bef3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nhc89" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.807993 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/05ea66c9-da9e-44a9-b85b-11514abd77e1-socket-dir\") pod \"csi-hostpathplugin-cvplk\" (UID: \"05ea66c9-da9e-44a9-b85b-11514abd77e1\") " pod="hostpath-provisioner/csi-hostpathplugin-cvplk" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.808027 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f619c2d4-bcf5-4403-acd6-0bf90e2ece94-secret-volume\") pod \"collect-profiles-29399700-qknbd\" (UID: \"f619c2d4-bcf5-4403-acd6-0bf90e2ece94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399700-qknbd" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.808110 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7bd8\" (UniqueName: \"kubernetes.io/projected/c0998cf9-2c91-46b3-a9b8-a070e8b855b2-kube-api-access-h7bd8\") pod \"router-default-5444994796-w7dbc\" (UID: \"c0998cf9-2c91-46b3-a9b8-a070e8b855b2\") " pod="openshift-ingress/router-default-5444994796-w7dbc" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.808138 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3258c094-1ed5-4938-8b6c-284465245440-metrics-tls\") pod \"ingress-operator-5b745b69d9-fnfwt\" (UID: \"3258c094-1ed5-4938-8b6c-284465245440\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fnfwt" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.808206 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f3147bef-3d3e-4b85-be3f-bfd0887849a9-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-857vf\" (UID: \"f3147bef-3d3e-4b85-be3f-bfd0887849a9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-857vf" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.808270 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3258c094-1ed5-4938-8b6c-284465245440-trusted-ca\") pod \"ingress-operator-5b745b69d9-fnfwt\" (UID: \"3258c094-1ed5-4938-8b6c-284465245440\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fnfwt" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.808305 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/08f929dc-5226-49d2-9048-9b4fb65eff57-srv-cert\") pod \"catalog-operator-68c6474976-jjd72\" (UID: \"08f929dc-5226-49d2-9048-9b4fb65eff57\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jjd72" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.808380 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cffc3e00-cf34-448e-83c9-96c95a232caa-proxy-tls\") pod \"machine-config-operator-74547568cd-5l427\" (UID: \"cffc3e00-cf34-448e-83c9-96c95a232caa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5l427" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.808412 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/05ea66c9-da9e-44a9-b85b-11514abd77e1-mountpoint-dir\") pod \"csi-hostpathplugin-cvplk\" (UID: \"05ea66c9-da9e-44a9-b85b-11514abd77e1\") " pod="hostpath-provisioner/csi-hostpathplugin-cvplk" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.808442 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/08f929dc-5226-49d2-9048-9b4fb65eff57-profile-collector-cert\") pod \"catalog-operator-68c6474976-jjd72\" (UID: \"08f929dc-5226-49d2-9048-9b4fb65eff57\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jjd72" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.808508 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c0998cf9-2c91-46b3-a9b8-a070e8b855b2-metrics-certs\") pod \"router-default-5444994796-w7dbc\" (UID: \"c0998cf9-2c91-46b3-a9b8-a070e8b855b2\") " pod="openshift-ingress/router-default-5444994796-w7dbc" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.808534 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/891cfebf-6f4e-446d-85f4-4231622d7886-metrics-tls\") pod \"dns-default-sd7kr\" (UID: \"891cfebf-6f4e-446d-85f4-4231622d7886\") " pod="openshift-dns/dns-default-sd7kr" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.808567 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wnkz\" (UniqueName: \"kubernetes.io/projected/8731be1c-9d01-46ea-9237-741ac9b9eb31-kube-api-access-7wnkz\") pod \"multus-admission-controller-857f4d67dd-ljttt\" (UID: \"8731be1c-9d01-46ea-9237-741ac9b9eb31\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ljttt" Nov 24 11:08:56 crc kubenswrapper[4752]: E1124 11:08:56.809107 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:08:57.309077194 +0000 UTC m=+143.293897483 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.809485 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bflcp\" (UniqueName: \"kubernetes.io/projected/fb1fcea4-1195-476d-b01c-875ab966a466-kube-api-access-bflcp\") pod \"package-server-manager-789f6589d5-htkkq\" (UID: \"fb1fcea4-1195-476d-b01c-875ab966a466\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-htkkq" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.809561 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/fb1fcea4-1195-476d-b01c-875ab966a466-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-htkkq\" (UID: \"fb1fcea4-1195-476d-b01c-875ab966a466\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-htkkq" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.809589 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a5b18d85-5b4b-4e2d-a357-39462b1bbcd6-profile-collector-cert\") pod \"olm-operator-6b444d44fb-wbs52\" (UID: \"a5b18d85-5b4b-4e2d-a357-39462b1bbcd6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wbs52" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.809622 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fffk2\" (UniqueName: \"kubernetes.io/projected/9503659c-61c6-42ce-988d-ee76cef6db29-kube-api-access-fffk2\") pod \"service-ca-operator-777779d784-v9rmm\" (UID: \"9503659c-61c6-42ce-988d-ee76cef6db29\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v9rmm" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.810001 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.810174 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9503659c-61c6-42ce-988d-ee76cef6db29-serving-cert\") pod \"service-ca-operator-777779d784-v9rmm\" (UID: \"9503659c-61c6-42ce-988d-ee76cef6db29\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v9rmm" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.810357 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/891cfebf-6f4e-446d-85f4-4231622d7886-config-volume\") pod \"dns-default-sd7kr\" (UID: \"891cfebf-6f4e-446d-85f4-4231622d7886\") " pod="openshift-dns/dns-default-sd7kr" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.810530 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a5b18d85-5b4b-4e2d-a357-39462b1bbcd6-srv-cert\") pod \"olm-operator-6b444d44fb-wbs52\" (UID: \"a5b18d85-5b4b-4e2d-a357-39462b1bbcd6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wbs52" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.810574 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6gcq\" (UniqueName: \"kubernetes.io/projected/2521dbb1-8413-4180-bd70-2cf9019e583c-kube-api-access-l6gcq\") pod \"ingress-canary-qvffh\" (UID: \"2521dbb1-8413-4180-bd70-2cf9019e583c\") " pod="openshift-ingress-canary/ingress-canary-qvffh" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.810608 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/05ea66c9-da9e-44a9-b85b-11514abd77e1-registration-dir\") pod \"csi-hostpathplugin-cvplk\" (UID: \"05ea66c9-da9e-44a9-b85b-11514abd77e1\") " pod="hostpath-provisioner/csi-hostpathplugin-cvplk" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.810643 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b5fea82-8918-4237-99a6-eae4894a0b5f-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-chccp\" (UID: \"9b5fea82-8918-4237-99a6-eae4894a0b5f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-chccp" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.810681 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8731be1c-9d01-46ea-9237-741ac9b9eb31-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-ljttt\" (UID: \"8731be1c-9d01-46ea-9237-741ac9b9eb31\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ljttt" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.810715 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5284b897-69bd-490f-a02d-2d8647c73029-signing-cabundle\") pod \"service-ca-9c57cc56f-xdzjw\" (UID: \"5284b897-69bd-490f-a02d-2d8647c73029\") " pod="openshift-service-ca/service-ca-9c57cc56f-xdzjw" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.810787 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ttbp\" (UniqueName: \"kubernetes.io/projected/08f929dc-5226-49d2-9048-9b4fb65eff57-kube-api-access-8ttbp\") pod \"catalog-operator-68c6474976-jjd72\" (UID: \"08f929dc-5226-49d2-9048-9b4fb65eff57\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jjd72" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.810825 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c0998cf9-2c91-46b3-a9b8-a070e8b855b2-stats-auth\") pod \"router-default-5444994796-w7dbc\" (UID: \"c0998cf9-2c91-46b3-a9b8-a070e8b855b2\") " pod="openshift-ingress/router-default-5444994796-w7dbc" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.810858 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3147bef-3d3e-4b85-be3f-bfd0887849a9-config\") pod \"kube-controller-manager-operator-78b949d7b-857vf\" (UID: \"f3147bef-3d3e-4b85-be3f-bfd0887849a9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-857vf" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.810893 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ts5wg\" (UniqueName: \"kubernetes.io/projected/05ea66c9-da9e-44a9-b85b-11514abd77e1-kube-api-access-ts5wg\") pod \"csi-hostpathplugin-cvplk\" (UID: \"05ea66c9-da9e-44a9-b85b-11514abd77e1\") " pod="hostpath-provisioner/csi-hostpathplugin-cvplk" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.810937 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfztv\" (UniqueName: \"kubernetes.io/projected/891cfebf-6f4e-446d-85f4-4231622d7886-kube-api-access-hfztv\") pod \"dns-default-sd7kr\" (UID: \"891cfebf-6f4e-446d-85f4-4231622d7886\") " pod="openshift-dns/dns-default-sd7kr" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.810987 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14136410-157d-4287-96e1-6939c887bef3-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-nhc89\" (UID: \"14136410-157d-4287-96e1-6939c887bef3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nhc89" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.811022 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14136410-157d-4287-96e1-6939c887bef3-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-nhc89\" (UID: \"14136410-157d-4287-96e1-6939c887bef3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nhc89" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.811053 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/93b229ce-883d-43b5-a41c-1012da66417b-proxy-tls\") pod \"machine-config-controller-84d6567774-4tkbk\" (UID: \"93b229ce-883d-43b5-a41c-1012da66417b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4tkbk" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.811121 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cffc3e00-cf34-448e-83c9-96c95a232caa-images\") pod \"machine-config-operator-74547568cd-5l427\" (UID: \"cffc3e00-cf34-448e-83c9-96c95a232caa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5l427" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.811155 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/eac73a5d-96c7-4aad-8882-edbb7eb2b55d-tmpfs\") pod \"packageserver-d55dfcdfc-txwcw\" (UID: \"eac73a5d-96c7-4aad-8882-edbb7eb2b55d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-txwcw" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.811174 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-l5zrr" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.811199 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/eac73a5d-96c7-4aad-8882-edbb7eb2b55d-webhook-cert\") pod \"packageserver-d55dfcdfc-txwcw\" (UID: \"eac73a5d-96c7-4aad-8882-edbb7eb2b55d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-txwcw" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.811231 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c0998cf9-2c91-46b3-a9b8-a070e8b855b2-default-certificate\") pod \"router-default-5444994796-w7dbc\" (UID: \"c0998cf9-2c91-46b3-a9b8-a070e8b855b2\") " pod="openshift-ingress/router-default-5444994796-w7dbc" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.811269 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrc5k\" (UniqueName: \"kubernetes.io/projected/5284b897-69bd-490f-a02d-2d8647c73029-kube-api-access-zrc5k\") pod \"service-ca-9c57cc56f-xdzjw\" (UID: \"5284b897-69bd-490f-a02d-2d8647c73029\") " pod="openshift-service-ca/service-ca-9c57cc56f-xdzjw" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.811296 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ptsq\" (UniqueName: \"kubernetes.io/projected/782a3373-7524-40dc-b312-08c40423ffb6-kube-api-access-9ptsq\") pod \"marketplace-operator-79b997595-bps76\" (UID: \"782a3373-7524-40dc-b312-08c40423ffb6\") " pod="openshift-marketplace/marketplace-operator-79b997595-bps76" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.811326 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gq5s4\" (UniqueName: \"kubernetes.io/projected/f619c2d4-bcf5-4403-acd6-0bf90e2ece94-kube-api-access-gq5s4\") pod \"collect-profiles-29399700-qknbd\" (UID: \"f619c2d4-bcf5-4403-acd6-0bf90e2ece94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399700-qknbd" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.811357 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9503659c-61c6-42ce-988d-ee76cef6db29-config\") pod \"service-ca-operator-777779d784-v9rmm\" (UID: \"9503659c-61c6-42ce-988d-ee76cef6db29\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v9rmm" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.811389 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/782a3373-7524-40dc-b312-08c40423ffb6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bps76\" (UID: \"782a3373-7524-40dc-b312-08c40423ffb6\") " pod="openshift-marketplace/marketplace-operator-79b997595-bps76" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.811445 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmnj7\" (UniqueName: \"kubernetes.io/projected/3258c094-1ed5-4938-8b6c-284465245440-kube-api-access-hmnj7\") pod \"ingress-operator-5b745b69d9-fnfwt\" (UID: \"3258c094-1ed5-4938-8b6c-284465245440\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fnfwt" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.811488 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/05ea66c9-da9e-44a9-b85b-11514abd77e1-csi-data-dir\") pod \"csi-hostpathplugin-cvplk\" (UID: \"05ea66c9-da9e-44a9-b85b-11514abd77e1\") " pod="hostpath-provisioner/csi-hostpathplugin-cvplk" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.811825 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/05ea66c9-da9e-44a9-b85b-11514abd77e1-csi-data-dir\") pod \"csi-hostpathplugin-cvplk\" (UID: \"05ea66c9-da9e-44a9-b85b-11514abd77e1\") " pod="hostpath-provisioner/csi-hostpathplugin-cvplk" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.811854 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0998cf9-2c91-46b3-a9b8-a070e8b855b2-service-ca-bundle\") pod \"router-default-5444994796-w7dbc\" (UID: \"c0998cf9-2c91-46b3-a9b8-a070e8b855b2\") " pod="openshift-ingress/router-default-5444994796-w7dbc" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.812051 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/782a3373-7524-40dc-b312-08c40423ffb6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bps76\" (UID: \"782a3373-7524-40dc-b312-08c40423ffb6\") " pod="openshift-marketplace/marketplace-operator-79b997595-bps76" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.812655 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/eac73a5d-96c7-4aad-8882-edbb7eb2b55d-apiservice-cert\") pod \"packageserver-d55dfcdfc-txwcw\" (UID: \"eac73a5d-96c7-4aad-8882-edbb7eb2b55d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-txwcw" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.812878 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cffc3e00-cf34-448e-83c9-96c95a232caa-auth-proxy-config\") pod \"machine-config-operator-74547568cd-5l427\" (UID: \"cffc3e00-cf34-448e-83c9-96c95a232caa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5l427" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.813111 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f619c2d4-bcf5-4403-acd6-0bf90e2ece94-config-volume\") pod \"collect-profiles-29399700-qknbd\" (UID: \"f619c2d4-bcf5-4403-acd6-0bf90e2ece94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399700-qknbd" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.814036 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/05ea66c9-da9e-44a9-b85b-11514abd77e1-plugins-dir\") pod \"csi-hostpathplugin-cvplk\" (UID: \"05ea66c9-da9e-44a9-b85b-11514abd77e1\") " pod="hostpath-provisioner/csi-hostpathplugin-cvplk" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.814115 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/05ea66c9-da9e-44a9-b85b-11514abd77e1-mountpoint-dir\") pod \"csi-hostpathplugin-cvplk\" (UID: \"05ea66c9-da9e-44a9-b85b-11514abd77e1\") " pod="hostpath-provisioner/csi-hostpathplugin-cvplk" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.814339 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3258c094-1ed5-4938-8b6c-284465245440-trusted-ca\") pod \"ingress-operator-5b745b69d9-fnfwt\" (UID: \"3258c094-1ed5-4938-8b6c-284465245440\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fnfwt" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.816286 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/05ea66c9-da9e-44a9-b85b-11514abd77e1-socket-dir\") pod \"csi-hostpathplugin-cvplk\" (UID: \"05ea66c9-da9e-44a9-b85b-11514abd77e1\") " pod="hostpath-provisioner/csi-hostpathplugin-cvplk" Nov 24 11:08:56 crc kubenswrapper[4752]: E1124 11:08:56.816947 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:08:57.316929191 +0000 UTC m=+143.301749480 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.817205 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c0998cf9-2c91-46b3-a9b8-a070e8b855b2-metrics-certs\") pod \"router-default-5444994796-w7dbc\" (UID: \"c0998cf9-2c91-46b3-a9b8-a070e8b855b2\") " pod="openshift-ingress/router-default-5444994796-w7dbc" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.817574 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/891cfebf-6f4e-446d-85f4-4231622d7886-config-volume\") pod \"dns-default-sd7kr\" (UID: \"891cfebf-6f4e-446d-85f4-4231622d7886\") " pod="openshift-dns/dns-default-sd7kr" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.817737 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14136410-157d-4287-96e1-6939c887bef3-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-nhc89\" (UID: \"14136410-157d-4287-96e1-6939c887bef3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nhc89" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.817793 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/93b229ce-883d-43b5-a41c-1012da66417b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-4tkbk\" (UID: \"93b229ce-883d-43b5-a41c-1012da66417b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4tkbk" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.818237 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/891cfebf-6f4e-446d-85f4-4231622d7886-metrics-tls\") pod \"dns-default-sd7kr\" (UID: \"891cfebf-6f4e-446d-85f4-4231622d7886\") " pod="openshift-dns/dns-default-sd7kr" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.818289 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/eac73a5d-96c7-4aad-8882-edbb7eb2b55d-tmpfs\") pod \"packageserver-d55dfcdfc-txwcw\" (UID: \"eac73a5d-96c7-4aad-8882-edbb7eb2b55d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-txwcw" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.818611 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cffc3e00-cf34-448e-83c9-96c95a232caa-images\") pod \"machine-config-operator-74547568cd-5l427\" (UID: \"cffc3e00-cf34-448e-83c9-96c95a232caa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5l427" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.818776 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3147bef-3d3e-4b85-be3f-bfd0887849a9-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-857vf\" (UID: \"f3147bef-3d3e-4b85-be3f-bfd0887849a9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-857vf" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.818847 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/08f929dc-5226-49d2-9048-9b4fb65eff57-srv-cert\") pod \"catalog-operator-68c6474976-jjd72\" (UID: \"08f929dc-5226-49d2-9048-9b4fb65eff57\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jjd72" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.818999 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3258c094-1ed5-4938-8b6c-284465245440-metrics-tls\") pod \"ingress-operator-5b745b69d9-fnfwt\" (UID: \"3258c094-1ed5-4938-8b6c-284465245440\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fnfwt" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.819041 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2521dbb1-8413-4180-bd70-2cf9019e583c-cert\") pod \"ingress-canary-qvffh\" (UID: \"2521dbb1-8413-4180-bd70-2cf9019e583c\") " pod="openshift-ingress-canary/ingress-canary-qvffh" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.819336 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/08f929dc-5226-49d2-9048-9b4fb65eff57-profile-collector-cert\") pod \"catalog-operator-68c6474976-jjd72\" (UID: \"08f929dc-5226-49d2-9048-9b4fb65eff57\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jjd72" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.819376 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cffc3e00-cf34-448e-83c9-96c95a232caa-proxy-tls\") pod \"machine-config-operator-74547568cd-5l427\" (UID: \"cffc3e00-cf34-448e-83c9-96c95a232caa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5l427" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.819971 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5284b897-69bd-490f-a02d-2d8647c73029-signing-key\") pod \"service-ca-9c57cc56f-xdzjw\" (UID: \"5284b897-69bd-490f-a02d-2d8647c73029\") " pod="openshift-service-ca/service-ca-9c57cc56f-xdzjw" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.820217 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/eac73a5d-96c7-4aad-8882-edbb7eb2b55d-webhook-cert\") pod \"packageserver-d55dfcdfc-txwcw\" (UID: \"eac73a5d-96c7-4aad-8882-edbb7eb2b55d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-txwcw" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.820349 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/57e2fd42-88ec-49e8-b089-ff3d0a9405cf-certs\") pod \"machine-config-server-pchf4\" (UID: \"57e2fd42-88ec-49e8-b089-ff3d0a9405cf\") " pod="openshift-machine-config-operator/machine-config-server-pchf4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.823019 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-7xbdq" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.823199 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9503659c-61c6-42ce-988d-ee76cef6db29-config\") pod \"service-ca-operator-777779d784-v9rmm\" (UID: \"9503659c-61c6-42ce-988d-ee76cef6db29\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v9rmm" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.823774 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14136410-157d-4287-96e1-6939c887bef3-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-nhc89\" (UID: \"14136410-157d-4287-96e1-6939c887bef3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nhc89" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.823806 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9503659c-61c6-42ce-988d-ee76cef6db29-serving-cert\") pod \"service-ca-operator-777779d784-v9rmm\" (UID: \"9503659c-61c6-42ce-988d-ee76cef6db29\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v9rmm" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.824015 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tmj4\" (UniqueName: \"kubernetes.io/projected/d4bbc9d0-4420-4581-a3be-919da131bf9a-kube-api-access-9tmj4\") pod \"controller-manager-879f6c89f-tn49v\" (UID: \"d4bbc9d0-4420-4581-a3be-919da131bf9a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tn49v" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.824150 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/fb1fcea4-1195-476d-b01c-875ab966a466-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-htkkq\" (UID: \"fb1fcea4-1195-476d-b01c-875ab966a466\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-htkkq" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.824418 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f619c2d4-bcf5-4403-acd6-0bf90e2ece94-secret-volume\") pod \"collect-profiles-29399700-qknbd\" (UID: \"f619c2d4-bcf5-4403-acd6-0bf90e2ece94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399700-qknbd" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.824461 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c0998cf9-2c91-46b3-a9b8-a070e8b855b2-default-certificate\") pod \"router-default-5444994796-w7dbc\" (UID: \"c0998cf9-2c91-46b3-a9b8-a070e8b855b2\") " pod="openshift-ingress/router-default-5444994796-w7dbc" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.824460 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5284b897-69bd-490f-a02d-2d8647c73029-signing-cabundle\") pod \"service-ca-9c57cc56f-xdzjw\" (UID: \"5284b897-69bd-490f-a02d-2d8647c73029\") " pod="openshift-service-ca/service-ca-9c57cc56f-xdzjw" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.824621 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/782a3373-7524-40dc-b312-08c40423ffb6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bps76\" (UID: \"782a3373-7524-40dc-b312-08c40423ffb6\") " pod="openshift-marketplace/marketplace-operator-79b997595-bps76" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.824725 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/93b229ce-883d-43b5-a41c-1012da66417b-proxy-tls\") pod \"machine-config-controller-84d6567774-4tkbk\" (UID: \"93b229ce-883d-43b5-a41c-1012da66417b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4tkbk" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.824837 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3147bef-3d3e-4b85-be3f-bfd0887849a9-config\") pod \"kube-controller-manager-operator-78b949d7b-857vf\" (UID: \"f3147bef-3d3e-4b85-be3f-bfd0887849a9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-857vf" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.824878 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a5b18d85-5b4b-4e2d-a357-39462b1bbcd6-profile-collector-cert\") pod \"olm-operator-6b444d44fb-wbs52\" (UID: \"a5b18d85-5b4b-4e2d-a357-39462b1bbcd6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wbs52" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.824930 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/05ea66c9-da9e-44a9-b85b-11514abd77e1-registration-dir\") pod \"csi-hostpathplugin-cvplk\" (UID: \"05ea66c9-da9e-44a9-b85b-11514abd77e1\") " pod="hostpath-provisioner/csi-hostpathplugin-cvplk" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.825530 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/57e2fd42-88ec-49e8-b089-ff3d0a9405cf-node-bootstrap-token\") pod \"machine-config-server-pchf4\" (UID: \"57e2fd42-88ec-49e8-b089-ff3d0a9405cf\") " pod="openshift-machine-config-operator/machine-config-server-pchf4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.826138 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c0998cf9-2c91-46b3-a9b8-a070e8b855b2-stats-auth\") pod \"router-default-5444994796-w7dbc\" (UID: \"c0998cf9-2c91-46b3-a9b8-a070e8b855b2\") " pod="openshift-ingress/router-default-5444994796-w7dbc" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.830157 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b5fea82-8918-4237-99a6-eae4894a0b5f-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-chccp\" (UID: \"9b5fea82-8918-4237-99a6-eae4894a0b5f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-chccp" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.830781 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8731be1c-9d01-46ea-9237-741ac9b9eb31-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-ljttt\" (UID: \"8731be1c-9d01-46ea-9237-741ac9b9eb31\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ljttt" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.831763 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a5b18d85-5b4b-4e2d-a357-39462b1bbcd6-srv-cert\") pod \"olm-operator-6b444d44fb-wbs52\" (UID: \"a5b18d85-5b4b-4e2d-a357-39462b1bbcd6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wbs52" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.836976 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6jtr\" (UniqueName: \"kubernetes.io/projected/0bff57c9-60c1-47ad-8b27-040cb3453a55-kube-api-access-t6jtr\") pod \"apiserver-76f77b778f-xvmg4\" (UID: \"0bff57c9-60c1-47ad-8b27-040cb3453a55\") " pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.859917 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtnz8\" (UniqueName: \"kubernetes.io/projected/df4cedec-414c-4253-8a55-79ed8c8734c1-kube-api-access-xtnz8\") pod \"console-f9d7485db-x7lpn\" (UID: \"df4cedec-414c-4253-8a55-79ed8c8734c1\") " pod="openshift-console/console-f9d7485db-x7lpn" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.878219 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tn49v" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.882097 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hsbd\" (UniqueName: \"kubernetes.io/projected/bae98c3e-6921-4f96-beab-9d294469a8fe-kube-api-access-6hsbd\") pod \"dns-operator-744455d44c-jzbd4\" (UID: \"bae98c3e-6921-4f96-beab-9d294469a8fe\") " pod="openshift-dns-operator/dns-operator-744455d44c-jzbd4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.899245 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3a9e501c-00d3-4607-8b6b-b01485ba4961-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-rmq2r\" (UID: \"3a9e501c-00d3-4607-8b6b-b01485ba4961\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rmq2r" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.912190 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:08:56 crc kubenswrapper[4752]: E1124 11:08:56.912864 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:08:57.412845653 +0000 UTC m=+143.397665952 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.964938 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c0f8aa1-dec1-46a3-ad52-b1e3275504b2-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-fvb85\" (UID: \"9c0f8aa1-dec1-46a3-ad52-b1e3275504b2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fvb85" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.969400 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvsgf\" (UniqueName: \"kubernetes.io/projected/ac73154f-7953-4cb0-b785-ebed9795fe9e-kube-api-access-kvsgf\") pod \"apiserver-7bbb656c7d-f8zgb\" (UID: \"ac73154f-7953-4cb0-b785-ebed9795fe9e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8zgb" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.969941 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pfbf\" (UniqueName: \"kubernetes.io/projected/7b04041a-699e-42d5-a26e-3af266dc33ee-kube-api-access-2pfbf\") pod \"openshift-apiserver-operator-796bbdcf4f-r4qhq\" (UID: \"7b04041a-699e-42d5-a26e-3af266dc33ee\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r4qhq" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.981620 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nmlc\" (UniqueName: \"kubernetes.io/projected/e7f6a4fd-346f-4828-b077-4d4888917d6a-kube-api-access-9nmlc\") pod \"downloads-7954f5f757-sght6\" (UID: \"e7f6a4fd-346f-4828-b077-4d4888917d6a\") " pod="openshift-console/downloads-7954f5f757-sght6" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.990113 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" Nov 24 11:08:56 crc kubenswrapper[4752]: I1124 11:08:56.999213 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd7zx\" (UniqueName: \"kubernetes.io/projected/224c53cb-9374-44a5-8256-0c45d1fbab84-kube-api-access-bd7zx\") pod \"etcd-operator-b45778765-c2q5l\" (UID: \"224c53cb-9374-44a5-8256-0c45d1fbab84\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c2q5l" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.013607 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:08:57 crc kubenswrapper[4752]: E1124 11:08:57.014119 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:08:57.514106756 +0000 UTC m=+143.498927035 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.019056 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cff4w\" (UniqueName: \"kubernetes.io/projected/6a04c082-6938-494c-833e-65f1f25817a9-kube-api-access-cff4w\") pod \"cluster-samples-operator-665b6dd947-gxnqd\" (UID: \"6a04c082-6938-494c-833e-65f1f25817a9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gxnqd" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.025193 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8zgb" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.025654 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r4qhq" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.040723 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b03c5740-8446-4caf-9072-274155052591-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-txmns\" (UID: \"b03c5740-8446-4caf-9072-274155052591\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-txmns" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.054949 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-x7lpn" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.058850 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h74z\" (UniqueName: \"kubernetes.io/projected/8d1db8d4-8511-4932-8478-61ddedab3680-kube-api-access-6h74z\") pod \"machine-approver-56656f9798-jntpt\" (UID: \"8d1db8d4-8511-4932-8478-61ddedab3680\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jntpt" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.062988 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gxnqd" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.079981 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ndvd\" (UniqueName: \"kubernetes.io/projected/b8ee461c-f13e-49f3-ba77-8063e92a9d01-kube-api-access-4ndvd\") pod \"authentication-operator-69f744f599-znzhg\" (UID: \"b8ee461c-f13e-49f3-ba77-8063e92a9d01\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-znzhg" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.100166 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-sght6" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.102241 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnn4t\" (UniqueName: \"kubernetes.io/projected/3a9e501c-00d3-4607-8b6b-b01485ba4961-kube-api-access-cnn4t\") pod \"cluster-image-registry-operator-dc59b4c8b-rmq2r\" (UID: \"3a9e501c-00d3-4607-8b6b-b01485ba4961\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rmq2r" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.116095 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:08:57 crc kubenswrapper[4752]: E1124 11:08:57.116612 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:08:57.616593376 +0000 UTC m=+143.601413675 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.121900 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zglc9\" (UniqueName: \"kubernetes.io/projected/fde8828e-798f-4da5-9f44-0b8a2726dcb1-kube-api-access-zglc9\") pod \"route-controller-manager-6576b87f9c-cprsp\" (UID: \"fde8828e-798f-4da5-9f44-0b8a2726dcb1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cprsp" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.134549 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tn49v"] Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.134853 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-jzbd4" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.134876 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-l5zrr"] Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.142980 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-c2q5l" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.143126 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7xbdq"] Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.144088 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngx8g\" (UniqueName: \"kubernetes.io/projected/5618f796-dad3-4ed3-bff7-ceed08f8b07c-kube-api-access-ngx8g\") pod \"oauth-openshift-558db77b4-965hb\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.157082 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fvb85" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.168544 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf8nr\" (UniqueName: \"kubernetes.io/projected/101dea51-f55e-4def-ae4f-e3d0ce84351f-kube-api-access-zf8nr\") pod \"openshift-controller-manager-operator-756b6f6bc6-bkhh5\" (UID: \"101dea51-f55e-4def-ae4f-e3d0ce84351f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bkhh5" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.203885 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2qhw\" (UniqueName: \"kubernetes.io/projected/a5b18d85-5b4b-4e2d-a357-39462b1bbcd6-kube-api-access-p2qhw\") pod \"olm-operator-6b444d44fb-wbs52\" (UID: \"a5b18d85-5b4b-4e2d-a357-39462b1bbcd6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wbs52" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.206713 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cprsp" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.211648 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wbs52" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.217629 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:08:57 crc kubenswrapper[4752]: E1124 11:08:57.218209 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:08:57.718194539 +0000 UTC m=+143.703014828 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.224864 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7k2d\" (UniqueName: \"kubernetes.io/projected/93b229ce-883d-43b5-a41c-1012da66417b-kube-api-access-r7k2d\") pod \"machine-config-controller-84d6567774-4tkbk\" (UID: \"93b229ce-883d-43b5-a41c-1012da66417b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4tkbk" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.240413 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xvmg4"] Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.249116 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8wjf\" (UniqueName: \"kubernetes.io/projected/9b5fea82-8918-4237-99a6-eae4894a0b5f-kube-api-access-t8wjf\") pod \"control-plane-machine-set-operator-78cbb6b69f-chccp\" (UID: \"9b5fea82-8918-4237-99a6-eae4894a0b5f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-chccp" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.252807 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-znzhg" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.255143 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-txmns" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.266864 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r4qhq"] Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.269083 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4tkbk" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.270882 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f3147bef-3d3e-4b85-be3f-bfd0887849a9-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-857vf\" (UID: \"f3147bef-3d3e-4b85-be3f-bfd0887849a9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-857vf" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.285869 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wnkz\" (UniqueName: \"kubernetes.io/projected/8731be1c-9d01-46ea-9237-741ac9b9eb31-kube-api-access-7wnkz\") pod \"multus-admission-controller-857f4d67dd-ljttt\" (UID: \"8731be1c-9d01-46ea-9237-741ac9b9eb31\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ljttt" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.299587 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jntpt" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.300120 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgtb5\" (UniqueName: \"kubernetes.io/projected/cae410f5-e7a9-42e2-9aaa-d6634d848e58-kube-api-access-pgtb5\") pod \"migrator-59844c95c7-ptrzc\" (UID: \"cae410f5-e7a9-42e2-9aaa-d6634d848e58\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ptrzc" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.318637 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.319286 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-f8zgb"] Nov 24 11:08:57 crc kubenswrapper[4752]: E1124 11:08:57.319345 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:08:57.819316168 +0000 UTC m=+143.804136467 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.319666 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:08:57 crc kubenswrapper[4752]: E1124 11:08:57.320021 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:08:57.820012929 +0000 UTC m=+143.804833208 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.327269 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b68js\" (UniqueName: \"kubernetes.io/projected/eac73a5d-96c7-4aad-8882-edbb7eb2b55d-kube-api-access-b68js\") pod \"packageserver-d55dfcdfc-txwcw\" (UID: \"eac73a5d-96c7-4aad-8882-edbb7eb2b55d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-txwcw" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.337193 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3258c094-1ed5-4938-8b6c-284465245440-bound-sa-token\") pod \"ingress-operator-5b745b69d9-fnfwt\" (UID: \"3258c094-1ed5-4938-8b6c-284465245440\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fnfwt" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.364231 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bflcp\" (UniqueName: \"kubernetes.io/projected/fb1fcea4-1195-476d-b01c-875ab966a466-kube-api-access-bflcp\") pod \"package-server-manager-789f6589d5-htkkq\" (UID: \"fb1fcea4-1195-476d-b01c-875ab966a466\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-htkkq" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.370851 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.377476 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v9sd\" (UniqueName: \"kubernetes.io/projected/cffc3e00-cf34-448e-83c9-96c95a232caa-kube-api-access-5v9sd\") pod \"machine-config-operator-74547568cd-5l427\" (UID: \"cffc3e00-cf34-448e-83c9-96c95a232caa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5l427" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.386409 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rmq2r" Nov 24 11:08:57 crc kubenswrapper[4752]: W1124 11:08:57.388968 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d1db8d4_8511_4932_8478_61ddedab3680.slice/crio-0fbcf8a858db1e75ed897e6f2167e2ee9e7d1c5ef210e60007cae233b955fa40 WatchSource:0}: Error finding container 0fbcf8a858db1e75ed897e6f2167e2ee9e7d1c5ef210e60007cae233b955fa40: Status 404 returned error can't find the container with id 0fbcf8a858db1e75ed897e6f2167e2ee9e7d1c5ef210e60007cae233b955fa40 Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.415653 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bkhh5" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.420550 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:08:57 crc kubenswrapper[4752]: E1124 11:08:57.420788 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:08:57.920771677 +0000 UTC m=+143.905591966 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.421868 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts5wg\" (UniqueName: \"kubernetes.io/projected/05ea66c9-da9e-44a9-b85b-11514abd77e1-kube-api-access-ts5wg\") pod \"csi-hostpathplugin-cvplk\" (UID: \"05ea66c9-da9e-44a9-b85b-11514abd77e1\") " pod="hostpath-provisioner/csi-hostpathplugin-cvplk" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.422578 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fffk2\" (UniqueName: \"kubernetes.io/projected/9503659c-61c6-42ce-988d-ee76cef6db29-kube-api-access-fffk2\") pod \"service-ca-operator-777779d784-v9rmm\" (UID: \"9503659c-61c6-42ce-988d-ee76cef6db29\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-v9rmm" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.438506 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-sght6"] Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.444710 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvvzb\" (UniqueName: \"kubernetes.io/projected/57e2fd42-88ec-49e8-b089-ff3d0a9405cf-kube-api-access-nvvzb\") pod \"machine-config-server-pchf4\" (UID: \"57e2fd42-88ec-49e8-b089-ff3d0a9405cf\") " pod="openshift-machine-config-operator/machine-config-server-pchf4" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.460810 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-857vf" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.468026 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ptrzc" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.469026 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvxxg\" (UniqueName: \"kubernetes.io/projected/14136410-157d-4287-96e1-6939c887bef3-kube-api-access-tvxxg\") pod \"kube-storage-version-migrator-operator-b67b599dd-nhc89\" (UID: \"14136410-157d-4287-96e1-6939c887bef3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nhc89" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.481718 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfztv\" (UniqueName: \"kubernetes.io/projected/891cfebf-6f4e-446d-85f4-4231622d7886-kube-api-access-hfztv\") pod \"dns-default-sd7kr\" (UID: \"891cfebf-6f4e-446d-85f4-4231622d7886\") " pod="openshift-dns/dns-default-sd7kr" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.495731 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5l427" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.505788 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nhc89" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.515273 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7bd8\" (UniqueName: \"kubernetes.io/projected/c0998cf9-2c91-46b3-a9b8-a070e8b855b2-kube-api-access-h7bd8\") pod \"router-default-5444994796-w7dbc\" (UID: \"c0998cf9-2c91-46b3-a9b8-a070e8b855b2\") " pod="openshift-ingress/router-default-5444994796-w7dbc" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.516696 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tn49v" event={"ID":"d4bbc9d0-4420-4581-a3be-919da131bf9a","Type":"ContainerStarted","Data":"8ff4ce0ecfa6820d3b74338d33dc1911d9c778721e3f0e9de7e97deb5c84ea6e"} Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.516919 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-chccp" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.523297 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:08:57 crc kubenswrapper[4752]: E1124 11:08:57.523691 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:08:58.02368043 +0000 UTC m=+144.008500719 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.528152 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-htkkq" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.530164 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrc5k\" (UniqueName: \"kubernetes.io/projected/5284b897-69bd-490f-a02d-2d8647c73029-kube-api-access-zrc5k\") pod \"service-ca-9c57cc56f-xdzjw\" (UID: \"5284b897-69bd-490f-a02d-2d8647c73029\") " pod="openshift-service-ca/service-ca-9c57cc56f-xdzjw" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.531954 4752 generic.go:334] "Generic (PLEG): container finished" podID="267a4077-a682-409c-855f-7de05580fc97" containerID="81f6739d7f4fcef9bbea40136c303ef095ea8c05b0cfab9ae0119896b8e03d57" exitCode=0 Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.532537 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d5cwn" event={"ID":"267a4077-a682-409c-855f-7de05580fc97","Type":"ContainerDied","Data":"81f6739d7f4fcef9bbea40136c303ef095ea8c05b0cfab9ae0119896b8e03d57"} Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.532572 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d5cwn" event={"ID":"267a4077-a682-409c-855f-7de05580fc97","Type":"ContainerStarted","Data":"8a9d0f3820b5711de096dad5240c6710f6189f0d3efb0a627040c07ee5f8fca7"} Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.537794 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-sght6" event={"ID":"e7f6a4fd-346f-4828-b077-4d4888917d6a","Type":"ContainerStarted","Data":"56cbe7f0d0c27a531dd56f3e8bfbc6c3834e7f8302756dcf80ede0a5d9aa5e5a"} Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.541403 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jntpt" event={"ID":"8d1db8d4-8511-4932-8478-61ddedab3680","Type":"ContainerStarted","Data":"0fbcf8a858db1e75ed897e6f2167e2ee9e7d1c5ef210e60007cae233b955fa40"} Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.543235 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq5s4\" (UniqueName: \"kubernetes.io/projected/f619c2d4-bcf5-4403-acd6-0bf90e2ece94-kube-api-access-gq5s4\") pod \"collect-profiles-29399700-qknbd\" (UID: \"f619c2d4-bcf5-4403-acd6-0bf90e2ece94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399700-qknbd" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.544727 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-v9rmm" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.548247 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-txwcw" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.553286 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7xbdq" event={"ID":"6e9f72eb-52cd-47cc-b939-3301c0aa7f3c","Type":"ContainerStarted","Data":"5b223b02071b6ed6f651270cbad7df074cd0b48abe76b5bc1e8c2417b93671bc"} Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.556192 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" event={"ID":"0bff57c9-60c1-47ad-8b27-040cb3453a55","Type":"ContainerStarted","Data":"a8c94a8718415996206718f1dcb58beb5c91cdea9f1872deb4365c08334c67ee"} Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.560183 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gxnqd"] Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.561431 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-ljttt" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.565211 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r4qhq" event={"ID":"7b04041a-699e-42d5-a26e-3af266dc33ee","Type":"ContainerStarted","Data":"d0486f35d3d0551cf94bb7e27a8b91cd4fc52fe191f54dcc7ed13f416565df41"} Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.569243 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-l5zrr" event={"ID":"f39385e4-9c04-4aae-878c-34ab4d097664","Type":"ContainerStarted","Data":"1c24eaddf090185ac8301a8dd6b4dfec98757f8a2fde488a8625f87c5a1ae2d9"} Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.569279 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ptsq\" (UniqueName: \"kubernetes.io/projected/782a3373-7524-40dc-b312-08c40423ffb6-kube-api-access-9ptsq\") pod \"marketplace-operator-79b997595-bps76\" (UID: \"782a3373-7524-40dc-b312-08c40423ffb6\") " pod="openshift-marketplace/marketplace-operator-79b997595-bps76" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.576967 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-w7dbc" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.579046 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-x7lpn"] Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.582574 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8zgb" event={"ID":"ac73154f-7953-4cb0-b785-ebed9795fe9e","Type":"ContainerStarted","Data":"488da87bc9ec6835eef065fa8e670344dfac58af5428ff3191f2ab1910746cca"} Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.589837 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmnj7\" (UniqueName: \"kubernetes.io/projected/3258c094-1ed5-4938-8b6c-284465245440-kube-api-access-hmnj7\") pod \"ingress-operator-5b745b69d9-fnfwt\" (UID: \"3258c094-1ed5-4938-8b6c-284465245440\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fnfwt" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.596180 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-xdzjw" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.603144 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ttbp\" (UniqueName: \"kubernetes.io/projected/08f929dc-5226-49d2-9048-9b4fb65eff57-kube-api-access-8ttbp\") pod \"catalog-operator-68c6474976-jjd72\" (UID: \"08f929dc-5226-49d2-9048-9b4fb65eff57\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jjd72" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.620261 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-cvplk" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.625570 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6gcq\" (UniqueName: \"kubernetes.io/projected/2521dbb1-8413-4180-bd70-2cf9019e583c-kube-api-access-l6gcq\") pod \"ingress-canary-qvffh\" (UID: \"2521dbb1-8413-4180-bd70-2cf9019e583c\") " pod="openshift-ingress-canary/ingress-canary-qvffh" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.626851 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:08:57 crc kubenswrapper[4752]: E1124 11:08:57.633620 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:08:58.133572674 +0000 UTC m=+144.118392963 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.633888 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-pchf4" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.636109 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-sd7kr" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.738125 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:08:57 crc kubenswrapper[4752]: E1124 11:08:57.738497 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:08:58.238437465 +0000 UTC m=+144.223257754 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.755691 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-c2q5l"] Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.765105 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fvb85"] Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.776650 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fnfwt" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.791984 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-jzbd4"] Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.799969 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-cprsp"] Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.833028 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399700-qknbd" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.839566 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:08:57 crc kubenswrapper[4752]: E1124 11:08:57.840013 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:08:58.339993087 +0000 UTC m=+144.324813376 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.858365 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bps76" Nov 24 11:08:57 crc kubenswrapper[4752]: W1124 11:08:57.869525 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfde8828e_798f_4da5_9f44_0b8a2726dcb1.slice/crio-18924eeddbfa085b3d991b5633ee912fcff9103dc42341b48e0aaeed1ed3dffc WatchSource:0}: Error finding container 18924eeddbfa085b3d991b5633ee912fcff9103dc42341b48e0aaeed1ed3dffc: Status 404 returned error can't find the container with id 18924eeddbfa085b3d991b5633ee912fcff9103dc42341b48e0aaeed1ed3dffc Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.887230 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jjd72" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.919958 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qvffh" Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.940676 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:08:57 crc kubenswrapper[4752]: E1124 11:08:57.941013 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:08:58.441001842 +0000 UTC m=+144.425822131 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.947331 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-txmns"] Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.965521 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-znzhg"] Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.967053 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wbs52"] Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.967737 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-4tkbk"] Nov 24 11:08:57 crc kubenswrapper[4752]: I1124 11:08:57.995664 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bkhh5"] Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.053964 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:08:58 crc kubenswrapper[4752]: E1124 11:08:58.054378 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:08:58.554362141 +0000 UTC m=+144.539182430 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.080455 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-965hb"] Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.104188 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rmq2r"] Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.114255 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-htkkq"] Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.134801 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nhc89"] Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.157236 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:08:58 crc kubenswrapper[4752]: E1124 11:08:58.157773 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:08:58.657760149 +0000 UTC m=+144.642580438 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.166718 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-5l427"] Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.170303 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-857vf"] Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.227273 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-v9rmm"] Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.234948 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-chccp"] Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.249233 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ptrzc"] Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.266474 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:08:58 crc kubenswrapper[4752]: E1124 11:08:58.266921 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:08:58.766873989 +0000 UTC m=+144.751694278 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.267312 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:08:58 crc kubenswrapper[4752]: E1124 11:08:58.267727 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:08:58.767716555 +0000 UTC m=+144.752536854 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.371488 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:08:58 crc kubenswrapper[4752]: E1124 11:08:58.371881 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:08:58.871867685 +0000 UTC m=+144.856687974 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:08:58 crc kubenswrapper[4752]: W1124 11:08:58.421801 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcae410f5_e7a9_42e2_9aaa_d6634d848e58.slice/crio-79ddf498b9f4c5466582cc2465780e47b7ea4df2bf2ae735527a5bc9d023d36d WatchSource:0}: Error finding container 79ddf498b9f4c5466582cc2465780e47b7ea4df2bf2ae735527a5bc9d023d36d: Status 404 returned error can't find the container with id 79ddf498b9f4c5466582cc2465780e47b7ea4df2bf2ae735527a5bc9d023d36d Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.473205 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:08:58 crc kubenswrapper[4752]: E1124 11:08:58.473528 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:08:58.97351629 +0000 UTC m=+144.958336569 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.553622 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-txwcw"] Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.574229 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:08:58 crc kubenswrapper[4752]: E1124 11:08:58.574665 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:08:59.074649489 +0000 UTC m=+145.059469778 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.591234 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cprsp" event={"ID":"fde8828e-798f-4da5-9f44-0b8a2726dcb1","Type":"ContainerStarted","Data":"18924eeddbfa085b3d991b5633ee912fcff9103dc42341b48e0aaeed1ed3dffc"} Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.594988 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d5cwn" event={"ID":"267a4077-a682-409c-855f-7de05580fc97","Type":"ContainerStarted","Data":"1b356818c09d976136971f1321fe18f81a87e5cf451a35c7c1fb25c0b2bdcb2a"} Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.595102 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d5cwn" Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.596502 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wbs52" event={"ID":"a5b18d85-5b4b-4e2d-a357-39462b1bbcd6","Type":"ContainerStarted","Data":"d330bcc0450314f7e7bb0bef8346eb7e0e13adf36ea3ef667970d77433a7882e"} Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.597645 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-x7lpn" event={"ID":"df4cedec-414c-4253-8a55-79ed8c8734c1","Type":"ContainerStarted","Data":"f6b9149c1dc35ac45b08f19616de1d146ce2d1605c090c8982cf399d39d2794d"} Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.599402 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7xbdq" event={"ID":"6e9f72eb-52cd-47cc-b939-3301c0aa7f3c","Type":"ContainerStarted","Data":"e1fc3f4825d92af06dde88972160d3515936b98df1068c6d7f144723877a08cd"} Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.602140 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jntpt" event={"ID":"8d1db8d4-8511-4932-8478-61ddedab3680","Type":"ContainerStarted","Data":"0817f903011e32cacc8640a0c2f264aa43da7b4fcd1136d64a47f435e5054985"} Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.604345 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-chccp" event={"ID":"9b5fea82-8918-4237-99a6-eae4894a0b5f","Type":"ContainerStarted","Data":"5b34920e7b46a15f215227b88ed33b93ae6fa4cfb4b9222c8e72b18489a0fc17"} Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.619481 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nhc89" event={"ID":"14136410-157d-4287-96e1-6939c887bef3","Type":"ContainerStarted","Data":"7c6eb4eab3affa09854e5fb046b2c7d7b57d4060368958e81d989bdfafb3b5f4"} Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.633172 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tn49v" event={"ID":"d4bbc9d0-4420-4581-a3be-919da131bf9a","Type":"ContainerStarted","Data":"47b456e02fb8855e1dada8a587fb1db963b53e4ff699d728b01e67714d34cab4"} Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.633621 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-tn49v" Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.634854 4752 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-tn49v container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.634945 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-tn49v" podUID="d4bbc9d0-4420-4581-a3be-919da131bf9a" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.636735 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4tkbk" event={"ID":"93b229ce-883d-43b5-a41c-1012da66417b","Type":"ContainerStarted","Data":"7d1b7144d7841c3e17bb97c33c33060b6e0932ec9f160f3441a3545e92a94e96"} Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.639016 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-znzhg" event={"ID":"b8ee461c-f13e-49f3-ba77-8063e92a9d01","Type":"ContainerStarted","Data":"21eb119ac7322b5e547e6ea1a1dcdd04e68ff95a9324af001acc76d95e018207"} Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.648034 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-htkkq" event={"ID":"fb1fcea4-1195-476d-b01c-875ab966a466","Type":"ContainerStarted","Data":"2005987929ae1eeb532c0423a7055964d7a12405dfaf16a265171b230a1bd6a2"} Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.672551 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r4qhq" event={"ID":"7b04041a-699e-42d5-a26e-3af266dc33ee","Type":"ContainerStarted","Data":"951bc10b228980734f039b4fc97ff864a34b0fe69646a35c63dd7f47d3086009"} Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.674204 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5l427" event={"ID":"cffc3e00-cf34-448e-83c9-96c95a232caa","Type":"ContainerStarted","Data":"f8275bab40f8619feffad180e0cd70589482de610f16e7b1fe37540f51ca7442"} Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.675348 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:08:58 crc kubenswrapper[4752]: E1124 11:08:58.675596 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:08:59.175585582 +0000 UTC m=+145.160405871 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.676840 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-v9rmm" event={"ID":"9503659c-61c6-42ce-988d-ee76cef6db29","Type":"ContainerStarted","Data":"a4f1738482a993aae3ab906254e6d0e83d0cad4edaf5c7858b9cd40165749c50"} Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.679897 4752 generic.go:334] "Generic (PLEG): container finished" podID="0bff57c9-60c1-47ad-8b27-040cb3453a55" containerID="6613865f9ecc6af9500e0d84fde1f756078d5f97a38cfcc60480213898d412c1" exitCode=0 Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.679959 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" event={"ID":"0bff57c9-60c1-47ad-8b27-040cb3453a55","Type":"ContainerDied","Data":"6613865f9ecc6af9500e0d84fde1f756078d5f97a38cfcc60480213898d412c1"} Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.681408 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ptrzc" event={"ID":"cae410f5-e7a9-42e2-9aaa-d6634d848e58","Type":"ContainerStarted","Data":"79ddf498b9f4c5466582cc2465780e47b7ea4df2bf2ae735527a5bc9d023d36d"} Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.682427 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-c2q5l" event={"ID":"224c53cb-9374-44a5-8256-0c45d1fbab84","Type":"ContainerStarted","Data":"16c47d3311c11e5ad04a425caa5e15a6e224fb4c30a2852b0423b15a47354ad7"} Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.683468 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-857vf" event={"ID":"f3147bef-3d3e-4b85-be3f-bfd0887849a9","Type":"ContainerStarted","Data":"9fcbb0d3152e8118cfdec5b2b76fe181d5ca9a75b2cdabcc25912adc4cb01364"} Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.684118 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-txmns" event={"ID":"b03c5740-8446-4caf-9072-274155052591","Type":"ContainerStarted","Data":"1615dd51bc80e6c73022bf3024f231ebcc113f3738ecf02b37bdba5e7f57bd58"} Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.684705 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gxnqd" event={"ID":"6a04c082-6938-494c-833e-65f1f25817a9","Type":"ContainerStarted","Data":"599b931443e63941eafa92069d2b2517bb184498425455745a8935026a964f42"} Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.685517 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rmq2r" event={"ID":"3a9e501c-00d3-4607-8b6b-b01485ba4961","Type":"ContainerStarted","Data":"1d9ba2fbbcbc2c9fc6a08039f4e634401683bcda2a674ef9a8d675ce2409cfe8"} Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.691245 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-w7dbc" event={"ID":"c0998cf9-2c91-46b3-a9b8-a070e8b855b2","Type":"ContainerStarted","Data":"7d629205d1182fd8e91d388e569c635aa5dbd59d86e56f9ce130f3b5272343cf"} Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.697519 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-965hb" event={"ID":"5618f796-dad3-4ed3-bff7-ceed08f8b07c","Type":"ContainerStarted","Data":"5b38b0a182fb10e8de0d351c1276c310c4e2b11a494e4cd1a2387c22f9487f20"} Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.699890 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fvb85" event={"ID":"9c0f8aa1-dec1-46a3-ad52-b1e3275504b2","Type":"ContainerStarted","Data":"1d0578b7ad294e7e1144959aa1b52a6ee7ebf7c8fd6420848e5ea20ea69595ed"} Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.702290 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-pchf4" event={"ID":"57e2fd42-88ec-49e8-b089-ff3d0a9405cf","Type":"ContainerStarted","Data":"fc059fed7060ba4f3a2086e72ff4e65c1d90f27c066b18cf706a829a691512fe"} Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.708377 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-jzbd4" event={"ID":"bae98c3e-6921-4f96-beab-9d294469a8fe","Type":"ContainerStarted","Data":"49efdbf746cbbef8f25a7d41b23d1f1ba52fa9227b240b96e225c3a5227fba3e"} Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.719437 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bkhh5" event={"ID":"101dea51-f55e-4def-ae4f-e3d0ce84351f","Type":"ContainerStarted","Data":"07c9c2498e8b4600cd6b63c9d8f7ac7076700c45111bfc2b303512a0790cb271"} Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.722397 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-sght6" event={"ID":"e7f6a4fd-346f-4828-b077-4d4888917d6a","Type":"ContainerStarted","Data":"f2435f28b989c1406244cc0f20b7baa09a63c50be668f80a74d7f6d7b69f38fe"} Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.722843 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-sght6" Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.725440 4752 patch_prober.go:28] interesting pod/downloads-7954f5f757-sght6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.725497 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-sght6" podUID="e7f6a4fd-346f-4828-b077-4d4888917d6a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.757821 4752 patch_prober.go:28] interesting pod/console-operator-58897d9998-l5zrr container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.758029 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-l5zrr" podUID="f39385e4-9c04-4aae-878c-34ab4d097664" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.767413 4752 generic.go:334] "Generic (PLEG): container finished" podID="ac73154f-7953-4cb0-b785-ebed9795fe9e" containerID="48e3351e7eead63c3bee40b5801e4eb59ee8379a8c37be943e20a6b5315dd690" exitCode=0 Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.768398 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-l5zrr" event={"ID":"f39385e4-9c04-4aae-878c-34ab4d097664","Type":"ContainerStarted","Data":"ab8a50df81987cfaa8ccd30ab8a9e46b9693d5da68f5fae383c65bf03e4c2ef0"} Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.768425 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8zgb" event={"ID":"ac73154f-7953-4cb0-b785-ebed9795fe9e","Type":"ContainerDied","Data":"48e3351e7eead63c3bee40b5801e4eb59ee8379a8c37be943e20a6b5315dd690"} Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.768441 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-l5zrr" Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.784288 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:08:58 crc kubenswrapper[4752]: E1124 11:08:58.786376 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:08:59.286348893 +0000 UTC m=+145.271169222 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.795666 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-xdzjw"] Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.797861 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-sd7kr"] Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.812689 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-ljttt"] Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.814062 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-cvplk"] Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.885793 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:08:58 crc kubenswrapper[4752]: E1124 11:08:58.887385 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:08:59.387349068 +0000 UTC m=+145.372169347 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.946241 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bps76"] Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.958990 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jjd72"] Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.965377 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-fnfwt"] Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.979215 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399700-qknbd"] Nov 24 11:08:58 crc kubenswrapper[4752]: I1124 11:08:58.986445 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:08:58 crc kubenswrapper[4752]: E1124 11:08:58.986825 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:08:59.486804076 +0000 UTC m=+145.471624375 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:08:59 crc kubenswrapper[4752]: I1124 11:08:59.015227 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qvffh"] Nov 24 11:08:59 crc kubenswrapper[4752]: I1124 11:08:59.087727 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:08:59 crc kubenswrapper[4752]: E1124 11:08:59.089093 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:08:59.58907858 +0000 UTC m=+145.573898869 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:08:59 crc kubenswrapper[4752]: I1124 11:08:59.188074 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-l5zrr" podStartSLOduration=124.188057444 podStartE2EDuration="2m4.188057444s" podCreationTimestamp="2025-11-24 11:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:08:59.186382163 +0000 UTC m=+145.171202462" watchObservedRunningTime="2025-11-24 11:08:59.188057444 +0000 UTC m=+145.172877733" Nov 24 11:08:59 crc kubenswrapper[4752]: I1124 11:08:59.189367 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:08:59 crc kubenswrapper[4752]: E1124 11:08:59.189855 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:08:59.689823427 +0000 UTC m=+145.674643716 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:08:59 crc kubenswrapper[4752]: I1124 11:08:59.235649 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-tn49v" podStartSLOduration=124.235631293 podStartE2EDuration="2m4.235631293s" podCreationTimestamp="2025-11-24 11:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:08:59.226588419 +0000 UTC m=+145.211408708" watchObservedRunningTime="2025-11-24 11:08:59.235631293 +0000 UTC m=+145.220451572" Nov 24 11:08:59 crc kubenswrapper[4752]: I1124 11:08:59.296506 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:08:59 crc kubenswrapper[4752]: E1124 11:08:59.296977 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:08:59.796965038 +0000 UTC m=+145.781785327 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:08:59 crc kubenswrapper[4752]: I1124 11:08:59.319954 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r4qhq" podStartSLOduration=124.319931423 podStartE2EDuration="2m4.319931423s" podCreationTimestamp="2025-11-24 11:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:08:59.260184325 +0000 UTC m=+145.245004624" watchObservedRunningTime="2025-11-24 11:08:59.319931423 +0000 UTC m=+145.304751712" Nov 24 11:08:59 crc kubenswrapper[4752]: I1124 11:08:59.400226 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:08:59 crc kubenswrapper[4752]: I1124 11:08:59.402350 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d5cwn" podStartSLOduration=124.402335025 podStartE2EDuration="2m4.402335025s" podCreationTimestamp="2025-11-24 11:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:08:59.376325679 +0000 UTC m=+145.361145968" watchObservedRunningTime="2025-11-24 11:08:59.402335025 +0000 UTC m=+145.387155314" Nov 24 11:08:59 crc kubenswrapper[4752]: E1124 11:08:59.402597 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:08:59.900440218 +0000 UTC m=+145.885260507 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:08:59 crc kubenswrapper[4752]: I1124 11:08:59.402761 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:08:59 crc kubenswrapper[4752]: I1124 11:08:59.402969 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-sght6" podStartSLOduration=124.402962874 podStartE2EDuration="2m4.402962874s" podCreationTimestamp="2025-11-24 11:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:08:59.402018936 +0000 UTC m=+145.386839215" watchObservedRunningTime="2025-11-24 11:08:59.402962874 +0000 UTC m=+145.387783163" Nov 24 11:08:59 crc kubenswrapper[4752]: E1124 11:08:59.403178 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:08:59.903166791 +0000 UTC m=+145.887987080 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:08:59 crc kubenswrapper[4752]: I1124 11:08:59.503953 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:08:59 crc kubenswrapper[4752]: E1124 11:08:59.504650 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:09:00.00463288 +0000 UTC m=+145.989453169 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:08:59 crc kubenswrapper[4752]: I1124 11:08:59.606392 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:08:59 crc kubenswrapper[4752]: E1124 11:08:59.607148 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:09:00.10713308 +0000 UTC m=+146.091953369 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:08:59 crc kubenswrapper[4752]: I1124 11:08:59.715645 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:08:59 crc kubenswrapper[4752]: E1124 11:08:59.716236 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:09:00.21621678 +0000 UTC m=+146.201037069 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:08:59 crc kubenswrapper[4752]: I1124 11:08:59.819865 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:08:59 crc kubenswrapper[4752]: E1124 11:08:59.820251 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:09:00.320241056 +0000 UTC m=+146.305061345 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:08:59 crc kubenswrapper[4752]: I1124 11:08:59.833384 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ptrzc" event={"ID":"cae410f5-e7a9-42e2-9aaa-d6634d848e58","Type":"ContainerStarted","Data":"0ce625ad91c07bb8d5e39df7b3447199cbcf00991428c52db4fccc70ac21ce81"} Nov 24 11:08:59 crc kubenswrapper[4752]: I1124 11:08:59.879334 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-857vf" event={"ID":"f3147bef-3d3e-4b85-be3f-bfd0887849a9","Type":"ContainerStarted","Data":"1c74234bd7be3703a629baad3da90eacb4e8dd1ec2060e3540be62486ae847bd"} Nov 24 11:08:59 crc kubenswrapper[4752]: I1124 11:08:59.891403 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sd7kr" event={"ID":"891cfebf-6f4e-446d-85f4-4231622d7886","Type":"ContainerStarted","Data":"9c28e81148d55da65f540d43a737352082f29b2e223243c1364ffa747b2b27a6"} Nov 24 11:08:59 crc kubenswrapper[4752]: I1124 11:08:59.896611 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jjd72" event={"ID":"08f929dc-5226-49d2-9048-9b4fb65eff57","Type":"ContainerStarted","Data":"2084e800612010425de4134b0f4e592463f778ee8abba390e0431eefdbdda80e"} Nov 24 11:08:59 crc kubenswrapper[4752]: I1124 11:08:59.900289 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-x7lpn" event={"ID":"df4cedec-414c-4253-8a55-79ed8c8734c1","Type":"ContainerStarted","Data":"7cc8f503cfd71bd1085fbfeed4373f198a7ca6321b6282d221a9eb7f477f008f"} Nov 24 11:08:59 crc kubenswrapper[4752]: I1124 11:08:59.925352 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:08:59 crc kubenswrapper[4752]: E1124 11:08:59.925658 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:09:00.425643565 +0000 UTC m=+146.410463854 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:08:59 crc kubenswrapper[4752]: I1124 11:08:59.944676 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-htkkq" event={"ID":"fb1fcea4-1195-476d-b01c-875ab966a466","Type":"ContainerStarted","Data":"3fc54b594547b662847afb017e224df38818180f78c785eae91833a5de0d08c5"} Nov 24 11:08:59 crc kubenswrapper[4752]: I1124 11:08:59.962266 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-jzbd4" event={"ID":"bae98c3e-6921-4f96-beab-9d294469a8fe","Type":"ContainerStarted","Data":"4c48bd3e6394cd34fa7ba66a345421aa697d491c8463990ed149ba2cc5d84308"} Nov 24 11:08:59 crc kubenswrapper[4752]: I1124 11:08:59.993998 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bps76" event={"ID":"782a3373-7524-40dc-b312-08c40423ffb6","Type":"ContainerStarted","Data":"d67a0de00949b9e5e49e77b96b2873e494ff258be25e84212ee94f2aee603023"} Nov 24 11:08:59 crc kubenswrapper[4752]: I1124 11:08:59.995291 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-bps76" Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.042694 4752 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-bps76 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.042760 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-bps76" podUID="782a3373-7524-40dc-b312-08c40423ffb6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.045150 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:09:00 crc kubenswrapper[4752]: E1124 11:09:00.047420 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:09:00.547400238 +0000 UTC m=+146.532220527 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.110219 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-965hb" event={"ID":"5618f796-dad3-4ed3-bff7-ceed08f8b07c","Type":"ContainerStarted","Data":"a0d356e945c72492a80531b5a942152c2c6f6b3abf8b459ff0f41394682e79cc"} Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.113827 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.115565 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-x7lpn" podStartSLOduration=125.115547229 podStartE2EDuration="2m5.115547229s" podCreationTimestamp="2025-11-24 11:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:09:00.046616544 +0000 UTC m=+146.031436833" watchObservedRunningTime="2025-11-24 11:09:00.115547229 +0000 UTC m=+146.100367518" Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.115700 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-857vf" podStartSLOduration=125.115691483 podStartE2EDuration="2m5.115691483s" podCreationTimestamp="2025-11-24 11:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:09:00.076541179 +0000 UTC m=+146.061361468" watchObservedRunningTime="2025-11-24 11:09:00.115691483 +0000 UTC m=+146.100511792" Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.132180 4752 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-965hb container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" start-of-body= Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.132229 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-965hb" podUID="5618f796-dad3-4ed3-bff7-ceed08f8b07c" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.153831 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:09:00 crc kubenswrapper[4752]: E1124 11:09:00.154736 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:09:00.654721954 +0000 UTC m=+146.639542243 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.173764 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-bps76" podStartSLOduration=124.173732959 podStartE2EDuration="2m4.173732959s" podCreationTimestamp="2025-11-24 11:06:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:09:00.164629514 +0000 UTC m=+146.149449793" watchObservedRunningTime="2025-11-24 11:09:00.173732959 +0000 UTC m=+146.158553248" Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.201649 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-chccp" event={"ID":"9b5fea82-8918-4237-99a6-eae4894a0b5f","Type":"ContainerStarted","Data":"e911cdf34bcf8b0ea40f15e61a1199c3f3b962babd12e53e1dc693e11eee1b06"} Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.232632 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-txmns" event={"ID":"b03c5740-8446-4caf-9072-274155052591","Type":"ContainerStarted","Data":"8d3344adbcef07c3cabb5c4206a3e1b343e40a620e3791b7c84177da1822fc9b"} Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.250016 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qvffh" event={"ID":"2521dbb1-8413-4180-bd70-2cf9019e583c","Type":"ContainerStarted","Data":"a8e923764bc1a683dc12a53217f80f92d9e879639e35a86e6e5c39e33011056b"} Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.254993 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.256236 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-txwcw" event={"ID":"eac73a5d-96c7-4aad-8882-edbb7eb2b55d","Type":"ContainerStarted","Data":"7d006870e4e4db1875fd6bd7ad84317058c57698c8e89f21872984638d17cba3"} Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.262074 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-txwcw" event={"ID":"eac73a5d-96c7-4aad-8882-edbb7eb2b55d","Type":"ContainerStarted","Data":"1fc403c368bb91faaccf9ab04afc9fc88df536a7d50e612e16ef9d3a5e060adc"} Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.262109 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-txwcw" Nov 24 11:09:00 crc kubenswrapper[4752]: E1124 11:09:00.257779 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:09:00.757759151 +0000 UTC m=+146.742579430 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.259447 4752 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-txwcw container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" start-of-body= Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.262928 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-txwcw" podUID="eac73a5d-96c7-4aad-8882-edbb7eb2b55d" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.269184 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jntpt" event={"ID":"8d1db8d4-8511-4932-8478-61ddedab3680","Type":"ContainerStarted","Data":"231cc19d1306b4f7f5e63e8edfa14134dd7c4c7ff9f1bde0dee8ead81ec100e2"} Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.275567 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399700-qknbd" event={"ID":"f619c2d4-bcf5-4403-acd6-0bf90e2ece94","Type":"ContainerStarted","Data":"aaad34a136dc98b5ba27ef389f2340a2bc6281421784a376f9c260099d91a0e8"} Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.290560 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rmq2r" event={"ID":"3a9e501c-00d3-4607-8b6b-b01485ba4961","Type":"ContainerStarted","Data":"a48b956eabffafdeda1e784a32efdb2f44fdb3f3375aff9af4a64ca8ddd75ab2"} Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.293890 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cprsp" event={"ID":"fde8828e-798f-4da5-9f44-0b8a2726dcb1","Type":"ContainerStarted","Data":"1ae27e93c7d5d004019f41656528a1d0e99ea52994831dfb3076235e9069b711"} Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.295375 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cprsp" Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.305385 4752 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-cprsp container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.305436 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cprsp" podUID="fde8828e-798f-4da5-9f44-0b8a2726dcb1" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.307175 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-965hb" podStartSLOduration=125.307158405 podStartE2EDuration="2m5.307158405s" podCreationTimestamp="2025-11-24 11:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:09:00.292713658 +0000 UTC m=+146.277533947" watchObservedRunningTime="2025-11-24 11:09:00.307158405 +0000 UTC m=+146.291978694" Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.340320 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5l427" event={"ID":"cffc3e00-cf34-448e-83c9-96c95a232caa","Type":"ContainerStarted","Data":"51580b2a5cfe7967ee4e0141dcb9a3e773b3729647bae124df3ff3c9788197b2"} Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.340379 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5l427" event={"ID":"cffc3e00-cf34-448e-83c9-96c95a232caa","Type":"ContainerStarted","Data":"42dba16f23a84219eef26180d906ba9915daa0ec19e0badfd36a9774f11fe152"} Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.361831 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7xbdq" event={"ID":"6e9f72eb-52cd-47cc-b939-3301c0aa7f3c","Type":"ContainerStarted","Data":"4696a6d4ca04267290552258d2caa828d7a5f4622237dfbe348d238284d14788"} Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.362955 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:09:00 crc kubenswrapper[4752]: E1124 11:09:00.364238 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:09:00.864220951 +0000 UTC m=+146.849041250 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.392520 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rmq2r" podStartSLOduration=125.392502506 podStartE2EDuration="2m5.392502506s" podCreationTimestamp="2025-11-24 11:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:09:00.361328683 +0000 UTC m=+146.346148982" watchObservedRunningTime="2025-11-24 11:09:00.392502506 +0000 UTC m=+146.377322795" Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.394387 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-txwcw" podStartSLOduration=124.394379523 podStartE2EDuration="2m4.394379523s" podCreationTimestamp="2025-11-24 11:06:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:09:00.393304201 +0000 UTC m=+146.378124490" watchObservedRunningTime="2025-11-24 11:09:00.394379523 +0000 UTC m=+146.379199812" Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.418549 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-v9rmm" event={"ID":"9503659c-61c6-42ce-988d-ee76cef6db29","Type":"ContainerStarted","Data":"d1c73302bbc7d9c2a38d878412329d5922561a49688363414efb260c627003fc"} Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.431027 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-znzhg" event={"ID":"b8ee461c-f13e-49f3-ba77-8063e92a9d01","Type":"ContainerStarted","Data":"59b46a16473dbe566e5f5e8215aaf4c2794c26a34b7d2e9dc63bc16dd359b2f7"} Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.464167 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:09:00 crc kubenswrapper[4752]: E1124 11:09:00.467326 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:09:00.967315319 +0000 UTC m=+146.952135608 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.474955 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-txmns" podStartSLOduration=125.47494085 podStartE2EDuration="2m5.47494085s" podCreationTimestamp="2025-11-24 11:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:09:00.439045334 +0000 UTC m=+146.423865653" watchObservedRunningTime="2025-11-24 11:09:00.47494085 +0000 UTC m=+146.459761139" Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.504662 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jntpt" podStartSLOduration=125.504645188 podStartE2EDuration="2m5.504645188s" podCreationTimestamp="2025-11-24 11:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:09:00.475864928 +0000 UTC m=+146.460685217" watchObservedRunningTime="2025-11-24 11:09:00.504645188 +0000 UTC m=+146.489465477" Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.505473 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-chccp" podStartSLOduration=125.505466193 podStartE2EDuration="2m5.505466193s" podCreationTimestamp="2025-11-24 11:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:09:00.503308718 +0000 UTC m=+146.488129007" watchObservedRunningTime="2025-11-24 11:09:00.505466193 +0000 UTC m=+146.490286482" Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.523176 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fvb85" event={"ID":"9c0f8aa1-dec1-46a3-ad52-b1e3275504b2","Type":"ContainerStarted","Data":"fac16046e9ee1c58b012b143a629b190ea7bd0afbebe90a75695f3df85fbe5c1"} Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.543706 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cprsp" podStartSLOduration=124.543690259 podStartE2EDuration="2m4.543690259s" podCreationTimestamp="2025-11-24 11:06:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:09:00.535962576 +0000 UTC m=+146.520782875" watchObservedRunningTime="2025-11-24 11:09:00.543690259 +0000 UTC m=+146.528510548" Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.565720 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:09:00 crc kubenswrapper[4752]: E1124 11:09:00.567158 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:09:01.067135919 +0000 UTC m=+147.051956208 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.592016 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4tkbk" event={"ID":"93b229ce-883d-43b5-a41c-1012da66417b","Type":"ContainerStarted","Data":"a1e0df6d6ac45b47a59af9febed13c0323a0a5bdf27a3c1f14c9626c9605f720"} Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.605999 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-7xbdq" podStartSLOduration=124.605979554 podStartE2EDuration="2m4.605979554s" podCreationTimestamp="2025-11-24 11:06:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:09:00.565243391 +0000 UTC m=+146.550063710" watchObservedRunningTime="2025-11-24 11:09:00.605979554 +0000 UTC m=+146.590799853" Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.607277 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-v9rmm" podStartSLOduration=124.607269953 podStartE2EDuration="2m4.607269953s" podCreationTimestamp="2025-11-24 11:06:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:09:00.605077656 +0000 UTC m=+146.589897945" watchObservedRunningTime="2025-11-24 11:09:00.607269953 +0000 UTC m=+146.592090242" Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.653052 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-xdzjw" event={"ID":"5284b897-69bd-490f-a02d-2d8647c73029","Type":"ContainerStarted","Data":"de6759f2235a4d6258799629e19274e6390c125c5660e2f08a0e9c12d95af319"} Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.663553 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-ljttt" event={"ID":"8731be1c-9d01-46ea-9237-741ac9b9eb31","Type":"ContainerStarted","Data":"1d73768c89c6418b61e81167cad267721682f71b2591f39fc5769d24ee1e2589"} Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.672503 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:09:00 crc kubenswrapper[4752]: E1124 11:09:00.673993 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:09:01.173980491 +0000 UTC m=+147.158800780 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.678369 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wbs52" event={"ID":"a5b18d85-5b4b-4e2d-a357-39462b1bbcd6","Type":"ContainerStarted","Data":"8987dbd609f8297ca67b5e69d96ad98acb11443f273ac0b4bbfce42098ca62d2"} Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.678414 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wbs52" Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.679473 4752 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-wbs52 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.679517 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wbs52" podUID="a5b18d85-5b4b-4e2d-a357-39462b1bbcd6" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.685461 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-znzhg" podStartSLOduration=125.685439847 podStartE2EDuration="2m5.685439847s" podCreationTimestamp="2025-11-24 11:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:09:00.652941194 +0000 UTC m=+146.637761483" watchObservedRunningTime="2025-11-24 11:09:00.685439847 +0000 UTC m=+146.670260136" Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.685871 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5l427" podStartSLOduration=124.68586632 podStartE2EDuration="2m4.68586632s" podCreationTimestamp="2025-11-24 11:06:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:09:00.67925703 +0000 UTC m=+146.664077319" watchObservedRunningTime="2025-11-24 11:09:00.68586632 +0000 UTC m=+146.670686609" Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.700280 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gxnqd" event={"ID":"6a04c082-6938-494c-833e-65f1f25817a9","Type":"ContainerStarted","Data":"7b2357777ced224dedb6e224b69fb3d6b189aed2927ff75fef6666427834609f"} Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.700324 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gxnqd" event={"ID":"6a04c082-6938-494c-833e-65f1f25817a9","Type":"ContainerStarted","Data":"c4f67dce5f5ffa14f4d6575f7edd8948b9416546804a8949ad4c6d69d3cff713"} Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.707919 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fvb85" podStartSLOduration=125.707899596 podStartE2EDuration="2m5.707899596s" podCreationTimestamp="2025-11-24 11:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:09:00.706607077 +0000 UTC m=+146.691427366" watchObservedRunningTime="2025-11-24 11:09:00.707899596 +0000 UTC m=+146.692719885" Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.712110 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nhc89" event={"ID":"14136410-157d-4287-96e1-6939c887bef3","Type":"ContainerStarted","Data":"4ee39ac6990c2864bbe0555a0241118110871c3b5bc08f3b55c9f6e3c63e0c2b"} Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.719900 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-pchf4" event={"ID":"57e2fd42-88ec-49e8-b089-ff3d0a9405cf","Type":"ContainerStarted","Data":"d80dc1b0e7a13c9ccd6b4a8df09c991d3b36bf3d1bf1e224e5161b33b479b8d0"} Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.727329 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-c2q5l" event={"ID":"224c53cb-9374-44a5-8256-0c45d1fbab84","Type":"ContainerStarted","Data":"9389d0cbefadf06269bfbada038f6f98ac637e315d7e81a3a77d2a5e1ad7061c"} Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.757031 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fnfwt" event={"ID":"3258c094-1ed5-4938-8b6c-284465245440","Type":"ContainerStarted","Data":"47b0e0ef66c6cbf8aed1f4ceab53b702beacf23a5c03899e9d3cf8e86a421fbe"} Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.764351 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bkhh5" event={"ID":"101dea51-f55e-4def-ae4f-e3d0ce84351f","Type":"ContainerStarted","Data":"c28ddccf286c999224531fced50cba225e65abab4841731ef07784737fc0859b"} Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.774068 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:09:00 crc kubenswrapper[4752]: E1124 11:09:00.775654 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:09:01.275630665 +0000 UTC m=+147.260450954 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.777274 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-xdzjw" podStartSLOduration=124.777255414 podStartE2EDuration="2m4.777255414s" podCreationTimestamp="2025-11-24 11:06:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:09:00.776616365 +0000 UTC m=+146.761436654" watchObservedRunningTime="2025-11-24 11:09:00.777255414 +0000 UTC m=+146.762075703" Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.783057 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-w7dbc" event={"ID":"c0998cf9-2c91-46b3-a9b8-a070e8b855b2","Type":"ContainerStarted","Data":"0d2045aa2b17fc5398051e76573fcf8f76f06f45e91c5ba88856926fceb28f3f"} Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.786606 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cvplk" event={"ID":"05ea66c9-da9e-44a9-b85b-11514abd77e1","Type":"ContainerStarted","Data":"50906ba6b3d5df9fcdda8f477cf224357aa915b3f3abbbace9a10e25b1a77788"} Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.787275 4752 patch_prober.go:28] interesting pod/downloads-7954f5f757-sght6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.787325 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-sght6" podUID="e7f6a4fd-346f-4828-b077-4d4888917d6a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.788861 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wbs52" podStartSLOduration=124.788423182 podStartE2EDuration="2m4.788423182s" podCreationTimestamp="2025-11-24 11:06:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:09:00.738332327 +0000 UTC m=+146.723152616" watchObservedRunningTime="2025-11-24 11:09:00.788423182 +0000 UTC m=+146.773243471" Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.798138 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-l5zrr" Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.827857 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-tn49v" Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.842524 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4tkbk" podStartSLOduration=124.842503668 podStartE2EDuration="2m4.842503668s" podCreationTimestamp="2025-11-24 11:06:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:09:00.824425101 +0000 UTC m=+146.809245410" watchObservedRunningTime="2025-11-24 11:09:00.842503668 +0000 UTC m=+146.827323957" Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.876187 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-w7dbc" podStartSLOduration=125.876166056 podStartE2EDuration="2m5.876166056s" podCreationTimestamp="2025-11-24 11:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:09:00.847173739 +0000 UTC m=+146.831994018" watchObservedRunningTime="2025-11-24 11:09:00.876166056 +0000 UTC m=+146.860986345" Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.877785 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:09:00 crc kubenswrapper[4752]: E1124 11:09:00.886093 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:09:01.386067456 +0000 UTC m=+147.370887745 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:00 crc kubenswrapper[4752]: I1124 11:09:00.988486 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:09:00 crc kubenswrapper[4752]: E1124 11:09:00.988848 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:09:01.488833724 +0000 UTC m=+147.473654013 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.015480 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fnfwt" podStartSLOduration=126.01546183 podStartE2EDuration="2m6.01546183s" podCreationTimestamp="2025-11-24 11:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:09:00.954907188 +0000 UTC m=+146.939727477" watchObservedRunningTime="2025-11-24 11:09:01.01546183 +0000 UTC m=+147.000282119" Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.042311 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bkhh5" podStartSLOduration=126.042294471 podStartE2EDuration="2m6.042294471s" podCreationTimestamp="2025-11-24 11:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:09:01.040719274 +0000 UTC m=+147.025539573" watchObservedRunningTime="2025-11-24 11:09:01.042294471 +0000 UTC m=+147.027114750" Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.043138 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nhc89" podStartSLOduration=126.043130807 podStartE2EDuration="2m6.043130807s" podCreationTimestamp="2025-11-24 11:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:09:01.014327555 +0000 UTC m=+146.999147844" watchObservedRunningTime="2025-11-24 11:09:01.043130807 +0000 UTC m=+147.027951096" Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.092375 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:09:01 crc kubenswrapper[4752]: E1124 11:09:01.092762 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:09:01.592733977 +0000 UTC m=+147.577554266 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.124930 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-c2q5l" podStartSLOduration=126.124914621 podStartE2EDuration="2m6.124914621s" podCreationTimestamp="2025-11-24 11:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:09:01.088059326 +0000 UTC m=+147.072879635" watchObservedRunningTime="2025-11-24 11:09:01.124914621 +0000 UTC m=+147.109734910" Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.161848 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gxnqd" podStartSLOduration=126.161828377 podStartE2EDuration="2m6.161828377s" podCreationTimestamp="2025-11-24 11:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:09:01.161346753 +0000 UTC m=+147.146167042" watchObservedRunningTime="2025-11-24 11:09:01.161828377 +0000 UTC m=+147.146648666" Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.162639 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-pchf4" podStartSLOduration=7.162634421 podStartE2EDuration="7.162634421s" podCreationTimestamp="2025-11-24 11:08:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:09:01.131022795 +0000 UTC m=+147.115843084" watchObservedRunningTime="2025-11-24 11:09:01.162634421 +0000 UTC m=+147.147454711" Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.193894 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:09:01 crc kubenswrapper[4752]: E1124 11:09:01.194236 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:09:01.694221007 +0000 UTC m=+147.679041296 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.295657 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:09:01 crc kubenswrapper[4752]: E1124 11:09:01.296083 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:09:01.796065718 +0000 UTC m=+147.780886087 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.396882 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:09:01 crc kubenswrapper[4752]: E1124 11:09:01.397011 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:09:01.896983899 +0000 UTC m=+147.881804188 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.397345 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:09:01 crc kubenswrapper[4752]: E1124 11:09:01.397612 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:09:01.897602618 +0000 UTC m=+147.882422907 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.498487 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:09:01 crc kubenswrapper[4752]: E1124 11:09:01.498629 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:09:01.998603763 +0000 UTC m=+147.983424052 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.498723 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:09:01 crc kubenswrapper[4752]: E1124 11:09:01.499001 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:09:01.998992275 +0000 UTC m=+147.983812554 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.578016 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-w7dbc" Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.582014 4752 patch_prober.go:28] interesting pod/router-default-5444994796-w7dbc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 24 11:09:01 crc kubenswrapper[4752]: [-]has-synced failed: reason withheld Nov 24 11:09:01 crc kubenswrapper[4752]: [+]process-running ok Nov 24 11:09:01 crc kubenswrapper[4752]: healthz check failed Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.582065 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w7dbc" podUID="c0998cf9-2c91-46b3-a9b8-a070e8b855b2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.600002 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:09:01 crc kubenswrapper[4752]: E1124 11:09:01.600477 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:09:02.100458124 +0000 UTC m=+148.085278413 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.701026 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:09:01 crc kubenswrapper[4752]: E1124 11:09:01.701392 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:09:02.201376317 +0000 UTC m=+148.186196606 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.792435 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399700-qknbd" event={"ID":"f619c2d4-bcf5-4403-acd6-0bf90e2ece94","Type":"ContainerStarted","Data":"723e98c50dd31d9365836ef6b053573db97c3c9d825af715ecd2ce5d83664ba8"} Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.801915 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:09:01 crc kubenswrapper[4752]: E1124 11:09:01.802115 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:09:02.302087603 +0000 UTC m=+148.286907892 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.802240 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:09:01 crc kubenswrapper[4752]: E1124 11:09:01.802573 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:09:02.302555627 +0000 UTC m=+148.287376016 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.807312 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sd7kr" event={"ID":"891cfebf-6f4e-446d-85f4-4231622d7886","Type":"ContainerStarted","Data":"110b79ce828e217f1051fb9eabc22749d0ee163e60330b3b439aa72aa8f86a9b"} Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.807350 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sd7kr" event={"ID":"891cfebf-6f4e-446d-85f4-4231622d7886","Type":"ContainerStarted","Data":"4ad1d5362739edff18e5b90e2dc14dfdd019522a061dbd247994a4d143817212"} Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.807398 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-sd7kr" Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.809320 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qvffh" event={"ID":"2521dbb1-8413-4180-bd70-2cf9019e583c","Type":"ContainerStarted","Data":"c921cebf76902c58c14c3f2760a171bf9c88b468d1ed6a82aaf77e55f3edd3e8"} Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.812071 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jjd72" event={"ID":"08f929dc-5226-49d2-9048-9b4fb65eff57","Type":"ContainerStarted","Data":"538e1058e23d11e17fdc57e8cc762f3fb49ed589dfc14ae325e55190ed864373"} Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.812797 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jjd72" Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.814872 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fnfwt" event={"ID":"3258c094-1ed5-4938-8b6c-284465245440","Type":"ContainerStarted","Data":"23b9fbf452f0e503bfea8143065cf4bbcef48887ecc926326a026d2a478f4f12"} Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.814909 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fnfwt" event={"ID":"3258c094-1ed5-4938-8b6c-284465245440","Type":"ContainerStarted","Data":"8e89698e2c59984459f490c16c3df4f0c37aebe693df9e7edd0810b6110eb1a1"} Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.817193 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-xdzjw" event={"ID":"5284b897-69bd-490f-a02d-2d8647c73029","Type":"ContainerStarted","Data":"365aa9c491e6325dbee2f3b53285d7eadac81339cb8647d66dfe5390993be7ce"} Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.820729 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-jzbd4" event={"ID":"bae98c3e-6921-4f96-beab-9d294469a8fe","Type":"ContainerStarted","Data":"71cfa5f470014ee9dd847db33dc253bbee02ad18c120afbb6e57c7b07c4fc10c"} Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.823453 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cvplk" event={"ID":"05ea66c9-da9e-44a9-b85b-11514abd77e1","Type":"ContainerStarted","Data":"c1d80fe8e95ec715d457ae79065d2cb84e334de3ef3fc718ebcaf0b282442bd8"} Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.824706 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bps76" event={"ID":"782a3373-7524-40dc-b312-08c40423ffb6","Type":"ContainerStarted","Data":"d6d40fef72b1cf1d1f0b1dc19406741ac4be722a13fb920e14e32f064b68c621"} Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.825508 4752 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-bps76 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.825540 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-bps76" podUID="782a3373-7524-40dc-b312-08c40423ffb6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.827337 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8zgb" event={"ID":"ac73154f-7953-4cb0-b785-ebed9795fe9e","Type":"ContainerStarted","Data":"d9de272a9d3598b2e8a362613b3f59005a8fc4c4fa702ff5dd87b5d890e46fd3"} Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.829231 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-ljttt" event={"ID":"8731be1c-9d01-46ea-9237-741ac9b9eb31","Type":"ContainerStarted","Data":"152f17042967379e85c5c870bf72c1386c85b58cad96d33a5dfa0fac62c16aa8"} Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.829256 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-ljttt" event={"ID":"8731be1c-9d01-46ea-9237-741ac9b9eb31","Type":"ContainerStarted","Data":"ac922aa4f74e477d7848da878c2ab6c9afea26d65aa83b35e85519178a89023a"} Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.829918 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jjd72" Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.835537 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" event={"ID":"0bff57c9-60c1-47ad-8b27-040cb3453a55","Type":"ContainerStarted","Data":"055424ccc74fac55360ed87d63efbb895a9994c905e33c143bab9420f9fe404f"} Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.835617 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" event={"ID":"0bff57c9-60c1-47ad-8b27-040cb3453a55","Type":"ContainerStarted","Data":"d95d1c3424f8ace4c617bb65b975c1f9acac059b21dacd1a639f1ac957d4d799"} Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.837462 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-htkkq" event={"ID":"fb1fcea4-1195-476d-b01c-875ab966a466","Type":"ContainerStarted","Data":"fb41a659e9f7d1909c181067b33322224dd398ed731138a8f5cd973199b2fdb9"} Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.837973 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-htkkq" Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.839919 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4tkbk" event={"ID":"93b229ce-883d-43b5-a41c-1012da66417b","Type":"ContainerStarted","Data":"81699195c871241e14c95acf6f4a3a12c2ead66bb94909195329ff116b14e02f"} Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.843729 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ptrzc" event={"ID":"cae410f5-e7a9-42e2-9aaa-d6634d848e58","Type":"ContainerStarted","Data":"6c69511896b009c3d37d63a5e1ff7f66c196b0f7e98e88db0f34a57191ffa11c"} Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.853701 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cprsp" Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.854454 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29399700-qknbd" podStartSLOduration=126.854439377 podStartE2EDuration="2m6.854439377s" podCreationTimestamp="2025-11-24 11:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:09:01.852662043 +0000 UTC m=+147.837482332" watchObservedRunningTime="2025-11-24 11:09:01.854439377 +0000 UTC m=+147.839259666" Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.893591 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8zgb" podStartSLOduration=125.893574911 podStartE2EDuration="2m5.893574911s" podCreationTimestamp="2025-11-24 11:06:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:09:01.893158878 +0000 UTC m=+147.877979167" watchObservedRunningTime="2025-11-24 11:09:01.893574911 +0000 UTC m=+147.878395200" Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.908451 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:09:01 crc kubenswrapper[4752]: E1124 11:09:01.910286 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:09:02.410260825 +0000 UTC m=+148.395081114 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.929108 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wbs52" Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.938334 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jjd72" podStartSLOduration=125.938317714 podStartE2EDuration="2m5.938317714s" podCreationTimestamp="2025-11-24 11:06:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:09:01.937222161 +0000 UTC m=+147.922042450" watchObservedRunningTime="2025-11-24 11:09:01.938317714 +0000 UTC m=+147.923138003" Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.991892 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.992224 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" Nov 24 11:09:01 crc kubenswrapper[4752]: I1124 11:09:01.999262 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-sd7kr" podStartSLOduration=7.999248187 podStartE2EDuration="7.999248187s" podCreationTimestamp="2025-11-24 11:08:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:09:01.9970228 +0000 UTC m=+147.981843089" watchObservedRunningTime="2025-11-24 11:09:01.999248187 +0000 UTC m=+147.984068476" Nov 24 11:09:02 crc kubenswrapper[4752]: I1124 11:09:02.007449 4752 patch_prober.go:28] interesting pod/apiserver-76f77b778f-xvmg4 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.20:8443/livez\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Nov 24 11:09:02 crc kubenswrapper[4752]: I1124 11:09:02.007494 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" podUID="0bff57c9-60c1-47ad-8b27-040cb3453a55" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.20:8443/livez\": dial tcp 10.217.0.20:8443: connect: connection refused" Nov 24 11:09:02 crc kubenswrapper[4752]: I1124 11:09:02.012166 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:09:02 crc kubenswrapper[4752]: E1124 11:09:02.012883 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:09:02.512847748 +0000 UTC m=+148.497668037 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:02 crc kubenswrapper[4752]: I1124 11:09:02.027566 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8zgb" Nov 24 11:09:02 crc kubenswrapper[4752]: I1124 11:09:02.028132 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8zgb" Nov 24 11:09:02 crc kubenswrapper[4752]: I1124 11:09:02.079013 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-qvffh" podStartSLOduration=8.078997009 podStartE2EDuration="8.078997009s" podCreationTimestamp="2025-11-24 11:08:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:09:02.027475981 +0000 UTC m=+148.012296290" watchObservedRunningTime="2025-11-24 11:09:02.078997009 +0000 UTC m=+148.063817298" Nov 24 11:09:02 crc kubenswrapper[4752]: I1124 11:09:02.080696 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-jzbd4" podStartSLOduration=127.08068994 podStartE2EDuration="2m7.08068994s" podCreationTimestamp="2025-11-24 11:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:09:02.079817774 +0000 UTC m=+148.064638063" watchObservedRunningTime="2025-11-24 11:09:02.08068994 +0000 UTC m=+148.065510229" Nov 24 11:09:02 crc kubenswrapper[4752]: I1124 11:09:02.114327 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:09:02 crc kubenswrapper[4752]: E1124 11:09:02.114627 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:09:02.614609157 +0000 UTC m=+148.599429446 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:02 crc kubenswrapper[4752]: I1124 11:09:02.161418 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ptrzc" podStartSLOduration=127.161403042 podStartE2EDuration="2m7.161403042s" podCreationTimestamp="2025-11-24 11:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:09:02.110366398 +0000 UTC m=+148.095186687" watchObservedRunningTime="2025-11-24 11:09:02.161403042 +0000 UTC m=+148.146223331" Nov 24 11:09:02 crc kubenswrapper[4752]: I1124 11:09:02.198500 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" podStartSLOduration=127.198483564 podStartE2EDuration="2m7.198483564s" podCreationTimestamp="2025-11-24 11:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:09:02.161150624 +0000 UTC m=+148.145970913" watchObservedRunningTime="2025-11-24 11:09:02.198483564 +0000 UTC m=+148.183303853" Nov 24 11:09:02 crc kubenswrapper[4752]: I1124 11:09:02.200593 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-htkkq" podStartSLOduration=126.200586467 podStartE2EDuration="2m6.200586467s" podCreationTimestamp="2025-11-24 11:06:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:09:02.199152214 +0000 UTC m=+148.183972503" watchObservedRunningTime="2025-11-24 11:09:02.200586467 +0000 UTC m=+148.185406756" Nov 24 11:09:02 crc kubenswrapper[4752]: I1124 11:09:02.215428 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:09:02 crc kubenswrapper[4752]: E1124 11:09:02.215932 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:09:02.715916851 +0000 UTC m=+148.700737140 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:02 crc kubenswrapper[4752]: I1124 11:09:02.274886 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-ljttt" podStartSLOduration=126.274867984 podStartE2EDuration="2m6.274867984s" podCreationTimestamp="2025-11-24 11:06:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:09:02.230270995 +0000 UTC m=+148.215091294" watchObservedRunningTime="2025-11-24 11:09:02.274867984 +0000 UTC m=+148.259688273" Nov 24 11:09:02 crc kubenswrapper[4752]: I1124 11:09:02.317949 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:09:02 crc kubenswrapper[4752]: E1124 11:09:02.318370 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:09:02.818352349 +0000 UTC m=+148.803172638 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:02 crc kubenswrapper[4752]: I1124 11:09:02.421613 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:09:02 crc kubenswrapper[4752]: E1124 11:09:02.421925 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:09:02.921912362 +0000 UTC m=+148.906732651 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:02 crc kubenswrapper[4752]: I1124 11:09:02.464314 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d5cwn" Nov 24 11:09:02 crc kubenswrapper[4752]: I1124 11:09:02.522840 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:09:02 crc kubenswrapper[4752]: E1124 11:09:02.523037 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:09:03.02301286 +0000 UTC m=+149.007833149 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:02 crc kubenswrapper[4752]: I1124 11:09:02.523181 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:09:02 crc kubenswrapper[4752]: E1124 11:09:02.523504 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:09:03.023490485 +0000 UTC m=+149.008310774 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:02 crc kubenswrapper[4752]: I1124 11:09:02.584788 4752 patch_prober.go:28] interesting pod/router-default-5444994796-w7dbc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 24 11:09:02 crc kubenswrapper[4752]: [-]has-synced failed: reason withheld Nov 24 11:09:02 crc kubenswrapper[4752]: [+]process-running ok Nov 24 11:09:02 crc kubenswrapper[4752]: healthz check failed Nov 24 11:09:02 crc kubenswrapper[4752]: I1124 11:09:02.584857 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w7dbc" podUID="c0998cf9-2c91-46b3-a9b8-a070e8b855b2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 24 11:09:02 crc kubenswrapper[4752]: I1124 11:09:02.624227 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:09:02 crc kubenswrapper[4752]: E1124 11:09:02.624461 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:09:03.124434768 +0000 UTC m=+149.109255057 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:02 crc kubenswrapper[4752]: I1124 11:09:02.624512 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:09:02 crc kubenswrapper[4752]: I1124 11:09:02.624604 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:09:02 crc kubenswrapper[4752]: I1124 11:09:02.624666 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:09:02 crc kubenswrapper[4752]: I1124 11:09:02.624768 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:09:02 crc kubenswrapper[4752]: I1124 11:09:02.624787 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:09:02 crc kubenswrapper[4752]: E1124 11:09:02.625095 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:09:03.125073337 +0000 UTC m=+149.109893716 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:02 crc kubenswrapper[4752]: I1124 11:09:02.626441 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:09:02 crc kubenswrapper[4752]: I1124 11:09:02.635496 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:09:02 crc kubenswrapper[4752]: I1124 11:09:02.645434 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:09:02 crc kubenswrapper[4752]: I1124 11:09:02.652482 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:09:02 crc kubenswrapper[4752]: I1124 11:09:02.725353 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:09:02 crc kubenswrapper[4752]: E1124 11:09:02.725709 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:09:03.225693661 +0000 UTC m=+149.210513950 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:02 crc kubenswrapper[4752]: I1124 11:09:02.827217 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:09:02 crc kubenswrapper[4752]: E1124 11:09:02.827760 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:09:03.327730677 +0000 UTC m=+149.312550966 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:02 crc kubenswrapper[4752]: I1124 11:09:02.842492 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 11:09:02 crc kubenswrapper[4752]: I1124 11:09:02.844771 4752 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-txwcw container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": context deadline exceeded" start-of-body= Nov 24 11:09:02 crc kubenswrapper[4752]: I1124 11:09:02.844916 4752 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-965hb container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 24 11:09:02 crc kubenswrapper[4752]: I1124 11:09:02.845261 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-txwcw" podUID="eac73a5d-96c7-4aad-8882-edbb7eb2b55d" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": context deadline exceeded" Nov 24 11:09:02 crc kubenswrapper[4752]: I1124 11:09:02.845342 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-965hb" podUID="5618f796-dad3-4ed3-bff7-ceed08f8b07c" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 24 11:09:02 crc kubenswrapper[4752]: I1124 11:09:02.852855 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 11:09:02 crc kubenswrapper[4752]: I1124 11:09:02.857081 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:09:02 crc kubenswrapper[4752]: I1124 11:09:02.879773 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cvplk" event={"ID":"05ea66c9-da9e-44a9-b85b-11514abd77e1","Type":"ContainerStarted","Data":"f67c395700a134fd973e4dbbf5d6452527a1268e2ccb347c3b81b5501e36f06c"} Nov 24 11:09:02 crc kubenswrapper[4752]: I1124 11:09:02.884502 4752 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-bps76 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Nov 24 11:09:02 crc kubenswrapper[4752]: I1124 11:09:02.898716 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-bps76" podUID="782a3373-7524-40dc-b312-08c40423ffb6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Nov 24 11:09:02 crc kubenswrapper[4752]: I1124 11:09:02.929589 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:09:02 crc kubenswrapper[4752]: E1124 11:09:02.929904 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:09:03.429876867 +0000 UTC m=+149.414697156 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:02 crc kubenswrapper[4752]: I1124 11:09:02.930117 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:09:02 crc kubenswrapper[4752]: E1124 11:09:02.930513 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:09:03.430504626 +0000 UTC m=+149.415324915 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:03 crc kubenswrapper[4752]: I1124 11:09:03.031210 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:09:03 crc kubenswrapper[4752]: E1124 11:09:03.033488 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:09:03.53346121 +0000 UTC m=+149.518281499 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:03 crc kubenswrapper[4752]: I1124 11:09:03.135364 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:09:03 crc kubenswrapper[4752]: E1124 11:09:03.135939 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:09:03.63592225 +0000 UTC m=+149.620742539 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:03 crc kubenswrapper[4752]: I1124 11:09:03.236446 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:09:03 crc kubenswrapper[4752]: E1124 11:09:03.236856 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:09:03.736838292 +0000 UTC m=+149.721658581 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:03 crc kubenswrapper[4752]: I1124 11:09:03.338132 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:09:03 crc kubenswrapper[4752]: E1124 11:09:03.338216 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:09:03.838183488 +0000 UTC m=+149.823003767 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:03 crc kubenswrapper[4752]: I1124 11:09:03.343405 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8zgb" Nov 24 11:09:03 crc kubenswrapper[4752]: I1124 11:09:03.439851 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:09:03 crc kubenswrapper[4752]: E1124 11:09:03.440167 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:09:03.940152322 +0000 UTC m=+149.924972611 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:03 crc kubenswrapper[4752]: I1124 11:09:03.493639 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:09:03 crc kubenswrapper[4752]: I1124 11:09:03.541647 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:09:03 crc kubenswrapper[4752]: E1124 11:09:03.541965 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:09:04.041953382 +0000 UTC m=+150.026773671 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:03 crc kubenswrapper[4752]: I1124 11:09:03.597460 4752 patch_prober.go:28] interesting pod/router-default-5444994796-w7dbc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 24 11:09:03 crc kubenswrapper[4752]: [-]has-synced failed: reason withheld Nov 24 11:09:03 crc kubenswrapper[4752]: [+]process-running ok Nov 24 11:09:03 crc kubenswrapper[4752]: healthz check failed Nov 24 11:09:03 crc kubenswrapper[4752]: I1124 11:09:03.597718 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w7dbc" podUID="c0998cf9-2c91-46b3-a9b8-a070e8b855b2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 24 11:09:03 crc kubenswrapper[4752]: I1124 11:09:03.652417 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:09:03 crc kubenswrapper[4752]: E1124 11:09:03.652759 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:09:04.152722462 +0000 UTC m=+150.137542751 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:03 crc kubenswrapper[4752]: I1124 11:09:03.663792 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-glr4w"] Nov 24 11:09:03 crc kubenswrapper[4752]: I1124 11:09:03.664919 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-glr4w" Nov 24 11:09:03 crc kubenswrapper[4752]: I1124 11:09:03.674297 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 24 11:09:03 crc kubenswrapper[4752]: I1124 11:09:03.688377 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-glr4w"] Nov 24 11:09:03 crc kubenswrapper[4752]: I1124 11:09:03.753363 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:09:03 crc kubenswrapper[4752]: E1124 11:09:03.753808 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:09:04.253790969 +0000 UTC m=+150.238611258 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:03 crc kubenswrapper[4752]: I1124 11:09:03.855302 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:09:03 crc kubenswrapper[4752]: E1124 11:09:03.855464 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:09:04.355439154 +0000 UTC m=+150.340259443 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:03 crc kubenswrapper[4752]: I1124 11:09:03.855660 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d2e36e3-15aa-4576-a13a-5b5512bb13f2-catalog-content\") pod \"certified-operators-glr4w\" (UID: \"0d2e36e3-15aa-4576-a13a-5b5512bb13f2\") " pod="openshift-marketplace/certified-operators-glr4w" Nov 24 11:09:03 crc kubenswrapper[4752]: I1124 11:09:03.855762 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psf4j\" (UniqueName: \"kubernetes.io/projected/0d2e36e3-15aa-4576-a13a-5b5512bb13f2-kube-api-access-psf4j\") pod \"certified-operators-glr4w\" (UID: \"0d2e36e3-15aa-4576-a13a-5b5512bb13f2\") " pod="openshift-marketplace/certified-operators-glr4w" Nov 24 11:09:03 crc kubenswrapper[4752]: I1124 11:09:03.855782 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d2e36e3-15aa-4576-a13a-5b5512bb13f2-utilities\") pod \"certified-operators-glr4w\" (UID: \"0d2e36e3-15aa-4576-a13a-5b5512bb13f2\") " pod="openshift-marketplace/certified-operators-glr4w" Nov 24 11:09:03 crc kubenswrapper[4752]: I1124 11:09:03.855813 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:09:03 crc kubenswrapper[4752]: E1124 11:09:03.858373 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:09:04.358356782 +0000 UTC m=+150.343177071 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:03 crc kubenswrapper[4752]: I1124 11:09:03.871417 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-v468m"] Nov 24 11:09:03 crc kubenswrapper[4752]: I1124 11:09:03.896850 4752 generic.go:334] "Generic (PLEG): container finished" podID="f619c2d4-bcf5-4403-acd6-0bf90e2ece94" containerID="723e98c50dd31d9365836ef6b053573db97c3c9d825af715ecd2ce5d83664ba8" exitCode=0 Nov 24 11:09:03 crc kubenswrapper[4752]: I1124 11:09:03.897443 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"c4f126e8bb0225228d2007a56ecb954c4065ef84c75ad520b4761d3a3af942e6"} Nov 24 11:09:03 crc kubenswrapper[4752]: I1124 11:09:03.897498 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cvplk" event={"ID":"05ea66c9-da9e-44a9-b85b-11514abd77e1","Type":"ContainerStarted","Data":"d47095a12ca48e0d0257610380dd82ea5dd745ad6046795cdef3c20a01a3ac7c"} Nov 24 11:09:03 crc kubenswrapper[4752]: I1124 11:09:03.897514 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v468m"] Nov 24 11:09:03 crc kubenswrapper[4752]: I1124 11:09:03.897537 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399700-qknbd" event={"ID":"f619c2d4-bcf5-4403-acd6-0bf90e2ece94","Type":"ContainerDied","Data":"723e98c50dd31d9365836ef6b053573db97c3c9d825af715ecd2ce5d83664ba8"} Nov 24 11:09:03 crc kubenswrapper[4752]: I1124 11:09:03.897948 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v468m" Nov 24 11:09:03 crc kubenswrapper[4752]: I1124 11:09:03.899990 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 24 11:09:03 crc kubenswrapper[4752]: I1124 11:09:03.902711 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"e47a701732dad1a11fdbecf0491bd1d6063f634da3ce0d43452d393e65233793"} Nov 24 11:09:03 crc kubenswrapper[4752]: I1124 11:09:03.902771 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"b1192d2dd32fa3828bb198740129da6dd61f3a41738d160641770617e42c4de2"} Nov 24 11:09:03 crc kubenswrapper[4752]: I1124 11:09:03.912949 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"10c183285a8e339116991f9115d25e722ac606989a2ea4dc431dedcb645efe1c"} Nov 24 11:09:03 crc kubenswrapper[4752]: I1124 11:09:03.912990 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"0c44876be2efa77574b03fe2c3c866d1f58a80f9f89bf45a0c89b9b14de43907"} Nov 24 11:09:03 crc kubenswrapper[4752]: I1124 11:09:03.929938 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8zgb" Nov 24 11:09:03 crc kubenswrapper[4752]: I1124 11:09:03.957152 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:09:03 crc kubenswrapper[4752]: I1124 11:09:03.957356 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psf4j\" (UniqueName: \"kubernetes.io/projected/0d2e36e3-15aa-4576-a13a-5b5512bb13f2-kube-api-access-psf4j\") pod \"certified-operators-glr4w\" (UID: \"0d2e36e3-15aa-4576-a13a-5b5512bb13f2\") " pod="openshift-marketplace/certified-operators-glr4w" Nov 24 11:09:03 crc kubenswrapper[4752]: I1124 11:09:03.957380 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d2e36e3-15aa-4576-a13a-5b5512bb13f2-utilities\") pod \"certified-operators-glr4w\" (UID: \"0d2e36e3-15aa-4576-a13a-5b5512bb13f2\") " pod="openshift-marketplace/certified-operators-glr4w" Nov 24 11:09:03 crc kubenswrapper[4752]: I1124 11:09:03.957415 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d2e36e3-15aa-4576-a13a-5b5512bb13f2-catalog-content\") pod \"certified-operators-glr4w\" (UID: \"0d2e36e3-15aa-4576-a13a-5b5512bb13f2\") " pod="openshift-marketplace/certified-operators-glr4w" Nov 24 11:09:03 crc kubenswrapper[4752]: E1124 11:09:03.957785 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:09:04.457765339 +0000 UTC m=+150.442585628 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:03 crc kubenswrapper[4752]: I1124 11:09:03.958211 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d2e36e3-15aa-4576-a13a-5b5512bb13f2-utilities\") pod \"certified-operators-glr4w\" (UID: \"0d2e36e3-15aa-4576-a13a-5b5512bb13f2\") " pod="openshift-marketplace/certified-operators-glr4w" Nov 24 11:09:03 crc kubenswrapper[4752]: I1124 11:09:03.959096 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d2e36e3-15aa-4576-a13a-5b5512bb13f2-catalog-content\") pod \"certified-operators-glr4w\" (UID: \"0d2e36e3-15aa-4576-a13a-5b5512bb13f2\") " pod="openshift-marketplace/certified-operators-glr4w" Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:03.999500 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psf4j\" (UniqueName: \"kubernetes.io/projected/0d2e36e3-15aa-4576-a13a-5b5512bb13f2-kube-api-access-psf4j\") pod \"certified-operators-glr4w\" (UID: \"0d2e36e3-15aa-4576-a13a-5b5512bb13f2\") " pod="openshift-marketplace/certified-operators-glr4w" Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.043459 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-glr4w" Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.059195 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5633c2f-36d1-4d10-a12c-f58e67571364-catalog-content\") pod \"community-operators-v468m\" (UID: \"e5633c2f-36d1-4d10-a12c-f58e67571364\") " pod="openshift-marketplace/community-operators-v468m" Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.059257 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqj2s\" (UniqueName: \"kubernetes.io/projected/e5633c2f-36d1-4d10-a12c-f58e67571364-kube-api-access-tqj2s\") pod \"community-operators-v468m\" (UID: \"e5633c2f-36d1-4d10-a12c-f58e67571364\") " pod="openshift-marketplace/community-operators-v468m" Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.059313 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.059397 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5633c2f-36d1-4d10-a12c-f58e67571364-utilities\") pod \"community-operators-v468m\" (UID: \"e5633c2f-36d1-4d10-a12c-f58e67571364\") " pod="openshift-marketplace/community-operators-v468m" Nov 24 11:09:04 crc kubenswrapper[4752]: E1124 11:09:04.061078 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:09:04.561064484 +0000 UTC m=+150.545884773 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.077143 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-v5cxt"] Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.078230 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v5cxt" Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.101030 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v5cxt"] Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.163252 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:09:04 crc kubenswrapper[4752]: E1124 11:09:04.163393 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:09:04.663363298 +0000 UTC m=+150.648183587 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.163477 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5633c2f-36d1-4d10-a12c-f58e67571364-catalog-content\") pod \"community-operators-v468m\" (UID: \"e5633c2f-36d1-4d10-a12c-f58e67571364\") " pod="openshift-marketplace/community-operators-v468m" Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.163503 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqj2s\" (UniqueName: \"kubernetes.io/projected/e5633c2f-36d1-4d10-a12c-f58e67571364-kube-api-access-tqj2s\") pod \"community-operators-v468m\" (UID: \"e5633c2f-36d1-4d10-a12c-f58e67571364\") " pod="openshift-marketplace/community-operators-v468m" Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.163527 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.163552 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5633c2f-36d1-4d10-a12c-f58e67571364-utilities\") pod \"community-operators-v468m\" (UID: \"e5633c2f-36d1-4d10-a12c-f58e67571364\") " pod="openshift-marketplace/community-operators-v468m" Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.163973 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5633c2f-36d1-4d10-a12c-f58e67571364-utilities\") pod \"community-operators-v468m\" (UID: \"e5633c2f-36d1-4d10-a12c-f58e67571364\") " pod="openshift-marketplace/community-operators-v468m" Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.164193 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5633c2f-36d1-4d10-a12c-f58e67571364-catalog-content\") pod \"community-operators-v468m\" (UID: \"e5633c2f-36d1-4d10-a12c-f58e67571364\") " pod="openshift-marketplace/community-operators-v468m" Nov 24 11:09:04 crc kubenswrapper[4752]: E1124 11:09:04.164632 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:09:04.664620826 +0000 UTC m=+150.649441115 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.184826 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqj2s\" (UniqueName: \"kubernetes.io/projected/e5633c2f-36d1-4d10-a12c-f58e67571364-kube-api-access-tqj2s\") pod \"community-operators-v468m\" (UID: \"e5633c2f-36d1-4d10-a12c-f58e67571364\") " pod="openshift-marketplace/community-operators-v468m" Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.191193 4752 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.220000 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v468m" Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.264803 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.265210 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn2tj\" (UniqueName: \"kubernetes.io/projected/0e9beada-5b6e-4ea6-adce-27504c85c851-kube-api-access-sn2tj\") pod \"certified-operators-v5cxt\" (UID: \"0e9beada-5b6e-4ea6-adce-27504c85c851\") " pod="openshift-marketplace/certified-operators-v5cxt" Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.265234 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e9beada-5b6e-4ea6-adce-27504c85c851-catalog-content\") pod \"certified-operators-v5cxt\" (UID: \"0e9beada-5b6e-4ea6-adce-27504c85c851\") " pod="openshift-marketplace/certified-operators-v5cxt" Nov 24 11:09:04 crc kubenswrapper[4752]: E1124 11:09:04.265394 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:09:04.765346733 +0000 UTC m=+150.750167032 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.265601 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e9beada-5b6e-4ea6-adce-27504c85c851-utilities\") pod \"certified-operators-v5cxt\" (UID: \"0e9beada-5b6e-4ea6-adce-27504c85c851\") " pod="openshift-marketplace/certified-operators-v5cxt" Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.268580 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-s7csx"] Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.277711 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s7csx" Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.304351 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s7csx"] Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.366803 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sn2tj\" (UniqueName: \"kubernetes.io/projected/0e9beada-5b6e-4ea6-adce-27504c85c851-kube-api-access-sn2tj\") pod \"certified-operators-v5cxt\" (UID: \"0e9beada-5b6e-4ea6-adce-27504c85c851\") " pod="openshift-marketplace/certified-operators-v5cxt" Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.366843 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e9beada-5b6e-4ea6-adce-27504c85c851-catalog-content\") pod \"certified-operators-v5cxt\" (UID: \"0e9beada-5b6e-4ea6-adce-27504c85c851\") " pod="openshift-marketplace/certified-operators-v5cxt" Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.366900 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/367c3613-ba5f-4154-9bc9-8ccdf47389f2-utilities\") pod \"community-operators-s7csx\" (UID: \"367c3613-ba5f-4154-9bc9-8ccdf47389f2\") " pod="openshift-marketplace/community-operators-s7csx" Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.366923 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/367c3613-ba5f-4154-9bc9-8ccdf47389f2-catalog-content\") pod \"community-operators-s7csx\" (UID: \"367c3613-ba5f-4154-9bc9-8ccdf47389f2\") " pod="openshift-marketplace/community-operators-s7csx" Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.366955 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.366987 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j676r\" (UniqueName: \"kubernetes.io/projected/367c3613-ba5f-4154-9bc9-8ccdf47389f2-kube-api-access-j676r\") pod \"community-operators-s7csx\" (UID: \"367c3613-ba5f-4154-9bc9-8ccdf47389f2\") " pod="openshift-marketplace/community-operators-s7csx" Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.367038 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e9beada-5b6e-4ea6-adce-27504c85c851-utilities\") pod \"certified-operators-v5cxt\" (UID: \"0e9beada-5b6e-4ea6-adce-27504c85c851\") " pod="openshift-marketplace/certified-operators-v5cxt" Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.367524 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e9beada-5b6e-4ea6-adce-27504c85c851-utilities\") pod \"certified-operators-v5cxt\" (UID: \"0e9beada-5b6e-4ea6-adce-27504c85c851\") " pod="openshift-marketplace/certified-operators-v5cxt" Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.368103 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e9beada-5b6e-4ea6-adce-27504c85c851-catalog-content\") pod \"certified-operators-v5cxt\" (UID: \"0e9beada-5b6e-4ea6-adce-27504c85c851\") " pod="openshift-marketplace/certified-operators-v5cxt" Nov 24 11:09:04 crc kubenswrapper[4752]: E1124 11:09:04.368379 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:09:04.868366759 +0000 UTC m=+150.853187048 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.413520 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn2tj\" (UniqueName: \"kubernetes.io/projected/0e9beada-5b6e-4ea6-adce-27504c85c851-kube-api-access-sn2tj\") pod \"certified-operators-v5cxt\" (UID: \"0e9beada-5b6e-4ea6-adce-27504c85c851\") " pod="openshift-marketplace/certified-operators-v5cxt" Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.436131 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v5cxt" Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.468009 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.468149 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/367c3613-ba5f-4154-9bc9-8ccdf47389f2-utilities\") pod \"community-operators-s7csx\" (UID: \"367c3613-ba5f-4154-9bc9-8ccdf47389f2\") " pod="openshift-marketplace/community-operators-s7csx" Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.468174 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/367c3613-ba5f-4154-9bc9-8ccdf47389f2-catalog-content\") pod \"community-operators-s7csx\" (UID: \"367c3613-ba5f-4154-9bc9-8ccdf47389f2\") " pod="openshift-marketplace/community-operators-s7csx" Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.468209 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j676r\" (UniqueName: \"kubernetes.io/projected/367c3613-ba5f-4154-9bc9-8ccdf47389f2-kube-api-access-j676r\") pod \"community-operators-s7csx\" (UID: \"367c3613-ba5f-4154-9bc9-8ccdf47389f2\") " pod="openshift-marketplace/community-operators-s7csx" Nov 24 11:09:04 crc kubenswrapper[4752]: E1124 11:09:04.468572 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:09:04.96855619 +0000 UTC m=+150.953376479 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.468931 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/367c3613-ba5f-4154-9bc9-8ccdf47389f2-utilities\") pod \"community-operators-s7csx\" (UID: \"367c3613-ba5f-4154-9bc9-8ccdf47389f2\") " pod="openshift-marketplace/community-operators-s7csx" Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.469141 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/367c3613-ba5f-4154-9bc9-8ccdf47389f2-catalog-content\") pod \"community-operators-s7csx\" (UID: \"367c3613-ba5f-4154-9bc9-8ccdf47389f2\") " pod="openshift-marketplace/community-operators-s7csx" Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.483911 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j676r\" (UniqueName: \"kubernetes.io/projected/367c3613-ba5f-4154-9bc9-8ccdf47389f2-kube-api-access-j676r\") pod \"community-operators-s7csx\" (UID: \"367c3613-ba5f-4154-9bc9-8ccdf47389f2\") " pod="openshift-marketplace/community-operators-s7csx" Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.521142 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-glr4w"] Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.569947 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:09:04 crc kubenswrapper[4752]: E1124 11:09:04.570307 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:09:05.070291397 +0000 UTC m=+151.055111696 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.586174 4752 patch_prober.go:28] interesting pod/router-default-5444994796-w7dbc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 24 11:09:04 crc kubenswrapper[4752]: [-]has-synced failed: reason withheld Nov 24 11:09:04 crc kubenswrapper[4752]: [+]process-running ok Nov 24 11:09:04 crc kubenswrapper[4752]: healthz check failed Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.586229 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w7dbc" podUID="c0998cf9-2c91-46b3-a9b8-a070e8b855b2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.590711 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v468m"] Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.626038 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s7csx" Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.652500 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v5cxt"] Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.671459 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:09:04 crc kubenswrapper[4752]: E1124 11:09:04.671566 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:09:05.171521169 +0000 UTC m=+151.156341458 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.672069 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:09:04 crc kubenswrapper[4752]: E1124 11:09:04.672351 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:09:05.172339764 +0000 UTC m=+151.157160053 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.774410 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:09:04 crc kubenswrapper[4752]: E1124 11:09:04.775000 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:09:05.274739082 +0000 UTC m=+151.259559371 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.775260 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:09:04 crc kubenswrapper[4752]: E1124 11:09:04.775720 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:09:05.275710001 +0000 UTC m=+151.260530290 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.872149 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s7csx"] Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.876638 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:09:04 crc kubenswrapper[4752]: E1124 11:09:04.876758 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 11:09:05.376724717 +0000 UTC m=+151.361545006 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.876833 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:09:04 crc kubenswrapper[4752]: E1124 11:09:04.877098 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 11:09:05.377089228 +0000 UTC m=+151.361909587 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4lbhg" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.892249 4752 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-11-24T11:09:04.191387896Z","Handler":null,"Name":""} Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.899889 4752 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.899922 4752 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.925790 4752 generic.go:334] "Generic (PLEG): container finished" podID="0e9beada-5b6e-4ea6-adce-27504c85c851" containerID="e1e75ab3abf9ddc1e8e212cecc26d6d9cc6e9e7ab13f2e71cbe8237a8872d173" exitCode=0 Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.925921 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v5cxt" event={"ID":"0e9beada-5b6e-4ea6-adce-27504c85c851","Type":"ContainerDied","Data":"e1e75ab3abf9ddc1e8e212cecc26d6d9cc6e9e7ab13f2e71cbe8237a8872d173"} Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.925972 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v5cxt" event={"ID":"0e9beada-5b6e-4ea6-adce-27504c85c851","Type":"ContainerStarted","Data":"9734c0e269530e08ba4a593da6981f3c0c0684530fd99036075d28deb13a7478"} Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.927416 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s7csx" event={"ID":"367c3613-ba5f-4154-9bc9-8ccdf47389f2","Type":"ContainerStarted","Data":"3dcc1e9929104ed7d1651d55467f08c801b9bcbce5acb1e1351c2c59c02401c2"} Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.927577 4752 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.929478 4752 generic.go:334] "Generic (PLEG): container finished" podID="e5633c2f-36d1-4d10-a12c-f58e67571364" containerID="07ba07891f5ac3167945a163de60ae65c1b50b594843101f54eed067659e8bb7" exitCode=0 Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.930121 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v468m" event={"ID":"e5633c2f-36d1-4d10-a12c-f58e67571364","Type":"ContainerDied","Data":"07ba07891f5ac3167945a163de60ae65c1b50b594843101f54eed067659e8bb7"} Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.930222 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v468m" event={"ID":"e5633c2f-36d1-4d10-a12c-f58e67571364","Type":"ContainerStarted","Data":"43f5287340dbd7553b644ac5a37a0fd65596be508035f122510be3d6d91f2693"} Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.941915 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cvplk" event={"ID":"05ea66c9-da9e-44a9-b85b-11514abd77e1","Type":"ContainerStarted","Data":"5d9fefb01bdbc41c9a46e418df5876d5b7605a0922cc909ad13fd1e21d9dc5a1"} Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.945049 4752 generic.go:334] "Generic (PLEG): container finished" podID="0d2e36e3-15aa-4576-a13a-5b5512bb13f2" containerID="6c818ee8855beeff21b3d4b90bc049dcc0f70237552df757f79d7a844895fac2" exitCode=0 Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.945092 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-glr4w" event={"ID":"0d2e36e3-15aa-4576-a13a-5b5512bb13f2","Type":"ContainerDied","Data":"6c818ee8855beeff21b3d4b90bc049dcc0f70237552df757f79d7a844895fac2"} Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.945110 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-glr4w" event={"ID":"0d2e36e3-15aa-4576-a13a-5b5512bb13f2","Type":"ContainerStarted","Data":"1705f9b75b102bd8a3b847598ac4f5630a754849998552f72b01e824463bb671"} Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.956001 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"aa25e11b3570ff494ad41a7f45a6d697fc3e61da5074f5ae220b8a6d3f129c24"} Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.956558 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.979081 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 11:09:04 crc kubenswrapper[4752]: I1124 11:09:04.988175 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.007258 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-cvplk" podStartSLOduration=11.007237823 podStartE2EDuration="11.007237823s" podCreationTimestamp="2025-11-24 11:08:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:09:05.005838561 +0000 UTC m=+150.990658850" watchObservedRunningTime="2025-11-24 11:09:05.007237823 +0000 UTC m=+150.992058112" Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.080899 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.084168 4752 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.084207 4752 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.106805 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4lbhg\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.185665 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399700-qknbd" Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.247800 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.283530 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f619c2d4-bcf5-4403-acd6-0bf90e2ece94-secret-volume\") pod \"f619c2d4-bcf5-4403-acd6-0bf90e2ece94\" (UID: \"f619c2d4-bcf5-4403-acd6-0bf90e2ece94\") " Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.283589 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gq5s4\" (UniqueName: \"kubernetes.io/projected/f619c2d4-bcf5-4403-acd6-0bf90e2ece94-kube-api-access-gq5s4\") pod \"f619c2d4-bcf5-4403-acd6-0bf90e2ece94\" (UID: \"f619c2d4-bcf5-4403-acd6-0bf90e2ece94\") " Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.283663 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f619c2d4-bcf5-4403-acd6-0bf90e2ece94-config-volume\") pod \"f619c2d4-bcf5-4403-acd6-0bf90e2ece94\" (UID: \"f619c2d4-bcf5-4403-acd6-0bf90e2ece94\") " Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.284299 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f619c2d4-bcf5-4403-acd6-0bf90e2ece94-config-volume" (OuterVolumeSpecName: "config-volume") pod "f619c2d4-bcf5-4403-acd6-0bf90e2ece94" (UID: "f619c2d4-bcf5-4403-acd6-0bf90e2ece94"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.288865 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f619c2d4-bcf5-4403-acd6-0bf90e2ece94-kube-api-access-gq5s4" (OuterVolumeSpecName: "kube-api-access-gq5s4") pod "f619c2d4-bcf5-4403-acd6-0bf90e2ece94" (UID: "f619c2d4-bcf5-4403-acd6-0bf90e2ece94"). InnerVolumeSpecName "kube-api-access-gq5s4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.289026 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f619c2d4-bcf5-4403-acd6-0bf90e2ece94-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f619c2d4-bcf5-4403-acd6-0bf90e2ece94" (UID: "f619c2d4-bcf5-4403-acd6-0bf90e2ece94"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.385707 4752 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f619c2d4-bcf5-4403-acd6-0bf90e2ece94-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.386067 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gq5s4\" (UniqueName: \"kubernetes.io/projected/f619c2d4-bcf5-4403-acd6-0bf90e2ece94-kube-api-access-gq5s4\") on node \"crc\" DevicePath \"\"" Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.386098 4752 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f619c2d4-bcf5-4403-acd6-0bf90e2ece94-config-volume\") on node \"crc\" DevicePath \"\"" Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.441200 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4lbhg"] Nov 24 11:09:05 crc kubenswrapper[4752]: W1124 11:09:05.454900 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3037291_c53e_4eb9_ae1b_00f71fee5cc5.slice/crio-5bb86b7834bb7d0e881873d47ed42b71849a47fb9c3f4b96005d3f0bdde3bdd2 WatchSource:0}: Error finding container 5bb86b7834bb7d0e881873d47ed42b71849a47fb9c3f4b96005d3f0bdde3bdd2: Status 404 returned error can't find the container with id 5bb86b7834bb7d0e881873d47ed42b71849a47fb9c3f4b96005d3f0bdde3bdd2 Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.588182 4752 patch_prober.go:28] interesting pod/router-default-5444994796-w7dbc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 24 11:09:05 crc kubenswrapper[4752]: [-]has-synced failed: reason withheld Nov 24 11:09:05 crc kubenswrapper[4752]: [+]process-running ok Nov 24 11:09:05 crc kubenswrapper[4752]: healthz check failed Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.588224 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w7dbc" podUID="c0998cf9-2c91-46b3-a9b8-a070e8b855b2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.590873 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 24 11:09:05 crc kubenswrapper[4752]: E1124 11:09:05.591095 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f619c2d4-bcf5-4403-acd6-0bf90e2ece94" containerName="collect-profiles" Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.591108 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="f619c2d4-bcf5-4403-acd6-0bf90e2ece94" containerName="collect-profiles" Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.591706 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="f619c2d4-bcf5-4403-acd6-0bf90e2ece94" containerName="collect-profiles" Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.592095 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.594596 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.594899 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.604537 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.678478 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lbtd5"] Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.679694 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lbtd5" Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.684274 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.687364 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lbtd5"] Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.691663 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/55a42dcf-4065-4af0-a1e8-6b938d9c53fd-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"55a42dcf-4065-4af0-a1e8-6b938d9c53fd\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.691790 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/55a42dcf-4065-4af0-a1e8-6b938d9c53fd-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"55a42dcf-4065-4af0-a1e8-6b938d9c53fd\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.793542 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7578\" (UniqueName: \"kubernetes.io/projected/1af90bc8-c663-4297-84fc-06b6663f837f-kube-api-access-q7578\") pod \"redhat-marketplace-lbtd5\" (UID: \"1af90bc8-c663-4297-84fc-06b6663f837f\") " pod="openshift-marketplace/redhat-marketplace-lbtd5" Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.793634 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/55a42dcf-4065-4af0-a1e8-6b938d9c53fd-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"55a42dcf-4065-4af0-a1e8-6b938d9c53fd\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.793681 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1af90bc8-c663-4297-84fc-06b6663f837f-utilities\") pod \"redhat-marketplace-lbtd5\" (UID: \"1af90bc8-c663-4297-84fc-06b6663f837f\") " pod="openshift-marketplace/redhat-marketplace-lbtd5" Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.793710 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1af90bc8-c663-4297-84fc-06b6663f837f-catalog-content\") pod \"redhat-marketplace-lbtd5\" (UID: \"1af90bc8-c663-4297-84fc-06b6663f837f\") " pod="openshift-marketplace/redhat-marketplace-lbtd5" Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.793776 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/55a42dcf-4065-4af0-a1e8-6b938d9c53fd-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"55a42dcf-4065-4af0-a1e8-6b938d9c53fd\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.794235 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/55a42dcf-4065-4af0-a1e8-6b938d9c53fd-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"55a42dcf-4065-4af0-a1e8-6b938d9c53fd\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.815518 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/55a42dcf-4065-4af0-a1e8-6b938d9c53fd-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"55a42dcf-4065-4af0-a1e8-6b938d9c53fd\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.894889 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7578\" (UniqueName: \"kubernetes.io/projected/1af90bc8-c663-4297-84fc-06b6663f837f-kube-api-access-q7578\") pod \"redhat-marketplace-lbtd5\" (UID: \"1af90bc8-c663-4297-84fc-06b6663f837f\") " pod="openshift-marketplace/redhat-marketplace-lbtd5" Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.894976 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1af90bc8-c663-4297-84fc-06b6663f837f-utilities\") pod \"redhat-marketplace-lbtd5\" (UID: \"1af90bc8-c663-4297-84fc-06b6663f837f\") " pod="openshift-marketplace/redhat-marketplace-lbtd5" Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.895004 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1af90bc8-c663-4297-84fc-06b6663f837f-catalog-content\") pod \"redhat-marketplace-lbtd5\" (UID: \"1af90bc8-c663-4297-84fc-06b6663f837f\") " pod="openshift-marketplace/redhat-marketplace-lbtd5" Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.895729 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1af90bc8-c663-4297-84fc-06b6663f837f-utilities\") pod \"redhat-marketplace-lbtd5\" (UID: \"1af90bc8-c663-4297-84fc-06b6663f837f\") " pod="openshift-marketplace/redhat-marketplace-lbtd5" Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.896085 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1af90bc8-c663-4297-84fc-06b6663f837f-catalog-content\") pod \"redhat-marketplace-lbtd5\" (UID: \"1af90bc8-c663-4297-84fc-06b6663f837f\") " pod="openshift-marketplace/redhat-marketplace-lbtd5" Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.911119 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7578\" (UniqueName: \"kubernetes.io/projected/1af90bc8-c663-4297-84fc-06b6663f837f-kube-api-access-q7578\") pod \"redhat-marketplace-lbtd5\" (UID: \"1af90bc8-c663-4297-84fc-06b6663f837f\") " pod="openshift-marketplace/redhat-marketplace-lbtd5" Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.946620 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.964704 4752 generic.go:334] "Generic (PLEG): container finished" podID="367c3613-ba5f-4154-9bc9-8ccdf47389f2" containerID="507f2d82ac4195329304a2582fb178ab93479358912e87a2c2a2ad27e96175f8" exitCode=0 Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.964807 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s7csx" event={"ID":"367c3613-ba5f-4154-9bc9-8ccdf47389f2","Type":"ContainerDied","Data":"507f2d82ac4195329304a2582fb178ab93479358912e87a2c2a2ad27e96175f8"} Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.968222 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399700-qknbd" event={"ID":"f619c2d4-bcf5-4403-acd6-0bf90e2ece94","Type":"ContainerDied","Data":"aaad34a136dc98b5ba27ef389f2340a2bc6281421784a376f9c260099d91a0e8"} Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.968243 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aaad34a136dc98b5ba27ef389f2340a2bc6281421784a376f9c260099d91a0e8" Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.968301 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399700-qknbd" Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.979428 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" event={"ID":"a3037291-c53e-4eb9-ae1b-00f71fee5cc5","Type":"ContainerStarted","Data":"66b76c9c1eb3a230a607deeed25a0e305fed9f58735e466bacb39e2685a14ab9"} Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.979471 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" event={"ID":"a3037291-c53e-4eb9-ae1b-00f71fee5cc5","Type":"ContainerStarted","Data":"5bb86b7834bb7d0e881873d47ed42b71849a47fb9c3f4b96005d3f0bdde3bdd2"} Nov 24 11:09:05 crc kubenswrapper[4752]: I1124 11:09:05.979489 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:09:06 crc kubenswrapper[4752]: I1124 11:09:06.017962 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lbtd5" Nov 24 11:09:06 crc kubenswrapper[4752]: I1124 11:09:06.046516 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" podStartSLOduration=131.04649928 podStartE2EDuration="2m11.04649928s" podCreationTimestamp="2025-11-24 11:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:09:06.012925574 +0000 UTC m=+151.997745863" watchObservedRunningTime="2025-11-24 11:09:06.04649928 +0000 UTC m=+152.031319579" Nov 24 11:09:06 crc kubenswrapper[4752]: I1124 11:09:06.049967 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-l2c5j"] Nov 24 11:09:06 crc kubenswrapper[4752]: I1124 11:09:06.052535 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l2c5j" Nov 24 11:09:06 crc kubenswrapper[4752]: I1124 11:09:06.108067 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l2c5j"] Nov 24 11:09:06 crc kubenswrapper[4752]: I1124 11:09:06.206915 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsrn7\" (UniqueName: \"kubernetes.io/projected/cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e-kube-api-access-tsrn7\") pod \"redhat-marketplace-l2c5j\" (UID: \"cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e\") " pod="openshift-marketplace/redhat-marketplace-l2c5j" Nov 24 11:09:06 crc kubenswrapper[4752]: I1124 11:09:06.207274 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e-catalog-content\") pod \"redhat-marketplace-l2c5j\" (UID: \"cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e\") " pod="openshift-marketplace/redhat-marketplace-l2c5j" Nov 24 11:09:06 crc kubenswrapper[4752]: I1124 11:09:06.207386 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e-utilities\") pod \"redhat-marketplace-l2c5j\" (UID: \"cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e\") " pod="openshift-marketplace/redhat-marketplace-l2c5j" Nov 24 11:09:06 crc kubenswrapper[4752]: I1124 11:09:06.244882 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 24 11:09:06 crc kubenswrapper[4752]: W1124 11:09:06.261630 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod55a42dcf_4065_4af0_a1e8_6b938d9c53fd.slice/crio-91e1bacac0bf490e1fb4d09f6dabc20330dcdc6d7a2ac8e1e2e46fc351f81c37 WatchSource:0}: Error finding container 91e1bacac0bf490e1fb4d09f6dabc20330dcdc6d7a2ac8e1e2e46fc351f81c37: Status 404 returned error can't find the container with id 91e1bacac0bf490e1fb4d09f6dabc20330dcdc6d7a2ac8e1e2e46fc351f81c37 Nov 24 11:09:06 crc kubenswrapper[4752]: I1124 11:09:06.308831 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e-utilities\") pod \"redhat-marketplace-l2c5j\" (UID: \"cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e\") " pod="openshift-marketplace/redhat-marketplace-l2c5j" Nov 24 11:09:06 crc kubenswrapper[4752]: I1124 11:09:06.308883 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsrn7\" (UniqueName: \"kubernetes.io/projected/cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e-kube-api-access-tsrn7\") pod \"redhat-marketplace-l2c5j\" (UID: \"cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e\") " pod="openshift-marketplace/redhat-marketplace-l2c5j" Nov 24 11:09:06 crc kubenswrapper[4752]: I1124 11:09:06.308904 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e-catalog-content\") pod \"redhat-marketplace-l2c5j\" (UID: \"cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e\") " pod="openshift-marketplace/redhat-marketplace-l2c5j" Nov 24 11:09:06 crc kubenswrapper[4752]: I1124 11:09:06.309401 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e-catalog-content\") pod \"redhat-marketplace-l2c5j\" (UID: \"cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e\") " pod="openshift-marketplace/redhat-marketplace-l2c5j" Nov 24 11:09:06 crc kubenswrapper[4752]: I1124 11:09:06.309550 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e-utilities\") pod \"redhat-marketplace-l2c5j\" (UID: \"cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e\") " pod="openshift-marketplace/redhat-marketplace-l2c5j" Nov 24 11:09:06 crc kubenswrapper[4752]: I1124 11:09:06.326659 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsrn7\" (UniqueName: \"kubernetes.io/projected/cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e-kube-api-access-tsrn7\") pod \"redhat-marketplace-l2c5j\" (UID: \"cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e\") " pod="openshift-marketplace/redhat-marketplace-l2c5j" Nov 24 11:09:06 crc kubenswrapper[4752]: I1124 11:09:06.374340 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lbtd5"] Nov 24 11:09:06 crc kubenswrapper[4752]: W1124 11:09:06.390166 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1af90bc8_c663_4297_84fc_06b6663f837f.slice/crio-b1734c44ad5bff3b6a913b419c06439f789e3ff699e49c0b8f363773b163fc90 WatchSource:0}: Error finding container b1734c44ad5bff3b6a913b419c06439f789e3ff699e49c0b8f363773b163fc90: Status 404 returned error can't find the container with id b1734c44ad5bff3b6a913b419c06439f789e3ff699e49c0b8f363773b163fc90 Nov 24 11:09:06 crc kubenswrapper[4752]: I1124 11:09:06.447029 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l2c5j" Nov 24 11:09:06 crc kubenswrapper[4752]: I1124 11:09:06.583672 4752 patch_prober.go:28] interesting pod/router-default-5444994796-w7dbc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 24 11:09:06 crc kubenswrapper[4752]: [-]has-synced failed: reason withheld Nov 24 11:09:06 crc kubenswrapper[4752]: [+]process-running ok Nov 24 11:09:06 crc kubenswrapper[4752]: healthz check failed Nov 24 11:09:06 crc kubenswrapper[4752]: I1124 11:09:06.583966 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w7dbc" podUID="c0998cf9-2c91-46b3-a9b8-a070e8b855b2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 24 11:09:06 crc kubenswrapper[4752]: I1124 11:09:06.756614 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Nov 24 11:09:06 crc kubenswrapper[4752]: I1124 11:09:06.757294 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l2c5j"] Nov 24 11:09:06 crc kubenswrapper[4752]: W1124 11:09:06.766847 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd3bf6b1_a2a9_4e5b_ad58_7f559317b86e.slice/crio-67d168d56c4bdfbaf34b28f4f2ad4b4ff2a3ebbe46da66f2217ee9a592a6e99c WatchSource:0}: Error finding container 67d168d56c4bdfbaf34b28f4f2ad4b4ff2a3ebbe46da66f2217ee9a592a6e99c: Status 404 returned error can't find the container with id 67d168d56c4bdfbaf34b28f4f2ad4b4ff2a3ebbe46da66f2217ee9a592a6e99c Nov 24 11:09:06 crc kubenswrapper[4752]: I1124 11:09:06.987018 4752 generic.go:334] "Generic (PLEG): container finished" podID="1af90bc8-c663-4297-84fc-06b6663f837f" containerID="b1c68be0687d4a0c998694a486d09ad578f9d1a05496be8eefa06648030f88c2" exitCode=0 Nov 24 11:09:06 crc kubenswrapper[4752]: I1124 11:09:06.987084 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbtd5" event={"ID":"1af90bc8-c663-4297-84fc-06b6663f837f","Type":"ContainerDied","Data":"b1c68be0687d4a0c998694a486d09ad578f9d1a05496be8eefa06648030f88c2"} Nov 24 11:09:06 crc kubenswrapper[4752]: I1124 11:09:06.987364 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbtd5" event={"ID":"1af90bc8-c663-4297-84fc-06b6663f837f","Type":"ContainerStarted","Data":"b1734c44ad5bff3b6a913b419c06439f789e3ff699e49c0b8f363773b163fc90"} Nov 24 11:09:06 crc kubenswrapper[4752]: I1124 11:09:06.989354 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l2c5j" event={"ID":"cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e","Type":"ContainerStarted","Data":"67d168d56c4bdfbaf34b28f4f2ad4b4ff2a3ebbe46da66f2217ee9a592a6e99c"} Nov 24 11:09:06 crc kubenswrapper[4752]: I1124 11:09:06.994318 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"55a42dcf-4065-4af0-a1e8-6b938d9c53fd","Type":"ContainerStarted","Data":"a8867cc17135734272e94c9af7c72746f52da1819d8b82fc1d7e51c9810add09"} Nov 24 11:09:06 crc kubenswrapper[4752]: I1124 11:09:06.994347 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"55a42dcf-4065-4af0-a1e8-6b938d9c53fd","Type":"ContainerStarted","Data":"91e1bacac0bf490e1fb4d09f6dabc20330dcdc6d7a2ac8e1e2e46fc351f81c37"} Nov 24 11:09:06 crc kubenswrapper[4752]: I1124 11:09:06.998427 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.006278 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-xvmg4" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.049365 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.050284 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.056887 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-x7lpn" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.057111 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.057289 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.057525 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-x7lpn" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.058883 4752 patch_prober.go:28] interesting pod/console-f9d7485db-x7lpn container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.25:8443/health\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.058919 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-x7lpn" podUID="df4cedec-414c-4253-8a55-79ed8c8734c1" containerName="console" probeResult="failure" output="Get \"https://10.217.0.25:8443/health\": dial tcp 10.217.0.25:8443: connect: connection refused" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.067271 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.072999 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-z4nx9"] Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.077133 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z4nx9" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.080003 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.087021 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.087000223 podStartE2EDuration="2.087000223s" podCreationTimestamp="2025-11-24 11:09:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:09:07.085291362 +0000 UTC m=+153.070111651" watchObservedRunningTime="2025-11-24 11:09:07.087000223 +0000 UTC m=+153.071820512" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.094005 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z4nx9"] Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.101152 4752 patch_prober.go:28] interesting pod/downloads-7954f5f757-sght6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.101186 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-sght6" podUID="e7f6a4fd-346f-4828-b077-4d4888917d6a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.101216 4752 patch_prober.go:28] interesting pod/downloads-7954f5f757-sght6 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.101257 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-sght6" podUID="e7f6a4fd-346f-4828-b077-4d4888917d6a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.198377 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.227311 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/21d5896c-3f0a-46a6-ae1a-71bc6662e22b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"21d5896c-3f0a-46a6-ae1a-71bc6662e22b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.227374 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21d5896c-3f0a-46a6-ae1a-71bc6662e22b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"21d5896c-3f0a-46a6-ae1a-71bc6662e22b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.227413 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2208d90e-0865-4938-8423-ea3d37ca26db-utilities\") pod \"redhat-operators-z4nx9\" (UID: \"2208d90e-0865-4938-8423-ea3d37ca26db\") " pod="openshift-marketplace/redhat-operators-z4nx9" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.227447 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2208d90e-0865-4938-8423-ea3d37ca26db-catalog-content\") pod \"redhat-operators-z4nx9\" (UID: \"2208d90e-0865-4938-8423-ea3d37ca26db\") " pod="openshift-marketplace/redhat-operators-z4nx9" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.227467 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t54kf\" (UniqueName: \"kubernetes.io/projected/2208d90e-0865-4938-8423-ea3d37ca26db-kube-api-access-t54kf\") pod \"redhat-operators-z4nx9\" (UID: \"2208d90e-0865-4938-8423-ea3d37ca26db\") " pod="openshift-marketplace/redhat-operators-z4nx9" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.333427 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2208d90e-0865-4938-8423-ea3d37ca26db-catalog-content\") pod \"redhat-operators-z4nx9\" (UID: \"2208d90e-0865-4938-8423-ea3d37ca26db\") " pod="openshift-marketplace/redhat-operators-z4nx9" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.333495 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t54kf\" (UniqueName: \"kubernetes.io/projected/2208d90e-0865-4938-8423-ea3d37ca26db-kube-api-access-t54kf\") pod \"redhat-operators-z4nx9\" (UID: \"2208d90e-0865-4938-8423-ea3d37ca26db\") " pod="openshift-marketplace/redhat-operators-z4nx9" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.333526 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/21d5896c-3f0a-46a6-ae1a-71bc6662e22b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"21d5896c-3f0a-46a6-ae1a-71bc6662e22b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.333581 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21d5896c-3f0a-46a6-ae1a-71bc6662e22b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"21d5896c-3f0a-46a6-ae1a-71bc6662e22b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.333632 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2208d90e-0865-4938-8423-ea3d37ca26db-utilities\") pod \"redhat-operators-z4nx9\" (UID: \"2208d90e-0865-4938-8423-ea3d37ca26db\") " pod="openshift-marketplace/redhat-operators-z4nx9" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.334136 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2208d90e-0865-4938-8423-ea3d37ca26db-utilities\") pod \"redhat-operators-z4nx9\" (UID: \"2208d90e-0865-4938-8423-ea3d37ca26db\") " pod="openshift-marketplace/redhat-operators-z4nx9" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.334596 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2208d90e-0865-4938-8423-ea3d37ca26db-catalog-content\") pod \"redhat-operators-z4nx9\" (UID: \"2208d90e-0865-4938-8423-ea3d37ca26db\") " pod="openshift-marketplace/redhat-operators-z4nx9" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.335106 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/21d5896c-3f0a-46a6-ae1a-71bc6662e22b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"21d5896c-3f0a-46a6-ae1a-71bc6662e22b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.359677 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t54kf\" (UniqueName: \"kubernetes.io/projected/2208d90e-0865-4938-8423-ea3d37ca26db-kube-api-access-t54kf\") pod \"redhat-operators-z4nx9\" (UID: \"2208d90e-0865-4938-8423-ea3d37ca26db\") " pod="openshift-marketplace/redhat-operators-z4nx9" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.375632 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21d5896c-3f0a-46a6-ae1a-71bc6662e22b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"21d5896c-3f0a-46a6-ae1a-71bc6662e22b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.376094 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.397499 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z4nx9" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.453889 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gxcj6"] Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.455054 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gxcj6" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.482150 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gxcj6"] Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.543033 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3536d4b-439e-4718-bb9a-97629f7beec8-catalog-content\") pod \"redhat-operators-gxcj6\" (UID: \"a3536d4b-439e-4718-bb9a-97629f7beec8\") " pod="openshift-marketplace/redhat-operators-gxcj6" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.543155 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffb57\" (UniqueName: \"kubernetes.io/projected/a3536d4b-439e-4718-bb9a-97629f7beec8-kube-api-access-ffb57\") pod \"redhat-operators-gxcj6\" (UID: \"a3536d4b-439e-4718-bb9a-97629f7beec8\") " pod="openshift-marketplace/redhat-operators-gxcj6" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.543600 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3536d4b-439e-4718-bb9a-97629f7beec8-utilities\") pod \"redhat-operators-gxcj6\" (UID: \"a3536d4b-439e-4718-bb9a-97629f7beec8\") " pod="openshift-marketplace/redhat-operators-gxcj6" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.557725 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-txwcw" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.578489 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-w7dbc" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.591921 4752 patch_prober.go:28] interesting pod/router-default-5444994796-w7dbc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 24 11:09:07 crc kubenswrapper[4752]: [-]has-synced failed: reason withheld Nov 24 11:09:07 crc kubenswrapper[4752]: [+]process-running ok Nov 24 11:09:07 crc kubenswrapper[4752]: healthz check failed Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.591987 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w7dbc" podUID="c0998cf9-2c91-46b3-a9b8-a070e8b855b2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.644431 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3536d4b-439e-4718-bb9a-97629f7beec8-catalog-content\") pod \"redhat-operators-gxcj6\" (UID: \"a3536d4b-439e-4718-bb9a-97629f7beec8\") " pod="openshift-marketplace/redhat-operators-gxcj6" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.644497 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffb57\" (UniqueName: \"kubernetes.io/projected/a3536d4b-439e-4718-bb9a-97629f7beec8-kube-api-access-ffb57\") pod \"redhat-operators-gxcj6\" (UID: \"a3536d4b-439e-4718-bb9a-97629f7beec8\") " pod="openshift-marketplace/redhat-operators-gxcj6" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.644523 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3536d4b-439e-4718-bb9a-97629f7beec8-utilities\") pod \"redhat-operators-gxcj6\" (UID: \"a3536d4b-439e-4718-bb9a-97629f7beec8\") " pod="openshift-marketplace/redhat-operators-gxcj6" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.644985 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3536d4b-439e-4718-bb9a-97629f7beec8-utilities\") pod \"redhat-operators-gxcj6\" (UID: \"a3536d4b-439e-4718-bb9a-97629f7beec8\") " pod="openshift-marketplace/redhat-operators-gxcj6" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.645284 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3536d4b-439e-4718-bb9a-97629f7beec8-catalog-content\") pod \"redhat-operators-gxcj6\" (UID: \"a3536d4b-439e-4718-bb9a-97629f7beec8\") " pod="openshift-marketplace/redhat-operators-gxcj6" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.665799 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffb57\" (UniqueName: \"kubernetes.io/projected/a3536d4b-439e-4718-bb9a-97629f7beec8-kube-api-access-ffb57\") pod \"redhat-operators-gxcj6\" (UID: \"a3536d4b-439e-4718-bb9a-97629f7beec8\") " pod="openshift-marketplace/redhat-operators-gxcj6" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.782375 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gxcj6" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.865424 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-bps76" Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.889936 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 24 11:09:07 crc kubenswrapper[4752]: W1124 11:09:07.970481 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod21d5896c_3f0a_46a6_ae1a_71bc6662e22b.slice/crio-588b8132b39200aebe9f163086b73e9afdd33a8f5587d7e352d356b9c9e980ea WatchSource:0}: Error finding container 588b8132b39200aebe9f163086b73e9afdd33a8f5587d7e352d356b9c9e980ea: Status 404 returned error can't find the container with id 588b8132b39200aebe9f163086b73e9afdd33a8f5587d7e352d356b9c9e980ea Nov 24 11:09:07 crc kubenswrapper[4752]: I1124 11:09:07.978705 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z4nx9"] Nov 24 11:09:08 crc kubenswrapper[4752]: W1124 11:09:08.007134 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2208d90e_0865_4938_8423_ea3d37ca26db.slice/crio-f6b68b980bceff1daa58abb7fd8562877c365975213a98fc20e09054f9e63577 WatchSource:0}: Error finding container f6b68b980bceff1daa58abb7fd8562877c365975213a98fc20e09054f9e63577: Status 404 returned error can't find the container with id f6b68b980bceff1daa58abb7fd8562877c365975213a98fc20e09054f9e63577 Nov 24 11:09:08 crc kubenswrapper[4752]: I1124 11:09:08.011666 4752 generic.go:334] "Generic (PLEG): container finished" podID="cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e" containerID="87d7921fb69a3b1aac21d6c71a07ce4ac22b8c83a89bd01fa735698a69d393f4" exitCode=0 Nov 24 11:09:08 crc kubenswrapper[4752]: I1124 11:09:08.011774 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l2c5j" event={"ID":"cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e","Type":"ContainerDied","Data":"87d7921fb69a3b1aac21d6c71a07ce4ac22b8c83a89bd01fa735698a69d393f4"} Nov 24 11:09:08 crc kubenswrapper[4752]: I1124 11:09:08.030111 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"21d5896c-3f0a-46a6-ae1a-71bc6662e22b","Type":"ContainerStarted","Data":"588b8132b39200aebe9f163086b73e9afdd33a8f5587d7e352d356b9c9e980ea"} Nov 24 11:09:08 crc kubenswrapper[4752]: I1124 11:09:08.035996 4752 generic.go:334] "Generic (PLEG): container finished" podID="55a42dcf-4065-4af0-a1e8-6b938d9c53fd" containerID="a8867cc17135734272e94c9af7c72746f52da1819d8b82fc1d7e51c9810add09" exitCode=0 Nov 24 11:09:08 crc kubenswrapper[4752]: I1124 11:09:08.036067 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"55a42dcf-4065-4af0-a1e8-6b938d9c53fd","Type":"ContainerDied","Data":"a8867cc17135734272e94c9af7c72746f52da1819d8b82fc1d7e51c9810add09"} Nov 24 11:09:08 crc kubenswrapper[4752]: I1124 11:09:08.395910 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gxcj6"] Nov 24 11:09:08 crc kubenswrapper[4752]: I1124 11:09:08.583763 4752 patch_prober.go:28] interesting pod/router-default-5444994796-w7dbc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 24 11:09:08 crc kubenswrapper[4752]: [-]has-synced failed: reason withheld Nov 24 11:09:08 crc kubenswrapper[4752]: [+]process-running ok Nov 24 11:09:08 crc kubenswrapper[4752]: healthz check failed Nov 24 11:09:08 crc kubenswrapper[4752]: I1124 11:09:08.583856 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w7dbc" podUID="c0998cf9-2c91-46b3-a9b8-a070e8b855b2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 24 11:09:09 crc kubenswrapper[4752]: I1124 11:09:09.066073 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"21d5896c-3f0a-46a6-ae1a-71bc6662e22b","Type":"ContainerStarted","Data":"5642b9cdd4a06ad32cb784da6cde1c105bb76d1983bad6aada7402d80c022e9a"} Nov 24 11:09:09 crc kubenswrapper[4752]: I1124 11:09:09.076855 4752 generic.go:334] "Generic (PLEG): container finished" podID="2208d90e-0865-4938-8423-ea3d37ca26db" containerID="fd88edf273056db29b27f5a0926ff22461ca626e5dbfc28e870686d80a33464e" exitCode=0 Nov 24 11:09:09 crc kubenswrapper[4752]: I1124 11:09:09.076926 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z4nx9" event={"ID":"2208d90e-0865-4938-8423-ea3d37ca26db","Type":"ContainerDied","Data":"fd88edf273056db29b27f5a0926ff22461ca626e5dbfc28e870686d80a33464e"} Nov 24 11:09:09 crc kubenswrapper[4752]: I1124 11:09:09.076954 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z4nx9" event={"ID":"2208d90e-0865-4938-8423-ea3d37ca26db","Type":"ContainerStarted","Data":"f6b68b980bceff1daa58abb7fd8562877c365975213a98fc20e09054f9e63577"} Nov 24 11:09:09 crc kubenswrapper[4752]: I1124 11:09:09.088830 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.088813594 podStartE2EDuration="2.088813594s" podCreationTimestamp="2025-11-24 11:09:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:09:09.087587307 +0000 UTC m=+155.072407606" watchObservedRunningTime="2025-11-24 11:09:09.088813594 +0000 UTC m=+155.073633873" Nov 24 11:09:09 crc kubenswrapper[4752]: I1124 11:09:09.098549 4752 generic.go:334] "Generic (PLEG): container finished" podID="a3536d4b-439e-4718-bb9a-97629f7beec8" containerID="35e79c94e33df1e23188b67edc27bb360b2868678205abc41b995e8bbab2a95a" exitCode=0 Nov 24 11:09:09 crc kubenswrapper[4752]: I1124 11:09:09.098637 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gxcj6" event={"ID":"a3536d4b-439e-4718-bb9a-97629f7beec8","Type":"ContainerDied","Data":"35e79c94e33df1e23188b67edc27bb360b2868678205abc41b995e8bbab2a95a"} Nov 24 11:09:09 crc kubenswrapper[4752]: I1124 11:09:09.098659 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gxcj6" event={"ID":"a3536d4b-439e-4718-bb9a-97629f7beec8","Type":"ContainerStarted","Data":"432901bc784de78ecfda47b74b5a5c2c805eaf06b394dc2041e1b4989b4c49a3"} Nov 24 11:09:09 crc kubenswrapper[4752]: I1124 11:09:09.586428 4752 patch_prober.go:28] interesting pod/router-default-5444994796-w7dbc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 24 11:09:09 crc kubenswrapper[4752]: [+]has-synced ok Nov 24 11:09:09 crc kubenswrapper[4752]: [+]process-running ok Nov 24 11:09:09 crc kubenswrapper[4752]: healthz check failed Nov 24 11:09:09 crc kubenswrapper[4752]: I1124 11:09:09.586819 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w7dbc" podUID="c0998cf9-2c91-46b3-a9b8-a070e8b855b2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 24 11:09:09 crc kubenswrapper[4752]: I1124 11:09:09.604281 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 24 11:09:09 crc kubenswrapper[4752]: I1124 11:09:09.703570 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/55a42dcf-4065-4af0-a1e8-6b938d9c53fd-kubelet-dir\") pod \"55a42dcf-4065-4af0-a1e8-6b938d9c53fd\" (UID: \"55a42dcf-4065-4af0-a1e8-6b938d9c53fd\") " Nov 24 11:09:09 crc kubenswrapper[4752]: I1124 11:09:09.703704 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55a42dcf-4065-4af0-a1e8-6b938d9c53fd-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "55a42dcf-4065-4af0-a1e8-6b938d9c53fd" (UID: "55a42dcf-4065-4af0-a1e8-6b938d9c53fd"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 11:09:09 crc kubenswrapper[4752]: I1124 11:09:09.704513 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/55a42dcf-4065-4af0-a1e8-6b938d9c53fd-kube-api-access\") pod \"55a42dcf-4065-4af0-a1e8-6b938d9c53fd\" (UID: \"55a42dcf-4065-4af0-a1e8-6b938d9c53fd\") " Nov 24 11:09:09 crc kubenswrapper[4752]: I1124 11:09:09.705038 4752 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/55a42dcf-4065-4af0-a1e8-6b938d9c53fd-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 24 11:09:09 crc kubenswrapper[4752]: I1124 11:09:09.727571 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55a42dcf-4065-4af0-a1e8-6b938d9c53fd-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "55a42dcf-4065-4af0-a1e8-6b938d9c53fd" (UID: "55a42dcf-4065-4af0-a1e8-6b938d9c53fd"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:09:09 crc kubenswrapper[4752]: I1124 11:09:09.806710 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/55a42dcf-4065-4af0-a1e8-6b938d9c53fd-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 24 11:09:10 crc kubenswrapper[4752]: I1124 11:09:10.137793 4752 generic.go:334] "Generic (PLEG): container finished" podID="21d5896c-3f0a-46a6-ae1a-71bc6662e22b" containerID="5642b9cdd4a06ad32cb784da6cde1c105bb76d1983bad6aada7402d80c022e9a" exitCode=0 Nov 24 11:09:10 crc kubenswrapper[4752]: I1124 11:09:10.138045 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"21d5896c-3f0a-46a6-ae1a-71bc6662e22b","Type":"ContainerDied","Data":"5642b9cdd4a06ad32cb784da6cde1c105bb76d1983bad6aada7402d80c022e9a"} Nov 24 11:09:10 crc kubenswrapper[4752]: I1124 11:09:10.144255 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"55a42dcf-4065-4af0-a1e8-6b938d9c53fd","Type":"ContainerDied","Data":"91e1bacac0bf490e1fb4d09f6dabc20330dcdc6d7a2ac8e1e2e46fc351f81c37"} Nov 24 11:09:10 crc kubenswrapper[4752]: I1124 11:09:10.144343 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91e1bacac0bf490e1fb4d09f6dabc20330dcdc6d7a2ac8e1e2e46fc351f81c37" Nov 24 11:09:10 crc kubenswrapper[4752]: I1124 11:09:10.144438 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 24 11:09:10 crc kubenswrapper[4752]: I1124 11:09:10.582675 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-w7dbc" Nov 24 11:09:10 crc kubenswrapper[4752]: I1124 11:09:10.595271 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-w7dbc" Nov 24 11:09:11 crc kubenswrapper[4752]: I1124 11:09:11.451586 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 24 11:09:11 crc kubenswrapper[4752]: I1124 11:09:11.646201 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/21d5896c-3f0a-46a6-ae1a-71bc6662e22b-kubelet-dir\") pod \"21d5896c-3f0a-46a6-ae1a-71bc6662e22b\" (UID: \"21d5896c-3f0a-46a6-ae1a-71bc6662e22b\") " Nov 24 11:09:11 crc kubenswrapper[4752]: I1124 11:09:11.646289 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21d5896c-3f0a-46a6-ae1a-71bc6662e22b-kube-api-access\") pod \"21d5896c-3f0a-46a6-ae1a-71bc6662e22b\" (UID: \"21d5896c-3f0a-46a6-ae1a-71bc6662e22b\") " Nov 24 11:09:11 crc kubenswrapper[4752]: I1124 11:09:11.646707 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21d5896c-3f0a-46a6-ae1a-71bc6662e22b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "21d5896c-3f0a-46a6-ae1a-71bc6662e22b" (UID: "21d5896c-3f0a-46a6-ae1a-71bc6662e22b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 11:09:11 crc kubenswrapper[4752]: I1124 11:09:11.648116 4752 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/21d5896c-3f0a-46a6-ae1a-71bc6662e22b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 24 11:09:11 crc kubenswrapper[4752]: I1124 11:09:11.671074 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21d5896c-3f0a-46a6-ae1a-71bc6662e22b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "21d5896c-3f0a-46a6-ae1a-71bc6662e22b" (UID: "21d5896c-3f0a-46a6-ae1a-71bc6662e22b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:09:11 crc kubenswrapper[4752]: I1124 11:09:11.750658 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21d5896c-3f0a-46a6-ae1a-71bc6662e22b-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 24 11:09:12 crc kubenswrapper[4752]: I1124 11:09:12.192956 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"21d5896c-3f0a-46a6-ae1a-71bc6662e22b","Type":"ContainerDied","Data":"588b8132b39200aebe9f163086b73e9afdd33a8f5587d7e352d356b9c9e980ea"} Nov 24 11:09:12 crc kubenswrapper[4752]: I1124 11:09:12.192990 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="588b8132b39200aebe9f163086b73e9afdd33a8f5587d7e352d356b9c9e980ea" Nov 24 11:09:12 crc kubenswrapper[4752]: I1124 11:09:12.193037 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 24 11:09:12 crc kubenswrapper[4752]: I1124 11:09:12.643937 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-sd7kr" Nov 24 11:09:15 crc kubenswrapper[4752]: I1124 11:09:15.469266 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 11:09:15 crc kubenswrapper[4752]: I1124 11:09:15.469966 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 11:09:17 crc kubenswrapper[4752]: I1124 11:09:17.081220 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-x7lpn" Nov 24 11:09:17 crc kubenswrapper[4752]: I1124 11:09:17.085958 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-x7lpn" Nov 24 11:09:17 crc kubenswrapper[4752]: I1124 11:09:17.109956 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-sght6" Nov 24 11:09:17 crc kubenswrapper[4752]: I1124 11:09:17.737318 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6dea074-570b-440e-b555-46c1dde88efa-metrics-certs\") pod \"network-metrics-daemon-8gb7x\" (UID: \"d6dea074-570b-440e-b555-46c1dde88efa\") " pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:09:17 crc kubenswrapper[4752]: I1124 11:09:17.743109 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6dea074-570b-440e-b555-46c1dde88efa-metrics-certs\") pod \"network-metrics-daemon-8gb7x\" (UID: \"d6dea074-570b-440e-b555-46c1dde88efa\") " pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:09:17 crc kubenswrapper[4752]: I1124 11:09:17.748904 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8gb7x" Nov 24 11:09:25 crc kubenswrapper[4752]: I1124 11:09:25.253986 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:09:34 crc kubenswrapper[4752]: E1124 11:09:34.895191 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Nov 24 11:09:34 crc kubenswrapper[4752]: E1124 11:09:34.896172 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tqj2s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-v468m_openshift-marketplace(e5633c2f-36d1-4d10-a12c-f58e67571364): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 24 11:09:34 crc kubenswrapper[4752]: E1124 11:09:34.897388 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-v468m" podUID="e5633c2f-36d1-4d10-a12c-f58e67571364" Nov 24 11:09:37 crc kubenswrapper[4752]: I1124 11:09:37.535143 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-htkkq" Nov 24 11:09:37 crc kubenswrapper[4752]: E1124 11:09:37.929260 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-v468m" podUID="e5633c2f-36d1-4d10-a12c-f58e67571364" Nov 24 11:09:38 crc kubenswrapper[4752]: E1124 11:09:37.999651 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Nov 24 11:09:38 crc kubenswrapper[4752]: E1124 11:09:38.000171 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ffb57,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-gxcj6_openshift-marketplace(a3536d4b-439e-4718-bb9a-97629f7beec8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 24 11:09:38 crc kubenswrapper[4752]: E1124 11:09:38.001381 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-gxcj6" podUID="a3536d4b-439e-4718-bb9a-97629f7beec8" Nov 24 11:09:39 crc kubenswrapper[4752]: E1124 11:09:39.358782 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-gxcj6" podUID="a3536d4b-439e-4718-bb9a-97629f7beec8" Nov 24 11:09:40 crc kubenswrapper[4752]: E1124 11:09:40.317510 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Nov 24 11:09:40 crc kubenswrapper[4752]: E1124 11:09:40.318871 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q7578,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-lbtd5_openshift-marketplace(1af90bc8-c663-4297-84fc-06b6663f837f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 24 11:09:40 crc kubenswrapper[4752]: E1124 11:09:40.320324 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-lbtd5" podUID="1af90bc8-c663-4297-84fc-06b6663f837f" Nov 24 11:09:40 crc kubenswrapper[4752]: E1124 11:09:40.389045 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Nov 24 11:09:40 crc kubenswrapper[4752]: E1124 11:09:40.389209 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tsrn7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-l2c5j_openshift-marketplace(cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 24 11:09:40 crc kubenswrapper[4752]: E1124 11:09:40.391391 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-l2c5j" podUID="cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e" Nov 24 11:09:40 crc kubenswrapper[4752]: E1124 11:09:40.407713 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Nov 24 11:09:40 crc kubenswrapper[4752]: E1124 11:09:40.407953 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sn2tj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-v5cxt_openshift-marketplace(0e9beada-5b6e-4ea6-adce-27504c85c851): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 24 11:09:40 crc kubenswrapper[4752]: E1124 11:09:40.409555 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-v5cxt" podUID="0e9beada-5b6e-4ea6-adce-27504c85c851" Nov 24 11:09:40 crc kubenswrapper[4752]: E1124 11:09:40.415150 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-lbtd5" podUID="1af90bc8-c663-4297-84fc-06b6663f837f" Nov 24 11:09:40 crc kubenswrapper[4752]: E1124 11:09:40.415427 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-l2c5j" podUID="cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e" Nov 24 11:09:40 crc kubenswrapper[4752]: E1124 11:09:40.417149 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Nov 24 11:09:40 crc kubenswrapper[4752]: E1124 11:09:40.417318 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j676r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-s7csx_openshift-marketplace(367c3613-ba5f-4154-9bc9-8ccdf47389f2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 24 11:09:40 crc kubenswrapper[4752]: E1124 11:09:40.418452 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-s7csx" podUID="367c3613-ba5f-4154-9bc9-8ccdf47389f2" Nov 24 11:09:40 crc kubenswrapper[4752]: E1124 11:09:40.461451 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Nov 24 11:09:40 crc kubenswrapper[4752]: E1124 11:09:40.462279 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t54kf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-z4nx9_openshift-marketplace(2208d90e-0865-4938-8423-ea3d37ca26db): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 24 11:09:40 crc kubenswrapper[4752]: E1124 11:09:40.464921 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-z4nx9" podUID="2208d90e-0865-4938-8423-ea3d37ca26db" Nov 24 11:09:40 crc kubenswrapper[4752]: I1124 11:09:40.498627 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8gb7x"] Nov 24 11:09:41 crc kubenswrapper[4752]: I1124 11:09:41.421685 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8gb7x" event={"ID":"d6dea074-570b-440e-b555-46c1dde88efa","Type":"ContainerStarted","Data":"fe1510cced3aa38b51a164d43fd50735f12d4eaded60540983c95d2a70a0b70f"} Nov 24 11:09:41 crc kubenswrapper[4752]: I1124 11:09:41.424173 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8gb7x" event={"ID":"d6dea074-570b-440e-b555-46c1dde88efa","Type":"ContainerStarted","Data":"2f1b2ac80a9b83d50423ef1c0d76e092db4d89f367ec84eb7fe6fce0434cf453"} Nov 24 11:09:41 crc kubenswrapper[4752]: I1124 11:09:41.424207 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8gb7x" event={"ID":"d6dea074-570b-440e-b555-46c1dde88efa","Type":"ContainerStarted","Data":"c9273bd213130a6f5b4ddbc58e1a344f552f477db4593c5e03ec6679334dd033"} Nov 24 11:09:41 crc kubenswrapper[4752]: I1124 11:09:41.427305 4752 generic.go:334] "Generic (PLEG): container finished" podID="0d2e36e3-15aa-4576-a13a-5b5512bb13f2" containerID="518243612bd992d205291561ed5239fb188280a63197459446cad7e3c7dfe55a" exitCode=0 Nov 24 11:09:41 crc kubenswrapper[4752]: I1124 11:09:41.427509 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-glr4w" event={"ID":"0d2e36e3-15aa-4576-a13a-5b5512bb13f2","Type":"ContainerDied","Data":"518243612bd992d205291561ed5239fb188280a63197459446cad7e3c7dfe55a"} Nov 24 11:09:41 crc kubenswrapper[4752]: E1124 11:09:41.432704 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-z4nx9" podUID="2208d90e-0865-4938-8423-ea3d37ca26db" Nov 24 11:09:41 crc kubenswrapper[4752]: E1124 11:09:41.432985 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-s7csx" podUID="367c3613-ba5f-4154-9bc9-8ccdf47389f2" Nov 24 11:09:41 crc kubenswrapper[4752]: E1124 11:09:41.434478 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-v5cxt" podUID="0e9beada-5b6e-4ea6-adce-27504c85c851" Nov 24 11:09:41 crc kubenswrapper[4752]: I1124 11:09:41.468007 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-8gb7x" podStartSLOduration=166.467988475 podStartE2EDuration="2m46.467988475s" podCreationTimestamp="2025-11-24 11:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:09:41.449194677 +0000 UTC m=+187.434014966" watchObservedRunningTime="2025-11-24 11:09:41.467988475 +0000 UTC m=+187.452808764" Nov 24 11:09:42 crc kubenswrapper[4752]: I1124 11:09:42.435893 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-glr4w" event={"ID":"0d2e36e3-15aa-4576-a13a-5b5512bb13f2","Type":"ContainerStarted","Data":"7c581a3b1365d35dce7abf8c0b4ac4753f79d66e2d1b46d3e990ed687e5e8bc7"} Nov 24 11:09:42 crc kubenswrapper[4752]: I1124 11:09:42.458891 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-glr4w" podStartSLOduration=2.577783161 podStartE2EDuration="39.458873638s" podCreationTimestamp="2025-11-24 11:09:03 +0000 UTC" firstStartedPulling="2025-11-24 11:09:04.949353342 +0000 UTC m=+150.934173631" lastFinishedPulling="2025-11-24 11:09:41.830443819 +0000 UTC m=+187.815264108" observedRunningTime="2025-11-24 11:09:42.455375742 +0000 UTC m=+188.440196041" watchObservedRunningTime="2025-11-24 11:09:42.458873638 +0000 UTC m=+188.443693937" Nov 24 11:09:42 crc kubenswrapper[4752]: I1124 11:09:42.862848 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 11:09:44 crc kubenswrapper[4752]: I1124 11:09:44.045107 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-glr4w" Nov 24 11:09:44 crc kubenswrapper[4752]: I1124 11:09:44.045794 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-glr4w" Nov 24 11:09:44 crc kubenswrapper[4752]: I1124 11:09:44.182350 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-glr4w" Nov 24 11:09:45 crc kubenswrapper[4752]: I1124 11:09:45.468804 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 11:09:45 crc kubenswrapper[4752]: I1124 11:09:45.468878 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 11:09:50 crc kubenswrapper[4752]: I1124 11:09:50.481273 4752 generic.go:334] "Generic (PLEG): container finished" podID="e5633c2f-36d1-4d10-a12c-f58e67571364" containerID="225ef182d4370b29d5d01f61bc062bd384cd95d7759c72e6117bee53822e155a" exitCode=0 Nov 24 11:09:50 crc kubenswrapper[4752]: I1124 11:09:50.481364 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v468m" event={"ID":"e5633c2f-36d1-4d10-a12c-f58e67571364","Type":"ContainerDied","Data":"225ef182d4370b29d5d01f61bc062bd384cd95d7759c72e6117bee53822e155a"} Nov 24 11:09:51 crc kubenswrapper[4752]: I1124 11:09:51.490265 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v468m" event={"ID":"e5633c2f-36d1-4d10-a12c-f58e67571364","Type":"ContainerStarted","Data":"a548b00beaefda12074bd2fcad971cb565b2d71ef57521cd9e29f4089fa79e62"} Nov 24 11:09:51 crc kubenswrapper[4752]: I1124 11:09:51.512852 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-v468m" podStartSLOduration=2.554131893 podStartE2EDuration="48.512786571s" podCreationTimestamp="2025-11-24 11:09:03 +0000 UTC" firstStartedPulling="2025-11-24 11:09:04.932716639 +0000 UTC m=+150.917536918" lastFinishedPulling="2025-11-24 11:09:50.891371307 +0000 UTC m=+196.876191596" observedRunningTime="2025-11-24 11:09:51.509466102 +0000 UTC m=+197.494286411" watchObservedRunningTime="2025-11-24 11:09:51.512786571 +0000 UTC m=+197.497606860" Nov 24 11:09:53 crc kubenswrapper[4752]: I1124 11:09:53.501008 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gxcj6" event={"ID":"a3536d4b-439e-4718-bb9a-97629f7beec8","Type":"ContainerStarted","Data":"dd2068d2ee9be154a4c00e5600bd277008de009ed3ca984e3e5a2f8fb40ae105"} Nov 24 11:09:53 crc kubenswrapper[4752]: I1124 11:09:53.503764 4752 generic.go:334] "Generic (PLEG): container finished" podID="cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e" containerID="ac607987590092d81b63d875bd1e3ccd230dc8a4715d89f8598430663833013e" exitCode=0 Nov 24 11:09:53 crc kubenswrapper[4752]: I1124 11:09:53.503790 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l2c5j" event={"ID":"cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e","Type":"ContainerDied","Data":"ac607987590092d81b63d875bd1e3ccd230dc8a4715d89f8598430663833013e"} Nov 24 11:09:54 crc kubenswrapper[4752]: I1124 11:09:54.124151 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-glr4w" Nov 24 11:09:54 crc kubenswrapper[4752]: I1124 11:09:54.221663 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-v468m" Nov 24 11:09:54 crc kubenswrapper[4752]: I1124 11:09:54.221705 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-v468m" Nov 24 11:09:54 crc kubenswrapper[4752]: I1124 11:09:54.269483 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-v468m" Nov 24 11:09:54 crc kubenswrapper[4752]: I1124 11:09:54.510380 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbtd5" event={"ID":"1af90bc8-c663-4297-84fc-06b6663f837f","Type":"ContainerStarted","Data":"864aa92518a67a21764c35661bffde90dde142385f1cd11d26c1b1b8ac9f69e7"} Nov 24 11:09:54 crc kubenswrapper[4752]: I1124 11:09:54.512470 4752 generic.go:334] "Generic (PLEG): container finished" podID="2208d90e-0865-4938-8423-ea3d37ca26db" containerID="48302c22a041592641f4d165dd55ed450af0b88535872222e33c39bc71d692e5" exitCode=0 Nov 24 11:09:54 crc kubenswrapper[4752]: I1124 11:09:54.512535 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z4nx9" event={"ID":"2208d90e-0865-4938-8423-ea3d37ca26db","Type":"ContainerDied","Data":"48302c22a041592641f4d165dd55ed450af0b88535872222e33c39bc71d692e5"} Nov 24 11:09:54 crc kubenswrapper[4752]: I1124 11:09:54.515683 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v5cxt" event={"ID":"0e9beada-5b6e-4ea6-adce-27504c85c851","Type":"ContainerStarted","Data":"8d368dec77a8671ea4b52698cd9772012fe64a4eae9bf8ebbf0948888ff02747"} Nov 24 11:09:54 crc kubenswrapper[4752]: I1124 11:09:54.521266 4752 generic.go:334] "Generic (PLEG): container finished" podID="a3536d4b-439e-4718-bb9a-97629f7beec8" containerID="dd2068d2ee9be154a4c00e5600bd277008de009ed3ca984e3e5a2f8fb40ae105" exitCode=0 Nov 24 11:09:54 crc kubenswrapper[4752]: I1124 11:09:54.521361 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gxcj6" event={"ID":"a3536d4b-439e-4718-bb9a-97629f7beec8","Type":"ContainerDied","Data":"dd2068d2ee9be154a4c00e5600bd277008de009ed3ca984e3e5a2f8fb40ae105"} Nov 24 11:09:54 crc kubenswrapper[4752]: I1124 11:09:54.524149 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l2c5j" event={"ID":"cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e","Type":"ContainerStarted","Data":"020e92bd9868ff0510dd3c46dfacca794180e072f7e1b4f95b66aacffdfa7fc1"} Nov 24 11:09:54 crc kubenswrapper[4752]: I1124 11:09:54.587781 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-l2c5j" podStartSLOduration=2.7085554370000002 podStartE2EDuration="48.587737523s" podCreationTimestamp="2025-11-24 11:09:06 +0000 UTC" firstStartedPulling="2025-11-24 11:09:08.037579067 +0000 UTC m=+154.022399356" lastFinishedPulling="2025-11-24 11:09:53.916761143 +0000 UTC m=+199.901581442" observedRunningTime="2025-11-24 11:09:54.586327291 +0000 UTC m=+200.571147580" watchObservedRunningTime="2025-11-24 11:09:54.587737523 +0000 UTC m=+200.572557812" Nov 24 11:09:55 crc kubenswrapper[4752]: I1124 11:09:55.531673 4752 generic.go:334] "Generic (PLEG): container finished" podID="1af90bc8-c663-4297-84fc-06b6663f837f" containerID="864aa92518a67a21764c35661bffde90dde142385f1cd11d26c1b1b8ac9f69e7" exitCode=0 Nov 24 11:09:55 crc kubenswrapper[4752]: I1124 11:09:55.531789 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbtd5" event={"ID":"1af90bc8-c663-4297-84fc-06b6663f837f","Type":"ContainerDied","Data":"864aa92518a67a21764c35661bffde90dde142385f1cd11d26c1b1b8ac9f69e7"} Nov 24 11:09:55 crc kubenswrapper[4752]: I1124 11:09:55.536390 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z4nx9" event={"ID":"2208d90e-0865-4938-8423-ea3d37ca26db","Type":"ContainerStarted","Data":"062bb58b82e1ef262650ef72a2b9a355058383599921482f7a591b04649db828"} Nov 24 11:09:55 crc kubenswrapper[4752]: I1124 11:09:55.539477 4752 generic.go:334] "Generic (PLEG): container finished" podID="0e9beada-5b6e-4ea6-adce-27504c85c851" containerID="8d368dec77a8671ea4b52698cd9772012fe64a4eae9bf8ebbf0948888ff02747" exitCode=0 Nov 24 11:09:55 crc kubenswrapper[4752]: I1124 11:09:55.539600 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v5cxt" event={"ID":"0e9beada-5b6e-4ea6-adce-27504c85c851","Type":"ContainerDied","Data":"8d368dec77a8671ea4b52698cd9772012fe64a4eae9bf8ebbf0948888ff02747"} Nov 24 11:09:55 crc kubenswrapper[4752]: I1124 11:09:55.542858 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gxcj6" event={"ID":"a3536d4b-439e-4718-bb9a-97629f7beec8","Type":"ContainerStarted","Data":"39fa8a90879021c769b4836372098b9cafcb3fbc704e73bbf5a1c8c14122ea42"} Nov 24 11:09:55 crc kubenswrapper[4752]: I1124 11:09:55.603109 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-z4nx9" podStartSLOduration=2.761847872 podStartE2EDuration="48.603091049s" podCreationTimestamp="2025-11-24 11:09:07 +0000 UTC" firstStartedPulling="2025-11-24 11:09:09.078466021 +0000 UTC m=+155.063286310" lastFinishedPulling="2025-11-24 11:09:54.919709188 +0000 UTC m=+200.904529487" observedRunningTime="2025-11-24 11:09:55.601991116 +0000 UTC m=+201.586811415" watchObservedRunningTime="2025-11-24 11:09:55.603091049 +0000 UTC m=+201.587911338" Nov 24 11:09:55 crc kubenswrapper[4752]: I1124 11:09:55.621436 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gxcj6" podStartSLOduration=2.449544735 podStartE2EDuration="48.621417436s" podCreationTimestamp="2025-11-24 11:09:07 +0000 UTC" firstStartedPulling="2025-11-24 11:09:09.105702405 +0000 UTC m=+155.090522684" lastFinishedPulling="2025-11-24 11:09:55.277575096 +0000 UTC m=+201.262395385" observedRunningTime="2025-11-24 11:09:55.619244491 +0000 UTC m=+201.604064780" watchObservedRunningTime="2025-11-24 11:09:55.621417436 +0000 UTC m=+201.606237725" Nov 24 11:09:56 crc kubenswrapper[4752]: I1124 11:09:56.448094 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l2c5j" Nov 24 11:09:56 crc kubenswrapper[4752]: I1124 11:09:56.448161 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l2c5j" Nov 24 11:09:56 crc kubenswrapper[4752]: I1124 11:09:56.499197 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l2c5j" Nov 24 11:09:56 crc kubenswrapper[4752]: I1124 11:09:56.549821 4752 generic.go:334] "Generic (PLEG): container finished" podID="367c3613-ba5f-4154-9bc9-8ccdf47389f2" containerID="3cf1bcecd69ac8f8d82ff4ca36794a9e50c92d84cd07700e865bcc38ed7e4497" exitCode=0 Nov 24 11:09:56 crc kubenswrapper[4752]: I1124 11:09:56.549928 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s7csx" event={"ID":"367c3613-ba5f-4154-9bc9-8ccdf47389f2","Type":"ContainerDied","Data":"3cf1bcecd69ac8f8d82ff4ca36794a9e50c92d84cd07700e865bcc38ed7e4497"} Nov 24 11:09:57 crc kubenswrapper[4752]: I1124 11:09:57.398062 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-z4nx9" Nov 24 11:09:57 crc kubenswrapper[4752]: I1124 11:09:57.398127 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-z4nx9" Nov 24 11:09:57 crc kubenswrapper[4752]: I1124 11:09:57.782807 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gxcj6" Nov 24 11:09:57 crc kubenswrapper[4752]: I1124 11:09:57.782890 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gxcj6" Nov 24 11:09:58 crc kubenswrapper[4752]: I1124 11:09:58.436609 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-z4nx9" podUID="2208d90e-0865-4938-8423-ea3d37ca26db" containerName="registry-server" probeResult="failure" output=< Nov 24 11:09:58 crc kubenswrapper[4752]: timeout: failed to connect service ":50051" within 1s Nov 24 11:09:58 crc kubenswrapper[4752]: > Nov 24 11:09:58 crc kubenswrapper[4752]: I1124 11:09:58.829212 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gxcj6" podUID="a3536d4b-439e-4718-bb9a-97629f7beec8" containerName="registry-server" probeResult="failure" output=< Nov 24 11:09:58 crc kubenswrapper[4752]: timeout: failed to connect service ":50051" within 1s Nov 24 11:09:58 crc kubenswrapper[4752]: > Nov 24 11:09:59 crc kubenswrapper[4752]: I1124 11:09:59.567726 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbtd5" event={"ID":"1af90bc8-c663-4297-84fc-06b6663f837f","Type":"ContainerStarted","Data":"1fda4635c4542bbdca9c05c4d104f85430f73879b0182da9948599abd81f5e95"} Nov 24 11:09:59 crc kubenswrapper[4752]: I1124 11:09:59.592982 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lbtd5" podStartSLOduration=3.576977267 podStartE2EDuration="54.592963086s" podCreationTimestamp="2025-11-24 11:09:05 +0000 UTC" firstStartedPulling="2025-11-24 11:09:06.989261217 +0000 UTC m=+152.974081506" lastFinishedPulling="2025-11-24 11:09:58.005247046 +0000 UTC m=+203.990067325" observedRunningTime="2025-11-24 11:09:59.589703429 +0000 UTC m=+205.574523718" watchObservedRunningTime="2025-11-24 11:09:59.592963086 +0000 UTC m=+205.577783375" Nov 24 11:10:00 crc kubenswrapper[4752]: I1124 11:10:00.574907 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s7csx" event={"ID":"367c3613-ba5f-4154-9bc9-8ccdf47389f2","Type":"ContainerStarted","Data":"7930938b2b3f54fb8298fc51fbc212308ef4382ec2f7445e0052f9acac77e72f"} Nov 24 11:10:00 crc kubenswrapper[4752]: I1124 11:10:00.594726 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-s7csx" podStartSLOduration=3.186087957 podStartE2EDuration="56.594701496s" podCreationTimestamp="2025-11-24 11:09:04 +0000 UTC" firstStartedPulling="2025-11-24 11:09:05.966326225 +0000 UTC m=+151.951146514" lastFinishedPulling="2025-11-24 11:09:59.374939764 +0000 UTC m=+205.359760053" observedRunningTime="2025-11-24 11:10:00.590573802 +0000 UTC m=+206.575394091" watchObservedRunningTime="2025-11-24 11:10:00.594701496 +0000 UTC m=+206.579521785" Nov 24 11:10:02 crc kubenswrapper[4752]: I1124 11:10:02.601455 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v5cxt" event={"ID":"0e9beada-5b6e-4ea6-adce-27504c85c851","Type":"ContainerStarted","Data":"b102c10d3bd80d34a53a4e2f2250a45f5b4ee456bbabb8901c435b328faee8ce"} Nov 24 11:10:03 crc kubenswrapper[4752]: I1124 11:10:03.626158 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-v5cxt" podStartSLOduration=3.0668568880000002 podStartE2EDuration="59.626136247s" podCreationTimestamp="2025-11-24 11:09:04 +0000 UTC" firstStartedPulling="2025-11-24 11:09:04.927354247 +0000 UTC m=+150.912174536" lastFinishedPulling="2025-11-24 11:10:01.486633606 +0000 UTC m=+207.471453895" observedRunningTime="2025-11-24 11:10:03.622494318 +0000 UTC m=+209.607314617" watchObservedRunningTime="2025-11-24 11:10:03.626136247 +0000 UTC m=+209.610956536" Nov 24 11:10:04 crc kubenswrapper[4752]: I1124 11:10:04.274798 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-v468m" Nov 24 11:10:04 crc kubenswrapper[4752]: I1124 11:10:04.436797 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-v5cxt" Nov 24 11:10:04 crc kubenswrapper[4752]: I1124 11:10:04.436885 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-v5cxt" Nov 24 11:10:04 crc kubenswrapper[4752]: I1124 11:10:04.485070 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-v5cxt" Nov 24 11:10:04 crc kubenswrapper[4752]: I1124 11:10:04.627528 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-s7csx" Nov 24 11:10:04 crc kubenswrapper[4752]: I1124 11:10:04.627589 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-s7csx" Nov 24 11:10:04 crc kubenswrapper[4752]: I1124 11:10:04.695243 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-s7csx" Nov 24 11:10:05 crc kubenswrapper[4752]: I1124 11:10:05.665003 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-s7csx" Nov 24 11:10:06 crc kubenswrapper[4752]: I1124 11:10:06.018280 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lbtd5" Nov 24 11:10:06 crc kubenswrapper[4752]: I1124 11:10:06.018372 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lbtd5" Nov 24 11:10:06 crc kubenswrapper[4752]: I1124 11:10:06.068681 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lbtd5" Nov 24 11:10:06 crc kubenswrapper[4752]: I1124 11:10:06.493278 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l2c5j" Nov 24 11:10:06 crc kubenswrapper[4752]: I1124 11:10:06.523532 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s7csx"] Nov 24 11:10:06 crc kubenswrapper[4752]: I1124 11:10:06.675479 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lbtd5" Nov 24 11:10:07 crc kubenswrapper[4752]: I1124 11:10:07.517863 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-z4nx9" Nov 24 11:10:07 crc kubenswrapper[4752]: I1124 11:10:07.583058 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-z4nx9" Nov 24 11:10:07 crc kubenswrapper[4752]: I1124 11:10:07.631227 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-s7csx" podUID="367c3613-ba5f-4154-9bc9-8ccdf47389f2" containerName="registry-server" containerID="cri-o://7930938b2b3f54fb8298fc51fbc212308ef4382ec2f7445e0052f9acac77e72f" gracePeriod=2 Nov 24 11:10:07 crc kubenswrapper[4752]: I1124 11:10:07.841522 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gxcj6" Nov 24 11:10:07 crc kubenswrapper[4752]: I1124 11:10:07.896218 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gxcj6" Nov 24 11:10:07 crc kubenswrapper[4752]: I1124 11:10:07.980938 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s7csx" Nov 24 11:10:08 crc kubenswrapper[4752]: I1124 11:10:08.123209 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j676r\" (UniqueName: \"kubernetes.io/projected/367c3613-ba5f-4154-9bc9-8ccdf47389f2-kube-api-access-j676r\") pod \"367c3613-ba5f-4154-9bc9-8ccdf47389f2\" (UID: \"367c3613-ba5f-4154-9bc9-8ccdf47389f2\") " Nov 24 11:10:08 crc kubenswrapper[4752]: I1124 11:10:08.123296 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/367c3613-ba5f-4154-9bc9-8ccdf47389f2-catalog-content\") pod \"367c3613-ba5f-4154-9bc9-8ccdf47389f2\" (UID: \"367c3613-ba5f-4154-9bc9-8ccdf47389f2\") " Nov 24 11:10:08 crc kubenswrapper[4752]: I1124 11:10:08.123388 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/367c3613-ba5f-4154-9bc9-8ccdf47389f2-utilities\") pod \"367c3613-ba5f-4154-9bc9-8ccdf47389f2\" (UID: \"367c3613-ba5f-4154-9bc9-8ccdf47389f2\") " Nov 24 11:10:08 crc kubenswrapper[4752]: I1124 11:10:08.124895 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/367c3613-ba5f-4154-9bc9-8ccdf47389f2-utilities" (OuterVolumeSpecName: "utilities") pod "367c3613-ba5f-4154-9bc9-8ccdf47389f2" (UID: "367c3613-ba5f-4154-9bc9-8ccdf47389f2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:10:08 crc kubenswrapper[4752]: I1124 11:10:08.129125 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/367c3613-ba5f-4154-9bc9-8ccdf47389f2-kube-api-access-j676r" (OuterVolumeSpecName: "kube-api-access-j676r") pod "367c3613-ba5f-4154-9bc9-8ccdf47389f2" (UID: "367c3613-ba5f-4154-9bc9-8ccdf47389f2"). InnerVolumeSpecName "kube-api-access-j676r". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:10:08 crc kubenswrapper[4752]: I1124 11:10:08.177649 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/367c3613-ba5f-4154-9bc9-8ccdf47389f2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "367c3613-ba5f-4154-9bc9-8ccdf47389f2" (UID: "367c3613-ba5f-4154-9bc9-8ccdf47389f2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:10:08 crc kubenswrapper[4752]: I1124 11:10:08.225386 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/367c3613-ba5f-4154-9bc9-8ccdf47389f2-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 11:10:08 crc kubenswrapper[4752]: I1124 11:10:08.225710 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j676r\" (UniqueName: \"kubernetes.io/projected/367c3613-ba5f-4154-9bc9-8ccdf47389f2-kube-api-access-j676r\") on node \"crc\" DevicePath \"\"" Nov 24 11:10:08 crc kubenswrapper[4752]: I1124 11:10:08.225894 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/367c3613-ba5f-4154-9bc9-8ccdf47389f2-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 11:10:08 crc kubenswrapper[4752]: I1124 11:10:08.640689 4752 generic.go:334] "Generic (PLEG): container finished" podID="367c3613-ba5f-4154-9bc9-8ccdf47389f2" containerID="7930938b2b3f54fb8298fc51fbc212308ef4382ec2f7445e0052f9acac77e72f" exitCode=0 Nov 24 11:10:08 crc kubenswrapper[4752]: I1124 11:10:08.640785 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s7csx" event={"ID":"367c3613-ba5f-4154-9bc9-8ccdf47389f2","Type":"ContainerDied","Data":"7930938b2b3f54fb8298fc51fbc212308ef4382ec2f7445e0052f9acac77e72f"} Nov 24 11:10:08 crc kubenswrapper[4752]: I1124 11:10:08.640816 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s7csx" Nov 24 11:10:08 crc kubenswrapper[4752]: I1124 11:10:08.640842 4752 scope.go:117] "RemoveContainer" containerID="7930938b2b3f54fb8298fc51fbc212308ef4382ec2f7445e0052f9acac77e72f" Nov 24 11:10:08 crc kubenswrapper[4752]: I1124 11:10:08.640831 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s7csx" event={"ID":"367c3613-ba5f-4154-9bc9-8ccdf47389f2","Type":"ContainerDied","Data":"3dcc1e9929104ed7d1651d55467f08c801b9bcbce5acb1e1351c2c59c02401c2"} Nov 24 11:10:08 crc kubenswrapper[4752]: I1124 11:10:08.662809 4752 scope.go:117] "RemoveContainer" containerID="3cf1bcecd69ac8f8d82ff4ca36794a9e50c92d84cd07700e865bcc38ed7e4497" Nov 24 11:10:08 crc kubenswrapper[4752]: I1124 11:10:08.671545 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s7csx"] Nov 24 11:10:08 crc kubenswrapper[4752]: I1124 11:10:08.674315 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-s7csx"] Nov 24 11:10:08 crc kubenswrapper[4752]: I1124 11:10:08.699504 4752 scope.go:117] "RemoveContainer" containerID="507f2d82ac4195329304a2582fb178ab93479358912e87a2c2a2ad27e96175f8" Nov 24 11:10:08 crc kubenswrapper[4752]: I1124 11:10:08.755971 4752 scope.go:117] "RemoveContainer" containerID="7930938b2b3f54fb8298fc51fbc212308ef4382ec2f7445e0052f9acac77e72f" Nov 24 11:10:08 crc kubenswrapper[4752]: E1124 11:10:08.757230 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7930938b2b3f54fb8298fc51fbc212308ef4382ec2f7445e0052f9acac77e72f\": container with ID starting with 7930938b2b3f54fb8298fc51fbc212308ef4382ec2f7445e0052f9acac77e72f not found: ID does not exist" containerID="7930938b2b3f54fb8298fc51fbc212308ef4382ec2f7445e0052f9acac77e72f" Nov 24 11:10:08 crc kubenswrapper[4752]: I1124 11:10:08.757311 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7930938b2b3f54fb8298fc51fbc212308ef4382ec2f7445e0052f9acac77e72f"} err="failed to get container status \"7930938b2b3f54fb8298fc51fbc212308ef4382ec2f7445e0052f9acac77e72f\": rpc error: code = NotFound desc = could not find container \"7930938b2b3f54fb8298fc51fbc212308ef4382ec2f7445e0052f9acac77e72f\": container with ID starting with 7930938b2b3f54fb8298fc51fbc212308ef4382ec2f7445e0052f9acac77e72f not found: ID does not exist" Nov 24 11:10:08 crc kubenswrapper[4752]: I1124 11:10:08.757406 4752 scope.go:117] "RemoveContainer" containerID="3cf1bcecd69ac8f8d82ff4ca36794a9e50c92d84cd07700e865bcc38ed7e4497" Nov 24 11:10:08 crc kubenswrapper[4752]: E1124 11:10:08.757697 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cf1bcecd69ac8f8d82ff4ca36794a9e50c92d84cd07700e865bcc38ed7e4497\": container with ID starting with 3cf1bcecd69ac8f8d82ff4ca36794a9e50c92d84cd07700e865bcc38ed7e4497 not found: ID does not exist" containerID="3cf1bcecd69ac8f8d82ff4ca36794a9e50c92d84cd07700e865bcc38ed7e4497" Nov 24 11:10:08 crc kubenswrapper[4752]: I1124 11:10:08.757732 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cf1bcecd69ac8f8d82ff4ca36794a9e50c92d84cd07700e865bcc38ed7e4497"} err="failed to get container status \"3cf1bcecd69ac8f8d82ff4ca36794a9e50c92d84cd07700e865bcc38ed7e4497\": rpc error: code = NotFound desc = could not find container \"3cf1bcecd69ac8f8d82ff4ca36794a9e50c92d84cd07700e865bcc38ed7e4497\": container with ID starting with 3cf1bcecd69ac8f8d82ff4ca36794a9e50c92d84cd07700e865bcc38ed7e4497 not found: ID does not exist" Nov 24 11:10:08 crc kubenswrapper[4752]: I1124 11:10:08.757772 4752 scope.go:117] "RemoveContainer" containerID="507f2d82ac4195329304a2582fb178ab93479358912e87a2c2a2ad27e96175f8" Nov 24 11:10:08 crc kubenswrapper[4752]: E1124 11:10:08.758137 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"507f2d82ac4195329304a2582fb178ab93479358912e87a2c2a2ad27e96175f8\": container with ID starting with 507f2d82ac4195329304a2582fb178ab93479358912e87a2c2a2ad27e96175f8 not found: ID does not exist" containerID="507f2d82ac4195329304a2582fb178ab93479358912e87a2c2a2ad27e96175f8" Nov 24 11:10:08 crc kubenswrapper[4752]: I1124 11:10:08.758189 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"507f2d82ac4195329304a2582fb178ab93479358912e87a2c2a2ad27e96175f8"} err="failed to get container status \"507f2d82ac4195329304a2582fb178ab93479358912e87a2c2a2ad27e96175f8\": rpc error: code = NotFound desc = could not find container \"507f2d82ac4195329304a2582fb178ab93479358912e87a2c2a2ad27e96175f8\": container with ID starting with 507f2d82ac4195329304a2582fb178ab93479358912e87a2c2a2ad27e96175f8 not found: ID does not exist" Nov 24 11:10:08 crc kubenswrapper[4752]: I1124 11:10:08.763666 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="367c3613-ba5f-4154-9bc9-8ccdf47389f2" path="/var/lib/kubelet/pods/367c3613-ba5f-4154-9bc9-8ccdf47389f2/volumes" Nov 24 11:10:08 crc kubenswrapper[4752]: I1124 11:10:08.919339 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l2c5j"] Nov 24 11:10:08 crc kubenswrapper[4752]: I1124 11:10:08.919643 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-l2c5j" podUID="cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e" containerName="registry-server" containerID="cri-o://020e92bd9868ff0510dd3c46dfacca794180e072f7e1b4f95b66aacffdfa7fc1" gracePeriod=2 Nov 24 11:10:09 crc kubenswrapper[4752]: I1124 11:10:09.650516 4752 generic.go:334] "Generic (PLEG): container finished" podID="cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e" containerID="020e92bd9868ff0510dd3c46dfacca794180e072f7e1b4f95b66aacffdfa7fc1" exitCode=0 Nov 24 11:10:09 crc kubenswrapper[4752]: I1124 11:10:09.650570 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l2c5j" event={"ID":"cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e","Type":"ContainerDied","Data":"020e92bd9868ff0510dd3c46dfacca794180e072f7e1b4f95b66aacffdfa7fc1"} Nov 24 11:10:09 crc kubenswrapper[4752]: I1124 11:10:09.986776 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l2c5j" Nov 24 11:10:10 crc kubenswrapper[4752]: I1124 11:10:10.174353 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e-catalog-content\") pod \"cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e\" (UID: \"cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e\") " Nov 24 11:10:10 crc kubenswrapper[4752]: I1124 11:10:10.174438 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsrn7\" (UniqueName: \"kubernetes.io/projected/cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e-kube-api-access-tsrn7\") pod \"cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e\" (UID: \"cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e\") " Nov 24 11:10:10 crc kubenswrapper[4752]: I1124 11:10:10.174524 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e-utilities\") pod \"cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e\" (UID: \"cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e\") " Nov 24 11:10:10 crc kubenswrapper[4752]: I1124 11:10:10.175515 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e-utilities" (OuterVolumeSpecName: "utilities") pod "cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e" (UID: "cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:10:10 crc kubenswrapper[4752]: I1124 11:10:10.177894 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e-kube-api-access-tsrn7" (OuterVolumeSpecName: "kube-api-access-tsrn7") pod "cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e" (UID: "cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e"). InnerVolumeSpecName "kube-api-access-tsrn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:10:10 crc kubenswrapper[4752]: I1124 11:10:10.192275 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e" (UID: "cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:10:10 crc kubenswrapper[4752]: I1124 11:10:10.275548 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 11:10:10 crc kubenswrapper[4752]: I1124 11:10:10.275596 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsrn7\" (UniqueName: \"kubernetes.io/projected/cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e-kube-api-access-tsrn7\") on node \"crc\" DevicePath \"\"" Nov 24 11:10:10 crc kubenswrapper[4752]: I1124 11:10:10.275609 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 11:10:10 crc kubenswrapper[4752]: I1124 11:10:10.657530 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l2c5j" event={"ID":"cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e","Type":"ContainerDied","Data":"67d168d56c4bdfbaf34b28f4f2ad4b4ff2a3ebbe46da66f2217ee9a592a6e99c"} Nov 24 11:10:10 crc kubenswrapper[4752]: I1124 11:10:10.657578 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l2c5j" Nov 24 11:10:10 crc kubenswrapper[4752]: I1124 11:10:10.657594 4752 scope.go:117] "RemoveContainer" containerID="020e92bd9868ff0510dd3c46dfacca794180e072f7e1b4f95b66aacffdfa7fc1" Nov 24 11:10:10 crc kubenswrapper[4752]: I1124 11:10:10.681581 4752 scope.go:117] "RemoveContainer" containerID="ac607987590092d81b63d875bd1e3ccd230dc8a4715d89f8598430663833013e" Nov 24 11:10:10 crc kubenswrapper[4752]: I1124 11:10:10.699244 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l2c5j"] Nov 24 11:10:10 crc kubenswrapper[4752]: I1124 11:10:10.705524 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-l2c5j"] Nov 24 11:10:10 crc kubenswrapper[4752]: I1124 11:10:10.712352 4752 scope.go:117] "RemoveContainer" containerID="87d7921fb69a3b1aac21d6c71a07ce4ac22b8c83a89bd01fa735698a69d393f4" Nov 24 11:10:10 crc kubenswrapper[4752]: I1124 11:10:10.736254 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e" path="/var/lib/kubelet/pods/cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e/volumes" Nov 24 11:10:11 crc kubenswrapper[4752]: I1124 11:10:11.118394 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gxcj6"] Nov 24 11:10:11 crc kubenswrapper[4752]: I1124 11:10:11.118611 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gxcj6" podUID="a3536d4b-439e-4718-bb9a-97629f7beec8" containerName="registry-server" containerID="cri-o://39fa8a90879021c769b4836372098b9cafcb3fbc704e73bbf5a1c8c14122ea42" gracePeriod=2 Nov 24 11:10:11 crc kubenswrapper[4752]: I1124 11:10:11.471519 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gxcj6" Nov 24 11:10:11 crc kubenswrapper[4752]: I1124 11:10:11.493395 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffb57\" (UniqueName: \"kubernetes.io/projected/a3536d4b-439e-4718-bb9a-97629f7beec8-kube-api-access-ffb57\") pod \"a3536d4b-439e-4718-bb9a-97629f7beec8\" (UID: \"a3536d4b-439e-4718-bb9a-97629f7beec8\") " Nov 24 11:10:11 crc kubenswrapper[4752]: I1124 11:10:11.498735 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3536d4b-439e-4718-bb9a-97629f7beec8-kube-api-access-ffb57" (OuterVolumeSpecName: "kube-api-access-ffb57") pod "a3536d4b-439e-4718-bb9a-97629f7beec8" (UID: "a3536d4b-439e-4718-bb9a-97629f7beec8"). InnerVolumeSpecName "kube-api-access-ffb57". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:10:11 crc kubenswrapper[4752]: I1124 11:10:11.596016 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3536d4b-439e-4718-bb9a-97629f7beec8-utilities\") pod \"a3536d4b-439e-4718-bb9a-97629f7beec8\" (UID: \"a3536d4b-439e-4718-bb9a-97629f7beec8\") " Nov 24 11:10:11 crc kubenswrapper[4752]: I1124 11:10:11.596080 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3536d4b-439e-4718-bb9a-97629f7beec8-catalog-content\") pod \"a3536d4b-439e-4718-bb9a-97629f7beec8\" (UID: \"a3536d4b-439e-4718-bb9a-97629f7beec8\") " Nov 24 11:10:11 crc kubenswrapper[4752]: I1124 11:10:11.596416 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffb57\" (UniqueName: \"kubernetes.io/projected/a3536d4b-439e-4718-bb9a-97629f7beec8-kube-api-access-ffb57\") on node \"crc\" DevicePath \"\"" Nov 24 11:10:11 crc kubenswrapper[4752]: I1124 11:10:11.599503 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3536d4b-439e-4718-bb9a-97629f7beec8-utilities" (OuterVolumeSpecName: "utilities") pod "a3536d4b-439e-4718-bb9a-97629f7beec8" (UID: "a3536d4b-439e-4718-bb9a-97629f7beec8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:10:11 crc kubenswrapper[4752]: I1124 11:10:11.664233 4752 generic.go:334] "Generic (PLEG): container finished" podID="a3536d4b-439e-4718-bb9a-97629f7beec8" containerID="39fa8a90879021c769b4836372098b9cafcb3fbc704e73bbf5a1c8c14122ea42" exitCode=0 Nov 24 11:10:11 crc kubenswrapper[4752]: I1124 11:10:11.664291 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gxcj6" Nov 24 11:10:11 crc kubenswrapper[4752]: I1124 11:10:11.664279 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gxcj6" event={"ID":"a3536d4b-439e-4718-bb9a-97629f7beec8","Type":"ContainerDied","Data":"39fa8a90879021c769b4836372098b9cafcb3fbc704e73bbf5a1c8c14122ea42"} Nov 24 11:10:11 crc kubenswrapper[4752]: I1124 11:10:11.664412 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gxcj6" event={"ID":"a3536d4b-439e-4718-bb9a-97629f7beec8","Type":"ContainerDied","Data":"432901bc784de78ecfda47b74b5a5c2c805eaf06b394dc2041e1b4989b4c49a3"} Nov 24 11:10:11 crc kubenswrapper[4752]: I1124 11:10:11.664437 4752 scope.go:117] "RemoveContainer" containerID="39fa8a90879021c769b4836372098b9cafcb3fbc704e73bbf5a1c8c14122ea42" Nov 24 11:10:11 crc kubenswrapper[4752]: I1124 11:10:11.678671 4752 scope.go:117] "RemoveContainer" containerID="dd2068d2ee9be154a4c00e5600bd277008de009ed3ca984e3e5a2f8fb40ae105" Nov 24 11:10:11 crc kubenswrapper[4752]: I1124 11:10:11.684040 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3536d4b-439e-4718-bb9a-97629f7beec8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a3536d4b-439e-4718-bb9a-97629f7beec8" (UID: "a3536d4b-439e-4718-bb9a-97629f7beec8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:10:11 crc kubenswrapper[4752]: I1124 11:10:11.693586 4752 scope.go:117] "RemoveContainer" containerID="35e79c94e33df1e23188b67edc27bb360b2868678205abc41b995e8bbab2a95a" Nov 24 11:10:11 crc kubenswrapper[4752]: I1124 11:10:11.697462 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3536d4b-439e-4718-bb9a-97629f7beec8-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 11:10:11 crc kubenswrapper[4752]: I1124 11:10:11.697509 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3536d4b-439e-4718-bb9a-97629f7beec8-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 11:10:11 crc kubenswrapper[4752]: I1124 11:10:11.710573 4752 scope.go:117] "RemoveContainer" containerID="39fa8a90879021c769b4836372098b9cafcb3fbc704e73bbf5a1c8c14122ea42" Nov 24 11:10:11 crc kubenswrapper[4752]: E1124 11:10:11.711253 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39fa8a90879021c769b4836372098b9cafcb3fbc704e73bbf5a1c8c14122ea42\": container with ID starting with 39fa8a90879021c769b4836372098b9cafcb3fbc704e73bbf5a1c8c14122ea42 not found: ID does not exist" containerID="39fa8a90879021c769b4836372098b9cafcb3fbc704e73bbf5a1c8c14122ea42" Nov 24 11:10:11 crc kubenswrapper[4752]: I1124 11:10:11.711314 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39fa8a90879021c769b4836372098b9cafcb3fbc704e73bbf5a1c8c14122ea42"} err="failed to get container status \"39fa8a90879021c769b4836372098b9cafcb3fbc704e73bbf5a1c8c14122ea42\": rpc error: code = NotFound desc = could not find container \"39fa8a90879021c769b4836372098b9cafcb3fbc704e73bbf5a1c8c14122ea42\": container with ID starting with 39fa8a90879021c769b4836372098b9cafcb3fbc704e73bbf5a1c8c14122ea42 not found: ID does not exist" Nov 24 11:10:11 crc kubenswrapper[4752]: I1124 11:10:11.711350 4752 scope.go:117] "RemoveContainer" containerID="dd2068d2ee9be154a4c00e5600bd277008de009ed3ca984e3e5a2f8fb40ae105" Nov 24 11:10:11 crc kubenswrapper[4752]: E1124 11:10:11.711735 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd2068d2ee9be154a4c00e5600bd277008de009ed3ca984e3e5a2f8fb40ae105\": container with ID starting with dd2068d2ee9be154a4c00e5600bd277008de009ed3ca984e3e5a2f8fb40ae105 not found: ID does not exist" containerID="dd2068d2ee9be154a4c00e5600bd277008de009ed3ca984e3e5a2f8fb40ae105" Nov 24 11:10:11 crc kubenswrapper[4752]: I1124 11:10:11.711783 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd2068d2ee9be154a4c00e5600bd277008de009ed3ca984e3e5a2f8fb40ae105"} err="failed to get container status \"dd2068d2ee9be154a4c00e5600bd277008de009ed3ca984e3e5a2f8fb40ae105\": rpc error: code = NotFound desc = could not find container \"dd2068d2ee9be154a4c00e5600bd277008de009ed3ca984e3e5a2f8fb40ae105\": container with ID starting with dd2068d2ee9be154a4c00e5600bd277008de009ed3ca984e3e5a2f8fb40ae105 not found: ID does not exist" Nov 24 11:10:11 crc kubenswrapper[4752]: I1124 11:10:11.711800 4752 scope.go:117] "RemoveContainer" containerID="35e79c94e33df1e23188b67edc27bb360b2868678205abc41b995e8bbab2a95a" Nov 24 11:10:11 crc kubenswrapper[4752]: E1124 11:10:11.712219 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35e79c94e33df1e23188b67edc27bb360b2868678205abc41b995e8bbab2a95a\": container with ID starting with 35e79c94e33df1e23188b67edc27bb360b2868678205abc41b995e8bbab2a95a not found: ID does not exist" containerID="35e79c94e33df1e23188b67edc27bb360b2868678205abc41b995e8bbab2a95a" Nov 24 11:10:11 crc kubenswrapper[4752]: I1124 11:10:11.712271 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35e79c94e33df1e23188b67edc27bb360b2868678205abc41b995e8bbab2a95a"} err="failed to get container status \"35e79c94e33df1e23188b67edc27bb360b2868678205abc41b995e8bbab2a95a\": rpc error: code = NotFound desc = could not find container \"35e79c94e33df1e23188b67edc27bb360b2868678205abc41b995e8bbab2a95a\": container with ID starting with 35e79c94e33df1e23188b67edc27bb360b2868678205abc41b995e8bbab2a95a not found: ID does not exist" Nov 24 11:10:11 crc kubenswrapper[4752]: I1124 11:10:11.996044 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gxcj6"] Nov 24 11:10:11 crc kubenswrapper[4752]: I1124 11:10:11.998923 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gxcj6"] Nov 24 11:10:12 crc kubenswrapper[4752]: I1124 11:10:12.733822 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3536d4b-439e-4718-bb9a-97629f7beec8" path="/var/lib/kubelet/pods/a3536d4b-439e-4718-bb9a-97629f7beec8/volumes" Nov 24 11:10:14 crc kubenswrapper[4752]: I1124 11:10:14.483315 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-v5cxt" Nov 24 11:10:15 crc kubenswrapper[4752]: I1124 11:10:15.468808 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 11:10:15 crc kubenswrapper[4752]: I1124 11:10:15.468907 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 11:10:15 crc kubenswrapper[4752]: I1124 11:10:15.468965 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 11:10:15 crc kubenswrapper[4752]: I1124 11:10:15.469611 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"19b6540a68ba35f1f647d8d4bc6fe738e35d1da8b8a4b8cdf8347e246bdc9a3d"} pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 24 11:10:15 crc kubenswrapper[4752]: I1124 11:10:15.469680 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" containerID="cri-o://19b6540a68ba35f1f647d8d4bc6fe738e35d1da8b8a4b8cdf8347e246bdc9a3d" gracePeriod=600 Nov 24 11:10:15 crc kubenswrapper[4752]: I1124 11:10:15.685315 4752 generic.go:334] "Generic (PLEG): container finished" podID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerID="19b6540a68ba35f1f647d8d4bc6fe738e35d1da8b8a4b8cdf8347e246bdc9a3d" exitCode=0 Nov 24 11:10:15 crc kubenswrapper[4752]: I1124 11:10:15.685364 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerDied","Data":"19b6540a68ba35f1f647d8d4bc6fe738e35d1da8b8a4b8cdf8347e246bdc9a3d"} Nov 24 11:10:16 crc kubenswrapper[4752]: I1124 11:10:16.289541 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-965hb"] Nov 24 11:10:16 crc kubenswrapper[4752]: I1124 11:10:16.691356 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerStarted","Data":"ab688d8032601fa7d70507bf76fab1a716e3b0d71d300597a4a57acbce817088"} Nov 24 11:10:17 crc kubenswrapper[4752]: I1124 11:10:17.523212 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v5cxt"] Nov 24 11:10:17 crc kubenswrapper[4752]: I1124 11:10:17.523838 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-v5cxt" podUID="0e9beada-5b6e-4ea6-adce-27504c85c851" containerName="registry-server" containerID="cri-o://b102c10d3bd80d34a53a4e2f2250a45f5b4ee456bbabb8901c435b328faee8ce" gracePeriod=2 Nov 24 11:10:17 crc kubenswrapper[4752]: I1124 11:10:17.700796 4752 generic.go:334] "Generic (PLEG): container finished" podID="0e9beada-5b6e-4ea6-adce-27504c85c851" containerID="b102c10d3bd80d34a53a4e2f2250a45f5b4ee456bbabb8901c435b328faee8ce" exitCode=0 Nov 24 11:10:17 crc kubenswrapper[4752]: I1124 11:10:17.700885 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v5cxt" event={"ID":"0e9beada-5b6e-4ea6-adce-27504c85c851","Type":"ContainerDied","Data":"b102c10d3bd80d34a53a4e2f2250a45f5b4ee456bbabb8901c435b328faee8ce"} Nov 24 11:10:17 crc kubenswrapper[4752]: I1124 11:10:17.859925 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v5cxt" Nov 24 11:10:17 crc kubenswrapper[4752]: I1124 11:10:17.979140 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e9beada-5b6e-4ea6-adce-27504c85c851-catalog-content\") pod \"0e9beada-5b6e-4ea6-adce-27504c85c851\" (UID: \"0e9beada-5b6e-4ea6-adce-27504c85c851\") " Nov 24 11:10:17 crc kubenswrapper[4752]: I1124 11:10:17.979276 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e9beada-5b6e-4ea6-adce-27504c85c851-utilities\") pod \"0e9beada-5b6e-4ea6-adce-27504c85c851\" (UID: \"0e9beada-5b6e-4ea6-adce-27504c85c851\") " Nov 24 11:10:17 crc kubenswrapper[4752]: I1124 11:10:17.979323 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sn2tj\" (UniqueName: \"kubernetes.io/projected/0e9beada-5b6e-4ea6-adce-27504c85c851-kube-api-access-sn2tj\") pod \"0e9beada-5b6e-4ea6-adce-27504c85c851\" (UID: \"0e9beada-5b6e-4ea6-adce-27504c85c851\") " Nov 24 11:10:17 crc kubenswrapper[4752]: I1124 11:10:17.980347 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e9beada-5b6e-4ea6-adce-27504c85c851-utilities" (OuterVolumeSpecName: "utilities") pod "0e9beada-5b6e-4ea6-adce-27504c85c851" (UID: "0e9beada-5b6e-4ea6-adce-27504c85c851"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:10:17 crc kubenswrapper[4752]: I1124 11:10:17.986214 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e9beada-5b6e-4ea6-adce-27504c85c851-kube-api-access-sn2tj" (OuterVolumeSpecName: "kube-api-access-sn2tj") pod "0e9beada-5b6e-4ea6-adce-27504c85c851" (UID: "0e9beada-5b6e-4ea6-adce-27504c85c851"). InnerVolumeSpecName "kube-api-access-sn2tj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:10:18 crc kubenswrapper[4752]: I1124 11:10:18.044418 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e9beada-5b6e-4ea6-adce-27504c85c851-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e9beada-5b6e-4ea6-adce-27504c85c851" (UID: "0e9beada-5b6e-4ea6-adce-27504c85c851"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:10:18 crc kubenswrapper[4752]: I1124 11:10:18.081079 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e9beada-5b6e-4ea6-adce-27504c85c851-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 11:10:18 crc kubenswrapper[4752]: I1124 11:10:18.081123 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e9beada-5b6e-4ea6-adce-27504c85c851-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 11:10:18 crc kubenswrapper[4752]: I1124 11:10:18.081137 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sn2tj\" (UniqueName: \"kubernetes.io/projected/0e9beada-5b6e-4ea6-adce-27504c85c851-kube-api-access-sn2tj\") on node \"crc\" DevicePath \"\"" Nov 24 11:10:18 crc kubenswrapper[4752]: I1124 11:10:18.708637 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v5cxt" event={"ID":"0e9beada-5b6e-4ea6-adce-27504c85c851","Type":"ContainerDied","Data":"9734c0e269530e08ba4a593da6981f3c0c0684530fd99036075d28deb13a7478"} Nov 24 11:10:18 crc kubenswrapper[4752]: I1124 11:10:18.708706 4752 scope.go:117] "RemoveContainer" containerID="b102c10d3bd80d34a53a4e2f2250a45f5b4ee456bbabb8901c435b328faee8ce" Nov 24 11:10:18 crc kubenswrapper[4752]: I1124 11:10:18.708718 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v5cxt" Nov 24 11:10:18 crc kubenswrapper[4752]: I1124 11:10:18.724664 4752 scope.go:117] "RemoveContainer" containerID="8d368dec77a8671ea4b52698cd9772012fe64a4eae9bf8ebbf0948888ff02747" Nov 24 11:10:18 crc kubenswrapper[4752]: I1124 11:10:18.757435 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v5cxt"] Nov 24 11:10:18 crc kubenswrapper[4752]: I1124 11:10:18.759982 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-v5cxt"] Nov 24 11:10:18 crc kubenswrapper[4752]: I1124 11:10:18.763899 4752 scope.go:117] "RemoveContainer" containerID="e1e75ab3abf9ddc1e8e212cecc26d6d9cc6e9e7ab13f2e71cbe8237a8872d173" Nov 24 11:10:20 crc kubenswrapper[4752]: I1124 11:10:20.735134 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e9beada-5b6e-4ea6-adce-27504c85c851" path="/var/lib/kubelet/pods/0e9beada-5b6e-4ea6-adce-27504c85c851/volumes" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.323773 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-965hb" podUID="5618f796-dad3-4ed3-bff7-ceed08f8b07c" containerName="oauth-openshift" containerID="cri-o://a0d356e945c72492a80531b5a942152c2c6f6b3abf8b459ff0f41394682e79cc" gracePeriod=15 Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.707251 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.747912 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5d4df5b879-wtdp8"] Nov 24 11:10:41 crc kubenswrapper[4752]: E1124 11:10:41.748161 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55a42dcf-4065-4af0-a1e8-6b938d9c53fd" containerName="pruner" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.748177 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="55a42dcf-4065-4af0-a1e8-6b938d9c53fd" containerName="pruner" Nov 24 11:10:41 crc kubenswrapper[4752]: E1124 11:10:41.748191 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="367c3613-ba5f-4154-9bc9-8ccdf47389f2" containerName="extract-utilities" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.748200 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="367c3613-ba5f-4154-9bc9-8ccdf47389f2" containerName="extract-utilities" Nov 24 11:10:41 crc kubenswrapper[4752]: E1124 11:10:41.748210 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e" containerName="extract-content" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.748219 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e" containerName="extract-content" Nov 24 11:10:41 crc kubenswrapper[4752]: E1124 11:10:41.748232 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3536d4b-439e-4718-bb9a-97629f7beec8" containerName="extract-content" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.748240 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3536d4b-439e-4718-bb9a-97629f7beec8" containerName="extract-content" Nov 24 11:10:41 crc kubenswrapper[4752]: E1124 11:10:41.748254 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e9beada-5b6e-4ea6-adce-27504c85c851" containerName="registry-server" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.748262 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e9beada-5b6e-4ea6-adce-27504c85c851" containerName="registry-server" Nov 24 11:10:41 crc kubenswrapper[4752]: E1124 11:10:41.748278 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="367c3613-ba5f-4154-9bc9-8ccdf47389f2" containerName="registry-server" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.748299 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="367c3613-ba5f-4154-9bc9-8ccdf47389f2" containerName="registry-server" Nov 24 11:10:41 crc kubenswrapper[4752]: E1124 11:10:41.748310 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3536d4b-439e-4718-bb9a-97629f7beec8" containerName="registry-server" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.748318 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3536d4b-439e-4718-bb9a-97629f7beec8" containerName="registry-server" Nov 24 11:10:41 crc kubenswrapper[4752]: E1124 11:10:41.748327 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5618f796-dad3-4ed3-bff7-ceed08f8b07c" containerName="oauth-openshift" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.748335 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="5618f796-dad3-4ed3-bff7-ceed08f8b07c" containerName="oauth-openshift" Nov 24 11:10:41 crc kubenswrapper[4752]: E1124 11:10:41.748349 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e9beada-5b6e-4ea6-adce-27504c85c851" containerName="extract-utilities" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.748357 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e9beada-5b6e-4ea6-adce-27504c85c851" containerName="extract-utilities" Nov 24 11:10:41 crc kubenswrapper[4752]: E1124 11:10:41.748366 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3536d4b-439e-4718-bb9a-97629f7beec8" containerName="extract-utilities" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.748375 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3536d4b-439e-4718-bb9a-97629f7beec8" containerName="extract-utilities" Nov 24 11:10:41 crc kubenswrapper[4752]: E1124 11:10:41.748389 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="367c3613-ba5f-4154-9bc9-8ccdf47389f2" containerName="extract-content" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.748398 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="367c3613-ba5f-4154-9bc9-8ccdf47389f2" containerName="extract-content" Nov 24 11:10:41 crc kubenswrapper[4752]: E1124 11:10:41.748408 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21d5896c-3f0a-46a6-ae1a-71bc6662e22b" containerName="pruner" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.748416 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="21d5896c-3f0a-46a6-ae1a-71bc6662e22b" containerName="pruner" Nov 24 11:10:41 crc kubenswrapper[4752]: E1124 11:10:41.748425 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e9beada-5b6e-4ea6-adce-27504c85c851" containerName="extract-content" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.748435 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e9beada-5b6e-4ea6-adce-27504c85c851" containerName="extract-content" Nov 24 11:10:41 crc kubenswrapper[4752]: E1124 11:10:41.748448 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e" containerName="registry-server" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.748456 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e" containerName="registry-server" Nov 24 11:10:41 crc kubenswrapper[4752]: E1124 11:10:41.748469 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e" containerName="extract-utilities" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.748476 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e" containerName="extract-utilities" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.748587 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3536d4b-439e-4718-bb9a-97629f7beec8" containerName="registry-server" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.748599 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="55a42dcf-4065-4af0-a1e8-6b938d9c53fd" containerName="pruner" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.748613 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd3bf6b1-a2a9-4e5b-ad58-7f559317b86e" containerName="registry-server" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.748624 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e9beada-5b6e-4ea6-adce-27504c85c851" containerName="registry-server" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.748975 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="367c3613-ba5f-4154-9bc9-8ccdf47389f2" containerName="registry-server" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.748993 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="5618f796-dad3-4ed3-bff7-ceed08f8b07c" containerName="oauth-openshift" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.749008 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="21d5896c-3f0a-46a6-ae1a-71bc6662e22b" containerName="pruner" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.749494 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.757423 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5d4df5b879-wtdp8"] Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.838892 4752 generic.go:334] "Generic (PLEG): container finished" podID="5618f796-dad3-4ed3-bff7-ceed08f8b07c" containerID="a0d356e945c72492a80531b5a942152c2c6f6b3abf8b459ff0f41394682e79cc" exitCode=0 Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.838937 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-965hb" event={"ID":"5618f796-dad3-4ed3-bff7-ceed08f8b07c","Type":"ContainerDied","Data":"a0d356e945c72492a80531b5a942152c2c6f6b3abf8b459ff0f41394682e79cc"} Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.838970 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-965hb" event={"ID":"5618f796-dad3-4ed3-bff7-ceed08f8b07c","Type":"ContainerDied","Data":"5b38b0a182fb10e8de0d351c1276c310c4e2b11a494e4cd1a2387c22f9487f20"} Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.838990 4752 scope.go:117] "RemoveContainer" containerID="a0d356e945c72492a80531b5a942152c2c6f6b3abf8b459ff0f41394682e79cc" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.838997 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-965hb" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.859118 4752 scope.go:117] "RemoveContainer" containerID="a0d356e945c72492a80531b5a942152c2c6f6b3abf8b459ff0f41394682e79cc" Nov 24 11:10:41 crc kubenswrapper[4752]: E1124 11:10:41.860874 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0d356e945c72492a80531b5a942152c2c6f6b3abf8b459ff0f41394682e79cc\": container with ID starting with a0d356e945c72492a80531b5a942152c2c6f6b3abf8b459ff0f41394682e79cc not found: ID does not exist" containerID="a0d356e945c72492a80531b5a942152c2c6f6b3abf8b459ff0f41394682e79cc" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.860919 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0d356e945c72492a80531b5a942152c2c6f6b3abf8b459ff0f41394682e79cc"} err="failed to get container status \"a0d356e945c72492a80531b5a942152c2c6f6b3abf8b459ff0f41394682e79cc\": rpc error: code = NotFound desc = could not find container \"a0d356e945c72492a80531b5a942152c2c6f6b3abf8b459ff0f41394682e79cc\": container with ID starting with a0d356e945c72492a80531b5a942152c2c6f6b3abf8b459ff0f41394682e79cc not found: ID does not exist" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.892636 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-user-template-error\") pod \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.893204 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-system-service-ca\") pod \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.893231 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-system-session\") pod \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.893270 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-system-cliconfig\") pod \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.893299 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-system-ocp-branding-template\") pod \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.893348 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-system-router-certs\") pod \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.893388 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-user-template-provider-selection\") pod \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.893441 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5618f796-dad3-4ed3-bff7-ceed08f8b07c-audit-dir\") pod \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.893465 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-system-trusted-ca-bundle\") pod \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.893492 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngx8g\" (UniqueName: \"kubernetes.io/projected/5618f796-dad3-4ed3-bff7-ceed08f8b07c-kube-api-access-ngx8g\") pod \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.893517 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-user-idp-0-file-data\") pod \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.893541 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-system-serving-cert\") pod \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.893579 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5618f796-dad3-4ed3-bff7-ceed08f8b07c-audit-policies\") pod \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.893609 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-user-template-login\") pod \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\" (UID: \"5618f796-dad3-4ed3-bff7-ceed08f8b07c\") " Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.893793 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/feff74e8-d288-46a8-9e3b-dfb2b9c38f05-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5d4df5b879-wtdp8\" (UID: \"feff74e8-d288-46a8-9e3b-dfb2b9c38f05\") " pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.893826 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/feff74e8-d288-46a8-9e3b-dfb2b9c38f05-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5d4df5b879-wtdp8\" (UID: \"feff74e8-d288-46a8-9e3b-dfb2b9c38f05\") " pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.893866 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7696f\" (UniqueName: \"kubernetes.io/projected/feff74e8-d288-46a8-9e3b-dfb2b9c38f05-kube-api-access-7696f\") pod \"oauth-openshift-5d4df5b879-wtdp8\" (UID: \"feff74e8-d288-46a8-9e3b-dfb2b9c38f05\") " pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.893899 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/feff74e8-d288-46a8-9e3b-dfb2b9c38f05-audit-policies\") pod \"oauth-openshift-5d4df5b879-wtdp8\" (UID: \"feff74e8-d288-46a8-9e3b-dfb2b9c38f05\") " pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.893945 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/feff74e8-d288-46a8-9e3b-dfb2b9c38f05-v4-0-config-system-service-ca\") pod \"oauth-openshift-5d4df5b879-wtdp8\" (UID: \"feff74e8-d288-46a8-9e3b-dfb2b9c38f05\") " pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.893967 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/feff74e8-d288-46a8-9e3b-dfb2b9c38f05-v4-0-config-system-router-certs\") pod \"oauth-openshift-5d4df5b879-wtdp8\" (UID: \"feff74e8-d288-46a8-9e3b-dfb2b9c38f05\") " pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.893989 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/feff74e8-d288-46a8-9e3b-dfb2b9c38f05-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5d4df5b879-wtdp8\" (UID: \"feff74e8-d288-46a8-9e3b-dfb2b9c38f05\") " pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.894027 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/feff74e8-d288-46a8-9e3b-dfb2b9c38f05-audit-dir\") pod \"oauth-openshift-5d4df5b879-wtdp8\" (UID: \"feff74e8-d288-46a8-9e3b-dfb2b9c38f05\") " pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.894121 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/feff74e8-d288-46a8-9e3b-dfb2b9c38f05-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5d4df5b879-wtdp8\" (UID: \"feff74e8-d288-46a8-9e3b-dfb2b9c38f05\") " pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.894224 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/feff74e8-d288-46a8-9e3b-dfb2b9c38f05-v4-0-config-user-template-error\") pod \"oauth-openshift-5d4df5b879-wtdp8\" (UID: \"feff74e8-d288-46a8-9e3b-dfb2b9c38f05\") " pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.894245 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "5618f796-dad3-4ed3-bff7-ceed08f8b07c" (UID: "5618f796-dad3-4ed3-bff7-ceed08f8b07c"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.894267 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5618f796-dad3-4ed3-bff7-ceed08f8b07c-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "5618f796-dad3-4ed3-bff7-ceed08f8b07c" (UID: "5618f796-dad3-4ed3-bff7-ceed08f8b07c"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.894276 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/feff74e8-d288-46a8-9e3b-dfb2b9c38f05-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5d4df5b879-wtdp8\" (UID: \"feff74e8-d288-46a8-9e3b-dfb2b9c38f05\") " pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.894357 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "5618f796-dad3-4ed3-bff7-ceed08f8b07c" (UID: "5618f796-dad3-4ed3-bff7-ceed08f8b07c"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.894435 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/feff74e8-d288-46a8-9e3b-dfb2b9c38f05-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5d4df5b879-wtdp8\" (UID: \"feff74e8-d288-46a8-9e3b-dfb2b9c38f05\") " pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.894679 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/feff74e8-d288-46a8-9e3b-dfb2b9c38f05-v4-0-config-user-template-login\") pod \"oauth-openshift-5d4df5b879-wtdp8\" (UID: \"feff74e8-d288-46a8-9e3b-dfb2b9c38f05\") " pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.894876 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5618f796-dad3-4ed3-bff7-ceed08f8b07c-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "5618f796-dad3-4ed3-bff7-ceed08f8b07c" (UID: "5618f796-dad3-4ed3-bff7-ceed08f8b07c"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.895066 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/feff74e8-d288-46a8-9e3b-dfb2b9c38f05-v4-0-config-system-session\") pod \"oauth-openshift-5d4df5b879-wtdp8\" (UID: \"feff74e8-d288-46a8-9e3b-dfb2b9c38f05\") " pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.895109 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "5618f796-dad3-4ed3-bff7-ceed08f8b07c" (UID: "5618f796-dad3-4ed3-bff7-ceed08f8b07c"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.895200 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.895225 4752 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5618f796-dad3-4ed3-bff7-ceed08f8b07c-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.895242 4752 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5618f796-dad3-4ed3-bff7-ceed08f8b07c-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.895307 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.899140 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "5618f796-dad3-4ed3-bff7-ceed08f8b07c" (UID: "5618f796-dad3-4ed3-bff7-ceed08f8b07c"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.899315 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "5618f796-dad3-4ed3-bff7-ceed08f8b07c" (UID: "5618f796-dad3-4ed3-bff7-ceed08f8b07c"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.899548 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5618f796-dad3-4ed3-bff7-ceed08f8b07c-kube-api-access-ngx8g" (OuterVolumeSpecName: "kube-api-access-ngx8g") pod "5618f796-dad3-4ed3-bff7-ceed08f8b07c" (UID: "5618f796-dad3-4ed3-bff7-ceed08f8b07c"). InnerVolumeSpecName "kube-api-access-ngx8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.899562 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "5618f796-dad3-4ed3-bff7-ceed08f8b07c" (UID: "5618f796-dad3-4ed3-bff7-ceed08f8b07c"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.899750 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "5618f796-dad3-4ed3-bff7-ceed08f8b07c" (UID: "5618f796-dad3-4ed3-bff7-ceed08f8b07c"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.899909 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "5618f796-dad3-4ed3-bff7-ceed08f8b07c" (UID: "5618f796-dad3-4ed3-bff7-ceed08f8b07c"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.901238 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "5618f796-dad3-4ed3-bff7-ceed08f8b07c" (UID: "5618f796-dad3-4ed3-bff7-ceed08f8b07c"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.905128 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "5618f796-dad3-4ed3-bff7-ceed08f8b07c" (UID: "5618f796-dad3-4ed3-bff7-ceed08f8b07c"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.905521 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "5618f796-dad3-4ed3-bff7-ceed08f8b07c" (UID: "5618f796-dad3-4ed3-bff7-ceed08f8b07c"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.996398 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/feff74e8-d288-46a8-9e3b-dfb2b9c38f05-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5d4df5b879-wtdp8\" (UID: \"feff74e8-d288-46a8-9e3b-dfb2b9c38f05\") " pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.996458 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/feff74e8-d288-46a8-9e3b-dfb2b9c38f05-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5d4df5b879-wtdp8\" (UID: \"feff74e8-d288-46a8-9e3b-dfb2b9c38f05\") " pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.996489 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/feff74e8-d288-46a8-9e3b-dfb2b9c38f05-v4-0-config-user-template-login\") pod \"oauth-openshift-5d4df5b879-wtdp8\" (UID: \"feff74e8-d288-46a8-9e3b-dfb2b9c38f05\") " pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.996529 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/feff74e8-d288-46a8-9e3b-dfb2b9c38f05-v4-0-config-system-session\") pod \"oauth-openshift-5d4df5b879-wtdp8\" (UID: \"feff74e8-d288-46a8-9e3b-dfb2b9c38f05\") " pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.996562 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/feff74e8-d288-46a8-9e3b-dfb2b9c38f05-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5d4df5b879-wtdp8\" (UID: \"feff74e8-d288-46a8-9e3b-dfb2b9c38f05\") " pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.996587 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/feff74e8-d288-46a8-9e3b-dfb2b9c38f05-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5d4df5b879-wtdp8\" (UID: \"feff74e8-d288-46a8-9e3b-dfb2b9c38f05\") " pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.996622 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7696f\" (UniqueName: \"kubernetes.io/projected/feff74e8-d288-46a8-9e3b-dfb2b9c38f05-kube-api-access-7696f\") pod \"oauth-openshift-5d4df5b879-wtdp8\" (UID: \"feff74e8-d288-46a8-9e3b-dfb2b9c38f05\") " pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.996668 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/feff74e8-d288-46a8-9e3b-dfb2b9c38f05-audit-policies\") pod \"oauth-openshift-5d4df5b879-wtdp8\" (UID: \"feff74e8-d288-46a8-9e3b-dfb2b9c38f05\") " pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.996694 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/feff74e8-d288-46a8-9e3b-dfb2b9c38f05-v4-0-config-system-service-ca\") pod \"oauth-openshift-5d4df5b879-wtdp8\" (UID: \"feff74e8-d288-46a8-9e3b-dfb2b9c38f05\") " pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.996716 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/feff74e8-d288-46a8-9e3b-dfb2b9c38f05-v4-0-config-system-router-certs\") pod \"oauth-openshift-5d4df5b879-wtdp8\" (UID: \"feff74e8-d288-46a8-9e3b-dfb2b9c38f05\") " pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.996736 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/feff74e8-d288-46a8-9e3b-dfb2b9c38f05-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5d4df5b879-wtdp8\" (UID: \"feff74e8-d288-46a8-9e3b-dfb2b9c38f05\") " pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.996788 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/feff74e8-d288-46a8-9e3b-dfb2b9c38f05-audit-dir\") pod \"oauth-openshift-5d4df5b879-wtdp8\" (UID: \"feff74e8-d288-46a8-9e3b-dfb2b9c38f05\") " pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.996827 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/feff74e8-d288-46a8-9e3b-dfb2b9c38f05-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5d4df5b879-wtdp8\" (UID: \"feff74e8-d288-46a8-9e3b-dfb2b9c38f05\") " pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.996866 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/feff74e8-d288-46a8-9e3b-dfb2b9c38f05-v4-0-config-user-template-error\") pod \"oauth-openshift-5d4df5b879-wtdp8\" (UID: \"feff74e8-d288-46a8-9e3b-dfb2b9c38f05\") " pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.996915 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.996931 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.996944 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.996958 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.996970 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngx8g\" (UniqueName: \"kubernetes.io/projected/5618f796-dad3-4ed3-bff7-ceed08f8b07c-kube-api-access-ngx8g\") on node \"crc\" DevicePath \"\"" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.996981 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.996992 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.997014 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.997066 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.997082 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5618f796-dad3-4ed3-bff7-ceed08f8b07c-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.997382 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/feff74e8-d288-46a8-9e3b-dfb2b9c38f05-audit-dir\") pod \"oauth-openshift-5d4df5b879-wtdp8\" (UID: \"feff74e8-d288-46a8-9e3b-dfb2b9c38f05\") " pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.997561 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/feff74e8-d288-46a8-9e3b-dfb2b9c38f05-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5d4df5b879-wtdp8\" (UID: \"feff74e8-d288-46a8-9e3b-dfb2b9c38f05\") " pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.997818 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/feff74e8-d288-46a8-9e3b-dfb2b9c38f05-audit-policies\") pod \"oauth-openshift-5d4df5b879-wtdp8\" (UID: \"feff74e8-d288-46a8-9e3b-dfb2b9c38f05\") " pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.997918 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/feff74e8-d288-46a8-9e3b-dfb2b9c38f05-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5d4df5b879-wtdp8\" (UID: \"feff74e8-d288-46a8-9e3b-dfb2b9c38f05\") " pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.998342 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/feff74e8-d288-46a8-9e3b-dfb2b9c38f05-v4-0-config-system-service-ca\") pod \"oauth-openshift-5d4df5b879-wtdp8\" (UID: \"feff74e8-d288-46a8-9e3b-dfb2b9c38f05\") " pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:41 crc kubenswrapper[4752]: I1124 11:10:41.999806 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/feff74e8-d288-46a8-9e3b-dfb2b9c38f05-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5d4df5b879-wtdp8\" (UID: \"feff74e8-d288-46a8-9e3b-dfb2b9c38f05\") " pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:42 crc kubenswrapper[4752]: I1124 11:10:42.000063 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/feff74e8-d288-46a8-9e3b-dfb2b9c38f05-v4-0-config-system-router-certs\") pod \"oauth-openshift-5d4df5b879-wtdp8\" (UID: \"feff74e8-d288-46a8-9e3b-dfb2b9c38f05\") " pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:42 crc kubenswrapper[4752]: I1124 11:10:42.000230 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/feff74e8-d288-46a8-9e3b-dfb2b9c38f05-v4-0-config-system-session\") pod \"oauth-openshift-5d4df5b879-wtdp8\" (UID: \"feff74e8-d288-46a8-9e3b-dfb2b9c38f05\") " pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:42 crc kubenswrapper[4752]: I1124 11:10:42.000517 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/feff74e8-d288-46a8-9e3b-dfb2b9c38f05-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5d4df5b879-wtdp8\" (UID: \"feff74e8-d288-46a8-9e3b-dfb2b9c38f05\") " pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:42 crc kubenswrapper[4752]: I1124 11:10:42.001385 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/feff74e8-d288-46a8-9e3b-dfb2b9c38f05-v4-0-config-user-template-error\") pod \"oauth-openshift-5d4df5b879-wtdp8\" (UID: \"feff74e8-d288-46a8-9e3b-dfb2b9c38f05\") " pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:42 crc kubenswrapper[4752]: I1124 11:10:42.001738 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/feff74e8-d288-46a8-9e3b-dfb2b9c38f05-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5d4df5b879-wtdp8\" (UID: \"feff74e8-d288-46a8-9e3b-dfb2b9c38f05\") " pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:42 crc kubenswrapper[4752]: I1124 11:10:42.001843 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/feff74e8-d288-46a8-9e3b-dfb2b9c38f05-v4-0-config-user-template-login\") pod \"oauth-openshift-5d4df5b879-wtdp8\" (UID: \"feff74e8-d288-46a8-9e3b-dfb2b9c38f05\") " pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:42 crc kubenswrapper[4752]: I1124 11:10:42.002505 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/feff74e8-d288-46a8-9e3b-dfb2b9c38f05-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5d4df5b879-wtdp8\" (UID: \"feff74e8-d288-46a8-9e3b-dfb2b9c38f05\") " pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:42 crc kubenswrapper[4752]: I1124 11:10:42.015426 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7696f\" (UniqueName: \"kubernetes.io/projected/feff74e8-d288-46a8-9e3b-dfb2b9c38f05-kube-api-access-7696f\") pod \"oauth-openshift-5d4df5b879-wtdp8\" (UID: \"feff74e8-d288-46a8-9e3b-dfb2b9c38f05\") " pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:42 crc kubenswrapper[4752]: I1124 11:10:42.097781 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:42 crc kubenswrapper[4752]: I1124 11:10:42.173820 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-965hb"] Nov 24 11:10:42 crc kubenswrapper[4752]: I1124 11:10:42.178784 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-965hb"] Nov 24 11:10:42 crc kubenswrapper[4752]: I1124 11:10:42.275720 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5d4df5b879-wtdp8"] Nov 24 11:10:42 crc kubenswrapper[4752]: I1124 11:10:42.737739 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5618f796-dad3-4ed3-bff7-ceed08f8b07c" path="/var/lib/kubelet/pods/5618f796-dad3-4ed3-bff7-ceed08f8b07c/volumes" Nov 24 11:10:42 crc kubenswrapper[4752]: I1124 11:10:42.846149 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" event={"ID":"feff74e8-d288-46a8-9e3b-dfb2b9c38f05","Type":"ContainerStarted","Data":"23353fd3d50e97923cbf102b19141707b1949d8b936efeb9506290b38057eb63"} Nov 24 11:10:42 crc kubenswrapper[4752]: I1124 11:10:42.847022 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:10:42 crc kubenswrapper[4752]: I1124 11:10:42.847114 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" event={"ID":"feff74e8-d288-46a8-9e3b-dfb2b9c38f05","Type":"ContainerStarted","Data":"732ad7b187ef7cc421f42fa57ce52dae6ac12e851f2000dadc1204bfa95f2631"} Nov 24 11:10:42 crc kubenswrapper[4752]: I1124 11:10:42.872005 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" podStartSLOduration=26.871964988 podStartE2EDuration="26.871964988s" podCreationTimestamp="2025-11-24 11:10:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:10:42.869050791 +0000 UTC m=+248.853871080" watchObservedRunningTime="2025-11-24 11:10:42.871964988 +0000 UTC m=+248.856785277" Nov 24 11:10:43 crc kubenswrapper[4752]: I1124 11:10:43.218834 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5d4df5b879-wtdp8" Nov 24 11:11:02 crc kubenswrapper[4752]: I1124 11:11:02.961940 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-glr4w"] Nov 24 11:11:02 crc kubenswrapper[4752]: I1124 11:11:02.962777 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-glr4w" podUID="0d2e36e3-15aa-4576-a13a-5b5512bb13f2" containerName="registry-server" containerID="cri-o://7c581a3b1365d35dce7abf8c0b4ac4753f79d66e2d1b46d3e990ed687e5e8bc7" gracePeriod=30 Nov 24 11:11:02 crc kubenswrapper[4752]: I1124 11:11:02.970829 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v468m"] Nov 24 11:11:02 crc kubenswrapper[4752]: I1124 11:11:02.971158 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-v468m" podUID="e5633c2f-36d1-4d10-a12c-f58e67571364" containerName="registry-server" containerID="cri-o://a548b00beaefda12074bd2fcad971cb565b2d71ef57521cd9e29f4089fa79e62" gracePeriod=30 Nov 24 11:11:02 crc kubenswrapper[4752]: I1124 11:11:02.980850 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bps76"] Nov 24 11:11:02 crc kubenswrapper[4752]: I1124 11:11:02.981121 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-bps76" podUID="782a3373-7524-40dc-b312-08c40423ffb6" containerName="marketplace-operator" containerID="cri-o://d6d40fef72b1cf1d1f0b1dc19406741ac4be722a13fb920e14e32f064b68c621" gracePeriod=30 Nov 24 11:11:02 crc kubenswrapper[4752]: I1124 11:11:02.983907 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lbtd5"] Nov 24 11:11:02 crc kubenswrapper[4752]: I1124 11:11:02.984436 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lbtd5" podUID="1af90bc8-c663-4297-84fc-06b6663f837f" containerName="registry-server" containerID="cri-o://1fda4635c4542bbdca9c05c4d104f85430f73879b0182da9948599abd81f5e95" gracePeriod=30 Nov 24 11:11:02 crc kubenswrapper[4752]: I1124 11:11:02.993923 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-l9vf6"] Nov 24 11:11:02 crc kubenswrapper[4752]: I1124 11:11:02.994728 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-l9vf6" Nov 24 11:11:02 crc kubenswrapper[4752]: I1124 11:11:02.996412 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z4nx9"] Nov 24 11:11:02 crc kubenswrapper[4752]: I1124 11:11:02.996630 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-z4nx9" podUID="2208d90e-0865-4938-8423-ea3d37ca26db" containerName="registry-server" containerID="cri-o://062bb58b82e1ef262650ef72a2b9a355058383599921482f7a591b04649db828" gracePeriod=30 Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.004553 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-l9vf6"] Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.067007 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/077748f3-107f-424e-9084-32a79b3ac58f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-l9vf6\" (UID: \"077748f3-107f-424e-9084-32a79b3ac58f\") " pod="openshift-marketplace/marketplace-operator-79b997595-l9vf6" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.067051 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tjpg\" (UniqueName: \"kubernetes.io/projected/077748f3-107f-424e-9084-32a79b3ac58f-kube-api-access-9tjpg\") pod \"marketplace-operator-79b997595-l9vf6\" (UID: \"077748f3-107f-424e-9084-32a79b3ac58f\") " pod="openshift-marketplace/marketplace-operator-79b997595-l9vf6" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.067108 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/077748f3-107f-424e-9084-32a79b3ac58f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-l9vf6\" (UID: \"077748f3-107f-424e-9084-32a79b3ac58f\") " pod="openshift-marketplace/marketplace-operator-79b997595-l9vf6" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.168799 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/077748f3-107f-424e-9084-32a79b3ac58f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-l9vf6\" (UID: \"077748f3-107f-424e-9084-32a79b3ac58f\") " pod="openshift-marketplace/marketplace-operator-79b997595-l9vf6" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.168882 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/077748f3-107f-424e-9084-32a79b3ac58f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-l9vf6\" (UID: \"077748f3-107f-424e-9084-32a79b3ac58f\") " pod="openshift-marketplace/marketplace-operator-79b997595-l9vf6" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.168906 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tjpg\" (UniqueName: \"kubernetes.io/projected/077748f3-107f-424e-9084-32a79b3ac58f-kube-api-access-9tjpg\") pod \"marketplace-operator-79b997595-l9vf6\" (UID: \"077748f3-107f-424e-9084-32a79b3ac58f\") " pod="openshift-marketplace/marketplace-operator-79b997595-l9vf6" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.170940 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/077748f3-107f-424e-9084-32a79b3ac58f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-l9vf6\" (UID: \"077748f3-107f-424e-9084-32a79b3ac58f\") " pod="openshift-marketplace/marketplace-operator-79b997595-l9vf6" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.177505 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/077748f3-107f-424e-9084-32a79b3ac58f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-l9vf6\" (UID: \"077748f3-107f-424e-9084-32a79b3ac58f\") " pod="openshift-marketplace/marketplace-operator-79b997595-l9vf6" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.194509 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tjpg\" (UniqueName: \"kubernetes.io/projected/077748f3-107f-424e-9084-32a79b3ac58f-kube-api-access-9tjpg\") pod \"marketplace-operator-79b997595-l9vf6\" (UID: \"077748f3-107f-424e-9084-32a79b3ac58f\") " pod="openshift-marketplace/marketplace-operator-79b997595-l9vf6" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.313588 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-l9vf6" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.446936 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-glr4w" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.451487 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bps76" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.460563 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lbtd5" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.465237 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v468m" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.474260 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/782a3373-7524-40dc-b312-08c40423ffb6-marketplace-operator-metrics\") pod \"782a3373-7524-40dc-b312-08c40423ffb6\" (UID: \"782a3373-7524-40dc-b312-08c40423ffb6\") " Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.474322 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5633c2f-36d1-4d10-a12c-f58e67571364-catalog-content\") pod \"e5633c2f-36d1-4d10-a12c-f58e67571364\" (UID: \"e5633c2f-36d1-4d10-a12c-f58e67571364\") " Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.475219 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7578\" (UniqueName: \"kubernetes.io/projected/1af90bc8-c663-4297-84fc-06b6663f837f-kube-api-access-q7578\") pod \"1af90bc8-c663-4297-84fc-06b6663f837f\" (UID: \"1af90bc8-c663-4297-84fc-06b6663f837f\") " Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.477568 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psf4j\" (UniqueName: \"kubernetes.io/projected/0d2e36e3-15aa-4576-a13a-5b5512bb13f2-kube-api-access-psf4j\") pod \"0d2e36e3-15aa-4576-a13a-5b5512bb13f2\" (UID: \"0d2e36e3-15aa-4576-a13a-5b5512bb13f2\") " Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.477627 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/782a3373-7524-40dc-b312-08c40423ffb6-marketplace-trusted-ca\") pod \"782a3373-7524-40dc-b312-08c40423ffb6\" (UID: \"782a3373-7524-40dc-b312-08c40423ffb6\") " Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.477663 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1af90bc8-c663-4297-84fc-06b6663f837f-utilities\") pod \"1af90bc8-c663-4297-84fc-06b6663f837f\" (UID: \"1af90bc8-c663-4297-84fc-06b6663f837f\") " Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.477699 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d2e36e3-15aa-4576-a13a-5b5512bb13f2-catalog-content\") pod \"0d2e36e3-15aa-4576-a13a-5b5512bb13f2\" (UID: \"0d2e36e3-15aa-4576-a13a-5b5512bb13f2\") " Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.477729 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqj2s\" (UniqueName: \"kubernetes.io/projected/e5633c2f-36d1-4d10-a12c-f58e67571364-kube-api-access-tqj2s\") pod \"e5633c2f-36d1-4d10-a12c-f58e67571364\" (UID: \"e5633c2f-36d1-4d10-a12c-f58e67571364\") " Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.477787 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1af90bc8-c663-4297-84fc-06b6663f837f-catalog-content\") pod \"1af90bc8-c663-4297-84fc-06b6663f837f\" (UID: \"1af90bc8-c663-4297-84fc-06b6663f837f\") " Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.477812 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5633c2f-36d1-4d10-a12c-f58e67571364-utilities\") pod \"e5633c2f-36d1-4d10-a12c-f58e67571364\" (UID: \"e5633c2f-36d1-4d10-a12c-f58e67571364\") " Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.478935 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1af90bc8-c663-4297-84fc-06b6663f837f-utilities" (OuterVolumeSpecName: "utilities") pod "1af90bc8-c663-4297-84fc-06b6663f837f" (UID: "1af90bc8-c663-4297-84fc-06b6663f837f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.479347 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ptsq\" (UniqueName: \"kubernetes.io/projected/782a3373-7524-40dc-b312-08c40423ffb6-kube-api-access-9ptsq\") pod \"782a3373-7524-40dc-b312-08c40423ffb6\" (UID: \"782a3373-7524-40dc-b312-08c40423ffb6\") " Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.479386 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d2e36e3-15aa-4576-a13a-5b5512bb13f2-utilities\") pod \"0d2e36e3-15aa-4576-a13a-5b5512bb13f2\" (UID: \"0d2e36e3-15aa-4576-a13a-5b5512bb13f2\") " Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.479824 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1af90bc8-c663-4297-84fc-06b6663f837f-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.480558 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d2e36e3-15aa-4576-a13a-5b5512bb13f2-utilities" (OuterVolumeSpecName: "utilities") pod "0d2e36e3-15aa-4576-a13a-5b5512bb13f2" (UID: "0d2e36e3-15aa-4576-a13a-5b5512bb13f2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.481227 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5633c2f-36d1-4d10-a12c-f58e67571364-utilities" (OuterVolumeSpecName: "utilities") pod "e5633c2f-36d1-4d10-a12c-f58e67571364" (UID: "e5633c2f-36d1-4d10-a12c-f58e67571364"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.483199 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5633c2f-36d1-4d10-a12c-f58e67571364-kube-api-access-tqj2s" (OuterVolumeSpecName: "kube-api-access-tqj2s") pod "e5633c2f-36d1-4d10-a12c-f58e67571364" (UID: "e5633c2f-36d1-4d10-a12c-f58e67571364"). InnerVolumeSpecName "kube-api-access-tqj2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.483571 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/782a3373-7524-40dc-b312-08c40423ffb6-kube-api-access-9ptsq" (OuterVolumeSpecName: "kube-api-access-9ptsq") pod "782a3373-7524-40dc-b312-08c40423ffb6" (UID: "782a3373-7524-40dc-b312-08c40423ffb6"). InnerVolumeSpecName "kube-api-access-9ptsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.484477 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/782a3373-7524-40dc-b312-08c40423ffb6-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "782a3373-7524-40dc-b312-08c40423ffb6" (UID: "782a3373-7524-40dc-b312-08c40423ffb6"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.485985 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z4nx9" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.488053 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1af90bc8-c663-4297-84fc-06b6663f837f-kube-api-access-q7578" (OuterVolumeSpecName: "kube-api-access-q7578") pod "1af90bc8-c663-4297-84fc-06b6663f837f" (UID: "1af90bc8-c663-4297-84fc-06b6663f837f"). InnerVolumeSpecName "kube-api-access-q7578". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.496411 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/782a3373-7524-40dc-b312-08c40423ffb6-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "782a3373-7524-40dc-b312-08c40423ffb6" (UID: "782a3373-7524-40dc-b312-08c40423ffb6"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.503931 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d2e36e3-15aa-4576-a13a-5b5512bb13f2-kube-api-access-psf4j" (OuterVolumeSpecName: "kube-api-access-psf4j") pod "0d2e36e3-15aa-4576-a13a-5b5512bb13f2" (UID: "0d2e36e3-15aa-4576-a13a-5b5512bb13f2"). InnerVolumeSpecName "kube-api-access-psf4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.509227 4752 generic.go:334] "Generic (PLEG): container finished" podID="2208d90e-0865-4938-8423-ea3d37ca26db" containerID="062bb58b82e1ef262650ef72a2b9a355058383599921482f7a591b04649db828" exitCode=0 Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.509293 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z4nx9" event={"ID":"2208d90e-0865-4938-8423-ea3d37ca26db","Type":"ContainerDied","Data":"062bb58b82e1ef262650ef72a2b9a355058383599921482f7a591b04649db828"} Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.509326 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z4nx9" event={"ID":"2208d90e-0865-4938-8423-ea3d37ca26db","Type":"ContainerDied","Data":"f6b68b980bceff1daa58abb7fd8562877c365975213a98fc20e09054f9e63577"} Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.509345 4752 scope.go:117] "RemoveContainer" containerID="062bb58b82e1ef262650ef72a2b9a355058383599921482f7a591b04649db828" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.509479 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z4nx9" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.524604 4752 generic.go:334] "Generic (PLEG): container finished" podID="e5633c2f-36d1-4d10-a12c-f58e67571364" containerID="a548b00beaefda12074bd2fcad971cb565b2d71ef57521cd9e29f4089fa79e62" exitCode=0 Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.524650 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v468m" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.524696 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v468m" event={"ID":"e5633c2f-36d1-4d10-a12c-f58e67571364","Type":"ContainerDied","Data":"a548b00beaefda12074bd2fcad971cb565b2d71ef57521cd9e29f4089fa79e62"} Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.525139 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v468m" event={"ID":"e5633c2f-36d1-4d10-a12c-f58e67571364","Type":"ContainerDied","Data":"43f5287340dbd7553b644ac5a37a0fd65596be508035f122510be3d6d91f2693"} Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.526649 4752 generic.go:334] "Generic (PLEG): container finished" podID="782a3373-7524-40dc-b312-08c40423ffb6" containerID="d6d40fef72b1cf1d1f0b1dc19406741ac4be722a13fb920e14e32f064b68c621" exitCode=0 Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.526720 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bps76" event={"ID":"782a3373-7524-40dc-b312-08c40423ffb6","Type":"ContainerDied","Data":"d6d40fef72b1cf1d1f0b1dc19406741ac4be722a13fb920e14e32f064b68c621"} Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.526787 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bps76" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.527723 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1af90bc8-c663-4297-84fc-06b6663f837f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1af90bc8-c663-4297-84fc-06b6663f837f" (UID: "1af90bc8-c663-4297-84fc-06b6663f837f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.529036 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bps76" event={"ID":"782a3373-7524-40dc-b312-08c40423ffb6","Type":"ContainerDied","Data":"d67a0de00949b9e5e49e77b96b2873e494ff258be25e84212ee94f2aee603023"} Nov 24 11:11:03 crc kubenswrapper[4752]: W1124 11:11:03.529106 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod077748f3_107f_424e_9084_32a79b3ac58f.slice/crio-fb770aacfb3f059fac40a4ac8afa3c260e78da0071d77750b9b2ff9ed1a3967d WatchSource:0}: Error finding container fb770aacfb3f059fac40a4ac8afa3c260e78da0071d77750b9b2ff9ed1a3967d: Status 404 returned error can't find the container with id fb770aacfb3f059fac40a4ac8afa3c260e78da0071d77750b9b2ff9ed1a3967d Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.531446 4752 generic.go:334] "Generic (PLEG): container finished" podID="0d2e36e3-15aa-4576-a13a-5b5512bb13f2" containerID="7c581a3b1365d35dce7abf8c0b4ac4753f79d66e2d1b46d3e990ed687e5e8bc7" exitCode=0 Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.531485 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-glr4w" event={"ID":"0d2e36e3-15aa-4576-a13a-5b5512bb13f2","Type":"ContainerDied","Data":"7c581a3b1365d35dce7abf8c0b4ac4753f79d66e2d1b46d3e990ed687e5e8bc7"} Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.531501 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-glr4w" event={"ID":"0d2e36e3-15aa-4576-a13a-5b5512bb13f2","Type":"ContainerDied","Data":"1705f9b75b102bd8a3b847598ac4f5630a754849998552f72b01e824463bb671"} Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.531556 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-glr4w" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.534564 4752 generic.go:334] "Generic (PLEG): container finished" podID="1af90bc8-c663-4297-84fc-06b6663f837f" containerID="1fda4635c4542bbdca9c05c4d104f85430f73879b0182da9948599abd81f5e95" exitCode=0 Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.534609 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lbtd5" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.534607 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbtd5" event={"ID":"1af90bc8-c663-4297-84fc-06b6663f837f","Type":"ContainerDied","Data":"1fda4635c4542bbdca9c05c4d104f85430f73879b0182da9948599abd81f5e95"} Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.534673 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbtd5" event={"ID":"1af90bc8-c663-4297-84fc-06b6663f837f","Type":"ContainerDied","Data":"b1734c44ad5bff3b6a913b419c06439f789e3ff699e49c0b8f363773b163fc90"} Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.540804 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-l9vf6"] Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.560357 4752 scope.go:117] "RemoveContainer" containerID="48302c22a041592641f4d165dd55ed450af0b88535872222e33c39bc71d692e5" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.580245 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2208d90e-0865-4938-8423-ea3d37ca26db-utilities\") pod \"2208d90e-0865-4938-8423-ea3d37ca26db\" (UID: \"2208d90e-0865-4938-8423-ea3d37ca26db\") " Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.580597 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2208d90e-0865-4938-8423-ea3d37ca26db-catalog-content\") pod \"2208d90e-0865-4938-8423-ea3d37ca26db\" (UID: \"2208d90e-0865-4938-8423-ea3d37ca26db\") " Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.580816 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t54kf\" (UniqueName: \"kubernetes.io/projected/2208d90e-0865-4938-8423-ea3d37ca26db-kube-api-access-t54kf\") pod \"2208d90e-0865-4938-8423-ea3d37ca26db\" (UID: \"2208d90e-0865-4938-8423-ea3d37ca26db\") " Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.580971 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2208d90e-0865-4938-8423-ea3d37ca26db-utilities" (OuterVolumeSpecName: "utilities") pod "2208d90e-0865-4938-8423-ea3d37ca26db" (UID: "2208d90e-0865-4938-8423-ea3d37ca26db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.581457 4752 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/782a3373-7524-40dc-b312-08c40423ffb6-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.581566 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7578\" (UniqueName: \"kubernetes.io/projected/1af90bc8-c663-4297-84fc-06b6663f837f-kube-api-access-q7578\") on node \"crc\" DevicePath \"\"" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.581663 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psf4j\" (UniqueName: \"kubernetes.io/projected/0d2e36e3-15aa-4576-a13a-5b5512bb13f2-kube-api-access-psf4j\") on node \"crc\" DevicePath \"\"" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.581781 4752 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/782a3373-7524-40dc-b312-08c40423ffb6-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.581881 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2208d90e-0865-4938-8423-ea3d37ca26db-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.581981 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqj2s\" (UniqueName: \"kubernetes.io/projected/e5633c2f-36d1-4d10-a12c-f58e67571364-kube-api-access-tqj2s\") on node \"crc\" DevicePath \"\"" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.582071 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1af90bc8-c663-4297-84fc-06b6663f837f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.582158 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5633c2f-36d1-4d10-a12c-f58e67571364-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.582261 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ptsq\" (UniqueName: \"kubernetes.io/projected/782a3373-7524-40dc-b312-08c40423ffb6-kube-api-access-9ptsq\") on node \"crc\" DevicePath \"\"" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.582359 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d2e36e3-15aa-4576-a13a-5b5512bb13f2-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.583903 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2208d90e-0865-4938-8423-ea3d37ca26db-kube-api-access-t54kf" (OuterVolumeSpecName: "kube-api-access-t54kf") pod "2208d90e-0865-4938-8423-ea3d37ca26db" (UID: "2208d90e-0865-4938-8423-ea3d37ca26db"). InnerVolumeSpecName "kube-api-access-t54kf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.588101 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d2e36e3-15aa-4576-a13a-5b5512bb13f2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0d2e36e3-15aa-4576-a13a-5b5512bb13f2" (UID: "0d2e36e3-15aa-4576-a13a-5b5512bb13f2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.603601 4752 scope.go:117] "RemoveContainer" containerID="fd88edf273056db29b27f5a0926ff22461ca626e5dbfc28e870686d80a33464e" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.608714 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5633c2f-36d1-4d10-a12c-f58e67571364-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e5633c2f-36d1-4d10-a12c-f58e67571364" (UID: "e5633c2f-36d1-4d10-a12c-f58e67571364"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.608818 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bps76"] Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.609838 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bps76"] Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.612458 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lbtd5"] Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.616014 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lbtd5"] Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.622545 4752 scope.go:117] "RemoveContainer" containerID="062bb58b82e1ef262650ef72a2b9a355058383599921482f7a591b04649db828" Nov 24 11:11:03 crc kubenswrapper[4752]: E1124 11:11:03.624275 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"062bb58b82e1ef262650ef72a2b9a355058383599921482f7a591b04649db828\": container with ID starting with 062bb58b82e1ef262650ef72a2b9a355058383599921482f7a591b04649db828 not found: ID does not exist" containerID="062bb58b82e1ef262650ef72a2b9a355058383599921482f7a591b04649db828" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.624309 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"062bb58b82e1ef262650ef72a2b9a355058383599921482f7a591b04649db828"} err="failed to get container status \"062bb58b82e1ef262650ef72a2b9a355058383599921482f7a591b04649db828\": rpc error: code = NotFound desc = could not find container \"062bb58b82e1ef262650ef72a2b9a355058383599921482f7a591b04649db828\": container with ID starting with 062bb58b82e1ef262650ef72a2b9a355058383599921482f7a591b04649db828 not found: ID does not exist" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.624332 4752 scope.go:117] "RemoveContainer" containerID="48302c22a041592641f4d165dd55ed450af0b88535872222e33c39bc71d692e5" Nov 24 11:11:03 crc kubenswrapper[4752]: E1124 11:11:03.624738 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48302c22a041592641f4d165dd55ed450af0b88535872222e33c39bc71d692e5\": container with ID starting with 48302c22a041592641f4d165dd55ed450af0b88535872222e33c39bc71d692e5 not found: ID does not exist" containerID="48302c22a041592641f4d165dd55ed450af0b88535872222e33c39bc71d692e5" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.625025 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48302c22a041592641f4d165dd55ed450af0b88535872222e33c39bc71d692e5"} err="failed to get container status \"48302c22a041592641f4d165dd55ed450af0b88535872222e33c39bc71d692e5\": rpc error: code = NotFound desc = could not find container \"48302c22a041592641f4d165dd55ed450af0b88535872222e33c39bc71d692e5\": container with ID starting with 48302c22a041592641f4d165dd55ed450af0b88535872222e33c39bc71d692e5 not found: ID does not exist" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.625061 4752 scope.go:117] "RemoveContainer" containerID="fd88edf273056db29b27f5a0926ff22461ca626e5dbfc28e870686d80a33464e" Nov 24 11:11:03 crc kubenswrapper[4752]: E1124 11:11:03.625510 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd88edf273056db29b27f5a0926ff22461ca626e5dbfc28e870686d80a33464e\": container with ID starting with fd88edf273056db29b27f5a0926ff22461ca626e5dbfc28e870686d80a33464e not found: ID does not exist" containerID="fd88edf273056db29b27f5a0926ff22461ca626e5dbfc28e870686d80a33464e" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.625551 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd88edf273056db29b27f5a0926ff22461ca626e5dbfc28e870686d80a33464e"} err="failed to get container status \"fd88edf273056db29b27f5a0926ff22461ca626e5dbfc28e870686d80a33464e\": rpc error: code = NotFound desc = could not find container \"fd88edf273056db29b27f5a0926ff22461ca626e5dbfc28e870686d80a33464e\": container with ID starting with fd88edf273056db29b27f5a0926ff22461ca626e5dbfc28e870686d80a33464e not found: ID does not exist" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.625583 4752 scope.go:117] "RemoveContainer" containerID="a548b00beaefda12074bd2fcad971cb565b2d71ef57521cd9e29f4089fa79e62" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.658553 4752 scope.go:117] "RemoveContainer" containerID="225ef182d4370b29d5d01f61bc062bd384cd95d7759c72e6117bee53822e155a" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.683419 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t54kf\" (UniqueName: \"kubernetes.io/projected/2208d90e-0865-4938-8423-ea3d37ca26db-kube-api-access-t54kf\") on node \"crc\" DevicePath \"\"" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.687904 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5633c2f-36d1-4d10-a12c-f58e67571364-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.687928 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d2e36e3-15aa-4576-a13a-5b5512bb13f2-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.690810 4752 scope.go:117] "RemoveContainer" containerID="07ba07891f5ac3167945a163de60ae65c1b50b594843101f54eed067659e8bb7" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.699028 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2208d90e-0865-4938-8423-ea3d37ca26db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2208d90e-0865-4938-8423-ea3d37ca26db" (UID: "2208d90e-0865-4938-8423-ea3d37ca26db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.708305 4752 scope.go:117] "RemoveContainer" containerID="a548b00beaefda12074bd2fcad971cb565b2d71ef57521cd9e29f4089fa79e62" Nov 24 11:11:03 crc kubenswrapper[4752]: E1124 11:11:03.708712 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a548b00beaefda12074bd2fcad971cb565b2d71ef57521cd9e29f4089fa79e62\": container with ID starting with a548b00beaefda12074bd2fcad971cb565b2d71ef57521cd9e29f4089fa79e62 not found: ID does not exist" containerID="a548b00beaefda12074bd2fcad971cb565b2d71ef57521cd9e29f4089fa79e62" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.708773 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a548b00beaefda12074bd2fcad971cb565b2d71ef57521cd9e29f4089fa79e62"} err="failed to get container status \"a548b00beaefda12074bd2fcad971cb565b2d71ef57521cd9e29f4089fa79e62\": rpc error: code = NotFound desc = could not find container \"a548b00beaefda12074bd2fcad971cb565b2d71ef57521cd9e29f4089fa79e62\": container with ID starting with a548b00beaefda12074bd2fcad971cb565b2d71ef57521cd9e29f4089fa79e62 not found: ID does not exist" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.708807 4752 scope.go:117] "RemoveContainer" containerID="225ef182d4370b29d5d01f61bc062bd384cd95d7759c72e6117bee53822e155a" Nov 24 11:11:03 crc kubenswrapper[4752]: E1124 11:11:03.709136 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"225ef182d4370b29d5d01f61bc062bd384cd95d7759c72e6117bee53822e155a\": container with ID starting with 225ef182d4370b29d5d01f61bc062bd384cd95d7759c72e6117bee53822e155a not found: ID does not exist" containerID="225ef182d4370b29d5d01f61bc062bd384cd95d7759c72e6117bee53822e155a" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.709177 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"225ef182d4370b29d5d01f61bc062bd384cd95d7759c72e6117bee53822e155a"} err="failed to get container status \"225ef182d4370b29d5d01f61bc062bd384cd95d7759c72e6117bee53822e155a\": rpc error: code = NotFound desc = could not find container \"225ef182d4370b29d5d01f61bc062bd384cd95d7759c72e6117bee53822e155a\": container with ID starting with 225ef182d4370b29d5d01f61bc062bd384cd95d7759c72e6117bee53822e155a not found: ID does not exist" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.709227 4752 scope.go:117] "RemoveContainer" containerID="07ba07891f5ac3167945a163de60ae65c1b50b594843101f54eed067659e8bb7" Nov 24 11:11:03 crc kubenswrapper[4752]: E1124 11:11:03.709577 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07ba07891f5ac3167945a163de60ae65c1b50b594843101f54eed067659e8bb7\": container with ID starting with 07ba07891f5ac3167945a163de60ae65c1b50b594843101f54eed067659e8bb7 not found: ID does not exist" containerID="07ba07891f5ac3167945a163de60ae65c1b50b594843101f54eed067659e8bb7" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.709598 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07ba07891f5ac3167945a163de60ae65c1b50b594843101f54eed067659e8bb7"} err="failed to get container status \"07ba07891f5ac3167945a163de60ae65c1b50b594843101f54eed067659e8bb7\": rpc error: code = NotFound desc = could not find container \"07ba07891f5ac3167945a163de60ae65c1b50b594843101f54eed067659e8bb7\": container with ID starting with 07ba07891f5ac3167945a163de60ae65c1b50b594843101f54eed067659e8bb7 not found: ID does not exist" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.709611 4752 scope.go:117] "RemoveContainer" containerID="d6d40fef72b1cf1d1f0b1dc19406741ac4be722a13fb920e14e32f064b68c621" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.721878 4752 scope.go:117] "RemoveContainer" containerID="d6d40fef72b1cf1d1f0b1dc19406741ac4be722a13fb920e14e32f064b68c621" Nov 24 11:11:03 crc kubenswrapper[4752]: E1124 11:11:03.722289 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6d40fef72b1cf1d1f0b1dc19406741ac4be722a13fb920e14e32f064b68c621\": container with ID starting with d6d40fef72b1cf1d1f0b1dc19406741ac4be722a13fb920e14e32f064b68c621 not found: ID does not exist" containerID="d6d40fef72b1cf1d1f0b1dc19406741ac4be722a13fb920e14e32f064b68c621" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.722328 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6d40fef72b1cf1d1f0b1dc19406741ac4be722a13fb920e14e32f064b68c621"} err="failed to get container status \"d6d40fef72b1cf1d1f0b1dc19406741ac4be722a13fb920e14e32f064b68c621\": rpc error: code = NotFound desc = could not find container \"d6d40fef72b1cf1d1f0b1dc19406741ac4be722a13fb920e14e32f064b68c621\": container with ID starting with d6d40fef72b1cf1d1f0b1dc19406741ac4be722a13fb920e14e32f064b68c621 not found: ID does not exist" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.722356 4752 scope.go:117] "RemoveContainer" containerID="7c581a3b1365d35dce7abf8c0b4ac4753f79d66e2d1b46d3e990ed687e5e8bc7" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.735802 4752 scope.go:117] "RemoveContainer" containerID="518243612bd992d205291561ed5239fb188280a63197459446cad7e3c7dfe55a" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.753912 4752 scope.go:117] "RemoveContainer" containerID="6c818ee8855beeff21b3d4b90bc049dcc0f70237552df757f79d7a844895fac2" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.770211 4752 scope.go:117] "RemoveContainer" containerID="7c581a3b1365d35dce7abf8c0b4ac4753f79d66e2d1b46d3e990ed687e5e8bc7" Nov 24 11:11:03 crc kubenswrapper[4752]: E1124 11:11:03.771056 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c581a3b1365d35dce7abf8c0b4ac4753f79d66e2d1b46d3e990ed687e5e8bc7\": container with ID starting with 7c581a3b1365d35dce7abf8c0b4ac4753f79d66e2d1b46d3e990ed687e5e8bc7 not found: ID does not exist" containerID="7c581a3b1365d35dce7abf8c0b4ac4753f79d66e2d1b46d3e990ed687e5e8bc7" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.771084 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c581a3b1365d35dce7abf8c0b4ac4753f79d66e2d1b46d3e990ed687e5e8bc7"} err="failed to get container status \"7c581a3b1365d35dce7abf8c0b4ac4753f79d66e2d1b46d3e990ed687e5e8bc7\": rpc error: code = NotFound desc = could not find container \"7c581a3b1365d35dce7abf8c0b4ac4753f79d66e2d1b46d3e990ed687e5e8bc7\": container with ID starting with 7c581a3b1365d35dce7abf8c0b4ac4753f79d66e2d1b46d3e990ed687e5e8bc7 not found: ID does not exist" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.771109 4752 scope.go:117] "RemoveContainer" containerID="518243612bd992d205291561ed5239fb188280a63197459446cad7e3c7dfe55a" Nov 24 11:11:03 crc kubenswrapper[4752]: E1124 11:11:03.771387 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"518243612bd992d205291561ed5239fb188280a63197459446cad7e3c7dfe55a\": container with ID starting with 518243612bd992d205291561ed5239fb188280a63197459446cad7e3c7dfe55a not found: ID does not exist" containerID="518243612bd992d205291561ed5239fb188280a63197459446cad7e3c7dfe55a" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.771408 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"518243612bd992d205291561ed5239fb188280a63197459446cad7e3c7dfe55a"} err="failed to get container status \"518243612bd992d205291561ed5239fb188280a63197459446cad7e3c7dfe55a\": rpc error: code = NotFound desc = could not find container \"518243612bd992d205291561ed5239fb188280a63197459446cad7e3c7dfe55a\": container with ID starting with 518243612bd992d205291561ed5239fb188280a63197459446cad7e3c7dfe55a not found: ID does not exist" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.771420 4752 scope.go:117] "RemoveContainer" containerID="6c818ee8855beeff21b3d4b90bc049dcc0f70237552df757f79d7a844895fac2" Nov 24 11:11:03 crc kubenswrapper[4752]: E1124 11:11:03.771841 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c818ee8855beeff21b3d4b90bc049dcc0f70237552df757f79d7a844895fac2\": container with ID starting with 6c818ee8855beeff21b3d4b90bc049dcc0f70237552df757f79d7a844895fac2 not found: ID does not exist" containerID="6c818ee8855beeff21b3d4b90bc049dcc0f70237552df757f79d7a844895fac2" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.771862 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c818ee8855beeff21b3d4b90bc049dcc0f70237552df757f79d7a844895fac2"} err="failed to get container status \"6c818ee8855beeff21b3d4b90bc049dcc0f70237552df757f79d7a844895fac2\": rpc error: code = NotFound desc = could not find container \"6c818ee8855beeff21b3d4b90bc049dcc0f70237552df757f79d7a844895fac2\": container with ID starting with 6c818ee8855beeff21b3d4b90bc049dcc0f70237552df757f79d7a844895fac2 not found: ID does not exist" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.771876 4752 scope.go:117] "RemoveContainer" containerID="1fda4635c4542bbdca9c05c4d104f85430f73879b0182da9948599abd81f5e95" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.787795 4752 scope.go:117] "RemoveContainer" containerID="864aa92518a67a21764c35661bffde90dde142385f1cd11d26c1b1b8ac9f69e7" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.789850 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2208d90e-0865-4938-8423-ea3d37ca26db-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.801850 4752 scope.go:117] "RemoveContainer" containerID="b1c68be0687d4a0c998694a486d09ad578f9d1a05496be8eefa06648030f88c2" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.817617 4752 scope.go:117] "RemoveContainer" containerID="1fda4635c4542bbdca9c05c4d104f85430f73879b0182da9948599abd81f5e95" Nov 24 11:11:03 crc kubenswrapper[4752]: E1124 11:11:03.818158 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fda4635c4542bbdca9c05c4d104f85430f73879b0182da9948599abd81f5e95\": container with ID starting with 1fda4635c4542bbdca9c05c4d104f85430f73879b0182da9948599abd81f5e95 not found: ID does not exist" containerID="1fda4635c4542bbdca9c05c4d104f85430f73879b0182da9948599abd81f5e95" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.818211 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fda4635c4542bbdca9c05c4d104f85430f73879b0182da9948599abd81f5e95"} err="failed to get container status \"1fda4635c4542bbdca9c05c4d104f85430f73879b0182da9948599abd81f5e95\": rpc error: code = NotFound desc = could not find container \"1fda4635c4542bbdca9c05c4d104f85430f73879b0182da9948599abd81f5e95\": container with ID starting with 1fda4635c4542bbdca9c05c4d104f85430f73879b0182da9948599abd81f5e95 not found: ID does not exist" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.818259 4752 scope.go:117] "RemoveContainer" containerID="864aa92518a67a21764c35661bffde90dde142385f1cd11d26c1b1b8ac9f69e7" Nov 24 11:11:03 crc kubenswrapper[4752]: E1124 11:11:03.818960 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"864aa92518a67a21764c35661bffde90dde142385f1cd11d26c1b1b8ac9f69e7\": container with ID starting with 864aa92518a67a21764c35661bffde90dde142385f1cd11d26c1b1b8ac9f69e7 not found: ID does not exist" containerID="864aa92518a67a21764c35661bffde90dde142385f1cd11d26c1b1b8ac9f69e7" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.818994 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"864aa92518a67a21764c35661bffde90dde142385f1cd11d26c1b1b8ac9f69e7"} err="failed to get container status \"864aa92518a67a21764c35661bffde90dde142385f1cd11d26c1b1b8ac9f69e7\": rpc error: code = NotFound desc = could not find container \"864aa92518a67a21764c35661bffde90dde142385f1cd11d26c1b1b8ac9f69e7\": container with ID starting with 864aa92518a67a21764c35661bffde90dde142385f1cd11d26c1b1b8ac9f69e7 not found: ID does not exist" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.819018 4752 scope.go:117] "RemoveContainer" containerID="b1c68be0687d4a0c998694a486d09ad578f9d1a05496be8eefa06648030f88c2" Nov 24 11:11:03 crc kubenswrapper[4752]: E1124 11:11:03.820214 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1c68be0687d4a0c998694a486d09ad578f9d1a05496be8eefa06648030f88c2\": container with ID starting with b1c68be0687d4a0c998694a486d09ad578f9d1a05496be8eefa06648030f88c2 not found: ID does not exist" containerID="b1c68be0687d4a0c998694a486d09ad578f9d1a05496be8eefa06648030f88c2" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.820288 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1c68be0687d4a0c998694a486d09ad578f9d1a05496be8eefa06648030f88c2"} err="failed to get container status \"b1c68be0687d4a0c998694a486d09ad578f9d1a05496be8eefa06648030f88c2\": rpc error: code = NotFound desc = could not find container \"b1c68be0687d4a0c998694a486d09ad578f9d1a05496be8eefa06648030f88c2\": container with ID starting with b1c68be0687d4a0c998694a486d09ad578f9d1a05496be8eefa06648030f88c2 not found: ID does not exist" Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.837877 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z4nx9"] Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.854772 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-z4nx9"] Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.870639 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-glr4w"] Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.875044 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-glr4w"] Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.887516 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v468m"] Nov 24 11:11:03 crc kubenswrapper[4752]: I1124 11:11:03.890153 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-v468m"] Nov 24 11:11:04 crc kubenswrapper[4752]: I1124 11:11:04.548351 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-l9vf6" event={"ID":"077748f3-107f-424e-9084-32a79b3ac58f","Type":"ContainerStarted","Data":"74e859d5ed528a2174db836527e95adeb7730e4e374dcc8e8dba9974c1e1e21c"} Nov 24 11:11:04 crc kubenswrapper[4752]: I1124 11:11:04.548404 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-l9vf6" event={"ID":"077748f3-107f-424e-9084-32a79b3ac58f","Type":"ContainerStarted","Data":"fb770aacfb3f059fac40a4ac8afa3c260e78da0071d77750b9b2ff9ed1a3967d"} Nov 24 11:11:04 crc kubenswrapper[4752]: I1124 11:11:04.548662 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-l9vf6" Nov 24 11:11:04 crc kubenswrapper[4752]: I1124 11:11:04.552256 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-l9vf6" Nov 24 11:11:04 crc kubenswrapper[4752]: I1124 11:11:04.604016 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-l9vf6" podStartSLOduration=2.603991496 podStartE2EDuration="2.603991496s" podCreationTimestamp="2025-11-24 11:11:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:11:04.577407334 +0000 UTC m=+270.562227663" watchObservedRunningTime="2025-11-24 11:11:04.603991496 +0000 UTC m=+270.588811795" Nov 24 11:11:04 crc kubenswrapper[4752]: I1124 11:11:04.734666 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d2e36e3-15aa-4576-a13a-5b5512bb13f2" path="/var/lib/kubelet/pods/0d2e36e3-15aa-4576-a13a-5b5512bb13f2/volumes" Nov 24 11:11:04 crc kubenswrapper[4752]: I1124 11:11:04.735297 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1af90bc8-c663-4297-84fc-06b6663f837f" path="/var/lib/kubelet/pods/1af90bc8-c663-4297-84fc-06b6663f837f/volumes" Nov 24 11:11:04 crc kubenswrapper[4752]: I1124 11:11:04.735871 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2208d90e-0865-4938-8423-ea3d37ca26db" path="/var/lib/kubelet/pods/2208d90e-0865-4938-8423-ea3d37ca26db/volumes" Nov 24 11:11:04 crc kubenswrapper[4752]: I1124 11:11:04.736444 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="782a3373-7524-40dc-b312-08c40423ffb6" path="/var/lib/kubelet/pods/782a3373-7524-40dc-b312-08c40423ffb6/volumes" Nov 24 11:11:04 crc kubenswrapper[4752]: I1124 11:11:04.736898 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5633c2f-36d1-4d10-a12c-f58e67571364" path="/var/lib/kubelet/pods/e5633c2f-36d1-4d10-a12c-f58e67571364/volumes" Nov 24 11:11:04 crc kubenswrapper[4752]: I1124 11:11:04.976487 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jj77l"] Nov 24 11:11:04 crc kubenswrapper[4752]: E1124 11:11:04.976710 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2208d90e-0865-4938-8423-ea3d37ca26db" containerName="extract-content" Nov 24 11:11:04 crc kubenswrapper[4752]: I1124 11:11:04.976724 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2208d90e-0865-4938-8423-ea3d37ca26db" containerName="extract-content" Nov 24 11:11:04 crc kubenswrapper[4752]: E1124 11:11:04.976735 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d2e36e3-15aa-4576-a13a-5b5512bb13f2" containerName="extract-utilities" Nov 24 11:11:04 crc kubenswrapper[4752]: I1124 11:11:04.976761 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d2e36e3-15aa-4576-a13a-5b5512bb13f2" containerName="extract-utilities" Nov 24 11:11:04 crc kubenswrapper[4752]: E1124 11:11:04.976772 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d2e36e3-15aa-4576-a13a-5b5512bb13f2" containerName="registry-server" Nov 24 11:11:04 crc kubenswrapper[4752]: I1124 11:11:04.976780 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d2e36e3-15aa-4576-a13a-5b5512bb13f2" containerName="registry-server" Nov 24 11:11:04 crc kubenswrapper[4752]: E1124 11:11:04.976788 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5633c2f-36d1-4d10-a12c-f58e67571364" containerName="extract-content" Nov 24 11:11:04 crc kubenswrapper[4752]: I1124 11:11:04.976795 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5633c2f-36d1-4d10-a12c-f58e67571364" containerName="extract-content" Nov 24 11:11:04 crc kubenswrapper[4752]: E1124 11:11:04.976806 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2208d90e-0865-4938-8423-ea3d37ca26db" containerName="extract-utilities" Nov 24 11:11:04 crc kubenswrapper[4752]: I1124 11:11:04.976813 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2208d90e-0865-4938-8423-ea3d37ca26db" containerName="extract-utilities" Nov 24 11:11:04 crc kubenswrapper[4752]: E1124 11:11:04.976823 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2208d90e-0865-4938-8423-ea3d37ca26db" containerName="registry-server" Nov 24 11:11:04 crc kubenswrapper[4752]: I1124 11:11:04.976831 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2208d90e-0865-4938-8423-ea3d37ca26db" containerName="registry-server" Nov 24 11:11:04 crc kubenswrapper[4752]: E1124 11:11:04.976840 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1af90bc8-c663-4297-84fc-06b6663f837f" containerName="extract-utilities" Nov 24 11:11:04 crc kubenswrapper[4752]: I1124 11:11:04.976846 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="1af90bc8-c663-4297-84fc-06b6663f837f" containerName="extract-utilities" Nov 24 11:11:04 crc kubenswrapper[4752]: E1124 11:11:04.976857 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1af90bc8-c663-4297-84fc-06b6663f837f" containerName="extract-content" Nov 24 11:11:04 crc kubenswrapper[4752]: I1124 11:11:04.976864 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="1af90bc8-c663-4297-84fc-06b6663f837f" containerName="extract-content" Nov 24 11:11:04 crc kubenswrapper[4752]: E1124 11:11:04.976875 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1af90bc8-c663-4297-84fc-06b6663f837f" containerName="registry-server" Nov 24 11:11:04 crc kubenswrapper[4752]: I1124 11:11:04.976882 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="1af90bc8-c663-4297-84fc-06b6663f837f" containerName="registry-server" Nov 24 11:11:04 crc kubenswrapper[4752]: E1124 11:11:04.976890 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="782a3373-7524-40dc-b312-08c40423ffb6" containerName="marketplace-operator" Nov 24 11:11:04 crc kubenswrapper[4752]: I1124 11:11:04.976897 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="782a3373-7524-40dc-b312-08c40423ffb6" containerName="marketplace-operator" Nov 24 11:11:04 crc kubenswrapper[4752]: E1124 11:11:04.976908 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d2e36e3-15aa-4576-a13a-5b5512bb13f2" containerName="extract-content" Nov 24 11:11:04 crc kubenswrapper[4752]: I1124 11:11:04.976916 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d2e36e3-15aa-4576-a13a-5b5512bb13f2" containerName="extract-content" Nov 24 11:11:04 crc kubenswrapper[4752]: E1124 11:11:04.976926 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5633c2f-36d1-4d10-a12c-f58e67571364" containerName="registry-server" Nov 24 11:11:04 crc kubenswrapper[4752]: I1124 11:11:04.976934 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5633c2f-36d1-4d10-a12c-f58e67571364" containerName="registry-server" Nov 24 11:11:04 crc kubenswrapper[4752]: E1124 11:11:04.976944 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5633c2f-36d1-4d10-a12c-f58e67571364" containerName="extract-utilities" Nov 24 11:11:04 crc kubenswrapper[4752]: I1124 11:11:04.976951 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5633c2f-36d1-4d10-a12c-f58e67571364" containerName="extract-utilities" Nov 24 11:11:04 crc kubenswrapper[4752]: I1124 11:11:04.977055 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d2e36e3-15aa-4576-a13a-5b5512bb13f2" containerName="registry-server" Nov 24 11:11:04 crc kubenswrapper[4752]: I1124 11:11:04.977070 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="1af90bc8-c663-4297-84fc-06b6663f837f" containerName="registry-server" Nov 24 11:11:04 crc kubenswrapper[4752]: I1124 11:11:04.977087 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5633c2f-36d1-4d10-a12c-f58e67571364" containerName="registry-server" Nov 24 11:11:04 crc kubenswrapper[4752]: I1124 11:11:04.977097 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="782a3373-7524-40dc-b312-08c40423ffb6" containerName="marketplace-operator" Nov 24 11:11:04 crc kubenswrapper[4752]: I1124 11:11:04.977107 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="2208d90e-0865-4938-8423-ea3d37ca26db" containerName="registry-server" Nov 24 11:11:04 crc kubenswrapper[4752]: I1124 11:11:04.978330 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jj77l" Nov 24 11:11:04 crc kubenswrapper[4752]: I1124 11:11:04.980223 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 24 11:11:04 crc kubenswrapper[4752]: I1124 11:11:04.983421 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jj77l"] Nov 24 11:11:05 crc kubenswrapper[4752]: I1124 11:11:05.005161 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdfqw\" (UniqueName: \"kubernetes.io/projected/9794b695-5b9a-42ba-8e52-8aa1e8b95866-kube-api-access-kdfqw\") pod \"certified-operators-jj77l\" (UID: \"9794b695-5b9a-42ba-8e52-8aa1e8b95866\") " pod="openshift-marketplace/certified-operators-jj77l" Nov 24 11:11:05 crc kubenswrapper[4752]: I1124 11:11:05.005233 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9794b695-5b9a-42ba-8e52-8aa1e8b95866-catalog-content\") pod \"certified-operators-jj77l\" (UID: \"9794b695-5b9a-42ba-8e52-8aa1e8b95866\") " pod="openshift-marketplace/certified-operators-jj77l" Nov 24 11:11:05 crc kubenswrapper[4752]: I1124 11:11:05.005297 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9794b695-5b9a-42ba-8e52-8aa1e8b95866-utilities\") pod \"certified-operators-jj77l\" (UID: \"9794b695-5b9a-42ba-8e52-8aa1e8b95866\") " pod="openshift-marketplace/certified-operators-jj77l" Nov 24 11:11:05 crc kubenswrapper[4752]: I1124 11:11:05.106646 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9794b695-5b9a-42ba-8e52-8aa1e8b95866-catalog-content\") pod \"certified-operators-jj77l\" (UID: \"9794b695-5b9a-42ba-8e52-8aa1e8b95866\") " pod="openshift-marketplace/certified-operators-jj77l" Nov 24 11:11:05 crc kubenswrapper[4752]: I1124 11:11:05.107084 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9794b695-5b9a-42ba-8e52-8aa1e8b95866-utilities\") pod \"certified-operators-jj77l\" (UID: \"9794b695-5b9a-42ba-8e52-8aa1e8b95866\") " pod="openshift-marketplace/certified-operators-jj77l" Nov 24 11:11:05 crc kubenswrapper[4752]: I1124 11:11:05.107116 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdfqw\" (UniqueName: \"kubernetes.io/projected/9794b695-5b9a-42ba-8e52-8aa1e8b95866-kube-api-access-kdfqw\") pod \"certified-operators-jj77l\" (UID: \"9794b695-5b9a-42ba-8e52-8aa1e8b95866\") " pod="openshift-marketplace/certified-operators-jj77l" Nov 24 11:11:05 crc kubenswrapper[4752]: I1124 11:11:05.107798 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9794b695-5b9a-42ba-8e52-8aa1e8b95866-catalog-content\") pod \"certified-operators-jj77l\" (UID: \"9794b695-5b9a-42ba-8e52-8aa1e8b95866\") " pod="openshift-marketplace/certified-operators-jj77l" Nov 24 11:11:05 crc kubenswrapper[4752]: I1124 11:11:05.108018 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9794b695-5b9a-42ba-8e52-8aa1e8b95866-utilities\") pod \"certified-operators-jj77l\" (UID: \"9794b695-5b9a-42ba-8e52-8aa1e8b95866\") " pod="openshift-marketplace/certified-operators-jj77l" Nov 24 11:11:05 crc kubenswrapper[4752]: I1124 11:11:05.129516 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdfqw\" (UniqueName: \"kubernetes.io/projected/9794b695-5b9a-42ba-8e52-8aa1e8b95866-kube-api-access-kdfqw\") pod \"certified-operators-jj77l\" (UID: \"9794b695-5b9a-42ba-8e52-8aa1e8b95866\") " pod="openshift-marketplace/certified-operators-jj77l" Nov 24 11:11:05 crc kubenswrapper[4752]: I1124 11:11:05.305782 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jj77l" Nov 24 11:11:05 crc kubenswrapper[4752]: I1124 11:11:05.481638 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jj77l"] Nov 24 11:11:05 crc kubenswrapper[4752]: W1124 11:11:05.485442 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9794b695_5b9a_42ba_8e52_8aa1e8b95866.slice/crio-d31475fdf0b879057ca8dec86336b1eef274f6eadeaa88590355fc8ceaa18d50 WatchSource:0}: Error finding container d31475fdf0b879057ca8dec86336b1eef274f6eadeaa88590355fc8ceaa18d50: Status 404 returned error can't find the container with id d31475fdf0b879057ca8dec86336b1eef274f6eadeaa88590355fc8ceaa18d50 Nov 24 11:11:05 crc kubenswrapper[4752]: I1124 11:11:05.556515 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jj77l" event={"ID":"9794b695-5b9a-42ba-8e52-8aa1e8b95866","Type":"ContainerStarted","Data":"d31475fdf0b879057ca8dec86336b1eef274f6eadeaa88590355fc8ceaa18d50"} Nov 24 11:11:05 crc kubenswrapper[4752]: I1124 11:11:05.566591 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5jngx"] Nov 24 11:11:05 crc kubenswrapper[4752]: I1124 11:11:05.567849 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5jngx" Nov 24 11:11:05 crc kubenswrapper[4752]: I1124 11:11:05.570029 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 24 11:11:05 crc kubenswrapper[4752]: I1124 11:11:05.578043 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5jngx"] Nov 24 11:11:05 crc kubenswrapper[4752]: I1124 11:11:05.613559 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cknjh\" (UniqueName: \"kubernetes.io/projected/7764e5eb-d7c5-4c69-8aa3-2dfd20e05660-kube-api-access-cknjh\") pod \"redhat-marketplace-5jngx\" (UID: \"7764e5eb-d7c5-4c69-8aa3-2dfd20e05660\") " pod="openshift-marketplace/redhat-marketplace-5jngx" Nov 24 11:11:05 crc kubenswrapper[4752]: I1124 11:11:05.613636 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7764e5eb-d7c5-4c69-8aa3-2dfd20e05660-utilities\") pod \"redhat-marketplace-5jngx\" (UID: \"7764e5eb-d7c5-4c69-8aa3-2dfd20e05660\") " pod="openshift-marketplace/redhat-marketplace-5jngx" Nov 24 11:11:05 crc kubenswrapper[4752]: I1124 11:11:05.613701 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7764e5eb-d7c5-4c69-8aa3-2dfd20e05660-catalog-content\") pod \"redhat-marketplace-5jngx\" (UID: \"7764e5eb-d7c5-4c69-8aa3-2dfd20e05660\") " pod="openshift-marketplace/redhat-marketplace-5jngx" Nov 24 11:11:05 crc kubenswrapper[4752]: I1124 11:11:05.714833 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cknjh\" (UniqueName: \"kubernetes.io/projected/7764e5eb-d7c5-4c69-8aa3-2dfd20e05660-kube-api-access-cknjh\") pod \"redhat-marketplace-5jngx\" (UID: \"7764e5eb-d7c5-4c69-8aa3-2dfd20e05660\") " pod="openshift-marketplace/redhat-marketplace-5jngx" Nov 24 11:11:05 crc kubenswrapper[4752]: I1124 11:11:05.714894 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7764e5eb-d7c5-4c69-8aa3-2dfd20e05660-utilities\") pod \"redhat-marketplace-5jngx\" (UID: \"7764e5eb-d7c5-4c69-8aa3-2dfd20e05660\") " pod="openshift-marketplace/redhat-marketplace-5jngx" Nov 24 11:11:05 crc kubenswrapper[4752]: I1124 11:11:05.714946 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7764e5eb-d7c5-4c69-8aa3-2dfd20e05660-catalog-content\") pod \"redhat-marketplace-5jngx\" (UID: \"7764e5eb-d7c5-4c69-8aa3-2dfd20e05660\") " pod="openshift-marketplace/redhat-marketplace-5jngx" Nov 24 11:11:05 crc kubenswrapper[4752]: I1124 11:11:05.715455 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7764e5eb-d7c5-4c69-8aa3-2dfd20e05660-catalog-content\") pod \"redhat-marketplace-5jngx\" (UID: \"7764e5eb-d7c5-4c69-8aa3-2dfd20e05660\") " pod="openshift-marketplace/redhat-marketplace-5jngx" Nov 24 11:11:05 crc kubenswrapper[4752]: I1124 11:11:05.715682 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7764e5eb-d7c5-4c69-8aa3-2dfd20e05660-utilities\") pod \"redhat-marketplace-5jngx\" (UID: \"7764e5eb-d7c5-4c69-8aa3-2dfd20e05660\") " pod="openshift-marketplace/redhat-marketplace-5jngx" Nov 24 11:11:05 crc kubenswrapper[4752]: I1124 11:11:05.736687 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cknjh\" (UniqueName: \"kubernetes.io/projected/7764e5eb-d7c5-4c69-8aa3-2dfd20e05660-kube-api-access-cknjh\") pod \"redhat-marketplace-5jngx\" (UID: \"7764e5eb-d7c5-4c69-8aa3-2dfd20e05660\") " pod="openshift-marketplace/redhat-marketplace-5jngx" Nov 24 11:11:05 crc kubenswrapper[4752]: I1124 11:11:05.900983 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5jngx" Nov 24 11:11:06 crc kubenswrapper[4752]: I1124 11:11:06.090489 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5jngx"] Nov 24 11:11:06 crc kubenswrapper[4752]: I1124 11:11:06.561294 4752 generic.go:334] "Generic (PLEG): container finished" podID="9794b695-5b9a-42ba-8e52-8aa1e8b95866" containerID="8113021df0d8e0507cc61b5137e3371fa7a2ff624c775b60dc9483e901ce6afe" exitCode=0 Nov 24 11:11:06 crc kubenswrapper[4752]: I1124 11:11:06.561388 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jj77l" event={"ID":"9794b695-5b9a-42ba-8e52-8aa1e8b95866","Type":"ContainerDied","Data":"8113021df0d8e0507cc61b5137e3371fa7a2ff624c775b60dc9483e901ce6afe"} Nov 24 11:11:06 crc kubenswrapper[4752]: I1124 11:11:06.563008 4752 generic.go:334] "Generic (PLEG): container finished" podID="7764e5eb-d7c5-4c69-8aa3-2dfd20e05660" containerID="250846695e46d3d12dcdf80faf04d59815a8a31a1784108c3fb1240992db4eb1" exitCode=0 Nov 24 11:11:06 crc kubenswrapper[4752]: I1124 11:11:06.563872 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jngx" event={"ID":"7764e5eb-d7c5-4c69-8aa3-2dfd20e05660","Type":"ContainerDied","Data":"250846695e46d3d12dcdf80faf04d59815a8a31a1784108c3fb1240992db4eb1"} Nov 24 11:11:06 crc kubenswrapper[4752]: I1124 11:11:06.563967 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jngx" event={"ID":"7764e5eb-d7c5-4c69-8aa3-2dfd20e05660","Type":"ContainerStarted","Data":"bd83541c2797bd3053b58686348b315988b9f2c78c4bebbedad70024be32c5d4"} Nov 24 11:11:07 crc kubenswrapper[4752]: I1124 11:11:07.375767 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8p8mr"] Nov 24 11:11:07 crc kubenswrapper[4752]: I1124 11:11:07.378497 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8p8mr" Nov 24 11:11:07 crc kubenswrapper[4752]: I1124 11:11:07.381221 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 24 11:11:07 crc kubenswrapper[4752]: I1124 11:11:07.390111 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8p8mr"] Nov 24 11:11:07 crc kubenswrapper[4752]: I1124 11:11:07.433859 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00198a03-44e5-42de-93b2-667fd5981ac4-catalog-content\") pod \"redhat-operators-8p8mr\" (UID: \"00198a03-44e5-42de-93b2-667fd5981ac4\") " pod="openshift-marketplace/redhat-operators-8p8mr" Nov 24 11:11:07 crc kubenswrapper[4752]: I1124 11:11:07.433946 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p9wv\" (UniqueName: \"kubernetes.io/projected/00198a03-44e5-42de-93b2-667fd5981ac4-kube-api-access-9p9wv\") pod \"redhat-operators-8p8mr\" (UID: \"00198a03-44e5-42de-93b2-667fd5981ac4\") " pod="openshift-marketplace/redhat-operators-8p8mr" Nov 24 11:11:07 crc kubenswrapper[4752]: I1124 11:11:07.433971 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00198a03-44e5-42de-93b2-667fd5981ac4-utilities\") pod \"redhat-operators-8p8mr\" (UID: \"00198a03-44e5-42de-93b2-667fd5981ac4\") " pod="openshift-marketplace/redhat-operators-8p8mr" Nov 24 11:11:07 crc kubenswrapper[4752]: I1124 11:11:07.535618 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00198a03-44e5-42de-93b2-667fd5981ac4-utilities\") pod \"redhat-operators-8p8mr\" (UID: \"00198a03-44e5-42de-93b2-667fd5981ac4\") " pod="openshift-marketplace/redhat-operators-8p8mr" Nov 24 11:11:07 crc kubenswrapper[4752]: I1124 11:11:07.535699 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00198a03-44e5-42de-93b2-667fd5981ac4-catalog-content\") pod \"redhat-operators-8p8mr\" (UID: \"00198a03-44e5-42de-93b2-667fd5981ac4\") " pod="openshift-marketplace/redhat-operators-8p8mr" Nov 24 11:11:07 crc kubenswrapper[4752]: I1124 11:11:07.535796 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p9wv\" (UniqueName: \"kubernetes.io/projected/00198a03-44e5-42de-93b2-667fd5981ac4-kube-api-access-9p9wv\") pod \"redhat-operators-8p8mr\" (UID: \"00198a03-44e5-42de-93b2-667fd5981ac4\") " pod="openshift-marketplace/redhat-operators-8p8mr" Nov 24 11:11:07 crc kubenswrapper[4752]: I1124 11:11:07.536523 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00198a03-44e5-42de-93b2-667fd5981ac4-catalog-content\") pod \"redhat-operators-8p8mr\" (UID: \"00198a03-44e5-42de-93b2-667fd5981ac4\") " pod="openshift-marketplace/redhat-operators-8p8mr" Nov 24 11:11:07 crc kubenswrapper[4752]: I1124 11:11:07.537288 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00198a03-44e5-42de-93b2-667fd5981ac4-utilities\") pod \"redhat-operators-8p8mr\" (UID: \"00198a03-44e5-42de-93b2-667fd5981ac4\") " pod="openshift-marketplace/redhat-operators-8p8mr" Nov 24 11:11:07 crc kubenswrapper[4752]: I1124 11:11:07.556937 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p9wv\" (UniqueName: \"kubernetes.io/projected/00198a03-44e5-42de-93b2-667fd5981ac4-kube-api-access-9p9wv\") pod \"redhat-operators-8p8mr\" (UID: \"00198a03-44e5-42de-93b2-667fd5981ac4\") " pod="openshift-marketplace/redhat-operators-8p8mr" Nov 24 11:11:07 crc kubenswrapper[4752]: I1124 11:11:07.569362 4752 generic.go:334] "Generic (PLEG): container finished" podID="7764e5eb-d7c5-4c69-8aa3-2dfd20e05660" containerID="b731b900b9803069dee2fe3ca6325633ad05f588200f898554c9465eb184b36a" exitCode=0 Nov 24 11:11:07 crc kubenswrapper[4752]: I1124 11:11:07.569843 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jngx" event={"ID":"7764e5eb-d7c5-4c69-8aa3-2dfd20e05660","Type":"ContainerDied","Data":"b731b900b9803069dee2fe3ca6325633ad05f588200f898554c9465eb184b36a"} Nov 24 11:11:07 crc kubenswrapper[4752]: I1124 11:11:07.573155 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jj77l" event={"ID":"9794b695-5b9a-42ba-8e52-8aa1e8b95866","Type":"ContainerStarted","Data":"c426dd7731dd620e122f86427c3e3707eb4a8c460a7caff94f4d1fc62e8e1b9f"} Nov 24 11:11:07 crc kubenswrapper[4752]: I1124 11:11:07.701028 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8p8mr" Nov 24 11:11:07 crc kubenswrapper[4752]: I1124 11:11:07.887439 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8p8mr"] Nov 24 11:11:07 crc kubenswrapper[4752]: W1124 11:11:07.896210 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00198a03_44e5_42de_93b2_667fd5981ac4.slice/crio-1bd87c35b8a7eb415b791e67efc865d992251c212e9e53cb030416cd67aefb90 WatchSource:0}: Error finding container 1bd87c35b8a7eb415b791e67efc865d992251c212e9e53cb030416cd67aefb90: Status 404 returned error can't find the container with id 1bd87c35b8a7eb415b791e67efc865d992251c212e9e53cb030416cd67aefb90 Nov 24 11:11:07 crc kubenswrapper[4752]: I1124 11:11:07.973526 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pcgzp"] Nov 24 11:11:07 crc kubenswrapper[4752]: I1124 11:11:07.975007 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pcgzp" Nov 24 11:11:07 crc kubenswrapper[4752]: I1124 11:11:07.978632 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 24 11:11:07 crc kubenswrapper[4752]: I1124 11:11:07.981882 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pcgzp"] Nov 24 11:11:08 crc kubenswrapper[4752]: I1124 11:11:08.040108 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2-utilities\") pod \"community-operators-pcgzp\" (UID: \"12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2\") " pod="openshift-marketplace/community-operators-pcgzp" Nov 24 11:11:08 crc kubenswrapper[4752]: I1124 11:11:08.040147 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25ggt\" (UniqueName: \"kubernetes.io/projected/12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2-kube-api-access-25ggt\") pod \"community-operators-pcgzp\" (UID: \"12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2\") " pod="openshift-marketplace/community-operators-pcgzp" Nov 24 11:11:08 crc kubenswrapper[4752]: I1124 11:11:08.040185 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2-catalog-content\") pod \"community-operators-pcgzp\" (UID: \"12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2\") " pod="openshift-marketplace/community-operators-pcgzp" Nov 24 11:11:08 crc kubenswrapper[4752]: I1124 11:11:08.141363 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2-utilities\") pod \"community-operators-pcgzp\" (UID: \"12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2\") " pod="openshift-marketplace/community-operators-pcgzp" Nov 24 11:11:08 crc kubenswrapper[4752]: I1124 11:11:08.141422 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25ggt\" (UniqueName: \"kubernetes.io/projected/12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2-kube-api-access-25ggt\") pod \"community-operators-pcgzp\" (UID: \"12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2\") " pod="openshift-marketplace/community-operators-pcgzp" Nov 24 11:11:08 crc kubenswrapper[4752]: I1124 11:11:08.141480 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2-catalog-content\") pod \"community-operators-pcgzp\" (UID: \"12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2\") " pod="openshift-marketplace/community-operators-pcgzp" Nov 24 11:11:08 crc kubenswrapper[4752]: I1124 11:11:08.141906 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2-utilities\") pod \"community-operators-pcgzp\" (UID: \"12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2\") " pod="openshift-marketplace/community-operators-pcgzp" Nov 24 11:11:08 crc kubenswrapper[4752]: I1124 11:11:08.141990 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2-catalog-content\") pod \"community-operators-pcgzp\" (UID: \"12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2\") " pod="openshift-marketplace/community-operators-pcgzp" Nov 24 11:11:08 crc kubenswrapper[4752]: I1124 11:11:08.167281 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25ggt\" (UniqueName: \"kubernetes.io/projected/12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2-kube-api-access-25ggt\") pod \"community-operators-pcgzp\" (UID: \"12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2\") " pod="openshift-marketplace/community-operators-pcgzp" Nov 24 11:11:08 crc kubenswrapper[4752]: I1124 11:11:08.312154 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pcgzp" Nov 24 11:11:08 crc kubenswrapper[4752]: I1124 11:11:08.581169 4752 generic.go:334] "Generic (PLEG): container finished" podID="9794b695-5b9a-42ba-8e52-8aa1e8b95866" containerID="c426dd7731dd620e122f86427c3e3707eb4a8c460a7caff94f4d1fc62e8e1b9f" exitCode=0 Nov 24 11:11:08 crc kubenswrapper[4752]: I1124 11:11:08.581286 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jj77l" event={"ID":"9794b695-5b9a-42ba-8e52-8aa1e8b95866","Type":"ContainerDied","Data":"c426dd7731dd620e122f86427c3e3707eb4a8c460a7caff94f4d1fc62e8e1b9f"} Nov 24 11:11:08 crc kubenswrapper[4752]: I1124 11:11:08.585450 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jngx" event={"ID":"7764e5eb-d7c5-4c69-8aa3-2dfd20e05660","Type":"ContainerStarted","Data":"3f64f172983730582e92de6b30ce20bf0f7756c7ec447dee768cfc1cbcde2e98"} Nov 24 11:11:08 crc kubenswrapper[4752]: I1124 11:11:08.590136 4752 generic.go:334] "Generic (PLEG): container finished" podID="00198a03-44e5-42de-93b2-667fd5981ac4" containerID="c8f76c44e041024053327ad7ca10d04f095263d89cbfd480a2b81bd3744cfad3" exitCode=0 Nov 24 11:11:08 crc kubenswrapper[4752]: I1124 11:11:08.590182 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8p8mr" event={"ID":"00198a03-44e5-42de-93b2-667fd5981ac4","Type":"ContainerDied","Data":"c8f76c44e041024053327ad7ca10d04f095263d89cbfd480a2b81bd3744cfad3"} Nov 24 11:11:08 crc kubenswrapper[4752]: I1124 11:11:08.590207 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8p8mr" event={"ID":"00198a03-44e5-42de-93b2-667fd5981ac4","Type":"ContainerStarted","Data":"1bd87c35b8a7eb415b791e67efc865d992251c212e9e53cb030416cd67aefb90"} Nov 24 11:11:08 crc kubenswrapper[4752]: I1124 11:11:08.634430 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5jngx" podStartSLOduration=2.206241001 podStartE2EDuration="3.634412082s" podCreationTimestamp="2025-11-24 11:11:05 +0000 UTC" firstStartedPulling="2025-11-24 11:11:06.564829634 +0000 UTC m=+272.549649923" lastFinishedPulling="2025-11-24 11:11:07.993000715 +0000 UTC m=+273.977821004" observedRunningTime="2025-11-24 11:11:08.634385782 +0000 UTC m=+274.619206071" watchObservedRunningTime="2025-11-24 11:11:08.634412082 +0000 UTC m=+274.619232371" Nov 24 11:11:08 crc kubenswrapper[4752]: I1124 11:11:08.739193 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pcgzp"] Nov 24 11:11:09 crc kubenswrapper[4752]: I1124 11:11:09.598271 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jj77l" event={"ID":"9794b695-5b9a-42ba-8e52-8aa1e8b95866","Type":"ContainerStarted","Data":"3bcf8f45ffc50580b86be55a260e025b7fd84967c98c11118cb11a4216d425ef"} Nov 24 11:11:09 crc kubenswrapper[4752]: I1124 11:11:09.604505 4752 generic.go:334] "Generic (PLEG): container finished" podID="12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2" containerID="8aba01788d25286e7f7941edc1e4a054220d83be847aca964c692a97c1293e22" exitCode=0 Nov 24 11:11:09 crc kubenswrapper[4752]: I1124 11:11:09.604567 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pcgzp" event={"ID":"12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2","Type":"ContainerDied","Data":"8aba01788d25286e7f7941edc1e4a054220d83be847aca964c692a97c1293e22"} Nov 24 11:11:09 crc kubenswrapper[4752]: I1124 11:11:09.604672 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pcgzp" event={"ID":"12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2","Type":"ContainerStarted","Data":"5ad0b8921f3bbfddbeaaf2152ed651a3ed1f4ffbd299afbfa82bb2aa037b00e3"} Nov 24 11:11:09 crc kubenswrapper[4752]: I1124 11:11:09.617871 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jj77l" podStartSLOduration=3.171145782 podStartE2EDuration="5.617853494s" podCreationTimestamp="2025-11-24 11:11:04 +0000 UTC" firstStartedPulling="2025-11-24 11:11:06.563932568 +0000 UTC m=+272.548752857" lastFinishedPulling="2025-11-24 11:11:09.01064028 +0000 UTC m=+274.995460569" observedRunningTime="2025-11-24 11:11:09.617831293 +0000 UTC m=+275.602651592" watchObservedRunningTime="2025-11-24 11:11:09.617853494 +0000 UTC m=+275.602673783" Nov 24 11:11:10 crc kubenswrapper[4752]: I1124 11:11:10.614557 4752 generic.go:334] "Generic (PLEG): container finished" podID="00198a03-44e5-42de-93b2-667fd5981ac4" containerID="9a3933f5ff6375ad55062bd77429e34547688773c9981fa98766262befc6113b" exitCode=0 Nov 24 11:11:10 crc kubenswrapper[4752]: I1124 11:11:10.614647 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8p8mr" event={"ID":"00198a03-44e5-42de-93b2-667fd5981ac4","Type":"ContainerDied","Data":"9a3933f5ff6375ad55062bd77429e34547688773c9981fa98766262befc6113b"} Nov 24 11:11:10 crc kubenswrapper[4752]: I1124 11:11:10.617291 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pcgzp" event={"ID":"12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2","Type":"ContainerStarted","Data":"963ce0dd6ba8c7196f28953f0e318fac3168e9d4c18cfae00979c4c0045a8f02"} Nov 24 11:11:11 crc kubenswrapper[4752]: I1124 11:11:11.628016 4752 generic.go:334] "Generic (PLEG): container finished" podID="12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2" containerID="963ce0dd6ba8c7196f28953f0e318fac3168e9d4c18cfae00979c4c0045a8f02" exitCode=0 Nov 24 11:11:11 crc kubenswrapper[4752]: I1124 11:11:11.628128 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pcgzp" event={"ID":"12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2","Type":"ContainerDied","Data":"963ce0dd6ba8c7196f28953f0e318fac3168e9d4c18cfae00979c4c0045a8f02"} Nov 24 11:11:12 crc kubenswrapper[4752]: I1124 11:11:12.644023 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pcgzp" event={"ID":"12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2","Type":"ContainerStarted","Data":"41fcaca80bce6fa6bd8bfd6d3b41bbad25a85e75ea620497889900ae27752395"} Nov 24 11:11:12 crc kubenswrapper[4752]: I1124 11:11:12.647469 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8p8mr" event={"ID":"00198a03-44e5-42de-93b2-667fd5981ac4","Type":"ContainerStarted","Data":"97c26f8d5d9c8781808611e01aaac98583fd1f48642c21a9157782d27b5c318f"} Nov 24 11:11:12 crc kubenswrapper[4752]: I1124 11:11:12.663856 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pcgzp" podStartSLOduration=3.237649123 podStartE2EDuration="5.663838899s" podCreationTimestamp="2025-11-24 11:11:07 +0000 UTC" firstStartedPulling="2025-11-24 11:11:09.602973421 +0000 UTC m=+275.587793710" lastFinishedPulling="2025-11-24 11:11:12.029163197 +0000 UTC m=+278.013983486" observedRunningTime="2025-11-24 11:11:12.663552441 +0000 UTC m=+278.648372730" watchObservedRunningTime="2025-11-24 11:11:12.663838899 +0000 UTC m=+278.648659178" Nov 24 11:11:12 crc kubenswrapper[4752]: I1124 11:11:12.680039 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8p8mr" podStartSLOduration=3.229563519 podStartE2EDuration="5.68002059s" podCreationTimestamp="2025-11-24 11:11:07 +0000 UTC" firstStartedPulling="2025-11-24 11:11:08.591777593 +0000 UTC m=+274.576597882" lastFinishedPulling="2025-11-24 11:11:11.042234664 +0000 UTC m=+277.027054953" observedRunningTime="2025-11-24 11:11:12.676967131 +0000 UTC m=+278.661787420" watchObservedRunningTime="2025-11-24 11:11:12.68002059 +0000 UTC m=+278.664840879" Nov 24 11:11:15 crc kubenswrapper[4752]: I1124 11:11:15.306141 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jj77l" Nov 24 11:11:15 crc kubenswrapper[4752]: I1124 11:11:15.306499 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jj77l" Nov 24 11:11:15 crc kubenswrapper[4752]: I1124 11:11:15.346381 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jj77l" Nov 24 11:11:15 crc kubenswrapper[4752]: I1124 11:11:15.702165 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jj77l" Nov 24 11:11:15 crc kubenswrapper[4752]: I1124 11:11:15.901894 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5jngx" Nov 24 11:11:15 crc kubenswrapper[4752]: I1124 11:11:15.902031 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5jngx" Nov 24 11:11:15 crc kubenswrapper[4752]: I1124 11:11:15.940554 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5jngx" Nov 24 11:11:16 crc kubenswrapper[4752]: I1124 11:11:16.712552 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5jngx" Nov 24 11:11:17 crc kubenswrapper[4752]: I1124 11:11:17.702092 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8p8mr" Nov 24 11:11:17 crc kubenswrapper[4752]: I1124 11:11:17.702442 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8p8mr" Nov 24 11:11:18 crc kubenswrapper[4752]: I1124 11:11:18.313284 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pcgzp" Nov 24 11:11:18 crc kubenswrapper[4752]: I1124 11:11:18.313364 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pcgzp" Nov 24 11:11:18 crc kubenswrapper[4752]: I1124 11:11:18.357974 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pcgzp" Nov 24 11:11:18 crc kubenswrapper[4752]: I1124 11:11:18.718388 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pcgzp" Nov 24 11:11:18 crc kubenswrapper[4752]: I1124 11:11:18.746691 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8p8mr" podUID="00198a03-44e5-42de-93b2-667fd5981ac4" containerName="registry-server" probeResult="failure" output=< Nov 24 11:11:18 crc kubenswrapper[4752]: timeout: failed to connect service ":50051" within 1s Nov 24 11:11:18 crc kubenswrapper[4752]: > Nov 24 11:11:27 crc kubenswrapper[4752]: I1124 11:11:27.763259 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8p8mr" Nov 24 11:11:27 crc kubenswrapper[4752]: I1124 11:11:27.804462 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8p8mr" Nov 24 11:11:34 crc kubenswrapper[4752]: I1124 11:11:34.516152 4752 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Nov 24 11:12:15 crc kubenswrapper[4752]: I1124 11:12:15.469058 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 11:12:15 crc kubenswrapper[4752]: I1124 11:12:15.469807 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 11:12:45 crc kubenswrapper[4752]: I1124 11:12:45.469095 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 11:12:45 crc kubenswrapper[4752]: I1124 11:12:45.469944 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 11:13:15 crc kubenswrapper[4752]: I1124 11:13:15.469574 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 11:13:15 crc kubenswrapper[4752]: I1124 11:13:15.470525 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 11:13:15 crc kubenswrapper[4752]: I1124 11:13:15.470610 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 11:13:15 crc kubenswrapper[4752]: I1124 11:13:15.471610 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ab688d8032601fa7d70507bf76fab1a716e3b0d71d300597a4a57acbce817088"} pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 24 11:13:15 crc kubenswrapper[4752]: I1124 11:13:15.471710 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" containerID="cri-o://ab688d8032601fa7d70507bf76fab1a716e3b0d71d300597a4a57acbce817088" gracePeriod=600 Nov 24 11:13:15 crc kubenswrapper[4752]: I1124 11:13:15.690887 4752 generic.go:334] "Generic (PLEG): container finished" podID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerID="ab688d8032601fa7d70507bf76fab1a716e3b0d71d300597a4a57acbce817088" exitCode=0 Nov 24 11:13:15 crc kubenswrapper[4752]: I1124 11:13:15.690953 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerDied","Data":"ab688d8032601fa7d70507bf76fab1a716e3b0d71d300597a4a57acbce817088"} Nov 24 11:13:15 crc kubenswrapper[4752]: I1124 11:13:15.691366 4752 scope.go:117] "RemoveContainer" containerID="19b6540a68ba35f1f647d8d4bc6fe738e35d1da8b8a4b8cdf8347e246bdc9a3d" Nov 24 11:13:16 crc kubenswrapper[4752]: I1124 11:13:16.697940 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerStarted","Data":"da2caaa446f21448894891c3dba7ab4a49ef9e957e5675f710fb5e3d17cf68e4"} Nov 24 11:14:38 crc kubenswrapper[4752]: I1124 11:14:38.362184 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-6g57b"] Nov 24 11:14:38 crc kubenswrapper[4752]: I1124 11:14:38.363556 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-6g57b" Nov 24 11:14:38 crc kubenswrapper[4752]: I1124 11:14:38.382899 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-6g57b"] Nov 24 11:14:38 crc kubenswrapper[4752]: I1124 11:14:38.478472 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aed29c5f-adef-4f28-891c-83f9e54d7609-registry-tls\") pod \"image-registry-66df7c8f76-6g57b\" (UID: \"aed29c5f-adef-4f28-891c-83f9e54d7609\") " pod="openshift-image-registry/image-registry-66df7c8f76-6g57b" Nov 24 11:14:38 crc kubenswrapper[4752]: I1124 11:14:38.478530 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aed29c5f-adef-4f28-891c-83f9e54d7609-trusted-ca\") pod \"image-registry-66df7c8f76-6g57b\" (UID: \"aed29c5f-adef-4f28-891c-83f9e54d7609\") " pod="openshift-image-registry/image-registry-66df7c8f76-6g57b" Nov 24 11:14:38 crc kubenswrapper[4752]: I1124 11:14:38.478561 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aed29c5f-adef-4f28-891c-83f9e54d7609-installation-pull-secrets\") pod \"image-registry-66df7c8f76-6g57b\" (UID: \"aed29c5f-adef-4f28-891c-83f9e54d7609\") " pod="openshift-image-registry/image-registry-66df7c8f76-6g57b" Nov 24 11:14:38 crc kubenswrapper[4752]: I1124 11:14:38.478584 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aed29c5f-adef-4f28-891c-83f9e54d7609-registry-certificates\") pod \"image-registry-66df7c8f76-6g57b\" (UID: \"aed29c5f-adef-4f28-891c-83f9e54d7609\") " pod="openshift-image-registry/image-registry-66df7c8f76-6g57b" Nov 24 11:14:38 crc kubenswrapper[4752]: I1124 11:14:38.478609 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aed29c5f-adef-4f28-891c-83f9e54d7609-ca-trust-extracted\") pod \"image-registry-66df7c8f76-6g57b\" (UID: \"aed29c5f-adef-4f28-891c-83f9e54d7609\") " pod="openshift-image-registry/image-registry-66df7c8f76-6g57b" Nov 24 11:14:38 crc kubenswrapper[4752]: I1124 11:14:38.478637 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsvzw\" (UniqueName: \"kubernetes.io/projected/aed29c5f-adef-4f28-891c-83f9e54d7609-kube-api-access-qsvzw\") pod \"image-registry-66df7c8f76-6g57b\" (UID: \"aed29c5f-adef-4f28-891c-83f9e54d7609\") " pod="openshift-image-registry/image-registry-66df7c8f76-6g57b" Nov 24 11:14:38 crc kubenswrapper[4752]: I1124 11:14:38.478661 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aed29c5f-adef-4f28-891c-83f9e54d7609-bound-sa-token\") pod \"image-registry-66df7c8f76-6g57b\" (UID: \"aed29c5f-adef-4f28-891c-83f9e54d7609\") " pod="openshift-image-registry/image-registry-66df7c8f76-6g57b" Nov 24 11:14:38 crc kubenswrapper[4752]: I1124 11:14:38.478690 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-6g57b\" (UID: \"aed29c5f-adef-4f28-891c-83f9e54d7609\") " pod="openshift-image-registry/image-registry-66df7c8f76-6g57b" Nov 24 11:14:38 crc kubenswrapper[4752]: I1124 11:14:38.500484 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-6g57b\" (UID: \"aed29c5f-adef-4f28-891c-83f9e54d7609\") " pod="openshift-image-registry/image-registry-66df7c8f76-6g57b" Nov 24 11:14:38 crc kubenswrapper[4752]: I1124 11:14:38.580250 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aed29c5f-adef-4f28-891c-83f9e54d7609-registry-tls\") pod \"image-registry-66df7c8f76-6g57b\" (UID: \"aed29c5f-adef-4f28-891c-83f9e54d7609\") " pod="openshift-image-registry/image-registry-66df7c8f76-6g57b" Nov 24 11:14:38 crc kubenswrapper[4752]: I1124 11:14:38.580301 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aed29c5f-adef-4f28-891c-83f9e54d7609-trusted-ca\") pod \"image-registry-66df7c8f76-6g57b\" (UID: \"aed29c5f-adef-4f28-891c-83f9e54d7609\") " pod="openshift-image-registry/image-registry-66df7c8f76-6g57b" Nov 24 11:14:38 crc kubenswrapper[4752]: I1124 11:14:38.580330 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aed29c5f-adef-4f28-891c-83f9e54d7609-installation-pull-secrets\") pod \"image-registry-66df7c8f76-6g57b\" (UID: \"aed29c5f-adef-4f28-891c-83f9e54d7609\") " pod="openshift-image-registry/image-registry-66df7c8f76-6g57b" Nov 24 11:14:38 crc kubenswrapper[4752]: I1124 11:14:38.580353 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aed29c5f-adef-4f28-891c-83f9e54d7609-registry-certificates\") pod \"image-registry-66df7c8f76-6g57b\" (UID: \"aed29c5f-adef-4f28-891c-83f9e54d7609\") " pod="openshift-image-registry/image-registry-66df7c8f76-6g57b" Nov 24 11:14:38 crc kubenswrapper[4752]: I1124 11:14:38.580373 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aed29c5f-adef-4f28-891c-83f9e54d7609-ca-trust-extracted\") pod \"image-registry-66df7c8f76-6g57b\" (UID: \"aed29c5f-adef-4f28-891c-83f9e54d7609\") " pod="openshift-image-registry/image-registry-66df7c8f76-6g57b" Nov 24 11:14:38 crc kubenswrapper[4752]: I1124 11:14:38.580405 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsvzw\" (UniqueName: \"kubernetes.io/projected/aed29c5f-adef-4f28-891c-83f9e54d7609-kube-api-access-qsvzw\") pod \"image-registry-66df7c8f76-6g57b\" (UID: \"aed29c5f-adef-4f28-891c-83f9e54d7609\") " pod="openshift-image-registry/image-registry-66df7c8f76-6g57b" Nov 24 11:14:38 crc kubenswrapper[4752]: I1124 11:14:38.580428 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aed29c5f-adef-4f28-891c-83f9e54d7609-bound-sa-token\") pod \"image-registry-66df7c8f76-6g57b\" (UID: \"aed29c5f-adef-4f28-891c-83f9e54d7609\") " pod="openshift-image-registry/image-registry-66df7c8f76-6g57b" Nov 24 11:14:38 crc kubenswrapper[4752]: I1124 11:14:38.581227 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aed29c5f-adef-4f28-891c-83f9e54d7609-ca-trust-extracted\") pod \"image-registry-66df7c8f76-6g57b\" (UID: \"aed29c5f-adef-4f28-891c-83f9e54d7609\") " pod="openshift-image-registry/image-registry-66df7c8f76-6g57b" Nov 24 11:14:38 crc kubenswrapper[4752]: I1124 11:14:38.581933 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aed29c5f-adef-4f28-891c-83f9e54d7609-registry-certificates\") pod \"image-registry-66df7c8f76-6g57b\" (UID: \"aed29c5f-adef-4f28-891c-83f9e54d7609\") " pod="openshift-image-registry/image-registry-66df7c8f76-6g57b" Nov 24 11:14:38 crc kubenswrapper[4752]: I1124 11:14:38.583272 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aed29c5f-adef-4f28-891c-83f9e54d7609-trusted-ca\") pod \"image-registry-66df7c8f76-6g57b\" (UID: \"aed29c5f-adef-4f28-891c-83f9e54d7609\") " pod="openshift-image-registry/image-registry-66df7c8f76-6g57b" Nov 24 11:14:38 crc kubenswrapper[4752]: I1124 11:14:38.585989 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aed29c5f-adef-4f28-891c-83f9e54d7609-registry-tls\") pod \"image-registry-66df7c8f76-6g57b\" (UID: \"aed29c5f-adef-4f28-891c-83f9e54d7609\") " pod="openshift-image-registry/image-registry-66df7c8f76-6g57b" Nov 24 11:14:38 crc kubenswrapper[4752]: I1124 11:14:38.586441 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aed29c5f-adef-4f28-891c-83f9e54d7609-installation-pull-secrets\") pod \"image-registry-66df7c8f76-6g57b\" (UID: \"aed29c5f-adef-4f28-891c-83f9e54d7609\") " pod="openshift-image-registry/image-registry-66df7c8f76-6g57b" Nov 24 11:14:38 crc kubenswrapper[4752]: I1124 11:14:38.596108 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aed29c5f-adef-4f28-891c-83f9e54d7609-bound-sa-token\") pod \"image-registry-66df7c8f76-6g57b\" (UID: \"aed29c5f-adef-4f28-891c-83f9e54d7609\") " pod="openshift-image-registry/image-registry-66df7c8f76-6g57b" Nov 24 11:14:38 crc kubenswrapper[4752]: I1124 11:14:38.598461 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsvzw\" (UniqueName: \"kubernetes.io/projected/aed29c5f-adef-4f28-891c-83f9e54d7609-kube-api-access-qsvzw\") pod \"image-registry-66df7c8f76-6g57b\" (UID: \"aed29c5f-adef-4f28-891c-83f9e54d7609\") " pod="openshift-image-registry/image-registry-66df7c8f76-6g57b" Nov 24 11:14:38 crc kubenswrapper[4752]: I1124 11:14:38.682702 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-6g57b" Nov 24 11:14:39 crc kubenswrapper[4752]: I1124 11:14:39.134204 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-6g57b"] Nov 24 11:14:39 crc kubenswrapper[4752]: I1124 11:14:39.248844 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-6g57b" event={"ID":"aed29c5f-adef-4f28-891c-83f9e54d7609","Type":"ContainerStarted","Data":"ad91b3ad884fde613c9822c50bc470a3219123bd0403748d6c00b6a191845ed5"} Nov 24 11:14:40 crc kubenswrapper[4752]: I1124 11:14:40.257407 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-6g57b" event={"ID":"aed29c5f-adef-4f28-891c-83f9e54d7609","Type":"ContainerStarted","Data":"19ccae5995adeb03606900f51ac9116516487354abb2b5596dc40d5402245601"} Nov 24 11:14:40 crc kubenswrapper[4752]: I1124 11:14:40.258109 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-6g57b" Nov 24 11:14:40 crc kubenswrapper[4752]: I1124 11:14:40.288808 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-6g57b" podStartSLOduration=2.288729753 podStartE2EDuration="2.288729753s" podCreationTimestamp="2025-11-24 11:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:14:40.285103799 +0000 UTC m=+486.269924148" watchObservedRunningTime="2025-11-24 11:14:40.288729753 +0000 UTC m=+486.273550122" Nov 24 11:14:58 crc kubenswrapper[4752]: I1124 11:14:58.690077 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-6g57b" Nov 24 11:14:58 crc kubenswrapper[4752]: I1124 11:14:58.763208 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4lbhg"] Nov 24 11:15:00 crc kubenswrapper[4752]: I1124 11:15:00.136419 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399715-jd4wt"] Nov 24 11:15:00 crc kubenswrapper[4752]: I1124 11:15:00.137420 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399715-jd4wt" Nov 24 11:15:00 crc kubenswrapper[4752]: I1124 11:15:00.140126 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 24 11:15:00 crc kubenswrapper[4752]: I1124 11:15:00.141111 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 24 11:15:00 crc kubenswrapper[4752]: I1124 11:15:00.147089 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399715-jd4wt"] Nov 24 11:15:00 crc kubenswrapper[4752]: I1124 11:15:00.306150 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8jlm\" (UniqueName: \"kubernetes.io/projected/881eb454-d8b6-4b23-a23b-c3e0fc44d97c-kube-api-access-t8jlm\") pod \"collect-profiles-29399715-jd4wt\" (UID: \"881eb454-d8b6-4b23-a23b-c3e0fc44d97c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399715-jd4wt" Nov 24 11:15:00 crc kubenswrapper[4752]: I1124 11:15:00.306477 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/881eb454-d8b6-4b23-a23b-c3e0fc44d97c-secret-volume\") pod \"collect-profiles-29399715-jd4wt\" (UID: \"881eb454-d8b6-4b23-a23b-c3e0fc44d97c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399715-jd4wt" Nov 24 11:15:00 crc kubenswrapper[4752]: I1124 11:15:00.306635 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/881eb454-d8b6-4b23-a23b-c3e0fc44d97c-config-volume\") pod \"collect-profiles-29399715-jd4wt\" (UID: \"881eb454-d8b6-4b23-a23b-c3e0fc44d97c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399715-jd4wt" Nov 24 11:15:00 crc kubenswrapper[4752]: I1124 11:15:00.407617 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8jlm\" (UniqueName: \"kubernetes.io/projected/881eb454-d8b6-4b23-a23b-c3e0fc44d97c-kube-api-access-t8jlm\") pod \"collect-profiles-29399715-jd4wt\" (UID: \"881eb454-d8b6-4b23-a23b-c3e0fc44d97c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399715-jd4wt" Nov 24 11:15:00 crc kubenswrapper[4752]: I1124 11:15:00.407692 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/881eb454-d8b6-4b23-a23b-c3e0fc44d97c-secret-volume\") pod \"collect-profiles-29399715-jd4wt\" (UID: \"881eb454-d8b6-4b23-a23b-c3e0fc44d97c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399715-jd4wt" Nov 24 11:15:00 crc kubenswrapper[4752]: I1124 11:15:00.407775 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/881eb454-d8b6-4b23-a23b-c3e0fc44d97c-config-volume\") pod \"collect-profiles-29399715-jd4wt\" (UID: \"881eb454-d8b6-4b23-a23b-c3e0fc44d97c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399715-jd4wt" Nov 24 11:15:00 crc kubenswrapper[4752]: I1124 11:15:00.409064 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/881eb454-d8b6-4b23-a23b-c3e0fc44d97c-config-volume\") pod \"collect-profiles-29399715-jd4wt\" (UID: \"881eb454-d8b6-4b23-a23b-c3e0fc44d97c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399715-jd4wt" Nov 24 11:15:00 crc kubenswrapper[4752]: I1124 11:15:00.416117 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/881eb454-d8b6-4b23-a23b-c3e0fc44d97c-secret-volume\") pod \"collect-profiles-29399715-jd4wt\" (UID: \"881eb454-d8b6-4b23-a23b-c3e0fc44d97c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399715-jd4wt" Nov 24 11:15:00 crc kubenswrapper[4752]: I1124 11:15:00.427293 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8jlm\" (UniqueName: \"kubernetes.io/projected/881eb454-d8b6-4b23-a23b-c3e0fc44d97c-kube-api-access-t8jlm\") pod \"collect-profiles-29399715-jd4wt\" (UID: \"881eb454-d8b6-4b23-a23b-c3e0fc44d97c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399715-jd4wt" Nov 24 11:15:00 crc kubenswrapper[4752]: I1124 11:15:00.463515 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399715-jd4wt" Nov 24 11:15:00 crc kubenswrapper[4752]: I1124 11:15:00.661023 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399715-jd4wt"] Nov 24 11:15:00 crc kubenswrapper[4752]: W1124 11:15:00.666841 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod881eb454_d8b6_4b23_a23b_c3e0fc44d97c.slice/crio-b90422e83718a1b76d4f69f673f1c30db22a24615e720a5a713626c4cd0919cc WatchSource:0}: Error finding container b90422e83718a1b76d4f69f673f1c30db22a24615e720a5a713626c4cd0919cc: Status 404 returned error can't find the container with id b90422e83718a1b76d4f69f673f1c30db22a24615e720a5a713626c4cd0919cc Nov 24 11:15:01 crc kubenswrapper[4752]: I1124 11:15:01.397575 4752 generic.go:334] "Generic (PLEG): container finished" podID="881eb454-d8b6-4b23-a23b-c3e0fc44d97c" containerID="1b75735ac8682a5ee99c592c9ca633b197599670c14e997d46c504c8dcd42504" exitCode=0 Nov 24 11:15:01 crc kubenswrapper[4752]: I1124 11:15:01.397694 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399715-jd4wt" event={"ID":"881eb454-d8b6-4b23-a23b-c3e0fc44d97c","Type":"ContainerDied","Data":"1b75735ac8682a5ee99c592c9ca633b197599670c14e997d46c504c8dcd42504"} Nov 24 11:15:01 crc kubenswrapper[4752]: I1124 11:15:01.397924 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399715-jd4wt" event={"ID":"881eb454-d8b6-4b23-a23b-c3e0fc44d97c","Type":"ContainerStarted","Data":"b90422e83718a1b76d4f69f673f1c30db22a24615e720a5a713626c4cd0919cc"} Nov 24 11:15:02 crc kubenswrapper[4752]: I1124 11:15:02.681718 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399715-jd4wt" Nov 24 11:15:02 crc kubenswrapper[4752]: I1124 11:15:02.839710 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/881eb454-d8b6-4b23-a23b-c3e0fc44d97c-config-volume\") pod \"881eb454-d8b6-4b23-a23b-c3e0fc44d97c\" (UID: \"881eb454-d8b6-4b23-a23b-c3e0fc44d97c\") " Nov 24 11:15:02 crc kubenswrapper[4752]: I1124 11:15:02.839793 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/881eb454-d8b6-4b23-a23b-c3e0fc44d97c-secret-volume\") pod \"881eb454-d8b6-4b23-a23b-c3e0fc44d97c\" (UID: \"881eb454-d8b6-4b23-a23b-c3e0fc44d97c\") " Nov 24 11:15:02 crc kubenswrapper[4752]: I1124 11:15:02.839911 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8jlm\" (UniqueName: \"kubernetes.io/projected/881eb454-d8b6-4b23-a23b-c3e0fc44d97c-kube-api-access-t8jlm\") pod \"881eb454-d8b6-4b23-a23b-c3e0fc44d97c\" (UID: \"881eb454-d8b6-4b23-a23b-c3e0fc44d97c\") " Nov 24 11:15:02 crc kubenswrapper[4752]: I1124 11:15:02.841129 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/881eb454-d8b6-4b23-a23b-c3e0fc44d97c-config-volume" (OuterVolumeSpecName: "config-volume") pod "881eb454-d8b6-4b23-a23b-c3e0fc44d97c" (UID: "881eb454-d8b6-4b23-a23b-c3e0fc44d97c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:15:02 crc kubenswrapper[4752]: I1124 11:15:02.849699 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/881eb454-d8b6-4b23-a23b-c3e0fc44d97c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "881eb454-d8b6-4b23-a23b-c3e0fc44d97c" (UID: "881eb454-d8b6-4b23-a23b-c3e0fc44d97c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:15:02 crc kubenswrapper[4752]: I1124 11:15:02.849993 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/881eb454-d8b6-4b23-a23b-c3e0fc44d97c-kube-api-access-t8jlm" (OuterVolumeSpecName: "kube-api-access-t8jlm") pod "881eb454-d8b6-4b23-a23b-c3e0fc44d97c" (UID: "881eb454-d8b6-4b23-a23b-c3e0fc44d97c"). InnerVolumeSpecName "kube-api-access-t8jlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:15:02 crc kubenswrapper[4752]: I1124 11:15:02.941284 4752 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/881eb454-d8b6-4b23-a23b-c3e0fc44d97c-config-volume\") on node \"crc\" DevicePath \"\"" Nov 24 11:15:02 crc kubenswrapper[4752]: I1124 11:15:02.941324 4752 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/881eb454-d8b6-4b23-a23b-c3e0fc44d97c-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 24 11:15:02 crc kubenswrapper[4752]: I1124 11:15:02.941336 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8jlm\" (UniqueName: \"kubernetes.io/projected/881eb454-d8b6-4b23-a23b-c3e0fc44d97c-kube-api-access-t8jlm\") on node \"crc\" DevicePath \"\"" Nov 24 11:15:03 crc kubenswrapper[4752]: I1124 11:15:03.410999 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399715-jd4wt" event={"ID":"881eb454-d8b6-4b23-a23b-c3e0fc44d97c","Type":"ContainerDied","Data":"b90422e83718a1b76d4f69f673f1c30db22a24615e720a5a713626c4cd0919cc"} Nov 24 11:15:03 crc kubenswrapper[4752]: I1124 11:15:03.411055 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b90422e83718a1b76d4f69f673f1c30db22a24615e720a5a713626c4cd0919cc" Nov 24 11:15:03 crc kubenswrapper[4752]: I1124 11:15:03.411081 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399715-jd4wt" Nov 24 11:15:15 crc kubenswrapper[4752]: I1124 11:15:15.469437 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 11:15:15 crc kubenswrapper[4752]: I1124 11:15:15.470606 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 11:15:23 crc kubenswrapper[4752]: I1124 11:15:23.831105 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" podUID="a3037291-c53e-4eb9-ae1b-00f71fee5cc5" containerName="registry" containerID="cri-o://66b76c9c1eb3a230a607deeed25a0e305fed9f58735e466bacb39e2685a14ab9" gracePeriod=30 Nov 24 11:15:24 crc kubenswrapper[4752]: I1124 11:15:24.229615 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:15:24 crc kubenswrapper[4752]: I1124 11:15:24.348300 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a3037291-c53e-4eb9-ae1b-00f71fee5cc5-installation-pull-secrets\") pod \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " Nov 24 11:15:24 crc kubenswrapper[4752]: I1124 11:15:24.348400 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a3037291-c53e-4eb9-ae1b-00f71fee5cc5-registry-tls\") pod \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " Nov 24 11:15:24 crc kubenswrapper[4752]: I1124 11:15:24.348839 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " Nov 24 11:15:24 crc kubenswrapper[4752]: I1124 11:15:24.348910 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6lhf\" (UniqueName: \"kubernetes.io/projected/a3037291-c53e-4eb9-ae1b-00f71fee5cc5-kube-api-access-d6lhf\") pod \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " Nov 24 11:15:24 crc kubenswrapper[4752]: I1124 11:15:24.348980 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a3037291-c53e-4eb9-ae1b-00f71fee5cc5-registry-certificates\") pod \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " Nov 24 11:15:24 crc kubenswrapper[4752]: I1124 11:15:24.349038 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a3037291-c53e-4eb9-ae1b-00f71fee5cc5-trusted-ca\") pod \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " Nov 24 11:15:24 crc kubenswrapper[4752]: I1124 11:15:24.349209 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a3037291-c53e-4eb9-ae1b-00f71fee5cc5-ca-trust-extracted\") pod \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " Nov 24 11:15:24 crc kubenswrapper[4752]: I1124 11:15:24.349281 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a3037291-c53e-4eb9-ae1b-00f71fee5cc5-bound-sa-token\") pod \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\" (UID: \"a3037291-c53e-4eb9-ae1b-00f71fee5cc5\") " Nov 24 11:15:24 crc kubenswrapper[4752]: I1124 11:15:24.350268 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3037291-c53e-4eb9-ae1b-00f71fee5cc5-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "a3037291-c53e-4eb9-ae1b-00f71fee5cc5" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:15:24 crc kubenswrapper[4752]: I1124 11:15:24.350833 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3037291-c53e-4eb9-ae1b-00f71fee5cc5-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a3037291-c53e-4eb9-ae1b-00f71fee5cc5" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:15:24 crc kubenswrapper[4752]: I1124 11:15:24.356962 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3037291-c53e-4eb9-ae1b-00f71fee5cc5-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "a3037291-c53e-4eb9-ae1b-00f71fee5cc5" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:15:24 crc kubenswrapper[4752]: I1124 11:15:24.358463 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3037291-c53e-4eb9-ae1b-00f71fee5cc5-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "a3037291-c53e-4eb9-ae1b-00f71fee5cc5" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:15:24 crc kubenswrapper[4752]: I1124 11:15:24.362850 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3037291-c53e-4eb9-ae1b-00f71fee5cc5-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a3037291-c53e-4eb9-ae1b-00f71fee5cc5" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:15:24 crc kubenswrapper[4752]: I1124 11:15:24.363327 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3037291-c53e-4eb9-ae1b-00f71fee5cc5-kube-api-access-d6lhf" (OuterVolumeSpecName: "kube-api-access-d6lhf") pod "a3037291-c53e-4eb9-ae1b-00f71fee5cc5" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5"). InnerVolumeSpecName "kube-api-access-d6lhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:15:24 crc kubenswrapper[4752]: I1124 11:15:24.366037 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3037291-c53e-4eb9-ae1b-00f71fee5cc5-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "a3037291-c53e-4eb9-ae1b-00f71fee5cc5" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:15:24 crc kubenswrapper[4752]: I1124 11:15:24.370196 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "a3037291-c53e-4eb9-ae1b-00f71fee5cc5" (UID: "a3037291-c53e-4eb9-ae1b-00f71fee5cc5"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 24 11:15:24 crc kubenswrapper[4752]: I1124 11:15:24.452006 4752 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a3037291-c53e-4eb9-ae1b-00f71fee5cc5-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 24 11:15:24 crc kubenswrapper[4752]: I1124 11:15:24.452668 4752 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a3037291-c53e-4eb9-ae1b-00f71fee5cc5-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 24 11:15:24 crc kubenswrapper[4752]: I1124 11:15:24.452703 4752 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a3037291-c53e-4eb9-ae1b-00f71fee5cc5-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 24 11:15:24 crc kubenswrapper[4752]: I1124 11:15:24.452795 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6lhf\" (UniqueName: \"kubernetes.io/projected/a3037291-c53e-4eb9-ae1b-00f71fee5cc5-kube-api-access-d6lhf\") on node \"crc\" DevicePath \"\"" Nov 24 11:15:24 crc kubenswrapper[4752]: I1124 11:15:24.452869 4752 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a3037291-c53e-4eb9-ae1b-00f71fee5cc5-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 24 11:15:24 crc kubenswrapper[4752]: I1124 11:15:24.452894 4752 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a3037291-c53e-4eb9-ae1b-00f71fee5cc5-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 24 11:15:24 crc kubenswrapper[4752]: I1124 11:15:24.452966 4752 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a3037291-c53e-4eb9-ae1b-00f71fee5cc5-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 24 11:15:24 crc kubenswrapper[4752]: I1124 11:15:24.550886 4752 generic.go:334] "Generic (PLEG): container finished" podID="a3037291-c53e-4eb9-ae1b-00f71fee5cc5" containerID="66b76c9c1eb3a230a607deeed25a0e305fed9f58735e466bacb39e2685a14ab9" exitCode=0 Nov 24 11:15:24 crc kubenswrapper[4752]: I1124 11:15:24.550954 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" event={"ID":"a3037291-c53e-4eb9-ae1b-00f71fee5cc5","Type":"ContainerDied","Data":"66b76c9c1eb3a230a607deeed25a0e305fed9f58735e466bacb39e2685a14ab9"} Nov 24 11:15:24 crc kubenswrapper[4752]: I1124 11:15:24.550984 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" Nov 24 11:15:24 crc kubenswrapper[4752]: I1124 11:15:24.551030 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4lbhg" event={"ID":"a3037291-c53e-4eb9-ae1b-00f71fee5cc5","Type":"ContainerDied","Data":"5bb86b7834bb7d0e881873d47ed42b71849a47fb9c3f4b96005d3f0bdde3bdd2"} Nov 24 11:15:24 crc kubenswrapper[4752]: I1124 11:15:24.551075 4752 scope.go:117] "RemoveContainer" containerID="66b76c9c1eb3a230a607deeed25a0e305fed9f58735e466bacb39e2685a14ab9" Nov 24 11:15:24 crc kubenswrapper[4752]: I1124 11:15:24.576575 4752 scope.go:117] "RemoveContainer" containerID="66b76c9c1eb3a230a607deeed25a0e305fed9f58735e466bacb39e2685a14ab9" Nov 24 11:15:24 crc kubenswrapper[4752]: E1124 11:15:24.577562 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66b76c9c1eb3a230a607deeed25a0e305fed9f58735e466bacb39e2685a14ab9\": container with ID starting with 66b76c9c1eb3a230a607deeed25a0e305fed9f58735e466bacb39e2685a14ab9 not found: ID does not exist" containerID="66b76c9c1eb3a230a607deeed25a0e305fed9f58735e466bacb39e2685a14ab9" Nov 24 11:15:24 crc kubenswrapper[4752]: I1124 11:15:24.577621 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66b76c9c1eb3a230a607deeed25a0e305fed9f58735e466bacb39e2685a14ab9"} err="failed to get container status \"66b76c9c1eb3a230a607deeed25a0e305fed9f58735e466bacb39e2685a14ab9\": rpc error: code = NotFound desc = could not find container \"66b76c9c1eb3a230a607deeed25a0e305fed9f58735e466bacb39e2685a14ab9\": container with ID starting with 66b76c9c1eb3a230a607deeed25a0e305fed9f58735e466bacb39e2685a14ab9 not found: ID does not exist" Nov 24 11:15:24 crc kubenswrapper[4752]: I1124 11:15:24.602955 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4lbhg"] Nov 24 11:15:24 crc kubenswrapper[4752]: I1124 11:15:24.606741 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4lbhg"] Nov 24 11:15:24 crc kubenswrapper[4752]: I1124 11:15:24.738916 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3037291-c53e-4eb9-ae1b-00f71fee5cc5" path="/var/lib/kubelet/pods/a3037291-c53e-4eb9-ae1b-00f71fee5cc5/volumes" Nov 24 11:15:45 crc kubenswrapper[4752]: I1124 11:15:45.469273 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 11:15:45 crc kubenswrapper[4752]: I1124 11:15:45.470033 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 11:16:15 crc kubenswrapper[4752]: I1124 11:16:15.469548 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 11:16:15 crc kubenswrapper[4752]: I1124 11:16:15.470951 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 11:16:15 crc kubenswrapper[4752]: I1124 11:16:15.471047 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 11:16:15 crc kubenswrapper[4752]: I1124 11:16:15.471995 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"da2caaa446f21448894891c3dba7ab4a49ef9e957e5675f710fb5e3d17cf68e4"} pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 24 11:16:15 crc kubenswrapper[4752]: I1124 11:16:15.472094 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" containerID="cri-o://da2caaa446f21448894891c3dba7ab4a49ef9e957e5675f710fb5e3d17cf68e4" gracePeriod=600 Nov 24 11:16:15 crc kubenswrapper[4752]: I1124 11:16:15.887598 4752 generic.go:334] "Generic (PLEG): container finished" podID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerID="da2caaa446f21448894891c3dba7ab4a49ef9e957e5675f710fb5e3d17cf68e4" exitCode=0 Nov 24 11:16:15 crc kubenswrapper[4752]: I1124 11:16:15.887651 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerDied","Data":"da2caaa446f21448894891c3dba7ab4a49ef9e957e5675f710fb5e3d17cf68e4"} Nov 24 11:16:15 crc kubenswrapper[4752]: I1124 11:16:15.888079 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerStarted","Data":"6e5c6804736df3aeb8f103a865f92241e4b53345e9d743bcf5a29c58bb51532b"} Nov 24 11:16:15 crc kubenswrapper[4752]: I1124 11:16:15.888117 4752 scope.go:117] "RemoveContainer" containerID="ab688d8032601fa7d70507bf76fab1a716e3b0d71d300597a4a57acbce817088" Nov 24 11:17:41 crc kubenswrapper[4752]: I1124 11:17:41.454801 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-x24q4"] Nov 24 11:17:41 crc kubenswrapper[4752]: E1124 11:17:41.455841 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="881eb454-d8b6-4b23-a23b-c3e0fc44d97c" containerName="collect-profiles" Nov 24 11:17:41 crc kubenswrapper[4752]: I1124 11:17:41.455869 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="881eb454-d8b6-4b23-a23b-c3e0fc44d97c" containerName="collect-profiles" Nov 24 11:17:41 crc kubenswrapper[4752]: E1124 11:17:41.455899 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3037291-c53e-4eb9-ae1b-00f71fee5cc5" containerName="registry" Nov 24 11:17:41 crc kubenswrapper[4752]: I1124 11:17:41.455912 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3037291-c53e-4eb9-ae1b-00f71fee5cc5" containerName="registry" Nov 24 11:17:41 crc kubenswrapper[4752]: I1124 11:17:41.456166 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="881eb454-d8b6-4b23-a23b-c3e0fc44d97c" containerName="collect-profiles" Nov 24 11:17:41 crc kubenswrapper[4752]: I1124 11:17:41.456193 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3037291-c53e-4eb9-ae1b-00f71fee5cc5" containerName="registry" Nov 24 11:17:41 crc kubenswrapper[4752]: I1124 11:17:41.456818 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-x24q4" Nov 24 11:17:41 crc kubenswrapper[4752]: I1124 11:17:41.460519 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Nov 24 11:17:41 crc kubenswrapper[4752]: I1124 11:17:41.460986 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Nov 24 11:17:41 crc kubenswrapper[4752]: I1124 11:17:41.461554 4752 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-wzbzq" Nov 24 11:17:41 crc kubenswrapper[4752]: I1124 11:17:41.461792 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Nov 24 11:17:41 crc kubenswrapper[4752]: I1124 11:17:41.466864 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-x24q4"] Nov 24 11:17:41 crc kubenswrapper[4752]: I1124 11:17:41.614819 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f-crc-storage\") pod \"crc-storage-crc-x24q4\" (UID: \"4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f\") " pod="crc-storage/crc-storage-crc-x24q4" Nov 24 11:17:41 crc kubenswrapper[4752]: I1124 11:17:41.614932 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f-node-mnt\") pod \"crc-storage-crc-x24q4\" (UID: \"4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f\") " pod="crc-storage/crc-storage-crc-x24q4" Nov 24 11:17:41 crc kubenswrapper[4752]: I1124 11:17:41.615069 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5xtw\" (UniqueName: \"kubernetes.io/projected/4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f-kube-api-access-j5xtw\") pod \"crc-storage-crc-x24q4\" (UID: \"4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f\") " pod="crc-storage/crc-storage-crc-x24q4" Nov 24 11:17:41 crc kubenswrapper[4752]: I1124 11:17:41.717139 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f-crc-storage\") pod \"crc-storage-crc-x24q4\" (UID: \"4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f\") " pod="crc-storage/crc-storage-crc-x24q4" Nov 24 11:17:41 crc kubenswrapper[4752]: I1124 11:17:41.717617 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f-node-mnt\") pod \"crc-storage-crc-x24q4\" (UID: \"4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f\") " pod="crc-storage/crc-storage-crc-x24q4" Nov 24 11:17:41 crc kubenswrapper[4752]: I1124 11:17:41.718055 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5xtw\" (UniqueName: \"kubernetes.io/projected/4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f-kube-api-access-j5xtw\") pod \"crc-storage-crc-x24q4\" (UID: \"4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f\") " pod="crc-storage/crc-storage-crc-x24q4" Nov 24 11:17:41 crc kubenswrapper[4752]: I1124 11:17:41.718222 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f-node-mnt\") pod \"crc-storage-crc-x24q4\" (UID: \"4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f\") " pod="crc-storage/crc-storage-crc-x24q4" Nov 24 11:17:41 crc kubenswrapper[4752]: I1124 11:17:41.718258 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f-crc-storage\") pod \"crc-storage-crc-x24q4\" (UID: \"4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f\") " pod="crc-storage/crc-storage-crc-x24q4" Nov 24 11:17:41 crc kubenswrapper[4752]: I1124 11:17:41.746715 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5xtw\" (UniqueName: \"kubernetes.io/projected/4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f-kube-api-access-j5xtw\") pod \"crc-storage-crc-x24q4\" (UID: \"4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f\") " pod="crc-storage/crc-storage-crc-x24q4" Nov 24 11:17:41 crc kubenswrapper[4752]: I1124 11:17:41.778938 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-x24q4" Nov 24 11:17:42 crc kubenswrapper[4752]: I1124 11:17:42.276067 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-x24q4"] Nov 24 11:17:42 crc kubenswrapper[4752]: I1124 11:17:42.288116 4752 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 24 11:17:42 crc kubenswrapper[4752]: I1124 11:17:42.441455 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-x24q4" event={"ID":"4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f","Type":"ContainerStarted","Data":"21ed5145591bfebc61d2a65852cf9a7d00c961feb8dfa81403467b2629c1f487"} Nov 24 11:17:44 crc kubenswrapper[4752]: I1124 11:17:44.454651 4752 generic.go:334] "Generic (PLEG): container finished" podID="4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f" containerID="49a92d8cacf21c15dacb7e2e661eec4d93f24f9175ad5a8324777a4edc76aac5" exitCode=0 Nov 24 11:17:44 crc kubenswrapper[4752]: I1124 11:17:44.454734 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-x24q4" event={"ID":"4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f","Type":"ContainerDied","Data":"49a92d8cacf21c15dacb7e2e661eec4d93f24f9175ad5a8324777a4edc76aac5"} Nov 24 11:17:45 crc kubenswrapper[4752]: I1124 11:17:45.830551 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-x24q4" Nov 24 11:17:45 crc kubenswrapper[4752]: I1124 11:17:45.988453 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5xtw\" (UniqueName: \"kubernetes.io/projected/4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f-kube-api-access-j5xtw\") pod \"4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f\" (UID: \"4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f\") " Nov 24 11:17:45 crc kubenswrapper[4752]: I1124 11:17:45.988521 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f-crc-storage\") pod \"4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f\" (UID: \"4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f\") " Nov 24 11:17:45 crc kubenswrapper[4752]: I1124 11:17:45.988610 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f-node-mnt\") pod \"4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f\" (UID: \"4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f\") " Nov 24 11:17:45 crc kubenswrapper[4752]: I1124 11:17:45.988884 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f" (UID: "4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 11:17:45 crc kubenswrapper[4752]: I1124 11:17:45.989238 4752 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f-node-mnt\") on node \"crc\" DevicePath \"\"" Nov 24 11:17:45 crc kubenswrapper[4752]: I1124 11:17:45.997007 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f-kube-api-access-j5xtw" (OuterVolumeSpecName: "kube-api-access-j5xtw") pod "4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f" (UID: "4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f"). InnerVolumeSpecName "kube-api-access-j5xtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:17:46 crc kubenswrapper[4752]: I1124 11:17:46.002658 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f" (UID: "4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:17:46 crc kubenswrapper[4752]: I1124 11:17:46.091047 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5xtw\" (UniqueName: \"kubernetes.io/projected/4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f-kube-api-access-j5xtw\") on node \"crc\" DevicePath \"\"" Nov 24 11:17:46 crc kubenswrapper[4752]: I1124 11:17:46.091104 4752 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f-crc-storage\") on node \"crc\" DevicePath \"\"" Nov 24 11:17:46 crc kubenswrapper[4752]: I1124 11:17:46.470062 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-x24q4" event={"ID":"4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f","Type":"ContainerDied","Data":"21ed5145591bfebc61d2a65852cf9a7d00c961feb8dfa81403467b2629c1f487"} Nov 24 11:17:46 crc kubenswrapper[4752]: I1124 11:17:46.470126 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21ed5145591bfebc61d2a65852cf9a7d00c961feb8dfa81403467b2629c1f487" Nov 24 11:17:46 crc kubenswrapper[4752]: I1124 11:17:46.470148 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-x24q4" Nov 24 11:17:53 crc kubenswrapper[4752]: I1124 11:17:53.353701 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e6h6hf"] Nov 24 11:17:53 crc kubenswrapper[4752]: E1124 11:17:53.354722 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f" containerName="storage" Nov 24 11:17:53 crc kubenswrapper[4752]: I1124 11:17:53.354775 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f" containerName="storage" Nov 24 11:17:53 crc kubenswrapper[4752]: I1124 11:17:53.354938 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f" containerName="storage" Nov 24 11:17:53 crc kubenswrapper[4752]: I1124 11:17:53.356170 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e6h6hf" Nov 24 11:17:53 crc kubenswrapper[4752]: I1124 11:17:53.361562 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 24 11:17:53 crc kubenswrapper[4752]: I1124 11:17:53.364571 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e6h6hf"] Nov 24 11:17:53 crc kubenswrapper[4752]: I1124 11:17:53.536616 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpzfw\" (UniqueName: \"kubernetes.io/projected/c193f46d-1b6e-4de5-a7e6-aac42bba0e53-kube-api-access-bpzfw\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e6h6hf\" (UID: \"c193f46d-1b6e-4de5-a7e6-aac42bba0e53\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e6h6hf" Nov 24 11:17:53 crc kubenswrapper[4752]: I1124 11:17:53.537027 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c193f46d-1b6e-4de5-a7e6-aac42bba0e53-util\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e6h6hf\" (UID: \"c193f46d-1b6e-4de5-a7e6-aac42bba0e53\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e6h6hf" Nov 24 11:17:53 crc kubenswrapper[4752]: I1124 11:17:53.537265 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c193f46d-1b6e-4de5-a7e6-aac42bba0e53-bundle\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e6h6hf\" (UID: \"c193f46d-1b6e-4de5-a7e6-aac42bba0e53\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e6h6hf" Nov 24 11:17:53 crc kubenswrapper[4752]: I1124 11:17:53.638273 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c193f46d-1b6e-4de5-a7e6-aac42bba0e53-bundle\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e6h6hf\" (UID: \"c193f46d-1b6e-4de5-a7e6-aac42bba0e53\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e6h6hf" Nov 24 11:17:53 crc kubenswrapper[4752]: I1124 11:17:53.638541 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpzfw\" (UniqueName: \"kubernetes.io/projected/c193f46d-1b6e-4de5-a7e6-aac42bba0e53-kube-api-access-bpzfw\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e6h6hf\" (UID: \"c193f46d-1b6e-4de5-a7e6-aac42bba0e53\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e6h6hf" Nov 24 11:17:53 crc kubenswrapper[4752]: I1124 11:17:53.638592 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c193f46d-1b6e-4de5-a7e6-aac42bba0e53-util\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e6h6hf\" (UID: \"c193f46d-1b6e-4de5-a7e6-aac42bba0e53\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e6h6hf" Nov 24 11:17:53 crc kubenswrapper[4752]: I1124 11:17:53.639517 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c193f46d-1b6e-4de5-a7e6-aac42bba0e53-bundle\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e6h6hf\" (UID: \"c193f46d-1b6e-4de5-a7e6-aac42bba0e53\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e6h6hf" Nov 24 11:17:53 crc kubenswrapper[4752]: I1124 11:17:53.640073 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c193f46d-1b6e-4de5-a7e6-aac42bba0e53-util\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e6h6hf\" (UID: \"c193f46d-1b6e-4de5-a7e6-aac42bba0e53\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e6h6hf" Nov 24 11:17:53 crc kubenswrapper[4752]: I1124 11:17:53.673179 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpzfw\" (UniqueName: \"kubernetes.io/projected/c193f46d-1b6e-4de5-a7e6-aac42bba0e53-kube-api-access-bpzfw\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e6h6hf\" (UID: \"c193f46d-1b6e-4de5-a7e6-aac42bba0e53\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e6h6hf" Nov 24 11:17:53 crc kubenswrapper[4752]: I1124 11:17:53.694408 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e6h6hf" Nov 24 11:17:54 crc kubenswrapper[4752]: I1124 11:17:54.218193 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e6h6hf"] Nov 24 11:17:54 crc kubenswrapper[4752]: I1124 11:17:54.522812 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e6h6hf" event={"ID":"c193f46d-1b6e-4de5-a7e6-aac42bba0e53","Type":"ContainerStarted","Data":"b05f408a1cb271a222d095cc68b42588d6509266f2ae403343c066983b972560"} Nov 24 11:17:54 crc kubenswrapper[4752]: I1124 11:17:54.522914 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e6h6hf" event={"ID":"c193f46d-1b6e-4de5-a7e6-aac42bba0e53","Type":"ContainerStarted","Data":"a39b1bf1d8cb8f3a552d80a1fee9c847ba41d5ef53de1897b9eb777e048bee59"} Nov 24 11:17:55 crc kubenswrapper[4752]: I1124 11:17:55.532164 4752 generic.go:334] "Generic (PLEG): container finished" podID="c193f46d-1b6e-4de5-a7e6-aac42bba0e53" containerID="b05f408a1cb271a222d095cc68b42588d6509266f2ae403343c066983b972560" exitCode=0 Nov 24 11:17:55 crc kubenswrapper[4752]: I1124 11:17:55.532250 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e6h6hf" event={"ID":"c193f46d-1b6e-4de5-a7e6-aac42bba0e53","Type":"ContainerDied","Data":"b05f408a1cb271a222d095cc68b42588d6509266f2ae403343c066983b972560"} Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.511686 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bkksr"] Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.541327 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="ovn-controller" containerID="cri-o://d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255" gracePeriod=30 Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.541736 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="sbdb" containerID="cri-o://b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5" gracePeriod=30 Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.541836 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="nbdb" containerID="cri-o://a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c" gracePeriod=30 Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.541897 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="northd" containerID="cri-o://c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad" gracePeriod=30 Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.541950 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24" gracePeriod=30 Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.542002 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="kube-rbac-proxy-node" containerID="cri-o://8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab" gracePeriod=30 Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.542057 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="ovn-acl-logging" containerID="cri-o://5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989" gracePeriod=30 Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.583104 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="ovnkube-controller" containerID="cri-o://391be3743a920a4488b7847c25adb755ba6f2fbce288cd8e210d51dee7fe0448" gracePeriod=30 Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.850505 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bkksr_fa360dfd-2d4c-4442-84c9-af5d97c4c1fa/ovnkube-controller/3.log" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.852805 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bkksr_fa360dfd-2d4c-4442-84c9-af5d97c4c1fa/ovn-acl-logging/0.log" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.853231 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bkksr_fa360dfd-2d4c-4442-84c9-af5d97c4c1fa/ovn-controller/0.log" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.853844 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.883076 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-ovnkube-config\") pod \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.883118 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.883143 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-ovn-node-metrics-cert\") pod \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.883166 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-host-cni-netd\") pod \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.883184 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-host-run-ovn-kubernetes\") pod \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.883221 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-run-ovn\") pod \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.883248 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gwnt\" (UniqueName: \"kubernetes.io/projected/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-kube-api-access-6gwnt\") pod \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.883523 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-run-systemd\") pod \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.883567 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-env-overrides\") pod \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.883590 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-host-kubelet\") pod \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.883616 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-ovnkube-script-lib\") pod \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.883624 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" (UID: "fa360dfd-2d4c-4442-84c9-af5d97c4c1fa"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.883641 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-host-cni-bin\") pod \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.883688 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" (UID: "fa360dfd-2d4c-4442-84c9-af5d97c4c1fa"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.883696 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-var-lib-openvswitch\") pod \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.883720 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-host-run-netns\") pod \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.883726 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" (UID: "fa360dfd-2d4c-4442-84c9-af5d97c4c1fa"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.883769 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" (UID: "fa360dfd-2d4c-4442-84c9-af5d97c4c1fa"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.883808 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" (UID: "fa360dfd-2d4c-4442-84c9-af5d97c4c1fa"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.884354 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" (UID: "fa360dfd-2d4c-4442-84c9-af5d97c4c1fa"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.885779 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-systemd-units\") pod \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.885816 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-node-log\") pod \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.885845 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-log-socket\") pod \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.885868 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-run-openvswitch\") pod \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.885892 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-etc-openvswitch\") pod \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.885904 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-host-slash\") pod \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\" (UID: \"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa\") " Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.886245 4752 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.886257 4752 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.886269 4752 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.886278 4752 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-host-cni-netd\") on node \"crc\" DevicePath \"\"" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.886286 4752 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.886297 4752 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-host-cni-bin\") on node \"crc\" DevicePath \"\"" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.886373 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-host-slash" (OuterVolumeSpecName: "host-slash") pod "fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" (UID: "fa360dfd-2d4c-4442-84c9-af5d97c4c1fa"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.886406 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" (UID: "fa360dfd-2d4c-4442-84c9-af5d97c4c1fa"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.886449 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" (UID: "fa360dfd-2d4c-4442-84c9-af5d97c4c1fa"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.886470 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" (UID: "fa360dfd-2d4c-4442-84c9-af5d97c4c1fa"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.886486 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-node-log" (OuterVolumeSpecName: "node-log") pod "fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" (UID: "fa360dfd-2d4c-4442-84c9-af5d97c4c1fa"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.886574 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-log-socket" (OuterVolumeSpecName: "log-socket") pod "fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" (UID: "fa360dfd-2d4c-4442-84c9-af5d97c4c1fa"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.886725 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" (UID: "fa360dfd-2d4c-4442-84c9-af5d97c4c1fa"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.887508 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" (UID: "fa360dfd-2d4c-4442-84c9-af5d97c4c1fa"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.887889 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" (UID: "fa360dfd-2d4c-4442-84c9-af5d97c4c1fa"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.887943 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" (UID: "fa360dfd-2d4c-4442-84c9-af5d97c4c1fa"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.892030 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" (UID: "fa360dfd-2d4c-4442-84c9-af5d97c4c1fa"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.895699 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" (UID: "fa360dfd-2d4c-4442-84c9-af5d97c4c1fa"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.899993 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-kube-api-access-6gwnt" (OuterVolumeSpecName: "kube-api-access-6gwnt") pod "fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" (UID: "fa360dfd-2d4c-4442-84c9-af5d97c4c1fa"). InnerVolumeSpecName "kube-api-access-6gwnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.913971 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pzt4z"] Nov 24 11:17:56 crc kubenswrapper[4752]: E1124 11:17:56.914395 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="ovn-acl-logging" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.914478 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="ovn-acl-logging" Nov 24 11:17:56 crc kubenswrapper[4752]: E1124 11:17:56.914549 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="kube-rbac-proxy-ovn-metrics" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.914615 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="kube-rbac-proxy-ovn-metrics" Nov 24 11:17:56 crc kubenswrapper[4752]: E1124 11:17:56.914695 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="northd" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.914788 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="northd" Nov 24 11:17:56 crc kubenswrapper[4752]: E1124 11:17:56.914935 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="sbdb" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.915045 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="sbdb" Nov 24 11:17:56 crc kubenswrapper[4752]: E1124 11:17:56.915151 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="kubecfg-setup" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.915295 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="kubecfg-setup" Nov 24 11:17:56 crc kubenswrapper[4752]: E1124 11:17:56.915379 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="ovnkube-controller" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.915451 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="ovnkube-controller" Nov 24 11:17:56 crc kubenswrapper[4752]: E1124 11:17:56.915548 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="ovn-controller" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.914845 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" (UID: "fa360dfd-2d4c-4442-84c9-af5d97c4c1fa"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.915617 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="ovn-controller" Nov 24 11:17:56 crc kubenswrapper[4752]: E1124 11:17:56.915697 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="ovnkube-controller" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.915714 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="ovnkube-controller" Nov 24 11:17:56 crc kubenswrapper[4752]: E1124 11:17:56.915727 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="nbdb" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.915733 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="nbdb" Nov 24 11:17:56 crc kubenswrapper[4752]: E1124 11:17:56.915762 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="ovnkube-controller" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.915770 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="ovnkube-controller" Nov 24 11:17:56 crc kubenswrapper[4752]: E1124 11:17:56.915781 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="kube-rbac-proxy-node" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.915787 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="kube-rbac-proxy-node" Nov 24 11:17:56 crc kubenswrapper[4752]: E1124 11:17:56.915799 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="ovnkube-controller" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.915805 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="ovnkube-controller" Nov 24 11:17:56 crc kubenswrapper[4752]: E1124 11:17:56.915813 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="ovnkube-controller" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.915819 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="ovnkube-controller" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.915990 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="ovn-acl-logging" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.916001 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="northd" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.916008 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="kube-rbac-proxy-node" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.916018 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="ovn-controller" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.916027 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="ovnkube-controller" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.916034 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="kube-rbac-proxy-ovn-metrics" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.916041 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="nbdb" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.916050 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="ovnkube-controller" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.916058 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="ovnkube-controller" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.916064 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="sbdb" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.916072 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="ovnkube-controller" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.916502 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerName="ovnkube-controller" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.920194 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.987459 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-log-socket\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.987712 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-host-kubelet\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.987806 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-host-cni-bin\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.987879 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-run-openvswitch\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.987998 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9f51a639-632f-4fd9-bb34-a68d0154ee15-ovnkube-script-lib\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.988049 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-host-cni-netd\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.988149 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.988201 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-run-systemd\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.988255 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-run-ovn\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.988279 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-host-run-ovn-kubernetes\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.988300 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66mwk\" (UniqueName: \"kubernetes.io/projected/9f51a639-632f-4fd9-bb34-a68d0154ee15-kube-api-access-66mwk\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.988315 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9f51a639-632f-4fd9-bb34-a68d0154ee15-ovn-node-metrics-cert\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.988331 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-etc-openvswitch\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.988346 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-host-run-netns\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.988465 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9f51a639-632f-4fd9-bb34-a68d0154ee15-ovnkube-config\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.988499 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-var-lib-openvswitch\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.988615 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-systemd-units\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.988659 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-node-log\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.988678 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-host-slash\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.988800 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9f51a639-632f-4fd9-bb34-a68d0154ee15-env-overrides\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.988878 4752 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-systemd-units\") on node \"crc\" DevicePath \"\"" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.988889 4752 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-node-log\") on node \"crc\" DevicePath \"\"" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.988899 4752 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-log-socket\") on node \"crc\" DevicePath \"\"" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.988908 4752 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-run-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.988917 4752 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.988926 4752 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-host-slash\") on node \"crc\" DevicePath \"\"" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.988935 4752 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.988945 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gwnt\" (UniqueName: \"kubernetes.io/projected/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-kube-api-access-6gwnt\") on node \"crc\" DevicePath \"\"" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.988954 4752 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-run-systemd\") on node \"crc\" DevicePath \"\"" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.988962 4752 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.988971 4752 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-host-kubelet\") on node \"crc\" DevicePath \"\"" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.988980 4752 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.988990 4752 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-host-run-netns\") on node \"crc\" DevicePath \"\"" Nov 24 11:17:56 crc kubenswrapper[4752]: I1124 11:17:56.988998 4752 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.090177 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.090253 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-run-systemd\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.090317 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-run-ovn\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.090366 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-host-run-ovn-kubernetes\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.090400 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.090417 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-run-systemd\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.090409 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66mwk\" (UniqueName: \"kubernetes.io/projected/9f51a639-632f-4fd9-bb34-a68d0154ee15-kube-api-access-66mwk\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.090481 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-run-ovn\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.090482 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9f51a639-632f-4fd9-bb34-a68d0154ee15-ovn-node-metrics-cert\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.090535 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-host-run-netns\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.090573 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-etc-openvswitch\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.090603 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9f51a639-632f-4fd9-bb34-a68d0154ee15-ovnkube-config\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.090633 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-var-lib-openvswitch\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.090695 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-systemd-units\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.090730 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-etc-openvswitch\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.090779 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-host-run-netns\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.090777 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-node-log\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.090809 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-host-slash\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.090825 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-node-log\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.090840 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9f51a639-632f-4fd9-bb34-a68d0154ee15-env-overrides\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.090898 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-host-kubelet\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.090966 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-log-socket\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.090998 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-host-cni-bin\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.091029 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-run-openvswitch\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.091031 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-host-run-ovn-kubernetes\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.091098 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-host-slash\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.091095 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-var-lib-openvswitch\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.091151 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-log-socket\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.091206 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-host-kubelet\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.091209 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-systemd-units\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.091111 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9f51a639-632f-4fd9-bb34-a68d0154ee15-ovnkube-script-lib\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.091253 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-host-cni-bin\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.091283 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-host-cni-netd\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.091299 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-run-openvswitch\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.091434 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9f51a639-632f-4fd9-bb34-a68d0154ee15-host-cni-netd\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.091983 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9f51a639-632f-4fd9-bb34-a68d0154ee15-ovnkube-config\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.092273 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9f51a639-632f-4fd9-bb34-a68d0154ee15-env-overrides\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.092280 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9f51a639-632f-4fd9-bb34-a68d0154ee15-ovnkube-script-lib\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.094922 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9f51a639-632f-4fd9-bb34-a68d0154ee15-ovn-node-metrics-cert\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.108317 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66mwk\" (UniqueName: \"kubernetes.io/projected/9f51a639-632f-4fd9-bb34-a68d0154ee15-kube-api-access-66mwk\") pod \"ovnkube-node-pzt4z\" (UID: \"9f51a639-632f-4fd9-bb34-a68d0154ee15\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.236650 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:17:57 crc kubenswrapper[4752]: W1124 11:17:57.266281 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f51a639_632f_4fd9_bb34_a68d0154ee15.slice/crio-07f07f659d6beaf3b1c4a3e58a7dee4565cc8a1488c2328acdfb8f3e9d3e9dde WatchSource:0}: Error finding container 07f07f659d6beaf3b1c4a3e58a7dee4565cc8a1488c2328acdfb8f3e9d3e9dde: Status 404 returned error can't find the container with id 07f07f659d6beaf3b1c4a3e58a7dee4565cc8a1488c2328acdfb8f3e9d3e9dde Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.554579 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bkksr_fa360dfd-2d4c-4442-84c9-af5d97c4c1fa/ovnkube-controller/3.log" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.559687 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bkksr_fa360dfd-2d4c-4442-84c9-af5d97c4c1fa/ovn-acl-logging/0.log" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.560431 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bkksr_fa360dfd-2d4c-4442-84c9-af5d97c4c1fa/ovn-controller/0.log" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.560938 4752 generic.go:334] "Generic (PLEG): container finished" podID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerID="391be3743a920a4488b7847c25adb755ba6f2fbce288cd8e210d51dee7fe0448" exitCode=0 Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.560982 4752 generic.go:334] "Generic (PLEG): container finished" podID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerID="b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5" exitCode=0 Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.560992 4752 generic.go:334] "Generic (PLEG): container finished" podID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerID="a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c" exitCode=0 Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561002 4752 generic.go:334] "Generic (PLEG): container finished" podID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerID="c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad" exitCode=0 Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561011 4752 generic.go:334] "Generic (PLEG): container finished" podID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerID="7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24" exitCode=0 Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561020 4752 generic.go:334] "Generic (PLEG): container finished" podID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerID="8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab" exitCode=0 Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561031 4752 generic.go:334] "Generic (PLEG): container finished" podID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerID="5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989" exitCode=143 Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561040 4752 generic.go:334] "Generic (PLEG): container finished" podID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" containerID="d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255" exitCode=143 Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561104 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" event={"ID":"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa","Type":"ContainerDied","Data":"391be3743a920a4488b7847c25adb755ba6f2fbce288cd8e210d51dee7fe0448"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561137 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" event={"ID":"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa","Type":"ContainerDied","Data":"b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561154 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" event={"ID":"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa","Type":"ContainerDied","Data":"a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561168 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" event={"ID":"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa","Type":"ContainerDied","Data":"c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561181 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" event={"ID":"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa","Type":"ContainerDied","Data":"7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561194 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" event={"ID":"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa","Type":"ContainerDied","Data":"8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561210 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561223 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561232 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561239 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561245 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561252 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561260 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561267 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561264 4752 scope.go:117] "RemoveContainer" containerID="391be3743a920a4488b7847c25adb755ba6f2fbce288cd8e210d51dee7fe0448" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561274 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561381 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" event={"ID":"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa","Type":"ContainerDied","Data":"5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561397 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"391be3743a920a4488b7847c25adb755ba6f2fbce288cd8e210d51dee7fe0448"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561405 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561413 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561420 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561429 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561437 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561443 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561451 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561458 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561465 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561474 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" event={"ID":"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa","Type":"ContainerDied","Data":"d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561485 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"391be3743a920a4488b7847c25adb755ba6f2fbce288cd8e210d51dee7fe0448"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561494 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561501 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561508 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561515 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561522 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561529 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561537 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561544 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561220 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561551 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561732 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bkksr" event={"ID":"fa360dfd-2d4c-4442-84c9-af5d97c4c1fa","Type":"ContainerDied","Data":"c35a8ef86d4ab8140f3da0310e5ef06887a7ea52a57a2f5d5b14e725dffcd022"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561819 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"391be3743a920a4488b7847c25adb755ba6f2fbce288cd8e210d51dee7fe0448"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561841 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561853 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561865 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561879 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561895 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561913 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561927 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561941 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.561956 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.563876 4752 generic.go:334] "Generic (PLEG): container finished" podID="c193f46d-1b6e-4de5-a7e6-aac42bba0e53" containerID="5ee4237923e9c6016cad515e8498f1afbe0c24add17ccd993fe7ecc0473ed2b5" exitCode=0 Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.563965 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e6h6hf" event={"ID":"c193f46d-1b6e-4de5-a7e6-aac42bba0e53","Type":"ContainerDied","Data":"5ee4237923e9c6016cad515e8498f1afbe0c24add17ccd993fe7ecc0473ed2b5"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.566008 4752 generic.go:334] "Generic (PLEG): container finished" podID="9f51a639-632f-4fd9-bb34-a68d0154ee15" containerID="593ab8ddccbb7962c7197deef9683148f711273d1064314ada034de0581f2d5a" exitCode=0 Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.566049 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" event={"ID":"9f51a639-632f-4fd9-bb34-a68d0154ee15","Type":"ContainerDied","Data":"593ab8ddccbb7962c7197deef9683148f711273d1064314ada034de0581f2d5a"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.566140 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" event={"ID":"9f51a639-632f-4fd9-bb34-a68d0154ee15","Type":"ContainerStarted","Data":"07f07f659d6beaf3b1c4a3e58a7dee4565cc8a1488c2328acdfb8f3e9d3e9dde"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.577527 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jh899_f578963d-5ff1-4e31-945b-cc59f0b244bf/kube-multus/2.log" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.580675 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jh899_f578963d-5ff1-4e31-945b-cc59f0b244bf/kube-multus/1.log" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.580720 4752 generic.go:334] "Generic (PLEG): container finished" podID="f578963d-5ff1-4e31-945b-cc59f0b244bf" containerID="657984dabefbfedd0918034d9477ebc04272e1cdc385401ce0e3c550ce68faf0" exitCode=2 Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.580771 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jh899" event={"ID":"f578963d-5ff1-4e31-945b-cc59f0b244bf","Type":"ContainerDied","Data":"657984dabefbfedd0918034d9477ebc04272e1cdc385401ce0e3c550ce68faf0"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.580796 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f51d87e3b03bc979ed61278f3c0e2021000517c2ff53f53793ac2862928974f9"} Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.582134 4752 scope.go:117] "RemoveContainer" containerID="657984dabefbfedd0918034d9477ebc04272e1cdc385401ce0e3c550ce68faf0" Nov 24 11:17:57 crc kubenswrapper[4752]: E1124 11:17:57.582346 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-jh899_openshift-multus(f578963d-5ff1-4e31-945b-cc59f0b244bf)\"" pod="openshift-multus/multus-jh899" podUID="f578963d-5ff1-4e31-945b-cc59f0b244bf" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.609405 4752 scope.go:117] "RemoveContainer" containerID="566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.627416 4752 scope.go:117] "RemoveContainer" containerID="b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.673322 4752 scope.go:117] "RemoveContainer" containerID="a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.675959 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bkksr"] Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.679237 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bkksr"] Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.716489 4752 scope.go:117] "RemoveContainer" containerID="c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.744842 4752 scope.go:117] "RemoveContainer" containerID="7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.762043 4752 scope.go:117] "RemoveContainer" containerID="8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.778258 4752 scope.go:117] "RemoveContainer" containerID="5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.790801 4752 scope.go:117] "RemoveContainer" containerID="d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.810688 4752 scope.go:117] "RemoveContainer" containerID="fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.832131 4752 scope.go:117] "RemoveContainer" containerID="391be3743a920a4488b7847c25adb755ba6f2fbce288cd8e210d51dee7fe0448" Nov 24 11:17:57 crc kubenswrapper[4752]: E1124 11:17:57.832655 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"391be3743a920a4488b7847c25adb755ba6f2fbce288cd8e210d51dee7fe0448\": container with ID starting with 391be3743a920a4488b7847c25adb755ba6f2fbce288cd8e210d51dee7fe0448 not found: ID does not exist" containerID="391be3743a920a4488b7847c25adb755ba6f2fbce288cd8e210d51dee7fe0448" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.832691 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"391be3743a920a4488b7847c25adb755ba6f2fbce288cd8e210d51dee7fe0448"} err="failed to get container status \"391be3743a920a4488b7847c25adb755ba6f2fbce288cd8e210d51dee7fe0448\": rpc error: code = NotFound desc = could not find container \"391be3743a920a4488b7847c25adb755ba6f2fbce288cd8e210d51dee7fe0448\": container with ID starting with 391be3743a920a4488b7847c25adb755ba6f2fbce288cd8e210d51dee7fe0448 not found: ID does not exist" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.832732 4752 scope.go:117] "RemoveContainer" containerID="566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af" Nov 24 11:17:57 crc kubenswrapper[4752]: E1124 11:17:57.833126 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af\": container with ID starting with 566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af not found: ID does not exist" containerID="566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.833187 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af"} err="failed to get container status \"566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af\": rpc error: code = NotFound desc = could not find container \"566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af\": container with ID starting with 566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af not found: ID does not exist" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.833230 4752 scope.go:117] "RemoveContainer" containerID="b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5" Nov 24 11:17:57 crc kubenswrapper[4752]: E1124 11:17:57.833575 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5\": container with ID starting with b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5 not found: ID does not exist" containerID="b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.833607 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5"} err="failed to get container status \"b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5\": rpc error: code = NotFound desc = could not find container \"b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5\": container with ID starting with b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5 not found: ID does not exist" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.833627 4752 scope.go:117] "RemoveContainer" containerID="a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c" Nov 24 11:17:57 crc kubenswrapper[4752]: E1124 11:17:57.834005 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c\": container with ID starting with a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c not found: ID does not exist" containerID="a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.834036 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c"} err="failed to get container status \"a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c\": rpc error: code = NotFound desc = could not find container \"a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c\": container with ID starting with a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c not found: ID does not exist" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.834054 4752 scope.go:117] "RemoveContainer" containerID="c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad" Nov 24 11:17:57 crc kubenswrapper[4752]: E1124 11:17:57.834731 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad\": container with ID starting with c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad not found: ID does not exist" containerID="c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.834775 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad"} err="failed to get container status \"c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad\": rpc error: code = NotFound desc = could not find container \"c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad\": container with ID starting with c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad not found: ID does not exist" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.834793 4752 scope.go:117] "RemoveContainer" containerID="7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24" Nov 24 11:17:57 crc kubenswrapper[4752]: E1124 11:17:57.834983 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24\": container with ID starting with 7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24 not found: ID does not exist" containerID="7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.835004 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24"} err="failed to get container status \"7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24\": rpc error: code = NotFound desc = could not find container \"7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24\": container with ID starting with 7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24 not found: ID does not exist" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.835033 4752 scope.go:117] "RemoveContainer" containerID="8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab" Nov 24 11:17:57 crc kubenswrapper[4752]: E1124 11:17:57.835526 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab\": container with ID starting with 8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab not found: ID does not exist" containerID="8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.835553 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab"} err="failed to get container status \"8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab\": rpc error: code = NotFound desc = could not find container \"8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab\": container with ID starting with 8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab not found: ID does not exist" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.835573 4752 scope.go:117] "RemoveContainer" containerID="5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989" Nov 24 11:17:57 crc kubenswrapper[4752]: E1124 11:17:57.835794 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989\": container with ID starting with 5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989 not found: ID does not exist" containerID="5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.835840 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989"} err="failed to get container status \"5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989\": rpc error: code = NotFound desc = could not find container \"5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989\": container with ID starting with 5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989 not found: ID does not exist" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.835859 4752 scope.go:117] "RemoveContainer" containerID="d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255" Nov 24 11:17:57 crc kubenswrapper[4752]: E1124 11:17:57.836219 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255\": container with ID starting with d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255 not found: ID does not exist" containerID="d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.836245 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255"} err="failed to get container status \"d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255\": rpc error: code = NotFound desc = could not find container \"d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255\": container with ID starting with d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255 not found: ID does not exist" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.836261 4752 scope.go:117] "RemoveContainer" containerID="fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac" Nov 24 11:17:57 crc kubenswrapper[4752]: E1124 11:17:57.836508 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\": container with ID starting with fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac not found: ID does not exist" containerID="fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.836533 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac"} err="failed to get container status \"fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\": rpc error: code = NotFound desc = could not find container \"fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\": container with ID starting with fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac not found: ID does not exist" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.836553 4752 scope.go:117] "RemoveContainer" containerID="391be3743a920a4488b7847c25adb755ba6f2fbce288cd8e210d51dee7fe0448" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.836797 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"391be3743a920a4488b7847c25adb755ba6f2fbce288cd8e210d51dee7fe0448"} err="failed to get container status \"391be3743a920a4488b7847c25adb755ba6f2fbce288cd8e210d51dee7fe0448\": rpc error: code = NotFound desc = could not find container \"391be3743a920a4488b7847c25adb755ba6f2fbce288cd8e210d51dee7fe0448\": container with ID starting with 391be3743a920a4488b7847c25adb755ba6f2fbce288cd8e210d51dee7fe0448 not found: ID does not exist" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.836818 4752 scope.go:117] "RemoveContainer" containerID="566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.837283 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af"} err="failed to get container status \"566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af\": rpc error: code = NotFound desc = could not find container \"566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af\": container with ID starting with 566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af not found: ID does not exist" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.837306 4752 scope.go:117] "RemoveContainer" containerID="b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.837534 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5"} err="failed to get container status \"b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5\": rpc error: code = NotFound desc = could not find container \"b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5\": container with ID starting with b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5 not found: ID does not exist" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.837553 4752 scope.go:117] "RemoveContainer" containerID="a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.837767 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c"} err="failed to get container status \"a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c\": rpc error: code = NotFound desc = could not find container \"a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c\": container with ID starting with a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c not found: ID does not exist" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.837787 4752 scope.go:117] "RemoveContainer" containerID="c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.838015 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad"} err="failed to get container status \"c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad\": rpc error: code = NotFound desc = could not find container \"c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad\": container with ID starting with c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad not found: ID does not exist" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.838043 4752 scope.go:117] "RemoveContainer" containerID="7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.838256 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24"} err="failed to get container status \"7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24\": rpc error: code = NotFound desc = could not find container \"7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24\": container with ID starting with 7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24 not found: ID does not exist" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.838281 4752 scope.go:117] "RemoveContainer" containerID="8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.838511 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab"} err="failed to get container status \"8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab\": rpc error: code = NotFound desc = could not find container \"8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab\": container with ID starting with 8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab not found: ID does not exist" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.838534 4752 scope.go:117] "RemoveContainer" containerID="5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.838808 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989"} err="failed to get container status \"5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989\": rpc error: code = NotFound desc = could not find container \"5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989\": container with ID starting with 5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989 not found: ID does not exist" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.838852 4752 scope.go:117] "RemoveContainer" containerID="d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.839153 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255"} err="failed to get container status \"d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255\": rpc error: code = NotFound desc = could not find container \"d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255\": container with ID starting with d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255 not found: ID does not exist" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.839180 4752 scope.go:117] "RemoveContainer" containerID="fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.839598 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac"} err="failed to get container status \"fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\": rpc error: code = NotFound desc = could not find container \"fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\": container with ID starting with fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac not found: ID does not exist" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.839644 4752 scope.go:117] "RemoveContainer" containerID="391be3743a920a4488b7847c25adb755ba6f2fbce288cd8e210d51dee7fe0448" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.839938 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"391be3743a920a4488b7847c25adb755ba6f2fbce288cd8e210d51dee7fe0448"} err="failed to get container status \"391be3743a920a4488b7847c25adb755ba6f2fbce288cd8e210d51dee7fe0448\": rpc error: code = NotFound desc = could not find container \"391be3743a920a4488b7847c25adb755ba6f2fbce288cd8e210d51dee7fe0448\": container with ID starting with 391be3743a920a4488b7847c25adb755ba6f2fbce288cd8e210d51dee7fe0448 not found: ID does not exist" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.839960 4752 scope.go:117] "RemoveContainer" containerID="566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.840218 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af"} err="failed to get container status \"566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af\": rpc error: code = NotFound desc = could not find container \"566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af\": container with ID starting with 566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af not found: ID does not exist" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.840242 4752 scope.go:117] "RemoveContainer" containerID="b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.840461 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5"} err="failed to get container status \"b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5\": rpc error: code = NotFound desc = could not find container \"b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5\": container with ID starting with b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5 not found: ID does not exist" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.840487 4752 scope.go:117] "RemoveContainer" containerID="a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.840725 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c"} err="failed to get container status \"a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c\": rpc error: code = NotFound desc = could not find container \"a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c\": container with ID starting with a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c not found: ID does not exist" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.840764 4752 scope.go:117] "RemoveContainer" containerID="c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.841055 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad"} err="failed to get container status \"c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad\": rpc error: code = NotFound desc = could not find container \"c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad\": container with ID starting with c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad not found: ID does not exist" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.841083 4752 scope.go:117] "RemoveContainer" containerID="7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.841381 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24"} err="failed to get container status \"7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24\": rpc error: code = NotFound desc = could not find container \"7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24\": container with ID starting with 7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24 not found: ID does not exist" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.841426 4752 scope.go:117] "RemoveContainer" containerID="8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.841634 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab"} err="failed to get container status \"8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab\": rpc error: code = NotFound desc = could not find container \"8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab\": container with ID starting with 8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab not found: ID does not exist" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.841655 4752 scope.go:117] "RemoveContainer" containerID="5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.841896 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989"} err="failed to get container status \"5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989\": rpc error: code = NotFound desc = could not find container \"5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989\": container with ID starting with 5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989 not found: ID does not exist" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.841944 4752 scope.go:117] "RemoveContainer" containerID="d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.842218 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255"} err="failed to get container status \"d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255\": rpc error: code = NotFound desc = could not find container \"d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255\": container with ID starting with d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255 not found: ID does not exist" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.842243 4752 scope.go:117] "RemoveContainer" containerID="fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.842444 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac"} err="failed to get container status \"fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\": rpc error: code = NotFound desc = could not find container \"fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\": container with ID starting with fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac not found: ID does not exist" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.842500 4752 scope.go:117] "RemoveContainer" containerID="391be3743a920a4488b7847c25adb755ba6f2fbce288cd8e210d51dee7fe0448" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.842799 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"391be3743a920a4488b7847c25adb755ba6f2fbce288cd8e210d51dee7fe0448"} err="failed to get container status \"391be3743a920a4488b7847c25adb755ba6f2fbce288cd8e210d51dee7fe0448\": rpc error: code = NotFound desc = could not find container \"391be3743a920a4488b7847c25adb755ba6f2fbce288cd8e210d51dee7fe0448\": container with ID starting with 391be3743a920a4488b7847c25adb755ba6f2fbce288cd8e210d51dee7fe0448 not found: ID does not exist" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.842818 4752 scope.go:117] "RemoveContainer" containerID="566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.843088 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af"} err="failed to get container status \"566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af\": rpc error: code = NotFound desc = could not find container \"566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af\": container with ID starting with 566c30d9b080f4e2f8063a4688af064d4742e776756e6527a6e9811f434e80af not found: ID does not exist" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.843106 4752 scope.go:117] "RemoveContainer" containerID="b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.843332 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5"} err="failed to get container status \"b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5\": rpc error: code = NotFound desc = could not find container \"b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5\": container with ID starting with b49ae1d5912e81b7a330beb7255c696e62cbe639c13e9cad38bce2eab13946f5 not found: ID does not exist" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.843358 4752 scope.go:117] "RemoveContainer" containerID="a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.843649 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c"} err="failed to get container status \"a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c\": rpc error: code = NotFound desc = could not find container \"a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c\": container with ID starting with a0c7109f69e3cdd604573c7fdb7f8ad7f4f829a6f96cbfea97ce964f7ddd338c not found: ID does not exist" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.843673 4752 scope.go:117] "RemoveContainer" containerID="c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.843889 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad"} err="failed to get container status \"c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad\": rpc error: code = NotFound desc = could not find container \"c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad\": container with ID starting with c7b05e513f6dccaac1291f9a16a8a6dbe1e8b60bbf0c653ae767f276265c05ad not found: ID does not exist" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.843908 4752 scope.go:117] "RemoveContainer" containerID="7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.844108 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24"} err="failed to get container status \"7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24\": rpc error: code = NotFound desc = could not find container \"7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24\": container with ID starting with 7ab54b4ad93ea8bff28c06a97335efd84339378f695dbf80e563d840e1697f24 not found: ID does not exist" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.844130 4752 scope.go:117] "RemoveContainer" containerID="8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.844310 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab"} err="failed to get container status \"8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab\": rpc error: code = NotFound desc = could not find container \"8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab\": container with ID starting with 8cc599b4502840adc968a6b4f9a7c54886d8befefa791160dbfabeb02ca707ab not found: ID does not exist" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.844334 4752 scope.go:117] "RemoveContainer" containerID="5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.844535 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989"} err="failed to get container status \"5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989\": rpc error: code = NotFound desc = could not find container \"5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989\": container with ID starting with 5218d7ab01da050160a679838f1b3122de1ed45783435a4c2ad798e9ab534989 not found: ID does not exist" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.844553 4752 scope.go:117] "RemoveContainer" containerID="d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.844768 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255"} err="failed to get container status \"d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255\": rpc error: code = NotFound desc = could not find container \"d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255\": container with ID starting with d162cffb20785d53506a2726c10a2d8d6b46960719a523ca6745da28ce288255 not found: ID does not exist" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.844793 4752 scope.go:117] "RemoveContainer" containerID="fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.845042 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac"} err="failed to get container status \"fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\": rpc error: code = NotFound desc = could not find container \"fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac\": container with ID starting with fd9c167fd977496010cf2c891c67f166892a89184b389a88e5b686465736c9ac not found: ID does not exist" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.845061 4752 scope.go:117] "RemoveContainer" containerID="391be3743a920a4488b7847c25adb755ba6f2fbce288cd8e210d51dee7fe0448" Nov 24 11:17:57 crc kubenswrapper[4752]: I1124 11:17:57.845236 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"391be3743a920a4488b7847c25adb755ba6f2fbce288cd8e210d51dee7fe0448"} err="failed to get container status \"391be3743a920a4488b7847c25adb755ba6f2fbce288cd8e210d51dee7fe0448\": rpc error: code = NotFound desc = could not find container \"391be3743a920a4488b7847c25adb755ba6f2fbce288cd8e210d51dee7fe0448\": container with ID starting with 391be3743a920a4488b7847c25adb755ba6f2fbce288cd8e210d51dee7fe0448 not found: ID does not exist" Nov 24 11:17:58 crc kubenswrapper[4752]: I1124 11:17:58.597983 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" event={"ID":"9f51a639-632f-4fd9-bb34-a68d0154ee15","Type":"ContainerStarted","Data":"dc552b99bbf73a8f7db18c490cba16198b2364a2cbc342f95c24496f1df70e99"} Nov 24 11:17:58 crc kubenswrapper[4752]: I1124 11:17:58.598070 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" event={"ID":"9f51a639-632f-4fd9-bb34-a68d0154ee15","Type":"ContainerStarted","Data":"519048d00b20cbe1da934d49c0eda82a13a6bac5662c21eadcdc8e6e9f263ffd"} Nov 24 11:17:58 crc kubenswrapper[4752]: I1124 11:17:58.598093 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" event={"ID":"9f51a639-632f-4fd9-bb34-a68d0154ee15","Type":"ContainerStarted","Data":"27fda401fe032aabc574514661e008207f0ecfa7d45987dfe2627c4905eb0c74"} Nov 24 11:17:58 crc kubenswrapper[4752]: I1124 11:17:58.598106 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" event={"ID":"9f51a639-632f-4fd9-bb34-a68d0154ee15","Type":"ContainerStarted","Data":"15c119207f770c7d0d22343522846989790eeded0b92224df4b92afac53d531e"} Nov 24 11:17:58 crc kubenswrapper[4752]: I1124 11:17:58.598119 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" event={"ID":"9f51a639-632f-4fd9-bb34-a68d0154ee15","Type":"ContainerStarted","Data":"6347fa196fca39972230e46b0eeab53d4b7b32230b2c7977045366812a2f0eb4"} Nov 24 11:17:58 crc kubenswrapper[4752]: I1124 11:17:58.598132 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" event={"ID":"9f51a639-632f-4fd9-bb34-a68d0154ee15","Type":"ContainerStarted","Data":"89639b2030df86d83cc92890d503e6cece0a2f1240aab53d64f6d6f803d37881"} Nov 24 11:17:58 crc kubenswrapper[4752]: I1124 11:17:58.602651 4752 generic.go:334] "Generic (PLEG): container finished" podID="c193f46d-1b6e-4de5-a7e6-aac42bba0e53" containerID="26566f94848ec24057f0700a6578c94e8f29ff164f0e59e6972d0b5260bff7e2" exitCode=0 Nov 24 11:17:58 crc kubenswrapper[4752]: I1124 11:17:58.602717 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e6h6hf" event={"ID":"c193f46d-1b6e-4de5-a7e6-aac42bba0e53","Type":"ContainerDied","Data":"26566f94848ec24057f0700a6578c94e8f29ff164f0e59e6972d0b5260bff7e2"} Nov 24 11:17:58 crc kubenswrapper[4752]: I1124 11:17:58.736316 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa360dfd-2d4c-4442-84c9-af5d97c4c1fa" path="/var/lib/kubelet/pods/fa360dfd-2d4c-4442-84c9-af5d97c4c1fa/volumes" Nov 24 11:17:59 crc kubenswrapper[4752]: I1124 11:17:59.763309 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e6h6hf" Nov 24 11:17:59 crc kubenswrapper[4752]: I1124 11:17:59.925423 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpzfw\" (UniqueName: \"kubernetes.io/projected/c193f46d-1b6e-4de5-a7e6-aac42bba0e53-kube-api-access-bpzfw\") pod \"c193f46d-1b6e-4de5-a7e6-aac42bba0e53\" (UID: \"c193f46d-1b6e-4de5-a7e6-aac42bba0e53\") " Nov 24 11:17:59 crc kubenswrapper[4752]: I1124 11:17:59.925492 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c193f46d-1b6e-4de5-a7e6-aac42bba0e53-util\") pod \"c193f46d-1b6e-4de5-a7e6-aac42bba0e53\" (UID: \"c193f46d-1b6e-4de5-a7e6-aac42bba0e53\") " Nov 24 11:17:59 crc kubenswrapper[4752]: I1124 11:17:59.925641 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c193f46d-1b6e-4de5-a7e6-aac42bba0e53-bundle\") pod \"c193f46d-1b6e-4de5-a7e6-aac42bba0e53\" (UID: \"c193f46d-1b6e-4de5-a7e6-aac42bba0e53\") " Nov 24 11:17:59 crc kubenswrapper[4752]: I1124 11:17:59.926455 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c193f46d-1b6e-4de5-a7e6-aac42bba0e53-bundle" (OuterVolumeSpecName: "bundle") pod "c193f46d-1b6e-4de5-a7e6-aac42bba0e53" (UID: "c193f46d-1b6e-4de5-a7e6-aac42bba0e53"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:17:59 crc kubenswrapper[4752]: I1124 11:17:59.934437 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c193f46d-1b6e-4de5-a7e6-aac42bba0e53-kube-api-access-bpzfw" (OuterVolumeSpecName: "kube-api-access-bpzfw") pod "c193f46d-1b6e-4de5-a7e6-aac42bba0e53" (UID: "c193f46d-1b6e-4de5-a7e6-aac42bba0e53"). InnerVolumeSpecName "kube-api-access-bpzfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:18:00 crc kubenswrapper[4752]: I1124 11:18:00.027069 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpzfw\" (UniqueName: \"kubernetes.io/projected/c193f46d-1b6e-4de5-a7e6-aac42bba0e53-kube-api-access-bpzfw\") on node \"crc\" DevicePath \"\"" Nov 24 11:18:00 crc kubenswrapper[4752]: I1124 11:18:00.027125 4752 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c193f46d-1b6e-4de5-a7e6-aac42bba0e53-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:18:00 crc kubenswrapper[4752]: I1124 11:18:00.183481 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c193f46d-1b6e-4de5-a7e6-aac42bba0e53-util" (OuterVolumeSpecName: "util") pod "c193f46d-1b6e-4de5-a7e6-aac42bba0e53" (UID: "c193f46d-1b6e-4de5-a7e6-aac42bba0e53"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:18:00 crc kubenswrapper[4752]: I1124 11:18:00.230017 4752 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c193f46d-1b6e-4de5-a7e6-aac42bba0e53-util\") on node \"crc\" DevicePath \"\"" Nov 24 11:18:00 crc kubenswrapper[4752]: I1124 11:18:00.619737 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e6h6hf" event={"ID":"c193f46d-1b6e-4de5-a7e6-aac42bba0e53","Type":"ContainerDied","Data":"a39b1bf1d8cb8f3a552d80a1fee9c847ba41d5ef53de1897b9eb777e048bee59"} Nov 24 11:18:00 crc kubenswrapper[4752]: I1124 11:18:00.619806 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a39b1bf1d8cb8f3a552d80a1fee9c847ba41d5ef53de1897b9eb777e048bee59" Nov 24 11:18:00 crc kubenswrapper[4752]: I1124 11:18:00.619823 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e6h6hf" Nov 24 11:18:01 crc kubenswrapper[4752]: I1124 11:18:01.633001 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" event={"ID":"9f51a639-632f-4fd9-bb34-a68d0154ee15","Type":"ContainerStarted","Data":"99221cfe23953091b8f3c10def639d1d20a11f13e2e11bedd50e56093e308fc8"} Nov 24 11:18:02 crc kubenswrapper[4752]: I1124 11:18:02.195565 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-557fdffb88-jrd2k"] Nov 24 11:18:02 crc kubenswrapper[4752]: E1124 11:18:02.195792 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c193f46d-1b6e-4de5-a7e6-aac42bba0e53" containerName="extract" Nov 24 11:18:02 crc kubenswrapper[4752]: I1124 11:18:02.195806 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="c193f46d-1b6e-4de5-a7e6-aac42bba0e53" containerName="extract" Nov 24 11:18:02 crc kubenswrapper[4752]: E1124 11:18:02.195819 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c193f46d-1b6e-4de5-a7e6-aac42bba0e53" containerName="util" Nov 24 11:18:02 crc kubenswrapper[4752]: I1124 11:18:02.195827 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="c193f46d-1b6e-4de5-a7e6-aac42bba0e53" containerName="util" Nov 24 11:18:02 crc kubenswrapper[4752]: E1124 11:18:02.195836 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c193f46d-1b6e-4de5-a7e6-aac42bba0e53" containerName="pull" Nov 24 11:18:02 crc kubenswrapper[4752]: I1124 11:18:02.195844 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="c193f46d-1b6e-4de5-a7e6-aac42bba0e53" containerName="pull" Nov 24 11:18:02 crc kubenswrapper[4752]: I1124 11:18:02.195933 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="c193f46d-1b6e-4de5-a7e6-aac42bba0e53" containerName="extract" Nov 24 11:18:02 crc kubenswrapper[4752]: I1124 11:18:02.196273 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-557fdffb88-jrd2k" Nov 24 11:18:02 crc kubenswrapper[4752]: I1124 11:18:02.198054 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Nov 24 11:18:02 crc kubenswrapper[4752]: I1124 11:18:02.198175 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Nov 24 11:18:02 crc kubenswrapper[4752]: I1124 11:18:02.199192 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-24s95" Nov 24 11:18:02 crc kubenswrapper[4752]: I1124 11:18:02.358716 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcjxw\" (UniqueName: \"kubernetes.io/projected/af6619a5-cfae-4ca3-99ff-dc2f716fee60-kube-api-access-qcjxw\") pod \"nmstate-operator-557fdffb88-jrd2k\" (UID: \"af6619a5-cfae-4ca3-99ff-dc2f716fee60\") " pod="openshift-nmstate/nmstate-operator-557fdffb88-jrd2k" Nov 24 11:18:02 crc kubenswrapper[4752]: I1124 11:18:02.459595 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcjxw\" (UniqueName: \"kubernetes.io/projected/af6619a5-cfae-4ca3-99ff-dc2f716fee60-kube-api-access-qcjxw\") pod \"nmstate-operator-557fdffb88-jrd2k\" (UID: \"af6619a5-cfae-4ca3-99ff-dc2f716fee60\") " pod="openshift-nmstate/nmstate-operator-557fdffb88-jrd2k" Nov 24 11:18:02 crc kubenswrapper[4752]: I1124 11:18:02.479164 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcjxw\" (UniqueName: \"kubernetes.io/projected/af6619a5-cfae-4ca3-99ff-dc2f716fee60-kube-api-access-qcjxw\") pod \"nmstate-operator-557fdffb88-jrd2k\" (UID: \"af6619a5-cfae-4ca3-99ff-dc2f716fee60\") " pod="openshift-nmstate/nmstate-operator-557fdffb88-jrd2k" Nov 24 11:18:02 crc kubenswrapper[4752]: I1124 11:18:02.510950 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-557fdffb88-jrd2k" Nov 24 11:18:02 crc kubenswrapper[4752]: E1124 11:18:02.539206 4752 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_nmstate-operator-557fdffb88-jrd2k_openshift-nmstate_af6619a5-cfae-4ca3-99ff-dc2f716fee60_0(d33a612867e14b69f80c398506b235c9229bb18e02935be65a611dad85e7756d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 24 11:18:02 crc kubenswrapper[4752]: E1124 11:18:02.539410 4752 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_nmstate-operator-557fdffb88-jrd2k_openshift-nmstate_af6619a5-cfae-4ca3-99ff-dc2f716fee60_0(d33a612867e14b69f80c398506b235c9229bb18e02935be65a611dad85e7756d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-nmstate/nmstate-operator-557fdffb88-jrd2k" Nov 24 11:18:02 crc kubenswrapper[4752]: E1124 11:18:02.539574 4752 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_nmstate-operator-557fdffb88-jrd2k_openshift-nmstate_af6619a5-cfae-4ca3-99ff-dc2f716fee60_0(d33a612867e14b69f80c398506b235c9229bb18e02935be65a611dad85e7756d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-nmstate/nmstate-operator-557fdffb88-jrd2k" Nov 24 11:18:02 crc kubenswrapper[4752]: E1124 11:18:02.539778 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nmstate-operator-557fdffb88-jrd2k_openshift-nmstate(af6619a5-cfae-4ca3-99ff-dc2f716fee60)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nmstate-operator-557fdffb88-jrd2k_openshift-nmstate(af6619a5-cfae-4ca3-99ff-dc2f716fee60)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_nmstate-operator-557fdffb88-jrd2k_openshift-nmstate_af6619a5-cfae-4ca3-99ff-dc2f716fee60_0(d33a612867e14b69f80c398506b235c9229bb18e02935be65a611dad85e7756d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-nmstate/nmstate-operator-557fdffb88-jrd2k" podUID="af6619a5-cfae-4ca3-99ff-dc2f716fee60" Nov 24 11:18:03 crc kubenswrapper[4752]: I1124 11:18:03.645321 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-557fdffb88-jrd2k"] Nov 24 11:18:03 crc kubenswrapper[4752]: I1124 11:18:03.646045 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-557fdffb88-jrd2k" Nov 24 11:18:03 crc kubenswrapper[4752]: I1124 11:18:03.646471 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-557fdffb88-jrd2k" Nov 24 11:18:03 crc kubenswrapper[4752]: I1124 11:18:03.649024 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" event={"ID":"9f51a639-632f-4fd9-bb34-a68d0154ee15","Type":"ContainerStarted","Data":"d2a26e8fca1fa323e6037be27886b688b0dca4712835ab02137dc5e682b257a0"} Nov 24 11:18:03 crc kubenswrapper[4752]: I1124 11:18:03.650187 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:18:03 crc kubenswrapper[4752]: I1124 11:18:03.650227 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:18:03 crc kubenswrapper[4752]: I1124 11:18:03.650310 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:18:03 crc kubenswrapper[4752]: I1124 11:18:03.705018 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:18:03 crc kubenswrapper[4752]: I1124 11:18:03.706137 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" podStartSLOduration=7.706112107 podStartE2EDuration="7.706112107s" podCreationTimestamp="2025-11-24 11:17:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:18:03.691399912 +0000 UTC m=+689.676220211" watchObservedRunningTime="2025-11-24 11:18:03.706112107 +0000 UTC m=+689.690932396" Nov 24 11:18:03 crc kubenswrapper[4752]: E1124 11:18:03.713110 4752 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_nmstate-operator-557fdffb88-jrd2k_openshift-nmstate_af6619a5-cfae-4ca3-99ff-dc2f716fee60_0(44c822e5f250fb7c817cb71aeb5f4d4d74b703de31c6db4ab2d5be2de2571663): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 24 11:18:03 crc kubenswrapper[4752]: E1124 11:18:03.713201 4752 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_nmstate-operator-557fdffb88-jrd2k_openshift-nmstate_af6619a5-cfae-4ca3-99ff-dc2f716fee60_0(44c822e5f250fb7c817cb71aeb5f4d4d74b703de31c6db4ab2d5be2de2571663): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-nmstate/nmstate-operator-557fdffb88-jrd2k" Nov 24 11:18:03 crc kubenswrapper[4752]: E1124 11:18:03.713222 4752 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_nmstate-operator-557fdffb88-jrd2k_openshift-nmstate_af6619a5-cfae-4ca3-99ff-dc2f716fee60_0(44c822e5f250fb7c817cb71aeb5f4d4d74b703de31c6db4ab2d5be2de2571663): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-nmstate/nmstate-operator-557fdffb88-jrd2k" Nov 24 11:18:03 crc kubenswrapper[4752]: E1124 11:18:03.713296 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nmstate-operator-557fdffb88-jrd2k_openshift-nmstate(af6619a5-cfae-4ca3-99ff-dc2f716fee60)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nmstate-operator-557fdffb88-jrd2k_openshift-nmstate(af6619a5-cfae-4ca3-99ff-dc2f716fee60)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_nmstate-operator-557fdffb88-jrd2k_openshift-nmstate_af6619a5-cfae-4ca3-99ff-dc2f716fee60_0(44c822e5f250fb7c817cb71aeb5f4d4d74b703de31c6db4ab2d5be2de2571663): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-nmstate/nmstate-operator-557fdffb88-jrd2k" podUID="af6619a5-cfae-4ca3-99ff-dc2f716fee60" Nov 24 11:18:03 crc kubenswrapper[4752]: I1124 11:18:03.721773 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:18:08 crc kubenswrapper[4752]: I1124 11:18:08.729098 4752 scope.go:117] "RemoveContainer" containerID="657984dabefbfedd0918034d9477ebc04272e1cdc385401ce0e3c550ce68faf0" Nov 24 11:18:08 crc kubenswrapper[4752]: E1124 11:18:08.730349 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-jh899_openshift-multus(f578963d-5ff1-4e31-945b-cc59f0b244bf)\"" pod="openshift-multus/multus-jh899" podUID="f578963d-5ff1-4e31-945b-cc59f0b244bf" Nov 24 11:18:15 crc kubenswrapper[4752]: I1124 11:18:15.469140 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 11:18:15 crc kubenswrapper[4752]: I1124 11:18:15.469548 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 11:18:15 crc kubenswrapper[4752]: I1124 11:18:15.727440 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-557fdffb88-jrd2k" Nov 24 11:18:15 crc kubenswrapper[4752]: I1124 11:18:15.728140 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-557fdffb88-jrd2k" Nov 24 11:18:15 crc kubenswrapper[4752]: E1124 11:18:15.766944 4752 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_nmstate-operator-557fdffb88-jrd2k_openshift-nmstate_af6619a5-cfae-4ca3-99ff-dc2f716fee60_0(797c2e53abdfa6bd33edcc978123204b351f92fe07d5b6f475334cd39d1b07c3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 24 11:18:15 crc kubenswrapper[4752]: E1124 11:18:15.767401 4752 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_nmstate-operator-557fdffb88-jrd2k_openshift-nmstate_af6619a5-cfae-4ca3-99ff-dc2f716fee60_0(797c2e53abdfa6bd33edcc978123204b351f92fe07d5b6f475334cd39d1b07c3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-nmstate/nmstate-operator-557fdffb88-jrd2k" Nov 24 11:18:15 crc kubenswrapper[4752]: E1124 11:18:15.767439 4752 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_nmstate-operator-557fdffb88-jrd2k_openshift-nmstate_af6619a5-cfae-4ca3-99ff-dc2f716fee60_0(797c2e53abdfa6bd33edcc978123204b351f92fe07d5b6f475334cd39d1b07c3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-nmstate/nmstate-operator-557fdffb88-jrd2k" Nov 24 11:18:15 crc kubenswrapper[4752]: E1124 11:18:15.767523 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nmstate-operator-557fdffb88-jrd2k_openshift-nmstate(af6619a5-cfae-4ca3-99ff-dc2f716fee60)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nmstate-operator-557fdffb88-jrd2k_openshift-nmstate(af6619a5-cfae-4ca3-99ff-dc2f716fee60)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_nmstate-operator-557fdffb88-jrd2k_openshift-nmstate_af6619a5-cfae-4ca3-99ff-dc2f716fee60_0(797c2e53abdfa6bd33edcc978123204b351f92fe07d5b6f475334cd39d1b07c3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-nmstate/nmstate-operator-557fdffb88-jrd2k" podUID="af6619a5-cfae-4ca3-99ff-dc2f716fee60" Nov 24 11:18:21 crc kubenswrapper[4752]: I1124 11:18:21.728160 4752 scope.go:117] "RemoveContainer" containerID="657984dabefbfedd0918034d9477ebc04272e1cdc385401ce0e3c550ce68faf0" Nov 24 11:18:22 crc kubenswrapper[4752]: I1124 11:18:22.779727 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jh899_f578963d-5ff1-4e31-945b-cc59f0b244bf/kube-multus/2.log" Nov 24 11:18:22 crc kubenswrapper[4752]: I1124 11:18:22.781134 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jh899_f578963d-5ff1-4e31-945b-cc59f0b244bf/kube-multus/1.log" Nov 24 11:18:22 crc kubenswrapper[4752]: I1124 11:18:22.781201 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jh899" event={"ID":"f578963d-5ff1-4e31-945b-cc59f0b244bf","Type":"ContainerStarted","Data":"efe443f4adee696a9e9fdaca48c6b75361e9229532edd73437a0d255aff3544e"} Nov 24 11:18:27 crc kubenswrapper[4752]: I1124 11:18:27.274806 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pzt4z" Nov 24 11:18:28 crc kubenswrapper[4752]: I1124 11:18:28.727590 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-557fdffb88-jrd2k" Nov 24 11:18:28 crc kubenswrapper[4752]: I1124 11:18:28.728270 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-557fdffb88-jrd2k" Nov 24 11:18:29 crc kubenswrapper[4752]: I1124 11:18:29.010120 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-557fdffb88-jrd2k"] Nov 24 11:18:29 crc kubenswrapper[4752]: W1124 11:18:29.023098 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf6619a5_cfae_4ca3_99ff_dc2f716fee60.slice/crio-4cf0f1bb0c8e4fdc80cb0dd488d1c74695aeb628b5191e03341dfa11336fb903 WatchSource:0}: Error finding container 4cf0f1bb0c8e4fdc80cb0dd488d1c74695aeb628b5191e03341dfa11336fb903: Status 404 returned error can't find the container with id 4cf0f1bb0c8e4fdc80cb0dd488d1c74695aeb628b5191e03341dfa11336fb903 Nov 24 11:18:29 crc kubenswrapper[4752]: I1124 11:18:29.832668 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-557fdffb88-jrd2k" event={"ID":"af6619a5-cfae-4ca3-99ff-dc2f716fee60","Type":"ContainerStarted","Data":"4cf0f1bb0c8e4fdc80cb0dd488d1c74695aeb628b5191e03341dfa11336fb903"} Nov 24 11:18:31 crc kubenswrapper[4752]: I1124 11:18:31.847783 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-557fdffb88-jrd2k" event={"ID":"af6619a5-cfae-4ca3-99ff-dc2f716fee60","Type":"ContainerStarted","Data":"82e4a87cb9f80630fb5e224466e847d93da0e725c97c639d94e3091873299759"} Nov 24 11:18:31 crc kubenswrapper[4752]: I1124 11:18:31.876527 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-557fdffb88-jrd2k" podStartSLOduration=27.76988412 podStartE2EDuration="29.876494441s" podCreationTimestamp="2025-11-24 11:18:02 +0000 UTC" firstStartedPulling="2025-11-24 11:18:29.025885363 +0000 UTC m=+715.010705682" lastFinishedPulling="2025-11-24 11:18:31.132495714 +0000 UTC m=+717.117316003" observedRunningTime="2025-11-24 11:18:31.873420782 +0000 UTC m=+717.858241171" watchObservedRunningTime="2025-11-24 11:18:31.876494441 +0000 UTC m=+717.861314740" Nov 24 11:18:32 crc kubenswrapper[4752]: I1124 11:18:32.896874 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-5dcf9c57c5-fvwhs"] Nov 24 11:18:32 crc kubenswrapper[4752]: I1124 11:18:32.898224 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-fvwhs" Nov 24 11:18:32 crc kubenswrapper[4752]: I1124 11:18:32.901819 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-6m74v" Nov 24 11:18:32 crc kubenswrapper[4752]: I1124 11:18:32.912132 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6b89b748d8-r8bcs"] Nov 24 11:18:32 crc kubenswrapper[4752]: I1124 11:18:32.913976 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-r8bcs" Nov 24 11:18:32 crc kubenswrapper[4752]: I1124 11:18:32.915532 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-5dcf9c57c5-fvwhs"] Nov 24 11:18:32 crc kubenswrapper[4752]: I1124 11:18:32.916250 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Nov 24 11:18:32 crc kubenswrapper[4752]: I1124 11:18:32.933332 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-lbjrx"] Nov 24 11:18:32 crc kubenswrapper[4752]: I1124 11:18:32.934074 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-lbjrx" Nov 24 11:18:32 crc kubenswrapper[4752]: I1124 11:18:32.968112 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6b89b748d8-r8bcs"] Nov 24 11:18:32 crc kubenswrapper[4752]: I1124 11:18:32.972643 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2f9m\" (UniqueName: \"kubernetes.io/projected/30debda6-1a90-4bb5-8b86-46344bc95a1e-kube-api-access-m2f9m\") pod \"nmstate-webhook-6b89b748d8-r8bcs\" (UID: \"30debda6-1a90-4bb5-8b86-46344bc95a1e\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-r8bcs" Nov 24 11:18:32 crc kubenswrapper[4752]: I1124 11:18:32.972685 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwzr9\" (UniqueName: \"kubernetes.io/projected/469d4bbf-3549-4f0f-8abe-574354176c0e-kube-api-access-lwzr9\") pod \"nmstate-metrics-5dcf9c57c5-fvwhs\" (UID: \"469d4bbf-3549-4f0f-8abe-574354176c0e\") " pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-fvwhs" Nov 24 11:18:32 crc kubenswrapper[4752]: I1124 11:18:32.972717 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/30debda6-1a90-4bb5-8b86-46344bc95a1e-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-r8bcs\" (UID: \"30debda6-1a90-4bb5-8b86-46344bc95a1e\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-r8bcs" Nov 24 11:18:32 crc kubenswrapper[4752]: I1124 11:18:32.972769 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/776b04ad-d48c-428f-8484-63bd82cde2a2-ovs-socket\") pod \"nmstate-handler-lbjrx\" (UID: \"776b04ad-d48c-428f-8484-63bd82cde2a2\") " pod="openshift-nmstate/nmstate-handler-lbjrx" Nov 24 11:18:32 crc kubenswrapper[4752]: I1124 11:18:32.972885 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/776b04ad-d48c-428f-8484-63bd82cde2a2-nmstate-lock\") pod \"nmstate-handler-lbjrx\" (UID: \"776b04ad-d48c-428f-8484-63bd82cde2a2\") " pod="openshift-nmstate/nmstate-handler-lbjrx" Nov 24 11:18:32 crc kubenswrapper[4752]: I1124 11:18:32.972937 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8trqr\" (UniqueName: \"kubernetes.io/projected/776b04ad-d48c-428f-8484-63bd82cde2a2-kube-api-access-8trqr\") pod \"nmstate-handler-lbjrx\" (UID: \"776b04ad-d48c-428f-8484-63bd82cde2a2\") " pod="openshift-nmstate/nmstate-handler-lbjrx" Nov 24 11:18:32 crc kubenswrapper[4752]: I1124 11:18:32.973007 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/776b04ad-d48c-428f-8484-63bd82cde2a2-dbus-socket\") pod \"nmstate-handler-lbjrx\" (UID: \"776b04ad-d48c-428f-8484-63bd82cde2a2\") " pod="openshift-nmstate/nmstate-handler-lbjrx" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.050392 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5874bd7bc5-tlw8h"] Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.051241 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-tlw8h" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.054630 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-lh2vj" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.054827 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.054946 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.062796 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5874bd7bc5-tlw8h"] Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.074159 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/776b04ad-d48c-428f-8484-63bd82cde2a2-nmstate-lock\") pod \"nmstate-handler-lbjrx\" (UID: \"776b04ad-d48c-428f-8484-63bd82cde2a2\") " pod="openshift-nmstate/nmstate-handler-lbjrx" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.074204 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8trqr\" (UniqueName: \"kubernetes.io/projected/776b04ad-d48c-428f-8484-63bd82cde2a2-kube-api-access-8trqr\") pod \"nmstate-handler-lbjrx\" (UID: \"776b04ad-d48c-428f-8484-63bd82cde2a2\") " pod="openshift-nmstate/nmstate-handler-lbjrx" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.074227 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/776b04ad-d48c-428f-8484-63bd82cde2a2-dbus-socket\") pod \"nmstate-handler-lbjrx\" (UID: \"776b04ad-d48c-428f-8484-63bd82cde2a2\") " pod="openshift-nmstate/nmstate-handler-lbjrx" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.074255 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2f9m\" (UniqueName: \"kubernetes.io/projected/30debda6-1a90-4bb5-8b86-46344bc95a1e-kube-api-access-m2f9m\") pod \"nmstate-webhook-6b89b748d8-r8bcs\" (UID: \"30debda6-1a90-4bb5-8b86-46344bc95a1e\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-r8bcs" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.074272 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwzr9\" (UniqueName: \"kubernetes.io/projected/469d4bbf-3549-4f0f-8abe-574354176c0e-kube-api-access-lwzr9\") pod \"nmstate-metrics-5dcf9c57c5-fvwhs\" (UID: \"469d4bbf-3549-4f0f-8abe-574354176c0e\") " pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-fvwhs" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.074290 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/30debda6-1a90-4bb5-8b86-46344bc95a1e-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-r8bcs\" (UID: \"30debda6-1a90-4bb5-8b86-46344bc95a1e\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-r8bcs" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.074317 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/776b04ad-d48c-428f-8484-63bd82cde2a2-ovs-socket\") pod \"nmstate-handler-lbjrx\" (UID: \"776b04ad-d48c-428f-8484-63bd82cde2a2\") " pod="openshift-nmstate/nmstate-handler-lbjrx" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.074376 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/776b04ad-d48c-428f-8484-63bd82cde2a2-ovs-socket\") pod \"nmstate-handler-lbjrx\" (UID: \"776b04ad-d48c-428f-8484-63bd82cde2a2\") " pod="openshift-nmstate/nmstate-handler-lbjrx" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.074648 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/776b04ad-d48c-428f-8484-63bd82cde2a2-nmstate-lock\") pod \"nmstate-handler-lbjrx\" (UID: \"776b04ad-d48c-428f-8484-63bd82cde2a2\") " pod="openshift-nmstate/nmstate-handler-lbjrx" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.075120 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/776b04ad-d48c-428f-8484-63bd82cde2a2-dbus-socket\") pod \"nmstate-handler-lbjrx\" (UID: \"776b04ad-d48c-428f-8484-63bd82cde2a2\") " pod="openshift-nmstate/nmstate-handler-lbjrx" Nov 24 11:18:33 crc kubenswrapper[4752]: E1124 11:18:33.075382 4752 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Nov 24 11:18:33 crc kubenswrapper[4752]: E1124 11:18:33.075433 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30debda6-1a90-4bb5-8b86-46344bc95a1e-tls-key-pair podName:30debda6-1a90-4bb5-8b86-46344bc95a1e nodeName:}" failed. No retries permitted until 2025-11-24 11:18:33.575418963 +0000 UTC m=+719.560239252 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/30debda6-1a90-4bb5-8b86-46344bc95a1e-tls-key-pair") pod "nmstate-webhook-6b89b748d8-r8bcs" (UID: "30debda6-1a90-4bb5-8b86-46344bc95a1e") : secret "openshift-nmstate-webhook" not found Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.091187 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2f9m\" (UniqueName: \"kubernetes.io/projected/30debda6-1a90-4bb5-8b86-46344bc95a1e-kube-api-access-m2f9m\") pod \"nmstate-webhook-6b89b748d8-r8bcs\" (UID: \"30debda6-1a90-4bb5-8b86-46344bc95a1e\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-r8bcs" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.091620 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8trqr\" (UniqueName: \"kubernetes.io/projected/776b04ad-d48c-428f-8484-63bd82cde2a2-kube-api-access-8trqr\") pod \"nmstate-handler-lbjrx\" (UID: \"776b04ad-d48c-428f-8484-63bd82cde2a2\") " pod="openshift-nmstate/nmstate-handler-lbjrx" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.092046 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwzr9\" (UniqueName: \"kubernetes.io/projected/469d4bbf-3549-4f0f-8abe-574354176c0e-kube-api-access-lwzr9\") pod \"nmstate-metrics-5dcf9c57c5-fvwhs\" (UID: \"469d4bbf-3549-4f0f-8abe-574354176c0e\") " pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-fvwhs" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.175673 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/645d9b0b-fd05-44fa-84ce-7fda6cd1c786-nginx-conf\") pod \"nmstate-console-plugin-5874bd7bc5-tlw8h\" (UID: \"645d9b0b-fd05-44fa-84ce-7fda6cd1c786\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-tlw8h" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.175717 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqsdk\" (UniqueName: \"kubernetes.io/projected/645d9b0b-fd05-44fa-84ce-7fda6cd1c786-kube-api-access-hqsdk\") pod \"nmstate-console-plugin-5874bd7bc5-tlw8h\" (UID: \"645d9b0b-fd05-44fa-84ce-7fda6cd1c786\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-tlw8h" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.175735 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/645d9b0b-fd05-44fa-84ce-7fda6cd1c786-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-tlw8h\" (UID: \"645d9b0b-fd05-44fa-84ce-7fda6cd1c786\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-tlw8h" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.215718 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-684b966679-pc9mm"] Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.216552 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-684b966679-pc9mm" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.225175 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-684b966679-pc9mm"] Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.259067 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-fvwhs" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.277195 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2529441c-dd28-4a24-9afa-6caf91bc2d16-console-config\") pod \"console-684b966679-pc9mm\" (UID: \"2529441c-dd28-4a24-9afa-6caf91bc2d16\") " pod="openshift-console/console-684b966679-pc9mm" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.277275 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2529441c-dd28-4a24-9afa-6caf91bc2d16-service-ca\") pod \"console-684b966679-pc9mm\" (UID: \"2529441c-dd28-4a24-9afa-6caf91bc2d16\") " pod="openshift-console/console-684b966679-pc9mm" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.277319 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2529441c-dd28-4a24-9afa-6caf91bc2d16-trusted-ca-bundle\") pod \"console-684b966679-pc9mm\" (UID: \"2529441c-dd28-4a24-9afa-6caf91bc2d16\") " pod="openshift-console/console-684b966679-pc9mm" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.277362 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmrwb\" (UniqueName: \"kubernetes.io/projected/2529441c-dd28-4a24-9afa-6caf91bc2d16-kube-api-access-gmrwb\") pod \"console-684b966679-pc9mm\" (UID: \"2529441c-dd28-4a24-9afa-6caf91bc2d16\") " pod="openshift-console/console-684b966679-pc9mm" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.277402 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/645d9b0b-fd05-44fa-84ce-7fda6cd1c786-nginx-conf\") pod \"nmstate-console-plugin-5874bd7bc5-tlw8h\" (UID: \"645d9b0b-fd05-44fa-84ce-7fda6cd1c786\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-tlw8h" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.277429 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqsdk\" (UniqueName: \"kubernetes.io/projected/645d9b0b-fd05-44fa-84ce-7fda6cd1c786-kube-api-access-hqsdk\") pod \"nmstate-console-plugin-5874bd7bc5-tlw8h\" (UID: \"645d9b0b-fd05-44fa-84ce-7fda6cd1c786\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-tlw8h" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.277453 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/645d9b0b-fd05-44fa-84ce-7fda6cd1c786-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-tlw8h\" (UID: \"645d9b0b-fd05-44fa-84ce-7fda6cd1c786\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-tlw8h" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.277478 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2529441c-dd28-4a24-9afa-6caf91bc2d16-console-serving-cert\") pod \"console-684b966679-pc9mm\" (UID: \"2529441c-dd28-4a24-9afa-6caf91bc2d16\") " pod="openshift-console/console-684b966679-pc9mm" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.277539 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2529441c-dd28-4a24-9afa-6caf91bc2d16-oauth-serving-cert\") pod \"console-684b966679-pc9mm\" (UID: \"2529441c-dd28-4a24-9afa-6caf91bc2d16\") " pod="openshift-console/console-684b966679-pc9mm" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.277559 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2529441c-dd28-4a24-9afa-6caf91bc2d16-console-oauth-config\") pod \"console-684b966679-pc9mm\" (UID: \"2529441c-dd28-4a24-9afa-6caf91bc2d16\") " pod="openshift-console/console-684b966679-pc9mm" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.278666 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/645d9b0b-fd05-44fa-84ce-7fda6cd1c786-nginx-conf\") pod \"nmstate-console-plugin-5874bd7bc5-tlw8h\" (UID: \"645d9b0b-fd05-44fa-84ce-7fda6cd1c786\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-tlw8h" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.282978 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/645d9b0b-fd05-44fa-84ce-7fda6cd1c786-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-tlw8h\" (UID: \"645d9b0b-fd05-44fa-84ce-7fda6cd1c786\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-tlw8h" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.301096 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-lbjrx" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.303477 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqsdk\" (UniqueName: \"kubernetes.io/projected/645d9b0b-fd05-44fa-84ce-7fda6cd1c786-kube-api-access-hqsdk\") pod \"nmstate-console-plugin-5874bd7bc5-tlw8h\" (UID: \"645d9b0b-fd05-44fa-84ce-7fda6cd1c786\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-tlw8h" Nov 24 11:18:33 crc kubenswrapper[4752]: W1124 11:18:33.321297 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod776b04ad_d48c_428f_8484_63bd82cde2a2.slice/crio-5a94b6228806a1dfbcde3c4e8b2cc298f5dc0e690c8b153db08a2c08bda26cb3 WatchSource:0}: Error finding container 5a94b6228806a1dfbcde3c4e8b2cc298f5dc0e690c8b153db08a2c08bda26cb3: Status 404 returned error can't find the container with id 5a94b6228806a1dfbcde3c4e8b2cc298f5dc0e690c8b153db08a2c08bda26cb3 Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.369647 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-tlw8h" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.379441 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2529441c-dd28-4a24-9afa-6caf91bc2d16-console-config\") pod \"console-684b966679-pc9mm\" (UID: \"2529441c-dd28-4a24-9afa-6caf91bc2d16\") " pod="openshift-console/console-684b966679-pc9mm" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.379555 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2529441c-dd28-4a24-9afa-6caf91bc2d16-service-ca\") pod \"console-684b966679-pc9mm\" (UID: \"2529441c-dd28-4a24-9afa-6caf91bc2d16\") " pod="openshift-console/console-684b966679-pc9mm" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.379599 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2529441c-dd28-4a24-9afa-6caf91bc2d16-trusted-ca-bundle\") pod \"console-684b966679-pc9mm\" (UID: \"2529441c-dd28-4a24-9afa-6caf91bc2d16\") " pod="openshift-console/console-684b966679-pc9mm" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.379640 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmrwb\" (UniqueName: \"kubernetes.io/projected/2529441c-dd28-4a24-9afa-6caf91bc2d16-kube-api-access-gmrwb\") pod \"console-684b966679-pc9mm\" (UID: \"2529441c-dd28-4a24-9afa-6caf91bc2d16\") " pod="openshift-console/console-684b966679-pc9mm" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.379692 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2529441c-dd28-4a24-9afa-6caf91bc2d16-console-serving-cert\") pod \"console-684b966679-pc9mm\" (UID: \"2529441c-dd28-4a24-9afa-6caf91bc2d16\") " pod="openshift-console/console-684b966679-pc9mm" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.379759 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2529441c-dd28-4a24-9afa-6caf91bc2d16-oauth-serving-cert\") pod \"console-684b966679-pc9mm\" (UID: \"2529441c-dd28-4a24-9afa-6caf91bc2d16\") " pod="openshift-console/console-684b966679-pc9mm" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.379787 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2529441c-dd28-4a24-9afa-6caf91bc2d16-console-oauth-config\") pod \"console-684b966679-pc9mm\" (UID: \"2529441c-dd28-4a24-9afa-6caf91bc2d16\") " pod="openshift-console/console-684b966679-pc9mm" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.380523 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2529441c-dd28-4a24-9afa-6caf91bc2d16-console-config\") pod \"console-684b966679-pc9mm\" (UID: \"2529441c-dd28-4a24-9afa-6caf91bc2d16\") " pod="openshift-console/console-684b966679-pc9mm" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.382087 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2529441c-dd28-4a24-9afa-6caf91bc2d16-service-ca\") pod \"console-684b966679-pc9mm\" (UID: \"2529441c-dd28-4a24-9afa-6caf91bc2d16\") " pod="openshift-console/console-684b966679-pc9mm" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.382337 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2529441c-dd28-4a24-9afa-6caf91bc2d16-oauth-serving-cert\") pod \"console-684b966679-pc9mm\" (UID: \"2529441c-dd28-4a24-9afa-6caf91bc2d16\") " pod="openshift-console/console-684b966679-pc9mm" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.383478 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2529441c-dd28-4a24-9afa-6caf91bc2d16-trusted-ca-bundle\") pod \"console-684b966679-pc9mm\" (UID: \"2529441c-dd28-4a24-9afa-6caf91bc2d16\") " pod="openshift-console/console-684b966679-pc9mm" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.385976 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2529441c-dd28-4a24-9afa-6caf91bc2d16-console-oauth-config\") pod \"console-684b966679-pc9mm\" (UID: \"2529441c-dd28-4a24-9afa-6caf91bc2d16\") " pod="openshift-console/console-684b966679-pc9mm" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.387959 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2529441c-dd28-4a24-9afa-6caf91bc2d16-console-serving-cert\") pod \"console-684b966679-pc9mm\" (UID: \"2529441c-dd28-4a24-9afa-6caf91bc2d16\") " pod="openshift-console/console-684b966679-pc9mm" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.398951 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmrwb\" (UniqueName: \"kubernetes.io/projected/2529441c-dd28-4a24-9afa-6caf91bc2d16-kube-api-access-gmrwb\") pod \"console-684b966679-pc9mm\" (UID: \"2529441c-dd28-4a24-9afa-6caf91bc2d16\") " pod="openshift-console/console-684b966679-pc9mm" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.439862 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-5dcf9c57c5-fvwhs"] Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.530669 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-684b966679-pc9mm" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.555793 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5874bd7bc5-tlw8h"] Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.581415 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/30debda6-1a90-4bb5-8b86-46344bc95a1e-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-r8bcs\" (UID: \"30debda6-1a90-4bb5-8b86-46344bc95a1e\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-r8bcs" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.585396 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/30debda6-1a90-4bb5-8b86-46344bc95a1e-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-r8bcs\" (UID: \"30debda6-1a90-4bb5-8b86-46344bc95a1e\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-r8bcs" Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.718331 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-684b966679-pc9mm"] Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.861902 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-tlw8h" event={"ID":"645d9b0b-fd05-44fa-84ce-7fda6cd1c786","Type":"ContainerStarted","Data":"07e22763b8f26e416259449541bb2027b02d4f8ec9d87f7180b3361d06b3801f"} Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.863571 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-lbjrx" event={"ID":"776b04ad-d48c-428f-8484-63bd82cde2a2","Type":"ContainerStarted","Data":"5a94b6228806a1dfbcde3c4e8b2cc298f5dc0e690c8b153db08a2c08bda26cb3"} Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.864931 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-fvwhs" event={"ID":"469d4bbf-3549-4f0f-8abe-574354176c0e","Type":"ContainerStarted","Data":"256bc44560f69bcf7187ac7672992a87cd623eec6cee5165fe05891f46eec176"} Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.866946 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-684b966679-pc9mm" event={"ID":"2529441c-dd28-4a24-9afa-6caf91bc2d16","Type":"ContainerStarted","Data":"508afb2a26d6ba88fedea9eb44024105da8dce29cf6a835962ef252cc203051f"} Nov 24 11:18:33 crc kubenswrapper[4752]: I1124 11:18:33.874340 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-r8bcs" Nov 24 11:18:34 crc kubenswrapper[4752]: I1124 11:18:34.075869 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6b89b748d8-r8bcs"] Nov 24 11:18:34 crc kubenswrapper[4752]: W1124 11:18:34.085962 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30debda6_1a90_4bb5_8b86_46344bc95a1e.slice/crio-51136283e3a5d28097ccabf9007754256dd33c4cc85598f9172b1c5d0ade8b92 WatchSource:0}: Error finding container 51136283e3a5d28097ccabf9007754256dd33c4cc85598f9172b1c5d0ade8b92: Status 404 returned error can't find the container with id 51136283e3a5d28097ccabf9007754256dd33c4cc85598f9172b1c5d0ade8b92 Nov 24 11:18:34 crc kubenswrapper[4752]: I1124 11:18:34.874385 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-r8bcs" event={"ID":"30debda6-1a90-4bb5-8b86-46344bc95a1e","Type":"ContainerStarted","Data":"51136283e3a5d28097ccabf9007754256dd33c4cc85598f9172b1c5d0ade8b92"} Nov 24 11:18:34 crc kubenswrapper[4752]: I1124 11:18:34.876488 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-684b966679-pc9mm" event={"ID":"2529441c-dd28-4a24-9afa-6caf91bc2d16","Type":"ContainerStarted","Data":"16915f7b2ec3c0ccb7bec7364b5ade493b4f9d1ec47facbd21a26957c51e1137"} Nov 24 11:18:34 crc kubenswrapper[4752]: I1124 11:18:34.897918 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-684b966679-pc9mm" podStartSLOduration=1.8978962529999999 podStartE2EDuration="1.897896253s" podCreationTimestamp="2025-11-24 11:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:18:34.892546988 +0000 UTC m=+720.877367287" watchObservedRunningTime="2025-11-24 11:18:34.897896253 +0000 UTC m=+720.882716542" Nov 24 11:18:34 crc kubenswrapper[4752]: I1124 11:18:34.932971 4752 scope.go:117] "RemoveContainer" containerID="f51d87e3b03bc979ed61278f3c0e2021000517c2ff53f53793ac2862928974f9" Nov 24 11:18:35 crc kubenswrapper[4752]: I1124 11:18:35.885428 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jh899_f578963d-5ff1-4e31-945b-cc59f0b244bf/kube-multus/2.log" Nov 24 11:18:36 crc kubenswrapper[4752]: I1124 11:18:36.897346 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-tlw8h" event={"ID":"645d9b0b-fd05-44fa-84ce-7fda6cd1c786","Type":"ContainerStarted","Data":"ae78b214e7ff17864c0e27474e9fba0727cd5c8dc14c35207163b80057f44dbd"} Nov 24 11:18:36 crc kubenswrapper[4752]: I1124 11:18:36.900377 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-lbjrx" event={"ID":"776b04ad-d48c-428f-8484-63bd82cde2a2","Type":"ContainerStarted","Data":"7cb98ce559583ee6d63fd2097f9f6e8f7473e5122670c5ff8a7c05646a5cce86"} Nov 24 11:18:36 crc kubenswrapper[4752]: I1124 11:18:36.900448 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-lbjrx" Nov 24 11:18:36 crc kubenswrapper[4752]: I1124 11:18:36.902260 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-fvwhs" event={"ID":"469d4bbf-3549-4f0f-8abe-574354176c0e","Type":"ContainerStarted","Data":"8620be50cce5b6150f7af8734bb4117c60f4c02ab9fc33025fcac32f26703b9c"} Nov 24 11:18:36 crc kubenswrapper[4752]: I1124 11:18:36.903768 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-r8bcs" event={"ID":"30debda6-1a90-4bb5-8b86-46344bc95a1e","Type":"ContainerStarted","Data":"c39070eaa1e52167c48d9ea3576fae46955ff061185e323e814e1288e74b33f3"} Nov 24 11:18:36 crc kubenswrapper[4752]: I1124 11:18:36.903891 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-r8bcs" Nov 24 11:18:36 crc kubenswrapper[4752]: I1124 11:18:36.914460 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-tlw8h" podStartSLOduration=1.106610627 podStartE2EDuration="3.914443194s" podCreationTimestamp="2025-11-24 11:18:33 +0000 UTC" firstStartedPulling="2025-11-24 11:18:33.5608082 +0000 UTC m=+719.545628489" lastFinishedPulling="2025-11-24 11:18:36.368640767 +0000 UTC m=+722.353461056" observedRunningTime="2025-11-24 11:18:36.912772985 +0000 UTC m=+722.897593284" watchObservedRunningTime="2025-11-24 11:18:36.914443194 +0000 UTC m=+722.899263483" Nov 24 11:18:36 crc kubenswrapper[4752]: I1124 11:18:36.954014 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-lbjrx" podStartSLOduration=1.914630228 podStartE2EDuration="4.953996209s" podCreationTimestamp="2025-11-24 11:18:32 +0000 UTC" firstStartedPulling="2025-11-24 11:18:33.328365269 +0000 UTC m=+719.313185568" lastFinishedPulling="2025-11-24 11:18:36.36773126 +0000 UTC m=+722.352551549" observedRunningTime="2025-11-24 11:18:36.934465814 +0000 UTC m=+722.919286103" watchObservedRunningTime="2025-11-24 11:18:36.953996209 +0000 UTC m=+722.938816508" Nov 24 11:18:36 crc kubenswrapper[4752]: I1124 11:18:36.955833 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-r8bcs" podStartSLOduration=2.674901514 podStartE2EDuration="4.955824132s" podCreationTimestamp="2025-11-24 11:18:32 +0000 UTC" firstStartedPulling="2025-11-24 11:18:34.089091899 +0000 UTC m=+720.073912208" lastFinishedPulling="2025-11-24 11:18:36.370014537 +0000 UTC m=+722.354834826" observedRunningTime="2025-11-24 11:18:36.953129434 +0000 UTC m=+722.937949733" watchObservedRunningTime="2025-11-24 11:18:36.955824132 +0000 UTC m=+722.940644421" Nov 24 11:18:38 crc kubenswrapper[4752]: I1124 11:18:38.922473 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-fvwhs" event={"ID":"469d4bbf-3549-4f0f-8abe-574354176c0e","Type":"ContainerStarted","Data":"f43cea48559749c4b74b3a3474ff339edad4b58de83ff52b6b0c944db8ba7930"} Nov 24 11:18:38 crc kubenswrapper[4752]: I1124 11:18:38.954286 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-fvwhs" podStartSLOduration=1.7871069450000001 podStartE2EDuration="6.954250838s" podCreationTimestamp="2025-11-24 11:18:32 +0000 UTC" firstStartedPulling="2025-11-24 11:18:33.453425911 +0000 UTC m=+719.438246200" lastFinishedPulling="2025-11-24 11:18:38.620569784 +0000 UTC m=+724.605390093" observedRunningTime="2025-11-24 11:18:38.950126858 +0000 UTC m=+724.934947227" watchObservedRunningTime="2025-11-24 11:18:38.954250838 +0000 UTC m=+724.939071157" Nov 24 11:18:43 crc kubenswrapper[4752]: I1124 11:18:43.345120 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-lbjrx" Nov 24 11:18:43 crc kubenswrapper[4752]: I1124 11:18:43.531346 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-684b966679-pc9mm" Nov 24 11:18:43 crc kubenswrapper[4752]: I1124 11:18:43.531596 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-684b966679-pc9mm" Nov 24 11:18:43 crc kubenswrapper[4752]: I1124 11:18:43.538706 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-684b966679-pc9mm" Nov 24 11:18:43 crc kubenswrapper[4752]: I1124 11:18:43.970052 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-684b966679-pc9mm" Nov 24 11:18:44 crc kubenswrapper[4752]: I1124 11:18:44.065192 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-x7lpn"] Nov 24 11:18:45 crc kubenswrapper[4752]: I1124 11:18:45.468700 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 11:18:45 crc kubenswrapper[4752]: I1124 11:18:45.469016 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 11:18:53 crc kubenswrapper[4752]: I1124 11:18:53.884532 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-r8bcs" Nov 24 11:19:03 crc kubenswrapper[4752]: I1124 11:19:03.327297 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tn49v"] Nov 24 11:19:03 crc kubenswrapper[4752]: I1124 11:19:03.328441 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-tn49v" podUID="d4bbc9d0-4420-4581-a3be-919da131bf9a" containerName="controller-manager" containerID="cri-o://47b456e02fb8855e1dada8a587fb1db963b53e4ff699d728b01e67714d34cab4" gracePeriod=30 Nov 24 11:19:03 crc kubenswrapper[4752]: I1124 11:19:03.428346 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-cprsp"] Nov 24 11:19:03 crc kubenswrapper[4752]: I1124 11:19:03.429119 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cprsp" podUID="fde8828e-798f-4da5-9f44-0b8a2726dcb1" containerName="route-controller-manager" containerID="cri-o://1ae27e93c7d5d004019f41656528a1d0e99ea52994831dfb3076235e9069b711" gracePeriod=30 Nov 24 11:19:03 crc kubenswrapper[4752]: I1124 11:19:03.751564 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tn49v" Nov 24 11:19:03 crc kubenswrapper[4752]: I1124 11:19:03.821823 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4bbc9d0-4420-4581-a3be-919da131bf9a-serving-cert\") pod \"d4bbc9d0-4420-4581-a3be-919da131bf9a\" (UID: \"d4bbc9d0-4420-4581-a3be-919da131bf9a\") " Nov 24 11:19:03 crc kubenswrapper[4752]: I1124 11:19:03.827984 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4bbc9d0-4420-4581-a3be-919da131bf9a-client-ca\") pod \"d4bbc9d0-4420-4581-a3be-919da131bf9a\" (UID: \"d4bbc9d0-4420-4581-a3be-919da131bf9a\") " Nov 24 11:19:03 crc kubenswrapper[4752]: I1124 11:19:03.828045 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4bbc9d0-4420-4581-a3be-919da131bf9a-config\") pod \"d4bbc9d0-4420-4581-a3be-919da131bf9a\" (UID: \"d4bbc9d0-4420-4581-a3be-919da131bf9a\") " Nov 24 11:19:03 crc kubenswrapper[4752]: I1124 11:19:03.830660 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4bbc9d0-4420-4581-a3be-919da131bf9a-config" (OuterVolumeSpecName: "config") pod "d4bbc9d0-4420-4581-a3be-919da131bf9a" (UID: "d4bbc9d0-4420-4581-a3be-919da131bf9a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:19:03 crc kubenswrapper[4752]: I1124 11:19:03.830815 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tmj4\" (UniqueName: \"kubernetes.io/projected/d4bbc9d0-4420-4581-a3be-919da131bf9a-kube-api-access-9tmj4\") pod \"d4bbc9d0-4420-4581-a3be-919da131bf9a\" (UID: \"d4bbc9d0-4420-4581-a3be-919da131bf9a\") " Nov 24 11:19:03 crc kubenswrapper[4752]: I1124 11:19:03.830868 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d4bbc9d0-4420-4581-a3be-919da131bf9a-proxy-ca-bundles\") pod \"d4bbc9d0-4420-4581-a3be-919da131bf9a\" (UID: \"d4bbc9d0-4420-4581-a3be-919da131bf9a\") " Nov 24 11:19:03 crc kubenswrapper[4752]: I1124 11:19:03.831444 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4bbc9d0-4420-4581-a3be-919da131bf9a-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:19:03 crc kubenswrapper[4752]: I1124 11:19:03.833141 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4bbc9d0-4420-4581-a3be-919da131bf9a-client-ca" (OuterVolumeSpecName: "client-ca") pod "d4bbc9d0-4420-4581-a3be-919da131bf9a" (UID: "d4bbc9d0-4420-4581-a3be-919da131bf9a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:19:03 crc kubenswrapper[4752]: I1124 11:19:03.837492 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4bbc9d0-4420-4581-a3be-919da131bf9a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d4bbc9d0-4420-4581-a3be-919da131bf9a" (UID: "d4bbc9d0-4420-4581-a3be-919da131bf9a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:19:03 crc kubenswrapper[4752]: I1124 11:19:03.841006 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4bbc9d0-4420-4581-a3be-919da131bf9a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d4bbc9d0-4420-4581-a3be-919da131bf9a" (UID: "d4bbc9d0-4420-4581-a3be-919da131bf9a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:19:03 crc kubenswrapper[4752]: I1124 11:19:03.842944 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cprsp" Nov 24 11:19:03 crc kubenswrapper[4752]: I1124 11:19:03.845945 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4bbc9d0-4420-4581-a3be-919da131bf9a-kube-api-access-9tmj4" (OuterVolumeSpecName: "kube-api-access-9tmj4") pod "d4bbc9d0-4420-4581-a3be-919da131bf9a" (UID: "d4bbc9d0-4420-4581-a3be-919da131bf9a"). InnerVolumeSpecName "kube-api-access-9tmj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:19:03 crc kubenswrapper[4752]: I1124 11:19:03.932628 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fde8828e-798f-4da5-9f44-0b8a2726dcb1-config\") pod \"fde8828e-798f-4da5-9f44-0b8a2726dcb1\" (UID: \"fde8828e-798f-4da5-9f44-0b8a2726dcb1\") " Nov 24 11:19:03 crc kubenswrapper[4752]: I1124 11:19:03.932826 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zglc9\" (UniqueName: \"kubernetes.io/projected/fde8828e-798f-4da5-9f44-0b8a2726dcb1-kube-api-access-zglc9\") pod \"fde8828e-798f-4da5-9f44-0b8a2726dcb1\" (UID: \"fde8828e-798f-4da5-9f44-0b8a2726dcb1\") " Nov 24 11:19:03 crc kubenswrapper[4752]: I1124 11:19:03.932855 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fde8828e-798f-4da5-9f44-0b8a2726dcb1-serving-cert\") pod \"fde8828e-798f-4da5-9f44-0b8a2726dcb1\" (UID: \"fde8828e-798f-4da5-9f44-0b8a2726dcb1\") " Nov 24 11:19:03 crc kubenswrapper[4752]: I1124 11:19:03.932882 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fde8828e-798f-4da5-9f44-0b8a2726dcb1-client-ca\") pod \"fde8828e-798f-4da5-9f44-0b8a2726dcb1\" (UID: \"fde8828e-798f-4da5-9f44-0b8a2726dcb1\") " Nov 24 11:19:03 crc kubenswrapper[4752]: I1124 11:19:03.933174 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tmj4\" (UniqueName: \"kubernetes.io/projected/d4bbc9d0-4420-4581-a3be-919da131bf9a-kube-api-access-9tmj4\") on node \"crc\" DevicePath \"\"" Nov 24 11:19:03 crc kubenswrapper[4752]: I1124 11:19:03.933201 4752 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d4bbc9d0-4420-4581-a3be-919da131bf9a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 24 11:19:03 crc kubenswrapper[4752]: I1124 11:19:03.933215 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4bbc9d0-4420-4581-a3be-919da131bf9a-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 11:19:03 crc kubenswrapper[4752]: I1124 11:19:03.933227 4752 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4bbc9d0-4420-4581-a3be-919da131bf9a-client-ca\") on node \"crc\" DevicePath \"\"" Nov 24 11:19:03 crc kubenswrapper[4752]: I1124 11:19:03.934105 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fde8828e-798f-4da5-9f44-0b8a2726dcb1-client-ca" (OuterVolumeSpecName: "client-ca") pod "fde8828e-798f-4da5-9f44-0b8a2726dcb1" (UID: "fde8828e-798f-4da5-9f44-0b8a2726dcb1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:19:03 crc kubenswrapper[4752]: I1124 11:19:03.934320 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fde8828e-798f-4da5-9f44-0b8a2726dcb1-config" (OuterVolumeSpecName: "config") pod "fde8828e-798f-4da5-9f44-0b8a2726dcb1" (UID: "fde8828e-798f-4da5-9f44-0b8a2726dcb1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:19:03 crc kubenswrapper[4752]: I1124 11:19:03.938327 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fde8828e-798f-4da5-9f44-0b8a2726dcb1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fde8828e-798f-4da5-9f44-0b8a2726dcb1" (UID: "fde8828e-798f-4da5-9f44-0b8a2726dcb1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:19:03 crc kubenswrapper[4752]: I1124 11:19:03.938436 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fde8828e-798f-4da5-9f44-0b8a2726dcb1-kube-api-access-zglc9" (OuterVolumeSpecName: "kube-api-access-zglc9") pod "fde8828e-798f-4da5-9f44-0b8a2726dcb1" (UID: "fde8828e-798f-4da5-9f44-0b8a2726dcb1"). InnerVolumeSpecName "kube-api-access-zglc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.034583 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zglc9\" (UniqueName: \"kubernetes.io/projected/fde8828e-798f-4da5-9f44-0b8a2726dcb1-kube-api-access-zglc9\") on node \"crc\" DevicePath \"\"" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.034638 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fde8828e-798f-4da5-9f44-0b8a2726dcb1-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.034659 4752 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fde8828e-798f-4da5-9f44-0b8a2726dcb1-client-ca\") on node \"crc\" DevicePath \"\"" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.034676 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fde8828e-798f-4da5-9f44-0b8a2726dcb1-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.089823 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f7bc4dd7f-s8f5p"] Nov 24 11:19:04 crc kubenswrapper[4752]: E1124 11:19:04.090175 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4bbc9d0-4420-4581-a3be-919da131bf9a" containerName="controller-manager" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.090197 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4bbc9d0-4420-4581-a3be-919da131bf9a" containerName="controller-manager" Nov 24 11:19:04 crc kubenswrapper[4752]: E1124 11:19:04.090219 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fde8828e-798f-4da5-9f44-0b8a2726dcb1" containerName="route-controller-manager" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.090227 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="fde8828e-798f-4da5-9f44-0b8a2726dcb1" containerName="route-controller-manager" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.090350 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="fde8828e-798f-4da5-9f44-0b8a2726dcb1" containerName="route-controller-manager" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.090363 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4bbc9d0-4420-4581-a3be-919da131bf9a" containerName="controller-manager" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.090998 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f7bc4dd7f-s8f5p" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.092988 4752 generic.go:334] "Generic (PLEG): container finished" podID="fde8828e-798f-4da5-9f44-0b8a2726dcb1" containerID="1ae27e93c7d5d004019f41656528a1d0e99ea52994831dfb3076235e9069b711" exitCode=0 Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.093091 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cprsp" event={"ID":"fde8828e-798f-4da5-9f44-0b8a2726dcb1","Type":"ContainerDied","Data":"1ae27e93c7d5d004019f41656528a1d0e99ea52994831dfb3076235e9069b711"} Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.093098 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cprsp" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.093127 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cprsp" event={"ID":"fde8828e-798f-4da5-9f44-0b8a2726dcb1","Type":"ContainerDied","Data":"18924eeddbfa085b3d991b5633ee912fcff9103dc42341b48e0aaeed1ed3dffc"} Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.093151 4752 scope.go:117] "RemoveContainer" containerID="1ae27e93c7d5d004019f41656528a1d0e99ea52994831dfb3076235e9069b711" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.096438 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-565d48d597-t56g4"] Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.097844 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-565d48d597-t56g4" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.104104 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f7bc4dd7f-s8f5p"] Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.107118 4752 generic.go:334] "Generic (PLEG): container finished" podID="d4bbc9d0-4420-4581-a3be-919da131bf9a" containerID="47b456e02fb8855e1dada8a587fb1db963b53e4ff699d728b01e67714d34cab4" exitCode=0 Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.107156 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tn49v" event={"ID":"d4bbc9d0-4420-4581-a3be-919da131bf9a","Type":"ContainerDied","Data":"47b456e02fb8855e1dada8a587fb1db963b53e4ff699d728b01e67714d34cab4"} Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.107177 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tn49v" event={"ID":"d4bbc9d0-4420-4581-a3be-919da131bf9a","Type":"ContainerDied","Data":"8ff4ce0ecfa6820d3b74338d33dc1911d9c778721e3f0e9de7e97deb5c84ea6e"} Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.107277 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tn49v" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.118532 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-565d48d597-t56g4"] Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.123254 4752 scope.go:117] "RemoveContainer" containerID="1ae27e93c7d5d004019f41656528a1d0e99ea52994831dfb3076235e9069b711" Nov 24 11:19:04 crc kubenswrapper[4752]: E1124 11:19:04.128836 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ae27e93c7d5d004019f41656528a1d0e99ea52994831dfb3076235e9069b711\": container with ID starting with 1ae27e93c7d5d004019f41656528a1d0e99ea52994831dfb3076235e9069b711 not found: ID does not exist" containerID="1ae27e93c7d5d004019f41656528a1d0e99ea52994831dfb3076235e9069b711" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.128881 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ae27e93c7d5d004019f41656528a1d0e99ea52994831dfb3076235e9069b711"} err="failed to get container status \"1ae27e93c7d5d004019f41656528a1d0e99ea52994831dfb3076235e9069b711\": rpc error: code = NotFound desc = could not find container \"1ae27e93c7d5d004019f41656528a1d0e99ea52994831dfb3076235e9069b711\": container with ID starting with 1ae27e93c7d5d004019f41656528a1d0e99ea52994831dfb3076235e9069b711 not found: ID does not exist" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.128906 4752 scope.go:117] "RemoveContainer" containerID="47b456e02fb8855e1dada8a587fb1db963b53e4ff699d728b01e67714d34cab4" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.136574 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b8169b6-6f73-41eb-99a6-ae990ab1fe5e-config\") pod \"route-controller-manager-6f7bc4dd7f-s8f5p\" (UID: \"4b8169b6-6f73-41eb-99a6-ae990ab1fe5e\") " pod="openshift-route-controller-manager/route-controller-manager-6f7bc4dd7f-s8f5p" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.136624 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6da0b156-fb6a-4764-b2f8-00bcc7b7e8a7-serving-cert\") pod \"controller-manager-565d48d597-t56g4\" (UID: \"6da0b156-fb6a-4764-b2f8-00bcc7b7e8a7\") " pod="openshift-controller-manager/controller-manager-565d48d597-t56g4" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.136654 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b8169b6-6f73-41eb-99a6-ae990ab1fe5e-serving-cert\") pod \"route-controller-manager-6f7bc4dd7f-s8f5p\" (UID: \"4b8169b6-6f73-41eb-99a6-ae990ab1fe5e\") " pod="openshift-route-controller-manager/route-controller-manager-6f7bc4dd7f-s8f5p" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.136733 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6da0b156-fb6a-4764-b2f8-00bcc7b7e8a7-config\") pod \"controller-manager-565d48d597-t56g4\" (UID: \"6da0b156-fb6a-4764-b2f8-00bcc7b7e8a7\") " pod="openshift-controller-manager/controller-manager-565d48d597-t56g4" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.136785 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6da0b156-fb6a-4764-b2f8-00bcc7b7e8a7-client-ca\") pod \"controller-manager-565d48d597-t56g4\" (UID: \"6da0b156-fb6a-4764-b2f8-00bcc7b7e8a7\") " pod="openshift-controller-manager/controller-manager-565d48d597-t56g4" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.136824 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b8169b6-6f73-41eb-99a6-ae990ab1fe5e-client-ca\") pod \"route-controller-manager-6f7bc4dd7f-s8f5p\" (UID: \"4b8169b6-6f73-41eb-99a6-ae990ab1fe5e\") " pod="openshift-route-controller-manager/route-controller-manager-6f7bc4dd7f-s8f5p" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.136854 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghf89\" (UniqueName: \"kubernetes.io/projected/4b8169b6-6f73-41eb-99a6-ae990ab1fe5e-kube-api-access-ghf89\") pod \"route-controller-manager-6f7bc4dd7f-s8f5p\" (UID: \"4b8169b6-6f73-41eb-99a6-ae990ab1fe5e\") " pod="openshift-route-controller-manager/route-controller-manager-6f7bc4dd7f-s8f5p" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.136884 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6da0b156-fb6a-4764-b2f8-00bcc7b7e8a7-proxy-ca-bundles\") pod \"controller-manager-565d48d597-t56g4\" (UID: \"6da0b156-fb6a-4764-b2f8-00bcc7b7e8a7\") " pod="openshift-controller-manager/controller-manager-565d48d597-t56g4" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.136932 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm22f\" (UniqueName: \"kubernetes.io/projected/6da0b156-fb6a-4764-b2f8-00bcc7b7e8a7-kube-api-access-vm22f\") pod \"controller-manager-565d48d597-t56g4\" (UID: \"6da0b156-fb6a-4764-b2f8-00bcc7b7e8a7\") " pod="openshift-controller-manager/controller-manager-565d48d597-t56g4" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.160721 4752 scope.go:117] "RemoveContainer" containerID="47b456e02fb8855e1dada8a587fb1db963b53e4ff699d728b01e67714d34cab4" Nov 24 11:19:04 crc kubenswrapper[4752]: E1124 11:19:04.162471 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47b456e02fb8855e1dada8a587fb1db963b53e4ff699d728b01e67714d34cab4\": container with ID starting with 47b456e02fb8855e1dada8a587fb1db963b53e4ff699d728b01e67714d34cab4 not found: ID does not exist" containerID="47b456e02fb8855e1dada8a587fb1db963b53e4ff699d728b01e67714d34cab4" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.162506 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47b456e02fb8855e1dada8a587fb1db963b53e4ff699d728b01e67714d34cab4"} err="failed to get container status \"47b456e02fb8855e1dada8a587fb1db963b53e4ff699d728b01e67714d34cab4\": rpc error: code = NotFound desc = could not find container \"47b456e02fb8855e1dada8a587fb1db963b53e4ff699d728b01e67714d34cab4\": container with ID starting with 47b456e02fb8855e1dada8a587fb1db963b53e4ff699d728b01e67714d34cab4 not found: ID does not exist" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.167395 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-cprsp"] Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.174823 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-cprsp"] Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.179214 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tn49v"] Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.182994 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tn49v"] Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.237873 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b8169b6-6f73-41eb-99a6-ae990ab1fe5e-serving-cert\") pod \"route-controller-manager-6f7bc4dd7f-s8f5p\" (UID: \"4b8169b6-6f73-41eb-99a6-ae990ab1fe5e\") " pod="openshift-route-controller-manager/route-controller-manager-6f7bc4dd7f-s8f5p" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.237921 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6da0b156-fb6a-4764-b2f8-00bcc7b7e8a7-config\") pod \"controller-manager-565d48d597-t56g4\" (UID: \"6da0b156-fb6a-4764-b2f8-00bcc7b7e8a7\") " pod="openshift-controller-manager/controller-manager-565d48d597-t56g4" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.237947 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6da0b156-fb6a-4764-b2f8-00bcc7b7e8a7-client-ca\") pod \"controller-manager-565d48d597-t56g4\" (UID: \"6da0b156-fb6a-4764-b2f8-00bcc7b7e8a7\") " pod="openshift-controller-manager/controller-manager-565d48d597-t56g4" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.237970 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b8169b6-6f73-41eb-99a6-ae990ab1fe5e-client-ca\") pod \"route-controller-manager-6f7bc4dd7f-s8f5p\" (UID: \"4b8169b6-6f73-41eb-99a6-ae990ab1fe5e\") " pod="openshift-route-controller-manager/route-controller-manager-6f7bc4dd7f-s8f5p" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.237988 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghf89\" (UniqueName: \"kubernetes.io/projected/4b8169b6-6f73-41eb-99a6-ae990ab1fe5e-kube-api-access-ghf89\") pod \"route-controller-manager-6f7bc4dd7f-s8f5p\" (UID: \"4b8169b6-6f73-41eb-99a6-ae990ab1fe5e\") " pod="openshift-route-controller-manager/route-controller-manager-6f7bc4dd7f-s8f5p" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.238006 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6da0b156-fb6a-4764-b2f8-00bcc7b7e8a7-proxy-ca-bundles\") pod \"controller-manager-565d48d597-t56g4\" (UID: \"6da0b156-fb6a-4764-b2f8-00bcc7b7e8a7\") " pod="openshift-controller-manager/controller-manager-565d48d597-t56g4" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.238035 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm22f\" (UniqueName: \"kubernetes.io/projected/6da0b156-fb6a-4764-b2f8-00bcc7b7e8a7-kube-api-access-vm22f\") pod \"controller-manager-565d48d597-t56g4\" (UID: \"6da0b156-fb6a-4764-b2f8-00bcc7b7e8a7\") " pod="openshift-controller-manager/controller-manager-565d48d597-t56g4" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.238081 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b8169b6-6f73-41eb-99a6-ae990ab1fe5e-config\") pod \"route-controller-manager-6f7bc4dd7f-s8f5p\" (UID: \"4b8169b6-6f73-41eb-99a6-ae990ab1fe5e\") " pod="openshift-route-controller-manager/route-controller-manager-6f7bc4dd7f-s8f5p" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.238098 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6da0b156-fb6a-4764-b2f8-00bcc7b7e8a7-serving-cert\") pod \"controller-manager-565d48d597-t56g4\" (UID: \"6da0b156-fb6a-4764-b2f8-00bcc7b7e8a7\") " pod="openshift-controller-manager/controller-manager-565d48d597-t56g4" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.239965 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6da0b156-fb6a-4764-b2f8-00bcc7b7e8a7-client-ca\") pod \"controller-manager-565d48d597-t56g4\" (UID: \"6da0b156-fb6a-4764-b2f8-00bcc7b7e8a7\") " pod="openshift-controller-manager/controller-manager-565d48d597-t56g4" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.240036 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6da0b156-fb6a-4764-b2f8-00bcc7b7e8a7-config\") pod \"controller-manager-565d48d597-t56g4\" (UID: \"6da0b156-fb6a-4764-b2f8-00bcc7b7e8a7\") " pod="openshift-controller-manager/controller-manager-565d48d597-t56g4" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.240377 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b8169b6-6f73-41eb-99a6-ae990ab1fe5e-client-ca\") pod \"route-controller-manager-6f7bc4dd7f-s8f5p\" (UID: \"4b8169b6-6f73-41eb-99a6-ae990ab1fe5e\") " pod="openshift-route-controller-manager/route-controller-manager-6f7bc4dd7f-s8f5p" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.240660 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b8169b6-6f73-41eb-99a6-ae990ab1fe5e-config\") pod \"route-controller-manager-6f7bc4dd7f-s8f5p\" (UID: \"4b8169b6-6f73-41eb-99a6-ae990ab1fe5e\") " pod="openshift-route-controller-manager/route-controller-manager-6f7bc4dd7f-s8f5p" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.241800 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6da0b156-fb6a-4764-b2f8-00bcc7b7e8a7-proxy-ca-bundles\") pod \"controller-manager-565d48d597-t56g4\" (UID: \"6da0b156-fb6a-4764-b2f8-00bcc7b7e8a7\") " pod="openshift-controller-manager/controller-manager-565d48d597-t56g4" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.244926 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6da0b156-fb6a-4764-b2f8-00bcc7b7e8a7-serving-cert\") pod \"controller-manager-565d48d597-t56g4\" (UID: \"6da0b156-fb6a-4764-b2f8-00bcc7b7e8a7\") " pod="openshift-controller-manager/controller-manager-565d48d597-t56g4" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.244941 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b8169b6-6f73-41eb-99a6-ae990ab1fe5e-serving-cert\") pod \"route-controller-manager-6f7bc4dd7f-s8f5p\" (UID: \"4b8169b6-6f73-41eb-99a6-ae990ab1fe5e\") " pod="openshift-route-controller-manager/route-controller-manager-6f7bc4dd7f-s8f5p" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.255601 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm22f\" (UniqueName: \"kubernetes.io/projected/6da0b156-fb6a-4764-b2f8-00bcc7b7e8a7-kube-api-access-vm22f\") pod \"controller-manager-565d48d597-t56g4\" (UID: \"6da0b156-fb6a-4764-b2f8-00bcc7b7e8a7\") " pod="openshift-controller-manager/controller-manager-565d48d597-t56g4" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.258185 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghf89\" (UniqueName: \"kubernetes.io/projected/4b8169b6-6f73-41eb-99a6-ae990ab1fe5e-kube-api-access-ghf89\") pod \"route-controller-manager-6f7bc4dd7f-s8f5p\" (UID: \"4b8169b6-6f73-41eb-99a6-ae990ab1fe5e\") " pod="openshift-route-controller-manager/route-controller-manager-6f7bc4dd7f-s8f5p" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.428861 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f7bc4dd7f-s8f5p" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.439739 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-565d48d597-t56g4" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.690889 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f7bc4dd7f-s8f5p"] Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.758217 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4bbc9d0-4420-4581-a3be-919da131bf9a" path="/var/lib/kubelet/pods/d4bbc9d0-4420-4581-a3be-919da131bf9a/volumes" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.759129 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fde8828e-798f-4da5-9f44-0b8a2726dcb1" path="/var/lib/kubelet/pods/fde8828e-798f-4da5-9f44-0b8a2726dcb1/volumes" Nov 24 11:19:04 crc kubenswrapper[4752]: I1124 11:19:04.944118 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-565d48d597-t56g4"] Nov 24 11:19:05 crc kubenswrapper[4752]: I1124 11:19:05.116411 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-565d48d597-t56g4" event={"ID":"6da0b156-fb6a-4764-b2f8-00bcc7b7e8a7","Type":"ContainerStarted","Data":"c120bf70d3a5d58eae9ffd87dd91467dd46b4df9aaff0ec1ccaea1f8be769ddb"} Nov 24 11:19:05 crc kubenswrapper[4752]: I1124 11:19:05.118038 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f7bc4dd7f-s8f5p" event={"ID":"4b8169b6-6f73-41eb-99a6-ae990ab1fe5e","Type":"ContainerStarted","Data":"304f8a676e354550a4f5da45e009b99aafcb1d2d38403d0b3b145b31d72d9692"} Nov 24 11:19:05 crc kubenswrapper[4752]: I1124 11:19:05.118107 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f7bc4dd7f-s8f5p" event={"ID":"4b8169b6-6f73-41eb-99a6-ae990ab1fe5e","Type":"ContainerStarted","Data":"70f7c491ddfc16037662bb96f80708c60bb24311133888402c7d88e51b166bb0"} Nov 24 11:19:05 crc kubenswrapper[4752]: I1124 11:19:05.118489 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6f7bc4dd7f-s8f5p" Nov 24 11:19:05 crc kubenswrapper[4752]: I1124 11:19:05.238455 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6f7bc4dd7f-s8f5p" Nov 24 11:19:05 crc kubenswrapper[4752]: I1124 11:19:05.263389 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6f7bc4dd7f-s8f5p" podStartSLOduration=1.263373315 podStartE2EDuration="1.263373315s" podCreationTimestamp="2025-11-24 11:19:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:19:05.139639611 +0000 UTC m=+751.124459920" watchObservedRunningTime="2025-11-24 11:19:05.263373315 +0000 UTC m=+751.248193604" Nov 24 11:19:06 crc kubenswrapper[4752]: I1124 11:19:06.124554 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-565d48d597-t56g4" event={"ID":"6da0b156-fb6a-4764-b2f8-00bcc7b7e8a7","Type":"ContainerStarted","Data":"b2ec0c741e3e151e7d3a7373783e1dbe48914986c353bd5aca192a9ab6f501ab"} Nov 24 11:19:06 crc kubenswrapper[4752]: I1124 11:19:06.145806 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-565d48d597-t56g4" podStartSLOduration=2.14578337 podStartE2EDuration="2.14578337s" podCreationTimestamp="2025-11-24 11:19:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:19:06.142768453 +0000 UTC m=+752.127588732" watchObservedRunningTime="2025-11-24 11:19:06.14578337 +0000 UTC m=+752.130603659" Nov 24 11:19:07 crc kubenswrapper[4752]: I1124 11:19:07.096983 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c62vfl2"] Nov 24 11:19:07 crc kubenswrapper[4752]: I1124 11:19:07.098264 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c62vfl2" Nov 24 11:19:07 crc kubenswrapper[4752]: I1124 11:19:07.101266 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 24 11:19:07 crc kubenswrapper[4752]: I1124 11:19:07.111144 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c62vfl2"] Nov 24 11:19:07 crc kubenswrapper[4752]: I1124 11:19:07.159385 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-565d48d597-t56g4" Nov 24 11:19:07 crc kubenswrapper[4752]: I1124 11:19:07.167641 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-565d48d597-t56g4" Nov 24 11:19:07 crc kubenswrapper[4752]: I1124 11:19:07.177647 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d853d04f-a54c-4f5a-a9b1-197da017ba29-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c62vfl2\" (UID: \"d853d04f-a54c-4f5a-a9b1-197da017ba29\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c62vfl2" Nov 24 11:19:07 crc kubenswrapper[4752]: I1124 11:19:07.177768 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcjtb\" (UniqueName: \"kubernetes.io/projected/d853d04f-a54c-4f5a-a9b1-197da017ba29-kube-api-access-fcjtb\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c62vfl2\" (UID: \"d853d04f-a54c-4f5a-a9b1-197da017ba29\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c62vfl2" Nov 24 11:19:07 crc kubenswrapper[4752]: I1124 11:19:07.177792 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d853d04f-a54c-4f5a-a9b1-197da017ba29-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c62vfl2\" (UID: \"d853d04f-a54c-4f5a-a9b1-197da017ba29\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c62vfl2" Nov 24 11:19:07 crc kubenswrapper[4752]: I1124 11:19:07.278976 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d853d04f-a54c-4f5a-a9b1-197da017ba29-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c62vfl2\" (UID: \"d853d04f-a54c-4f5a-a9b1-197da017ba29\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c62vfl2" Nov 24 11:19:07 crc kubenswrapper[4752]: I1124 11:19:07.279045 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d853d04f-a54c-4f5a-a9b1-197da017ba29-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c62vfl2\" (UID: \"d853d04f-a54c-4f5a-a9b1-197da017ba29\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c62vfl2" Nov 24 11:19:07 crc kubenswrapper[4752]: I1124 11:19:07.279094 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcjtb\" (UniqueName: \"kubernetes.io/projected/d853d04f-a54c-4f5a-a9b1-197da017ba29-kube-api-access-fcjtb\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c62vfl2\" (UID: \"d853d04f-a54c-4f5a-a9b1-197da017ba29\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c62vfl2" Nov 24 11:19:07 crc kubenswrapper[4752]: I1124 11:19:07.279756 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d853d04f-a54c-4f5a-a9b1-197da017ba29-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c62vfl2\" (UID: \"d853d04f-a54c-4f5a-a9b1-197da017ba29\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c62vfl2" Nov 24 11:19:07 crc kubenswrapper[4752]: I1124 11:19:07.279968 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d853d04f-a54c-4f5a-a9b1-197da017ba29-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c62vfl2\" (UID: \"d853d04f-a54c-4f5a-a9b1-197da017ba29\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c62vfl2" Nov 24 11:19:07 crc kubenswrapper[4752]: I1124 11:19:07.296020 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcjtb\" (UniqueName: \"kubernetes.io/projected/d853d04f-a54c-4f5a-a9b1-197da017ba29-kube-api-access-fcjtb\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c62vfl2\" (UID: \"d853d04f-a54c-4f5a-a9b1-197da017ba29\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c62vfl2" Nov 24 11:19:07 crc kubenswrapper[4752]: I1124 11:19:07.465450 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c62vfl2" Nov 24 11:19:07 crc kubenswrapper[4752]: I1124 11:19:07.872389 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c62vfl2"] Nov 24 11:19:07 crc kubenswrapper[4752]: W1124 11:19:07.883982 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd853d04f_a54c_4f5a_a9b1_197da017ba29.slice/crio-8b1f2110fbc66d6f841f6ded9959c47ff1f74ba2f96b2ad5efb15e4d7b34a24c WatchSource:0}: Error finding container 8b1f2110fbc66d6f841f6ded9959c47ff1f74ba2f96b2ad5efb15e4d7b34a24c: Status 404 returned error can't find the container with id 8b1f2110fbc66d6f841f6ded9959c47ff1f74ba2f96b2ad5efb15e4d7b34a24c Nov 24 11:19:08 crc kubenswrapper[4752]: I1124 11:19:08.165639 4752 generic.go:334] "Generic (PLEG): container finished" podID="d853d04f-a54c-4f5a-a9b1-197da017ba29" containerID="6ff42a2919b6222a592576c1bf593a1ee2ebbdbddd1efa4107ab28c959c24f0b" exitCode=0 Nov 24 11:19:08 crc kubenswrapper[4752]: I1124 11:19:08.166481 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c62vfl2" event={"ID":"d853d04f-a54c-4f5a-a9b1-197da017ba29","Type":"ContainerDied","Data":"6ff42a2919b6222a592576c1bf593a1ee2ebbdbddd1efa4107ab28c959c24f0b"} Nov 24 11:19:08 crc kubenswrapper[4752]: I1124 11:19:08.166513 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c62vfl2" event={"ID":"d853d04f-a54c-4f5a-a9b1-197da017ba29","Type":"ContainerStarted","Data":"8b1f2110fbc66d6f841f6ded9959c47ff1f74ba2f96b2ad5efb15e4d7b34a24c"} Nov 24 11:19:09 crc kubenswrapper[4752]: I1124 11:19:09.122811 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-x7lpn" podUID="df4cedec-414c-4253-8a55-79ed8c8734c1" containerName="console" containerID="cri-o://7cc8f503cfd71bd1085fbfeed4373f198a7ca6321b6282d221a9eb7f477f008f" gracePeriod=15 Nov 24 11:19:09 crc kubenswrapper[4752]: I1124 11:19:09.217640 4752 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 24 11:19:09 crc kubenswrapper[4752]: I1124 11:19:09.451936 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ngb78"] Nov 24 11:19:09 crc kubenswrapper[4752]: I1124 11:19:09.453674 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ngb78" Nov 24 11:19:09 crc kubenswrapper[4752]: I1124 11:19:09.462005 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ngb78"] Nov 24 11:19:09 crc kubenswrapper[4752]: I1124 11:19:09.507314 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b46cbf23-b328-4d38-8f11-bb7bb81244fb-utilities\") pod \"redhat-operators-ngb78\" (UID: \"b46cbf23-b328-4d38-8f11-bb7bb81244fb\") " pod="openshift-marketplace/redhat-operators-ngb78" Nov 24 11:19:09 crc kubenswrapper[4752]: I1124 11:19:09.507383 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b46cbf23-b328-4d38-8f11-bb7bb81244fb-catalog-content\") pod \"redhat-operators-ngb78\" (UID: \"b46cbf23-b328-4d38-8f11-bb7bb81244fb\") " pod="openshift-marketplace/redhat-operators-ngb78" Nov 24 11:19:09 crc kubenswrapper[4752]: I1124 11:19:09.507422 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54p8t\" (UniqueName: \"kubernetes.io/projected/b46cbf23-b328-4d38-8f11-bb7bb81244fb-kube-api-access-54p8t\") pod \"redhat-operators-ngb78\" (UID: \"b46cbf23-b328-4d38-8f11-bb7bb81244fb\") " pod="openshift-marketplace/redhat-operators-ngb78" Nov 24 11:19:09 crc kubenswrapper[4752]: I1124 11:19:09.608163 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b46cbf23-b328-4d38-8f11-bb7bb81244fb-utilities\") pod \"redhat-operators-ngb78\" (UID: \"b46cbf23-b328-4d38-8f11-bb7bb81244fb\") " pod="openshift-marketplace/redhat-operators-ngb78" Nov 24 11:19:09 crc kubenswrapper[4752]: I1124 11:19:09.608201 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b46cbf23-b328-4d38-8f11-bb7bb81244fb-catalog-content\") pod \"redhat-operators-ngb78\" (UID: \"b46cbf23-b328-4d38-8f11-bb7bb81244fb\") " pod="openshift-marketplace/redhat-operators-ngb78" Nov 24 11:19:09 crc kubenswrapper[4752]: I1124 11:19:09.608248 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54p8t\" (UniqueName: \"kubernetes.io/projected/b46cbf23-b328-4d38-8f11-bb7bb81244fb-kube-api-access-54p8t\") pod \"redhat-operators-ngb78\" (UID: \"b46cbf23-b328-4d38-8f11-bb7bb81244fb\") " pod="openshift-marketplace/redhat-operators-ngb78" Nov 24 11:19:09 crc kubenswrapper[4752]: I1124 11:19:09.608811 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b46cbf23-b328-4d38-8f11-bb7bb81244fb-utilities\") pod \"redhat-operators-ngb78\" (UID: \"b46cbf23-b328-4d38-8f11-bb7bb81244fb\") " pod="openshift-marketplace/redhat-operators-ngb78" Nov 24 11:19:09 crc kubenswrapper[4752]: I1124 11:19:09.608815 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b46cbf23-b328-4d38-8f11-bb7bb81244fb-catalog-content\") pod \"redhat-operators-ngb78\" (UID: \"b46cbf23-b328-4d38-8f11-bb7bb81244fb\") " pod="openshift-marketplace/redhat-operators-ngb78" Nov 24 11:19:09 crc kubenswrapper[4752]: I1124 11:19:09.609241 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-x7lpn_df4cedec-414c-4253-8a55-79ed8c8734c1/console/0.log" Nov 24 11:19:09 crc kubenswrapper[4752]: I1124 11:19:09.609311 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-x7lpn" Nov 24 11:19:09 crc kubenswrapper[4752]: I1124 11:19:09.627062 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54p8t\" (UniqueName: \"kubernetes.io/projected/b46cbf23-b328-4d38-8f11-bb7bb81244fb-kube-api-access-54p8t\") pod \"redhat-operators-ngb78\" (UID: \"b46cbf23-b328-4d38-8f11-bb7bb81244fb\") " pod="openshift-marketplace/redhat-operators-ngb78" Nov 24 11:19:09 crc kubenswrapper[4752]: I1124 11:19:09.709091 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/df4cedec-414c-4253-8a55-79ed8c8734c1-service-ca\") pod \"df4cedec-414c-4253-8a55-79ed8c8734c1\" (UID: \"df4cedec-414c-4253-8a55-79ed8c8734c1\") " Nov 24 11:19:09 crc kubenswrapper[4752]: I1124 11:19:09.709154 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/df4cedec-414c-4253-8a55-79ed8c8734c1-console-oauth-config\") pod \"df4cedec-414c-4253-8a55-79ed8c8734c1\" (UID: \"df4cedec-414c-4253-8a55-79ed8c8734c1\") " Nov 24 11:19:09 crc kubenswrapper[4752]: I1124 11:19:09.709176 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/df4cedec-414c-4253-8a55-79ed8c8734c1-oauth-serving-cert\") pod \"df4cedec-414c-4253-8a55-79ed8c8734c1\" (UID: \"df4cedec-414c-4253-8a55-79ed8c8734c1\") " Nov 24 11:19:09 crc kubenswrapper[4752]: I1124 11:19:09.709211 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/df4cedec-414c-4253-8a55-79ed8c8734c1-console-config\") pod \"df4cedec-414c-4253-8a55-79ed8c8734c1\" (UID: \"df4cedec-414c-4253-8a55-79ed8c8734c1\") " Nov 24 11:19:09 crc kubenswrapper[4752]: I1124 11:19:09.709245 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/df4cedec-414c-4253-8a55-79ed8c8734c1-console-serving-cert\") pod \"df4cedec-414c-4253-8a55-79ed8c8734c1\" (UID: \"df4cedec-414c-4253-8a55-79ed8c8734c1\") " Nov 24 11:19:09 crc kubenswrapper[4752]: I1124 11:19:09.709267 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtnz8\" (UniqueName: \"kubernetes.io/projected/df4cedec-414c-4253-8a55-79ed8c8734c1-kube-api-access-xtnz8\") pod \"df4cedec-414c-4253-8a55-79ed8c8734c1\" (UID: \"df4cedec-414c-4253-8a55-79ed8c8734c1\") " Nov 24 11:19:09 crc kubenswrapper[4752]: I1124 11:19:09.709287 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df4cedec-414c-4253-8a55-79ed8c8734c1-trusted-ca-bundle\") pod \"df4cedec-414c-4253-8a55-79ed8c8734c1\" (UID: \"df4cedec-414c-4253-8a55-79ed8c8734c1\") " Nov 24 11:19:09 crc kubenswrapper[4752]: I1124 11:19:09.710001 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df4cedec-414c-4253-8a55-79ed8c8734c1-service-ca" (OuterVolumeSpecName: "service-ca") pod "df4cedec-414c-4253-8a55-79ed8c8734c1" (UID: "df4cedec-414c-4253-8a55-79ed8c8734c1"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:19:09 crc kubenswrapper[4752]: I1124 11:19:09.709991 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df4cedec-414c-4253-8a55-79ed8c8734c1-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "df4cedec-414c-4253-8a55-79ed8c8734c1" (UID: "df4cedec-414c-4253-8a55-79ed8c8734c1"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:19:09 crc kubenswrapper[4752]: I1124 11:19:09.710023 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df4cedec-414c-4253-8a55-79ed8c8734c1-console-config" (OuterVolumeSpecName: "console-config") pod "df4cedec-414c-4253-8a55-79ed8c8734c1" (UID: "df4cedec-414c-4253-8a55-79ed8c8734c1"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:19:09 crc kubenswrapper[4752]: I1124 11:19:09.710090 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df4cedec-414c-4253-8a55-79ed8c8734c1-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "df4cedec-414c-4253-8a55-79ed8c8734c1" (UID: "df4cedec-414c-4253-8a55-79ed8c8734c1"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:19:09 crc kubenswrapper[4752]: I1124 11:19:09.710183 4752 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/df4cedec-414c-4253-8a55-79ed8c8734c1-console-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:19:09 crc kubenswrapper[4752]: I1124 11:19:09.710201 4752 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/df4cedec-414c-4253-8a55-79ed8c8734c1-service-ca\") on node \"crc\" DevicePath \"\"" Nov 24 11:19:09 crc kubenswrapper[4752]: I1124 11:19:09.710213 4752 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/df4cedec-414c-4253-8a55-79ed8c8734c1-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 11:19:09 crc kubenswrapper[4752]: I1124 11:19:09.712401 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df4cedec-414c-4253-8a55-79ed8c8734c1-kube-api-access-xtnz8" (OuterVolumeSpecName: "kube-api-access-xtnz8") pod "df4cedec-414c-4253-8a55-79ed8c8734c1" (UID: "df4cedec-414c-4253-8a55-79ed8c8734c1"). InnerVolumeSpecName "kube-api-access-xtnz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:19:09 crc kubenswrapper[4752]: I1124 11:19:09.712660 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df4cedec-414c-4253-8a55-79ed8c8734c1-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "df4cedec-414c-4253-8a55-79ed8c8734c1" (UID: "df4cedec-414c-4253-8a55-79ed8c8734c1"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:19:09 crc kubenswrapper[4752]: I1124 11:19:09.713878 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df4cedec-414c-4253-8a55-79ed8c8734c1-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "df4cedec-414c-4253-8a55-79ed8c8734c1" (UID: "df4cedec-414c-4253-8a55-79ed8c8734c1"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:19:09 crc kubenswrapper[4752]: I1124 11:19:09.768637 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ngb78" Nov 24 11:19:09 crc kubenswrapper[4752]: I1124 11:19:09.811343 4752 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/df4cedec-414c-4253-8a55-79ed8c8734c1-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 11:19:09 crc kubenswrapper[4752]: I1124 11:19:09.811383 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtnz8\" (UniqueName: \"kubernetes.io/projected/df4cedec-414c-4253-8a55-79ed8c8734c1-kube-api-access-xtnz8\") on node \"crc\" DevicePath \"\"" Nov 24 11:19:09 crc kubenswrapper[4752]: I1124 11:19:09.811395 4752 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df4cedec-414c-4253-8a55-79ed8c8734c1-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:19:09 crc kubenswrapper[4752]: I1124 11:19:09.811406 4752 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/df4cedec-414c-4253-8a55-79ed8c8734c1-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:19:10 crc kubenswrapper[4752]: I1124 11:19:10.173968 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ngb78"] Nov 24 11:19:10 crc kubenswrapper[4752]: I1124 11:19:10.185066 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-x7lpn_df4cedec-414c-4253-8a55-79ed8c8734c1/console/0.log" Nov 24 11:19:10 crc kubenswrapper[4752]: I1124 11:19:10.185113 4752 generic.go:334] "Generic (PLEG): container finished" podID="df4cedec-414c-4253-8a55-79ed8c8734c1" containerID="7cc8f503cfd71bd1085fbfeed4373f198a7ca6321b6282d221a9eb7f477f008f" exitCode=2 Nov 24 11:19:10 crc kubenswrapper[4752]: I1124 11:19:10.185158 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-x7lpn" Nov 24 11:19:10 crc kubenswrapper[4752]: I1124 11:19:10.185158 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-x7lpn" event={"ID":"df4cedec-414c-4253-8a55-79ed8c8734c1","Type":"ContainerDied","Data":"7cc8f503cfd71bd1085fbfeed4373f198a7ca6321b6282d221a9eb7f477f008f"} Nov 24 11:19:10 crc kubenswrapper[4752]: I1124 11:19:10.185215 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-x7lpn" event={"ID":"df4cedec-414c-4253-8a55-79ed8c8734c1","Type":"ContainerDied","Data":"f6b9149c1dc35ac45b08f19616de1d146ce2d1605c090c8982cf399d39d2794d"} Nov 24 11:19:10 crc kubenswrapper[4752]: I1124 11:19:10.185237 4752 scope.go:117] "RemoveContainer" containerID="7cc8f503cfd71bd1085fbfeed4373f198a7ca6321b6282d221a9eb7f477f008f" Nov 24 11:19:10 crc kubenswrapper[4752]: I1124 11:19:10.211294 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-x7lpn"] Nov 24 11:19:10 crc kubenswrapper[4752]: I1124 11:19:10.213408 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-x7lpn"] Nov 24 11:19:10 crc kubenswrapper[4752]: I1124 11:19:10.220941 4752 scope.go:117] "RemoveContainer" containerID="7cc8f503cfd71bd1085fbfeed4373f198a7ca6321b6282d221a9eb7f477f008f" Nov 24 11:19:10 crc kubenswrapper[4752]: E1124 11:19:10.221610 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cc8f503cfd71bd1085fbfeed4373f198a7ca6321b6282d221a9eb7f477f008f\": container with ID starting with 7cc8f503cfd71bd1085fbfeed4373f198a7ca6321b6282d221a9eb7f477f008f not found: ID does not exist" containerID="7cc8f503cfd71bd1085fbfeed4373f198a7ca6321b6282d221a9eb7f477f008f" Nov 24 11:19:10 crc kubenswrapper[4752]: I1124 11:19:10.221662 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cc8f503cfd71bd1085fbfeed4373f198a7ca6321b6282d221a9eb7f477f008f"} err="failed to get container status \"7cc8f503cfd71bd1085fbfeed4373f198a7ca6321b6282d221a9eb7f477f008f\": rpc error: code = NotFound desc = could not find container \"7cc8f503cfd71bd1085fbfeed4373f198a7ca6321b6282d221a9eb7f477f008f\": container with ID starting with 7cc8f503cfd71bd1085fbfeed4373f198a7ca6321b6282d221a9eb7f477f008f not found: ID does not exist" Nov 24 11:19:10 crc kubenswrapper[4752]: I1124 11:19:10.735121 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df4cedec-414c-4253-8a55-79ed8c8734c1" path="/var/lib/kubelet/pods/df4cedec-414c-4253-8a55-79ed8c8734c1/volumes" Nov 24 11:19:11 crc kubenswrapper[4752]: I1124 11:19:11.195313 4752 generic.go:334] "Generic (PLEG): container finished" podID="b46cbf23-b328-4d38-8f11-bb7bb81244fb" containerID="81fcfa154af3be2ad6b7a8f1ca9b88f8d8e8667b31022e1fe76735e27e407cb2" exitCode=0 Nov 24 11:19:11 crc kubenswrapper[4752]: I1124 11:19:11.195377 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngb78" event={"ID":"b46cbf23-b328-4d38-8f11-bb7bb81244fb","Type":"ContainerDied","Data":"81fcfa154af3be2ad6b7a8f1ca9b88f8d8e8667b31022e1fe76735e27e407cb2"} Nov 24 11:19:11 crc kubenswrapper[4752]: I1124 11:19:11.195427 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngb78" event={"ID":"b46cbf23-b328-4d38-8f11-bb7bb81244fb","Type":"ContainerStarted","Data":"4f4de182b243cb05eca041e1129d83b02998e360b30595a93e9041f06096fd3d"} Nov 24 11:19:12 crc kubenswrapper[4752]: I1124 11:19:12.203097 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngb78" event={"ID":"b46cbf23-b328-4d38-8f11-bb7bb81244fb","Type":"ContainerStarted","Data":"dd2dce82ca0eb8d05a13a5a9005eeed2afcbacbac7ea95d9478af4a38a23738a"} Nov 24 11:19:13 crc kubenswrapper[4752]: I1124 11:19:13.212855 4752 generic.go:334] "Generic (PLEG): container finished" podID="b46cbf23-b328-4d38-8f11-bb7bb81244fb" containerID="dd2dce82ca0eb8d05a13a5a9005eeed2afcbacbac7ea95d9478af4a38a23738a" exitCode=0 Nov 24 11:19:13 crc kubenswrapper[4752]: I1124 11:19:13.212907 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngb78" event={"ID":"b46cbf23-b328-4d38-8f11-bb7bb81244fb","Type":"ContainerDied","Data":"dd2dce82ca0eb8d05a13a5a9005eeed2afcbacbac7ea95d9478af4a38a23738a"} Nov 24 11:19:14 crc kubenswrapper[4752]: I1124 11:19:14.225062 4752 generic.go:334] "Generic (PLEG): container finished" podID="d853d04f-a54c-4f5a-a9b1-197da017ba29" containerID="726d06f8c5a70673638c76a615a898b545d533e09651930934cfbbdf0912625b" exitCode=0 Nov 24 11:19:14 crc kubenswrapper[4752]: I1124 11:19:14.225111 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c62vfl2" event={"ID":"d853d04f-a54c-4f5a-a9b1-197da017ba29","Type":"ContainerDied","Data":"726d06f8c5a70673638c76a615a898b545d533e09651930934cfbbdf0912625b"} Nov 24 11:19:14 crc kubenswrapper[4752]: I1124 11:19:14.231486 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngb78" event={"ID":"b46cbf23-b328-4d38-8f11-bb7bb81244fb","Type":"ContainerStarted","Data":"22aaaf43e8f59bf2e592328930b71a3c0d490c5a78d8cc6ba6e32614e6be7bab"} Nov 24 11:19:14 crc kubenswrapper[4752]: I1124 11:19:14.288018 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ngb78" podStartSLOduration=2.853016944 podStartE2EDuration="5.287992912s" podCreationTimestamp="2025-11-24 11:19:09 +0000 UTC" firstStartedPulling="2025-11-24 11:19:11.197536211 +0000 UTC m=+757.182356510" lastFinishedPulling="2025-11-24 11:19:13.632512179 +0000 UTC m=+759.617332478" observedRunningTime="2025-11-24 11:19:14.282586376 +0000 UTC m=+760.267406675" watchObservedRunningTime="2025-11-24 11:19:14.287992912 +0000 UTC m=+760.272813241" Nov 24 11:19:15 crc kubenswrapper[4752]: I1124 11:19:15.242103 4752 generic.go:334] "Generic (PLEG): container finished" podID="d853d04f-a54c-4f5a-a9b1-197da017ba29" containerID="e0c50c236f0729c3cc91da954c233fcb275e0b98844680776135e31475fcf6db" exitCode=0 Nov 24 11:19:15 crc kubenswrapper[4752]: I1124 11:19:15.242333 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c62vfl2" event={"ID":"d853d04f-a54c-4f5a-a9b1-197da017ba29","Type":"ContainerDied","Data":"e0c50c236f0729c3cc91da954c233fcb275e0b98844680776135e31475fcf6db"} Nov 24 11:19:15 crc kubenswrapper[4752]: I1124 11:19:15.469314 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 11:19:15 crc kubenswrapper[4752]: I1124 11:19:15.469404 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 11:19:15 crc kubenswrapper[4752]: I1124 11:19:15.469465 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 11:19:15 crc kubenswrapper[4752]: I1124 11:19:15.470139 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6e5c6804736df3aeb8f103a865f92241e4b53345e9d743bcf5a29c58bb51532b"} pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 24 11:19:15 crc kubenswrapper[4752]: I1124 11:19:15.470224 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" containerID="cri-o://6e5c6804736df3aeb8f103a865f92241e4b53345e9d743bcf5a29c58bb51532b" gracePeriod=600 Nov 24 11:19:16 crc kubenswrapper[4752]: I1124 11:19:16.251238 4752 generic.go:334] "Generic (PLEG): container finished" podID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerID="6e5c6804736df3aeb8f103a865f92241e4b53345e9d743bcf5a29c58bb51532b" exitCode=0 Nov 24 11:19:16 crc kubenswrapper[4752]: I1124 11:19:16.251324 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerDied","Data":"6e5c6804736df3aeb8f103a865f92241e4b53345e9d743bcf5a29c58bb51532b"} Nov 24 11:19:16 crc kubenswrapper[4752]: I1124 11:19:16.251622 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerStarted","Data":"232a5964f5a1b0c13db2c00fa30a37e21418af2bce45250ff9defff836c970d9"} Nov 24 11:19:16 crc kubenswrapper[4752]: I1124 11:19:16.251659 4752 scope.go:117] "RemoveContainer" containerID="da2caaa446f21448894891c3dba7ab4a49ef9e957e5675f710fb5e3d17cf68e4" Nov 24 11:19:16 crc kubenswrapper[4752]: I1124 11:19:16.525005 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c62vfl2" Nov 24 11:19:16 crc kubenswrapper[4752]: I1124 11:19:16.617169 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcjtb\" (UniqueName: \"kubernetes.io/projected/d853d04f-a54c-4f5a-a9b1-197da017ba29-kube-api-access-fcjtb\") pod \"d853d04f-a54c-4f5a-a9b1-197da017ba29\" (UID: \"d853d04f-a54c-4f5a-a9b1-197da017ba29\") " Nov 24 11:19:16 crc kubenswrapper[4752]: I1124 11:19:16.617500 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d853d04f-a54c-4f5a-a9b1-197da017ba29-util\") pod \"d853d04f-a54c-4f5a-a9b1-197da017ba29\" (UID: \"d853d04f-a54c-4f5a-a9b1-197da017ba29\") " Nov 24 11:19:16 crc kubenswrapper[4752]: I1124 11:19:16.617525 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d853d04f-a54c-4f5a-a9b1-197da017ba29-bundle\") pod \"d853d04f-a54c-4f5a-a9b1-197da017ba29\" (UID: \"d853d04f-a54c-4f5a-a9b1-197da017ba29\") " Nov 24 11:19:16 crc kubenswrapper[4752]: I1124 11:19:16.619476 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d853d04f-a54c-4f5a-a9b1-197da017ba29-bundle" (OuterVolumeSpecName: "bundle") pod "d853d04f-a54c-4f5a-a9b1-197da017ba29" (UID: "d853d04f-a54c-4f5a-a9b1-197da017ba29"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:19:16 crc kubenswrapper[4752]: I1124 11:19:16.622378 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d853d04f-a54c-4f5a-a9b1-197da017ba29-kube-api-access-fcjtb" (OuterVolumeSpecName: "kube-api-access-fcjtb") pod "d853d04f-a54c-4f5a-a9b1-197da017ba29" (UID: "d853d04f-a54c-4f5a-a9b1-197da017ba29"). InnerVolumeSpecName "kube-api-access-fcjtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:19:16 crc kubenswrapper[4752]: I1124 11:19:16.630815 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d853d04f-a54c-4f5a-a9b1-197da017ba29-util" (OuterVolumeSpecName: "util") pod "d853d04f-a54c-4f5a-a9b1-197da017ba29" (UID: "d853d04f-a54c-4f5a-a9b1-197da017ba29"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:19:16 crc kubenswrapper[4752]: I1124 11:19:16.718541 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcjtb\" (UniqueName: \"kubernetes.io/projected/d853d04f-a54c-4f5a-a9b1-197da017ba29-kube-api-access-fcjtb\") on node \"crc\" DevicePath \"\"" Nov 24 11:19:16 crc kubenswrapper[4752]: I1124 11:19:16.718577 4752 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d853d04f-a54c-4f5a-a9b1-197da017ba29-util\") on node \"crc\" DevicePath \"\"" Nov 24 11:19:16 crc kubenswrapper[4752]: I1124 11:19:16.718590 4752 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d853d04f-a54c-4f5a-a9b1-197da017ba29-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:19:17 crc kubenswrapper[4752]: I1124 11:19:17.262707 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c62vfl2" Nov 24 11:19:17 crc kubenswrapper[4752]: I1124 11:19:17.262679 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c62vfl2" event={"ID":"d853d04f-a54c-4f5a-a9b1-197da017ba29","Type":"ContainerDied","Data":"8b1f2110fbc66d6f841f6ded9959c47ff1f74ba2f96b2ad5efb15e4d7b34a24c"} Nov 24 11:19:17 crc kubenswrapper[4752]: I1124 11:19:17.262864 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b1f2110fbc66d6f841f6ded9959c47ff1f74ba2f96b2ad5efb15e4d7b34a24c" Nov 24 11:19:19 crc kubenswrapper[4752]: I1124 11:19:19.769625 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ngb78" Nov 24 11:19:19 crc kubenswrapper[4752]: I1124 11:19:19.770239 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ngb78" Nov 24 11:19:20 crc kubenswrapper[4752]: I1124 11:19:20.808865 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ngb78" podUID="b46cbf23-b328-4d38-8f11-bb7bb81244fb" containerName="registry-server" probeResult="failure" output=< Nov 24 11:19:20 crc kubenswrapper[4752]: timeout: failed to connect service ":50051" within 1s Nov 24 11:19:20 crc kubenswrapper[4752]: > Nov 24 11:19:25 crc kubenswrapper[4752]: I1124 11:19:25.537808 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6947b4bf66-9rmcj"] Nov 24 11:19:25 crc kubenswrapper[4752]: E1124 11:19:25.538373 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d853d04f-a54c-4f5a-a9b1-197da017ba29" containerName="extract" Nov 24 11:19:25 crc kubenswrapper[4752]: I1124 11:19:25.538385 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="d853d04f-a54c-4f5a-a9b1-197da017ba29" containerName="extract" Nov 24 11:19:25 crc kubenswrapper[4752]: E1124 11:19:25.538393 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d853d04f-a54c-4f5a-a9b1-197da017ba29" containerName="pull" Nov 24 11:19:25 crc kubenswrapper[4752]: I1124 11:19:25.538399 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="d853d04f-a54c-4f5a-a9b1-197da017ba29" containerName="pull" Nov 24 11:19:25 crc kubenswrapper[4752]: E1124 11:19:25.538408 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df4cedec-414c-4253-8a55-79ed8c8734c1" containerName="console" Nov 24 11:19:25 crc kubenswrapper[4752]: I1124 11:19:25.538415 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="df4cedec-414c-4253-8a55-79ed8c8734c1" containerName="console" Nov 24 11:19:25 crc kubenswrapper[4752]: E1124 11:19:25.538426 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d853d04f-a54c-4f5a-a9b1-197da017ba29" containerName="util" Nov 24 11:19:25 crc kubenswrapper[4752]: I1124 11:19:25.538432 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="d853d04f-a54c-4f5a-a9b1-197da017ba29" containerName="util" Nov 24 11:19:25 crc kubenswrapper[4752]: I1124 11:19:25.538515 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="d853d04f-a54c-4f5a-a9b1-197da017ba29" containerName="extract" Nov 24 11:19:25 crc kubenswrapper[4752]: I1124 11:19:25.538524 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="df4cedec-414c-4253-8a55-79ed8c8734c1" containerName="console" Nov 24 11:19:25 crc kubenswrapper[4752]: I1124 11:19:25.538862 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6947b4bf66-9rmcj" Nov 24 11:19:25 crc kubenswrapper[4752]: I1124 11:19:25.541879 4752 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Nov 24 11:19:25 crc kubenswrapper[4752]: I1124 11:19:25.542022 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Nov 24 11:19:25 crc kubenswrapper[4752]: I1124 11:19:25.542274 4752 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Nov 24 11:19:25 crc kubenswrapper[4752]: I1124 11:19:25.542427 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Nov 24 11:19:25 crc kubenswrapper[4752]: I1124 11:19:25.542501 4752 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-r7fvr" Nov 24 11:19:25 crc kubenswrapper[4752]: I1124 11:19:25.565337 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6947b4bf66-9rmcj"] Nov 24 11:19:25 crc kubenswrapper[4752]: I1124 11:19:25.632367 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cb94bb73-0a35-4724-af4c-8333c6dbc07c-webhook-cert\") pod \"metallb-operator-controller-manager-6947b4bf66-9rmcj\" (UID: \"cb94bb73-0a35-4724-af4c-8333c6dbc07c\") " pod="metallb-system/metallb-operator-controller-manager-6947b4bf66-9rmcj" Nov 24 11:19:25 crc kubenswrapper[4752]: I1124 11:19:25.632422 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt4k6\" (UniqueName: \"kubernetes.io/projected/cb94bb73-0a35-4724-af4c-8333c6dbc07c-kube-api-access-gt4k6\") pod \"metallb-operator-controller-manager-6947b4bf66-9rmcj\" (UID: \"cb94bb73-0a35-4724-af4c-8333c6dbc07c\") " pod="metallb-system/metallb-operator-controller-manager-6947b4bf66-9rmcj" Nov 24 11:19:25 crc kubenswrapper[4752]: I1124 11:19:25.632445 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cb94bb73-0a35-4724-af4c-8333c6dbc07c-apiservice-cert\") pod \"metallb-operator-controller-manager-6947b4bf66-9rmcj\" (UID: \"cb94bb73-0a35-4724-af4c-8333c6dbc07c\") " pod="metallb-system/metallb-operator-controller-manager-6947b4bf66-9rmcj" Nov 24 11:19:25 crc kubenswrapper[4752]: I1124 11:19:25.734206 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cb94bb73-0a35-4724-af4c-8333c6dbc07c-webhook-cert\") pod \"metallb-operator-controller-manager-6947b4bf66-9rmcj\" (UID: \"cb94bb73-0a35-4724-af4c-8333c6dbc07c\") " pod="metallb-system/metallb-operator-controller-manager-6947b4bf66-9rmcj" Nov 24 11:19:25 crc kubenswrapper[4752]: I1124 11:19:25.735307 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt4k6\" (UniqueName: \"kubernetes.io/projected/cb94bb73-0a35-4724-af4c-8333c6dbc07c-kube-api-access-gt4k6\") pod \"metallb-operator-controller-manager-6947b4bf66-9rmcj\" (UID: \"cb94bb73-0a35-4724-af4c-8333c6dbc07c\") " pod="metallb-system/metallb-operator-controller-manager-6947b4bf66-9rmcj" Nov 24 11:19:25 crc kubenswrapper[4752]: I1124 11:19:25.735417 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cb94bb73-0a35-4724-af4c-8333c6dbc07c-apiservice-cert\") pod \"metallb-operator-controller-manager-6947b4bf66-9rmcj\" (UID: \"cb94bb73-0a35-4724-af4c-8333c6dbc07c\") " pod="metallb-system/metallb-operator-controller-manager-6947b4bf66-9rmcj" Nov 24 11:19:25 crc kubenswrapper[4752]: I1124 11:19:25.740825 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cb94bb73-0a35-4724-af4c-8333c6dbc07c-webhook-cert\") pod \"metallb-operator-controller-manager-6947b4bf66-9rmcj\" (UID: \"cb94bb73-0a35-4724-af4c-8333c6dbc07c\") " pod="metallb-system/metallb-operator-controller-manager-6947b4bf66-9rmcj" Nov 24 11:19:25 crc kubenswrapper[4752]: I1124 11:19:25.754547 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cb94bb73-0a35-4724-af4c-8333c6dbc07c-apiservice-cert\") pod \"metallb-operator-controller-manager-6947b4bf66-9rmcj\" (UID: \"cb94bb73-0a35-4724-af4c-8333c6dbc07c\") " pod="metallb-system/metallb-operator-controller-manager-6947b4bf66-9rmcj" Nov 24 11:19:25 crc kubenswrapper[4752]: I1124 11:19:25.767194 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-5b8d98bc54-4tslc"] Nov 24 11:19:25 crc kubenswrapper[4752]: I1124 11:19:25.768021 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5b8d98bc54-4tslc" Nov 24 11:19:25 crc kubenswrapper[4752]: I1124 11:19:25.770311 4752 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-sk68q" Nov 24 11:19:25 crc kubenswrapper[4752]: I1124 11:19:25.770427 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt4k6\" (UniqueName: \"kubernetes.io/projected/cb94bb73-0a35-4724-af4c-8333c6dbc07c-kube-api-access-gt4k6\") pod \"metallb-operator-controller-manager-6947b4bf66-9rmcj\" (UID: \"cb94bb73-0a35-4724-af4c-8333c6dbc07c\") " pod="metallb-system/metallb-operator-controller-manager-6947b4bf66-9rmcj" Nov 24 11:19:25 crc kubenswrapper[4752]: I1124 11:19:25.770368 4752 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 24 11:19:25 crc kubenswrapper[4752]: I1124 11:19:25.770407 4752 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Nov 24 11:19:25 crc kubenswrapper[4752]: I1124 11:19:25.789560 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5b8d98bc54-4tslc"] Nov 24 11:19:25 crc kubenswrapper[4752]: I1124 11:19:25.853642 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6947b4bf66-9rmcj" Nov 24 11:19:25 crc kubenswrapper[4752]: I1124 11:19:25.937697 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nplwf\" (UniqueName: \"kubernetes.io/projected/a8771fed-834c-4867-b376-1c8b5347b532-kube-api-access-nplwf\") pod \"metallb-operator-webhook-server-5b8d98bc54-4tslc\" (UID: \"a8771fed-834c-4867-b376-1c8b5347b532\") " pod="metallb-system/metallb-operator-webhook-server-5b8d98bc54-4tslc" Nov 24 11:19:25 crc kubenswrapper[4752]: I1124 11:19:25.937844 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a8771fed-834c-4867-b376-1c8b5347b532-apiservice-cert\") pod \"metallb-operator-webhook-server-5b8d98bc54-4tslc\" (UID: \"a8771fed-834c-4867-b376-1c8b5347b532\") " pod="metallb-system/metallb-operator-webhook-server-5b8d98bc54-4tslc" Nov 24 11:19:25 crc kubenswrapper[4752]: I1124 11:19:25.937896 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a8771fed-834c-4867-b376-1c8b5347b532-webhook-cert\") pod \"metallb-operator-webhook-server-5b8d98bc54-4tslc\" (UID: \"a8771fed-834c-4867-b376-1c8b5347b532\") " pod="metallb-system/metallb-operator-webhook-server-5b8d98bc54-4tslc" Nov 24 11:19:26 crc kubenswrapper[4752]: I1124 11:19:26.038803 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a8771fed-834c-4867-b376-1c8b5347b532-webhook-cert\") pod \"metallb-operator-webhook-server-5b8d98bc54-4tslc\" (UID: \"a8771fed-834c-4867-b376-1c8b5347b532\") " pod="metallb-system/metallb-operator-webhook-server-5b8d98bc54-4tslc" Nov 24 11:19:26 crc kubenswrapper[4752]: I1124 11:19:26.038879 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nplwf\" (UniqueName: \"kubernetes.io/projected/a8771fed-834c-4867-b376-1c8b5347b532-kube-api-access-nplwf\") pod \"metallb-operator-webhook-server-5b8d98bc54-4tslc\" (UID: \"a8771fed-834c-4867-b376-1c8b5347b532\") " pod="metallb-system/metallb-operator-webhook-server-5b8d98bc54-4tslc" Nov 24 11:19:26 crc kubenswrapper[4752]: I1124 11:19:26.038934 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a8771fed-834c-4867-b376-1c8b5347b532-apiservice-cert\") pod \"metallb-operator-webhook-server-5b8d98bc54-4tslc\" (UID: \"a8771fed-834c-4867-b376-1c8b5347b532\") " pod="metallb-system/metallb-operator-webhook-server-5b8d98bc54-4tslc" Nov 24 11:19:26 crc kubenswrapper[4752]: I1124 11:19:26.044712 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a8771fed-834c-4867-b376-1c8b5347b532-webhook-cert\") pod \"metallb-operator-webhook-server-5b8d98bc54-4tslc\" (UID: \"a8771fed-834c-4867-b376-1c8b5347b532\") " pod="metallb-system/metallb-operator-webhook-server-5b8d98bc54-4tslc" Nov 24 11:19:26 crc kubenswrapper[4752]: I1124 11:19:26.044809 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a8771fed-834c-4867-b376-1c8b5347b532-apiservice-cert\") pod \"metallb-operator-webhook-server-5b8d98bc54-4tslc\" (UID: \"a8771fed-834c-4867-b376-1c8b5347b532\") " pod="metallb-system/metallb-operator-webhook-server-5b8d98bc54-4tslc" Nov 24 11:19:26 crc kubenswrapper[4752]: I1124 11:19:26.059983 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nplwf\" (UniqueName: \"kubernetes.io/projected/a8771fed-834c-4867-b376-1c8b5347b532-kube-api-access-nplwf\") pod \"metallb-operator-webhook-server-5b8d98bc54-4tslc\" (UID: \"a8771fed-834c-4867-b376-1c8b5347b532\") " pod="metallb-system/metallb-operator-webhook-server-5b8d98bc54-4tslc" Nov 24 11:19:26 crc kubenswrapper[4752]: I1124 11:19:26.119728 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5b8d98bc54-4tslc" Nov 24 11:19:26 crc kubenswrapper[4752]: I1124 11:19:26.303634 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6947b4bf66-9rmcj"] Nov 24 11:19:26 crc kubenswrapper[4752]: W1124 11:19:26.328376 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb94bb73_0a35_4724_af4c_8333c6dbc07c.slice/crio-07e9bfa15f8854d691c82fb4a3f467c182098413ea5b29a02c44e9d5c75a3be0 WatchSource:0}: Error finding container 07e9bfa15f8854d691c82fb4a3f467c182098413ea5b29a02c44e9d5c75a3be0: Status 404 returned error can't find the container with id 07e9bfa15f8854d691c82fb4a3f467c182098413ea5b29a02c44e9d5c75a3be0 Nov 24 11:19:26 crc kubenswrapper[4752]: I1124 11:19:26.587658 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5b8d98bc54-4tslc"] Nov 24 11:19:26 crc kubenswrapper[4752]: W1124 11:19:26.600610 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8771fed_834c_4867_b376_1c8b5347b532.slice/crio-bb60888ca07c73fcf949dd37f23661d576b4ddbd18381c8d68f150a892da1f6c WatchSource:0}: Error finding container bb60888ca07c73fcf949dd37f23661d576b4ddbd18381c8d68f150a892da1f6c: Status 404 returned error can't find the container with id bb60888ca07c73fcf949dd37f23661d576b4ddbd18381c8d68f150a892da1f6c Nov 24 11:19:27 crc kubenswrapper[4752]: I1124 11:19:27.345823 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5b8d98bc54-4tslc" event={"ID":"a8771fed-834c-4867-b376-1c8b5347b532","Type":"ContainerStarted","Data":"bb60888ca07c73fcf949dd37f23661d576b4ddbd18381c8d68f150a892da1f6c"} Nov 24 11:19:27 crc kubenswrapper[4752]: I1124 11:19:27.347503 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6947b4bf66-9rmcj" event={"ID":"cb94bb73-0a35-4724-af4c-8333c6dbc07c","Type":"ContainerStarted","Data":"07e9bfa15f8854d691c82fb4a3f467c182098413ea5b29a02c44e9d5c75a3be0"} Nov 24 11:19:29 crc kubenswrapper[4752]: I1124 11:19:29.833315 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ngb78" Nov 24 11:19:29 crc kubenswrapper[4752]: I1124 11:19:29.883607 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ngb78" Nov 24 11:19:30 crc kubenswrapper[4752]: I1124 11:19:30.080523 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ngb78"] Nov 24 11:19:30 crc kubenswrapper[4752]: I1124 11:19:30.363794 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6947b4bf66-9rmcj" event={"ID":"cb94bb73-0a35-4724-af4c-8333c6dbc07c","Type":"ContainerStarted","Data":"6845a9ac8ce45d0a5c1651bd3b1474a6d90e3f31e4645cf08c06b651d9ce46d3"} Nov 24 11:19:30 crc kubenswrapper[4752]: I1124 11:19:30.363872 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6947b4bf66-9rmcj" Nov 24 11:19:30 crc kubenswrapper[4752]: I1124 11:19:30.389679 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6947b4bf66-9rmcj" podStartSLOduration=2.258679383 podStartE2EDuration="5.389641147s" podCreationTimestamp="2025-11-24 11:19:25 +0000 UTC" firstStartedPulling="2025-11-24 11:19:26.342707715 +0000 UTC m=+772.327528004" lastFinishedPulling="2025-11-24 11:19:29.473669479 +0000 UTC m=+775.458489768" observedRunningTime="2025-11-24 11:19:30.382195511 +0000 UTC m=+776.367015800" watchObservedRunningTime="2025-11-24 11:19:30.389641147 +0000 UTC m=+776.374461476" Nov 24 11:19:31 crc kubenswrapper[4752]: I1124 11:19:31.373297 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ngb78" podUID="b46cbf23-b328-4d38-8f11-bb7bb81244fb" containerName="registry-server" containerID="cri-o://22aaaf43e8f59bf2e592328930b71a3c0d490c5a78d8cc6ba6e32614e6be7bab" gracePeriod=2 Nov 24 11:19:31 crc kubenswrapper[4752]: I1124 11:19:31.855233 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ngb78" Nov 24 11:19:32 crc kubenswrapper[4752]: I1124 11:19:32.017045 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b46cbf23-b328-4d38-8f11-bb7bb81244fb-utilities\") pod \"b46cbf23-b328-4d38-8f11-bb7bb81244fb\" (UID: \"b46cbf23-b328-4d38-8f11-bb7bb81244fb\") " Nov 24 11:19:32 crc kubenswrapper[4752]: I1124 11:19:32.017098 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54p8t\" (UniqueName: \"kubernetes.io/projected/b46cbf23-b328-4d38-8f11-bb7bb81244fb-kube-api-access-54p8t\") pod \"b46cbf23-b328-4d38-8f11-bb7bb81244fb\" (UID: \"b46cbf23-b328-4d38-8f11-bb7bb81244fb\") " Nov 24 11:19:32 crc kubenswrapper[4752]: I1124 11:19:32.017177 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b46cbf23-b328-4d38-8f11-bb7bb81244fb-catalog-content\") pod \"b46cbf23-b328-4d38-8f11-bb7bb81244fb\" (UID: \"b46cbf23-b328-4d38-8f11-bb7bb81244fb\") " Nov 24 11:19:32 crc kubenswrapper[4752]: I1124 11:19:32.018419 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b46cbf23-b328-4d38-8f11-bb7bb81244fb-utilities" (OuterVolumeSpecName: "utilities") pod "b46cbf23-b328-4d38-8f11-bb7bb81244fb" (UID: "b46cbf23-b328-4d38-8f11-bb7bb81244fb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:19:32 crc kubenswrapper[4752]: I1124 11:19:32.023201 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b46cbf23-b328-4d38-8f11-bb7bb81244fb-kube-api-access-54p8t" (OuterVolumeSpecName: "kube-api-access-54p8t") pod "b46cbf23-b328-4d38-8f11-bb7bb81244fb" (UID: "b46cbf23-b328-4d38-8f11-bb7bb81244fb"). InnerVolumeSpecName "kube-api-access-54p8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:19:32 crc kubenswrapper[4752]: I1124 11:19:32.106945 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b46cbf23-b328-4d38-8f11-bb7bb81244fb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b46cbf23-b328-4d38-8f11-bb7bb81244fb" (UID: "b46cbf23-b328-4d38-8f11-bb7bb81244fb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:19:32 crc kubenswrapper[4752]: I1124 11:19:32.118338 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b46cbf23-b328-4d38-8f11-bb7bb81244fb-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 11:19:32 crc kubenswrapper[4752]: I1124 11:19:32.118385 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b46cbf23-b328-4d38-8f11-bb7bb81244fb-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 11:19:32 crc kubenswrapper[4752]: I1124 11:19:32.118399 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54p8t\" (UniqueName: \"kubernetes.io/projected/b46cbf23-b328-4d38-8f11-bb7bb81244fb-kube-api-access-54p8t\") on node \"crc\" DevicePath \"\"" Nov 24 11:19:32 crc kubenswrapper[4752]: I1124 11:19:32.380522 4752 generic.go:334] "Generic (PLEG): container finished" podID="b46cbf23-b328-4d38-8f11-bb7bb81244fb" containerID="22aaaf43e8f59bf2e592328930b71a3c0d490c5a78d8cc6ba6e32614e6be7bab" exitCode=0 Nov 24 11:19:32 crc kubenswrapper[4752]: I1124 11:19:32.380611 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngb78" event={"ID":"b46cbf23-b328-4d38-8f11-bb7bb81244fb","Type":"ContainerDied","Data":"22aaaf43e8f59bf2e592328930b71a3c0d490c5a78d8cc6ba6e32614e6be7bab"} Nov 24 11:19:32 crc kubenswrapper[4752]: I1124 11:19:32.380669 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngb78" event={"ID":"b46cbf23-b328-4d38-8f11-bb7bb81244fb","Type":"ContainerDied","Data":"4f4de182b243cb05eca041e1129d83b02998e360b30595a93e9041f06096fd3d"} Nov 24 11:19:32 crc kubenswrapper[4752]: I1124 11:19:32.380691 4752 scope.go:117] "RemoveContainer" containerID="22aaaf43e8f59bf2e592328930b71a3c0d490c5a78d8cc6ba6e32614e6be7bab" Nov 24 11:19:32 crc kubenswrapper[4752]: I1124 11:19:32.380633 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ngb78" Nov 24 11:19:32 crc kubenswrapper[4752]: I1124 11:19:32.383220 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5b8d98bc54-4tslc" event={"ID":"a8771fed-834c-4867-b376-1c8b5347b532","Type":"ContainerStarted","Data":"8eabf1d58155fb55188729f7dc3422c51f82d8b27fc45675a1316b8d127b13e3"} Nov 24 11:19:32 crc kubenswrapper[4752]: I1124 11:19:32.383427 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5b8d98bc54-4tslc" Nov 24 11:19:32 crc kubenswrapper[4752]: I1124 11:19:32.402946 4752 scope.go:117] "RemoveContainer" containerID="dd2dce82ca0eb8d05a13a5a9005eeed2afcbacbac7ea95d9478af4a38a23738a" Nov 24 11:19:32 crc kubenswrapper[4752]: I1124 11:19:32.420298 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-5b8d98bc54-4tslc" podStartSLOduration=2.684863095 podStartE2EDuration="7.420277865s" podCreationTimestamp="2025-11-24 11:19:25 +0000 UTC" firstStartedPulling="2025-11-24 11:19:26.603633961 +0000 UTC m=+772.588454260" lastFinishedPulling="2025-11-24 11:19:31.339048741 +0000 UTC m=+777.323869030" observedRunningTime="2025-11-24 11:19:32.417490704 +0000 UTC m=+778.402310993" watchObservedRunningTime="2025-11-24 11:19:32.420277865 +0000 UTC m=+778.405098154" Nov 24 11:19:32 crc kubenswrapper[4752]: I1124 11:19:32.432293 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ngb78"] Nov 24 11:19:32 crc kubenswrapper[4752]: I1124 11:19:32.434096 4752 scope.go:117] "RemoveContainer" containerID="81fcfa154af3be2ad6b7a8f1ca9b88f8d8e8667b31022e1fe76735e27e407cb2" Nov 24 11:19:32 crc kubenswrapper[4752]: I1124 11:19:32.438009 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ngb78"] Nov 24 11:19:32 crc kubenswrapper[4752]: I1124 11:19:32.453240 4752 scope.go:117] "RemoveContainer" containerID="22aaaf43e8f59bf2e592328930b71a3c0d490c5a78d8cc6ba6e32614e6be7bab" Nov 24 11:19:32 crc kubenswrapper[4752]: E1124 11:19:32.453847 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22aaaf43e8f59bf2e592328930b71a3c0d490c5a78d8cc6ba6e32614e6be7bab\": container with ID starting with 22aaaf43e8f59bf2e592328930b71a3c0d490c5a78d8cc6ba6e32614e6be7bab not found: ID does not exist" containerID="22aaaf43e8f59bf2e592328930b71a3c0d490c5a78d8cc6ba6e32614e6be7bab" Nov 24 11:19:32 crc kubenswrapper[4752]: I1124 11:19:32.453914 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22aaaf43e8f59bf2e592328930b71a3c0d490c5a78d8cc6ba6e32614e6be7bab"} err="failed to get container status \"22aaaf43e8f59bf2e592328930b71a3c0d490c5a78d8cc6ba6e32614e6be7bab\": rpc error: code = NotFound desc = could not find container \"22aaaf43e8f59bf2e592328930b71a3c0d490c5a78d8cc6ba6e32614e6be7bab\": container with ID starting with 22aaaf43e8f59bf2e592328930b71a3c0d490c5a78d8cc6ba6e32614e6be7bab not found: ID does not exist" Nov 24 11:19:32 crc kubenswrapper[4752]: I1124 11:19:32.453944 4752 scope.go:117] "RemoveContainer" containerID="dd2dce82ca0eb8d05a13a5a9005eeed2afcbacbac7ea95d9478af4a38a23738a" Nov 24 11:19:32 crc kubenswrapper[4752]: E1124 11:19:32.454555 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd2dce82ca0eb8d05a13a5a9005eeed2afcbacbac7ea95d9478af4a38a23738a\": container with ID starting with dd2dce82ca0eb8d05a13a5a9005eeed2afcbacbac7ea95d9478af4a38a23738a not found: ID does not exist" containerID="dd2dce82ca0eb8d05a13a5a9005eeed2afcbacbac7ea95d9478af4a38a23738a" Nov 24 11:19:32 crc kubenswrapper[4752]: I1124 11:19:32.454599 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd2dce82ca0eb8d05a13a5a9005eeed2afcbacbac7ea95d9478af4a38a23738a"} err="failed to get container status \"dd2dce82ca0eb8d05a13a5a9005eeed2afcbacbac7ea95d9478af4a38a23738a\": rpc error: code = NotFound desc = could not find container \"dd2dce82ca0eb8d05a13a5a9005eeed2afcbacbac7ea95d9478af4a38a23738a\": container with ID starting with dd2dce82ca0eb8d05a13a5a9005eeed2afcbacbac7ea95d9478af4a38a23738a not found: ID does not exist" Nov 24 11:19:32 crc kubenswrapper[4752]: I1124 11:19:32.454626 4752 scope.go:117] "RemoveContainer" containerID="81fcfa154af3be2ad6b7a8f1ca9b88f8d8e8667b31022e1fe76735e27e407cb2" Nov 24 11:19:32 crc kubenswrapper[4752]: E1124 11:19:32.455102 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81fcfa154af3be2ad6b7a8f1ca9b88f8d8e8667b31022e1fe76735e27e407cb2\": container with ID starting with 81fcfa154af3be2ad6b7a8f1ca9b88f8d8e8667b31022e1fe76735e27e407cb2 not found: ID does not exist" containerID="81fcfa154af3be2ad6b7a8f1ca9b88f8d8e8667b31022e1fe76735e27e407cb2" Nov 24 11:19:32 crc kubenswrapper[4752]: I1124 11:19:32.455132 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81fcfa154af3be2ad6b7a8f1ca9b88f8d8e8667b31022e1fe76735e27e407cb2"} err="failed to get container status \"81fcfa154af3be2ad6b7a8f1ca9b88f8d8e8667b31022e1fe76735e27e407cb2\": rpc error: code = NotFound desc = could not find container \"81fcfa154af3be2ad6b7a8f1ca9b88f8d8e8667b31022e1fe76735e27e407cb2\": container with ID starting with 81fcfa154af3be2ad6b7a8f1ca9b88f8d8e8667b31022e1fe76735e27e407cb2 not found: ID does not exist" Nov 24 11:19:32 crc kubenswrapper[4752]: I1124 11:19:32.739449 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b46cbf23-b328-4d38-8f11-bb7bb81244fb" path="/var/lib/kubelet/pods/b46cbf23-b328-4d38-8f11-bb7bb81244fb/volumes" Nov 24 11:19:46 crc kubenswrapper[4752]: I1124 11:19:46.126388 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-5b8d98bc54-4tslc" Nov 24 11:19:49 crc kubenswrapper[4752]: I1124 11:19:49.290998 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8ndps"] Nov 24 11:19:49 crc kubenswrapper[4752]: E1124 11:19:49.291698 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b46cbf23-b328-4d38-8f11-bb7bb81244fb" containerName="extract-utilities" Nov 24 11:19:49 crc kubenswrapper[4752]: I1124 11:19:49.291721 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="b46cbf23-b328-4d38-8f11-bb7bb81244fb" containerName="extract-utilities" Nov 24 11:19:49 crc kubenswrapper[4752]: E1124 11:19:49.291753 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b46cbf23-b328-4d38-8f11-bb7bb81244fb" containerName="extract-content" Nov 24 11:19:49 crc kubenswrapper[4752]: I1124 11:19:49.291767 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="b46cbf23-b328-4d38-8f11-bb7bb81244fb" containerName="extract-content" Nov 24 11:19:49 crc kubenswrapper[4752]: E1124 11:19:49.291803 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b46cbf23-b328-4d38-8f11-bb7bb81244fb" containerName="registry-server" Nov 24 11:19:49 crc kubenswrapper[4752]: I1124 11:19:49.291813 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="b46cbf23-b328-4d38-8f11-bb7bb81244fb" containerName="registry-server" Nov 24 11:19:49 crc kubenswrapper[4752]: I1124 11:19:49.291991 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="b46cbf23-b328-4d38-8f11-bb7bb81244fb" containerName="registry-server" Nov 24 11:19:49 crc kubenswrapper[4752]: I1124 11:19:49.297320 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8ndps" Nov 24 11:19:49 crc kubenswrapper[4752]: I1124 11:19:49.312816 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8ndps"] Nov 24 11:19:49 crc kubenswrapper[4752]: I1124 11:19:49.498897 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f543638-97bc-4db2-8b92-3f331f15c55a-utilities\") pod \"redhat-marketplace-8ndps\" (UID: \"8f543638-97bc-4db2-8b92-3f331f15c55a\") " pod="openshift-marketplace/redhat-marketplace-8ndps" Nov 24 11:19:49 crc kubenswrapper[4752]: I1124 11:19:49.498945 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f543638-97bc-4db2-8b92-3f331f15c55a-catalog-content\") pod \"redhat-marketplace-8ndps\" (UID: \"8f543638-97bc-4db2-8b92-3f331f15c55a\") " pod="openshift-marketplace/redhat-marketplace-8ndps" Nov 24 11:19:49 crc kubenswrapper[4752]: I1124 11:19:49.499028 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cgg5\" (UniqueName: \"kubernetes.io/projected/8f543638-97bc-4db2-8b92-3f331f15c55a-kube-api-access-7cgg5\") pod \"redhat-marketplace-8ndps\" (UID: \"8f543638-97bc-4db2-8b92-3f331f15c55a\") " pod="openshift-marketplace/redhat-marketplace-8ndps" Nov 24 11:19:49 crc kubenswrapper[4752]: I1124 11:19:49.600062 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cgg5\" (UniqueName: \"kubernetes.io/projected/8f543638-97bc-4db2-8b92-3f331f15c55a-kube-api-access-7cgg5\") pod \"redhat-marketplace-8ndps\" (UID: \"8f543638-97bc-4db2-8b92-3f331f15c55a\") " pod="openshift-marketplace/redhat-marketplace-8ndps" Nov 24 11:19:49 crc kubenswrapper[4752]: I1124 11:19:49.600138 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f543638-97bc-4db2-8b92-3f331f15c55a-utilities\") pod \"redhat-marketplace-8ndps\" (UID: \"8f543638-97bc-4db2-8b92-3f331f15c55a\") " pod="openshift-marketplace/redhat-marketplace-8ndps" Nov 24 11:19:49 crc kubenswrapper[4752]: I1124 11:19:49.600165 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f543638-97bc-4db2-8b92-3f331f15c55a-catalog-content\") pod \"redhat-marketplace-8ndps\" (UID: \"8f543638-97bc-4db2-8b92-3f331f15c55a\") " pod="openshift-marketplace/redhat-marketplace-8ndps" Nov 24 11:19:49 crc kubenswrapper[4752]: I1124 11:19:49.600585 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f543638-97bc-4db2-8b92-3f331f15c55a-catalog-content\") pod \"redhat-marketplace-8ndps\" (UID: \"8f543638-97bc-4db2-8b92-3f331f15c55a\") " pod="openshift-marketplace/redhat-marketplace-8ndps" Nov 24 11:19:49 crc kubenswrapper[4752]: I1124 11:19:49.601033 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f543638-97bc-4db2-8b92-3f331f15c55a-utilities\") pod \"redhat-marketplace-8ndps\" (UID: \"8f543638-97bc-4db2-8b92-3f331f15c55a\") " pod="openshift-marketplace/redhat-marketplace-8ndps" Nov 24 11:19:49 crc kubenswrapper[4752]: I1124 11:19:49.618292 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cgg5\" (UniqueName: \"kubernetes.io/projected/8f543638-97bc-4db2-8b92-3f331f15c55a-kube-api-access-7cgg5\") pod \"redhat-marketplace-8ndps\" (UID: \"8f543638-97bc-4db2-8b92-3f331f15c55a\") " pod="openshift-marketplace/redhat-marketplace-8ndps" Nov 24 11:19:49 crc kubenswrapper[4752]: I1124 11:19:49.917437 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8ndps" Nov 24 11:19:50 crc kubenswrapper[4752]: I1124 11:19:50.341300 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8ndps"] Nov 24 11:19:51 crc kubenswrapper[4752]: I1124 11:19:51.008842 4752 generic.go:334] "Generic (PLEG): container finished" podID="8f543638-97bc-4db2-8b92-3f331f15c55a" containerID="41b89df6a318b8e4795629d419ac0f3f95d778084b68b0779d654b1026ce0e57" exitCode=0 Nov 24 11:19:51 crc kubenswrapper[4752]: I1124 11:19:51.009075 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8ndps" event={"ID":"8f543638-97bc-4db2-8b92-3f331f15c55a","Type":"ContainerDied","Data":"41b89df6a318b8e4795629d419ac0f3f95d778084b68b0779d654b1026ce0e57"} Nov 24 11:19:51 crc kubenswrapper[4752]: I1124 11:19:51.009225 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8ndps" event={"ID":"8f543638-97bc-4db2-8b92-3f331f15c55a","Type":"ContainerStarted","Data":"c5f5a30bb589eaf5949e96e4d96c9de13c594951bfb59fed2e11694eee2cf5c7"} Nov 24 11:19:51 crc kubenswrapper[4752]: I1124 11:19:51.853749 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-j5fnb"] Nov 24 11:19:51 crc kubenswrapper[4752]: I1124 11:19:51.854742 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j5fnb" Nov 24 11:19:51 crc kubenswrapper[4752]: I1124 11:19:51.867028 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j5fnb"] Nov 24 11:19:52 crc kubenswrapper[4752]: I1124 11:19:52.016404 4752 generic.go:334] "Generic (PLEG): container finished" podID="8f543638-97bc-4db2-8b92-3f331f15c55a" containerID="0e1b631030c7d4f1b07b4ca8837a20302a2b32f113f1c999be7bc493fbaa8ab1" exitCode=0 Nov 24 11:19:52 crc kubenswrapper[4752]: I1124 11:19:52.016461 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8ndps" event={"ID":"8f543638-97bc-4db2-8b92-3f331f15c55a","Type":"ContainerDied","Data":"0e1b631030c7d4f1b07b4ca8837a20302a2b32f113f1c999be7bc493fbaa8ab1"} Nov 24 11:19:52 crc kubenswrapper[4752]: I1124 11:19:52.035683 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td8bh\" (UniqueName: \"kubernetes.io/projected/f25e6c50-6d1a-453a-849e-8f17eede07ba-kube-api-access-td8bh\") pod \"community-operators-j5fnb\" (UID: \"f25e6c50-6d1a-453a-849e-8f17eede07ba\") " pod="openshift-marketplace/community-operators-j5fnb" Nov 24 11:19:52 crc kubenswrapper[4752]: I1124 11:19:52.035778 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f25e6c50-6d1a-453a-849e-8f17eede07ba-utilities\") pod \"community-operators-j5fnb\" (UID: \"f25e6c50-6d1a-453a-849e-8f17eede07ba\") " pod="openshift-marketplace/community-operators-j5fnb" Nov 24 11:19:52 crc kubenswrapper[4752]: I1124 11:19:52.035812 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f25e6c50-6d1a-453a-849e-8f17eede07ba-catalog-content\") pod \"community-operators-j5fnb\" (UID: \"f25e6c50-6d1a-453a-849e-8f17eede07ba\") " pod="openshift-marketplace/community-operators-j5fnb" Nov 24 11:19:52 crc kubenswrapper[4752]: I1124 11:19:52.136510 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f25e6c50-6d1a-453a-849e-8f17eede07ba-catalog-content\") pod \"community-operators-j5fnb\" (UID: \"f25e6c50-6d1a-453a-849e-8f17eede07ba\") " pod="openshift-marketplace/community-operators-j5fnb" Nov 24 11:19:52 crc kubenswrapper[4752]: I1124 11:19:52.136603 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-td8bh\" (UniqueName: \"kubernetes.io/projected/f25e6c50-6d1a-453a-849e-8f17eede07ba-kube-api-access-td8bh\") pod \"community-operators-j5fnb\" (UID: \"f25e6c50-6d1a-453a-849e-8f17eede07ba\") " pod="openshift-marketplace/community-operators-j5fnb" Nov 24 11:19:52 crc kubenswrapper[4752]: I1124 11:19:52.136648 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f25e6c50-6d1a-453a-849e-8f17eede07ba-utilities\") pod \"community-operators-j5fnb\" (UID: \"f25e6c50-6d1a-453a-849e-8f17eede07ba\") " pod="openshift-marketplace/community-operators-j5fnb" Nov 24 11:19:52 crc kubenswrapper[4752]: I1124 11:19:52.137196 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f25e6c50-6d1a-453a-849e-8f17eede07ba-catalog-content\") pod \"community-operators-j5fnb\" (UID: \"f25e6c50-6d1a-453a-849e-8f17eede07ba\") " pod="openshift-marketplace/community-operators-j5fnb" Nov 24 11:19:52 crc kubenswrapper[4752]: I1124 11:19:52.137215 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f25e6c50-6d1a-453a-849e-8f17eede07ba-utilities\") pod \"community-operators-j5fnb\" (UID: \"f25e6c50-6d1a-453a-849e-8f17eede07ba\") " pod="openshift-marketplace/community-operators-j5fnb" Nov 24 11:19:52 crc kubenswrapper[4752]: I1124 11:19:52.175690 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-td8bh\" (UniqueName: \"kubernetes.io/projected/f25e6c50-6d1a-453a-849e-8f17eede07ba-kube-api-access-td8bh\") pod \"community-operators-j5fnb\" (UID: \"f25e6c50-6d1a-453a-849e-8f17eede07ba\") " pod="openshift-marketplace/community-operators-j5fnb" Nov 24 11:19:52 crc kubenswrapper[4752]: I1124 11:19:52.473186 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j5fnb" Nov 24 11:19:52 crc kubenswrapper[4752]: I1124 11:19:52.680151 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j5fnb"] Nov 24 11:19:52 crc kubenswrapper[4752]: W1124 11:19:52.686452 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf25e6c50_6d1a_453a_849e_8f17eede07ba.slice/crio-b0030f6a3bb5e3d53a6d4cbb98d5e1d0d96bdb7ad6ee6a70a0c2e3eacae5be3f WatchSource:0}: Error finding container b0030f6a3bb5e3d53a6d4cbb98d5e1d0d96bdb7ad6ee6a70a0c2e3eacae5be3f: Status 404 returned error can't find the container with id b0030f6a3bb5e3d53a6d4cbb98d5e1d0d96bdb7ad6ee6a70a0c2e3eacae5be3f Nov 24 11:19:53 crc kubenswrapper[4752]: I1124 11:19:53.023241 4752 generic.go:334] "Generic (PLEG): container finished" podID="f25e6c50-6d1a-453a-849e-8f17eede07ba" containerID="67790045c8150930b6157dd445a4a4b28769e3a1fe4bff0af02822a9eaad94ea" exitCode=0 Nov 24 11:19:53 crc kubenswrapper[4752]: I1124 11:19:53.023327 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j5fnb" event={"ID":"f25e6c50-6d1a-453a-849e-8f17eede07ba","Type":"ContainerDied","Data":"67790045c8150930b6157dd445a4a4b28769e3a1fe4bff0af02822a9eaad94ea"} Nov 24 11:19:53 crc kubenswrapper[4752]: I1124 11:19:53.023364 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j5fnb" event={"ID":"f25e6c50-6d1a-453a-849e-8f17eede07ba","Type":"ContainerStarted","Data":"b0030f6a3bb5e3d53a6d4cbb98d5e1d0d96bdb7ad6ee6a70a0c2e3eacae5be3f"} Nov 24 11:19:53 crc kubenswrapper[4752]: I1124 11:19:53.025737 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8ndps" event={"ID":"8f543638-97bc-4db2-8b92-3f331f15c55a","Type":"ContainerStarted","Data":"8ec347c24c89fb0e2cccafd0c0b6a23eea61c6ef6e3f0985d65c49803385caf4"} Nov 24 11:19:53 crc kubenswrapper[4752]: I1124 11:19:53.061876 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8ndps" podStartSLOduration=2.666967546 podStartE2EDuration="4.06186134s" podCreationTimestamp="2025-11-24 11:19:49 +0000 UTC" firstStartedPulling="2025-11-24 11:19:51.010961323 +0000 UTC m=+796.995781662" lastFinishedPulling="2025-11-24 11:19:52.405855167 +0000 UTC m=+798.390675456" observedRunningTime="2025-11-24 11:19:53.059971586 +0000 UTC m=+799.044791875" watchObservedRunningTime="2025-11-24 11:19:53.06186134 +0000 UTC m=+799.046681629" Nov 24 11:19:54 crc kubenswrapper[4752]: I1124 11:19:54.041217 4752 generic.go:334] "Generic (PLEG): container finished" podID="f25e6c50-6d1a-453a-849e-8f17eede07ba" containerID="632871015aa9887d48c56c2d32a0b81fb3d075a3cf19ccdf7051e5c02914cfa9" exitCode=0 Nov 24 11:19:54 crc kubenswrapper[4752]: I1124 11:19:54.042283 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j5fnb" event={"ID":"f25e6c50-6d1a-453a-849e-8f17eede07ba","Type":"ContainerDied","Data":"632871015aa9887d48c56c2d32a0b81fb3d075a3cf19ccdf7051e5c02914cfa9"} Nov 24 11:19:55 crc kubenswrapper[4752]: I1124 11:19:55.050379 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j5fnb" event={"ID":"f25e6c50-6d1a-453a-849e-8f17eede07ba","Type":"ContainerStarted","Data":"d5f395a5e27945dbd72a3b4a499ef280d371a29cbb497b7d952a50a2b35f7c02"} Nov 24 11:19:55 crc kubenswrapper[4752]: I1124 11:19:55.067871 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-j5fnb" podStartSLOduration=2.569476915 podStartE2EDuration="4.067853751s" podCreationTimestamp="2025-11-24 11:19:51 +0000 UTC" firstStartedPulling="2025-11-24 11:19:53.024913762 +0000 UTC m=+799.009734051" lastFinishedPulling="2025-11-24 11:19:54.523290588 +0000 UTC m=+800.508110887" observedRunningTime="2025-11-24 11:19:55.06438048 +0000 UTC m=+801.049200779" watchObservedRunningTime="2025-11-24 11:19:55.067853751 +0000 UTC m=+801.052674050" Nov 24 11:19:59 crc kubenswrapper[4752]: I1124 11:19:59.918065 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8ndps" Nov 24 11:19:59 crc kubenswrapper[4752]: I1124 11:19:59.918866 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8ndps" Nov 24 11:19:59 crc kubenswrapper[4752]: I1124 11:19:59.990297 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8ndps" Nov 24 11:20:00 crc kubenswrapper[4752]: I1124 11:20:00.132226 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8ndps" Nov 24 11:20:02 crc kubenswrapper[4752]: I1124 11:20:02.452222 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8ndps"] Nov 24 11:20:02 crc kubenswrapper[4752]: I1124 11:20:02.453043 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8ndps" podUID="8f543638-97bc-4db2-8b92-3f331f15c55a" containerName="registry-server" containerID="cri-o://8ec347c24c89fb0e2cccafd0c0b6a23eea61c6ef6e3f0985d65c49803385caf4" gracePeriod=2 Nov 24 11:20:02 crc kubenswrapper[4752]: I1124 11:20:02.473995 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-j5fnb" Nov 24 11:20:02 crc kubenswrapper[4752]: I1124 11:20:02.474035 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-j5fnb" Nov 24 11:20:02 crc kubenswrapper[4752]: I1124 11:20:02.534694 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-j5fnb" Nov 24 11:20:02 crc kubenswrapper[4752]: I1124 11:20:02.865529 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8ndps" Nov 24 11:20:02 crc kubenswrapper[4752]: I1124 11:20:02.989972 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cgg5\" (UniqueName: \"kubernetes.io/projected/8f543638-97bc-4db2-8b92-3f331f15c55a-kube-api-access-7cgg5\") pod \"8f543638-97bc-4db2-8b92-3f331f15c55a\" (UID: \"8f543638-97bc-4db2-8b92-3f331f15c55a\") " Nov 24 11:20:02 crc kubenswrapper[4752]: I1124 11:20:02.990186 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f543638-97bc-4db2-8b92-3f331f15c55a-catalog-content\") pod \"8f543638-97bc-4db2-8b92-3f331f15c55a\" (UID: \"8f543638-97bc-4db2-8b92-3f331f15c55a\") " Nov 24 11:20:02 crc kubenswrapper[4752]: I1124 11:20:02.990246 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f543638-97bc-4db2-8b92-3f331f15c55a-utilities\") pod \"8f543638-97bc-4db2-8b92-3f331f15c55a\" (UID: \"8f543638-97bc-4db2-8b92-3f331f15c55a\") " Nov 24 11:20:02 crc kubenswrapper[4752]: I1124 11:20:02.992063 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f543638-97bc-4db2-8b92-3f331f15c55a-utilities" (OuterVolumeSpecName: "utilities") pod "8f543638-97bc-4db2-8b92-3f331f15c55a" (UID: "8f543638-97bc-4db2-8b92-3f331f15c55a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:20:03 crc kubenswrapper[4752]: I1124 11:20:02.999993 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f543638-97bc-4db2-8b92-3f331f15c55a-kube-api-access-7cgg5" (OuterVolumeSpecName: "kube-api-access-7cgg5") pod "8f543638-97bc-4db2-8b92-3f331f15c55a" (UID: "8f543638-97bc-4db2-8b92-3f331f15c55a"). InnerVolumeSpecName "kube-api-access-7cgg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:20:03 crc kubenswrapper[4752]: I1124 11:20:03.031930 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f543638-97bc-4db2-8b92-3f331f15c55a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8f543638-97bc-4db2-8b92-3f331f15c55a" (UID: "8f543638-97bc-4db2-8b92-3f331f15c55a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:20:03 crc kubenswrapper[4752]: I1124 11:20:03.091709 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f543638-97bc-4db2-8b92-3f331f15c55a-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 11:20:03 crc kubenswrapper[4752]: I1124 11:20:03.091763 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f543638-97bc-4db2-8b92-3f331f15c55a-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 11:20:03 crc kubenswrapper[4752]: I1124 11:20:03.091780 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cgg5\" (UniqueName: \"kubernetes.io/projected/8f543638-97bc-4db2-8b92-3f331f15c55a-kube-api-access-7cgg5\") on node \"crc\" DevicePath \"\"" Nov 24 11:20:03 crc kubenswrapper[4752]: I1124 11:20:03.099976 4752 generic.go:334] "Generic (PLEG): container finished" podID="8f543638-97bc-4db2-8b92-3f331f15c55a" containerID="8ec347c24c89fb0e2cccafd0c0b6a23eea61c6ef6e3f0985d65c49803385caf4" exitCode=0 Nov 24 11:20:03 crc kubenswrapper[4752]: I1124 11:20:03.100039 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8ndps" event={"ID":"8f543638-97bc-4db2-8b92-3f331f15c55a","Type":"ContainerDied","Data":"8ec347c24c89fb0e2cccafd0c0b6a23eea61c6ef6e3f0985d65c49803385caf4"} Nov 24 11:20:03 crc kubenswrapper[4752]: I1124 11:20:03.100077 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8ndps" event={"ID":"8f543638-97bc-4db2-8b92-3f331f15c55a","Type":"ContainerDied","Data":"c5f5a30bb589eaf5949e96e4d96c9de13c594951bfb59fed2e11694eee2cf5c7"} Nov 24 11:20:03 crc kubenswrapper[4752]: I1124 11:20:03.100074 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8ndps" Nov 24 11:20:03 crc kubenswrapper[4752]: I1124 11:20:03.100115 4752 scope.go:117] "RemoveContainer" containerID="8ec347c24c89fb0e2cccafd0c0b6a23eea61c6ef6e3f0985d65c49803385caf4" Nov 24 11:20:03 crc kubenswrapper[4752]: I1124 11:20:03.124378 4752 scope.go:117] "RemoveContainer" containerID="0e1b631030c7d4f1b07b4ca8837a20302a2b32f113f1c999be7bc493fbaa8ab1" Nov 24 11:20:03 crc kubenswrapper[4752]: I1124 11:20:03.141506 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8ndps"] Nov 24 11:20:03 crc kubenswrapper[4752]: I1124 11:20:03.148213 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8ndps"] Nov 24 11:20:03 crc kubenswrapper[4752]: I1124 11:20:03.158253 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-j5fnb" Nov 24 11:20:03 crc kubenswrapper[4752]: I1124 11:20:03.164592 4752 scope.go:117] "RemoveContainer" containerID="41b89df6a318b8e4795629d419ac0f3f95d778084b68b0779d654b1026ce0e57" Nov 24 11:20:03 crc kubenswrapper[4752]: I1124 11:20:03.182896 4752 scope.go:117] "RemoveContainer" containerID="8ec347c24c89fb0e2cccafd0c0b6a23eea61c6ef6e3f0985d65c49803385caf4" Nov 24 11:20:03 crc kubenswrapper[4752]: E1124 11:20:03.184383 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ec347c24c89fb0e2cccafd0c0b6a23eea61c6ef6e3f0985d65c49803385caf4\": container with ID starting with 8ec347c24c89fb0e2cccafd0c0b6a23eea61c6ef6e3f0985d65c49803385caf4 not found: ID does not exist" containerID="8ec347c24c89fb0e2cccafd0c0b6a23eea61c6ef6e3f0985d65c49803385caf4" Nov 24 11:20:03 crc kubenswrapper[4752]: I1124 11:20:03.184480 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ec347c24c89fb0e2cccafd0c0b6a23eea61c6ef6e3f0985d65c49803385caf4"} err="failed to get container status \"8ec347c24c89fb0e2cccafd0c0b6a23eea61c6ef6e3f0985d65c49803385caf4\": rpc error: code = NotFound desc = could not find container \"8ec347c24c89fb0e2cccafd0c0b6a23eea61c6ef6e3f0985d65c49803385caf4\": container with ID starting with 8ec347c24c89fb0e2cccafd0c0b6a23eea61c6ef6e3f0985d65c49803385caf4 not found: ID does not exist" Nov 24 11:20:03 crc kubenswrapper[4752]: I1124 11:20:03.184561 4752 scope.go:117] "RemoveContainer" containerID="0e1b631030c7d4f1b07b4ca8837a20302a2b32f113f1c999be7bc493fbaa8ab1" Nov 24 11:20:03 crc kubenswrapper[4752]: E1124 11:20:03.184928 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e1b631030c7d4f1b07b4ca8837a20302a2b32f113f1c999be7bc493fbaa8ab1\": container with ID starting with 0e1b631030c7d4f1b07b4ca8837a20302a2b32f113f1c999be7bc493fbaa8ab1 not found: ID does not exist" containerID="0e1b631030c7d4f1b07b4ca8837a20302a2b32f113f1c999be7bc493fbaa8ab1" Nov 24 11:20:03 crc kubenswrapper[4752]: I1124 11:20:03.184959 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e1b631030c7d4f1b07b4ca8837a20302a2b32f113f1c999be7bc493fbaa8ab1"} err="failed to get container status \"0e1b631030c7d4f1b07b4ca8837a20302a2b32f113f1c999be7bc493fbaa8ab1\": rpc error: code = NotFound desc = could not find container \"0e1b631030c7d4f1b07b4ca8837a20302a2b32f113f1c999be7bc493fbaa8ab1\": container with ID starting with 0e1b631030c7d4f1b07b4ca8837a20302a2b32f113f1c999be7bc493fbaa8ab1 not found: ID does not exist" Nov 24 11:20:03 crc kubenswrapper[4752]: I1124 11:20:03.184982 4752 scope.go:117] "RemoveContainer" containerID="41b89df6a318b8e4795629d419ac0f3f95d778084b68b0779d654b1026ce0e57" Nov 24 11:20:03 crc kubenswrapper[4752]: E1124 11:20:03.185202 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41b89df6a318b8e4795629d419ac0f3f95d778084b68b0779d654b1026ce0e57\": container with ID starting with 41b89df6a318b8e4795629d419ac0f3f95d778084b68b0779d654b1026ce0e57 not found: ID does not exist" containerID="41b89df6a318b8e4795629d419ac0f3f95d778084b68b0779d654b1026ce0e57" Nov 24 11:20:03 crc kubenswrapper[4752]: I1124 11:20:03.185281 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41b89df6a318b8e4795629d419ac0f3f95d778084b68b0779d654b1026ce0e57"} err="failed to get container status \"41b89df6a318b8e4795629d419ac0f3f95d778084b68b0779d654b1026ce0e57\": rpc error: code = NotFound desc = could not find container \"41b89df6a318b8e4795629d419ac0f3f95d778084b68b0779d654b1026ce0e57\": container with ID starting with 41b89df6a318b8e4795629d419ac0f3f95d778084b68b0779d654b1026ce0e57 not found: ID does not exist" Nov 24 11:20:04 crc kubenswrapper[4752]: I1124 11:20:04.741941 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f543638-97bc-4db2-8b92-3f331f15c55a" path="/var/lib/kubelet/pods/8f543638-97bc-4db2-8b92-3f331f15c55a/volumes" Nov 24 11:20:05 crc kubenswrapper[4752]: I1124 11:20:05.856684 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6947b4bf66-9rmcj" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.047562 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j5fnb"] Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.048034 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-j5fnb" podUID="f25e6c50-6d1a-453a-849e-8f17eede07ba" containerName="registry-server" containerID="cri-o://d5f395a5e27945dbd72a3b4a499ef280d371a29cbb497b7d952a50a2b35f7c02" gracePeriod=2 Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.490382 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j5fnb" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.535632 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-jjkc5"] Nov 24 11:20:06 crc kubenswrapper[4752]: E1124 11:20:06.535913 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f25e6c50-6d1a-453a-849e-8f17eede07ba" containerName="extract-content" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.535930 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="f25e6c50-6d1a-453a-849e-8f17eede07ba" containerName="extract-content" Nov 24 11:20:06 crc kubenswrapper[4752]: E1124 11:20:06.535943 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f543638-97bc-4db2-8b92-3f331f15c55a" containerName="extract-content" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.535952 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f543638-97bc-4db2-8b92-3f331f15c55a" containerName="extract-content" Nov 24 11:20:06 crc kubenswrapper[4752]: E1124 11:20:06.535965 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f25e6c50-6d1a-453a-849e-8f17eede07ba" containerName="extract-utilities" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.535973 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="f25e6c50-6d1a-453a-849e-8f17eede07ba" containerName="extract-utilities" Nov 24 11:20:06 crc kubenswrapper[4752]: E1124 11:20:06.535986 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f543638-97bc-4db2-8b92-3f331f15c55a" containerName="extract-utilities" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.535994 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f543638-97bc-4db2-8b92-3f331f15c55a" containerName="extract-utilities" Nov 24 11:20:06 crc kubenswrapper[4752]: E1124 11:20:06.536003 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f25e6c50-6d1a-453a-849e-8f17eede07ba" containerName="registry-server" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.536012 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="f25e6c50-6d1a-453a-849e-8f17eede07ba" containerName="registry-server" Nov 24 11:20:06 crc kubenswrapper[4752]: E1124 11:20:06.536023 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f543638-97bc-4db2-8b92-3f331f15c55a" containerName="registry-server" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.536031 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f543638-97bc-4db2-8b92-3f331f15c55a" containerName="registry-server" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.536180 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="f25e6c50-6d1a-453a-849e-8f17eede07ba" containerName="registry-server" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.536197 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f543638-97bc-4db2-8b92-3f331f15c55a" containerName="registry-server" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.538406 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-jjkc5" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.540766 4752 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.540898 4752 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-zf58r" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.541005 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.551060 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-wwxm2"] Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.552156 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-6998585d5-wwxm2" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.554769 4752 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.562165 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-wwxm2"] Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.636165 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f25e6c50-6d1a-453a-849e-8f17eede07ba-catalog-content\") pod \"f25e6c50-6d1a-453a-849e-8f17eede07ba\" (UID: \"f25e6c50-6d1a-453a-849e-8f17eede07ba\") " Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.636236 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f25e6c50-6d1a-453a-849e-8f17eede07ba-utilities\") pod \"f25e6c50-6d1a-453a-849e-8f17eede07ba\" (UID: \"f25e6c50-6d1a-453a-849e-8f17eede07ba\") " Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.636407 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-td8bh\" (UniqueName: \"kubernetes.io/projected/f25e6c50-6d1a-453a-849e-8f17eede07ba-kube-api-access-td8bh\") pod \"f25e6c50-6d1a-453a-849e-8f17eede07ba\" (UID: \"f25e6c50-6d1a-453a-849e-8f17eede07ba\") " Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.637819 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f25e6c50-6d1a-453a-849e-8f17eede07ba-utilities" (OuterVolumeSpecName: "utilities") pod "f25e6c50-6d1a-453a-849e-8f17eede07ba" (UID: "f25e6c50-6d1a-453a-849e-8f17eede07ba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.643213 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f25e6c50-6d1a-453a-849e-8f17eede07ba-kube-api-access-td8bh" (OuterVolumeSpecName: "kube-api-access-td8bh") pod "f25e6c50-6d1a-453a-849e-8f17eede07ba" (UID: "f25e6c50-6d1a-453a-849e-8f17eede07ba"). InnerVolumeSpecName "kube-api-access-td8bh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.644017 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-zscb2"] Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.645630 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-zscb2" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.650898 4752 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.650941 4752 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-h7fbg" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.651095 4752 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.651148 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.654275 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6c7b4b5f48-w4xmc"] Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.655619 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6c7b4b5f48-w4xmc" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.657787 4752 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.671707 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6c7b4b5f48-w4xmc"] Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.719351 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f25e6c50-6d1a-453a-849e-8f17eede07ba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f25e6c50-6d1a-453a-849e-8f17eede07ba" (UID: "f25e6c50-6d1a-453a-849e-8f17eede07ba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.738290 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpxvh\" (UniqueName: \"kubernetes.io/projected/e7ee7fd9-f333-4573-bd44-49ffa58e8389-kube-api-access-qpxvh\") pod \"frr-k8s-webhook-server-6998585d5-wwxm2\" (UID: \"e7ee7fd9-f333-4573-bd44-49ffa58e8389\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-wwxm2" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.738392 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1244a8ec-63d5-438f-8f4f-40796b68de59-frr-conf\") pod \"frr-k8s-jjkc5\" (UID: \"1244a8ec-63d5-438f-8f4f-40796b68de59\") " pod="metallb-system/frr-k8s-jjkc5" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.738853 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1244a8ec-63d5-438f-8f4f-40796b68de59-frr-sockets\") pod \"frr-k8s-jjkc5\" (UID: \"1244a8ec-63d5-438f-8f4f-40796b68de59\") " pod="metallb-system/frr-k8s-jjkc5" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.739195 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1244a8ec-63d5-438f-8f4f-40796b68de59-reloader\") pod \"frr-k8s-jjkc5\" (UID: \"1244a8ec-63d5-438f-8f4f-40796b68de59\") " pod="metallb-system/frr-k8s-jjkc5" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.739261 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mvtm\" (UniqueName: \"kubernetes.io/projected/1244a8ec-63d5-438f-8f4f-40796b68de59-kube-api-access-9mvtm\") pod \"frr-k8s-jjkc5\" (UID: \"1244a8ec-63d5-438f-8f4f-40796b68de59\") " pod="metallb-system/frr-k8s-jjkc5" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.739361 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7ee7fd9-f333-4573-bd44-49ffa58e8389-cert\") pod \"frr-k8s-webhook-server-6998585d5-wwxm2\" (UID: \"e7ee7fd9-f333-4573-bd44-49ffa58e8389\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-wwxm2" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.739405 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1244a8ec-63d5-438f-8f4f-40796b68de59-frr-startup\") pod \"frr-k8s-jjkc5\" (UID: \"1244a8ec-63d5-438f-8f4f-40796b68de59\") " pod="metallb-system/frr-k8s-jjkc5" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.739480 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1244a8ec-63d5-438f-8f4f-40796b68de59-metrics-certs\") pod \"frr-k8s-jjkc5\" (UID: \"1244a8ec-63d5-438f-8f4f-40796b68de59\") " pod="metallb-system/frr-k8s-jjkc5" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.739512 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1244a8ec-63d5-438f-8f4f-40796b68de59-metrics\") pod \"frr-k8s-jjkc5\" (UID: \"1244a8ec-63d5-438f-8f4f-40796b68de59\") " pod="metallb-system/frr-k8s-jjkc5" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.739581 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-td8bh\" (UniqueName: \"kubernetes.io/projected/f25e6c50-6d1a-453a-849e-8f17eede07ba-kube-api-access-td8bh\") on node \"crc\" DevicePath \"\"" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.739603 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f25e6c50-6d1a-453a-849e-8f17eede07ba-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.739616 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f25e6c50-6d1a-453a-849e-8f17eede07ba-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.841224 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n4tt\" (UniqueName: \"kubernetes.io/projected/2731a61e-82f6-43be-93c8-d0a5f9000bec-kube-api-access-4n4tt\") pod \"controller-6c7b4b5f48-w4xmc\" (UID: \"2731a61e-82f6-43be-93c8-d0a5f9000bec\") " pod="metallb-system/controller-6c7b4b5f48-w4xmc" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.841292 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mvtm\" (UniqueName: \"kubernetes.io/projected/1244a8ec-63d5-438f-8f4f-40796b68de59-kube-api-access-9mvtm\") pod \"frr-k8s-jjkc5\" (UID: \"1244a8ec-63d5-438f-8f4f-40796b68de59\") " pod="metallb-system/frr-k8s-jjkc5" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.841321 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2731a61e-82f6-43be-93c8-d0a5f9000bec-cert\") pod \"controller-6c7b4b5f48-w4xmc\" (UID: \"2731a61e-82f6-43be-93c8-d0a5f9000bec\") " pod="metallb-system/controller-6c7b4b5f48-w4xmc" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.841352 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/4cc40b5d-710f-44a1-917e-b330c4dcab18-metallb-excludel2\") pod \"speaker-zscb2\" (UID: \"4cc40b5d-710f-44a1-917e-b330c4dcab18\") " pod="metallb-system/speaker-zscb2" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.841369 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7ee7fd9-f333-4573-bd44-49ffa58e8389-cert\") pod \"frr-k8s-webhook-server-6998585d5-wwxm2\" (UID: \"e7ee7fd9-f333-4573-bd44-49ffa58e8389\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-wwxm2" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.841388 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1244a8ec-63d5-438f-8f4f-40796b68de59-frr-startup\") pod \"frr-k8s-jjkc5\" (UID: \"1244a8ec-63d5-438f-8f4f-40796b68de59\") " pod="metallb-system/frr-k8s-jjkc5" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.841408 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2731a61e-82f6-43be-93c8-d0a5f9000bec-metrics-certs\") pod \"controller-6c7b4b5f48-w4xmc\" (UID: \"2731a61e-82f6-43be-93c8-d0a5f9000bec\") " pod="metallb-system/controller-6c7b4b5f48-w4xmc" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.841436 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1244a8ec-63d5-438f-8f4f-40796b68de59-metrics-certs\") pod \"frr-k8s-jjkc5\" (UID: \"1244a8ec-63d5-438f-8f4f-40796b68de59\") " pod="metallb-system/frr-k8s-jjkc5" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.841457 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1244a8ec-63d5-438f-8f4f-40796b68de59-metrics\") pod \"frr-k8s-jjkc5\" (UID: \"1244a8ec-63d5-438f-8f4f-40796b68de59\") " pod="metallb-system/frr-k8s-jjkc5" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.841477 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpxvh\" (UniqueName: \"kubernetes.io/projected/e7ee7fd9-f333-4573-bd44-49ffa58e8389-kube-api-access-qpxvh\") pod \"frr-k8s-webhook-server-6998585d5-wwxm2\" (UID: \"e7ee7fd9-f333-4573-bd44-49ffa58e8389\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-wwxm2" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.841523 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4cc40b5d-710f-44a1-917e-b330c4dcab18-memberlist\") pod \"speaker-zscb2\" (UID: \"4cc40b5d-710f-44a1-917e-b330c4dcab18\") " pod="metallb-system/speaker-zscb2" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.841563 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1244a8ec-63d5-438f-8f4f-40796b68de59-frr-conf\") pod \"frr-k8s-jjkc5\" (UID: \"1244a8ec-63d5-438f-8f4f-40796b68de59\") " pod="metallb-system/frr-k8s-jjkc5" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.841581 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1244a8ec-63d5-438f-8f4f-40796b68de59-frr-sockets\") pod \"frr-k8s-jjkc5\" (UID: \"1244a8ec-63d5-438f-8f4f-40796b68de59\") " pod="metallb-system/frr-k8s-jjkc5" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.841613 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4cc40b5d-710f-44a1-917e-b330c4dcab18-metrics-certs\") pod \"speaker-zscb2\" (UID: \"4cc40b5d-710f-44a1-917e-b330c4dcab18\") " pod="metallb-system/speaker-zscb2" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.841650 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1244a8ec-63d5-438f-8f4f-40796b68de59-reloader\") pod \"frr-k8s-jjkc5\" (UID: \"1244a8ec-63d5-438f-8f4f-40796b68de59\") " pod="metallb-system/frr-k8s-jjkc5" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.841671 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dnx7\" (UniqueName: \"kubernetes.io/projected/4cc40b5d-710f-44a1-917e-b330c4dcab18-kube-api-access-2dnx7\") pod \"speaker-zscb2\" (UID: \"4cc40b5d-710f-44a1-917e-b330c4dcab18\") " pod="metallb-system/speaker-zscb2" Nov 24 11:20:06 crc kubenswrapper[4752]: E1124 11:20:06.841863 4752 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Nov 24 11:20:06 crc kubenswrapper[4752]: E1124 11:20:06.841920 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7ee7fd9-f333-4573-bd44-49ffa58e8389-cert podName:e7ee7fd9-f333-4573-bd44-49ffa58e8389 nodeName:}" failed. No retries permitted until 2025-11-24 11:20:07.341899179 +0000 UTC m=+813.326719468 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e7ee7fd9-f333-4573-bd44-49ffa58e8389-cert") pod "frr-k8s-webhook-server-6998585d5-wwxm2" (UID: "e7ee7fd9-f333-4573-bd44-49ffa58e8389") : secret "frr-k8s-webhook-server-cert" not found Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.842943 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1244a8ec-63d5-438f-8f4f-40796b68de59-frr-conf\") pod \"frr-k8s-jjkc5\" (UID: \"1244a8ec-63d5-438f-8f4f-40796b68de59\") " pod="metallb-system/frr-k8s-jjkc5" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.843094 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1244a8ec-63d5-438f-8f4f-40796b68de59-frr-startup\") pod \"frr-k8s-jjkc5\" (UID: \"1244a8ec-63d5-438f-8f4f-40796b68de59\") " pod="metallb-system/frr-k8s-jjkc5" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.843183 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1244a8ec-63d5-438f-8f4f-40796b68de59-frr-sockets\") pod \"frr-k8s-jjkc5\" (UID: \"1244a8ec-63d5-438f-8f4f-40796b68de59\") " pod="metallb-system/frr-k8s-jjkc5" Nov 24 11:20:06 crc kubenswrapper[4752]: E1124 11:20:06.843249 4752 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Nov 24 11:20:06 crc kubenswrapper[4752]: E1124 11:20:06.843327 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1244a8ec-63d5-438f-8f4f-40796b68de59-metrics-certs podName:1244a8ec-63d5-438f-8f4f-40796b68de59 nodeName:}" failed. No retries permitted until 2025-11-24 11:20:07.343305309 +0000 UTC m=+813.328125598 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1244a8ec-63d5-438f-8f4f-40796b68de59-metrics-certs") pod "frr-k8s-jjkc5" (UID: "1244a8ec-63d5-438f-8f4f-40796b68de59") : secret "frr-k8s-certs-secret" not found Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.843614 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1244a8ec-63d5-438f-8f4f-40796b68de59-metrics\") pod \"frr-k8s-jjkc5\" (UID: \"1244a8ec-63d5-438f-8f4f-40796b68de59\") " pod="metallb-system/frr-k8s-jjkc5" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.844148 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1244a8ec-63d5-438f-8f4f-40796b68de59-reloader\") pod \"frr-k8s-jjkc5\" (UID: \"1244a8ec-63d5-438f-8f4f-40796b68de59\") " pod="metallb-system/frr-k8s-jjkc5" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.872850 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mvtm\" (UniqueName: \"kubernetes.io/projected/1244a8ec-63d5-438f-8f4f-40796b68de59-kube-api-access-9mvtm\") pod \"frr-k8s-jjkc5\" (UID: \"1244a8ec-63d5-438f-8f4f-40796b68de59\") " pod="metallb-system/frr-k8s-jjkc5" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.879037 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpxvh\" (UniqueName: \"kubernetes.io/projected/e7ee7fd9-f333-4573-bd44-49ffa58e8389-kube-api-access-qpxvh\") pod \"frr-k8s-webhook-server-6998585d5-wwxm2\" (UID: \"e7ee7fd9-f333-4573-bd44-49ffa58e8389\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-wwxm2" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.942443 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4cc40b5d-710f-44a1-917e-b330c4dcab18-metrics-certs\") pod \"speaker-zscb2\" (UID: \"4cc40b5d-710f-44a1-917e-b330c4dcab18\") " pod="metallb-system/speaker-zscb2" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.942495 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dnx7\" (UniqueName: \"kubernetes.io/projected/4cc40b5d-710f-44a1-917e-b330c4dcab18-kube-api-access-2dnx7\") pod \"speaker-zscb2\" (UID: \"4cc40b5d-710f-44a1-917e-b330c4dcab18\") " pod="metallb-system/speaker-zscb2" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.942521 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n4tt\" (UniqueName: \"kubernetes.io/projected/2731a61e-82f6-43be-93c8-d0a5f9000bec-kube-api-access-4n4tt\") pod \"controller-6c7b4b5f48-w4xmc\" (UID: \"2731a61e-82f6-43be-93c8-d0a5f9000bec\") " pod="metallb-system/controller-6c7b4b5f48-w4xmc" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.942551 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2731a61e-82f6-43be-93c8-d0a5f9000bec-cert\") pod \"controller-6c7b4b5f48-w4xmc\" (UID: \"2731a61e-82f6-43be-93c8-d0a5f9000bec\") " pod="metallb-system/controller-6c7b4b5f48-w4xmc" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.942578 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/4cc40b5d-710f-44a1-917e-b330c4dcab18-metallb-excludel2\") pod \"speaker-zscb2\" (UID: \"4cc40b5d-710f-44a1-917e-b330c4dcab18\") " pod="metallb-system/speaker-zscb2" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.942611 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2731a61e-82f6-43be-93c8-d0a5f9000bec-metrics-certs\") pod \"controller-6c7b4b5f48-w4xmc\" (UID: \"2731a61e-82f6-43be-93c8-d0a5f9000bec\") " pod="metallb-system/controller-6c7b4b5f48-w4xmc" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.942643 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4cc40b5d-710f-44a1-917e-b330c4dcab18-memberlist\") pod \"speaker-zscb2\" (UID: \"4cc40b5d-710f-44a1-917e-b330c4dcab18\") " pod="metallb-system/speaker-zscb2" Nov 24 11:20:06 crc kubenswrapper[4752]: E1124 11:20:06.942921 4752 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 24 11:20:06 crc kubenswrapper[4752]: E1124 11:20:06.942988 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4cc40b5d-710f-44a1-917e-b330c4dcab18-memberlist podName:4cc40b5d-710f-44a1-917e-b330c4dcab18 nodeName:}" failed. No retries permitted until 2025-11-24 11:20:07.44296861 +0000 UTC m=+813.427788899 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/4cc40b5d-710f-44a1-917e-b330c4dcab18-memberlist") pod "speaker-zscb2" (UID: "4cc40b5d-710f-44a1-917e-b330c4dcab18") : secret "metallb-memberlist" not found Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.943787 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/4cc40b5d-710f-44a1-917e-b330c4dcab18-metallb-excludel2\") pod \"speaker-zscb2\" (UID: \"4cc40b5d-710f-44a1-917e-b330c4dcab18\") " pod="metallb-system/speaker-zscb2" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.945736 4752 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.947365 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2731a61e-82f6-43be-93c8-d0a5f9000bec-metrics-certs\") pod \"controller-6c7b4b5f48-w4xmc\" (UID: \"2731a61e-82f6-43be-93c8-d0a5f9000bec\") " pod="metallb-system/controller-6c7b4b5f48-w4xmc" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.947497 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4cc40b5d-710f-44a1-917e-b330c4dcab18-metrics-certs\") pod \"speaker-zscb2\" (UID: \"4cc40b5d-710f-44a1-917e-b330c4dcab18\") " pod="metallb-system/speaker-zscb2" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.957622 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2731a61e-82f6-43be-93c8-d0a5f9000bec-cert\") pod \"controller-6c7b4b5f48-w4xmc\" (UID: \"2731a61e-82f6-43be-93c8-d0a5f9000bec\") " pod="metallb-system/controller-6c7b4b5f48-w4xmc" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.957819 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dnx7\" (UniqueName: \"kubernetes.io/projected/4cc40b5d-710f-44a1-917e-b330c4dcab18-kube-api-access-2dnx7\") pod \"speaker-zscb2\" (UID: \"4cc40b5d-710f-44a1-917e-b330c4dcab18\") " pod="metallb-system/speaker-zscb2" Nov 24 11:20:06 crc kubenswrapper[4752]: I1124 11:20:06.964092 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n4tt\" (UniqueName: \"kubernetes.io/projected/2731a61e-82f6-43be-93c8-d0a5f9000bec-kube-api-access-4n4tt\") pod \"controller-6c7b4b5f48-w4xmc\" (UID: \"2731a61e-82f6-43be-93c8-d0a5f9000bec\") " pod="metallb-system/controller-6c7b4b5f48-w4xmc" Nov 24 11:20:07 crc kubenswrapper[4752]: I1124 11:20:07.003442 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6c7b4b5f48-w4xmc" Nov 24 11:20:07 crc kubenswrapper[4752]: I1124 11:20:07.126522 4752 generic.go:334] "Generic (PLEG): container finished" podID="f25e6c50-6d1a-453a-849e-8f17eede07ba" containerID="d5f395a5e27945dbd72a3b4a499ef280d371a29cbb497b7d952a50a2b35f7c02" exitCode=0 Nov 24 11:20:07 crc kubenswrapper[4752]: I1124 11:20:07.126561 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j5fnb" event={"ID":"f25e6c50-6d1a-453a-849e-8f17eede07ba","Type":"ContainerDied","Data":"d5f395a5e27945dbd72a3b4a499ef280d371a29cbb497b7d952a50a2b35f7c02"} Nov 24 11:20:07 crc kubenswrapper[4752]: I1124 11:20:07.126589 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j5fnb" event={"ID":"f25e6c50-6d1a-453a-849e-8f17eede07ba","Type":"ContainerDied","Data":"b0030f6a3bb5e3d53a6d4cbb98d5e1d0d96bdb7ad6ee6a70a0c2e3eacae5be3f"} Nov 24 11:20:07 crc kubenswrapper[4752]: I1124 11:20:07.126604 4752 scope.go:117] "RemoveContainer" containerID="d5f395a5e27945dbd72a3b4a499ef280d371a29cbb497b7d952a50a2b35f7c02" Nov 24 11:20:07 crc kubenswrapper[4752]: I1124 11:20:07.126760 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j5fnb" Nov 24 11:20:07 crc kubenswrapper[4752]: I1124 11:20:07.163407 4752 scope.go:117] "RemoveContainer" containerID="632871015aa9887d48c56c2d32a0b81fb3d075a3cf19ccdf7051e5c02914cfa9" Nov 24 11:20:07 crc kubenswrapper[4752]: I1124 11:20:07.164633 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j5fnb"] Nov 24 11:20:07 crc kubenswrapper[4752]: I1124 11:20:07.191033 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-j5fnb"] Nov 24 11:20:07 crc kubenswrapper[4752]: I1124 11:20:07.220084 4752 scope.go:117] "RemoveContainer" containerID="67790045c8150930b6157dd445a4a4b28769e3a1fe4bff0af02822a9eaad94ea" Nov 24 11:20:07 crc kubenswrapper[4752]: I1124 11:20:07.233930 4752 scope.go:117] "RemoveContainer" containerID="d5f395a5e27945dbd72a3b4a499ef280d371a29cbb497b7d952a50a2b35f7c02" Nov 24 11:20:07 crc kubenswrapper[4752]: E1124 11:20:07.235253 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5f395a5e27945dbd72a3b4a499ef280d371a29cbb497b7d952a50a2b35f7c02\": container with ID starting with d5f395a5e27945dbd72a3b4a499ef280d371a29cbb497b7d952a50a2b35f7c02 not found: ID does not exist" containerID="d5f395a5e27945dbd72a3b4a499ef280d371a29cbb497b7d952a50a2b35f7c02" Nov 24 11:20:07 crc kubenswrapper[4752]: I1124 11:20:07.235292 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5f395a5e27945dbd72a3b4a499ef280d371a29cbb497b7d952a50a2b35f7c02"} err="failed to get container status \"d5f395a5e27945dbd72a3b4a499ef280d371a29cbb497b7d952a50a2b35f7c02\": rpc error: code = NotFound desc = could not find container \"d5f395a5e27945dbd72a3b4a499ef280d371a29cbb497b7d952a50a2b35f7c02\": container with ID starting with d5f395a5e27945dbd72a3b4a499ef280d371a29cbb497b7d952a50a2b35f7c02 not found: ID does not exist" Nov 24 11:20:07 crc kubenswrapper[4752]: I1124 11:20:07.235320 4752 scope.go:117] "RemoveContainer" containerID="632871015aa9887d48c56c2d32a0b81fb3d075a3cf19ccdf7051e5c02914cfa9" Nov 24 11:20:07 crc kubenswrapper[4752]: E1124 11:20:07.235986 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"632871015aa9887d48c56c2d32a0b81fb3d075a3cf19ccdf7051e5c02914cfa9\": container with ID starting with 632871015aa9887d48c56c2d32a0b81fb3d075a3cf19ccdf7051e5c02914cfa9 not found: ID does not exist" containerID="632871015aa9887d48c56c2d32a0b81fb3d075a3cf19ccdf7051e5c02914cfa9" Nov 24 11:20:07 crc kubenswrapper[4752]: I1124 11:20:07.236024 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"632871015aa9887d48c56c2d32a0b81fb3d075a3cf19ccdf7051e5c02914cfa9"} err="failed to get container status \"632871015aa9887d48c56c2d32a0b81fb3d075a3cf19ccdf7051e5c02914cfa9\": rpc error: code = NotFound desc = could not find container \"632871015aa9887d48c56c2d32a0b81fb3d075a3cf19ccdf7051e5c02914cfa9\": container with ID starting with 632871015aa9887d48c56c2d32a0b81fb3d075a3cf19ccdf7051e5c02914cfa9 not found: ID does not exist" Nov 24 11:20:07 crc kubenswrapper[4752]: I1124 11:20:07.236044 4752 scope.go:117] "RemoveContainer" containerID="67790045c8150930b6157dd445a4a4b28769e3a1fe4bff0af02822a9eaad94ea" Nov 24 11:20:07 crc kubenswrapper[4752]: E1124 11:20:07.236288 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67790045c8150930b6157dd445a4a4b28769e3a1fe4bff0af02822a9eaad94ea\": container with ID starting with 67790045c8150930b6157dd445a4a4b28769e3a1fe4bff0af02822a9eaad94ea not found: ID does not exist" containerID="67790045c8150930b6157dd445a4a4b28769e3a1fe4bff0af02822a9eaad94ea" Nov 24 11:20:07 crc kubenswrapper[4752]: I1124 11:20:07.236318 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67790045c8150930b6157dd445a4a4b28769e3a1fe4bff0af02822a9eaad94ea"} err="failed to get container status \"67790045c8150930b6157dd445a4a4b28769e3a1fe4bff0af02822a9eaad94ea\": rpc error: code = NotFound desc = could not find container \"67790045c8150930b6157dd445a4a4b28769e3a1fe4bff0af02822a9eaad94ea\": container with ID starting with 67790045c8150930b6157dd445a4a4b28769e3a1fe4bff0af02822a9eaad94ea not found: ID does not exist" Nov 24 11:20:07 crc kubenswrapper[4752]: I1124 11:20:07.349576 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7ee7fd9-f333-4573-bd44-49ffa58e8389-cert\") pod \"frr-k8s-webhook-server-6998585d5-wwxm2\" (UID: \"e7ee7fd9-f333-4573-bd44-49ffa58e8389\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-wwxm2" Nov 24 11:20:07 crc kubenswrapper[4752]: I1124 11:20:07.349659 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1244a8ec-63d5-438f-8f4f-40796b68de59-metrics-certs\") pod \"frr-k8s-jjkc5\" (UID: \"1244a8ec-63d5-438f-8f4f-40796b68de59\") " pod="metallb-system/frr-k8s-jjkc5" Nov 24 11:20:07 crc kubenswrapper[4752]: I1124 11:20:07.352983 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7ee7fd9-f333-4573-bd44-49ffa58e8389-cert\") pod \"frr-k8s-webhook-server-6998585d5-wwxm2\" (UID: \"e7ee7fd9-f333-4573-bd44-49ffa58e8389\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-wwxm2" Nov 24 11:20:07 crc kubenswrapper[4752]: I1124 11:20:07.354388 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1244a8ec-63d5-438f-8f4f-40796b68de59-metrics-certs\") pod \"frr-k8s-jjkc5\" (UID: \"1244a8ec-63d5-438f-8f4f-40796b68de59\") " pod="metallb-system/frr-k8s-jjkc5" Nov 24 11:20:07 crc kubenswrapper[4752]: I1124 11:20:07.450637 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4cc40b5d-710f-44a1-917e-b330c4dcab18-memberlist\") pod \"speaker-zscb2\" (UID: \"4cc40b5d-710f-44a1-917e-b330c4dcab18\") " pod="metallb-system/speaker-zscb2" Nov 24 11:20:07 crc kubenswrapper[4752]: E1124 11:20:07.450865 4752 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 24 11:20:07 crc kubenswrapper[4752]: E1124 11:20:07.450942 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4cc40b5d-710f-44a1-917e-b330c4dcab18-memberlist podName:4cc40b5d-710f-44a1-917e-b330c4dcab18 nodeName:}" failed. No retries permitted until 2025-11-24 11:20:08.450923224 +0000 UTC m=+814.435743513 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/4cc40b5d-710f-44a1-917e-b330c4dcab18-memberlist") pod "speaker-zscb2" (UID: "4cc40b5d-710f-44a1-917e-b330c4dcab18") : secret "metallb-memberlist" not found Nov 24 11:20:07 crc kubenswrapper[4752]: I1124 11:20:07.454232 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-jjkc5" Nov 24 11:20:07 crc kubenswrapper[4752]: I1124 11:20:07.477539 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-6998585d5-wwxm2" Nov 24 11:20:07 crc kubenswrapper[4752]: I1124 11:20:07.486700 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6c7b4b5f48-w4xmc"] Nov 24 11:20:07 crc kubenswrapper[4752]: W1124 11:20:07.501346 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2731a61e_82f6_43be_93c8_d0a5f9000bec.slice/crio-7c94d2e177c4e356ea06bd4910104097bf2099ec60a9ba9d9ae6fae422f5ab0c WatchSource:0}: Error finding container 7c94d2e177c4e356ea06bd4910104097bf2099ec60a9ba9d9ae6fae422f5ab0c: Status 404 returned error can't find the container with id 7c94d2e177c4e356ea06bd4910104097bf2099ec60a9ba9d9ae6fae422f5ab0c Nov 24 11:20:07 crc kubenswrapper[4752]: I1124 11:20:07.883151 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-wwxm2"] Nov 24 11:20:07 crc kubenswrapper[4752]: W1124 11:20:07.889239 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7ee7fd9_f333_4573_bd44_49ffa58e8389.slice/crio-88443d6495c8080caba516241d918b25f09c05f560a933318caaabc51ab92207 WatchSource:0}: Error finding container 88443d6495c8080caba516241d918b25f09c05f560a933318caaabc51ab92207: Status 404 returned error can't find the container with id 88443d6495c8080caba516241d918b25f09c05f560a933318caaabc51ab92207 Nov 24 11:20:08 crc kubenswrapper[4752]: I1124 11:20:08.135903 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-w4xmc" event={"ID":"2731a61e-82f6-43be-93c8-d0a5f9000bec","Type":"ContainerStarted","Data":"b2e1151ac365eb7d9e88470a0ccdaa586d707fa3fcf3f5ea529e1c8615a9dc12"} Nov 24 11:20:08 crc kubenswrapper[4752]: I1124 11:20:08.136042 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-w4xmc" event={"ID":"2731a61e-82f6-43be-93c8-d0a5f9000bec","Type":"ContainerStarted","Data":"c6a8c8689ca71a831a8a2ba243ab9283e5615343e42a54043fe93c5ef9535a79"} Nov 24 11:20:08 crc kubenswrapper[4752]: I1124 11:20:08.136074 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6c7b4b5f48-w4xmc" Nov 24 11:20:08 crc kubenswrapper[4752]: I1124 11:20:08.136120 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-w4xmc" event={"ID":"2731a61e-82f6-43be-93c8-d0a5f9000bec","Type":"ContainerStarted","Data":"7c94d2e177c4e356ea06bd4910104097bf2099ec60a9ba9d9ae6fae422f5ab0c"} Nov 24 11:20:08 crc kubenswrapper[4752]: I1124 11:20:08.138225 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jjkc5" event={"ID":"1244a8ec-63d5-438f-8f4f-40796b68de59","Type":"ContainerStarted","Data":"d6800b8ee7a3dbe9d7bcf3a8dbdc9958110cb15a961f80292352001d19061705"} Nov 24 11:20:08 crc kubenswrapper[4752]: I1124 11:20:08.139432 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-6998585d5-wwxm2" event={"ID":"e7ee7fd9-f333-4573-bd44-49ffa58e8389","Type":"ContainerStarted","Data":"88443d6495c8080caba516241d918b25f09c05f560a933318caaabc51ab92207"} Nov 24 11:20:08 crc kubenswrapper[4752]: I1124 11:20:08.163855 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6c7b4b5f48-w4xmc" podStartSLOduration=2.163834013 podStartE2EDuration="2.163834013s" podCreationTimestamp="2025-11-24 11:20:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:20:08.163759461 +0000 UTC m=+814.148579750" watchObservedRunningTime="2025-11-24 11:20:08.163834013 +0000 UTC m=+814.148654302" Nov 24 11:20:08 crc kubenswrapper[4752]: I1124 11:20:08.461456 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4cc40b5d-710f-44a1-917e-b330c4dcab18-memberlist\") pod \"speaker-zscb2\" (UID: \"4cc40b5d-710f-44a1-917e-b330c4dcab18\") " pod="metallb-system/speaker-zscb2" Nov 24 11:20:08 crc kubenswrapper[4752]: I1124 11:20:08.470191 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4cc40b5d-710f-44a1-917e-b330c4dcab18-memberlist\") pod \"speaker-zscb2\" (UID: \"4cc40b5d-710f-44a1-917e-b330c4dcab18\") " pod="metallb-system/speaker-zscb2" Nov 24 11:20:08 crc kubenswrapper[4752]: I1124 11:20:08.494005 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-zscb2" Nov 24 11:20:08 crc kubenswrapper[4752]: W1124 11:20:08.512451 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cc40b5d_710f_44a1_917e_b330c4dcab18.slice/crio-a4293761478564d17bda357dbe8f60bd15549663d5f1367d632c15cec7704ea8 WatchSource:0}: Error finding container a4293761478564d17bda357dbe8f60bd15549663d5f1367d632c15cec7704ea8: Status 404 returned error can't find the container with id a4293761478564d17bda357dbe8f60bd15549663d5f1367d632c15cec7704ea8 Nov 24 11:20:08 crc kubenswrapper[4752]: I1124 11:20:08.738732 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f25e6c50-6d1a-453a-849e-8f17eede07ba" path="/var/lib/kubelet/pods/f25e6c50-6d1a-453a-849e-8f17eede07ba/volumes" Nov 24 11:20:09 crc kubenswrapper[4752]: I1124 11:20:09.147661 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zscb2" event={"ID":"4cc40b5d-710f-44a1-917e-b330c4dcab18","Type":"ContainerStarted","Data":"8dbc53afdf093498db7f6c4d614947b2ffe485f0bdbf70025ce5a4414272f07a"} Nov 24 11:20:09 crc kubenswrapper[4752]: I1124 11:20:09.148021 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zscb2" event={"ID":"4cc40b5d-710f-44a1-917e-b330c4dcab18","Type":"ContainerStarted","Data":"856338384e242f2be6f11582019924b705ed54ace50830b850c753f2b3343871"} Nov 24 11:20:09 crc kubenswrapper[4752]: I1124 11:20:09.148031 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zscb2" event={"ID":"4cc40b5d-710f-44a1-917e-b330c4dcab18","Type":"ContainerStarted","Data":"a4293761478564d17bda357dbe8f60bd15549663d5f1367d632c15cec7704ea8"} Nov 24 11:20:09 crc kubenswrapper[4752]: I1124 11:20:09.148538 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-zscb2" Nov 24 11:20:09 crc kubenswrapper[4752]: I1124 11:20:09.166823 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-zscb2" podStartSLOduration=3.166804187 podStartE2EDuration="3.166804187s" podCreationTimestamp="2025-11-24 11:20:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:20:09.166610512 +0000 UTC m=+815.151430801" watchObservedRunningTime="2025-11-24 11:20:09.166804187 +0000 UTC m=+815.151624476" Nov 24 11:20:15 crc kubenswrapper[4752]: I1124 11:20:15.193551 4752 generic.go:334] "Generic (PLEG): container finished" podID="1244a8ec-63d5-438f-8f4f-40796b68de59" containerID="fce003b2c48f3067de3d4f57d9c3d8a876ec11052bfba36eab94575cfbe3959d" exitCode=0 Nov 24 11:20:15 crc kubenswrapper[4752]: I1124 11:20:15.193629 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jjkc5" event={"ID":"1244a8ec-63d5-438f-8f4f-40796b68de59","Type":"ContainerDied","Data":"fce003b2c48f3067de3d4f57d9c3d8a876ec11052bfba36eab94575cfbe3959d"} Nov 24 11:20:15 crc kubenswrapper[4752]: I1124 11:20:15.196511 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-6998585d5-wwxm2" event={"ID":"e7ee7fd9-f333-4573-bd44-49ffa58e8389","Type":"ContainerStarted","Data":"17f7d51163ff5742a780651e9f18e0cb880627d3c55c35c47b435915f3188824"} Nov 24 11:20:15 crc kubenswrapper[4752]: I1124 11:20:15.196818 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-6998585d5-wwxm2" Nov 24 11:20:15 crc kubenswrapper[4752]: I1124 11:20:15.247792 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-6998585d5-wwxm2" podStartSLOduration=2.656255848 podStartE2EDuration="9.247770318s" podCreationTimestamp="2025-11-24 11:20:06 +0000 UTC" firstStartedPulling="2025-11-24 11:20:07.891441698 +0000 UTC m=+813.876261977" lastFinishedPulling="2025-11-24 11:20:14.482956148 +0000 UTC m=+820.467776447" observedRunningTime="2025-11-24 11:20:15.245277336 +0000 UTC m=+821.230097635" watchObservedRunningTime="2025-11-24 11:20:15.247770318 +0000 UTC m=+821.232590627" Nov 24 11:20:15 crc kubenswrapper[4752]: E1124 11:20:15.442142 4752 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1244a8ec_63d5_438f_8f4f_40796b68de59.slice/crio-8996e425a9929033af8c7cd149ab49503518db0d2744ebbf0a615d4383f6d183.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1244a8ec_63d5_438f_8f4f_40796b68de59.slice/crio-conmon-8996e425a9929033af8c7cd149ab49503518db0d2744ebbf0a615d4383f6d183.scope\": RecentStats: unable to find data in memory cache]" Nov 24 11:20:16 crc kubenswrapper[4752]: I1124 11:20:16.205944 4752 generic.go:334] "Generic (PLEG): container finished" podID="1244a8ec-63d5-438f-8f4f-40796b68de59" containerID="8996e425a9929033af8c7cd149ab49503518db0d2744ebbf0a615d4383f6d183" exitCode=0 Nov 24 11:20:16 crc kubenswrapper[4752]: I1124 11:20:16.205995 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jjkc5" event={"ID":"1244a8ec-63d5-438f-8f4f-40796b68de59","Type":"ContainerDied","Data":"8996e425a9929033af8c7cd149ab49503518db0d2744ebbf0a615d4383f6d183"} Nov 24 11:20:17 crc kubenswrapper[4752]: I1124 11:20:17.007717 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6c7b4b5f48-w4xmc" Nov 24 11:20:17 crc kubenswrapper[4752]: I1124 11:20:17.214515 4752 generic.go:334] "Generic (PLEG): container finished" podID="1244a8ec-63d5-438f-8f4f-40796b68de59" containerID="20ae49c51e93303bcafe590995efbff36e2cbfce74c24133ea3ec113261f1cd9" exitCode=0 Nov 24 11:20:17 crc kubenswrapper[4752]: I1124 11:20:17.214603 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jjkc5" event={"ID":"1244a8ec-63d5-438f-8f4f-40796b68de59","Type":"ContainerDied","Data":"20ae49c51e93303bcafe590995efbff36e2cbfce74c24133ea3ec113261f1cd9"} Nov 24 11:20:18 crc kubenswrapper[4752]: I1124 11:20:18.105336 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pznxb"] Nov 24 11:20:18 crc kubenswrapper[4752]: I1124 11:20:18.107721 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pznxb" Nov 24 11:20:18 crc kubenswrapper[4752]: I1124 11:20:18.112202 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pznxb"] Nov 24 11:20:18 crc kubenswrapper[4752]: I1124 11:20:18.222652 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp4rc\" (UniqueName: \"kubernetes.io/projected/aef6b9d6-313d-4bde-9cdf-9a64b751a7cf-kube-api-access-qp4rc\") pod \"certified-operators-pznxb\" (UID: \"aef6b9d6-313d-4bde-9cdf-9a64b751a7cf\") " pod="openshift-marketplace/certified-operators-pznxb" Nov 24 11:20:18 crc kubenswrapper[4752]: I1124 11:20:18.222909 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aef6b9d6-313d-4bde-9cdf-9a64b751a7cf-utilities\") pod \"certified-operators-pznxb\" (UID: \"aef6b9d6-313d-4bde-9cdf-9a64b751a7cf\") " pod="openshift-marketplace/certified-operators-pznxb" Nov 24 11:20:18 crc kubenswrapper[4752]: I1124 11:20:18.223037 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aef6b9d6-313d-4bde-9cdf-9a64b751a7cf-catalog-content\") pod \"certified-operators-pznxb\" (UID: \"aef6b9d6-313d-4bde-9cdf-9a64b751a7cf\") " pod="openshift-marketplace/certified-operators-pznxb" Nov 24 11:20:18 crc kubenswrapper[4752]: I1124 11:20:18.228199 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jjkc5" event={"ID":"1244a8ec-63d5-438f-8f4f-40796b68de59","Type":"ContainerStarted","Data":"7383ff0736cb1f1e7b532f792db6291dc2c00285a95914bcfaf4f85b1f2c5105"} Nov 24 11:20:18 crc kubenswrapper[4752]: I1124 11:20:18.228537 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jjkc5" event={"ID":"1244a8ec-63d5-438f-8f4f-40796b68de59","Type":"ContainerStarted","Data":"f26c719989009adfaef86ca8d66769da32869a2051a0e539e20dec702605e1b6"} Nov 24 11:20:18 crc kubenswrapper[4752]: I1124 11:20:18.228549 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jjkc5" event={"ID":"1244a8ec-63d5-438f-8f4f-40796b68de59","Type":"ContainerStarted","Data":"91b6fe1ed19d0f89279eb2636e4d57530c7fe603c5125bfec0107eefd0fba4c5"} Nov 24 11:20:18 crc kubenswrapper[4752]: I1124 11:20:18.228558 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jjkc5" event={"ID":"1244a8ec-63d5-438f-8f4f-40796b68de59","Type":"ContainerStarted","Data":"9c74884ac6120ad2e2fe2008e333728f98bab15bafdd3ee0c45dcdbd95b4ca26"} Nov 24 11:20:18 crc kubenswrapper[4752]: I1124 11:20:18.228566 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jjkc5" event={"ID":"1244a8ec-63d5-438f-8f4f-40796b68de59","Type":"ContainerStarted","Data":"ce5bb5ac99246ce562b0d112d3f812d6c3ea393ebb86ce94211b056d0b7fdc29"} Nov 24 11:20:18 crc kubenswrapper[4752]: I1124 11:20:18.324212 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aef6b9d6-313d-4bde-9cdf-9a64b751a7cf-catalog-content\") pod \"certified-operators-pznxb\" (UID: \"aef6b9d6-313d-4bde-9cdf-9a64b751a7cf\") " pod="openshift-marketplace/certified-operators-pznxb" Nov 24 11:20:18 crc kubenswrapper[4752]: I1124 11:20:18.324257 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp4rc\" (UniqueName: \"kubernetes.io/projected/aef6b9d6-313d-4bde-9cdf-9a64b751a7cf-kube-api-access-qp4rc\") pod \"certified-operators-pznxb\" (UID: \"aef6b9d6-313d-4bde-9cdf-9a64b751a7cf\") " pod="openshift-marketplace/certified-operators-pznxb" Nov 24 11:20:18 crc kubenswrapper[4752]: I1124 11:20:18.324324 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aef6b9d6-313d-4bde-9cdf-9a64b751a7cf-utilities\") pod \"certified-operators-pznxb\" (UID: \"aef6b9d6-313d-4bde-9cdf-9a64b751a7cf\") " pod="openshift-marketplace/certified-operators-pznxb" Nov 24 11:20:18 crc kubenswrapper[4752]: I1124 11:20:18.325069 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aef6b9d6-313d-4bde-9cdf-9a64b751a7cf-utilities\") pod \"certified-operators-pznxb\" (UID: \"aef6b9d6-313d-4bde-9cdf-9a64b751a7cf\") " pod="openshift-marketplace/certified-operators-pznxb" Nov 24 11:20:18 crc kubenswrapper[4752]: I1124 11:20:18.325207 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aef6b9d6-313d-4bde-9cdf-9a64b751a7cf-catalog-content\") pod \"certified-operators-pznxb\" (UID: \"aef6b9d6-313d-4bde-9cdf-9a64b751a7cf\") " pod="openshift-marketplace/certified-operators-pznxb" Nov 24 11:20:18 crc kubenswrapper[4752]: I1124 11:20:18.347858 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp4rc\" (UniqueName: \"kubernetes.io/projected/aef6b9d6-313d-4bde-9cdf-9a64b751a7cf-kube-api-access-qp4rc\") pod \"certified-operators-pznxb\" (UID: \"aef6b9d6-313d-4bde-9cdf-9a64b751a7cf\") " pod="openshift-marketplace/certified-operators-pznxb" Nov 24 11:20:18 crc kubenswrapper[4752]: I1124 11:20:18.425375 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pznxb" Nov 24 11:20:18 crc kubenswrapper[4752]: I1124 11:20:18.497661 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-zscb2" Nov 24 11:20:18 crc kubenswrapper[4752]: I1124 11:20:18.673684 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pznxb"] Nov 24 11:20:18 crc kubenswrapper[4752]: W1124 11:20:18.682555 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaef6b9d6_313d_4bde_9cdf_9a64b751a7cf.slice/crio-3b2a5fd4eaf57ba24251cff4f08bd6fef10d4e42e50f907f6bb01a982b14f56d WatchSource:0}: Error finding container 3b2a5fd4eaf57ba24251cff4f08bd6fef10d4e42e50f907f6bb01a982b14f56d: Status 404 returned error can't find the container with id 3b2a5fd4eaf57ba24251cff4f08bd6fef10d4e42e50f907f6bb01a982b14f56d Nov 24 11:20:19 crc kubenswrapper[4752]: I1124 11:20:19.234867 4752 generic.go:334] "Generic (PLEG): container finished" podID="aef6b9d6-313d-4bde-9cdf-9a64b751a7cf" containerID="2c4be4fb1b3841d285fed9f25ea315f99ecb3ae6ea049745f63f79c9cf3bd7e7" exitCode=0 Nov 24 11:20:19 crc kubenswrapper[4752]: I1124 11:20:19.234934 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pznxb" event={"ID":"aef6b9d6-313d-4bde-9cdf-9a64b751a7cf","Type":"ContainerDied","Data":"2c4be4fb1b3841d285fed9f25ea315f99ecb3ae6ea049745f63f79c9cf3bd7e7"} Nov 24 11:20:19 crc kubenswrapper[4752]: I1124 11:20:19.234959 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pznxb" event={"ID":"aef6b9d6-313d-4bde-9cdf-9a64b751a7cf","Type":"ContainerStarted","Data":"3b2a5fd4eaf57ba24251cff4f08bd6fef10d4e42e50f907f6bb01a982b14f56d"} Nov 24 11:20:19 crc kubenswrapper[4752]: I1124 11:20:19.238954 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jjkc5" event={"ID":"1244a8ec-63d5-438f-8f4f-40796b68de59","Type":"ContainerStarted","Data":"134d680141af2781e44a41d4560244e52a81604dd78f49ad51ed9bed9d22ef95"} Nov 24 11:20:19 crc kubenswrapper[4752]: I1124 11:20:19.239194 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-jjkc5" Nov 24 11:20:20 crc kubenswrapper[4752]: I1124 11:20:20.247148 4752 generic.go:334] "Generic (PLEG): container finished" podID="aef6b9d6-313d-4bde-9cdf-9a64b751a7cf" containerID="dfb8c52913935c3dcbb2d1e8d83575827dea310395646f14675e31af6c361742" exitCode=0 Nov 24 11:20:20 crc kubenswrapper[4752]: I1124 11:20:20.247263 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pznxb" event={"ID":"aef6b9d6-313d-4bde-9cdf-9a64b751a7cf","Type":"ContainerDied","Data":"dfb8c52913935c3dcbb2d1e8d83575827dea310395646f14675e31af6c361742"} Nov 24 11:20:20 crc kubenswrapper[4752]: I1124 11:20:20.266659 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-jjkc5" podStartSLOduration=7.3771296920000005 podStartE2EDuration="14.266636145s" podCreationTimestamp="2025-11-24 11:20:06 +0000 UTC" firstStartedPulling="2025-11-24 11:20:07.614509003 +0000 UTC m=+813.599329282" lastFinishedPulling="2025-11-24 11:20:14.504015446 +0000 UTC m=+820.488835735" observedRunningTime="2025-11-24 11:20:19.288943551 +0000 UTC m=+825.273763840" watchObservedRunningTime="2025-11-24 11:20:20.266636145 +0000 UTC m=+826.251456444" Nov 24 11:20:20 crc kubenswrapper[4752]: I1124 11:20:20.503325 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931as4qf7"] Nov 24 11:20:20 crc kubenswrapper[4752]: I1124 11:20:20.504705 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931as4qf7" Nov 24 11:20:20 crc kubenswrapper[4752]: I1124 11:20:20.508590 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 24 11:20:20 crc kubenswrapper[4752]: I1124 11:20:20.510480 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931as4qf7"] Nov 24 11:20:20 crc kubenswrapper[4752]: I1124 11:20:20.656707 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931as4qf7\" (UID: \"f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931as4qf7" Nov 24 11:20:20 crc kubenswrapper[4752]: I1124 11:20:20.656776 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djswg\" (UniqueName: \"kubernetes.io/projected/f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5-kube-api-access-djswg\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931as4qf7\" (UID: \"f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931as4qf7" Nov 24 11:20:20 crc kubenswrapper[4752]: I1124 11:20:20.656816 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931as4qf7\" (UID: \"f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931as4qf7" Nov 24 11:20:20 crc kubenswrapper[4752]: I1124 11:20:20.759199 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931as4qf7\" (UID: \"f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931as4qf7" Nov 24 11:20:20 crc kubenswrapper[4752]: I1124 11:20:20.759252 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djswg\" (UniqueName: \"kubernetes.io/projected/f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5-kube-api-access-djswg\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931as4qf7\" (UID: \"f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931as4qf7" Nov 24 11:20:20 crc kubenswrapper[4752]: I1124 11:20:20.759286 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931as4qf7\" (UID: \"f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931as4qf7" Nov 24 11:20:20 crc kubenswrapper[4752]: I1124 11:20:20.759684 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931as4qf7\" (UID: \"f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931as4qf7" Nov 24 11:20:20 crc kubenswrapper[4752]: I1124 11:20:20.759836 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931as4qf7\" (UID: \"f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931as4qf7" Nov 24 11:20:20 crc kubenswrapper[4752]: I1124 11:20:20.785503 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djswg\" (UniqueName: \"kubernetes.io/projected/f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5-kube-api-access-djswg\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931as4qf7\" (UID: \"f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931as4qf7" Nov 24 11:20:20 crc kubenswrapper[4752]: I1124 11:20:20.819947 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931as4qf7" Nov 24 11:20:21 crc kubenswrapper[4752]: I1124 11:20:21.255341 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pznxb" event={"ID":"aef6b9d6-313d-4bde-9cdf-9a64b751a7cf","Type":"ContainerStarted","Data":"6dae0ff8645b8735b36d01f68d2c79379b2107522f3b12531497a652fa07bcef"} Nov 24 11:20:21 crc kubenswrapper[4752]: I1124 11:20:21.272889 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931as4qf7"] Nov 24 11:20:21 crc kubenswrapper[4752]: I1124 11:20:21.274156 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pznxb" podStartSLOduration=1.858708002 podStartE2EDuration="3.274129091s" podCreationTimestamp="2025-11-24 11:20:18 +0000 UTC" firstStartedPulling="2025-11-24 11:20:19.236151875 +0000 UTC m=+825.220972164" lastFinishedPulling="2025-11-24 11:20:20.651572954 +0000 UTC m=+826.636393253" observedRunningTime="2025-11-24 11:20:21.270720962 +0000 UTC m=+827.255541251" watchObservedRunningTime="2025-11-24 11:20:21.274129091 +0000 UTC m=+827.258949380" Nov 24 11:20:22 crc kubenswrapper[4752]: I1124 11:20:22.269341 4752 generic.go:334] "Generic (PLEG): container finished" podID="f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5" containerID="f0794ef9e27d600c8bf49baa625a38280f7e8d425f9fa906caac1c7d0e51ceab" exitCode=0 Nov 24 11:20:22 crc kubenswrapper[4752]: I1124 11:20:22.269553 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931as4qf7" event={"ID":"f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5","Type":"ContainerDied","Data":"f0794ef9e27d600c8bf49baa625a38280f7e8d425f9fa906caac1c7d0e51ceab"} Nov 24 11:20:22 crc kubenswrapper[4752]: I1124 11:20:22.270453 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931as4qf7" event={"ID":"f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5","Type":"ContainerStarted","Data":"67373fd7d06bfcbca03686092367eab4d68b3a18a3555234281847e35efd285e"} Nov 24 11:20:22 crc kubenswrapper[4752]: I1124 11:20:22.455666 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-jjkc5" Nov 24 11:20:22 crc kubenswrapper[4752]: I1124 11:20:22.489083 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-jjkc5" Nov 24 11:20:25 crc kubenswrapper[4752]: I1124 11:20:25.325308 4752 generic.go:334] "Generic (PLEG): container finished" podID="f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5" containerID="b3f9b1b390e37b97957a2b5da7dfb1e735ed038c552bb0f52330115117eaa7c5" exitCode=0 Nov 24 11:20:25 crc kubenswrapper[4752]: I1124 11:20:25.325415 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931as4qf7" event={"ID":"f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5","Type":"ContainerDied","Data":"b3f9b1b390e37b97957a2b5da7dfb1e735ed038c552bb0f52330115117eaa7c5"} Nov 24 11:20:26 crc kubenswrapper[4752]: I1124 11:20:26.335419 4752 generic.go:334] "Generic (PLEG): container finished" podID="f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5" containerID="0a52529b3d17e3f58b422ed8ea2234c11ab34cc2b49a0ba593327c64a7736450" exitCode=0 Nov 24 11:20:26 crc kubenswrapper[4752]: I1124 11:20:26.335505 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931as4qf7" event={"ID":"f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5","Type":"ContainerDied","Data":"0a52529b3d17e3f58b422ed8ea2234c11ab34cc2b49a0ba593327c64a7736450"} Nov 24 11:20:27 crc kubenswrapper[4752]: I1124 11:20:27.458244 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-jjkc5" Nov 24 11:20:27 crc kubenswrapper[4752]: I1124 11:20:27.491069 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-6998585d5-wwxm2" Nov 24 11:20:27 crc kubenswrapper[4752]: I1124 11:20:27.638435 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931as4qf7" Nov 24 11:20:27 crc kubenswrapper[4752]: I1124 11:20:27.756874 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5-bundle\") pod \"f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5\" (UID: \"f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5\") " Nov 24 11:20:27 crc kubenswrapper[4752]: I1124 11:20:27.756959 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djswg\" (UniqueName: \"kubernetes.io/projected/f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5-kube-api-access-djswg\") pod \"f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5\" (UID: \"f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5\") " Nov 24 11:20:27 crc kubenswrapper[4752]: I1124 11:20:27.757109 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5-util\") pod \"f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5\" (UID: \"f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5\") " Nov 24 11:20:27 crc kubenswrapper[4752]: I1124 11:20:27.757795 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5-bundle" (OuterVolumeSpecName: "bundle") pod "f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5" (UID: "f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:20:27 crc kubenswrapper[4752]: I1124 11:20:27.759345 4752 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:20:27 crc kubenswrapper[4752]: I1124 11:20:27.764988 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5-kube-api-access-djswg" (OuterVolumeSpecName: "kube-api-access-djswg") pod "f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5" (UID: "f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5"). InnerVolumeSpecName "kube-api-access-djswg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:20:27 crc kubenswrapper[4752]: I1124 11:20:27.774515 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5-util" (OuterVolumeSpecName: "util") pod "f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5" (UID: "f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:20:27 crc kubenswrapper[4752]: I1124 11:20:27.861045 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djswg\" (UniqueName: \"kubernetes.io/projected/f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5-kube-api-access-djswg\") on node \"crc\" DevicePath \"\"" Nov 24 11:20:27 crc kubenswrapper[4752]: I1124 11:20:27.861088 4752 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5-util\") on node \"crc\" DevicePath \"\"" Nov 24 11:20:28 crc kubenswrapper[4752]: I1124 11:20:28.350093 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931as4qf7" event={"ID":"f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5","Type":"ContainerDied","Data":"67373fd7d06bfcbca03686092367eab4d68b3a18a3555234281847e35efd285e"} Nov 24 11:20:28 crc kubenswrapper[4752]: I1124 11:20:28.350146 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67373fd7d06bfcbca03686092367eab4d68b3a18a3555234281847e35efd285e" Nov 24 11:20:28 crc kubenswrapper[4752]: I1124 11:20:28.350229 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931as4qf7" Nov 24 11:20:28 crc kubenswrapper[4752]: I1124 11:20:28.426447 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pznxb" Nov 24 11:20:28 crc kubenswrapper[4752]: I1124 11:20:28.426492 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pznxb" Nov 24 11:20:28 crc kubenswrapper[4752]: I1124 11:20:28.498089 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pznxb" Nov 24 11:20:29 crc kubenswrapper[4752]: I1124 11:20:29.428019 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pznxb" Nov 24 11:20:32 crc kubenswrapper[4752]: I1124 11:20:32.649076 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pznxb"] Nov 24 11:20:32 crc kubenswrapper[4752]: I1124 11:20:32.649679 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pznxb" podUID="aef6b9d6-313d-4bde-9cdf-9a64b751a7cf" containerName="registry-server" containerID="cri-o://6dae0ff8645b8735b36d01f68d2c79379b2107522f3b12531497a652fa07bcef" gracePeriod=2 Nov 24 11:20:32 crc kubenswrapper[4752]: I1124 11:20:32.832580 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-6pgm5"] Nov 24 11:20:32 crc kubenswrapper[4752]: E1124 11:20:32.833278 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5" containerName="pull" Nov 24 11:20:32 crc kubenswrapper[4752]: I1124 11:20:32.833303 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5" containerName="pull" Nov 24 11:20:32 crc kubenswrapper[4752]: E1124 11:20:32.833324 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5" containerName="util" Nov 24 11:20:32 crc kubenswrapper[4752]: I1124 11:20:32.833337 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5" containerName="util" Nov 24 11:20:32 crc kubenswrapper[4752]: E1124 11:20:32.833369 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5" containerName="extract" Nov 24 11:20:32 crc kubenswrapper[4752]: I1124 11:20:32.833381 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5" containerName="extract" Nov 24 11:20:32 crc kubenswrapper[4752]: I1124 11:20:32.833551 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5" containerName="extract" Nov 24 11:20:32 crc kubenswrapper[4752]: I1124 11:20:32.834208 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-6pgm5" Nov 24 11:20:32 crc kubenswrapper[4752]: I1124 11:20:32.836645 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Nov 24 11:20:32 crc kubenswrapper[4752]: I1124 11:20:32.839265 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Nov 24 11:20:32 crc kubenswrapper[4752]: I1124 11:20:32.839373 4752 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-lpgxh" Nov 24 11:20:32 crc kubenswrapper[4752]: I1124 11:20:32.860635 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-6pgm5"] Nov 24 11:20:32 crc kubenswrapper[4752]: I1124 11:20:32.933008 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/99a73ee6-9622-4421-9d60-687e2d253daa-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-6pgm5\" (UID: \"99a73ee6-9622-4421-9d60-687e2d253daa\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-6pgm5" Nov 24 11:20:32 crc kubenswrapper[4752]: I1124 11:20:32.933073 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjqbg\" (UniqueName: \"kubernetes.io/projected/99a73ee6-9622-4421-9d60-687e2d253daa-kube-api-access-mjqbg\") pod \"cert-manager-operator-controller-manager-64cf6dff88-6pgm5\" (UID: \"99a73ee6-9622-4421-9d60-687e2d253daa\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-6pgm5" Nov 24 11:20:33 crc kubenswrapper[4752]: I1124 11:20:33.034925 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/99a73ee6-9622-4421-9d60-687e2d253daa-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-6pgm5\" (UID: \"99a73ee6-9622-4421-9d60-687e2d253daa\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-6pgm5" Nov 24 11:20:33 crc kubenswrapper[4752]: I1124 11:20:33.035015 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjqbg\" (UniqueName: \"kubernetes.io/projected/99a73ee6-9622-4421-9d60-687e2d253daa-kube-api-access-mjqbg\") pod \"cert-manager-operator-controller-manager-64cf6dff88-6pgm5\" (UID: \"99a73ee6-9622-4421-9d60-687e2d253daa\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-6pgm5" Nov 24 11:20:33 crc kubenswrapper[4752]: I1124 11:20:33.036266 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/99a73ee6-9622-4421-9d60-687e2d253daa-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-6pgm5\" (UID: \"99a73ee6-9622-4421-9d60-687e2d253daa\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-6pgm5" Nov 24 11:20:33 crc kubenswrapper[4752]: I1124 11:20:33.069374 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjqbg\" (UniqueName: \"kubernetes.io/projected/99a73ee6-9622-4421-9d60-687e2d253daa-kube-api-access-mjqbg\") pod \"cert-manager-operator-controller-manager-64cf6dff88-6pgm5\" (UID: \"99a73ee6-9622-4421-9d60-687e2d253daa\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-6pgm5" Nov 24 11:20:33 crc kubenswrapper[4752]: I1124 11:20:33.152049 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-6pgm5" Nov 24 11:20:33 crc kubenswrapper[4752]: I1124 11:20:33.387142 4752 generic.go:334] "Generic (PLEG): container finished" podID="aef6b9d6-313d-4bde-9cdf-9a64b751a7cf" containerID="6dae0ff8645b8735b36d01f68d2c79379b2107522f3b12531497a652fa07bcef" exitCode=0 Nov 24 11:20:33 crc kubenswrapper[4752]: I1124 11:20:33.387347 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pznxb" event={"ID":"aef6b9d6-313d-4bde-9cdf-9a64b751a7cf","Type":"ContainerDied","Data":"6dae0ff8645b8735b36d01f68d2c79379b2107522f3b12531497a652fa07bcef"} Nov 24 11:20:33 crc kubenswrapper[4752]: I1124 11:20:33.679659 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-6pgm5"] Nov 24 11:20:33 crc kubenswrapper[4752]: I1124 11:20:33.869963 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pznxb" Nov 24 11:20:33 crc kubenswrapper[4752]: I1124 11:20:33.944624 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aef6b9d6-313d-4bde-9cdf-9a64b751a7cf-catalog-content\") pod \"aef6b9d6-313d-4bde-9cdf-9a64b751a7cf\" (UID: \"aef6b9d6-313d-4bde-9cdf-9a64b751a7cf\") " Nov 24 11:20:33 crc kubenswrapper[4752]: I1124 11:20:33.944701 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qp4rc\" (UniqueName: \"kubernetes.io/projected/aef6b9d6-313d-4bde-9cdf-9a64b751a7cf-kube-api-access-qp4rc\") pod \"aef6b9d6-313d-4bde-9cdf-9a64b751a7cf\" (UID: \"aef6b9d6-313d-4bde-9cdf-9a64b751a7cf\") " Nov 24 11:20:33 crc kubenswrapper[4752]: I1124 11:20:33.944810 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aef6b9d6-313d-4bde-9cdf-9a64b751a7cf-utilities\") pod \"aef6b9d6-313d-4bde-9cdf-9a64b751a7cf\" (UID: \"aef6b9d6-313d-4bde-9cdf-9a64b751a7cf\") " Nov 24 11:20:33 crc kubenswrapper[4752]: I1124 11:20:33.945537 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aef6b9d6-313d-4bde-9cdf-9a64b751a7cf-utilities" (OuterVolumeSpecName: "utilities") pod "aef6b9d6-313d-4bde-9cdf-9a64b751a7cf" (UID: "aef6b9d6-313d-4bde-9cdf-9a64b751a7cf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:20:33 crc kubenswrapper[4752]: I1124 11:20:33.952763 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aef6b9d6-313d-4bde-9cdf-9a64b751a7cf-kube-api-access-qp4rc" (OuterVolumeSpecName: "kube-api-access-qp4rc") pod "aef6b9d6-313d-4bde-9cdf-9a64b751a7cf" (UID: "aef6b9d6-313d-4bde-9cdf-9a64b751a7cf"). InnerVolumeSpecName "kube-api-access-qp4rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:20:33 crc kubenswrapper[4752]: I1124 11:20:33.989514 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aef6b9d6-313d-4bde-9cdf-9a64b751a7cf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aef6b9d6-313d-4bde-9cdf-9a64b751a7cf" (UID: "aef6b9d6-313d-4bde-9cdf-9a64b751a7cf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:20:34 crc kubenswrapper[4752]: I1124 11:20:34.046996 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aef6b9d6-313d-4bde-9cdf-9a64b751a7cf-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 11:20:34 crc kubenswrapper[4752]: I1124 11:20:34.047046 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qp4rc\" (UniqueName: \"kubernetes.io/projected/aef6b9d6-313d-4bde-9cdf-9a64b751a7cf-kube-api-access-qp4rc\") on node \"crc\" DevicePath \"\"" Nov 24 11:20:34 crc kubenswrapper[4752]: I1124 11:20:34.047067 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aef6b9d6-313d-4bde-9cdf-9a64b751a7cf-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 11:20:34 crc kubenswrapper[4752]: I1124 11:20:34.397584 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pznxb" event={"ID":"aef6b9d6-313d-4bde-9cdf-9a64b751a7cf","Type":"ContainerDied","Data":"3b2a5fd4eaf57ba24251cff4f08bd6fef10d4e42e50f907f6bb01a982b14f56d"} Nov 24 11:20:34 crc kubenswrapper[4752]: I1124 11:20:34.398181 4752 scope.go:117] "RemoveContainer" containerID="6dae0ff8645b8735b36d01f68d2c79379b2107522f3b12531497a652fa07bcef" Nov 24 11:20:34 crc kubenswrapper[4752]: I1124 11:20:34.397712 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pznxb" Nov 24 11:20:34 crc kubenswrapper[4752]: I1124 11:20:34.401099 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-6pgm5" event={"ID":"99a73ee6-9622-4421-9d60-687e2d253daa","Type":"ContainerStarted","Data":"0ca040714eccc21d295f2e676ed4dedaa5694bdbfe9386dad0e536d83e9e4ce7"} Nov 24 11:20:34 crc kubenswrapper[4752]: I1124 11:20:34.424072 4752 scope.go:117] "RemoveContainer" containerID="dfb8c52913935c3dcbb2d1e8d83575827dea310395646f14675e31af6c361742" Nov 24 11:20:34 crc kubenswrapper[4752]: I1124 11:20:34.441104 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pznxb"] Nov 24 11:20:34 crc kubenswrapper[4752]: I1124 11:20:34.452429 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pznxb"] Nov 24 11:20:34 crc kubenswrapper[4752]: I1124 11:20:34.462452 4752 scope.go:117] "RemoveContainer" containerID="2c4be4fb1b3841d285fed9f25ea315f99ecb3ae6ea049745f63f79c9cf3bd7e7" Nov 24 11:20:34 crc kubenswrapper[4752]: I1124 11:20:34.746387 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aef6b9d6-313d-4bde-9cdf-9a64b751a7cf" path="/var/lib/kubelet/pods/aef6b9d6-313d-4bde-9cdf-9a64b751a7cf/volumes" Nov 24 11:20:40 crc kubenswrapper[4752]: I1124 11:20:40.446591 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-6pgm5" event={"ID":"99a73ee6-9622-4421-9d60-687e2d253daa","Type":"ContainerStarted","Data":"c8296daefae05370e6d8ed2b57a347d276ce702f66220f235f460196ebdad144"} Nov 24 11:20:40 crc kubenswrapper[4752]: I1124 11:20:40.484616 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-6pgm5" podStartSLOduration=2.10711927 podStartE2EDuration="8.484592202s" podCreationTimestamp="2025-11-24 11:20:32 +0000 UTC" firstStartedPulling="2025-11-24 11:20:33.702307167 +0000 UTC m=+839.687127456" lastFinishedPulling="2025-11-24 11:20:40.079780109 +0000 UTC m=+846.064600388" observedRunningTime="2025-11-24 11:20:40.479464964 +0000 UTC m=+846.464285353" watchObservedRunningTime="2025-11-24 11:20:40.484592202 +0000 UTC m=+846.469412501" Nov 24 11:20:44 crc kubenswrapper[4752]: I1124 11:20:44.259009 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-rxsjd"] Nov 24 11:20:44 crc kubenswrapper[4752]: E1124 11:20:44.259572 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aef6b9d6-313d-4bde-9cdf-9a64b751a7cf" containerName="registry-server" Nov 24 11:20:44 crc kubenswrapper[4752]: I1124 11:20:44.259588 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="aef6b9d6-313d-4bde-9cdf-9a64b751a7cf" containerName="registry-server" Nov 24 11:20:44 crc kubenswrapper[4752]: E1124 11:20:44.259599 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aef6b9d6-313d-4bde-9cdf-9a64b751a7cf" containerName="extract-utilities" Nov 24 11:20:44 crc kubenswrapper[4752]: I1124 11:20:44.259607 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="aef6b9d6-313d-4bde-9cdf-9a64b751a7cf" containerName="extract-utilities" Nov 24 11:20:44 crc kubenswrapper[4752]: E1124 11:20:44.259630 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aef6b9d6-313d-4bde-9cdf-9a64b751a7cf" containerName="extract-content" Nov 24 11:20:44 crc kubenswrapper[4752]: I1124 11:20:44.259639 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="aef6b9d6-313d-4bde-9cdf-9a64b751a7cf" containerName="extract-content" Nov 24 11:20:44 crc kubenswrapper[4752]: I1124 11:20:44.259798 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="aef6b9d6-313d-4bde-9cdf-9a64b751a7cf" containerName="registry-server" Nov 24 11:20:44 crc kubenswrapper[4752]: I1124 11:20:44.260244 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-rxsjd" Nov 24 11:20:44 crc kubenswrapper[4752]: I1124 11:20:44.262346 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Nov 24 11:20:44 crc kubenswrapper[4752]: I1124 11:20:44.262567 4752 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-z54xr" Nov 24 11:20:44 crc kubenswrapper[4752]: I1124 11:20:44.262579 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Nov 24 11:20:44 crc kubenswrapper[4752]: I1124 11:20:44.278639 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-rxsjd"] Nov 24 11:20:44 crc kubenswrapper[4752]: I1124 11:20:44.383219 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c3e7eab4-cc04-458f-adde-096e61b680f2-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-rxsjd\" (UID: \"c3e7eab4-cc04-458f-adde-096e61b680f2\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-rxsjd" Nov 24 11:20:44 crc kubenswrapper[4752]: I1124 11:20:44.383435 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tftvl\" (UniqueName: \"kubernetes.io/projected/c3e7eab4-cc04-458f-adde-096e61b680f2-kube-api-access-tftvl\") pod \"cert-manager-webhook-f4fb5df64-rxsjd\" (UID: \"c3e7eab4-cc04-458f-adde-096e61b680f2\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-rxsjd" Nov 24 11:20:44 crc kubenswrapper[4752]: I1124 11:20:44.485631 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tftvl\" (UniqueName: \"kubernetes.io/projected/c3e7eab4-cc04-458f-adde-096e61b680f2-kube-api-access-tftvl\") pod \"cert-manager-webhook-f4fb5df64-rxsjd\" (UID: \"c3e7eab4-cc04-458f-adde-096e61b680f2\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-rxsjd" Nov 24 11:20:44 crc kubenswrapper[4752]: I1124 11:20:44.485794 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c3e7eab4-cc04-458f-adde-096e61b680f2-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-rxsjd\" (UID: \"c3e7eab4-cc04-458f-adde-096e61b680f2\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-rxsjd" Nov 24 11:20:44 crc kubenswrapper[4752]: I1124 11:20:44.505631 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tftvl\" (UniqueName: \"kubernetes.io/projected/c3e7eab4-cc04-458f-adde-096e61b680f2-kube-api-access-tftvl\") pod \"cert-manager-webhook-f4fb5df64-rxsjd\" (UID: \"c3e7eab4-cc04-458f-adde-096e61b680f2\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-rxsjd" Nov 24 11:20:44 crc kubenswrapper[4752]: I1124 11:20:44.507207 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c3e7eab4-cc04-458f-adde-096e61b680f2-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-rxsjd\" (UID: \"c3e7eab4-cc04-458f-adde-096e61b680f2\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-rxsjd" Nov 24 11:20:44 crc kubenswrapper[4752]: I1124 11:20:44.576766 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-rxsjd" Nov 24 11:20:44 crc kubenswrapper[4752]: I1124 11:20:44.996547 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-rxsjd"] Nov 24 11:20:45 crc kubenswrapper[4752]: W1124 11:20:45.011063 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3e7eab4_cc04_458f_adde_096e61b680f2.slice/crio-67d13addc029e747c550e2f93f28e05a762bcd6f8fb7b94add3665988c8a142a WatchSource:0}: Error finding container 67d13addc029e747c550e2f93f28e05a762bcd6f8fb7b94add3665988c8a142a: Status 404 returned error can't find the container with id 67d13addc029e747c550e2f93f28e05a762bcd6f8fb7b94add3665988c8a142a Nov 24 11:20:45 crc kubenswrapper[4752]: I1124 11:20:45.512183 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-rxsjd" event={"ID":"c3e7eab4-cc04-458f-adde-096e61b680f2","Type":"ContainerStarted","Data":"67d13addc029e747c550e2f93f28e05a762bcd6f8fb7b94add3665988c8a142a"} Nov 24 11:20:46 crc kubenswrapper[4752]: I1124 11:20:46.843028 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-g4qsq"] Nov 24 11:20:46 crc kubenswrapper[4752]: I1124 11:20:46.845308 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-g4qsq" Nov 24 11:20:46 crc kubenswrapper[4752]: I1124 11:20:46.850633 4752 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-shfdk" Nov 24 11:20:46 crc kubenswrapper[4752]: I1124 11:20:46.862170 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-g4qsq"] Nov 24 11:20:46 crc kubenswrapper[4752]: I1124 11:20:46.929965 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/85b94b6e-97ea-4d23-a0af-42ae0ee795ae-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-g4qsq\" (UID: \"85b94b6e-97ea-4d23-a0af-42ae0ee795ae\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-g4qsq" Nov 24 11:20:46 crc kubenswrapper[4752]: I1124 11:20:46.930034 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s9cm\" (UniqueName: \"kubernetes.io/projected/85b94b6e-97ea-4d23-a0af-42ae0ee795ae-kube-api-access-4s9cm\") pod \"cert-manager-cainjector-855d9ccff4-g4qsq\" (UID: \"85b94b6e-97ea-4d23-a0af-42ae0ee795ae\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-g4qsq" Nov 24 11:20:47 crc kubenswrapper[4752]: I1124 11:20:47.032548 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4s9cm\" (UniqueName: \"kubernetes.io/projected/85b94b6e-97ea-4d23-a0af-42ae0ee795ae-kube-api-access-4s9cm\") pod \"cert-manager-cainjector-855d9ccff4-g4qsq\" (UID: \"85b94b6e-97ea-4d23-a0af-42ae0ee795ae\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-g4qsq" Nov 24 11:20:47 crc kubenswrapper[4752]: I1124 11:20:47.033655 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/85b94b6e-97ea-4d23-a0af-42ae0ee795ae-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-g4qsq\" (UID: \"85b94b6e-97ea-4d23-a0af-42ae0ee795ae\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-g4qsq" Nov 24 11:20:47 crc kubenswrapper[4752]: I1124 11:20:47.057544 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s9cm\" (UniqueName: \"kubernetes.io/projected/85b94b6e-97ea-4d23-a0af-42ae0ee795ae-kube-api-access-4s9cm\") pod \"cert-manager-cainjector-855d9ccff4-g4qsq\" (UID: \"85b94b6e-97ea-4d23-a0af-42ae0ee795ae\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-g4qsq" Nov 24 11:20:47 crc kubenswrapper[4752]: I1124 11:20:47.067137 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/85b94b6e-97ea-4d23-a0af-42ae0ee795ae-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-g4qsq\" (UID: \"85b94b6e-97ea-4d23-a0af-42ae0ee795ae\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-g4qsq" Nov 24 11:20:47 crc kubenswrapper[4752]: I1124 11:20:47.201902 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-g4qsq" Nov 24 11:20:47 crc kubenswrapper[4752]: I1124 11:20:47.404136 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-g4qsq"] Nov 24 11:20:47 crc kubenswrapper[4752]: W1124 11:20:47.408501 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85b94b6e_97ea_4d23_a0af_42ae0ee795ae.slice/crio-80167ba73e64ac0529457801b7104a4df2091eb6206e4f7bd03a7a3b66576106 WatchSource:0}: Error finding container 80167ba73e64ac0529457801b7104a4df2091eb6206e4f7bd03a7a3b66576106: Status 404 returned error can't find the container with id 80167ba73e64ac0529457801b7104a4df2091eb6206e4f7bd03a7a3b66576106 Nov 24 11:20:47 crc kubenswrapper[4752]: I1124 11:20:47.569815 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-g4qsq" event={"ID":"85b94b6e-97ea-4d23-a0af-42ae0ee795ae","Type":"ContainerStarted","Data":"80167ba73e64ac0529457801b7104a4df2091eb6206e4f7bd03a7a3b66576106"} Nov 24 11:20:52 crc kubenswrapper[4752]: I1124 11:20:52.609849 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-rxsjd" event={"ID":"c3e7eab4-cc04-458f-adde-096e61b680f2","Type":"ContainerStarted","Data":"f57b3e69b4e239b82741f2ee97e1b8a2512ff38f548b4914f739fee96d29180e"} Nov 24 11:20:52 crc kubenswrapper[4752]: I1124 11:20:52.611309 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-f4fb5df64-rxsjd" Nov 24 11:20:52 crc kubenswrapper[4752]: I1124 11:20:52.611717 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-g4qsq" event={"ID":"85b94b6e-97ea-4d23-a0af-42ae0ee795ae","Type":"ContainerStarted","Data":"f16354f3731ca9016e9ada16e66873f6499dcf3cce76057350963ebb365a73c2"} Nov 24 11:20:52 crc kubenswrapper[4752]: I1124 11:20:52.655090 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-f4fb5df64-rxsjd" podStartSLOduration=1.399191915 podStartE2EDuration="8.65507268s" podCreationTimestamp="2025-11-24 11:20:44 +0000 UTC" firstStartedPulling="2025-11-24 11:20:45.012791535 +0000 UTC m=+850.997611834" lastFinishedPulling="2025-11-24 11:20:52.2686723 +0000 UTC m=+858.253492599" observedRunningTime="2025-11-24 11:20:52.650404705 +0000 UTC m=+858.635224994" watchObservedRunningTime="2025-11-24 11:20:52.65507268 +0000 UTC m=+858.639892969" Nov 24 11:20:52 crc kubenswrapper[4752]: I1124 11:20:52.676663 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-855d9ccff4-g4qsq" podStartSLOduration=1.84399647 podStartE2EDuration="6.676649344s" podCreationTimestamp="2025-11-24 11:20:46 +0000 UTC" firstStartedPulling="2025-11-24 11:20:47.410881159 +0000 UTC m=+853.395701468" lastFinishedPulling="2025-11-24 11:20:52.243534033 +0000 UTC m=+858.228354342" observedRunningTime="2025-11-24 11:20:52.675959184 +0000 UTC m=+858.660779473" watchObservedRunningTime="2025-11-24 11:20:52.676649344 +0000 UTC m=+858.661469633" Nov 24 11:20:59 crc kubenswrapper[4752]: I1124 11:20:59.580697 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-f4fb5df64-rxsjd" Nov 24 11:21:03 crc kubenswrapper[4752]: I1124 11:21:03.203791 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-86cb77c54b-7k726"] Nov 24 11:21:03 crc kubenswrapper[4752]: I1124 11:21:03.205734 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-7k726" Nov 24 11:21:03 crc kubenswrapper[4752]: I1124 11:21:03.209181 4752 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-b8wjs" Nov 24 11:21:03 crc kubenswrapper[4752]: I1124 11:21:03.217212 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-7k726"] Nov 24 11:21:03 crc kubenswrapper[4752]: I1124 11:21:03.281511 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnm47\" (UniqueName: \"kubernetes.io/projected/d362a2ce-aaf7-464c-b569-1dccdbf6edcf-kube-api-access-xnm47\") pod \"cert-manager-86cb77c54b-7k726\" (UID: \"d362a2ce-aaf7-464c-b569-1dccdbf6edcf\") " pod="cert-manager/cert-manager-86cb77c54b-7k726" Nov 24 11:21:03 crc kubenswrapper[4752]: I1124 11:21:03.281688 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d362a2ce-aaf7-464c-b569-1dccdbf6edcf-bound-sa-token\") pod \"cert-manager-86cb77c54b-7k726\" (UID: \"d362a2ce-aaf7-464c-b569-1dccdbf6edcf\") " pod="cert-manager/cert-manager-86cb77c54b-7k726" Nov 24 11:21:03 crc kubenswrapper[4752]: I1124 11:21:03.383221 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnm47\" (UniqueName: \"kubernetes.io/projected/d362a2ce-aaf7-464c-b569-1dccdbf6edcf-kube-api-access-xnm47\") pod \"cert-manager-86cb77c54b-7k726\" (UID: \"d362a2ce-aaf7-464c-b569-1dccdbf6edcf\") " pod="cert-manager/cert-manager-86cb77c54b-7k726" Nov 24 11:21:03 crc kubenswrapper[4752]: I1124 11:21:03.383796 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d362a2ce-aaf7-464c-b569-1dccdbf6edcf-bound-sa-token\") pod \"cert-manager-86cb77c54b-7k726\" (UID: \"d362a2ce-aaf7-464c-b569-1dccdbf6edcf\") " pod="cert-manager/cert-manager-86cb77c54b-7k726" Nov 24 11:21:03 crc kubenswrapper[4752]: I1124 11:21:03.407165 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnm47\" (UniqueName: \"kubernetes.io/projected/d362a2ce-aaf7-464c-b569-1dccdbf6edcf-kube-api-access-xnm47\") pod \"cert-manager-86cb77c54b-7k726\" (UID: \"d362a2ce-aaf7-464c-b569-1dccdbf6edcf\") " pod="cert-manager/cert-manager-86cb77c54b-7k726" Nov 24 11:21:03 crc kubenswrapper[4752]: I1124 11:21:03.407354 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d362a2ce-aaf7-464c-b569-1dccdbf6edcf-bound-sa-token\") pod \"cert-manager-86cb77c54b-7k726\" (UID: \"d362a2ce-aaf7-464c-b569-1dccdbf6edcf\") " pod="cert-manager/cert-manager-86cb77c54b-7k726" Nov 24 11:21:03 crc kubenswrapper[4752]: I1124 11:21:03.540294 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-7k726" Nov 24 11:21:03 crc kubenswrapper[4752]: I1124 11:21:03.962013 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-7k726"] Nov 24 11:21:04 crc kubenswrapper[4752]: I1124 11:21:04.687647 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-7k726" event={"ID":"d362a2ce-aaf7-464c-b569-1dccdbf6edcf","Type":"ContainerStarted","Data":"00b20272a4dc20ad9d2e67fb9651e5707a737d10563cd51ee847f2fe4c2889d2"} Nov 24 11:21:06 crc kubenswrapper[4752]: I1124 11:21:06.701264 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-7k726" event={"ID":"d362a2ce-aaf7-464c-b569-1dccdbf6edcf","Type":"ContainerStarted","Data":"5f1616cee1ab0bf85df69355b9eea71888875f5df45fa0192456de3f66c07e59"} Nov 24 11:21:07 crc kubenswrapper[4752]: I1124 11:21:07.741112 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-86cb77c54b-7k726" podStartSLOduration=4.741078631 podStartE2EDuration="4.741078631s" podCreationTimestamp="2025-11-24 11:21:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:21:07.732196114 +0000 UTC m=+873.717016413" watchObservedRunningTime="2025-11-24 11:21:07.741078631 +0000 UTC m=+873.725898960" Nov 24 11:21:12 crc kubenswrapper[4752]: I1124 11:21:12.669137 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-gl5nw"] Nov 24 11:21:12 crc kubenswrapper[4752]: I1124 11:21:12.670884 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-gl5nw" Nov 24 11:21:12 crc kubenswrapper[4752]: I1124 11:21:12.673884 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Nov 24 11:21:12 crc kubenswrapper[4752]: I1124 11:21:12.690000 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-2gcnk" Nov 24 11:21:12 crc kubenswrapper[4752]: I1124 11:21:12.690202 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Nov 24 11:21:12 crc kubenswrapper[4752]: I1124 11:21:12.695475 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-gl5nw"] Nov 24 11:21:12 crc kubenswrapper[4752]: I1124 11:21:12.718303 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5wqh\" (UniqueName: \"kubernetes.io/projected/7232c8a0-d02b-447b-a249-18db1434cc7d-kube-api-access-g5wqh\") pod \"openstack-operator-index-gl5nw\" (UID: \"7232c8a0-d02b-447b-a249-18db1434cc7d\") " pod="openstack-operators/openstack-operator-index-gl5nw" Nov 24 11:21:12 crc kubenswrapper[4752]: I1124 11:21:12.819964 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5wqh\" (UniqueName: \"kubernetes.io/projected/7232c8a0-d02b-447b-a249-18db1434cc7d-kube-api-access-g5wqh\") pod \"openstack-operator-index-gl5nw\" (UID: \"7232c8a0-d02b-447b-a249-18db1434cc7d\") " pod="openstack-operators/openstack-operator-index-gl5nw" Nov 24 11:21:12 crc kubenswrapper[4752]: I1124 11:21:12.841481 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5wqh\" (UniqueName: \"kubernetes.io/projected/7232c8a0-d02b-447b-a249-18db1434cc7d-kube-api-access-g5wqh\") pod \"openstack-operator-index-gl5nw\" (UID: \"7232c8a0-d02b-447b-a249-18db1434cc7d\") " pod="openstack-operators/openstack-operator-index-gl5nw" Nov 24 11:21:12 crc kubenswrapper[4752]: I1124 11:21:12.994168 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-gl5nw" Nov 24 11:21:13 crc kubenswrapper[4752]: I1124 11:21:13.457909 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-gl5nw"] Nov 24 11:21:13 crc kubenswrapper[4752]: W1124 11:21:13.461188 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7232c8a0_d02b_447b_a249_18db1434cc7d.slice/crio-e29edc383476f211b6b57bd330eb2c2724f34e8c1b403a7469fe7a6a24141e19 WatchSource:0}: Error finding container e29edc383476f211b6b57bd330eb2c2724f34e8c1b403a7469fe7a6a24141e19: Status 404 returned error can't find the container with id e29edc383476f211b6b57bd330eb2c2724f34e8c1b403a7469fe7a6a24141e19 Nov 24 11:21:13 crc kubenswrapper[4752]: I1124 11:21:13.772843 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-gl5nw" event={"ID":"7232c8a0-d02b-447b-a249-18db1434cc7d","Type":"ContainerStarted","Data":"e29edc383476f211b6b57bd330eb2c2724f34e8c1b403a7469fe7a6a24141e19"} Nov 24 11:21:14 crc kubenswrapper[4752]: I1124 11:21:14.832637 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-gl5nw"] Nov 24 11:21:15 crc kubenswrapper[4752]: I1124 11:21:15.254289 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-p94tb"] Nov 24 11:21:15 crc kubenswrapper[4752]: I1124 11:21:15.255960 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-p94tb" Nov 24 11:21:15 crc kubenswrapper[4752]: I1124 11:21:15.266367 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-p94tb"] Nov 24 11:21:15 crc kubenswrapper[4752]: I1124 11:21:15.380501 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xzmh\" (UniqueName: \"kubernetes.io/projected/3f65a59f-6b3f-45c1-9e63-76472e38cee1-kube-api-access-8xzmh\") pod \"openstack-operator-index-p94tb\" (UID: \"3f65a59f-6b3f-45c1-9e63-76472e38cee1\") " pod="openstack-operators/openstack-operator-index-p94tb" Nov 24 11:21:15 crc kubenswrapper[4752]: I1124 11:21:15.472143 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 11:21:15 crc kubenswrapper[4752]: I1124 11:21:15.472202 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 11:21:15 crc kubenswrapper[4752]: I1124 11:21:15.482485 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xzmh\" (UniqueName: \"kubernetes.io/projected/3f65a59f-6b3f-45c1-9e63-76472e38cee1-kube-api-access-8xzmh\") pod \"openstack-operator-index-p94tb\" (UID: \"3f65a59f-6b3f-45c1-9e63-76472e38cee1\") " pod="openstack-operators/openstack-operator-index-p94tb" Nov 24 11:21:15 crc kubenswrapper[4752]: I1124 11:21:15.500306 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xzmh\" (UniqueName: \"kubernetes.io/projected/3f65a59f-6b3f-45c1-9e63-76472e38cee1-kube-api-access-8xzmh\") pod \"openstack-operator-index-p94tb\" (UID: \"3f65a59f-6b3f-45c1-9e63-76472e38cee1\") " pod="openstack-operators/openstack-operator-index-p94tb" Nov 24 11:21:15 crc kubenswrapper[4752]: I1124 11:21:15.577359 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-p94tb" Nov 24 11:21:16 crc kubenswrapper[4752]: I1124 11:21:16.016515 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-p94tb"] Nov 24 11:21:16 crc kubenswrapper[4752]: W1124 11:21:16.023099 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f65a59f_6b3f_45c1_9e63_76472e38cee1.slice/crio-f09ac3efa49c163261036677ad06f06c88d941938847e1ffc32d2a03519e633b WatchSource:0}: Error finding container f09ac3efa49c163261036677ad06f06c88d941938847e1ffc32d2a03519e633b: Status 404 returned error can't find the container with id f09ac3efa49c163261036677ad06f06c88d941938847e1ffc32d2a03519e633b Nov 24 11:21:16 crc kubenswrapper[4752]: I1124 11:21:16.793679 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-gl5nw" event={"ID":"7232c8a0-d02b-447b-a249-18db1434cc7d","Type":"ContainerStarted","Data":"778d555ed97f6a49b260c2072261f3009422528f70d757ed80321b02621b7bb8"} Nov 24 11:21:16 crc kubenswrapper[4752]: I1124 11:21:16.793876 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-gl5nw" podUID="7232c8a0-d02b-447b-a249-18db1434cc7d" containerName="registry-server" containerID="cri-o://778d555ed97f6a49b260c2072261f3009422528f70d757ed80321b02621b7bb8" gracePeriod=2 Nov 24 11:21:16 crc kubenswrapper[4752]: I1124 11:21:16.795688 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-p94tb" event={"ID":"3f65a59f-6b3f-45c1-9e63-76472e38cee1","Type":"ContainerStarted","Data":"a4e8cf7380d640a70d81980aa6fbc988fc4a434ef24b32e655bdd763e37fa37f"} Nov 24 11:21:16 crc kubenswrapper[4752]: I1124 11:21:16.795721 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-p94tb" event={"ID":"3f65a59f-6b3f-45c1-9e63-76472e38cee1","Type":"ContainerStarted","Data":"f09ac3efa49c163261036677ad06f06c88d941938847e1ffc32d2a03519e633b"} Nov 24 11:21:16 crc kubenswrapper[4752]: I1124 11:21:16.811993 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-gl5nw" podStartSLOduration=2.623867771 podStartE2EDuration="4.811969715s" podCreationTimestamp="2025-11-24 11:21:12 +0000 UTC" firstStartedPulling="2025-11-24 11:21:13.464090164 +0000 UTC m=+879.448910463" lastFinishedPulling="2025-11-24 11:21:15.652192108 +0000 UTC m=+881.637012407" observedRunningTime="2025-11-24 11:21:16.808831765 +0000 UTC m=+882.793652064" watchObservedRunningTime="2025-11-24 11:21:16.811969715 +0000 UTC m=+882.796790024" Nov 24 11:21:16 crc kubenswrapper[4752]: I1124 11:21:16.836202 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-p94tb" podStartSLOduration=1.792486322 podStartE2EDuration="1.836177465s" podCreationTimestamp="2025-11-24 11:21:15 +0000 UTC" firstStartedPulling="2025-11-24 11:21:16.027023794 +0000 UTC m=+882.011844093" lastFinishedPulling="2025-11-24 11:21:16.070714947 +0000 UTC m=+882.055535236" observedRunningTime="2025-11-24 11:21:16.829185433 +0000 UTC m=+882.814005722" watchObservedRunningTime="2025-11-24 11:21:16.836177465 +0000 UTC m=+882.820997754" Nov 24 11:21:17 crc kubenswrapper[4752]: I1124 11:21:17.158592 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-gl5nw" Nov 24 11:21:17 crc kubenswrapper[4752]: I1124 11:21:17.309084 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5wqh\" (UniqueName: \"kubernetes.io/projected/7232c8a0-d02b-447b-a249-18db1434cc7d-kube-api-access-g5wqh\") pod \"7232c8a0-d02b-447b-a249-18db1434cc7d\" (UID: \"7232c8a0-d02b-447b-a249-18db1434cc7d\") " Nov 24 11:21:17 crc kubenswrapper[4752]: I1124 11:21:17.317659 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7232c8a0-d02b-447b-a249-18db1434cc7d-kube-api-access-g5wqh" (OuterVolumeSpecName: "kube-api-access-g5wqh") pod "7232c8a0-d02b-447b-a249-18db1434cc7d" (UID: "7232c8a0-d02b-447b-a249-18db1434cc7d"). InnerVolumeSpecName "kube-api-access-g5wqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:21:17 crc kubenswrapper[4752]: I1124 11:21:17.411565 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5wqh\" (UniqueName: \"kubernetes.io/projected/7232c8a0-d02b-447b-a249-18db1434cc7d-kube-api-access-g5wqh\") on node \"crc\" DevicePath \"\"" Nov 24 11:21:17 crc kubenswrapper[4752]: I1124 11:21:17.802520 4752 generic.go:334] "Generic (PLEG): container finished" podID="7232c8a0-d02b-447b-a249-18db1434cc7d" containerID="778d555ed97f6a49b260c2072261f3009422528f70d757ed80321b02621b7bb8" exitCode=0 Nov 24 11:21:17 crc kubenswrapper[4752]: I1124 11:21:17.802579 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-gl5nw" event={"ID":"7232c8a0-d02b-447b-a249-18db1434cc7d","Type":"ContainerDied","Data":"778d555ed97f6a49b260c2072261f3009422528f70d757ed80321b02621b7bb8"} Nov 24 11:21:17 crc kubenswrapper[4752]: I1124 11:21:17.802634 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-gl5nw" event={"ID":"7232c8a0-d02b-447b-a249-18db1434cc7d","Type":"ContainerDied","Data":"e29edc383476f211b6b57bd330eb2c2724f34e8c1b403a7469fe7a6a24141e19"} Nov 24 11:21:17 crc kubenswrapper[4752]: I1124 11:21:17.802656 4752 scope.go:117] "RemoveContainer" containerID="778d555ed97f6a49b260c2072261f3009422528f70d757ed80321b02621b7bb8" Nov 24 11:21:17 crc kubenswrapper[4752]: I1124 11:21:17.803119 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-gl5nw" Nov 24 11:21:17 crc kubenswrapper[4752]: I1124 11:21:17.815879 4752 scope.go:117] "RemoveContainer" containerID="778d555ed97f6a49b260c2072261f3009422528f70d757ed80321b02621b7bb8" Nov 24 11:21:17 crc kubenswrapper[4752]: E1124 11:21:17.816538 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"778d555ed97f6a49b260c2072261f3009422528f70d757ed80321b02621b7bb8\": container with ID starting with 778d555ed97f6a49b260c2072261f3009422528f70d757ed80321b02621b7bb8 not found: ID does not exist" containerID="778d555ed97f6a49b260c2072261f3009422528f70d757ed80321b02621b7bb8" Nov 24 11:21:17 crc kubenswrapper[4752]: I1124 11:21:17.816602 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"778d555ed97f6a49b260c2072261f3009422528f70d757ed80321b02621b7bb8"} err="failed to get container status \"778d555ed97f6a49b260c2072261f3009422528f70d757ed80321b02621b7bb8\": rpc error: code = NotFound desc = could not find container \"778d555ed97f6a49b260c2072261f3009422528f70d757ed80321b02621b7bb8\": container with ID starting with 778d555ed97f6a49b260c2072261f3009422528f70d757ed80321b02621b7bb8 not found: ID does not exist" Nov 24 11:21:17 crc kubenswrapper[4752]: I1124 11:21:17.837267 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-gl5nw"] Nov 24 11:21:17 crc kubenswrapper[4752]: I1124 11:21:17.842202 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-gl5nw"] Nov 24 11:21:18 crc kubenswrapper[4752]: I1124 11:21:18.738633 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7232c8a0-d02b-447b-a249-18db1434cc7d" path="/var/lib/kubelet/pods/7232c8a0-d02b-447b-a249-18db1434cc7d/volumes" Nov 24 11:21:25 crc kubenswrapper[4752]: I1124 11:21:25.578493 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-p94tb" Nov 24 11:21:25 crc kubenswrapper[4752]: I1124 11:21:25.578928 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-p94tb" Nov 24 11:21:25 crc kubenswrapper[4752]: I1124 11:21:25.610967 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-p94tb" Nov 24 11:21:25 crc kubenswrapper[4752]: I1124 11:21:25.889889 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-p94tb" Nov 24 11:21:33 crc kubenswrapper[4752]: I1124 11:21:33.072579 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/4a0953fd9391bf39ad887962f638d696e69e0cac7f27b9f8cbf4e65e676qzjp"] Nov 24 11:21:33 crc kubenswrapper[4752]: E1124 11:21:33.073577 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7232c8a0-d02b-447b-a249-18db1434cc7d" containerName="registry-server" Nov 24 11:21:33 crc kubenswrapper[4752]: I1124 11:21:33.073597 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="7232c8a0-d02b-447b-a249-18db1434cc7d" containerName="registry-server" Nov 24 11:21:33 crc kubenswrapper[4752]: I1124 11:21:33.073811 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="7232c8a0-d02b-447b-a249-18db1434cc7d" containerName="registry-server" Nov 24 11:21:33 crc kubenswrapper[4752]: I1124 11:21:33.075124 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4a0953fd9391bf39ad887962f638d696e69e0cac7f27b9f8cbf4e65e676qzjp" Nov 24 11:21:33 crc kubenswrapper[4752]: I1124 11:21:33.079953 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-lqq7x" Nov 24 11:21:33 crc kubenswrapper[4752]: I1124 11:21:33.102908 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/4a0953fd9391bf39ad887962f638d696e69e0cac7f27b9f8cbf4e65e676qzjp"] Nov 24 11:21:33 crc kubenswrapper[4752]: I1124 11:21:33.140911 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6sbn\" (UniqueName: \"kubernetes.io/projected/2061713f-5dc5-441a-bc1a-9702e5959aea-kube-api-access-d6sbn\") pod \"4a0953fd9391bf39ad887962f638d696e69e0cac7f27b9f8cbf4e65e676qzjp\" (UID: \"2061713f-5dc5-441a-bc1a-9702e5959aea\") " pod="openstack-operators/4a0953fd9391bf39ad887962f638d696e69e0cac7f27b9f8cbf4e65e676qzjp" Nov 24 11:21:33 crc kubenswrapper[4752]: I1124 11:21:33.141069 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2061713f-5dc5-441a-bc1a-9702e5959aea-bundle\") pod \"4a0953fd9391bf39ad887962f638d696e69e0cac7f27b9f8cbf4e65e676qzjp\" (UID: \"2061713f-5dc5-441a-bc1a-9702e5959aea\") " pod="openstack-operators/4a0953fd9391bf39ad887962f638d696e69e0cac7f27b9f8cbf4e65e676qzjp" Nov 24 11:21:33 crc kubenswrapper[4752]: I1124 11:21:33.141127 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2061713f-5dc5-441a-bc1a-9702e5959aea-util\") pod \"4a0953fd9391bf39ad887962f638d696e69e0cac7f27b9f8cbf4e65e676qzjp\" (UID: \"2061713f-5dc5-441a-bc1a-9702e5959aea\") " pod="openstack-operators/4a0953fd9391bf39ad887962f638d696e69e0cac7f27b9f8cbf4e65e676qzjp" Nov 24 11:21:33 crc kubenswrapper[4752]: I1124 11:21:33.266082 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2061713f-5dc5-441a-bc1a-9702e5959aea-util\") pod \"4a0953fd9391bf39ad887962f638d696e69e0cac7f27b9f8cbf4e65e676qzjp\" (UID: \"2061713f-5dc5-441a-bc1a-9702e5959aea\") " pod="openstack-operators/4a0953fd9391bf39ad887962f638d696e69e0cac7f27b9f8cbf4e65e676qzjp" Nov 24 11:21:33 crc kubenswrapper[4752]: I1124 11:21:33.266148 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6sbn\" (UniqueName: \"kubernetes.io/projected/2061713f-5dc5-441a-bc1a-9702e5959aea-kube-api-access-d6sbn\") pod \"4a0953fd9391bf39ad887962f638d696e69e0cac7f27b9f8cbf4e65e676qzjp\" (UID: \"2061713f-5dc5-441a-bc1a-9702e5959aea\") " pod="openstack-operators/4a0953fd9391bf39ad887962f638d696e69e0cac7f27b9f8cbf4e65e676qzjp" Nov 24 11:21:33 crc kubenswrapper[4752]: I1124 11:21:33.266215 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2061713f-5dc5-441a-bc1a-9702e5959aea-bundle\") pod \"4a0953fd9391bf39ad887962f638d696e69e0cac7f27b9f8cbf4e65e676qzjp\" (UID: \"2061713f-5dc5-441a-bc1a-9702e5959aea\") " pod="openstack-operators/4a0953fd9391bf39ad887962f638d696e69e0cac7f27b9f8cbf4e65e676qzjp" Nov 24 11:21:33 crc kubenswrapper[4752]: I1124 11:21:33.266984 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2061713f-5dc5-441a-bc1a-9702e5959aea-bundle\") pod \"4a0953fd9391bf39ad887962f638d696e69e0cac7f27b9f8cbf4e65e676qzjp\" (UID: \"2061713f-5dc5-441a-bc1a-9702e5959aea\") " pod="openstack-operators/4a0953fd9391bf39ad887962f638d696e69e0cac7f27b9f8cbf4e65e676qzjp" Nov 24 11:21:33 crc kubenswrapper[4752]: I1124 11:21:33.266982 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2061713f-5dc5-441a-bc1a-9702e5959aea-util\") pod \"4a0953fd9391bf39ad887962f638d696e69e0cac7f27b9f8cbf4e65e676qzjp\" (UID: \"2061713f-5dc5-441a-bc1a-9702e5959aea\") " pod="openstack-operators/4a0953fd9391bf39ad887962f638d696e69e0cac7f27b9f8cbf4e65e676qzjp" Nov 24 11:21:33 crc kubenswrapper[4752]: I1124 11:21:33.284564 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6sbn\" (UniqueName: \"kubernetes.io/projected/2061713f-5dc5-441a-bc1a-9702e5959aea-kube-api-access-d6sbn\") pod \"4a0953fd9391bf39ad887962f638d696e69e0cac7f27b9f8cbf4e65e676qzjp\" (UID: \"2061713f-5dc5-441a-bc1a-9702e5959aea\") " pod="openstack-operators/4a0953fd9391bf39ad887962f638d696e69e0cac7f27b9f8cbf4e65e676qzjp" Nov 24 11:21:33 crc kubenswrapper[4752]: I1124 11:21:33.388394 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4a0953fd9391bf39ad887962f638d696e69e0cac7f27b9f8cbf4e65e676qzjp" Nov 24 11:21:33 crc kubenswrapper[4752]: I1124 11:21:33.596238 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/4a0953fd9391bf39ad887962f638d696e69e0cac7f27b9f8cbf4e65e676qzjp"] Nov 24 11:21:33 crc kubenswrapper[4752]: I1124 11:21:33.930356 4752 generic.go:334] "Generic (PLEG): container finished" podID="2061713f-5dc5-441a-bc1a-9702e5959aea" containerID="175ea3bf611e3525bb1df2d973127d5e14d329f11d8a131ccd7e85a3219d5adb" exitCode=0 Nov 24 11:21:33 crc kubenswrapper[4752]: I1124 11:21:33.930414 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4a0953fd9391bf39ad887962f638d696e69e0cac7f27b9f8cbf4e65e676qzjp" event={"ID":"2061713f-5dc5-441a-bc1a-9702e5959aea","Type":"ContainerDied","Data":"175ea3bf611e3525bb1df2d973127d5e14d329f11d8a131ccd7e85a3219d5adb"} Nov 24 11:21:33 crc kubenswrapper[4752]: I1124 11:21:33.930452 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4a0953fd9391bf39ad887962f638d696e69e0cac7f27b9f8cbf4e65e676qzjp" event={"ID":"2061713f-5dc5-441a-bc1a-9702e5959aea","Type":"ContainerStarted","Data":"a2f64dda61b8aa4ebe5f99320a4aa0e46ad612fc80f7eb1d3d4715ba9f374e4d"} Nov 24 11:21:34 crc kubenswrapper[4752]: I1124 11:21:34.943185 4752 generic.go:334] "Generic (PLEG): container finished" podID="2061713f-5dc5-441a-bc1a-9702e5959aea" containerID="d93545dda6bcd65ddcc94aaa618be857f90a5900736785b4c7b51eb58d6e7c1f" exitCode=0 Nov 24 11:21:34 crc kubenswrapper[4752]: I1124 11:21:34.943342 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4a0953fd9391bf39ad887962f638d696e69e0cac7f27b9f8cbf4e65e676qzjp" event={"ID":"2061713f-5dc5-441a-bc1a-9702e5959aea","Type":"ContainerDied","Data":"d93545dda6bcd65ddcc94aaa618be857f90a5900736785b4c7b51eb58d6e7c1f"} Nov 24 11:21:35 crc kubenswrapper[4752]: I1124 11:21:35.957936 4752 generic.go:334] "Generic (PLEG): container finished" podID="2061713f-5dc5-441a-bc1a-9702e5959aea" containerID="7b68c2c45e08380da3aaa5a184f25c1a193f22d0dd8546042f30578dfa3cf138" exitCode=0 Nov 24 11:21:35 crc kubenswrapper[4752]: I1124 11:21:35.958048 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4a0953fd9391bf39ad887962f638d696e69e0cac7f27b9f8cbf4e65e676qzjp" event={"ID":"2061713f-5dc5-441a-bc1a-9702e5959aea","Type":"ContainerDied","Data":"7b68c2c45e08380da3aaa5a184f25c1a193f22d0dd8546042f30578dfa3cf138"} Nov 24 11:21:37 crc kubenswrapper[4752]: I1124 11:21:37.253569 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4a0953fd9391bf39ad887962f638d696e69e0cac7f27b9f8cbf4e65e676qzjp" Nov 24 11:21:37 crc kubenswrapper[4752]: I1124 11:21:37.427279 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2061713f-5dc5-441a-bc1a-9702e5959aea-util\") pod \"2061713f-5dc5-441a-bc1a-9702e5959aea\" (UID: \"2061713f-5dc5-441a-bc1a-9702e5959aea\") " Nov 24 11:21:37 crc kubenswrapper[4752]: I1124 11:21:37.427563 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2061713f-5dc5-441a-bc1a-9702e5959aea-bundle\") pod \"2061713f-5dc5-441a-bc1a-9702e5959aea\" (UID: \"2061713f-5dc5-441a-bc1a-9702e5959aea\") " Nov 24 11:21:37 crc kubenswrapper[4752]: I1124 11:21:37.427610 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6sbn\" (UniqueName: \"kubernetes.io/projected/2061713f-5dc5-441a-bc1a-9702e5959aea-kube-api-access-d6sbn\") pod \"2061713f-5dc5-441a-bc1a-9702e5959aea\" (UID: \"2061713f-5dc5-441a-bc1a-9702e5959aea\") " Nov 24 11:21:37 crc kubenswrapper[4752]: I1124 11:21:37.428431 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2061713f-5dc5-441a-bc1a-9702e5959aea-bundle" (OuterVolumeSpecName: "bundle") pod "2061713f-5dc5-441a-bc1a-9702e5959aea" (UID: "2061713f-5dc5-441a-bc1a-9702e5959aea"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:21:37 crc kubenswrapper[4752]: I1124 11:21:37.433048 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2061713f-5dc5-441a-bc1a-9702e5959aea-kube-api-access-d6sbn" (OuterVolumeSpecName: "kube-api-access-d6sbn") pod "2061713f-5dc5-441a-bc1a-9702e5959aea" (UID: "2061713f-5dc5-441a-bc1a-9702e5959aea"). InnerVolumeSpecName "kube-api-access-d6sbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:21:37 crc kubenswrapper[4752]: I1124 11:21:37.446712 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2061713f-5dc5-441a-bc1a-9702e5959aea-util" (OuterVolumeSpecName: "util") pod "2061713f-5dc5-441a-bc1a-9702e5959aea" (UID: "2061713f-5dc5-441a-bc1a-9702e5959aea"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:21:37 crc kubenswrapper[4752]: I1124 11:21:37.529590 4752 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2061713f-5dc5-441a-bc1a-9702e5959aea-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:21:37 crc kubenswrapper[4752]: I1124 11:21:37.529643 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6sbn\" (UniqueName: \"kubernetes.io/projected/2061713f-5dc5-441a-bc1a-9702e5959aea-kube-api-access-d6sbn\") on node \"crc\" DevicePath \"\"" Nov 24 11:21:37 crc kubenswrapper[4752]: I1124 11:21:37.529658 4752 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2061713f-5dc5-441a-bc1a-9702e5959aea-util\") on node \"crc\" DevicePath \"\"" Nov 24 11:21:37 crc kubenswrapper[4752]: I1124 11:21:37.979884 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4a0953fd9391bf39ad887962f638d696e69e0cac7f27b9f8cbf4e65e676qzjp" event={"ID":"2061713f-5dc5-441a-bc1a-9702e5959aea","Type":"ContainerDied","Data":"a2f64dda61b8aa4ebe5f99320a4aa0e46ad612fc80f7eb1d3d4715ba9f374e4d"} Nov 24 11:21:37 crc kubenswrapper[4752]: I1124 11:21:37.979940 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4a0953fd9391bf39ad887962f638d696e69e0cac7f27b9f8cbf4e65e676qzjp" Nov 24 11:21:37 crc kubenswrapper[4752]: I1124 11:21:37.979950 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2f64dda61b8aa4ebe5f99320a4aa0e46ad612fc80f7eb1d3d4715ba9f374e4d" Nov 24 11:21:45 crc kubenswrapper[4752]: I1124 11:21:45.468334 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 11:21:45 crc kubenswrapper[4752]: I1124 11:21:45.468910 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 11:21:45 crc kubenswrapper[4752]: I1124 11:21:45.689034 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-689f78bdf7-xqn7j"] Nov 24 11:21:45 crc kubenswrapper[4752]: E1124 11:21:45.689322 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2061713f-5dc5-441a-bc1a-9702e5959aea" containerName="pull" Nov 24 11:21:45 crc kubenswrapper[4752]: I1124 11:21:45.689347 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2061713f-5dc5-441a-bc1a-9702e5959aea" containerName="pull" Nov 24 11:21:45 crc kubenswrapper[4752]: E1124 11:21:45.689365 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2061713f-5dc5-441a-bc1a-9702e5959aea" containerName="util" Nov 24 11:21:45 crc kubenswrapper[4752]: I1124 11:21:45.689373 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2061713f-5dc5-441a-bc1a-9702e5959aea" containerName="util" Nov 24 11:21:45 crc kubenswrapper[4752]: E1124 11:21:45.689393 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2061713f-5dc5-441a-bc1a-9702e5959aea" containerName="extract" Nov 24 11:21:45 crc kubenswrapper[4752]: I1124 11:21:45.689400 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2061713f-5dc5-441a-bc1a-9702e5959aea" containerName="extract" Nov 24 11:21:45 crc kubenswrapper[4752]: I1124 11:21:45.689523 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="2061713f-5dc5-441a-bc1a-9702e5959aea" containerName="extract" Nov 24 11:21:45 crc kubenswrapper[4752]: I1124 11:21:45.690324 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-689f78bdf7-xqn7j" Nov 24 11:21:45 crc kubenswrapper[4752]: I1124 11:21:45.692176 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-b6dkz" Nov 24 11:21:45 crc kubenswrapper[4752]: I1124 11:21:45.781110 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpf25\" (UniqueName: \"kubernetes.io/projected/2fc2a88f-2768-4cdc-a075-bdaa38c25853-kube-api-access-xpf25\") pod \"openstack-operator-controller-operator-689f78bdf7-xqn7j\" (UID: \"2fc2a88f-2768-4cdc-a075-bdaa38c25853\") " pod="openstack-operators/openstack-operator-controller-operator-689f78bdf7-xqn7j" Nov 24 11:21:45 crc kubenswrapper[4752]: I1124 11:21:45.798033 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-689f78bdf7-xqn7j"] Nov 24 11:21:45 crc kubenswrapper[4752]: I1124 11:21:45.882366 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpf25\" (UniqueName: \"kubernetes.io/projected/2fc2a88f-2768-4cdc-a075-bdaa38c25853-kube-api-access-xpf25\") pod \"openstack-operator-controller-operator-689f78bdf7-xqn7j\" (UID: \"2fc2a88f-2768-4cdc-a075-bdaa38c25853\") " pod="openstack-operators/openstack-operator-controller-operator-689f78bdf7-xqn7j" Nov 24 11:21:45 crc kubenswrapper[4752]: I1124 11:21:45.903683 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpf25\" (UniqueName: \"kubernetes.io/projected/2fc2a88f-2768-4cdc-a075-bdaa38c25853-kube-api-access-xpf25\") pod \"openstack-operator-controller-operator-689f78bdf7-xqn7j\" (UID: \"2fc2a88f-2768-4cdc-a075-bdaa38c25853\") " pod="openstack-operators/openstack-operator-controller-operator-689f78bdf7-xqn7j" Nov 24 11:21:46 crc kubenswrapper[4752]: I1124 11:21:46.048492 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-689f78bdf7-xqn7j" Nov 24 11:21:46 crc kubenswrapper[4752]: I1124 11:21:46.538184 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-689f78bdf7-xqn7j"] Nov 24 11:21:47 crc kubenswrapper[4752]: I1124 11:21:47.039233 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-689f78bdf7-xqn7j" event={"ID":"2fc2a88f-2768-4cdc-a075-bdaa38c25853","Type":"ContainerStarted","Data":"29ae8d3f2e34a073e9db4338ed234f49d7e630460ce70021a3a44781e6ec0bd5"} Nov 24 11:21:52 crc kubenswrapper[4752]: I1124 11:21:52.336239 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-689f78bdf7-xqn7j" event={"ID":"2fc2a88f-2768-4cdc-a075-bdaa38c25853","Type":"ContainerStarted","Data":"2a57a0c4d20ad073b776944edd74b84aac462065fb70610eee149d759a6149d8"} Nov 24 11:21:54 crc kubenswrapper[4752]: I1124 11:21:54.352888 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-689f78bdf7-xqn7j" event={"ID":"2fc2a88f-2768-4cdc-a075-bdaa38c25853","Type":"ContainerStarted","Data":"ffc15ab241b7097c182fe5a40920be89e89eaca192d334c34d821742acc7984f"} Nov 24 11:21:54 crc kubenswrapper[4752]: I1124 11:21:54.353329 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-689f78bdf7-xqn7j" Nov 24 11:22:06 crc kubenswrapper[4752]: I1124 11:22:06.051176 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-689f78bdf7-xqn7j" Nov 24 11:22:06 crc kubenswrapper[4752]: I1124 11:22:06.094639 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-689f78bdf7-xqn7j" podStartSLOduration=13.574869018 podStartE2EDuration="21.094623326s" podCreationTimestamp="2025-11-24 11:21:45 +0000 UTC" firstStartedPulling="2025-11-24 11:21:46.556241307 +0000 UTC m=+912.541061596" lastFinishedPulling="2025-11-24 11:21:54.075995625 +0000 UTC m=+920.060815904" observedRunningTime="2025-11-24 11:21:54.392104676 +0000 UTC m=+920.376924955" watchObservedRunningTime="2025-11-24 11:22:06.094623326 +0000 UTC m=+932.079443615" Nov 24 11:22:15 crc kubenswrapper[4752]: I1124 11:22:15.469388 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 11:22:15 crc kubenswrapper[4752]: I1124 11:22:15.469899 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 11:22:15 crc kubenswrapper[4752]: I1124 11:22:15.469958 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 11:22:15 crc kubenswrapper[4752]: I1124 11:22:15.470618 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"232a5964f5a1b0c13db2c00fa30a37e21418af2bce45250ff9defff836c970d9"} pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 24 11:22:15 crc kubenswrapper[4752]: I1124 11:22:15.470672 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" containerID="cri-o://232a5964f5a1b0c13db2c00fa30a37e21418af2bce45250ff9defff836c970d9" gracePeriod=600 Nov 24 11:22:16 crc kubenswrapper[4752]: I1124 11:22:16.492682 4752 generic.go:334] "Generic (PLEG): container finished" podID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerID="232a5964f5a1b0c13db2c00fa30a37e21418af2bce45250ff9defff836c970d9" exitCode=0 Nov 24 11:22:16 crc kubenswrapper[4752]: I1124 11:22:16.492774 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerDied","Data":"232a5964f5a1b0c13db2c00fa30a37e21418af2bce45250ff9defff836c970d9"} Nov 24 11:22:16 crc kubenswrapper[4752]: I1124 11:22:16.493174 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerStarted","Data":"47e4c2373d3287581d775010f911ee2e00a31c4abfe85132a68240b84e00095c"} Nov 24 11:22:16 crc kubenswrapper[4752]: I1124 11:22:16.493198 4752 scope.go:117] "RemoveContainer" containerID="6e5c6804736df3aeb8f103a865f92241e4b53345e9d743bcf5a29c58bb51532b" Nov 24 11:22:22 crc kubenswrapper[4752]: I1124 11:22:22.958450 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-75fb479bcc-zjh9k"] Nov 24 11:22:22 crc kubenswrapper[4752]: I1124 11:22:22.960164 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-zjh9k" Nov 24 11:22:22 crc kubenswrapper[4752]: I1124 11:22:22.962569 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-zwp8z" Nov 24 11:22:22 crc kubenswrapper[4752]: I1124 11:22:22.969681 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6498cbf48f-chvx8"] Nov 24 11:22:22 crc kubenswrapper[4752]: I1124 11:22:22.971383 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-chvx8" Nov 24 11:22:22 crc kubenswrapper[4752]: I1124 11:22:22.972719 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-57w8x" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:22.980188 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-767ccfd65f-pcppk"] Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:22.981295 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-pcppk" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:22.983150 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-zzx6l" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:22.985427 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-75fb479bcc-zjh9k"] Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.005366 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-56f54d6746-r8x8j"] Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.013429 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-56f54d6746-r8x8j" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.039861 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-767ccfd65f-pcppk"] Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.041756 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-xtswc" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.049218 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6498cbf48f-chvx8"] Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.054567 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-7969689c84-56mql"] Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.055575 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7969689c84-56mql" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.061268 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-tqrw8" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.062320 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-56f54d6746-r8x8j"] Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.083803 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7969689c84-56mql"] Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.103505 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blqcp\" (UniqueName: \"kubernetes.io/projected/ab3c1b1a-1cd2-4acf-9208-2eb8d370d25f-kube-api-access-blqcp\") pod \"cinder-operator-controller-manager-6498cbf48f-chvx8\" (UID: \"ab3c1b1a-1cd2-4acf-9208-2eb8d370d25f\") " pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-chvx8" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.103559 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrpq2\" (UniqueName: \"kubernetes.io/projected/8d27c6dc-dd9d-4061-aa22-a334d0ffce1e-kube-api-access-qrpq2\") pod \"barbican-operator-controller-manager-75fb479bcc-zjh9k\" (UID: \"8d27c6dc-dd9d-4061-aa22-a334d0ffce1e\") " pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-zjh9k" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.103611 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztsqr\" (UniqueName: \"kubernetes.io/projected/e4b457a1-f44d-4117-b4ac-96117293474b-kube-api-access-ztsqr\") pod \"designate-operator-controller-manager-767ccfd65f-pcppk\" (UID: \"e4b457a1-f44d-4117-b4ac-96117293474b\") " pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-pcppk" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.140655 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-99b499f4-d8l5p"] Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.141646 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-99b499f4-d8l5p" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.143288 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-q64qj" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.145642 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-598f69df5d-hrlfx"] Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.146701 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-hrlfx" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.149211 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-ws96w" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.155568 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-6dd8864d7c-248dp"] Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.156510 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-6dd8864d7c-248dp" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.159111 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-mx4bd" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.159603 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7454b96578-wh2h5"] Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.164308 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.168053 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7454b96578-wh2h5" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.170003 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-6b22z" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.170089 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7454b96578-wh2h5"] Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.181333 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-58f887965d-bfqrz"] Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.182411 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-58f887965d-bfqrz" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.186183 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-92t7m" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.203202 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-99b499f4-d8l5p"] Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.204786 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztsqr\" (UniqueName: \"kubernetes.io/projected/e4b457a1-f44d-4117-b4ac-96117293474b-kube-api-access-ztsqr\") pod \"designate-operator-controller-manager-767ccfd65f-pcppk\" (UID: \"e4b457a1-f44d-4117-b4ac-96117293474b\") " pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-pcppk" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.204841 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxfg2\" (UniqueName: \"kubernetes.io/projected/190fcfd5-d931-49b7-bca0-b0347fb39619-kube-api-access-pxfg2\") pod \"glance-operator-controller-manager-7969689c84-56mql\" (UID: \"190fcfd5-d931-49b7-bca0-b0347fb39619\") " pod="openstack-operators/glance-operator-controller-manager-7969689c84-56mql" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.204915 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blqcp\" (UniqueName: \"kubernetes.io/projected/ab3c1b1a-1cd2-4acf-9208-2eb8d370d25f-kube-api-access-blqcp\") pod \"cinder-operator-controller-manager-6498cbf48f-chvx8\" (UID: \"ab3c1b1a-1cd2-4acf-9208-2eb8d370d25f\") " pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-chvx8" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.204944 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzx95\" (UniqueName: \"kubernetes.io/projected/c60d4c47-fcde-4a44-ad68-f6113546b3e5-kube-api-access-gzx95\") pod \"heat-operator-controller-manager-56f54d6746-r8x8j\" (UID: \"c60d4c47-fcde-4a44-ad68-f6113546b3e5\") " pod="openstack-operators/heat-operator-controller-manager-56f54d6746-r8x8j" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.204972 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrpq2\" (UniqueName: \"kubernetes.io/projected/8d27c6dc-dd9d-4061-aa22-a334d0ffce1e-kube-api-access-qrpq2\") pod \"barbican-operator-controller-manager-75fb479bcc-zjh9k\" (UID: \"8d27c6dc-dd9d-4061-aa22-a334d0ffce1e\") " pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-zjh9k" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.222151 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-598f69df5d-hrlfx"] Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.227624 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-58f887965d-bfqrz"] Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.231112 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrpq2\" (UniqueName: \"kubernetes.io/projected/8d27c6dc-dd9d-4061-aa22-a334d0ffce1e-kube-api-access-qrpq2\") pod \"barbican-operator-controller-manager-75fb479bcc-zjh9k\" (UID: \"8d27c6dc-dd9d-4061-aa22-a334d0ffce1e\") " pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-zjh9k" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.243898 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-54b5986bb8-prxnc"] Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.244894 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-6dd8864d7c-248dp"] Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.244979 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-prxnc" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.247158 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-mmbcf" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.250009 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztsqr\" (UniqueName: \"kubernetes.io/projected/e4b457a1-f44d-4117-b4ac-96117293474b-kube-api-access-ztsqr\") pod \"designate-operator-controller-manager-767ccfd65f-pcppk\" (UID: \"e4b457a1-f44d-4117-b4ac-96117293474b\") " pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-pcppk" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.256451 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blqcp\" (UniqueName: \"kubernetes.io/projected/ab3c1b1a-1cd2-4acf-9208-2eb8d370d25f-kube-api-access-blqcp\") pod \"cinder-operator-controller-manager-6498cbf48f-chvx8\" (UID: \"ab3c1b1a-1cd2-4acf-9208-2eb8d370d25f\") " pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-chvx8" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.258627 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-54b5986bb8-prxnc"] Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.267441 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-78bd47f458-d4fdd"] Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.269660 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-d4fdd" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.273871 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-cfbb9c588-kvzg4"] Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.274981 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-kvzg4" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.275205 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-z4ssb" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.283459 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-trkkw" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.287219 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-zjh9k" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.306900 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-78bd47f458-d4fdd"] Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.307415 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-chvx8" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.315819 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-cfbb9c588-kvzg4"] Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.320860 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgqfm\" (UniqueName: \"kubernetes.io/projected/dec80ec2-1e3d-413e-aed2-426ce66a601a-kube-api-access-pgqfm\") pod \"ironic-operator-controller-manager-99b499f4-d8l5p\" (UID: \"dec80ec2-1e3d-413e-aed2-426ce66a601a\") " pod="openstack-operators/ironic-operator-controller-manager-99b499f4-d8l5p" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.320982 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fppn\" (UniqueName: \"kubernetes.io/projected/97ad1b28-46b3-4eb3-a00d-03b6e5b58575-kube-api-access-4fppn\") pod \"horizon-operator-controller-manager-598f69df5d-hrlfx\" (UID: \"97ad1b28-46b3-4eb3-a00d-03b6e5b58575\") " pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-hrlfx" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.321072 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxfg2\" (UniqueName: \"kubernetes.io/projected/190fcfd5-d931-49b7-bca0-b0347fb39619-kube-api-access-pxfg2\") pod \"glance-operator-controller-manager-7969689c84-56mql\" (UID: \"190fcfd5-d931-49b7-bca0-b0347fb39619\") " pod="openstack-operators/glance-operator-controller-manager-7969689c84-56mql" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.321211 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2s5c\" (UniqueName: \"kubernetes.io/projected/65cdca17-af51-44a1-b1ad-3411ff357a5f-kube-api-access-h2s5c\") pod \"manila-operator-controller-manager-58f887965d-bfqrz\" (UID: \"65cdca17-af51-44a1-b1ad-3411ff357a5f\") " pod="openstack-operators/manila-operator-controller-manager-58f887965d-bfqrz" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.321285 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzx95\" (UniqueName: \"kubernetes.io/projected/c60d4c47-fcde-4a44-ad68-f6113546b3e5-kube-api-access-gzx95\") pod \"heat-operator-controller-manager-56f54d6746-r8x8j\" (UID: \"c60d4c47-fcde-4a44-ad68-f6113546b3e5\") " pod="openstack-operators/heat-operator-controller-manager-56f54d6746-r8x8j" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.321322 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2sc2\" (UniqueName: \"kubernetes.io/projected/a480eff6-4e20-4afc-942a-075e40ef0699-kube-api-access-m2sc2\") pod \"keystone-operator-controller-manager-7454b96578-wh2h5\" (UID: \"a480eff6-4e20-4afc-942a-075e40ef0699\") " pod="openstack-operators/keystone-operator-controller-manager-7454b96578-wh2h5" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.321390 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz5dd\" (UniqueName: \"kubernetes.io/projected/e2ee6ec0-9173-4d22-8e3d-00fb401fe5a3-kube-api-access-zz5dd\") pod \"infra-operator-controller-manager-6dd8864d7c-248dp\" (UID: \"e2ee6ec0-9173-4d22-8e3d-00fb401fe5a3\") " pod="openstack-operators/infra-operator-controller-manager-6dd8864d7c-248dp" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.321456 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e2ee6ec0-9173-4d22-8e3d-00fb401fe5a3-cert\") pod \"infra-operator-controller-manager-6dd8864d7c-248dp\" (UID: \"e2ee6ec0-9173-4d22-8e3d-00fb401fe5a3\") " pod="openstack-operators/infra-operator-controller-manager-6dd8864d7c-248dp" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.334150 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-sp84f"] Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.335616 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-sp84f" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.348689 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-kntgn" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.358228 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxfg2\" (UniqueName: \"kubernetes.io/projected/190fcfd5-d931-49b7-bca0-b0347fb39619-kube-api-access-pxfg2\") pod \"glance-operator-controller-manager-7969689c84-56mql\" (UID: \"190fcfd5-d931-49b7-bca0-b0347fb39619\") " pod="openstack-operators/glance-operator-controller-manager-7969689c84-56mql" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.365256 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzx95\" (UniqueName: \"kubernetes.io/projected/c60d4c47-fcde-4a44-ad68-f6113546b3e5-kube-api-access-gzx95\") pod \"heat-operator-controller-manager-56f54d6746-r8x8j\" (UID: \"c60d4c47-fcde-4a44-ad68-f6113546b3e5\") " pod="openstack-operators/heat-operator-controller-manager-56f54d6746-r8x8j" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.365326 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-sp84f"] Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.384661 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-pcppk" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.391841 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-54fc5f65b7-vkkhd"] Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.393382 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-vkkhd" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.394868 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-vgcg6" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.406331 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-csjb7"] Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.407464 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-csjb7" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.411598 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.411735 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-ktblj" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.419493 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-54fc5f65b7-vkkhd"] Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.421489 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-56f54d6746-r8x8j" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.422324 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn97v\" (UniqueName: \"kubernetes.io/projected/6acbdf0a-2e7b-4c61-9e29-ca7ec67f0ca4-kube-api-access-bn97v\") pod \"ovn-operator-controller-manager-54fc5f65b7-vkkhd\" (UID: \"6acbdf0a-2e7b-4c61-9e29-ca7ec67f0ca4\") " pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-vkkhd" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.422384 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmjv2\" (UniqueName: \"kubernetes.io/projected/87c73240-34a4-4ad6-b134-1bacc37d2eaa-kube-api-access-wmjv2\") pod \"openstack-baremetal-operator-controller-manager-8c7444f48-csjb7\" (UID: \"87c73240-34a4-4ad6-b134-1bacc37d2eaa\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-csjb7" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.422410 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt7mj\" (UniqueName: \"kubernetes.io/projected/cbacf405-082e-46b7-94e2-e881df3184bf-kube-api-access-kt7mj\") pod \"nova-operator-controller-manager-cfbb9c588-kvzg4\" (UID: \"cbacf405-082e-46b7-94e2-e881df3184bf\") " pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-kvzg4" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.422431 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/87c73240-34a4-4ad6-b134-1bacc37d2eaa-cert\") pod \"openstack-baremetal-operator-controller-manager-8c7444f48-csjb7\" (UID: \"87c73240-34a4-4ad6-b134-1bacc37d2eaa\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-csjb7" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.422487 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2s5c\" (UniqueName: \"kubernetes.io/projected/65cdca17-af51-44a1-b1ad-3411ff357a5f-kube-api-access-h2s5c\") pod \"manila-operator-controller-manager-58f887965d-bfqrz\" (UID: \"65cdca17-af51-44a1-b1ad-3411ff357a5f\") " pod="openstack-operators/manila-operator-controller-manager-58f887965d-bfqrz" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.422533 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wq82w\" (UniqueName: \"kubernetes.io/projected/0450318e-f006-4998-ad0a-6b21fe253ec8-kube-api-access-wq82w\") pod \"octavia-operator-controller-manager-54cfbf4c7d-sp84f\" (UID: \"0450318e-f006-4998-ad0a-6b21fe253ec8\") " pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-sp84f" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.422555 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2sc2\" (UniqueName: \"kubernetes.io/projected/a480eff6-4e20-4afc-942a-075e40ef0699-kube-api-access-m2sc2\") pod \"keystone-operator-controller-manager-7454b96578-wh2h5\" (UID: \"a480eff6-4e20-4afc-942a-075e40ef0699\") " pod="openstack-operators/keystone-operator-controller-manager-7454b96578-wh2h5" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.422597 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz5dd\" (UniqueName: \"kubernetes.io/projected/e2ee6ec0-9173-4d22-8e3d-00fb401fe5a3-kube-api-access-zz5dd\") pod \"infra-operator-controller-manager-6dd8864d7c-248dp\" (UID: \"e2ee6ec0-9173-4d22-8e3d-00fb401fe5a3\") " pod="openstack-operators/infra-operator-controller-manager-6dd8864d7c-248dp" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.422647 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stcn4\" (UniqueName: \"kubernetes.io/projected/d92f3535-049c-451a-8f0f-eb863b9e6319-kube-api-access-stcn4\") pod \"mariadb-operator-controller-manager-54b5986bb8-prxnc\" (UID: \"d92f3535-049c-451a-8f0f-eb863b9e6319\") " pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-prxnc" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.422672 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9jj7\" (UniqueName: \"kubernetes.io/projected/7bb11c48-7c81-4e71-a258-1bd291051c79-kube-api-access-r9jj7\") pod \"neutron-operator-controller-manager-78bd47f458-d4fdd\" (UID: \"7bb11c48-7c81-4e71-a258-1bd291051c79\") " pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-d4fdd" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.422697 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e2ee6ec0-9173-4d22-8e3d-00fb401fe5a3-cert\") pod \"infra-operator-controller-manager-6dd8864d7c-248dp\" (UID: \"e2ee6ec0-9173-4d22-8e3d-00fb401fe5a3\") " pod="openstack-operators/infra-operator-controller-manager-6dd8864d7c-248dp" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.422727 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgqfm\" (UniqueName: \"kubernetes.io/projected/dec80ec2-1e3d-413e-aed2-426ce66a601a-kube-api-access-pgqfm\") pod \"ironic-operator-controller-manager-99b499f4-d8l5p\" (UID: \"dec80ec2-1e3d-413e-aed2-426ce66a601a\") " pod="openstack-operators/ironic-operator-controller-manager-99b499f4-d8l5p" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.422822 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fppn\" (UniqueName: \"kubernetes.io/projected/97ad1b28-46b3-4eb3-a00d-03b6e5b58575-kube-api-access-4fppn\") pod \"horizon-operator-controller-manager-598f69df5d-hrlfx\" (UID: \"97ad1b28-46b3-4eb3-a00d-03b6e5b58575\") " pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-hrlfx" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.429805 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b797b8dff-vdzg7"] Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.439521 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7969689c84-56mql" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.447058 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e2ee6ec0-9173-4d22-8e3d-00fb401fe5a3-cert\") pod \"infra-operator-controller-manager-6dd8864d7c-248dp\" (UID: \"e2ee6ec0-9173-4d22-8e3d-00fb401fe5a3\") " pod="openstack-operators/infra-operator-controller-manager-6dd8864d7c-248dp" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.449504 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-vdzg7" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.457375 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2s5c\" (UniqueName: \"kubernetes.io/projected/65cdca17-af51-44a1-b1ad-3411ff357a5f-kube-api-access-h2s5c\") pod \"manila-operator-controller-manager-58f887965d-bfqrz\" (UID: \"65cdca17-af51-44a1-b1ad-3411ff357a5f\") " pod="openstack-operators/manila-operator-controller-manager-58f887965d-bfqrz" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.457840 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-csjb7"] Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.463522 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgqfm\" (UniqueName: \"kubernetes.io/projected/dec80ec2-1e3d-413e-aed2-426ce66a601a-kube-api-access-pgqfm\") pod \"ironic-operator-controller-manager-99b499f4-d8l5p\" (UID: \"dec80ec2-1e3d-413e-aed2-426ce66a601a\") " pod="openstack-operators/ironic-operator-controller-manager-99b499f4-d8l5p" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.464602 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-bs9rz" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.464832 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fppn\" (UniqueName: \"kubernetes.io/projected/97ad1b28-46b3-4eb3-a00d-03b6e5b58575-kube-api-access-4fppn\") pod \"horizon-operator-controller-manager-598f69df5d-hrlfx\" (UID: \"97ad1b28-46b3-4eb3-a00d-03b6e5b58575\") " pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-hrlfx" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.465141 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-99b499f4-d8l5p" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.465138 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz5dd\" (UniqueName: \"kubernetes.io/projected/e2ee6ec0-9173-4d22-8e3d-00fb401fe5a3-kube-api-access-zz5dd\") pod \"infra-operator-controller-manager-6dd8864d7c-248dp\" (UID: \"e2ee6ec0-9173-4d22-8e3d-00fb401fe5a3\") " pod="openstack-operators/infra-operator-controller-manager-6dd8864d7c-248dp" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.468244 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2sc2\" (UniqueName: \"kubernetes.io/projected/a480eff6-4e20-4afc-942a-075e40ef0699-kube-api-access-m2sc2\") pod \"keystone-operator-controller-manager-7454b96578-wh2h5\" (UID: \"a480eff6-4e20-4afc-942a-075e40ef0699\") " pod="openstack-operators/keystone-operator-controller-manager-7454b96578-wh2h5" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.483801 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-hrlfx" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.504106 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b797b8dff-vdzg7"] Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.504830 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-6dd8864d7c-248dp" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.519477 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-d656998f4-pqj8q"] Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.521223 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-d656998f4-pqj8q" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.522393 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7454b96578-wh2h5" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.523671 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmjv2\" (UniqueName: \"kubernetes.io/projected/87c73240-34a4-4ad6-b134-1bacc37d2eaa-kube-api-access-wmjv2\") pod \"openstack-baremetal-operator-controller-manager-8c7444f48-csjb7\" (UID: \"87c73240-34a4-4ad6-b134-1bacc37d2eaa\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-csjb7" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.524058 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt7mj\" (UniqueName: \"kubernetes.io/projected/cbacf405-082e-46b7-94e2-e881df3184bf-kube-api-access-kt7mj\") pod \"nova-operator-controller-manager-cfbb9c588-kvzg4\" (UID: \"cbacf405-082e-46b7-94e2-e881df3184bf\") " pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-kvzg4" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.524097 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/87c73240-34a4-4ad6-b134-1bacc37d2eaa-cert\") pod \"openstack-baremetal-operator-controller-manager-8c7444f48-csjb7\" (UID: \"87c73240-34a4-4ad6-b134-1bacc37d2eaa\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-csjb7" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.524625 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wq82w\" (UniqueName: \"kubernetes.io/projected/0450318e-f006-4998-ad0a-6b21fe253ec8-kube-api-access-wq82w\") pod \"octavia-operator-controller-manager-54cfbf4c7d-sp84f\" (UID: \"0450318e-f006-4998-ad0a-6b21fe253ec8\") " pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-sp84f" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.524665 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9gjj\" (UniqueName: \"kubernetes.io/projected/c49282e6-3072-458d-8fdb-1e8282b3aa59-kube-api-access-n9gjj\") pod \"placement-operator-controller-manager-5b797b8dff-vdzg7\" (UID: \"c49282e6-3072-458d-8fdb-1e8282b3aa59\") " pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-vdzg7" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.525029 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-ftctn" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.525253 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stcn4\" (UniqueName: \"kubernetes.io/projected/d92f3535-049c-451a-8f0f-eb863b9e6319-kube-api-access-stcn4\") pod \"mariadb-operator-controller-manager-54b5986bb8-prxnc\" (UID: \"d92f3535-049c-451a-8f0f-eb863b9e6319\") " pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-prxnc" Nov 24 11:22:23 crc kubenswrapper[4752]: E1124 11:22:23.525458 4752 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 24 11:22:23 crc kubenswrapper[4752]: E1124 11:22:23.525518 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87c73240-34a4-4ad6-b134-1bacc37d2eaa-cert podName:87c73240-34a4-4ad6-b134-1bacc37d2eaa nodeName:}" failed. No retries permitted until 2025-11-24 11:22:24.025502194 +0000 UTC m=+950.010322483 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/87c73240-34a4-4ad6-b134-1bacc37d2eaa-cert") pod "openstack-baremetal-operator-controller-manager-8c7444f48-csjb7" (UID: "87c73240-34a4-4ad6-b134-1bacc37d2eaa") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.525337 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9jj7\" (UniqueName: \"kubernetes.io/projected/7bb11c48-7c81-4e71-a258-1bd291051c79-kube-api-access-r9jj7\") pod \"neutron-operator-controller-manager-78bd47f458-d4fdd\" (UID: \"7bb11c48-7c81-4e71-a258-1bd291051c79\") " pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-d4fdd" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.527158 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-d656998f4-pqj8q"] Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.527921 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn97v\" (UniqueName: \"kubernetes.io/projected/6acbdf0a-2e7b-4c61-9e29-ca7ec67f0ca4-kube-api-access-bn97v\") pod \"ovn-operator-controller-manager-54fc5f65b7-vkkhd\" (UID: \"6acbdf0a-2e7b-4c61-9e29-ca7ec67f0ca4\") " pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-vkkhd" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.565971 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmjv2\" (UniqueName: \"kubernetes.io/projected/87c73240-34a4-4ad6-b134-1bacc37d2eaa-kube-api-access-wmjv2\") pod \"openstack-baremetal-operator-controller-manager-8c7444f48-csjb7\" (UID: \"87c73240-34a4-4ad6-b134-1bacc37d2eaa\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-csjb7" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.571702 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt7mj\" (UniqueName: \"kubernetes.io/projected/cbacf405-082e-46b7-94e2-e881df3184bf-kube-api-access-kt7mj\") pod \"nova-operator-controller-manager-cfbb9c588-kvzg4\" (UID: \"cbacf405-082e-46b7-94e2-e881df3184bf\") " pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-kvzg4" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.573529 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn97v\" (UniqueName: \"kubernetes.io/projected/6acbdf0a-2e7b-4c61-9e29-ca7ec67f0ca4-kube-api-access-bn97v\") pod \"ovn-operator-controller-manager-54fc5f65b7-vkkhd\" (UID: \"6acbdf0a-2e7b-4c61-9e29-ca7ec67f0ca4\") " pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-vkkhd" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.576144 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wq82w\" (UniqueName: \"kubernetes.io/projected/0450318e-f006-4998-ad0a-6b21fe253ec8-kube-api-access-wq82w\") pod \"octavia-operator-controller-manager-54cfbf4c7d-sp84f\" (UID: \"0450318e-f006-4998-ad0a-6b21fe253ec8\") " pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-sp84f" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.576353 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stcn4\" (UniqueName: \"kubernetes.io/projected/d92f3535-049c-451a-8f0f-eb863b9e6319-kube-api-access-stcn4\") pod \"mariadb-operator-controller-manager-54b5986bb8-prxnc\" (UID: \"d92f3535-049c-451a-8f0f-eb863b9e6319\") " pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-prxnc" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.593336 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-58f887965d-bfqrz" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.605793 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-p8b8t"] Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.608134 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-p8b8t" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.609453 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9jj7\" (UniqueName: \"kubernetes.io/projected/7bb11c48-7c81-4e71-a258-1bd291051c79-kube-api-access-r9jj7\") pod \"neutron-operator-controller-manager-78bd47f458-d4fdd\" (UID: \"7bb11c48-7c81-4e71-a258-1bd291051c79\") " pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-d4fdd" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.610113 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-vkkhd" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.612505 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-lpmlx" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.630412 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9gjj\" (UniqueName: \"kubernetes.io/projected/c49282e6-3072-458d-8fdb-1e8282b3aa59-kube-api-access-n9gjj\") pod \"placement-operator-controller-manager-5b797b8dff-vdzg7\" (UID: \"c49282e6-3072-458d-8fdb-1e8282b3aa59\") " pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-vdzg7" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.630452 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpvbn\" (UniqueName: \"kubernetes.io/projected/341d84f5-7ebf-48bb-a7d1-6c55d45d0c58-kube-api-access-mpvbn\") pod \"telemetry-operator-controller-manager-6d4bf84b58-p8b8t\" (UID: \"341d84f5-7ebf-48bb-a7d1-6c55d45d0c58\") " pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-p8b8t" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.630507 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h2zn\" (UniqueName: \"kubernetes.io/projected/4a5a406f-6b3d-4919-9bde-e7af06fd38d4-kube-api-access-4h2zn\") pod \"swift-operator-controller-manager-d656998f4-pqj8q\" (UID: \"4a5a406f-6b3d-4919-9bde-e7af06fd38d4\") " pod="openstack-operators/swift-operator-controller-manager-d656998f4-pqj8q" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.633652 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-p8b8t"] Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.638960 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-prxnc" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.678205 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9gjj\" (UniqueName: \"kubernetes.io/projected/c49282e6-3072-458d-8fdb-1e8282b3aa59-kube-api-access-n9gjj\") pod \"placement-operator-controller-manager-5b797b8dff-vdzg7\" (UID: \"c49282e6-3072-458d-8fdb-1e8282b3aa59\") " pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-vdzg7" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.678554 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-b4c496f69-bbxj8"] Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.679952 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-b4c496f69-bbxj8" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.686023 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-b4c496f69-bbxj8"] Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.696309 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-qsjp9" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.710234 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-8c6448b9f-llhmp"] Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.711488 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-llhmp" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.718811 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-8c6448b9f-llhmp"] Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.726038 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-x9svj" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.738879 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h2zn\" (UniqueName: \"kubernetes.io/projected/4a5a406f-6b3d-4919-9bde-e7af06fd38d4-kube-api-access-4h2zn\") pod \"swift-operator-controller-manager-d656998f4-pqj8q\" (UID: \"4a5a406f-6b3d-4919-9bde-e7af06fd38d4\") " pod="openstack-operators/swift-operator-controller-manager-d656998f4-pqj8q" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.738922 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv7qq\" (UniqueName: \"kubernetes.io/projected/05751949-a258-4470-b4ad-4ad1ae9a3bc6-kube-api-access-hv7qq\") pod \"test-operator-controller-manager-b4c496f69-bbxj8\" (UID: \"05751949-a258-4470-b4ad-4ad1ae9a3bc6\") " pod="openstack-operators/test-operator-controller-manager-b4c496f69-bbxj8" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.738943 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9l7x\" (UniqueName: \"kubernetes.io/projected/eb9eb6b3-a33b-4f64-92a4-f2415648f6c6-kube-api-access-j9l7x\") pod \"watcher-operator-controller-manager-8c6448b9f-llhmp\" (UID: \"eb9eb6b3-a33b-4f64-92a4-f2415648f6c6\") " pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-llhmp" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.739026 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpvbn\" (UniqueName: \"kubernetes.io/projected/341d84f5-7ebf-48bb-a7d1-6c55d45d0c58-kube-api-access-mpvbn\") pod \"telemetry-operator-controller-manager-6d4bf84b58-p8b8t\" (UID: \"341d84f5-7ebf-48bb-a7d1-6c55d45d0c58\") " pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-p8b8t" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.758286 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpvbn\" (UniqueName: \"kubernetes.io/projected/341d84f5-7ebf-48bb-a7d1-6c55d45d0c58-kube-api-access-mpvbn\") pod \"telemetry-operator-controller-manager-6d4bf84b58-p8b8t\" (UID: \"341d84f5-7ebf-48bb-a7d1-6c55d45d0c58\") " pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-p8b8t" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.759140 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-d4fdd" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.761498 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h2zn\" (UniqueName: \"kubernetes.io/projected/4a5a406f-6b3d-4919-9bde-e7af06fd38d4-kube-api-access-4h2zn\") pod \"swift-operator-controller-manager-d656998f4-pqj8q\" (UID: \"4a5a406f-6b3d-4919-9bde-e7af06fd38d4\") " pod="openstack-operators/swift-operator-controller-manager-d656998f4-pqj8q" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.767425 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-kvzg4" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.798693 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-d656998f4-pqj8q" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.818304 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-585c6fcdf4-p7vw7"] Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.819343 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-585c6fcdf4-p7vw7" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.823674 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-nfv2s" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.824622 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-p8b8t" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.829819 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.836043 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-vdzg7" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.853041 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnpmw\" (UniqueName: \"kubernetes.io/projected/c1df47a5-8725-40e7-bc90-2d77c49dba4a-kube-api-access-fnpmw\") pod \"openstack-operator-controller-manager-585c6fcdf4-p7vw7\" (UID: \"c1df47a5-8725-40e7-bc90-2d77c49dba4a\") " pod="openstack-operators/openstack-operator-controller-manager-585c6fcdf4-p7vw7" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.853125 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv7qq\" (UniqueName: \"kubernetes.io/projected/05751949-a258-4470-b4ad-4ad1ae9a3bc6-kube-api-access-hv7qq\") pod \"test-operator-controller-manager-b4c496f69-bbxj8\" (UID: \"05751949-a258-4470-b4ad-4ad1ae9a3bc6\") " pod="openstack-operators/test-operator-controller-manager-b4c496f69-bbxj8" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.853146 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9l7x\" (UniqueName: \"kubernetes.io/projected/eb9eb6b3-a33b-4f64-92a4-f2415648f6c6-kube-api-access-j9l7x\") pod \"watcher-operator-controller-manager-8c6448b9f-llhmp\" (UID: \"eb9eb6b3-a33b-4f64-92a4-f2415648f6c6\") " pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-llhmp" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.853182 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c1df47a5-8725-40e7-bc90-2d77c49dba4a-cert\") pod \"openstack-operator-controller-manager-585c6fcdf4-p7vw7\" (UID: \"c1df47a5-8725-40e7-bc90-2d77c49dba4a\") " pod="openstack-operators/openstack-operator-controller-manager-585c6fcdf4-p7vw7" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.866297 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-585c6fcdf4-p7vw7"] Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.868796 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-sp84f" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.892496 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9l7x\" (UniqueName: \"kubernetes.io/projected/eb9eb6b3-a33b-4f64-92a4-f2415648f6c6-kube-api-access-j9l7x\") pod \"watcher-operator-controller-manager-8c6448b9f-llhmp\" (UID: \"eb9eb6b3-a33b-4f64-92a4-f2415648f6c6\") " pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-llhmp" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.906887 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv7qq\" (UniqueName: \"kubernetes.io/projected/05751949-a258-4470-b4ad-4ad1ae9a3bc6-kube-api-access-hv7qq\") pod \"test-operator-controller-manager-b4c496f69-bbxj8\" (UID: \"05751949-a258-4470-b4ad-4ad1ae9a3bc6\") " pod="openstack-operators/test-operator-controller-manager-b4c496f69-bbxj8" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.939424 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4fxhg"] Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.940426 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4fxhg" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.949576 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4fxhg"] Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.952498 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-hqfcv" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.956884 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c1df47a5-8725-40e7-bc90-2d77c49dba4a-cert\") pod \"openstack-operator-controller-manager-585c6fcdf4-p7vw7\" (UID: \"c1df47a5-8725-40e7-bc90-2d77c49dba4a\") " pod="openstack-operators/openstack-operator-controller-manager-585c6fcdf4-p7vw7" Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.956960 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnpmw\" (UniqueName: \"kubernetes.io/projected/c1df47a5-8725-40e7-bc90-2d77c49dba4a-kube-api-access-fnpmw\") pod \"openstack-operator-controller-manager-585c6fcdf4-p7vw7\" (UID: \"c1df47a5-8725-40e7-bc90-2d77c49dba4a\") " pod="openstack-operators/openstack-operator-controller-manager-585c6fcdf4-p7vw7" Nov 24 11:22:23 crc kubenswrapper[4752]: E1124 11:22:23.957375 4752 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 24 11:22:23 crc kubenswrapper[4752]: E1124 11:22:23.957419 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1df47a5-8725-40e7-bc90-2d77c49dba4a-cert podName:c1df47a5-8725-40e7-bc90-2d77c49dba4a nodeName:}" failed. No retries permitted until 2025-11-24 11:22:24.457405084 +0000 UTC m=+950.442225373 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c1df47a5-8725-40e7-bc90-2d77c49dba4a-cert") pod "openstack-operator-controller-manager-585c6fcdf4-p7vw7" (UID: "c1df47a5-8725-40e7-bc90-2d77c49dba4a") : secret "webhook-server-cert" not found Nov 24 11:22:23 crc kubenswrapper[4752]: I1124 11:22:23.985577 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnpmw\" (UniqueName: \"kubernetes.io/projected/c1df47a5-8725-40e7-bc90-2d77c49dba4a-kube-api-access-fnpmw\") pod \"openstack-operator-controller-manager-585c6fcdf4-p7vw7\" (UID: \"c1df47a5-8725-40e7-bc90-2d77c49dba4a\") " pod="openstack-operators/openstack-operator-controller-manager-585c6fcdf4-p7vw7" Nov 24 11:22:24 crc kubenswrapper[4752]: I1124 11:22:24.043217 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6498cbf48f-chvx8"] Nov 24 11:22:24 crc kubenswrapper[4752]: I1124 11:22:24.058649 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gggd\" (UniqueName: \"kubernetes.io/projected/0198b944-3886-44c4-85c1-8786136c4f2a-kube-api-access-9gggd\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-4fxhg\" (UID: \"0198b944-3886-44c4-85c1-8786136c4f2a\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4fxhg" Nov 24 11:22:24 crc kubenswrapper[4752]: I1124 11:22:24.058731 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/87c73240-34a4-4ad6-b134-1bacc37d2eaa-cert\") pod \"openstack-baremetal-operator-controller-manager-8c7444f48-csjb7\" (UID: \"87c73240-34a4-4ad6-b134-1bacc37d2eaa\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-csjb7" Nov 24 11:22:24 crc kubenswrapper[4752]: E1124 11:22:24.058884 4752 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 24 11:22:24 crc kubenswrapper[4752]: E1124 11:22:24.058929 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87c73240-34a4-4ad6-b134-1bacc37d2eaa-cert podName:87c73240-34a4-4ad6-b134-1bacc37d2eaa nodeName:}" failed. No retries permitted until 2025-11-24 11:22:25.058916209 +0000 UTC m=+951.043736498 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/87c73240-34a4-4ad6-b134-1bacc37d2eaa-cert") pod "openstack-baremetal-operator-controller-manager-8c7444f48-csjb7" (UID: "87c73240-34a4-4ad6-b134-1bacc37d2eaa") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 24 11:22:24 crc kubenswrapper[4752]: I1124 11:22:24.149671 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-b4c496f69-bbxj8" Nov 24 11:22:24 crc kubenswrapper[4752]: I1124 11:22:24.160517 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gggd\" (UniqueName: \"kubernetes.io/projected/0198b944-3886-44c4-85c1-8786136c4f2a-kube-api-access-9gggd\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-4fxhg\" (UID: \"0198b944-3886-44c4-85c1-8786136c4f2a\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4fxhg" Nov 24 11:22:24 crc kubenswrapper[4752]: I1124 11:22:24.167230 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-llhmp" Nov 24 11:22:24 crc kubenswrapper[4752]: I1124 11:22:24.177715 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gggd\" (UniqueName: \"kubernetes.io/projected/0198b944-3886-44c4-85c1-8786136c4f2a-kube-api-access-9gggd\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-4fxhg\" (UID: \"0198b944-3886-44c4-85c1-8786136c4f2a\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4fxhg" Nov 24 11:22:24 crc kubenswrapper[4752]: I1124 11:22:24.194692 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-767ccfd65f-pcppk"] Nov 24 11:22:24 crc kubenswrapper[4752]: I1124 11:22:24.219992 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-75fb479bcc-zjh9k"] Nov 24 11:22:24 crc kubenswrapper[4752]: I1124 11:22:24.276413 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4fxhg" Nov 24 11:22:24 crc kubenswrapper[4752]: I1124 11:22:24.466889 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c1df47a5-8725-40e7-bc90-2d77c49dba4a-cert\") pod \"openstack-operator-controller-manager-585c6fcdf4-p7vw7\" (UID: \"c1df47a5-8725-40e7-bc90-2d77c49dba4a\") " pod="openstack-operators/openstack-operator-controller-manager-585c6fcdf4-p7vw7" Nov 24 11:22:24 crc kubenswrapper[4752]: E1124 11:22:24.467014 4752 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 24 11:22:24 crc kubenswrapper[4752]: E1124 11:22:24.467072 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1df47a5-8725-40e7-bc90-2d77c49dba4a-cert podName:c1df47a5-8725-40e7-bc90-2d77c49dba4a nodeName:}" failed. No retries permitted until 2025-11-24 11:22:25.467055681 +0000 UTC m=+951.451875970 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c1df47a5-8725-40e7-bc90-2d77c49dba4a-cert") pod "openstack-operator-controller-manager-585c6fcdf4-p7vw7" (UID: "c1df47a5-8725-40e7-bc90-2d77c49dba4a") : secret "webhook-server-cert" not found Nov 24 11:22:24 crc kubenswrapper[4752]: I1124 11:22:24.586499 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-pcppk" event={"ID":"e4b457a1-f44d-4117-b4ac-96117293474b","Type":"ContainerStarted","Data":"08de3d9a7f858d21e5fc0688e64395bdf485c967997b2678f59ba44a4cc5d81b"} Nov 24 11:22:24 crc kubenswrapper[4752]: I1124 11:22:24.589990 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-zjh9k" event={"ID":"8d27c6dc-dd9d-4061-aa22-a334d0ffce1e","Type":"ContainerStarted","Data":"a9df19a6dca84be5a86f3df30fad4cda193ea5898de7a3bfcc791ac10f4e23e8"} Nov 24 11:22:24 crc kubenswrapper[4752]: I1124 11:22:24.594952 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-chvx8" event={"ID":"ab3c1b1a-1cd2-4acf-9208-2eb8d370d25f","Type":"ContainerStarted","Data":"afa56d7fc224d6ff511627213408eb55dccd18a944b443ef9a236d27eebd470f"} Nov 24 11:22:24 crc kubenswrapper[4752]: I1124 11:22:24.666127 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-598f69df5d-hrlfx"] Nov 24 11:22:24 crc kubenswrapper[4752]: I1124 11:22:24.703566 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7454b96578-wh2h5"] Nov 24 11:22:24 crc kubenswrapper[4752]: I1124 11:22:24.858372 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-6dd8864d7c-248dp"] Nov 24 11:22:24 crc kubenswrapper[4752]: I1124 11:22:24.878055 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7969689c84-56mql"] Nov 24 11:22:24 crc kubenswrapper[4752]: I1124 11:22:24.889922 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-99b499f4-d8l5p"] Nov 24 11:22:24 crc kubenswrapper[4752]: I1124 11:22:24.898436 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-56f54d6746-r8x8j"] Nov 24 11:22:24 crc kubenswrapper[4752]: I1124 11:22:24.936292 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-54fc5f65b7-vkkhd"] Nov 24 11:22:24 crc kubenswrapper[4752]: I1124 11:22:24.940918 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-58f887965d-bfqrz"] Nov 24 11:22:24 crc kubenswrapper[4752]: W1124 11:22:24.953054 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65cdca17_af51_44a1_b1ad_3411ff357a5f.slice/crio-5be6503be5d8762cca1b23571162d37705f4221dce42e1f92f3f4d3b91e03462 WatchSource:0}: Error finding container 5be6503be5d8762cca1b23571162d37705f4221dce42e1f92f3f4d3b91e03462: Status 404 returned error can't find the container with id 5be6503be5d8762cca1b23571162d37705f4221dce42e1f92f3f4d3b91e03462 Nov 24 11:22:24 crc kubenswrapper[4752]: I1124 11:22:24.958608 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-54b5986bb8-prxnc"] Nov 24 11:22:24 crc kubenswrapper[4752]: I1124 11:22:24.962886 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-cfbb9c588-kvzg4"] Nov 24 11:22:24 crc kubenswrapper[4752]: I1124 11:22:24.966232 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-p8b8t"] Nov 24 11:22:24 crc kubenswrapper[4752]: E1124 11:22:24.966673 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:207578cb433471cc1a79c21a808c8a15489d1d3c9fa77e29f3f697c33917fec6,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-r9jj7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-78bd47f458-d4fdd_openstack-operators(7bb11c48-7c81-4e71-a258-1bd291051c79): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 24 11:22:24 crc kubenswrapper[4752]: E1124 11:22:24.966777 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:5324a6d2f76fc3041023b0cbd09a733ef2b59f310d390e4d6483d219eb96494f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mpvbn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-6d4bf84b58-p8b8t_openstack-operators(341d84f5-7ebf-48bb-a7d1-6c55d45d0c58): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 24 11:22:24 crc kubenswrapper[4752]: E1124 11:22:24.968310 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:4094e7fc11a33e8e2b6768a053cafaf5b122446d23f9113d43d520cb64e9776c,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-n9gjj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5b797b8dff-vdzg7_openstack-operators(c49282e6-3072-458d-8fdb-1e8282b3aa59): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 24 11:22:24 crc kubenswrapper[4752]: I1124 11:22:24.977852 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-sp84f"] Nov 24 11:22:24 crc kubenswrapper[4752]: I1124 11:22:24.986796 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b797b8dff-vdzg7"] Nov 24 11:22:24 crc kubenswrapper[4752]: I1124 11:22:24.989596 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-78bd47f458-d4fdd"] Nov 24 11:22:25 crc kubenswrapper[4752]: I1124 11:22:25.082899 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/87c73240-34a4-4ad6-b134-1bacc37d2eaa-cert\") pod \"openstack-baremetal-operator-controller-manager-8c7444f48-csjb7\" (UID: \"87c73240-34a4-4ad6-b134-1bacc37d2eaa\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-csjb7" Nov 24 11:22:25 crc kubenswrapper[4752]: I1124 11:22:25.090169 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/87c73240-34a4-4ad6-b134-1bacc37d2eaa-cert\") pod \"openstack-baremetal-operator-controller-manager-8c7444f48-csjb7\" (UID: \"87c73240-34a4-4ad6-b134-1bacc37d2eaa\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-csjb7" Nov 24 11:22:25 crc kubenswrapper[4752]: I1124 11:22:25.122418 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-8c6448b9f-llhmp"] Nov 24 11:22:25 crc kubenswrapper[4752]: I1124 11:22:25.127730 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-d656998f4-pqj8q"] Nov 24 11:22:25 crc kubenswrapper[4752]: I1124 11:22:25.136607 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-b4c496f69-bbxj8"] Nov 24 11:22:25 crc kubenswrapper[4752]: W1124 11:22:25.138073 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05751949_a258_4470_b4ad_4ad1ae9a3bc6.slice/crio-de372c6b445ada9e52960233ea4ba650a0b07f28d098804e9d3e38af4e4eeb4a WatchSource:0}: Error finding container de372c6b445ada9e52960233ea4ba650a0b07f28d098804e9d3e38af4e4eeb4a: Status 404 returned error can't find the container with id de372c6b445ada9e52960233ea4ba650a0b07f28d098804e9d3e38af4e4eeb4a Nov 24 11:22:25 crc kubenswrapper[4752]: W1124 11:22:25.138928 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb9eb6b3_a33b_4f64_92a4_f2415648f6c6.slice/crio-fc3e96f061b20311b49547bac55259fb48c3e659d8620e3f2049305e65fe98b7 WatchSource:0}: Error finding container fc3e96f061b20311b49547bac55259fb48c3e659d8620e3f2049305e65fe98b7: Status 404 returned error can't find the container with id fc3e96f061b20311b49547bac55259fb48c3e659d8620e3f2049305e65fe98b7 Nov 24 11:22:25 crc kubenswrapper[4752]: E1124 11:22:25.141457 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-p8b8t" podUID="341d84f5-7ebf-48bb-a7d1-6c55d45d0c58" Nov 24 11:22:25 crc kubenswrapper[4752]: E1124 11:22:25.141654 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:82207e753574d4be246f86c4b074500d66cf20214aa80f0a8525cf3287a35e6d,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hv7qq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-b4c496f69-bbxj8_openstack-operators(05751949-a258-4470-b4ad-4ad1ae9a3bc6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 24 11:22:25 crc kubenswrapper[4752]: E1124 11:22:25.142394 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:4838402d41d42c56613d43dc5041aae475a2b18e6172491d6c4d4a78a580697f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-j9l7x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-8c6448b9f-llhmp_openstack-operators(eb9eb6b3-a33b-4f64-92a4-f2415648f6c6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 24 11:22:25 crc kubenswrapper[4752]: I1124 11:22:25.150710 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-csjb7" Nov 24 11:22:25 crc kubenswrapper[4752]: I1124 11:22:25.156812 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4fxhg"] Nov 24 11:22:25 crc kubenswrapper[4752]: W1124 11:22:25.165736 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0198b944_3886_44c4_85c1_8786136c4f2a.slice/crio-4468505378b1aa59f5fe41989c5c4c7f792a67f60e34246b62a949ffc3f4e952 WatchSource:0}: Error finding container 4468505378b1aa59f5fe41989c5c4c7f792a67f60e34246b62a949ffc3f4e952: Status 404 returned error can't find the container with id 4468505378b1aa59f5fe41989c5c4c7f792a67f60e34246b62a949ffc3f4e952 Nov 24 11:22:25 crc kubenswrapper[4752]: E1124 11:22:25.167244 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-vdzg7" podUID="c49282e6-3072-458d-8fdb-1e8282b3aa59" Nov 24 11:22:25 crc kubenswrapper[4752]: E1124 11:22:25.171461 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9gggd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-4fxhg_openstack-operators(0198b944-3886-44c4-85c1-8786136c4f2a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 24 11:22:25 crc kubenswrapper[4752]: E1124 11:22:25.173332 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4fxhg" podUID="0198b944-3886-44c4-85c1-8786136c4f2a" Nov 24 11:22:25 crc kubenswrapper[4752]: E1124 11:22:25.181702 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-d4fdd" podUID="7bb11c48-7c81-4e71-a258-1bd291051c79" Nov 24 11:22:25 crc kubenswrapper[4752]: E1124 11:22:25.412147 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-b4c496f69-bbxj8" podUID="05751949-a258-4470-b4ad-4ad1ae9a3bc6" Nov 24 11:22:25 crc kubenswrapper[4752]: E1124 11:22:25.442440 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-llhmp" podUID="eb9eb6b3-a33b-4f64-92a4-f2415648f6c6" Nov 24 11:22:25 crc kubenswrapper[4752]: I1124 11:22:25.489560 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c1df47a5-8725-40e7-bc90-2d77c49dba4a-cert\") pod \"openstack-operator-controller-manager-585c6fcdf4-p7vw7\" (UID: \"c1df47a5-8725-40e7-bc90-2d77c49dba4a\") " pod="openstack-operators/openstack-operator-controller-manager-585c6fcdf4-p7vw7" Nov 24 11:22:25 crc kubenswrapper[4752]: I1124 11:22:25.511042 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c1df47a5-8725-40e7-bc90-2d77c49dba4a-cert\") pod \"openstack-operator-controller-manager-585c6fcdf4-p7vw7\" (UID: \"c1df47a5-8725-40e7-bc90-2d77c49dba4a\") " pod="openstack-operators/openstack-operator-controller-manager-585c6fcdf4-p7vw7" Nov 24 11:22:25 crc kubenswrapper[4752]: I1124 11:22:25.609036 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-csjb7"] Nov 24 11:22:25 crc kubenswrapper[4752]: I1124 11:22:25.610391 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7454b96578-wh2h5" event={"ID":"a480eff6-4e20-4afc-942a-075e40ef0699","Type":"ContainerStarted","Data":"2c538f8a13fe351038aef5be64005afd65b742caacbe81b2d769a97e73d03675"} Nov 24 11:22:25 crc kubenswrapper[4752]: I1124 11:22:25.613048 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-vdzg7" event={"ID":"c49282e6-3072-458d-8fdb-1e8282b3aa59","Type":"ContainerStarted","Data":"2d82d85fead4223778c599e6d10af976bcdab678ec098b4f1c0c88bedd6047fb"} Nov 24 11:22:25 crc kubenswrapper[4752]: I1124 11:22:25.613103 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-vdzg7" event={"ID":"c49282e6-3072-458d-8fdb-1e8282b3aa59","Type":"ContainerStarted","Data":"2be625360b2a3484b3862a9cb24a5a7da3587aa8f511063fb66d33fdc3a1ced1"} Nov 24 11:22:25 crc kubenswrapper[4752]: E1124 11:22:25.617951 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:4094e7fc11a33e8e2b6768a053cafaf5b122446d23f9113d43d520cb64e9776c\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-vdzg7" podUID="c49282e6-3072-458d-8fdb-1e8282b3aa59" Nov 24 11:22:25 crc kubenswrapper[4752]: I1124 11:22:25.619143 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-llhmp" event={"ID":"eb9eb6b3-a33b-4f64-92a4-f2415648f6c6","Type":"ContainerStarted","Data":"8a1b2d9822b25ace40347b2e94350c7b7422ad8ae4d47af4f31ec06159f3ead8"} Nov 24 11:22:25 crc kubenswrapper[4752]: I1124 11:22:25.619210 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-llhmp" event={"ID":"eb9eb6b3-a33b-4f64-92a4-f2415648f6c6","Type":"ContainerStarted","Data":"fc3e96f061b20311b49547bac55259fb48c3e659d8620e3f2049305e65fe98b7"} Nov 24 11:22:25 crc kubenswrapper[4752]: I1124 11:22:25.628231 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-prxnc" event={"ID":"d92f3535-049c-451a-8f0f-eb863b9e6319","Type":"ContainerStarted","Data":"03967db87b98122b901454277cb51bd0d10943683b1c9b15554a7473045d8481"} Nov 24 11:22:25 crc kubenswrapper[4752]: E1124 11:22:25.639379 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4838402d41d42c56613d43dc5041aae475a2b18e6172491d6c4d4a78a580697f\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-llhmp" podUID="eb9eb6b3-a33b-4f64-92a4-f2415648f6c6" Nov 24 11:22:25 crc kubenswrapper[4752]: I1124 11:22:25.644176 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-kvzg4" event={"ID":"cbacf405-082e-46b7-94e2-e881df3184bf","Type":"ContainerStarted","Data":"4dbed4ca7a04645dbafc2e3ea41ede29d456dd26107992d3ea853b5f98a32c02"} Nov 24 11:22:25 crc kubenswrapper[4752]: I1124 11:22:25.688805 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-p8b8t" event={"ID":"341d84f5-7ebf-48bb-a7d1-6c55d45d0c58","Type":"ContainerStarted","Data":"0520c90b310d31dd8dfe57396d72b4a609b67e7f45f5462f6a89713f0d115b16"} Nov 24 11:22:25 crc kubenswrapper[4752]: I1124 11:22:25.688858 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-p8b8t" event={"ID":"341d84f5-7ebf-48bb-a7d1-6c55d45d0c58","Type":"ContainerStarted","Data":"022a3c51cbf6e976f2dd1924557f5fa78973403d1669c90f71d4ca21b40d0015"} Nov 24 11:22:25 crc kubenswrapper[4752]: E1124 11:22:25.691600 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:5324a6d2f76fc3041023b0cbd09a733ef2b59f310d390e4d6483d219eb96494f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-p8b8t" podUID="341d84f5-7ebf-48bb-a7d1-6c55d45d0c58" Nov 24 11:22:25 crc kubenswrapper[4752]: I1124 11:22:25.695651 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-99b499f4-d8l5p" event={"ID":"dec80ec2-1e3d-413e-aed2-426ce66a601a","Type":"ContainerStarted","Data":"d3f0154ee2e4a4ae176852a0ce96e471b12fcf42a3d797ccb930050ba3ef8f15"} Nov 24 11:22:25 crc kubenswrapper[4752]: I1124 11:22:25.703969 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-d656998f4-pqj8q" event={"ID":"4a5a406f-6b3d-4919-9bde-e7af06fd38d4","Type":"ContainerStarted","Data":"002d7d46fa799249540c5963c2d254e8b10489f9e26d41719997a0e54db745d3"} Nov 24 11:22:25 crc kubenswrapper[4752]: W1124 11:22:25.708241 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87c73240_34a4_4ad6_b134_1bacc37d2eaa.slice/crio-d15a5c06167673ce8cbe9105543c40ad2452674f19815ff2454f0fc6ac11f9ce WatchSource:0}: Error finding container d15a5c06167673ce8cbe9105543c40ad2452674f19815ff2454f0fc6ac11f9ce: Status 404 returned error can't find the container with id d15a5c06167673ce8cbe9105543c40ad2452674f19815ff2454f0fc6ac11f9ce Nov 24 11:22:25 crc kubenswrapper[4752]: I1124 11:22:25.710317 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-d4fdd" event={"ID":"7bb11c48-7c81-4e71-a258-1bd291051c79","Type":"ContainerStarted","Data":"191e6eabd1e1048dec9afb8b065b4927b56ad738a9e428fdf1b199d04434ea87"} Nov 24 11:22:25 crc kubenswrapper[4752]: I1124 11:22:25.710362 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-d4fdd" event={"ID":"7bb11c48-7c81-4e71-a258-1bd291051c79","Type":"ContainerStarted","Data":"ce4b7a0a1a8892c952084829f1033448a577cf585b1dd6a12e48c471edf45925"} Nov 24 11:22:25 crc kubenswrapper[4752]: E1124 11:22:25.712311 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:207578cb433471cc1a79c21a808c8a15489d1d3c9fa77e29f3f697c33917fec6\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-d4fdd" podUID="7bb11c48-7c81-4e71-a258-1bd291051c79" Nov 24 11:22:25 crc kubenswrapper[4752]: I1124 11:22:25.713355 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-58f887965d-bfqrz" event={"ID":"65cdca17-af51-44a1-b1ad-3411ff357a5f","Type":"ContainerStarted","Data":"5be6503be5d8762cca1b23571162d37705f4221dce42e1f92f3f4d3b91e03462"} Nov 24 11:22:25 crc kubenswrapper[4752]: I1124 11:22:25.716914 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-585c6fcdf4-p7vw7" Nov 24 11:22:25 crc kubenswrapper[4752]: I1124 11:22:25.722706 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-6dd8864d7c-248dp" event={"ID":"e2ee6ec0-9173-4d22-8e3d-00fb401fe5a3","Type":"ContainerStarted","Data":"48c8048e6b41f30ccd691181c19f02bb9c91a4e88d511699e71b688d22fa5a6d"} Nov 24 11:22:25 crc kubenswrapper[4752]: I1124 11:22:25.760502 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-b4c496f69-bbxj8" event={"ID":"05751949-a258-4470-b4ad-4ad1ae9a3bc6","Type":"ContainerStarted","Data":"7ae6dc0978ecd9bd7ca0d09c4c519fac8e87310c8e4242e817c55204a983ffd4"} Nov 24 11:22:25 crc kubenswrapper[4752]: I1124 11:22:25.760556 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-b4c496f69-bbxj8" event={"ID":"05751949-a258-4470-b4ad-4ad1ae9a3bc6","Type":"ContainerStarted","Data":"de372c6b445ada9e52960233ea4ba650a0b07f28d098804e9d3e38af4e4eeb4a"} Nov 24 11:22:25 crc kubenswrapper[4752]: I1124 11:22:25.787132 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4fxhg" event={"ID":"0198b944-3886-44c4-85c1-8786136c4f2a","Type":"ContainerStarted","Data":"4468505378b1aa59f5fe41989c5c4c7f792a67f60e34246b62a949ffc3f4e952"} Nov 24 11:22:25 crc kubenswrapper[4752]: E1124 11:22:25.795466 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:82207e753574d4be246f86c4b074500d66cf20214aa80f0a8525cf3287a35e6d\\\"\"" pod="openstack-operators/test-operator-controller-manager-b4c496f69-bbxj8" podUID="05751949-a258-4470-b4ad-4ad1ae9a3bc6" Nov 24 11:22:25 crc kubenswrapper[4752]: E1124 11:22:25.799876 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4fxhg" podUID="0198b944-3886-44c4-85c1-8786136c4f2a" Nov 24 11:22:25 crc kubenswrapper[4752]: I1124 11:22:25.802310 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-56f54d6746-r8x8j" event={"ID":"c60d4c47-fcde-4a44-ad68-f6113546b3e5","Type":"ContainerStarted","Data":"57439ff2c482417fdd0f72c2df420f86790987f26a14e6308ad7cd0e3b3df2ab"} Nov 24 11:22:25 crc kubenswrapper[4752]: I1124 11:22:25.805030 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-sp84f" event={"ID":"0450318e-f006-4998-ad0a-6b21fe253ec8","Type":"ContainerStarted","Data":"3a767a8217196ea910c46546a2b4f4416ae6e690281b5d4dbbc1fd7e7ab32d7c"} Nov 24 11:22:25 crc kubenswrapper[4752]: I1124 11:22:25.809350 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-vkkhd" event={"ID":"6acbdf0a-2e7b-4c61-9e29-ca7ec67f0ca4","Type":"ContainerStarted","Data":"bdcd18088171e37914646eccee46a9d22930fcf18100f9679bb4f1326b340f34"} Nov 24 11:22:25 crc kubenswrapper[4752]: I1124 11:22:25.813189 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7969689c84-56mql" event={"ID":"190fcfd5-d931-49b7-bca0-b0347fb39619","Type":"ContainerStarted","Data":"ca8292b4b5f7fbaeb0e26ce340a31298cfd48b04a6ba38aac47961bd7f5a48d8"} Nov 24 11:22:25 crc kubenswrapper[4752]: I1124 11:22:25.816112 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-hrlfx" event={"ID":"97ad1b28-46b3-4eb3-a00d-03b6e5b58575","Type":"ContainerStarted","Data":"111e116126cf2f26244774e56f17a697302f244922c6548ba9d7ec25b897482a"} Nov 24 11:22:26 crc kubenswrapper[4752]: I1124 11:22:26.469411 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-585c6fcdf4-p7vw7"] Nov 24 11:22:26 crc kubenswrapper[4752]: W1124 11:22:26.484635 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1df47a5_8725_40e7_bc90_2d77c49dba4a.slice/crio-8684abe7dcdc2580fa6ce6d5ace5ebea6616f424700415a03479c339e6412d4a WatchSource:0}: Error finding container 8684abe7dcdc2580fa6ce6d5ace5ebea6616f424700415a03479c339e6412d4a: Status 404 returned error can't find the container with id 8684abe7dcdc2580fa6ce6d5ace5ebea6616f424700415a03479c339e6412d4a Nov 24 11:22:26 crc kubenswrapper[4752]: I1124 11:22:26.836985 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-585c6fcdf4-p7vw7" event={"ID":"c1df47a5-8725-40e7-bc90-2d77c49dba4a","Type":"ContainerStarted","Data":"8fb0b4009447f4f6a89d06b53e9a5cb266cc17cb1068c6931081ed2fa0ff3373"} Nov 24 11:22:26 crc kubenswrapper[4752]: I1124 11:22:26.837295 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-585c6fcdf4-p7vw7" event={"ID":"c1df47a5-8725-40e7-bc90-2d77c49dba4a","Type":"ContainerStarted","Data":"8684abe7dcdc2580fa6ce6d5ace5ebea6616f424700415a03479c339e6412d4a"} Nov 24 11:22:26 crc kubenswrapper[4752]: I1124 11:22:26.851303 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-csjb7" event={"ID":"87c73240-34a4-4ad6-b134-1bacc37d2eaa","Type":"ContainerStarted","Data":"d15a5c06167673ce8cbe9105543c40ad2452674f19815ff2454f0fc6ac11f9ce"} Nov 24 11:22:26 crc kubenswrapper[4752]: E1124 11:22:26.854398 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4838402d41d42c56613d43dc5041aae475a2b18e6172491d6c4d4a78a580697f\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-llhmp" podUID="eb9eb6b3-a33b-4f64-92a4-f2415648f6c6" Nov 24 11:22:26 crc kubenswrapper[4752]: E1124 11:22:26.855251 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:207578cb433471cc1a79c21a808c8a15489d1d3c9fa77e29f3f697c33917fec6\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-d4fdd" podUID="7bb11c48-7c81-4e71-a258-1bd291051c79" Nov 24 11:22:26 crc kubenswrapper[4752]: E1124 11:22:26.855399 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:4094e7fc11a33e8e2b6768a053cafaf5b122446d23f9113d43d520cb64e9776c\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-vdzg7" podUID="c49282e6-3072-458d-8fdb-1e8282b3aa59" Nov 24 11:22:26 crc kubenswrapper[4752]: E1124 11:22:26.855852 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4fxhg" podUID="0198b944-3886-44c4-85c1-8786136c4f2a" Nov 24 11:22:26 crc kubenswrapper[4752]: E1124 11:22:26.857901 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:5324a6d2f76fc3041023b0cbd09a733ef2b59f310d390e4d6483d219eb96494f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-p8b8t" podUID="341d84f5-7ebf-48bb-a7d1-6c55d45d0c58" Nov 24 11:22:26 crc kubenswrapper[4752]: E1124 11:22:26.857968 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:82207e753574d4be246f86c4b074500d66cf20214aa80f0a8525cf3287a35e6d\\\"\"" pod="openstack-operators/test-operator-controller-manager-b4c496f69-bbxj8" podUID="05751949-a258-4470-b4ad-4ad1ae9a3bc6" Nov 24 11:22:27 crc kubenswrapper[4752]: I1124 11:22:27.873176 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-585c6fcdf4-p7vw7" event={"ID":"c1df47a5-8725-40e7-bc90-2d77c49dba4a","Type":"ContainerStarted","Data":"ee4ced18bc7dbc02a6d9bcca44db066358cc2d1641e5fe4cbfb74aadb49c05e2"} Nov 24 11:22:27 crc kubenswrapper[4752]: I1124 11:22:27.874114 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-585c6fcdf4-p7vw7" Nov 24 11:22:34 crc kubenswrapper[4752]: I1124 11:22:34.801593 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-585c6fcdf4-p7vw7" podStartSLOduration=11.801576024 podStartE2EDuration="11.801576024s" podCreationTimestamp="2025-11-24 11:22:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:22:27.905150701 +0000 UTC m=+953.889970980" watchObservedRunningTime="2025-11-24 11:22:34.801576024 +0000 UTC m=+960.786396313" Nov 24 11:22:35 crc kubenswrapper[4752]: I1124 11:22:35.008098 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-56f54d6746-r8x8j" event={"ID":"c60d4c47-fcde-4a44-ad68-f6113546b3e5","Type":"ContainerStarted","Data":"32bcb35bec05e45568303cdd987f8aaf84cd5e0f24b650fda505d60d9625d9e4"} Nov 24 11:22:35 crc kubenswrapper[4752]: I1124 11:22:35.027044 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-d656998f4-pqj8q" event={"ID":"4a5a406f-6b3d-4919-9bde-e7af06fd38d4","Type":"ContainerStarted","Data":"26c67e739b9472de2d5661001fa7ec6359e7d4a1ae599b9b30a8ad92ac2ffc0b"} Nov 24 11:22:35 crc kubenswrapper[4752]: I1124 11:22:35.063124 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-99b499f4-d8l5p" event={"ID":"dec80ec2-1e3d-413e-aed2-426ce66a601a","Type":"ContainerStarted","Data":"4d26170d285c3f1408a36e7f844b6a17b5c9b9833711333ba23f19a96f0413cf"} Nov 24 11:22:35 crc kubenswrapper[4752]: I1124 11:22:35.101602 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-chvx8" event={"ID":"ab3c1b1a-1cd2-4acf-9208-2eb8d370d25f","Type":"ContainerStarted","Data":"3cae36d831feff91c09ed5a6471b065bbf5284494d8631ddc3e46e0d4c20033f"} Nov 24 11:22:35 crc kubenswrapper[4752]: I1124 11:22:35.124215 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-hrlfx" event={"ID":"97ad1b28-46b3-4eb3-a00d-03b6e5b58575","Type":"ContainerStarted","Data":"e163fe6f719c6ee33ff4dfc68982e83b60c14c5ad9e11b9cf4c24239d22f2215"} Nov 24 11:22:35 crc kubenswrapper[4752]: I1124 11:22:35.183620 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-zjh9k" event={"ID":"8d27c6dc-dd9d-4061-aa22-a334d0ffce1e","Type":"ContainerStarted","Data":"b176bc607031dc67e7ff1cc2b3360e8c532b4a65c6163999007e98c5be27d590"} Nov 24 11:22:35 crc kubenswrapper[4752]: I1124 11:22:35.201584 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-sp84f" event={"ID":"0450318e-f006-4998-ad0a-6b21fe253ec8","Type":"ContainerStarted","Data":"eab433c1519d69cef7cd7cce759b0d93020b6ebfca2e91f578fc2ab4fa57544d"} Nov 24 11:22:35 crc kubenswrapper[4752]: I1124 11:22:35.208692 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-58f887965d-bfqrz" event={"ID":"65cdca17-af51-44a1-b1ad-3411ff357a5f","Type":"ContainerStarted","Data":"9d8b33b77b8dd1337b2f19eadfbedcfb9bfd702151d7f7a4596e053b570f7a12"} Nov 24 11:22:35 crc kubenswrapper[4752]: I1124 11:22:35.260246 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-vkkhd" event={"ID":"6acbdf0a-2e7b-4c61-9e29-ca7ec67f0ca4","Type":"ContainerStarted","Data":"d41d5f8c4b9612f004b62fd8ab4c9b8e131f3944ee4383aa1ac9b3807b8faca8"} Nov 24 11:22:35 crc kubenswrapper[4752]: I1124 11:22:35.269284 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-prxnc" event={"ID":"d92f3535-049c-451a-8f0f-eb863b9e6319","Type":"ContainerStarted","Data":"9f10cac7b99b6c16b258a4177f8567231ca9b6c90181d12f9039a160debe901e"} Nov 24 11:22:35 crc kubenswrapper[4752]: I1124 11:22:35.278905 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-kvzg4" event={"ID":"cbacf405-082e-46b7-94e2-e881df3184bf","Type":"ContainerStarted","Data":"d94afbc6d1d781b85fc3ddf0f6aff1e6d1f8bdee637c0d223850f6d3ec7c34c1"} Nov 24 11:22:35 crc kubenswrapper[4752]: I1124 11:22:35.317856 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-6dd8864d7c-248dp" event={"ID":"e2ee6ec0-9173-4d22-8e3d-00fb401fe5a3","Type":"ContainerStarted","Data":"e5ac2f2cfea75b5bda605e468fbc1ff5476ed11e28bd91b34fc7a6c86a694960"} Nov 24 11:22:35 crc kubenswrapper[4752]: I1124 11:22:35.342868 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7969689c84-56mql" event={"ID":"190fcfd5-d931-49b7-bca0-b0347fb39619","Type":"ContainerStarted","Data":"6c3fa85dffc78afe92382b1d66c3b3f1980e59594b5a8a6bb62cf76978d9b90c"} Nov 24 11:22:35 crc kubenswrapper[4752]: I1124 11:22:35.344187 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7454b96578-wh2h5" event={"ID":"a480eff6-4e20-4afc-942a-075e40ef0699","Type":"ContainerStarted","Data":"734a5ad649e065617893fe9ca66a6ac20fa8a99fa89b36c94bf6d42fa6a49100"} Nov 24 11:22:35 crc kubenswrapper[4752]: I1124 11:22:35.379037 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-pcppk" event={"ID":"e4b457a1-f44d-4117-b4ac-96117293474b","Type":"ContainerStarted","Data":"15100ac1351ddd2092c386f9f6b4a6da665c7187af37abd9d9a0cf18072c7685"} Nov 24 11:22:35 crc kubenswrapper[4752]: I1124 11:22:35.723326 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-585c6fcdf4-p7vw7" Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.387446 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-sp84f" event={"ID":"0450318e-f006-4998-ad0a-6b21fe253ec8","Type":"ContainerStarted","Data":"02d671c6db5508be913ce3bc72570da6ad5e64b1429f2837f7a781d50cc0a05e"} Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.387535 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-sp84f" Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.389894 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7454b96578-wh2h5" event={"ID":"a480eff6-4e20-4afc-942a-075e40ef0699","Type":"ContainerStarted","Data":"7582e02994e9c2958055d830792b45fdf22b63fc2b05391847cbe9968ba6dfff"} Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.391680 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-99b499f4-d8l5p" event={"ID":"dec80ec2-1e3d-413e-aed2-426ce66a601a","Type":"ContainerStarted","Data":"bb0115ecddfea52bc032f5785a7d91a0c197350600395156367dc7e2296dc420"} Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.391790 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-99b499f4-d8l5p" Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.393303 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-chvx8" event={"ID":"ab3c1b1a-1cd2-4acf-9208-2eb8d370d25f","Type":"ContainerStarted","Data":"9808b64b6b2a1656e156ac86799618b83bf533c5987ac2a3d462160455c6bf86"} Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.393427 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-chvx8" Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.395191 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-56f54d6746-r8x8j" event={"ID":"c60d4c47-fcde-4a44-ad68-f6113546b3e5","Type":"ContainerStarted","Data":"224c10ab61ef3522b0f7bac46210d424558250083afb89234ae0a61bbfd27099"} Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.395257 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-56f54d6746-r8x8j" Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.396647 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7969689c84-56mql" event={"ID":"190fcfd5-d931-49b7-bca0-b0347fb39619","Type":"ContainerStarted","Data":"c17c09dd386df913c3944cf656be737ca927aa9c8083867eed9bd6f71925585f"} Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.396697 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-7969689c84-56mql" Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.398105 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-pcppk" event={"ID":"e4b457a1-f44d-4117-b4ac-96117293474b","Type":"ContainerStarted","Data":"8b8ae346f01117bb497d382a08e6a8be4f264b0ab78624f4d49efa867790a3f8"} Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.398190 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-pcppk" Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.399819 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-prxnc" event={"ID":"d92f3535-049c-451a-8f0f-eb863b9e6319","Type":"ContainerStarted","Data":"a332cefe4caeed3d9cf0efae9d0f5caa00f04d3407dc3b1fb9e390dcc661c990"} Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.399933 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-prxnc" Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.401153 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-kvzg4" event={"ID":"cbacf405-082e-46b7-94e2-e881df3184bf","Type":"ContainerStarted","Data":"15b828893f175c15b0295331ce657f5198c80f80180dea958f0b53281bb2c5b4"} Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.401278 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-kvzg4" Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.402472 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-d656998f4-pqj8q" event={"ID":"4a5a406f-6b3d-4919-9bde-e7af06fd38d4","Type":"ContainerStarted","Data":"8e5ce7ce68f83e9721195c56b141405c4fa5d0e23c01ece0c7e654bc7412a069"} Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.402600 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-d656998f4-pqj8q" Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.404255 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-58f887965d-bfqrz" event={"ID":"65cdca17-af51-44a1-b1ad-3411ff357a5f","Type":"ContainerStarted","Data":"5983d427134472db2ca1f7b6f921467c22980e0a7c4f644358e14651d8b26afb"} Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.404356 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-58f887965d-bfqrz" Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.405720 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-vkkhd" event={"ID":"6acbdf0a-2e7b-4c61-9e29-ca7ec67f0ca4","Type":"ContainerStarted","Data":"e10c553551b59fd488696101a4f55b919f43d187ba24dd03d3438e0d2efe772b"} Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.406167 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-vkkhd" Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.411708 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-zjh9k" event={"ID":"8d27c6dc-dd9d-4061-aa22-a334d0ffce1e","Type":"ContainerStarted","Data":"c77768ee7b85fe508d20ecca85d9e664585d66ffaf5f35523e5b242b697c64cf"} Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.411912 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-zjh9k" Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.414655 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-csjb7" event={"ID":"87c73240-34a4-4ad6-b134-1bacc37d2eaa","Type":"ContainerStarted","Data":"32b3acf78d8844c637b1074c2bff6e39a6f25e8b53d0a59f13e72412dc6305bd"} Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.414692 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-csjb7" event={"ID":"87c73240-34a4-4ad6-b134-1bacc37d2eaa","Type":"ContainerStarted","Data":"a023a95d519591ec29cf3de10d94619e4a205792369c97ed4cf6509563e2609b"} Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.414796 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-csjb7" Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.414838 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-sp84f" podStartSLOduration=4.404110933 podStartE2EDuration="13.414821455s" podCreationTimestamp="2025-11-24 11:22:23 +0000 UTC" firstStartedPulling="2025-11-24 11:22:24.966328189 +0000 UTC m=+950.951148478" lastFinishedPulling="2025-11-24 11:22:33.977038711 +0000 UTC m=+959.961859000" observedRunningTime="2025-11-24 11:22:36.410092628 +0000 UTC m=+962.394912927" watchObservedRunningTime="2025-11-24 11:22:36.414821455 +0000 UTC m=+962.399641744" Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.417057 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-hrlfx" event={"ID":"97ad1b28-46b3-4eb3-a00d-03b6e5b58575","Type":"ContainerStarted","Data":"02409e90bb13419dee0309c19cba0c873f5c35e9a667f582948d1a0dd6e5e9f6"} Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.417250 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-hrlfx" Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.418911 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-6dd8864d7c-248dp" event={"ID":"e2ee6ec0-9173-4d22-8e3d-00fb401fe5a3","Type":"ContainerStarted","Data":"2f81e98a292bf12dae95888ac5d8ccd4c1b7a7d0937c5606bd07c66d3772d386"} Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.419062 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-6dd8864d7c-248dp" Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.427181 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7454b96578-wh2h5" podStartSLOduration=4.151527629 podStartE2EDuration="13.427164792s" podCreationTimestamp="2025-11-24 11:22:23 +0000 UTC" firstStartedPulling="2025-11-24 11:22:24.696175557 +0000 UTC m=+950.680995846" lastFinishedPulling="2025-11-24 11:22:33.97181272 +0000 UTC m=+959.956633009" observedRunningTime="2025-11-24 11:22:36.423012012 +0000 UTC m=+962.407832301" watchObservedRunningTime="2025-11-24 11:22:36.427164792 +0000 UTC m=+962.411985081" Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.438623 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-vkkhd" podStartSLOduration=4.383971872 podStartE2EDuration="13.438606493s" podCreationTimestamp="2025-11-24 11:22:23 +0000 UTC" firstStartedPulling="2025-11-24 11:22:24.94733221 +0000 UTC m=+950.932152499" lastFinishedPulling="2025-11-24 11:22:34.001966831 +0000 UTC m=+959.986787120" observedRunningTime="2025-11-24 11:22:36.436485161 +0000 UTC m=+962.421305450" watchObservedRunningTime="2025-11-24 11:22:36.438606493 +0000 UTC m=+962.423426782" Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.457383 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-58f887965d-bfqrz" podStartSLOduration=4.456050035 podStartE2EDuration="13.457370045s" podCreationTimestamp="2025-11-24 11:22:23 +0000 UTC" firstStartedPulling="2025-11-24 11:22:24.957562086 +0000 UTC m=+950.942382365" lastFinishedPulling="2025-11-24 11:22:33.958882086 +0000 UTC m=+959.943702375" observedRunningTime="2025-11-24 11:22:36.454688338 +0000 UTC m=+962.439508627" watchObservedRunningTime="2025-11-24 11:22:36.457370045 +0000 UTC m=+962.442190334" Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.480645 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-prxnc" podStartSLOduration=4.473449568 podStartE2EDuration="13.480629078s" podCreationTimestamp="2025-11-24 11:22:23 +0000 UTC" firstStartedPulling="2025-11-24 11:22:24.958348408 +0000 UTC m=+950.943168697" lastFinishedPulling="2025-11-24 11:22:33.965527918 +0000 UTC m=+959.950348207" observedRunningTime="2025-11-24 11:22:36.475916801 +0000 UTC m=+962.460737100" watchObservedRunningTime="2025-11-24 11:22:36.480629078 +0000 UTC m=+962.465449367" Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.493057 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-99b499f4-d8l5p" podStartSLOduration=4.409268552 podStartE2EDuration="13.493037676s" podCreationTimestamp="2025-11-24 11:22:23 +0000 UTC" firstStartedPulling="2025-11-24 11:22:24.893261936 +0000 UTC m=+950.878082225" lastFinishedPulling="2025-11-24 11:22:33.97703105 +0000 UTC m=+959.961851349" observedRunningTime="2025-11-24 11:22:36.491919064 +0000 UTC m=+962.476739353" watchObservedRunningTime="2025-11-24 11:22:36.493037676 +0000 UTC m=+962.477857985" Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.510674 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-7969689c84-56mql" podStartSLOduration=5.388685337 podStartE2EDuration="14.510656676s" podCreationTimestamp="2025-11-24 11:22:22 +0000 UTC" firstStartedPulling="2025-11-24 11:22:24.879860859 +0000 UTC m=+950.864681148" lastFinishedPulling="2025-11-24 11:22:34.001832188 +0000 UTC m=+959.986652487" observedRunningTime="2025-11-24 11:22:36.503804938 +0000 UTC m=+962.488625247" watchObservedRunningTime="2025-11-24 11:22:36.510656676 +0000 UTC m=+962.495476965" Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.520421 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-d656998f4-pqj8q" podStartSLOduration=4.684387048 podStartE2EDuration="13.520399768s" podCreationTimestamp="2025-11-24 11:22:23 +0000 UTC" firstStartedPulling="2025-11-24 11:22:25.140974819 +0000 UTC m=+951.125795108" lastFinishedPulling="2025-11-24 11:22:33.976987529 +0000 UTC m=+959.961807828" observedRunningTime="2025-11-24 11:22:36.517336229 +0000 UTC m=+962.502156518" watchObservedRunningTime="2025-11-24 11:22:36.520399768 +0000 UTC m=+962.505220057" Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.531536 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-kvzg4" podStartSLOduration=4.482773718 podStartE2EDuration="13.531519949s" podCreationTimestamp="2025-11-24 11:22:23 +0000 UTC" firstStartedPulling="2025-11-24 11:22:24.953149118 +0000 UTC m=+950.937969407" lastFinishedPulling="2025-11-24 11:22:34.001895349 +0000 UTC m=+959.986715638" observedRunningTime="2025-11-24 11:22:36.529637985 +0000 UTC m=+962.514458274" watchObservedRunningTime="2025-11-24 11:22:36.531519949 +0000 UTC m=+962.516340238" Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.554465 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-56f54d6746-r8x8j" podStartSLOduration=5.489948965 podStartE2EDuration="14.554431262s" podCreationTimestamp="2025-11-24 11:22:22 +0000 UTC" firstStartedPulling="2025-11-24 11:22:24.900722222 +0000 UTC m=+950.885542511" lastFinishedPulling="2025-11-24 11:22:33.965204519 +0000 UTC m=+959.950024808" observedRunningTime="2025-11-24 11:22:36.546619636 +0000 UTC m=+962.531439925" watchObservedRunningTime="2025-11-24 11:22:36.554431262 +0000 UTC m=+962.539251561" Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.567440 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-pcppk" podStartSLOduration=4.835604134 podStartE2EDuration="14.567423378s" podCreationTimestamp="2025-11-24 11:22:22 +0000 UTC" firstStartedPulling="2025-11-24 11:22:24.245614098 +0000 UTC m=+950.230434387" lastFinishedPulling="2025-11-24 11:22:33.977433342 +0000 UTC m=+959.962253631" observedRunningTime="2025-11-24 11:22:36.562164445 +0000 UTC m=+962.546984734" watchObservedRunningTime="2025-11-24 11:22:36.567423378 +0000 UTC m=+962.552243667" Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.582978 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-chvx8" podStartSLOduration=4.7639717919999995 podStartE2EDuration="14.582964107s" podCreationTimestamp="2025-11-24 11:22:22 +0000 UTC" firstStartedPulling="2025-11-24 11:22:24.114924349 +0000 UTC m=+950.099744638" lastFinishedPulling="2025-11-24 11:22:33.933916664 +0000 UTC m=+959.918736953" observedRunningTime="2025-11-24 11:22:36.580967689 +0000 UTC m=+962.565787988" watchObservedRunningTime="2025-11-24 11:22:36.582964107 +0000 UTC m=+962.567784396" Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.597711 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-hrlfx" podStartSLOduration=4.308321703 podStartE2EDuration="13.597693783s" podCreationTimestamp="2025-11-24 11:22:23 +0000 UTC" firstStartedPulling="2025-11-24 11:22:24.681519223 +0000 UTC m=+950.666339512" lastFinishedPulling="2025-11-24 11:22:33.970891303 +0000 UTC m=+959.955711592" observedRunningTime="2025-11-24 11:22:36.594615154 +0000 UTC m=+962.579435443" watchObservedRunningTime="2025-11-24 11:22:36.597693783 +0000 UTC m=+962.582514072" Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.621172 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-6dd8864d7c-248dp" podStartSLOduration=4.510500369 podStartE2EDuration="13.621154311s" podCreationTimestamp="2025-11-24 11:22:23 +0000 UTC" firstStartedPulling="2025-11-24 11:22:24.864290368 +0000 UTC m=+950.849110657" lastFinishedPulling="2025-11-24 11:22:33.97494432 +0000 UTC m=+959.959764599" observedRunningTime="2025-11-24 11:22:36.614493719 +0000 UTC m=+962.599314008" watchObservedRunningTime="2025-11-24 11:22:36.621154311 +0000 UTC m=+962.605974600" Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.638049 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-zjh9k" podStartSLOduration=4.952585266 podStartE2EDuration="14.638032189s" podCreationTimestamp="2025-11-24 11:22:22 +0000 UTC" firstStartedPulling="2025-11-24 11:22:24.265524794 +0000 UTC m=+950.250345083" lastFinishedPulling="2025-11-24 11:22:33.950971717 +0000 UTC m=+959.935792006" observedRunningTime="2025-11-24 11:22:36.631945233 +0000 UTC m=+962.616765522" watchObservedRunningTime="2025-11-24 11:22:36.638032189 +0000 UTC m=+962.622852478" Nov 24 11:22:36 crc kubenswrapper[4752]: I1124 11:22:36.669209 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-csjb7" podStartSLOduration=5.38543772 podStartE2EDuration="13.66919002s" podCreationTimestamp="2025-11-24 11:22:23 +0000 UTC" firstStartedPulling="2025-11-24 11:22:25.71745629 +0000 UTC m=+951.702276579" lastFinishedPulling="2025-11-24 11:22:34.00120859 +0000 UTC m=+959.986028879" observedRunningTime="2025-11-24 11:22:36.666586965 +0000 UTC m=+962.651407254" watchObservedRunningTime="2025-11-24 11:22:36.66919002 +0000 UTC m=+962.654010309" Nov 24 11:22:37 crc kubenswrapper[4752]: I1124 11:22:37.427787 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7454b96578-wh2h5" Nov 24 11:22:40 crc kubenswrapper[4752]: I1124 11:22:40.457220 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-p8b8t" event={"ID":"341d84f5-7ebf-48bb-a7d1-6c55d45d0c58","Type":"ContainerStarted","Data":"d7c0b84d691a3902f8a253c30b157a792e085dd7242f6dd5e2d0ab7eb628edea"} Nov 24 11:22:40 crc kubenswrapper[4752]: I1124 11:22:40.458526 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-p8b8t" Nov 24 11:22:40 crc kubenswrapper[4752]: I1124 11:22:40.481658 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-p8b8t" podStartSLOduration=2.51379193 podStartE2EDuration="17.481641525s" podCreationTimestamp="2025-11-24 11:22:23 +0000 UTC" firstStartedPulling="2025-11-24 11:22:24.966618727 +0000 UTC m=+950.951439016" lastFinishedPulling="2025-11-24 11:22:39.934468312 +0000 UTC m=+965.919288611" observedRunningTime="2025-11-24 11:22:40.479295387 +0000 UTC m=+966.464115676" watchObservedRunningTime="2025-11-24 11:22:40.481641525 +0000 UTC m=+966.466461814" Nov 24 11:22:43 crc kubenswrapper[4752]: I1124 11:22:43.290686 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-zjh9k" Nov 24 11:22:43 crc kubenswrapper[4752]: I1124 11:22:43.311309 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-chvx8" Nov 24 11:22:43 crc kubenswrapper[4752]: I1124 11:22:43.387438 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-pcppk" Nov 24 11:22:43 crc kubenswrapper[4752]: I1124 11:22:43.424422 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-56f54d6746-r8x8j" Nov 24 11:22:43 crc kubenswrapper[4752]: I1124 11:22:43.445586 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-7969689c84-56mql" Nov 24 11:22:43 crc kubenswrapper[4752]: I1124 11:22:43.471697 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-99b499f4-d8l5p" Nov 24 11:22:43 crc kubenswrapper[4752]: I1124 11:22:43.483208 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-llhmp" event={"ID":"eb9eb6b3-a33b-4f64-92a4-f2415648f6c6","Type":"ContainerStarted","Data":"630d7cf375b7edacff5313227ebdce8ef4e6f3efff27e680d1f529e0459459f3"} Nov 24 11:22:43 crc kubenswrapper[4752]: I1124 11:22:43.483540 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-llhmp" Nov 24 11:22:43 crc kubenswrapper[4752]: I1124 11:22:43.488700 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-hrlfx" Nov 24 11:22:43 crc kubenswrapper[4752]: I1124 11:22:43.513467 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-6dd8864d7c-248dp" Nov 24 11:22:43 crc kubenswrapper[4752]: I1124 11:22:43.528723 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-llhmp" podStartSLOduration=3.240738801 podStartE2EDuration="20.528707896s" podCreationTimestamp="2025-11-24 11:22:23 +0000 UTC" firstStartedPulling="2025-11-24 11:22:25.142276947 +0000 UTC m=+951.127097236" lastFinishedPulling="2025-11-24 11:22:42.430246042 +0000 UTC m=+968.415066331" observedRunningTime="2025-11-24 11:22:43.525611507 +0000 UTC m=+969.510431796" watchObservedRunningTime="2025-11-24 11:22:43.528707896 +0000 UTC m=+969.513528185" Nov 24 11:22:43 crc kubenswrapper[4752]: I1124 11:22:43.531338 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7454b96578-wh2h5" Nov 24 11:22:43 crc kubenswrapper[4752]: I1124 11:22:43.598849 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-58f887965d-bfqrz" Nov 24 11:22:43 crc kubenswrapper[4752]: I1124 11:22:43.618933 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-vkkhd" Nov 24 11:22:43 crc kubenswrapper[4752]: I1124 11:22:43.650307 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-prxnc" Nov 24 11:22:43 crc kubenswrapper[4752]: I1124 11:22:43.775085 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-kvzg4" Nov 24 11:22:43 crc kubenswrapper[4752]: I1124 11:22:43.804375 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-d656998f4-pqj8q" Nov 24 11:22:43 crc kubenswrapper[4752]: I1124 11:22:43.871524 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-sp84f" Nov 24 11:22:45 crc kubenswrapper[4752]: I1124 11:22:45.159214 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-csjb7" Nov 24 11:22:48 crc kubenswrapper[4752]: I1124 11:22:48.521628 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-d4fdd" event={"ID":"7bb11c48-7c81-4e71-a258-1bd291051c79","Type":"ContainerStarted","Data":"0d6a6f79a128eeca58f55d697ee9d317955fa7130f79cfde53d8c55341b088fc"} Nov 24 11:22:48 crc kubenswrapper[4752]: I1124 11:22:48.523150 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-d4fdd" Nov 24 11:22:48 crc kubenswrapper[4752]: I1124 11:22:48.525205 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-vdzg7" event={"ID":"c49282e6-3072-458d-8fdb-1e8282b3aa59","Type":"ContainerStarted","Data":"4723621cdce9d771f8f725a23cb533dbca8a240fca6bf291854f41959d9021dd"} Nov 24 11:22:48 crc kubenswrapper[4752]: I1124 11:22:48.525495 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-vdzg7" Nov 24 11:22:48 crc kubenswrapper[4752]: I1124 11:22:48.527712 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-b4c496f69-bbxj8" event={"ID":"05751949-a258-4470-b4ad-4ad1ae9a3bc6","Type":"ContainerStarted","Data":"fd20b7ca716ee5899f1e145a29a83ecd426c33063c6ea7d2b09df4433fb1dc84"} Nov 24 11:22:48 crc kubenswrapper[4752]: I1124 11:22:48.528118 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-b4c496f69-bbxj8" Nov 24 11:22:48 crc kubenswrapper[4752]: I1124 11:22:48.548400 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-d4fdd" podStartSLOduration=3.128725091 podStartE2EDuration="25.548375299s" podCreationTimestamp="2025-11-24 11:22:23 +0000 UTC" firstStartedPulling="2025-11-24 11:22:24.966539655 +0000 UTC m=+950.951359944" lastFinishedPulling="2025-11-24 11:22:47.386189863 +0000 UTC m=+973.371010152" observedRunningTime="2025-11-24 11:22:48.542944692 +0000 UTC m=+974.527765021" watchObservedRunningTime="2025-11-24 11:22:48.548375299 +0000 UTC m=+974.533195618" Nov 24 11:22:48 crc kubenswrapper[4752]: I1124 11:22:48.571161 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-vdzg7" podStartSLOduration=3.150543592 podStartE2EDuration="25.571126197s" podCreationTimestamp="2025-11-24 11:22:23 +0000 UTC" firstStartedPulling="2025-11-24 11:22:24.967879494 +0000 UTC m=+950.952699783" lastFinishedPulling="2025-11-24 11:22:47.388462099 +0000 UTC m=+973.373282388" observedRunningTime="2025-11-24 11:22:48.566298577 +0000 UTC m=+974.551118876" watchObservedRunningTime="2025-11-24 11:22:48.571126197 +0000 UTC m=+974.555946526" Nov 24 11:22:48 crc kubenswrapper[4752]: I1124 11:22:48.593085 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-b4c496f69-bbxj8" podStartSLOduration=3.346881029 podStartE2EDuration="25.593063711s" podCreationTimestamp="2025-11-24 11:22:23 +0000 UTC" firstStartedPulling="2025-11-24 11:22:25.141499764 +0000 UTC m=+951.126320053" lastFinishedPulling="2025-11-24 11:22:47.387682446 +0000 UTC m=+973.372502735" observedRunningTime="2025-11-24 11:22:48.588804778 +0000 UTC m=+974.573625067" watchObservedRunningTime="2025-11-24 11:22:48.593063711 +0000 UTC m=+974.577884010" Nov 24 11:22:53 crc kubenswrapper[4752]: I1124 11:22:53.762885 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-d4fdd" Nov 24 11:22:53 crc kubenswrapper[4752]: I1124 11:22:53.830321 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-p8b8t" Nov 24 11:22:53 crc kubenswrapper[4752]: I1124 11:22:53.839397 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-vdzg7" Nov 24 11:22:54 crc kubenswrapper[4752]: I1124 11:22:54.153989 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-b4c496f69-bbxj8" Nov 24 11:22:54 crc kubenswrapper[4752]: I1124 11:22:54.171041 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-llhmp" Nov 24 11:22:54 crc kubenswrapper[4752]: I1124 11:22:54.594146 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4fxhg" event={"ID":"0198b944-3886-44c4-85c1-8786136c4f2a","Type":"ContainerStarted","Data":"affc22768742dbfb02d9f7c7c478edafb48f7f81cb95c99827ffd48775115b0f"} Nov 24 11:22:55 crc kubenswrapper[4752]: I1124 11:22:55.625403 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4fxhg" podStartSLOduration=10.219007699 podStartE2EDuration="32.625380443s" podCreationTimestamp="2025-11-24 11:22:23 +0000 UTC" firstStartedPulling="2025-11-24 11:22:25.171340487 +0000 UTC m=+951.156160776" lastFinishedPulling="2025-11-24 11:22:47.577713231 +0000 UTC m=+973.562533520" observedRunningTime="2025-11-24 11:22:55.619489743 +0000 UTC m=+981.604310042" watchObservedRunningTime="2025-11-24 11:22:55.625380443 +0000 UTC m=+981.610200752" Nov 24 11:23:11 crc kubenswrapper[4752]: I1124 11:23:11.290010 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-xgwmt"] Nov 24 11:23:11 crc kubenswrapper[4752]: I1124 11:23:11.291420 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-xgwmt" Nov 24 11:23:11 crc kubenswrapper[4752]: I1124 11:23:11.297182 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Nov 24 11:23:11 crc kubenswrapper[4752]: I1124 11:23:11.297336 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-n4wqj" Nov 24 11:23:11 crc kubenswrapper[4752]: I1124 11:23:11.297441 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Nov 24 11:23:11 crc kubenswrapper[4752]: I1124 11:23:11.299488 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Nov 24 11:23:11 crc kubenswrapper[4752]: I1124 11:23:11.310219 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-xgwmt"] Nov 24 11:23:11 crc kubenswrapper[4752]: I1124 11:23:11.452994 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/571aeb3d-6dd9-4362-8680-f625baa05500-config\") pod \"dnsmasq-dns-675f4bcbfc-xgwmt\" (UID: \"571aeb3d-6dd9-4362-8680-f625baa05500\") " pod="openstack/dnsmasq-dns-675f4bcbfc-xgwmt" Nov 24 11:23:11 crc kubenswrapper[4752]: I1124 11:23:11.453043 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pttv\" (UniqueName: \"kubernetes.io/projected/571aeb3d-6dd9-4362-8680-f625baa05500-kube-api-access-5pttv\") pod \"dnsmasq-dns-675f4bcbfc-xgwmt\" (UID: \"571aeb3d-6dd9-4362-8680-f625baa05500\") " pod="openstack/dnsmasq-dns-675f4bcbfc-xgwmt" Nov 24 11:23:11 crc kubenswrapper[4752]: I1124 11:23:11.478843 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-snmhr"] Nov 24 11:23:11 crc kubenswrapper[4752]: I1124 11:23:11.480296 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-snmhr" Nov 24 11:23:11 crc kubenswrapper[4752]: I1124 11:23:11.482238 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Nov 24 11:23:11 crc kubenswrapper[4752]: I1124 11:23:11.492281 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-snmhr"] Nov 24 11:23:11 crc kubenswrapper[4752]: I1124 11:23:11.554255 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/571aeb3d-6dd9-4362-8680-f625baa05500-config\") pod \"dnsmasq-dns-675f4bcbfc-xgwmt\" (UID: \"571aeb3d-6dd9-4362-8680-f625baa05500\") " pod="openstack/dnsmasq-dns-675f4bcbfc-xgwmt" Nov 24 11:23:11 crc kubenswrapper[4752]: I1124 11:23:11.554312 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pttv\" (UniqueName: \"kubernetes.io/projected/571aeb3d-6dd9-4362-8680-f625baa05500-kube-api-access-5pttv\") pod \"dnsmasq-dns-675f4bcbfc-xgwmt\" (UID: \"571aeb3d-6dd9-4362-8680-f625baa05500\") " pod="openstack/dnsmasq-dns-675f4bcbfc-xgwmt" Nov 24 11:23:11 crc kubenswrapper[4752]: I1124 11:23:11.555136 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/571aeb3d-6dd9-4362-8680-f625baa05500-config\") pod \"dnsmasq-dns-675f4bcbfc-xgwmt\" (UID: \"571aeb3d-6dd9-4362-8680-f625baa05500\") " pod="openstack/dnsmasq-dns-675f4bcbfc-xgwmt" Nov 24 11:23:11 crc kubenswrapper[4752]: I1124 11:23:11.575674 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pttv\" (UniqueName: \"kubernetes.io/projected/571aeb3d-6dd9-4362-8680-f625baa05500-kube-api-access-5pttv\") pod \"dnsmasq-dns-675f4bcbfc-xgwmt\" (UID: \"571aeb3d-6dd9-4362-8680-f625baa05500\") " pod="openstack/dnsmasq-dns-675f4bcbfc-xgwmt" Nov 24 11:23:11 crc kubenswrapper[4752]: I1124 11:23:11.629160 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-xgwmt" Nov 24 11:23:11 crc kubenswrapper[4752]: I1124 11:23:11.658633 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b65c931-7677-4844-a2ed-568c49b76406-config\") pod \"dnsmasq-dns-78dd6ddcc-snmhr\" (UID: \"5b65c931-7677-4844-a2ed-568c49b76406\") " pod="openstack/dnsmasq-dns-78dd6ddcc-snmhr" Nov 24 11:23:11 crc kubenswrapper[4752]: I1124 11:23:11.658708 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b65c931-7677-4844-a2ed-568c49b76406-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-snmhr\" (UID: \"5b65c931-7677-4844-a2ed-568c49b76406\") " pod="openstack/dnsmasq-dns-78dd6ddcc-snmhr" Nov 24 11:23:11 crc kubenswrapper[4752]: I1124 11:23:11.658959 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvnbb\" (UniqueName: \"kubernetes.io/projected/5b65c931-7677-4844-a2ed-568c49b76406-kube-api-access-jvnbb\") pod \"dnsmasq-dns-78dd6ddcc-snmhr\" (UID: \"5b65c931-7677-4844-a2ed-568c49b76406\") " pod="openstack/dnsmasq-dns-78dd6ddcc-snmhr" Nov 24 11:23:11 crc kubenswrapper[4752]: I1124 11:23:11.760051 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvnbb\" (UniqueName: \"kubernetes.io/projected/5b65c931-7677-4844-a2ed-568c49b76406-kube-api-access-jvnbb\") pod \"dnsmasq-dns-78dd6ddcc-snmhr\" (UID: \"5b65c931-7677-4844-a2ed-568c49b76406\") " pod="openstack/dnsmasq-dns-78dd6ddcc-snmhr" Nov 24 11:23:11 crc kubenswrapper[4752]: I1124 11:23:11.760085 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b65c931-7677-4844-a2ed-568c49b76406-config\") pod \"dnsmasq-dns-78dd6ddcc-snmhr\" (UID: \"5b65c931-7677-4844-a2ed-568c49b76406\") " pod="openstack/dnsmasq-dns-78dd6ddcc-snmhr" Nov 24 11:23:11 crc kubenswrapper[4752]: I1124 11:23:11.760123 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b65c931-7677-4844-a2ed-568c49b76406-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-snmhr\" (UID: \"5b65c931-7677-4844-a2ed-568c49b76406\") " pod="openstack/dnsmasq-dns-78dd6ddcc-snmhr" Nov 24 11:23:11 crc kubenswrapper[4752]: I1124 11:23:11.761372 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b65c931-7677-4844-a2ed-568c49b76406-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-snmhr\" (UID: \"5b65c931-7677-4844-a2ed-568c49b76406\") " pod="openstack/dnsmasq-dns-78dd6ddcc-snmhr" Nov 24 11:23:11 crc kubenswrapper[4752]: I1124 11:23:11.763062 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b65c931-7677-4844-a2ed-568c49b76406-config\") pod \"dnsmasq-dns-78dd6ddcc-snmhr\" (UID: \"5b65c931-7677-4844-a2ed-568c49b76406\") " pod="openstack/dnsmasq-dns-78dd6ddcc-snmhr" Nov 24 11:23:11 crc kubenswrapper[4752]: I1124 11:23:11.788987 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvnbb\" (UniqueName: \"kubernetes.io/projected/5b65c931-7677-4844-a2ed-568c49b76406-kube-api-access-jvnbb\") pod \"dnsmasq-dns-78dd6ddcc-snmhr\" (UID: \"5b65c931-7677-4844-a2ed-568c49b76406\") " pod="openstack/dnsmasq-dns-78dd6ddcc-snmhr" Nov 24 11:23:11 crc kubenswrapper[4752]: I1124 11:23:11.793544 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-snmhr" Nov 24 11:23:12 crc kubenswrapper[4752]: I1124 11:23:12.033304 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-xgwmt"] Nov 24 11:23:12 crc kubenswrapper[4752]: I1124 11:23:12.038956 4752 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 24 11:23:12 crc kubenswrapper[4752]: I1124 11:23:12.189866 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-snmhr"] Nov 24 11:23:12 crc kubenswrapper[4752]: W1124 11:23:12.198867 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b65c931_7677_4844_a2ed_568c49b76406.slice/crio-25e98a4c1cfef7e1e3b167c4070dacd0741b71a834fdb68b7b8dcf7f1aaf5838 WatchSource:0}: Error finding container 25e98a4c1cfef7e1e3b167c4070dacd0741b71a834fdb68b7b8dcf7f1aaf5838: Status 404 returned error can't find the container with id 25e98a4c1cfef7e1e3b167c4070dacd0741b71a834fdb68b7b8dcf7f1aaf5838 Nov 24 11:23:12 crc kubenswrapper[4752]: I1124 11:23:12.717955 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-snmhr" event={"ID":"5b65c931-7677-4844-a2ed-568c49b76406","Type":"ContainerStarted","Data":"25e98a4c1cfef7e1e3b167c4070dacd0741b71a834fdb68b7b8dcf7f1aaf5838"} Nov 24 11:23:12 crc kubenswrapper[4752]: I1124 11:23:12.720359 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-xgwmt" event={"ID":"571aeb3d-6dd9-4362-8680-f625baa05500","Type":"ContainerStarted","Data":"4ce0620b9a536ce9de1ea5ee722d85df55c3bec5e0b1e4d0ab18a37cc3bd844d"} Nov 24 11:23:13 crc kubenswrapper[4752]: I1124 11:23:13.868735 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-xgwmt"] Nov 24 11:23:13 crc kubenswrapper[4752]: I1124 11:23:13.886397 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-xrzhq"] Nov 24 11:23:13 crc kubenswrapper[4752]: I1124 11:23:13.887533 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-xrzhq" Nov 24 11:23:13 crc kubenswrapper[4752]: I1124 11:23:13.896773 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddf3eb37-e41b-4a4f-b394-83cc91c4d837-dns-svc\") pod \"dnsmasq-dns-666b6646f7-xrzhq\" (UID: \"ddf3eb37-e41b-4a4f-b394-83cc91c4d837\") " pod="openstack/dnsmasq-dns-666b6646f7-xrzhq" Nov 24 11:23:13 crc kubenswrapper[4752]: I1124 11:23:13.896809 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddf3eb37-e41b-4a4f-b394-83cc91c4d837-config\") pod \"dnsmasq-dns-666b6646f7-xrzhq\" (UID: \"ddf3eb37-e41b-4a4f-b394-83cc91c4d837\") " pod="openstack/dnsmasq-dns-666b6646f7-xrzhq" Nov 24 11:23:13 crc kubenswrapper[4752]: I1124 11:23:13.896863 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztxxz\" (UniqueName: \"kubernetes.io/projected/ddf3eb37-e41b-4a4f-b394-83cc91c4d837-kube-api-access-ztxxz\") pod \"dnsmasq-dns-666b6646f7-xrzhq\" (UID: \"ddf3eb37-e41b-4a4f-b394-83cc91c4d837\") " pod="openstack/dnsmasq-dns-666b6646f7-xrzhq" Nov 24 11:23:13 crc kubenswrapper[4752]: I1124 11:23:13.898085 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-xrzhq"] Nov 24 11:23:14 crc kubenswrapper[4752]: I1124 11:23:13.999682 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddf3eb37-e41b-4a4f-b394-83cc91c4d837-dns-svc\") pod \"dnsmasq-dns-666b6646f7-xrzhq\" (UID: \"ddf3eb37-e41b-4a4f-b394-83cc91c4d837\") " pod="openstack/dnsmasq-dns-666b6646f7-xrzhq" Nov 24 11:23:14 crc kubenswrapper[4752]: I1124 11:23:13.999724 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddf3eb37-e41b-4a4f-b394-83cc91c4d837-config\") pod \"dnsmasq-dns-666b6646f7-xrzhq\" (UID: \"ddf3eb37-e41b-4a4f-b394-83cc91c4d837\") " pod="openstack/dnsmasq-dns-666b6646f7-xrzhq" Nov 24 11:23:14 crc kubenswrapper[4752]: I1124 11:23:13.999788 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztxxz\" (UniqueName: \"kubernetes.io/projected/ddf3eb37-e41b-4a4f-b394-83cc91c4d837-kube-api-access-ztxxz\") pod \"dnsmasq-dns-666b6646f7-xrzhq\" (UID: \"ddf3eb37-e41b-4a4f-b394-83cc91c4d837\") " pod="openstack/dnsmasq-dns-666b6646f7-xrzhq" Nov 24 11:23:14 crc kubenswrapper[4752]: I1124 11:23:14.000803 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddf3eb37-e41b-4a4f-b394-83cc91c4d837-dns-svc\") pod \"dnsmasq-dns-666b6646f7-xrzhq\" (UID: \"ddf3eb37-e41b-4a4f-b394-83cc91c4d837\") " pod="openstack/dnsmasq-dns-666b6646f7-xrzhq" Nov 24 11:23:14 crc kubenswrapper[4752]: I1124 11:23:14.001433 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddf3eb37-e41b-4a4f-b394-83cc91c4d837-config\") pod \"dnsmasq-dns-666b6646f7-xrzhq\" (UID: \"ddf3eb37-e41b-4a4f-b394-83cc91c4d837\") " pod="openstack/dnsmasq-dns-666b6646f7-xrzhq" Nov 24 11:23:14 crc kubenswrapper[4752]: I1124 11:23:14.038811 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztxxz\" (UniqueName: \"kubernetes.io/projected/ddf3eb37-e41b-4a4f-b394-83cc91c4d837-kube-api-access-ztxxz\") pod \"dnsmasq-dns-666b6646f7-xrzhq\" (UID: \"ddf3eb37-e41b-4a4f-b394-83cc91c4d837\") " pod="openstack/dnsmasq-dns-666b6646f7-xrzhq" Nov 24 11:23:14 crc kubenswrapper[4752]: I1124 11:23:14.156577 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-snmhr"] Nov 24 11:23:14 crc kubenswrapper[4752]: I1124 11:23:14.184940 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-mtl48"] Nov 24 11:23:14 crc kubenswrapper[4752]: I1124 11:23:14.186959 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-mtl48" Nov 24 11:23:14 crc kubenswrapper[4752]: I1124 11:23:14.199370 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-mtl48"] Nov 24 11:23:14 crc kubenswrapper[4752]: I1124 11:23:14.215074 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-xrzhq" Nov 24 11:23:14 crc kubenswrapper[4752]: I1124 11:23:14.303230 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/152a7f03-9669-4f3f-977e-0da6d03122f5-config\") pod \"dnsmasq-dns-57d769cc4f-mtl48\" (UID: \"152a7f03-9669-4f3f-977e-0da6d03122f5\") " pod="openstack/dnsmasq-dns-57d769cc4f-mtl48" Nov 24 11:23:14 crc kubenswrapper[4752]: I1124 11:23:14.303292 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/152a7f03-9669-4f3f-977e-0da6d03122f5-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-mtl48\" (UID: \"152a7f03-9669-4f3f-977e-0da6d03122f5\") " pod="openstack/dnsmasq-dns-57d769cc4f-mtl48" Nov 24 11:23:14 crc kubenswrapper[4752]: I1124 11:23:14.303767 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx8vw\" (UniqueName: \"kubernetes.io/projected/152a7f03-9669-4f3f-977e-0da6d03122f5-kube-api-access-dx8vw\") pod \"dnsmasq-dns-57d769cc4f-mtl48\" (UID: \"152a7f03-9669-4f3f-977e-0da6d03122f5\") " pod="openstack/dnsmasq-dns-57d769cc4f-mtl48" Nov 24 11:23:14 crc kubenswrapper[4752]: I1124 11:23:14.407078 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/152a7f03-9669-4f3f-977e-0da6d03122f5-config\") pod \"dnsmasq-dns-57d769cc4f-mtl48\" (UID: \"152a7f03-9669-4f3f-977e-0da6d03122f5\") " pod="openstack/dnsmasq-dns-57d769cc4f-mtl48" Nov 24 11:23:14 crc kubenswrapper[4752]: I1124 11:23:14.407361 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/152a7f03-9669-4f3f-977e-0da6d03122f5-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-mtl48\" (UID: \"152a7f03-9669-4f3f-977e-0da6d03122f5\") " pod="openstack/dnsmasq-dns-57d769cc4f-mtl48" Nov 24 11:23:14 crc kubenswrapper[4752]: I1124 11:23:14.407403 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dx8vw\" (UniqueName: \"kubernetes.io/projected/152a7f03-9669-4f3f-977e-0da6d03122f5-kube-api-access-dx8vw\") pod \"dnsmasq-dns-57d769cc4f-mtl48\" (UID: \"152a7f03-9669-4f3f-977e-0da6d03122f5\") " pod="openstack/dnsmasq-dns-57d769cc4f-mtl48" Nov 24 11:23:14 crc kubenswrapper[4752]: I1124 11:23:14.408378 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/152a7f03-9669-4f3f-977e-0da6d03122f5-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-mtl48\" (UID: \"152a7f03-9669-4f3f-977e-0da6d03122f5\") " pod="openstack/dnsmasq-dns-57d769cc4f-mtl48" Nov 24 11:23:14 crc kubenswrapper[4752]: I1124 11:23:14.409189 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/152a7f03-9669-4f3f-977e-0da6d03122f5-config\") pod \"dnsmasq-dns-57d769cc4f-mtl48\" (UID: \"152a7f03-9669-4f3f-977e-0da6d03122f5\") " pod="openstack/dnsmasq-dns-57d769cc4f-mtl48" Nov 24 11:23:14 crc kubenswrapper[4752]: I1124 11:23:14.429849 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx8vw\" (UniqueName: \"kubernetes.io/projected/152a7f03-9669-4f3f-977e-0da6d03122f5-kube-api-access-dx8vw\") pod \"dnsmasq-dns-57d769cc4f-mtl48\" (UID: \"152a7f03-9669-4f3f-977e-0da6d03122f5\") " pod="openstack/dnsmasq-dns-57d769cc4f-mtl48" Nov 24 11:23:14 crc kubenswrapper[4752]: I1124 11:23:14.519492 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-mtl48" Nov 24 11:23:14 crc kubenswrapper[4752]: I1124 11:23:14.860460 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-xrzhq"] Nov 24 11:23:14 crc kubenswrapper[4752]: W1124 11:23:14.899193 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podddf3eb37_e41b_4a4f_b394_83cc91c4d837.slice/crio-d0a6a1b167389b0fc3545e087d7f4d38b8d18b7c0abc0dd4a5fe790ceaa426f6 WatchSource:0}: Error finding container d0a6a1b167389b0fc3545e087d7f4d38b8d18b7c0abc0dd4a5fe790ceaa426f6: Status 404 returned error can't find the container with id d0a6a1b167389b0fc3545e087d7f4d38b8d18b7c0abc0dd4a5fe790ceaa426f6 Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.034963 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.037012 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.039792 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.039792 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.039888 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.040203 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.042194 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.042552 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-br8jb" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.042695 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.049470 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.130933 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") " pod="openstack/rabbitmq-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.130966 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") " pod="openstack/rabbitmq-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.131020 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smttb\" (UniqueName: \"kubernetes.io/projected/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-kube-api-access-smttb\") pod \"rabbitmq-server-0\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") " pod="openstack/rabbitmq-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.131055 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") " pod="openstack/rabbitmq-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.131129 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") " pod="openstack/rabbitmq-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.131157 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") " pod="openstack/rabbitmq-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.131280 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") " pod="openstack/rabbitmq-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.131362 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") " pod="openstack/rabbitmq-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.131401 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") " pod="openstack/rabbitmq-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.131420 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-config-data\") pod \"rabbitmq-server-0\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") " pod="openstack/rabbitmq-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.131444 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") " pod="openstack/rabbitmq-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.144501 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-mtl48"] Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.232814 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") " pod="openstack/rabbitmq-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.232910 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") " pod="openstack/rabbitmq-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.232956 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") " pod="openstack/rabbitmq-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.232990 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") " pod="openstack/rabbitmq-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.233014 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") " pod="openstack/rabbitmq-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.233046 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") " pod="openstack/rabbitmq-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.233067 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-config-data\") pod \"rabbitmq-server-0\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") " pod="openstack/rabbitmq-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.233092 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") " pod="openstack/rabbitmq-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.233158 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") " pod="openstack/rabbitmq-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.233181 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") " pod="openstack/rabbitmq-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.233211 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smttb\" (UniqueName: \"kubernetes.io/projected/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-kube-api-access-smttb\") pod \"rabbitmq-server-0\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") " pod="openstack/rabbitmq-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.233704 4752 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.234545 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") " pod="openstack/rabbitmq-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.234891 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") " pod="openstack/rabbitmq-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.234950 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-config-data\") pod \"rabbitmq-server-0\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") " pod="openstack/rabbitmq-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.235442 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") " pod="openstack/rabbitmq-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.235986 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") " pod="openstack/rabbitmq-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.237859 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") " pod="openstack/rabbitmq-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.238953 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") " pod="openstack/rabbitmq-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.240307 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") " pod="openstack/rabbitmq-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.250342 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smttb\" (UniqueName: \"kubernetes.io/projected/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-kube-api-access-smttb\") pod \"rabbitmq-server-0\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") " pod="openstack/rabbitmq-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.252563 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") " pod="openstack/rabbitmq-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.255097 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") " pod="openstack/rabbitmq-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.301382 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.302492 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.308176 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.308260 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-gzgjs" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.308407 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.308897 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.309838 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.310007 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.310795 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.315924 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.373984 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.435860 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.435917 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.436059 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.436154 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.437102 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.437143 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.437191 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.437396 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4pjw\" (UniqueName: \"kubernetes.io/projected/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-kube-api-access-r4pjw\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.437501 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.437523 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.437777 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.539057 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.539103 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.539130 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.539146 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.539166 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.539218 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4pjw\" (UniqueName: \"kubernetes.io/projected/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-kube-api-access-r4pjw\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.539246 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.539268 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.539293 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.539320 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.539345 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.539663 4752 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-cell1-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.540121 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.540216 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.540252 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.540568 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.541166 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.543725 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.545035 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.548687 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.555282 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4pjw\" (UniqueName: \"kubernetes.io/projected/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-kube-api-access-r4pjw\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.559770 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.577192 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.658424 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 24 11:23:15 crc kubenswrapper[4752]: I1124 11:23:15.840336 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-xrzhq" event={"ID":"ddf3eb37-e41b-4a4f-b394-83cc91c4d837","Type":"ContainerStarted","Data":"d0a6a1b167389b0fc3545e087d7f4d38b8d18b7c0abc0dd4a5fe790ceaa426f6"} Nov 24 11:23:16 crc kubenswrapper[4752]: I1124 11:23:16.723082 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Nov 24 11:23:16 crc kubenswrapper[4752]: I1124 11:23:16.724958 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 24 11:23:16 crc kubenswrapper[4752]: I1124 11:23:16.730611 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Nov 24 11:23:16 crc kubenswrapper[4752]: I1124 11:23:16.730782 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Nov 24 11:23:16 crc kubenswrapper[4752]: I1124 11:23:16.731187 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Nov 24 11:23:16 crc kubenswrapper[4752]: I1124 11:23:16.733459 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-zg767" Nov 24 11:23:16 crc kubenswrapper[4752]: I1124 11:23:16.735795 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Nov 24 11:23:16 crc kubenswrapper[4752]: I1124 11:23:16.760977 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 24 11:23:16 crc kubenswrapper[4752]: I1124 11:23:16.858777 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/30292578-890d-42e8-b689-b909de083b48-config-data-generated\") pod \"openstack-galera-0\" (UID: \"30292578-890d-42e8-b689-b909de083b48\") " pod="openstack/openstack-galera-0" Nov 24 11:23:16 crc kubenswrapper[4752]: I1124 11:23:16.858848 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/30292578-890d-42e8-b689-b909de083b48-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"30292578-890d-42e8-b689-b909de083b48\") " pod="openstack/openstack-galera-0" Nov 24 11:23:16 crc kubenswrapper[4752]: I1124 11:23:16.858898 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30292578-890d-42e8-b689-b909de083b48-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"30292578-890d-42e8-b689-b909de083b48\") " pod="openstack/openstack-galera-0" Nov 24 11:23:16 crc kubenswrapper[4752]: I1124 11:23:16.858943 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30292578-890d-42e8-b689-b909de083b48-operator-scripts\") pod \"openstack-galera-0\" (UID: \"30292578-890d-42e8-b689-b909de083b48\") " pod="openstack/openstack-galera-0" Nov 24 11:23:16 crc kubenswrapper[4752]: I1124 11:23:16.858972 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"30292578-890d-42e8-b689-b909de083b48\") " pod="openstack/openstack-galera-0" Nov 24 11:23:16 crc kubenswrapper[4752]: I1124 11:23:16.859003 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/30292578-890d-42e8-b689-b909de083b48-kolla-config\") pod \"openstack-galera-0\" (UID: \"30292578-890d-42e8-b689-b909de083b48\") " pod="openstack/openstack-galera-0" Nov 24 11:23:16 crc kubenswrapper[4752]: I1124 11:23:16.859028 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/30292578-890d-42e8-b689-b909de083b48-config-data-default\") pod \"openstack-galera-0\" (UID: \"30292578-890d-42e8-b689-b909de083b48\") " pod="openstack/openstack-galera-0" Nov 24 11:23:16 crc kubenswrapper[4752]: I1124 11:23:16.859091 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpsng\" (UniqueName: \"kubernetes.io/projected/30292578-890d-42e8-b689-b909de083b48-kube-api-access-zpsng\") pod \"openstack-galera-0\" (UID: \"30292578-890d-42e8-b689-b909de083b48\") " pod="openstack/openstack-galera-0" Nov 24 11:23:16 crc kubenswrapper[4752]: I1124 11:23:16.960719 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/30292578-890d-42e8-b689-b909de083b48-config-data-generated\") pod \"openstack-galera-0\" (UID: \"30292578-890d-42e8-b689-b909de083b48\") " pod="openstack/openstack-galera-0" Nov 24 11:23:16 crc kubenswrapper[4752]: I1124 11:23:16.960807 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/30292578-890d-42e8-b689-b909de083b48-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"30292578-890d-42e8-b689-b909de083b48\") " pod="openstack/openstack-galera-0" Nov 24 11:23:16 crc kubenswrapper[4752]: I1124 11:23:16.960846 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30292578-890d-42e8-b689-b909de083b48-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"30292578-890d-42e8-b689-b909de083b48\") " pod="openstack/openstack-galera-0" Nov 24 11:23:16 crc kubenswrapper[4752]: I1124 11:23:16.960877 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30292578-890d-42e8-b689-b909de083b48-operator-scripts\") pod \"openstack-galera-0\" (UID: \"30292578-890d-42e8-b689-b909de083b48\") " pod="openstack/openstack-galera-0" Nov 24 11:23:16 crc kubenswrapper[4752]: I1124 11:23:16.960899 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"30292578-890d-42e8-b689-b909de083b48\") " pod="openstack/openstack-galera-0" Nov 24 11:23:16 crc kubenswrapper[4752]: I1124 11:23:16.960927 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/30292578-890d-42e8-b689-b909de083b48-kolla-config\") pod \"openstack-galera-0\" (UID: \"30292578-890d-42e8-b689-b909de083b48\") " pod="openstack/openstack-galera-0" Nov 24 11:23:16 crc kubenswrapper[4752]: I1124 11:23:16.960951 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/30292578-890d-42e8-b689-b909de083b48-config-data-default\") pod \"openstack-galera-0\" (UID: \"30292578-890d-42e8-b689-b909de083b48\") " pod="openstack/openstack-galera-0" Nov 24 11:23:16 crc kubenswrapper[4752]: I1124 11:23:16.961019 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpsng\" (UniqueName: \"kubernetes.io/projected/30292578-890d-42e8-b689-b909de083b48-kube-api-access-zpsng\") pod \"openstack-galera-0\" (UID: \"30292578-890d-42e8-b689-b909de083b48\") " pod="openstack/openstack-galera-0" Nov 24 11:23:16 crc kubenswrapper[4752]: I1124 11:23:16.961826 4752 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"30292578-890d-42e8-b689-b909de083b48\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-galera-0" Nov 24 11:23:16 crc kubenswrapper[4752]: I1124 11:23:16.963936 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/30292578-890d-42e8-b689-b909de083b48-kolla-config\") pod \"openstack-galera-0\" (UID: \"30292578-890d-42e8-b689-b909de083b48\") " pod="openstack/openstack-galera-0" Nov 24 11:23:16 crc kubenswrapper[4752]: I1124 11:23:16.965033 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/30292578-890d-42e8-b689-b909de083b48-config-data-default\") pod \"openstack-galera-0\" (UID: \"30292578-890d-42e8-b689-b909de083b48\") " pod="openstack/openstack-galera-0" Nov 24 11:23:16 crc kubenswrapper[4752]: I1124 11:23:16.966857 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30292578-890d-42e8-b689-b909de083b48-operator-scripts\") pod \"openstack-galera-0\" (UID: \"30292578-890d-42e8-b689-b909de083b48\") " pod="openstack/openstack-galera-0" Nov 24 11:23:16 crc kubenswrapper[4752]: I1124 11:23:16.967105 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/30292578-890d-42e8-b689-b909de083b48-config-data-generated\") pod \"openstack-galera-0\" (UID: \"30292578-890d-42e8-b689-b909de083b48\") " pod="openstack/openstack-galera-0" Nov 24 11:23:16 crc kubenswrapper[4752]: I1124 11:23:16.967946 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30292578-890d-42e8-b689-b909de083b48-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"30292578-890d-42e8-b689-b909de083b48\") " pod="openstack/openstack-galera-0" Nov 24 11:23:16 crc kubenswrapper[4752]: I1124 11:23:16.974600 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/30292578-890d-42e8-b689-b909de083b48-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"30292578-890d-42e8-b689-b909de083b48\") " pod="openstack/openstack-galera-0" Nov 24 11:23:16 crc kubenswrapper[4752]: I1124 11:23:16.986116 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpsng\" (UniqueName: \"kubernetes.io/projected/30292578-890d-42e8-b689-b909de083b48-kube-api-access-zpsng\") pod \"openstack-galera-0\" (UID: \"30292578-890d-42e8-b689-b909de083b48\") " pod="openstack/openstack-galera-0" Nov 24 11:23:17 crc kubenswrapper[4752]: I1124 11:23:17.005454 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"30292578-890d-42e8-b689-b909de083b48\") " pod="openstack/openstack-galera-0" Nov 24 11:23:17 crc kubenswrapper[4752]: I1124 11:23:17.051614 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 24 11:23:17 crc kubenswrapper[4752]: I1124 11:23:17.863424 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-mtl48" event={"ID":"152a7f03-9669-4f3f-977e-0da6d03122f5","Type":"ContainerStarted","Data":"859b0faa5a37a68ef428441a2e3d48617818fa4577c9fbd32676a827dc16a410"} Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.121634 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.122953 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.125598 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.125812 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.125880 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.126539 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-c4wzs" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.129769 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.279855 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e3426b66-ea91-4d99-86c5-955d77073619\") " pod="openstack/openstack-cell1-galera-0" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.279907 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e3426b66-ea91-4d99-86c5-955d77073619-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e3426b66-ea91-4d99-86c5-955d77073619\") " pod="openstack/openstack-cell1-galera-0" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.279940 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmkh7\" (UniqueName: \"kubernetes.io/projected/e3426b66-ea91-4d99-86c5-955d77073619-kube-api-access-jmkh7\") pod \"openstack-cell1-galera-0\" (UID: \"e3426b66-ea91-4d99-86c5-955d77073619\") " pod="openstack/openstack-cell1-galera-0" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.279975 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e3426b66-ea91-4d99-86c5-955d77073619-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e3426b66-ea91-4d99-86c5-955d77073619\") " pod="openstack/openstack-cell1-galera-0" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.280017 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3426b66-ea91-4d99-86c5-955d77073619-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e3426b66-ea91-4d99-86c5-955d77073619\") " pod="openstack/openstack-cell1-galera-0" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.280152 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3426b66-ea91-4d99-86c5-955d77073619-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e3426b66-ea91-4d99-86c5-955d77073619\") " pod="openstack/openstack-cell1-galera-0" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.280211 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3426b66-ea91-4d99-86c5-955d77073619-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e3426b66-ea91-4d99-86c5-955d77073619\") " pod="openstack/openstack-cell1-galera-0" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.280348 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e3426b66-ea91-4d99-86c5-955d77073619-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e3426b66-ea91-4d99-86c5-955d77073619\") " pod="openstack/openstack-cell1-galera-0" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.348644 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.349571 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.352328 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.352525 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.355219 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-9pzt6" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.383007 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e3426b66-ea91-4d99-86c5-955d77073619\") " pod="openstack/openstack-cell1-galera-0" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.383058 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e3426b66-ea91-4d99-86c5-955d77073619-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e3426b66-ea91-4d99-86c5-955d77073619\") " pod="openstack/openstack-cell1-galera-0" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.383083 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmkh7\" (UniqueName: \"kubernetes.io/projected/e3426b66-ea91-4d99-86c5-955d77073619-kube-api-access-jmkh7\") pod \"openstack-cell1-galera-0\" (UID: \"e3426b66-ea91-4d99-86c5-955d77073619\") " pod="openstack/openstack-cell1-galera-0" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.383141 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e3426b66-ea91-4d99-86c5-955d77073619-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e3426b66-ea91-4d99-86c5-955d77073619\") " pod="openstack/openstack-cell1-galera-0" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.383180 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3426b66-ea91-4d99-86c5-955d77073619-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e3426b66-ea91-4d99-86c5-955d77073619\") " pod="openstack/openstack-cell1-galera-0" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.383229 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3426b66-ea91-4d99-86c5-955d77073619-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e3426b66-ea91-4d99-86c5-955d77073619\") " pod="openstack/openstack-cell1-galera-0" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.383254 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3426b66-ea91-4d99-86c5-955d77073619-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e3426b66-ea91-4d99-86c5-955d77073619\") " pod="openstack/openstack-cell1-galera-0" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.383292 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e3426b66-ea91-4d99-86c5-955d77073619-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e3426b66-ea91-4d99-86c5-955d77073619\") " pod="openstack/openstack-cell1-galera-0" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.383302 4752 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e3426b66-ea91-4d99-86c5-955d77073619\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-cell1-galera-0" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.384085 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e3426b66-ea91-4d99-86c5-955d77073619-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e3426b66-ea91-4d99-86c5-955d77073619\") " pod="openstack/openstack-cell1-galera-0" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.384590 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e3426b66-ea91-4d99-86c5-955d77073619-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e3426b66-ea91-4d99-86c5-955d77073619\") " pod="openstack/openstack-cell1-galera-0" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.384626 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3426b66-ea91-4d99-86c5-955d77073619-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e3426b66-ea91-4d99-86c5-955d77073619\") " pod="openstack/openstack-cell1-galera-0" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.384766 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e3426b66-ea91-4d99-86c5-955d77073619-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e3426b66-ea91-4d99-86c5-955d77073619\") " pod="openstack/openstack-cell1-galera-0" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.389761 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3426b66-ea91-4d99-86c5-955d77073619-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e3426b66-ea91-4d99-86c5-955d77073619\") " pod="openstack/openstack-cell1-galera-0" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.398616 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3426b66-ea91-4d99-86c5-955d77073619-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e3426b66-ea91-4d99-86c5-955d77073619\") " pod="openstack/openstack-cell1-galera-0" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.415401 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmkh7\" (UniqueName: \"kubernetes.io/projected/e3426b66-ea91-4d99-86c5-955d77073619-kube-api-access-jmkh7\") pod \"openstack-cell1-galera-0\" (UID: \"e3426b66-ea91-4d99-86c5-955d77073619\") " pod="openstack/openstack-cell1-galera-0" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.419783 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.443989 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e3426b66-ea91-4d99-86c5-955d77073619\") " pod="openstack/openstack-cell1-galera-0" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.485446 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1020727e-3725-4fed-b276-043ae4d30c4c-kolla-config\") pod \"memcached-0\" (UID: \"1020727e-3725-4fed-b276-043ae4d30c4c\") " pod="openstack/memcached-0" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.485808 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1020727e-3725-4fed-b276-043ae4d30c4c-config-data\") pod \"memcached-0\" (UID: \"1020727e-3725-4fed-b276-043ae4d30c4c\") " pod="openstack/memcached-0" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.485828 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhd9l\" (UniqueName: \"kubernetes.io/projected/1020727e-3725-4fed-b276-043ae4d30c4c-kube-api-access-vhd9l\") pod \"memcached-0\" (UID: \"1020727e-3725-4fed-b276-043ae4d30c4c\") " pod="openstack/memcached-0" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.485890 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/1020727e-3725-4fed-b276-043ae4d30c4c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"1020727e-3725-4fed-b276-043ae4d30c4c\") " pod="openstack/memcached-0" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.485924 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1020727e-3725-4fed-b276-043ae4d30c4c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"1020727e-3725-4fed-b276-043ae4d30c4c\") " pod="openstack/memcached-0" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.587117 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1020727e-3725-4fed-b276-043ae4d30c4c-kolla-config\") pod \"memcached-0\" (UID: \"1020727e-3725-4fed-b276-043ae4d30c4c\") " pod="openstack/memcached-0" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.587177 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1020727e-3725-4fed-b276-043ae4d30c4c-config-data\") pod \"memcached-0\" (UID: \"1020727e-3725-4fed-b276-043ae4d30c4c\") " pod="openstack/memcached-0" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.587195 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhd9l\" (UniqueName: \"kubernetes.io/projected/1020727e-3725-4fed-b276-043ae4d30c4c-kube-api-access-vhd9l\") pod \"memcached-0\" (UID: \"1020727e-3725-4fed-b276-043ae4d30c4c\") " pod="openstack/memcached-0" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.587257 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/1020727e-3725-4fed-b276-043ae4d30c4c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"1020727e-3725-4fed-b276-043ae4d30c4c\") " pod="openstack/memcached-0" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.587288 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1020727e-3725-4fed-b276-043ae4d30c4c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"1020727e-3725-4fed-b276-043ae4d30c4c\") " pod="openstack/memcached-0" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.587894 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1020727e-3725-4fed-b276-043ae4d30c4c-kolla-config\") pod \"memcached-0\" (UID: \"1020727e-3725-4fed-b276-043ae4d30c4c\") " pod="openstack/memcached-0" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.588324 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1020727e-3725-4fed-b276-043ae4d30c4c-config-data\") pod \"memcached-0\" (UID: \"1020727e-3725-4fed-b276-043ae4d30c4c\") " pod="openstack/memcached-0" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.592368 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1020727e-3725-4fed-b276-043ae4d30c4c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"1020727e-3725-4fed-b276-043ae4d30c4c\") " pod="openstack/memcached-0" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.603274 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/1020727e-3725-4fed-b276-043ae4d30c4c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"1020727e-3725-4fed-b276-043ae4d30c4c\") " pod="openstack/memcached-0" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.618287 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhd9l\" (UniqueName: \"kubernetes.io/projected/1020727e-3725-4fed-b276-043ae4d30c4c-kube-api-access-vhd9l\") pod \"memcached-0\" (UID: \"1020727e-3725-4fed-b276-043ae4d30c4c\") " pod="openstack/memcached-0" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.689880 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 24 11:23:18 crc kubenswrapper[4752]: I1124 11:23:18.752438 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 24 11:23:20 crc kubenswrapper[4752]: I1124 11:23:20.297300 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Nov 24 11:23:20 crc kubenswrapper[4752]: I1124 11:23:20.298827 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 24 11:23:20 crc kubenswrapper[4752]: I1124 11:23:20.301417 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-p72pf" Nov 24 11:23:20 crc kubenswrapper[4752]: I1124 11:23:20.315856 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 24 11:23:20 crc kubenswrapper[4752]: I1124 11:23:20.413572 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5lgt\" (UniqueName: \"kubernetes.io/projected/4777fb8a-2a08-4286-8c15-39d249db6db0-kube-api-access-s5lgt\") pod \"kube-state-metrics-0\" (UID: \"4777fb8a-2a08-4286-8c15-39d249db6db0\") " pod="openstack/kube-state-metrics-0" Nov 24 11:23:20 crc kubenswrapper[4752]: I1124 11:23:20.515279 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5lgt\" (UniqueName: \"kubernetes.io/projected/4777fb8a-2a08-4286-8c15-39d249db6db0-kube-api-access-s5lgt\") pod \"kube-state-metrics-0\" (UID: \"4777fb8a-2a08-4286-8c15-39d249db6db0\") " pod="openstack/kube-state-metrics-0" Nov 24 11:23:20 crc kubenswrapper[4752]: I1124 11:23:20.535510 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5lgt\" (UniqueName: \"kubernetes.io/projected/4777fb8a-2a08-4286-8c15-39d249db6db0-kube-api-access-s5lgt\") pod \"kube-state-metrics-0\" (UID: \"4777fb8a-2a08-4286-8c15-39d249db6db0\") " pod="openstack/kube-state-metrics-0" Nov 24 11:23:20 crc kubenswrapper[4752]: I1124 11:23:20.614411 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.526157 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-xmc2s"] Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.527661 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xmc2s" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.534001 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-7tlng" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.534201 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.534693 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.551945 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-v6jhw"] Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.555668 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-v6jhw" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.575382 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-xmc2s"] Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.594728 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-v6jhw"] Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.664402 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/26da9a73-3ede-476b-bfab-08839a8b88d1-var-log\") pod \"ovn-controller-ovs-v6jhw\" (UID: \"26da9a73-3ede-476b-bfab-08839a8b88d1\") " pod="openstack/ovn-controller-ovs-v6jhw" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.664467 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/26da9a73-3ede-476b-bfab-08839a8b88d1-var-lib\") pod \"ovn-controller-ovs-v6jhw\" (UID: \"26da9a73-3ede-476b-bfab-08839a8b88d1\") " pod="openstack/ovn-controller-ovs-v6jhw" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.664510 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/26da9a73-3ede-476b-bfab-08839a8b88d1-var-run\") pod \"ovn-controller-ovs-v6jhw\" (UID: \"26da9a73-3ede-476b-bfab-08839a8b88d1\") " pod="openstack/ovn-controller-ovs-v6jhw" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.664535 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26da9a73-3ede-476b-bfab-08839a8b88d1-scripts\") pod \"ovn-controller-ovs-v6jhw\" (UID: \"26da9a73-3ede-476b-bfab-08839a8b88d1\") " pod="openstack/ovn-controller-ovs-v6jhw" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.664559 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9495af0b-dca8-4695-8f40-f4f9a7ec5229-ovn-controller-tls-certs\") pod \"ovn-controller-xmc2s\" (UID: \"9495af0b-dca8-4695-8f40-f4f9a7ec5229\") " pod="openstack/ovn-controller-xmc2s" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.664582 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/26da9a73-3ede-476b-bfab-08839a8b88d1-etc-ovs\") pod \"ovn-controller-ovs-v6jhw\" (UID: \"26da9a73-3ede-476b-bfab-08839a8b88d1\") " pod="openstack/ovn-controller-ovs-v6jhw" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.664613 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6whzx\" (UniqueName: \"kubernetes.io/projected/26da9a73-3ede-476b-bfab-08839a8b88d1-kube-api-access-6whzx\") pod \"ovn-controller-ovs-v6jhw\" (UID: \"26da9a73-3ede-476b-bfab-08839a8b88d1\") " pod="openstack/ovn-controller-ovs-v6jhw" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.664660 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j8xn\" (UniqueName: \"kubernetes.io/projected/9495af0b-dca8-4695-8f40-f4f9a7ec5229-kube-api-access-2j8xn\") pod \"ovn-controller-xmc2s\" (UID: \"9495af0b-dca8-4695-8f40-f4f9a7ec5229\") " pod="openstack/ovn-controller-xmc2s" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.664699 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9495af0b-dca8-4695-8f40-f4f9a7ec5229-var-run\") pod \"ovn-controller-xmc2s\" (UID: \"9495af0b-dca8-4695-8f40-f4f9a7ec5229\") " pod="openstack/ovn-controller-xmc2s" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.664882 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9495af0b-dca8-4695-8f40-f4f9a7ec5229-var-run-ovn\") pod \"ovn-controller-xmc2s\" (UID: \"9495af0b-dca8-4695-8f40-f4f9a7ec5229\") " pod="openstack/ovn-controller-xmc2s" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.664919 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9495af0b-dca8-4695-8f40-f4f9a7ec5229-combined-ca-bundle\") pod \"ovn-controller-xmc2s\" (UID: \"9495af0b-dca8-4695-8f40-f4f9a7ec5229\") " pod="openstack/ovn-controller-xmc2s" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.664960 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9495af0b-dca8-4695-8f40-f4f9a7ec5229-var-log-ovn\") pod \"ovn-controller-xmc2s\" (UID: \"9495af0b-dca8-4695-8f40-f4f9a7ec5229\") " pod="openstack/ovn-controller-xmc2s" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.665011 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9495af0b-dca8-4695-8f40-f4f9a7ec5229-scripts\") pod \"ovn-controller-xmc2s\" (UID: \"9495af0b-dca8-4695-8f40-f4f9a7ec5229\") " pod="openstack/ovn-controller-xmc2s" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.766979 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9495af0b-dca8-4695-8f40-f4f9a7ec5229-var-log-ovn\") pod \"ovn-controller-xmc2s\" (UID: \"9495af0b-dca8-4695-8f40-f4f9a7ec5229\") " pod="openstack/ovn-controller-xmc2s" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.767098 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9495af0b-dca8-4695-8f40-f4f9a7ec5229-scripts\") pod \"ovn-controller-xmc2s\" (UID: \"9495af0b-dca8-4695-8f40-f4f9a7ec5229\") " pod="openstack/ovn-controller-xmc2s" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.767135 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/26da9a73-3ede-476b-bfab-08839a8b88d1-var-log\") pod \"ovn-controller-ovs-v6jhw\" (UID: \"26da9a73-3ede-476b-bfab-08839a8b88d1\") " pod="openstack/ovn-controller-ovs-v6jhw" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.767173 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/26da9a73-3ede-476b-bfab-08839a8b88d1-var-lib\") pod \"ovn-controller-ovs-v6jhw\" (UID: \"26da9a73-3ede-476b-bfab-08839a8b88d1\") " pod="openstack/ovn-controller-ovs-v6jhw" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.767219 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/26da9a73-3ede-476b-bfab-08839a8b88d1-var-run\") pod \"ovn-controller-ovs-v6jhw\" (UID: \"26da9a73-3ede-476b-bfab-08839a8b88d1\") " pod="openstack/ovn-controller-ovs-v6jhw" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.767243 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26da9a73-3ede-476b-bfab-08839a8b88d1-scripts\") pod \"ovn-controller-ovs-v6jhw\" (UID: \"26da9a73-3ede-476b-bfab-08839a8b88d1\") " pod="openstack/ovn-controller-ovs-v6jhw" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.767267 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9495af0b-dca8-4695-8f40-f4f9a7ec5229-ovn-controller-tls-certs\") pod \"ovn-controller-xmc2s\" (UID: \"9495af0b-dca8-4695-8f40-f4f9a7ec5229\") " pod="openstack/ovn-controller-xmc2s" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.767286 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/26da9a73-3ede-476b-bfab-08839a8b88d1-etc-ovs\") pod \"ovn-controller-ovs-v6jhw\" (UID: \"26da9a73-3ede-476b-bfab-08839a8b88d1\") " pod="openstack/ovn-controller-ovs-v6jhw" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.767324 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6whzx\" (UniqueName: \"kubernetes.io/projected/26da9a73-3ede-476b-bfab-08839a8b88d1-kube-api-access-6whzx\") pod \"ovn-controller-ovs-v6jhw\" (UID: \"26da9a73-3ede-476b-bfab-08839a8b88d1\") " pod="openstack/ovn-controller-ovs-v6jhw" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.767383 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j8xn\" (UniqueName: \"kubernetes.io/projected/9495af0b-dca8-4695-8f40-f4f9a7ec5229-kube-api-access-2j8xn\") pod \"ovn-controller-xmc2s\" (UID: \"9495af0b-dca8-4695-8f40-f4f9a7ec5229\") " pod="openstack/ovn-controller-xmc2s" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.767408 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9495af0b-dca8-4695-8f40-f4f9a7ec5229-var-run\") pod \"ovn-controller-xmc2s\" (UID: \"9495af0b-dca8-4695-8f40-f4f9a7ec5229\") " pod="openstack/ovn-controller-xmc2s" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.767456 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9495af0b-dca8-4695-8f40-f4f9a7ec5229-var-run-ovn\") pod \"ovn-controller-xmc2s\" (UID: \"9495af0b-dca8-4695-8f40-f4f9a7ec5229\") " pod="openstack/ovn-controller-xmc2s" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.767490 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9495af0b-dca8-4695-8f40-f4f9a7ec5229-combined-ca-bundle\") pod \"ovn-controller-xmc2s\" (UID: \"9495af0b-dca8-4695-8f40-f4f9a7ec5229\") " pod="openstack/ovn-controller-xmc2s" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.767656 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9495af0b-dca8-4695-8f40-f4f9a7ec5229-var-log-ovn\") pod \"ovn-controller-xmc2s\" (UID: \"9495af0b-dca8-4695-8f40-f4f9a7ec5229\") " pod="openstack/ovn-controller-xmc2s" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.767863 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/26da9a73-3ede-476b-bfab-08839a8b88d1-var-log\") pod \"ovn-controller-ovs-v6jhw\" (UID: \"26da9a73-3ede-476b-bfab-08839a8b88d1\") " pod="openstack/ovn-controller-ovs-v6jhw" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.768593 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/26da9a73-3ede-476b-bfab-08839a8b88d1-var-lib\") pod \"ovn-controller-ovs-v6jhw\" (UID: \"26da9a73-3ede-476b-bfab-08839a8b88d1\") " pod="openstack/ovn-controller-ovs-v6jhw" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.768825 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/26da9a73-3ede-476b-bfab-08839a8b88d1-var-run\") pod \"ovn-controller-ovs-v6jhw\" (UID: \"26da9a73-3ede-476b-bfab-08839a8b88d1\") " pod="openstack/ovn-controller-ovs-v6jhw" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.769018 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9495af0b-dca8-4695-8f40-f4f9a7ec5229-var-run\") pod \"ovn-controller-xmc2s\" (UID: \"9495af0b-dca8-4695-8f40-f4f9a7ec5229\") " pod="openstack/ovn-controller-xmc2s" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.769171 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9495af0b-dca8-4695-8f40-f4f9a7ec5229-var-run-ovn\") pod \"ovn-controller-xmc2s\" (UID: \"9495af0b-dca8-4695-8f40-f4f9a7ec5229\") " pod="openstack/ovn-controller-xmc2s" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.769201 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/26da9a73-3ede-476b-bfab-08839a8b88d1-etc-ovs\") pod \"ovn-controller-ovs-v6jhw\" (UID: \"26da9a73-3ede-476b-bfab-08839a8b88d1\") " pod="openstack/ovn-controller-ovs-v6jhw" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.769880 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26da9a73-3ede-476b-bfab-08839a8b88d1-scripts\") pod \"ovn-controller-ovs-v6jhw\" (UID: \"26da9a73-3ede-476b-bfab-08839a8b88d1\") " pod="openstack/ovn-controller-ovs-v6jhw" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.770711 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9495af0b-dca8-4695-8f40-f4f9a7ec5229-scripts\") pod \"ovn-controller-xmc2s\" (UID: \"9495af0b-dca8-4695-8f40-f4f9a7ec5229\") " pod="openstack/ovn-controller-xmc2s" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.786350 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9495af0b-dca8-4695-8f40-f4f9a7ec5229-ovn-controller-tls-certs\") pod \"ovn-controller-xmc2s\" (UID: \"9495af0b-dca8-4695-8f40-f4f9a7ec5229\") " pod="openstack/ovn-controller-xmc2s" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.786919 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9495af0b-dca8-4695-8f40-f4f9a7ec5229-combined-ca-bundle\") pod \"ovn-controller-xmc2s\" (UID: \"9495af0b-dca8-4695-8f40-f4f9a7ec5229\") " pod="openstack/ovn-controller-xmc2s" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.789842 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6whzx\" (UniqueName: \"kubernetes.io/projected/26da9a73-3ede-476b-bfab-08839a8b88d1-kube-api-access-6whzx\") pod \"ovn-controller-ovs-v6jhw\" (UID: \"26da9a73-3ede-476b-bfab-08839a8b88d1\") " pod="openstack/ovn-controller-ovs-v6jhw" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.793364 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j8xn\" (UniqueName: \"kubernetes.io/projected/9495af0b-dca8-4695-8f40-f4f9a7ec5229-kube-api-access-2j8xn\") pod \"ovn-controller-xmc2s\" (UID: \"9495af0b-dca8-4695-8f40-f4f9a7ec5229\") " pod="openstack/ovn-controller-xmc2s" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.865395 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xmc2s" Nov 24 11:23:23 crc kubenswrapper[4752]: I1124 11:23:23.888241 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-v6jhw" Nov 24 11:23:24 crc kubenswrapper[4752]: I1124 11:23:24.890215 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 24 11:23:24 crc kubenswrapper[4752]: I1124 11:23:24.892027 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 24 11:23:24 crc kubenswrapper[4752]: I1124 11:23:24.895879 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Nov 24 11:23:24 crc kubenswrapper[4752]: I1124 11:23:24.895901 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Nov 24 11:23:24 crc kubenswrapper[4752]: I1124 11:23:24.896073 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Nov 24 11:23:24 crc kubenswrapper[4752]: I1124 11:23:24.896121 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-dbllw" Nov 24 11:23:24 crc kubenswrapper[4752]: I1124 11:23:24.900706 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Nov 24 11:23:24 crc kubenswrapper[4752]: I1124 11:23:24.914652 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 24 11:23:25 crc kubenswrapper[4752]: I1124 11:23:25.005485 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5a1ee4e-2810-485d-b931-d3121fe7e264-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d5a1ee4e-2810-485d-b931-d3121fe7e264\") " pod="openstack/ovsdbserver-nb-0" Nov 24 11:23:25 crc kubenswrapper[4752]: I1124 11:23:25.005539 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d5a1ee4e-2810-485d-b931-d3121fe7e264-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d5a1ee4e-2810-485d-b931-d3121fe7e264\") " pod="openstack/ovsdbserver-nb-0" Nov 24 11:23:25 crc kubenswrapper[4752]: I1124 11:23:25.005599 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5a1ee4e-2810-485d-b931-d3121fe7e264-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d5a1ee4e-2810-485d-b931-d3121fe7e264\") " pod="openstack/ovsdbserver-nb-0" Nov 24 11:23:25 crc kubenswrapper[4752]: I1124 11:23:25.005633 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvs2l\" (UniqueName: \"kubernetes.io/projected/d5a1ee4e-2810-485d-b931-d3121fe7e264-kube-api-access-cvs2l\") pod \"ovsdbserver-nb-0\" (UID: \"d5a1ee4e-2810-485d-b931-d3121fe7e264\") " pod="openstack/ovsdbserver-nb-0" Nov 24 11:23:25 crc kubenswrapper[4752]: I1124 11:23:25.005653 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d5a1ee4e-2810-485d-b931-d3121fe7e264\") " pod="openstack/ovsdbserver-nb-0" Nov 24 11:23:25 crc kubenswrapper[4752]: I1124 11:23:25.005679 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5a1ee4e-2810-485d-b931-d3121fe7e264-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d5a1ee4e-2810-485d-b931-d3121fe7e264\") " pod="openstack/ovsdbserver-nb-0" Nov 24 11:23:25 crc kubenswrapper[4752]: I1124 11:23:25.005699 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5a1ee4e-2810-485d-b931-d3121fe7e264-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d5a1ee4e-2810-485d-b931-d3121fe7e264\") " pod="openstack/ovsdbserver-nb-0" Nov 24 11:23:25 crc kubenswrapper[4752]: I1124 11:23:25.005729 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5a1ee4e-2810-485d-b931-d3121fe7e264-config\") pod \"ovsdbserver-nb-0\" (UID: \"d5a1ee4e-2810-485d-b931-d3121fe7e264\") " pod="openstack/ovsdbserver-nb-0" Nov 24 11:23:25 crc kubenswrapper[4752]: I1124 11:23:25.106380 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5a1ee4e-2810-485d-b931-d3121fe7e264-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d5a1ee4e-2810-485d-b931-d3121fe7e264\") " pod="openstack/ovsdbserver-nb-0" Nov 24 11:23:25 crc kubenswrapper[4752]: I1124 11:23:25.106427 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d5a1ee4e-2810-485d-b931-d3121fe7e264-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d5a1ee4e-2810-485d-b931-d3121fe7e264\") " pod="openstack/ovsdbserver-nb-0" Nov 24 11:23:25 crc kubenswrapper[4752]: I1124 11:23:25.106499 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5a1ee4e-2810-485d-b931-d3121fe7e264-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d5a1ee4e-2810-485d-b931-d3121fe7e264\") " pod="openstack/ovsdbserver-nb-0" Nov 24 11:23:25 crc kubenswrapper[4752]: I1124 11:23:25.106522 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvs2l\" (UniqueName: \"kubernetes.io/projected/d5a1ee4e-2810-485d-b931-d3121fe7e264-kube-api-access-cvs2l\") pod \"ovsdbserver-nb-0\" (UID: \"d5a1ee4e-2810-485d-b931-d3121fe7e264\") " pod="openstack/ovsdbserver-nb-0" Nov 24 11:23:25 crc kubenswrapper[4752]: I1124 11:23:25.106548 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d5a1ee4e-2810-485d-b931-d3121fe7e264\") " pod="openstack/ovsdbserver-nb-0" Nov 24 11:23:25 crc kubenswrapper[4752]: I1124 11:23:25.106579 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5a1ee4e-2810-485d-b931-d3121fe7e264-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d5a1ee4e-2810-485d-b931-d3121fe7e264\") " pod="openstack/ovsdbserver-nb-0" Nov 24 11:23:25 crc kubenswrapper[4752]: I1124 11:23:25.106608 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5a1ee4e-2810-485d-b931-d3121fe7e264-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d5a1ee4e-2810-485d-b931-d3121fe7e264\") " pod="openstack/ovsdbserver-nb-0" Nov 24 11:23:25 crc kubenswrapper[4752]: I1124 11:23:25.106642 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5a1ee4e-2810-485d-b931-d3121fe7e264-config\") pod \"ovsdbserver-nb-0\" (UID: \"d5a1ee4e-2810-485d-b931-d3121fe7e264\") " pod="openstack/ovsdbserver-nb-0" Nov 24 11:23:25 crc kubenswrapper[4752]: I1124 11:23:25.107002 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d5a1ee4e-2810-485d-b931-d3121fe7e264-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d5a1ee4e-2810-485d-b931-d3121fe7e264\") " pod="openstack/ovsdbserver-nb-0" Nov 24 11:23:25 crc kubenswrapper[4752]: I1124 11:23:25.107162 4752 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d5a1ee4e-2810-485d-b931-d3121fe7e264\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-nb-0" Nov 24 11:23:25 crc kubenswrapper[4752]: I1124 11:23:25.107362 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5a1ee4e-2810-485d-b931-d3121fe7e264-config\") pod \"ovsdbserver-nb-0\" (UID: \"d5a1ee4e-2810-485d-b931-d3121fe7e264\") " pod="openstack/ovsdbserver-nb-0" Nov 24 11:23:25 crc kubenswrapper[4752]: I1124 11:23:25.108538 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5a1ee4e-2810-485d-b931-d3121fe7e264-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d5a1ee4e-2810-485d-b931-d3121fe7e264\") " pod="openstack/ovsdbserver-nb-0" Nov 24 11:23:25 crc kubenswrapper[4752]: I1124 11:23:25.112080 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5a1ee4e-2810-485d-b931-d3121fe7e264-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d5a1ee4e-2810-485d-b931-d3121fe7e264\") " pod="openstack/ovsdbserver-nb-0" Nov 24 11:23:25 crc kubenswrapper[4752]: I1124 11:23:25.113246 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5a1ee4e-2810-485d-b931-d3121fe7e264-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d5a1ee4e-2810-485d-b931-d3121fe7e264\") " pod="openstack/ovsdbserver-nb-0" Nov 24 11:23:25 crc kubenswrapper[4752]: I1124 11:23:25.125086 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5a1ee4e-2810-485d-b931-d3121fe7e264-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d5a1ee4e-2810-485d-b931-d3121fe7e264\") " pod="openstack/ovsdbserver-nb-0" Nov 24 11:23:25 crc kubenswrapper[4752]: I1124 11:23:25.125682 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d5a1ee4e-2810-485d-b931-d3121fe7e264\") " pod="openstack/ovsdbserver-nb-0" Nov 24 11:23:25 crc kubenswrapper[4752]: I1124 11:23:25.127940 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvs2l\" (UniqueName: \"kubernetes.io/projected/d5a1ee4e-2810-485d-b931-d3121fe7e264-kube-api-access-cvs2l\") pod \"ovsdbserver-nb-0\" (UID: \"d5a1ee4e-2810-485d-b931-d3121fe7e264\") " pod="openstack/ovsdbserver-nb-0" Nov 24 11:23:25 crc kubenswrapper[4752]: I1124 11:23:25.218176 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 24 11:23:26 crc kubenswrapper[4752]: I1124 11:23:26.953944 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 24 11:23:27 crc kubenswrapper[4752]: E1124 11:23:27.534721 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Nov 24 11:23:27 crc kubenswrapper[4752]: E1124 11:23:27.534860 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5pttv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-xgwmt_openstack(571aeb3d-6dd9-4362-8680-f625baa05500): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 24 11:23:27 crc kubenswrapper[4752]: E1124 11:23:27.536845 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-xgwmt" podUID="571aeb3d-6dd9-4362-8680-f625baa05500" Nov 24 11:23:27 crc kubenswrapper[4752]: I1124 11:23:27.596885 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 24 11:23:27 crc kubenswrapper[4752]: I1124 11:23:27.598610 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 24 11:23:27 crc kubenswrapper[4752]: I1124 11:23:27.613437 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Nov 24 11:23:27 crc kubenswrapper[4752]: I1124 11:23:27.613803 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Nov 24 11:23:27 crc kubenswrapper[4752]: I1124 11:23:27.613919 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Nov 24 11:23:27 crc kubenswrapper[4752]: I1124 11:23:27.614457 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-cmkll" Nov 24 11:23:27 crc kubenswrapper[4752]: I1124 11:23:27.649483 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 24 11:23:27 crc kubenswrapper[4752]: I1124 11:23:27.660765 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcb9z\" (UniqueName: \"kubernetes.io/projected/7d558bea-9793-4831-9304-f8cee2b2331e-kube-api-access-hcb9z\") pod \"ovsdbserver-sb-0\" (UID: \"7d558bea-9793-4831-9304-f8cee2b2331e\") " pod="openstack/ovsdbserver-sb-0" Nov 24 11:23:27 crc kubenswrapper[4752]: I1124 11:23:27.660817 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d558bea-9793-4831-9304-f8cee2b2331e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7d558bea-9793-4831-9304-f8cee2b2331e\") " pod="openstack/ovsdbserver-sb-0" Nov 24 11:23:27 crc kubenswrapper[4752]: I1124 11:23:27.660871 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d558bea-9793-4831-9304-f8cee2b2331e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7d558bea-9793-4831-9304-f8cee2b2331e\") " pod="openstack/ovsdbserver-sb-0" Nov 24 11:23:27 crc kubenswrapper[4752]: I1124 11:23:27.660906 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7d558bea-9793-4831-9304-f8cee2b2331e\") " pod="openstack/ovsdbserver-sb-0" Nov 24 11:23:27 crc kubenswrapper[4752]: I1124 11:23:27.660946 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d558bea-9793-4831-9304-f8cee2b2331e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7d558bea-9793-4831-9304-f8cee2b2331e\") " pod="openstack/ovsdbserver-sb-0" Nov 24 11:23:27 crc kubenswrapper[4752]: I1124 11:23:27.660968 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7d558bea-9793-4831-9304-f8cee2b2331e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7d558bea-9793-4831-9304-f8cee2b2331e\") " pod="openstack/ovsdbserver-sb-0" Nov 24 11:23:27 crc kubenswrapper[4752]: I1124 11:23:27.661027 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d558bea-9793-4831-9304-f8cee2b2331e-config\") pod \"ovsdbserver-sb-0\" (UID: \"7d558bea-9793-4831-9304-f8cee2b2331e\") " pod="openstack/ovsdbserver-sb-0" Nov 24 11:23:27 crc kubenswrapper[4752]: I1124 11:23:27.661075 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d558bea-9793-4831-9304-f8cee2b2331e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7d558bea-9793-4831-9304-f8cee2b2331e\") " pod="openstack/ovsdbserver-sb-0" Nov 24 11:23:27 crc kubenswrapper[4752]: E1124 11:23:27.675651 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Nov 24 11:23:27 crc kubenswrapper[4752]: E1124 11:23:27.675835 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jvnbb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-snmhr_openstack(5b65c931-7677-4844-a2ed-568c49b76406): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 24 11:23:27 crc kubenswrapper[4752]: E1124 11:23:27.682463 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-snmhr" podUID="5b65c931-7677-4844-a2ed-568c49b76406" Nov 24 11:23:27 crc kubenswrapper[4752]: I1124 11:23:27.763614 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d558bea-9793-4831-9304-f8cee2b2331e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7d558bea-9793-4831-9304-f8cee2b2331e\") " pod="openstack/ovsdbserver-sb-0" Nov 24 11:23:27 crc kubenswrapper[4752]: I1124 11:23:27.763974 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7d558bea-9793-4831-9304-f8cee2b2331e\") " pod="openstack/ovsdbserver-sb-0" Nov 24 11:23:27 crc kubenswrapper[4752]: I1124 11:23:27.764011 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d558bea-9793-4831-9304-f8cee2b2331e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7d558bea-9793-4831-9304-f8cee2b2331e\") " pod="openstack/ovsdbserver-sb-0" Nov 24 11:23:27 crc kubenswrapper[4752]: I1124 11:23:27.764065 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7d558bea-9793-4831-9304-f8cee2b2331e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7d558bea-9793-4831-9304-f8cee2b2331e\") " pod="openstack/ovsdbserver-sb-0" Nov 24 11:23:27 crc kubenswrapper[4752]: I1124 11:23:27.764121 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d558bea-9793-4831-9304-f8cee2b2331e-config\") pod \"ovsdbserver-sb-0\" (UID: \"7d558bea-9793-4831-9304-f8cee2b2331e\") " pod="openstack/ovsdbserver-sb-0" Nov 24 11:23:27 crc kubenswrapper[4752]: I1124 11:23:27.764165 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d558bea-9793-4831-9304-f8cee2b2331e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7d558bea-9793-4831-9304-f8cee2b2331e\") " pod="openstack/ovsdbserver-sb-0" Nov 24 11:23:27 crc kubenswrapper[4752]: I1124 11:23:27.764187 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcb9z\" (UniqueName: \"kubernetes.io/projected/7d558bea-9793-4831-9304-f8cee2b2331e-kube-api-access-hcb9z\") pod \"ovsdbserver-sb-0\" (UID: \"7d558bea-9793-4831-9304-f8cee2b2331e\") " pod="openstack/ovsdbserver-sb-0" Nov 24 11:23:27 crc kubenswrapper[4752]: I1124 11:23:27.764208 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d558bea-9793-4831-9304-f8cee2b2331e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7d558bea-9793-4831-9304-f8cee2b2331e\") " pod="openstack/ovsdbserver-sb-0" Nov 24 11:23:27 crc kubenswrapper[4752]: I1124 11:23:27.765602 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7d558bea-9793-4831-9304-f8cee2b2331e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7d558bea-9793-4831-9304-f8cee2b2331e\") " pod="openstack/ovsdbserver-sb-0" Nov 24 11:23:27 crc kubenswrapper[4752]: I1124 11:23:27.765932 4752 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7d558bea-9793-4831-9304-f8cee2b2331e\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/ovsdbserver-sb-0" Nov 24 11:23:27 crc kubenswrapper[4752]: I1124 11:23:27.768840 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d558bea-9793-4831-9304-f8cee2b2331e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7d558bea-9793-4831-9304-f8cee2b2331e\") " pod="openstack/ovsdbserver-sb-0" Nov 24 11:23:27 crc kubenswrapper[4752]: I1124 11:23:27.770299 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d558bea-9793-4831-9304-f8cee2b2331e-config\") pod \"ovsdbserver-sb-0\" (UID: \"7d558bea-9793-4831-9304-f8cee2b2331e\") " pod="openstack/ovsdbserver-sb-0" Nov 24 11:23:27 crc kubenswrapper[4752]: I1124 11:23:27.778269 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d558bea-9793-4831-9304-f8cee2b2331e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7d558bea-9793-4831-9304-f8cee2b2331e\") " pod="openstack/ovsdbserver-sb-0" Nov 24 11:23:27 crc kubenswrapper[4752]: I1124 11:23:27.807038 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcb9z\" (UniqueName: \"kubernetes.io/projected/7d558bea-9793-4831-9304-f8cee2b2331e-kube-api-access-hcb9z\") pod \"ovsdbserver-sb-0\" (UID: \"7d558bea-9793-4831-9304-f8cee2b2331e\") " pod="openstack/ovsdbserver-sb-0" Nov 24 11:23:27 crc kubenswrapper[4752]: I1124 11:23:27.807929 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d558bea-9793-4831-9304-f8cee2b2331e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7d558bea-9793-4831-9304-f8cee2b2331e\") " pod="openstack/ovsdbserver-sb-0" Nov 24 11:23:27 crc kubenswrapper[4752]: I1124 11:23:27.808705 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d558bea-9793-4831-9304-f8cee2b2331e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7d558bea-9793-4831-9304-f8cee2b2331e\") " pod="openstack/ovsdbserver-sb-0" Nov 24 11:23:27 crc kubenswrapper[4752]: I1124 11:23:27.848822 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7d558bea-9793-4831-9304-f8cee2b2331e\") " pod="openstack/ovsdbserver-sb-0" Nov 24 11:23:27 crc kubenswrapper[4752]: I1124 11:23:27.968493 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3d820ba8-03d6-48f4-9423-bbc1ed64a36b","Type":"ContainerStarted","Data":"70fcd895ea7f430512fbb3aced2ca12c3fba7d5a82925481ac5e2520dbe70a4c"} Nov 24 11:23:28 crc kubenswrapper[4752]: I1124 11:23:28.008093 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 24 11:23:28 crc kubenswrapper[4752]: I1124 11:23:28.480811 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-xgwmt" Nov 24 11:23:28 crc kubenswrapper[4752]: I1124 11:23:28.489017 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-snmhr" Nov 24 11:23:28 crc kubenswrapper[4752]: I1124 11:23:28.502127 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 24 11:23:28 crc kubenswrapper[4752]: W1124 11:23:28.514810 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3426b66_ea91_4d99_86c5_955d77073619.slice/crio-10b7753a7df64fd559a8dafe71a80a27e8b14aaa19eaaf5c6094695d8f1bf827 WatchSource:0}: Error finding container 10b7753a7df64fd559a8dafe71a80a27e8b14aaa19eaaf5c6094695d8f1bf827: Status 404 returned error can't find the container with id 10b7753a7df64fd559a8dafe71a80a27e8b14aaa19eaaf5c6094695d8f1bf827 Nov 24 11:23:28 crc kubenswrapper[4752]: I1124 11:23:28.582517 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/571aeb3d-6dd9-4362-8680-f625baa05500-config\") pod \"571aeb3d-6dd9-4362-8680-f625baa05500\" (UID: \"571aeb3d-6dd9-4362-8680-f625baa05500\") " Nov 24 11:23:28 crc kubenswrapper[4752]: I1124 11:23:28.582581 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pttv\" (UniqueName: \"kubernetes.io/projected/571aeb3d-6dd9-4362-8680-f625baa05500-kube-api-access-5pttv\") pod \"571aeb3d-6dd9-4362-8680-f625baa05500\" (UID: \"571aeb3d-6dd9-4362-8680-f625baa05500\") " Nov 24 11:23:28 crc kubenswrapper[4752]: I1124 11:23:28.583374 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/571aeb3d-6dd9-4362-8680-f625baa05500-config" (OuterVolumeSpecName: "config") pod "571aeb3d-6dd9-4362-8680-f625baa05500" (UID: "571aeb3d-6dd9-4362-8680-f625baa05500"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:23:28 crc kubenswrapper[4752]: I1124 11:23:28.629136 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/571aeb3d-6dd9-4362-8680-f625baa05500-kube-api-access-5pttv" (OuterVolumeSpecName: "kube-api-access-5pttv") pod "571aeb3d-6dd9-4362-8680-f625baa05500" (UID: "571aeb3d-6dd9-4362-8680-f625baa05500"). InnerVolumeSpecName "kube-api-access-5pttv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:23:28 crc kubenswrapper[4752]: I1124 11:23:28.667546 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 24 11:23:28 crc kubenswrapper[4752]: I1124 11:23:28.695547 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvnbb\" (UniqueName: \"kubernetes.io/projected/5b65c931-7677-4844-a2ed-568c49b76406-kube-api-access-jvnbb\") pod \"5b65c931-7677-4844-a2ed-568c49b76406\" (UID: \"5b65c931-7677-4844-a2ed-568c49b76406\") " Nov 24 11:23:28 crc kubenswrapper[4752]: I1124 11:23:28.695615 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b65c931-7677-4844-a2ed-568c49b76406-dns-svc\") pod \"5b65c931-7677-4844-a2ed-568c49b76406\" (UID: \"5b65c931-7677-4844-a2ed-568c49b76406\") " Nov 24 11:23:28 crc kubenswrapper[4752]: I1124 11:23:28.695663 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b65c931-7677-4844-a2ed-568c49b76406-config\") pod \"5b65c931-7677-4844-a2ed-568c49b76406\" (UID: \"5b65c931-7677-4844-a2ed-568c49b76406\") " Nov 24 11:23:28 crc kubenswrapper[4752]: I1124 11:23:28.698010 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b65c931-7677-4844-a2ed-568c49b76406-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5b65c931-7677-4844-a2ed-568c49b76406" (UID: "5b65c931-7677-4844-a2ed-568c49b76406"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:23:28 crc kubenswrapper[4752]: I1124 11:23:28.701143 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pttv\" (UniqueName: \"kubernetes.io/projected/571aeb3d-6dd9-4362-8680-f625baa05500-kube-api-access-5pttv\") on node \"crc\" DevicePath \"\"" Nov 24 11:23:28 crc kubenswrapper[4752]: I1124 11:23:28.701252 4752 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b65c931-7677-4844-a2ed-568c49b76406-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 11:23:28 crc kubenswrapper[4752]: I1124 11:23:28.701292 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/571aeb3d-6dd9-4362-8680-f625baa05500-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:23:28 crc kubenswrapper[4752]: I1124 11:23:28.701994 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b65c931-7677-4844-a2ed-568c49b76406-config" (OuterVolumeSpecName: "config") pod "5b65c931-7677-4844-a2ed-568c49b76406" (UID: "5b65c931-7677-4844-a2ed-568c49b76406"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:23:28 crc kubenswrapper[4752]: I1124 11:23:28.703044 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 24 11:23:28 crc kubenswrapper[4752]: I1124 11:23:28.706320 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b65c931-7677-4844-a2ed-568c49b76406-kube-api-access-jvnbb" (OuterVolumeSpecName: "kube-api-access-jvnbb") pod "5b65c931-7677-4844-a2ed-568c49b76406" (UID: "5b65c931-7677-4844-a2ed-568c49b76406"). InnerVolumeSpecName "kube-api-access-jvnbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:23:28 crc kubenswrapper[4752]: I1124 11:23:28.755810 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 24 11:23:28 crc kubenswrapper[4752]: I1124 11:23:28.756219 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-xmc2s"] Nov 24 11:23:28 crc kubenswrapper[4752]: I1124 11:23:28.773451 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 24 11:23:28 crc kubenswrapper[4752]: I1124 11:23:28.805102 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b65c931-7677-4844-a2ed-568c49b76406-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:23:28 crc kubenswrapper[4752]: I1124 11:23:28.805135 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvnbb\" (UniqueName: \"kubernetes.io/projected/5b65c931-7677-4844-a2ed-568c49b76406-kube-api-access-jvnbb\") on node \"crc\" DevicePath \"\"" Nov 24 11:23:28 crc kubenswrapper[4752]: I1124 11:23:28.809777 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 24 11:23:28 crc kubenswrapper[4752]: W1124 11:23:28.814632 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d558bea_9793_4831_9304_f8cee2b2331e.slice/crio-c55fa00ce217c9ff143279ea2ebf102cf6985dd0a7e29e449f51597e7dc588e9 WatchSource:0}: Error finding container c55fa00ce217c9ff143279ea2ebf102cf6985dd0a7e29e449f51597e7dc588e9: Status 404 returned error can't find the container with id c55fa00ce217c9ff143279ea2ebf102cf6985dd0a7e29e449f51597e7dc588e9 Nov 24 11:23:28 crc kubenswrapper[4752]: I1124 11:23:28.978544 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5","Type":"ContainerStarted","Data":"dfc4f105e340723be162ecae7dacaabe01750f1b6aaf460058b78ab7887bc5f1"} Nov 24 11:23:28 crc kubenswrapper[4752]: I1124 11:23:28.980387 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xmc2s" event={"ID":"9495af0b-dca8-4695-8f40-f4f9a7ec5229","Type":"ContainerStarted","Data":"97ae2b312225cdca306e777fe223cca18db358f282d54afa66de7b1071301f4f"} Nov 24 11:23:28 crc kubenswrapper[4752]: I1124 11:23:28.981672 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-xgwmt" event={"ID":"571aeb3d-6dd9-4362-8680-f625baa05500","Type":"ContainerDied","Data":"4ce0620b9a536ce9de1ea5ee722d85df55c3bec5e0b1e4d0ab18a37cc3bd844d"} Nov 24 11:23:28 crc kubenswrapper[4752]: I1124 11:23:28.981696 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-xgwmt" Nov 24 11:23:28 crc kubenswrapper[4752]: I1124 11:23:28.986627 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"1020727e-3725-4fed-b276-043ae4d30c4c","Type":"ContainerStarted","Data":"dcba6482a3ae81032c0db9645c5e86cb29e0abe3e9376cbfd1c49910aa608d17"} Nov 24 11:23:28 crc kubenswrapper[4752]: I1124 11:23:28.988462 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7d558bea-9793-4831-9304-f8cee2b2331e","Type":"ContainerStarted","Data":"c55fa00ce217c9ff143279ea2ebf102cf6985dd0a7e29e449f51597e7dc588e9"} Nov 24 11:23:28 crc kubenswrapper[4752]: I1124 11:23:28.990278 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4777fb8a-2a08-4286-8c15-39d249db6db0","Type":"ContainerStarted","Data":"a5a1ce0dd28425425124e8919363775aaf4eda0b4450b67adff2e30e758a681e"} Nov 24 11:23:28 crc kubenswrapper[4752]: I1124 11:23:28.992673 4752 generic.go:334] "Generic (PLEG): container finished" podID="152a7f03-9669-4f3f-977e-0da6d03122f5" containerID="1c1e759158a12e5160fb333c16f566796a888f4675fa0f3eb44a700dd00afe8e" exitCode=0 Nov 24 11:23:28 crc kubenswrapper[4752]: I1124 11:23:28.992732 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-mtl48" event={"ID":"152a7f03-9669-4f3f-977e-0da6d03122f5","Type":"ContainerDied","Data":"1c1e759158a12e5160fb333c16f566796a888f4675fa0f3eb44a700dd00afe8e"} Nov 24 11:23:28 crc kubenswrapper[4752]: I1124 11:23:28.995184 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-snmhr" event={"ID":"5b65c931-7677-4844-a2ed-568c49b76406","Type":"ContainerDied","Data":"25e98a4c1cfef7e1e3b167c4070dacd0741b71a834fdb68b7b8dcf7f1aaf5838"} Nov 24 11:23:28 crc kubenswrapper[4752]: I1124 11:23:28.995317 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-snmhr" Nov 24 11:23:28 crc kubenswrapper[4752]: I1124 11:23:28.997138 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"30292578-890d-42e8-b689-b909de083b48","Type":"ContainerStarted","Data":"5c2f60fd67a0566ec2eb65f095e07acc01aceeeeb15ee5ace5d35d83a86ef470"} Nov 24 11:23:29 crc kubenswrapper[4752]: I1124 11:23:29.001675 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e3426b66-ea91-4d99-86c5-955d77073619","Type":"ContainerStarted","Data":"10b7753a7df64fd559a8dafe71a80a27e8b14aaa19eaaf5c6094695d8f1bf827"} Nov 24 11:23:29 crc kubenswrapper[4752]: I1124 11:23:29.006704 4752 generic.go:334] "Generic (PLEG): container finished" podID="ddf3eb37-e41b-4a4f-b394-83cc91c4d837" containerID="b42434001f9b02e2f0dd76869d4e7bdcb382a7ea6e37e3cf8f553847b60e00da" exitCode=0 Nov 24 11:23:29 crc kubenswrapper[4752]: I1124 11:23:29.006769 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-xrzhq" event={"ID":"ddf3eb37-e41b-4a4f-b394-83cc91c4d837","Type":"ContainerDied","Data":"b42434001f9b02e2f0dd76869d4e7bdcb382a7ea6e37e3cf8f553847b60e00da"} Nov 24 11:23:29 crc kubenswrapper[4752]: I1124 11:23:29.035335 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-xgwmt"] Nov 24 11:23:29 crc kubenswrapper[4752]: I1124 11:23:29.047208 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-xgwmt"] Nov 24 11:23:29 crc kubenswrapper[4752]: I1124 11:23:29.057406 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-snmhr"] Nov 24 11:23:29 crc kubenswrapper[4752]: I1124 11:23:29.063286 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-snmhr"] Nov 24 11:23:29 crc kubenswrapper[4752]: E1124 11:23:29.234344 4752 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Nov 24 11:23:29 crc kubenswrapper[4752]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/ddf3eb37-e41b-4a4f-b394-83cc91c4d837/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Nov 24 11:23:29 crc kubenswrapper[4752]: > podSandboxID="d0a6a1b167389b0fc3545e087d7f4d38b8d18b7c0abc0dd4a5fe790ceaa426f6" Nov 24 11:23:29 crc kubenswrapper[4752]: E1124 11:23:29.234574 4752 kuberuntime_manager.go:1274] "Unhandled Error" err=< Nov 24 11:23:29 crc kubenswrapper[4752]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ztxxz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-xrzhq_openstack(ddf3eb37-e41b-4a4f-b394-83cc91c4d837): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/ddf3eb37-e41b-4a4f-b394-83cc91c4d837/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Nov 24 11:23:29 crc kubenswrapper[4752]: > logger="UnhandledError" Nov 24 11:23:29 crc kubenswrapper[4752]: E1124 11:23:29.235833 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/ddf3eb37-e41b-4a4f-b394-83cc91c4d837/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-666b6646f7-xrzhq" podUID="ddf3eb37-e41b-4a4f-b394-83cc91c4d837" Nov 24 11:23:29 crc kubenswrapper[4752]: I1124 11:23:29.369256 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-v6jhw"] Nov 24 11:23:29 crc kubenswrapper[4752]: W1124 11:23:29.376117 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26da9a73_3ede_476b_bfab_08839a8b88d1.slice/crio-56fde739099e3c78f5d578401686680df965464a0a7168b59933697eff75109c WatchSource:0}: Error finding container 56fde739099e3c78f5d578401686680df965464a0a7168b59933697eff75109c: Status 404 returned error can't find the container with id 56fde739099e3c78f5d578401686680df965464a0a7168b59933697eff75109c Nov 24 11:23:30 crc kubenswrapper[4752]: I1124 11:23:30.020140 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-mtl48" event={"ID":"152a7f03-9669-4f3f-977e-0da6d03122f5","Type":"ContainerStarted","Data":"162f2f38a32ff4f50b02e21cbe01ccd0e9abe6155ab668e7f74c46cf3a0bd5d3"} Nov 24 11:23:30 crc kubenswrapper[4752]: I1124 11:23:30.020215 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-mtl48" Nov 24 11:23:30 crc kubenswrapper[4752]: I1124 11:23:30.022704 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-v6jhw" event={"ID":"26da9a73-3ede-476b-bfab-08839a8b88d1","Type":"ContainerStarted","Data":"56fde739099e3c78f5d578401686680df965464a0a7168b59933697eff75109c"} Nov 24 11:23:30 crc kubenswrapper[4752]: I1124 11:23:30.041483 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-mtl48" podStartSLOduration=5.961846669 podStartE2EDuration="16.041461331s" podCreationTimestamp="2025-11-24 11:23:14 +0000 UTC" firstStartedPulling="2025-11-24 11:23:17.743316725 +0000 UTC m=+1003.728137044" lastFinishedPulling="2025-11-24 11:23:27.822931417 +0000 UTC m=+1013.807751706" observedRunningTime="2025-11-24 11:23:30.035237171 +0000 UTC m=+1016.020057480" watchObservedRunningTime="2025-11-24 11:23:30.041461331 +0000 UTC m=+1016.026281620" Nov 24 11:23:30 crc kubenswrapper[4752]: I1124 11:23:30.471305 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 24 11:23:30 crc kubenswrapper[4752]: I1124 11:23:30.740926 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="571aeb3d-6dd9-4362-8680-f625baa05500" path="/var/lib/kubelet/pods/571aeb3d-6dd9-4362-8680-f625baa05500/volumes" Nov 24 11:23:30 crc kubenswrapper[4752]: I1124 11:23:30.741370 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b65c931-7677-4844-a2ed-568c49b76406" path="/var/lib/kubelet/pods/5b65c931-7677-4844-a2ed-568c49b76406/volumes" Nov 24 11:23:31 crc kubenswrapper[4752]: W1124 11:23:31.352532 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5a1ee4e_2810_485d_b931_d3121fe7e264.slice/crio-825da2ef53156d201cfd38c810103d4083ba547ce727cf4b78de89b162548a38 WatchSource:0}: Error finding container 825da2ef53156d201cfd38c810103d4083ba547ce727cf4b78de89b162548a38: Status 404 returned error can't find the container with id 825da2ef53156d201cfd38c810103d4083ba547ce727cf4b78de89b162548a38 Nov 24 11:23:32 crc kubenswrapper[4752]: I1124 11:23:32.049290 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d5a1ee4e-2810-485d-b931-d3121fe7e264","Type":"ContainerStarted","Data":"825da2ef53156d201cfd38c810103d4083ba547ce727cf4b78de89b162548a38"} Nov 24 11:23:34 crc kubenswrapper[4752]: I1124 11:23:34.528994 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-mtl48" Nov 24 11:23:34 crc kubenswrapper[4752]: I1124 11:23:34.582003 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-xrzhq"] Nov 24 11:23:35 crc kubenswrapper[4752]: I1124 11:23:35.658358 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-cm8dh"] Nov 24 11:23:35 crc kubenswrapper[4752]: I1124 11:23:35.673229 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-cm8dh"] Nov 24 11:23:35 crc kubenswrapper[4752]: I1124 11:23:35.673340 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-cm8dh" Nov 24 11:23:35 crc kubenswrapper[4752]: I1124 11:23:35.675527 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Nov 24 11:23:35 crc kubenswrapper[4752]: I1124 11:23:35.754629 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01bd298d-28f4-4c50-80b3-f81ed96de794-config\") pod \"ovn-controller-metrics-cm8dh\" (UID: \"01bd298d-28f4-4c50-80b3-f81ed96de794\") " pod="openstack/ovn-controller-metrics-cm8dh" Nov 24 11:23:35 crc kubenswrapper[4752]: I1124 11:23:35.754999 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01bd298d-28f4-4c50-80b3-f81ed96de794-combined-ca-bundle\") pod \"ovn-controller-metrics-cm8dh\" (UID: \"01bd298d-28f4-4c50-80b3-f81ed96de794\") " pod="openstack/ovn-controller-metrics-cm8dh" Nov 24 11:23:35 crc kubenswrapper[4752]: I1124 11:23:35.755065 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/01bd298d-28f4-4c50-80b3-f81ed96de794-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-cm8dh\" (UID: \"01bd298d-28f4-4c50-80b3-f81ed96de794\") " pod="openstack/ovn-controller-metrics-cm8dh" Nov 24 11:23:35 crc kubenswrapper[4752]: I1124 11:23:35.755156 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/01bd298d-28f4-4c50-80b3-f81ed96de794-ovn-rundir\") pod \"ovn-controller-metrics-cm8dh\" (UID: \"01bd298d-28f4-4c50-80b3-f81ed96de794\") " pod="openstack/ovn-controller-metrics-cm8dh" Nov 24 11:23:35 crc kubenswrapper[4752]: I1124 11:23:35.755244 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bcwj\" (UniqueName: \"kubernetes.io/projected/01bd298d-28f4-4c50-80b3-f81ed96de794-kube-api-access-6bcwj\") pod \"ovn-controller-metrics-cm8dh\" (UID: \"01bd298d-28f4-4c50-80b3-f81ed96de794\") " pod="openstack/ovn-controller-metrics-cm8dh" Nov 24 11:23:35 crc kubenswrapper[4752]: I1124 11:23:35.755364 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/01bd298d-28f4-4c50-80b3-f81ed96de794-ovs-rundir\") pod \"ovn-controller-metrics-cm8dh\" (UID: \"01bd298d-28f4-4c50-80b3-f81ed96de794\") " pod="openstack/ovn-controller-metrics-cm8dh" Nov 24 11:23:35 crc kubenswrapper[4752]: I1124 11:23:35.818576 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-82l5g"] Nov 24 11:23:35 crc kubenswrapper[4752]: I1124 11:23:35.819952 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-82l5g" Nov 24 11:23:35 crc kubenswrapper[4752]: I1124 11:23:35.830238 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Nov 24 11:23:35 crc kubenswrapper[4752]: I1124 11:23:35.831998 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-82l5g"] Nov 24 11:23:35 crc kubenswrapper[4752]: I1124 11:23:35.863411 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/01bd298d-28f4-4c50-80b3-f81ed96de794-ovs-rundir\") pod \"ovn-controller-metrics-cm8dh\" (UID: \"01bd298d-28f4-4c50-80b3-f81ed96de794\") " pod="openstack/ovn-controller-metrics-cm8dh" Nov 24 11:23:35 crc kubenswrapper[4752]: I1124 11:23:35.863464 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01bd298d-28f4-4c50-80b3-f81ed96de794-config\") pod \"ovn-controller-metrics-cm8dh\" (UID: \"01bd298d-28f4-4c50-80b3-f81ed96de794\") " pod="openstack/ovn-controller-metrics-cm8dh" Nov 24 11:23:35 crc kubenswrapper[4752]: I1124 11:23:35.863492 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01bd298d-28f4-4c50-80b3-f81ed96de794-combined-ca-bundle\") pod \"ovn-controller-metrics-cm8dh\" (UID: \"01bd298d-28f4-4c50-80b3-f81ed96de794\") " pod="openstack/ovn-controller-metrics-cm8dh" Nov 24 11:23:35 crc kubenswrapper[4752]: I1124 11:23:35.863584 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/01bd298d-28f4-4c50-80b3-f81ed96de794-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-cm8dh\" (UID: \"01bd298d-28f4-4c50-80b3-f81ed96de794\") " pod="openstack/ovn-controller-metrics-cm8dh" Nov 24 11:23:35 crc kubenswrapper[4752]: I1124 11:23:35.863651 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/01bd298d-28f4-4c50-80b3-f81ed96de794-ovn-rundir\") pod \"ovn-controller-metrics-cm8dh\" (UID: \"01bd298d-28f4-4c50-80b3-f81ed96de794\") " pod="openstack/ovn-controller-metrics-cm8dh" Nov 24 11:23:35 crc kubenswrapper[4752]: I1124 11:23:35.863717 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bcwj\" (UniqueName: \"kubernetes.io/projected/01bd298d-28f4-4c50-80b3-f81ed96de794-kube-api-access-6bcwj\") pod \"ovn-controller-metrics-cm8dh\" (UID: \"01bd298d-28f4-4c50-80b3-f81ed96de794\") " pod="openstack/ovn-controller-metrics-cm8dh" Nov 24 11:23:35 crc kubenswrapper[4752]: I1124 11:23:35.864277 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/01bd298d-28f4-4c50-80b3-f81ed96de794-ovs-rundir\") pod \"ovn-controller-metrics-cm8dh\" (UID: \"01bd298d-28f4-4c50-80b3-f81ed96de794\") " pod="openstack/ovn-controller-metrics-cm8dh" Nov 24 11:23:35 crc kubenswrapper[4752]: I1124 11:23:35.864489 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/01bd298d-28f4-4c50-80b3-f81ed96de794-ovn-rundir\") pod \"ovn-controller-metrics-cm8dh\" (UID: \"01bd298d-28f4-4c50-80b3-f81ed96de794\") " pod="openstack/ovn-controller-metrics-cm8dh" Nov 24 11:23:35 crc kubenswrapper[4752]: I1124 11:23:35.864725 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01bd298d-28f4-4c50-80b3-f81ed96de794-config\") pod \"ovn-controller-metrics-cm8dh\" (UID: \"01bd298d-28f4-4c50-80b3-f81ed96de794\") " pod="openstack/ovn-controller-metrics-cm8dh" Nov 24 11:23:35 crc kubenswrapper[4752]: I1124 11:23:35.871331 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01bd298d-28f4-4c50-80b3-f81ed96de794-combined-ca-bundle\") pod \"ovn-controller-metrics-cm8dh\" (UID: \"01bd298d-28f4-4c50-80b3-f81ed96de794\") " pod="openstack/ovn-controller-metrics-cm8dh" Nov 24 11:23:35 crc kubenswrapper[4752]: I1124 11:23:35.876448 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/01bd298d-28f4-4c50-80b3-f81ed96de794-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-cm8dh\" (UID: \"01bd298d-28f4-4c50-80b3-f81ed96de794\") " pod="openstack/ovn-controller-metrics-cm8dh" Nov 24 11:23:35 crc kubenswrapper[4752]: I1124 11:23:35.888601 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bcwj\" (UniqueName: \"kubernetes.io/projected/01bd298d-28f4-4c50-80b3-f81ed96de794-kube-api-access-6bcwj\") pod \"ovn-controller-metrics-cm8dh\" (UID: \"01bd298d-28f4-4c50-80b3-f81ed96de794\") " pod="openstack/ovn-controller-metrics-cm8dh" Nov 24 11:23:35 crc kubenswrapper[4752]: I1124 11:23:35.965594 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d8f3080-018d-4e4e-b38b-5a3a2190312e-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-82l5g\" (UID: \"4d8f3080-018d-4e4e-b38b-5a3a2190312e\") " pod="openstack/dnsmasq-dns-5bf47b49b7-82l5g" Nov 24 11:23:35 crc kubenswrapper[4752]: I1124 11:23:35.965679 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d8f3080-018d-4e4e-b38b-5a3a2190312e-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-82l5g\" (UID: \"4d8f3080-018d-4e4e-b38b-5a3a2190312e\") " pod="openstack/dnsmasq-dns-5bf47b49b7-82l5g" Nov 24 11:23:35 crc kubenswrapper[4752]: I1124 11:23:35.965813 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d8f3080-018d-4e4e-b38b-5a3a2190312e-config\") pod \"dnsmasq-dns-5bf47b49b7-82l5g\" (UID: \"4d8f3080-018d-4e4e-b38b-5a3a2190312e\") " pod="openstack/dnsmasq-dns-5bf47b49b7-82l5g" Nov 24 11:23:35 crc kubenswrapper[4752]: I1124 11:23:35.965864 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4h4j\" (UniqueName: \"kubernetes.io/projected/4d8f3080-018d-4e4e-b38b-5a3a2190312e-kube-api-access-x4h4j\") pod \"dnsmasq-dns-5bf47b49b7-82l5g\" (UID: \"4d8f3080-018d-4e4e-b38b-5a3a2190312e\") " pod="openstack/dnsmasq-dns-5bf47b49b7-82l5g" Nov 24 11:23:35 crc kubenswrapper[4752]: I1124 11:23:35.985196 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-82l5g"] Nov 24 11:23:35 crc kubenswrapper[4752]: E1124 11:23:35.985733 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-x4h4j ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-5bf47b49b7-82l5g" podUID="4d8f3080-018d-4e4e-b38b-5a3a2190312e" Nov 24 11:23:36 crc kubenswrapper[4752]: I1124 11:23:36.013347 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-tdgsl"] Nov 24 11:23:36 crc kubenswrapper[4752]: I1124 11:23:36.014666 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-tdgsl" Nov 24 11:23:36 crc kubenswrapper[4752]: I1124 11:23:36.017970 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Nov 24 11:23:36 crc kubenswrapper[4752]: I1124 11:23:36.023865 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-tdgsl"] Nov 24 11:23:36 crc kubenswrapper[4752]: I1124 11:23:36.034570 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-cm8dh" Nov 24 11:23:36 crc kubenswrapper[4752]: I1124 11:23:36.067806 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d8f3080-018d-4e4e-b38b-5a3a2190312e-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-82l5g\" (UID: \"4d8f3080-018d-4e4e-b38b-5a3a2190312e\") " pod="openstack/dnsmasq-dns-5bf47b49b7-82l5g" Nov 24 11:23:36 crc kubenswrapper[4752]: I1124 11:23:36.067859 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d8f3080-018d-4e4e-b38b-5a3a2190312e-config\") pod \"dnsmasq-dns-5bf47b49b7-82l5g\" (UID: \"4d8f3080-018d-4e4e-b38b-5a3a2190312e\") " pod="openstack/dnsmasq-dns-5bf47b49b7-82l5g" Nov 24 11:23:36 crc kubenswrapper[4752]: I1124 11:23:36.067915 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4h4j\" (UniqueName: \"kubernetes.io/projected/4d8f3080-018d-4e4e-b38b-5a3a2190312e-kube-api-access-x4h4j\") pod \"dnsmasq-dns-5bf47b49b7-82l5g\" (UID: \"4d8f3080-018d-4e4e-b38b-5a3a2190312e\") " pod="openstack/dnsmasq-dns-5bf47b49b7-82l5g" Nov 24 11:23:36 crc kubenswrapper[4752]: I1124 11:23:36.068039 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d8f3080-018d-4e4e-b38b-5a3a2190312e-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-82l5g\" (UID: \"4d8f3080-018d-4e4e-b38b-5a3a2190312e\") " pod="openstack/dnsmasq-dns-5bf47b49b7-82l5g" Nov 24 11:23:36 crc kubenswrapper[4752]: I1124 11:23:36.068777 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d8f3080-018d-4e4e-b38b-5a3a2190312e-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-82l5g\" (UID: \"4d8f3080-018d-4e4e-b38b-5a3a2190312e\") " pod="openstack/dnsmasq-dns-5bf47b49b7-82l5g" Nov 24 11:23:36 crc kubenswrapper[4752]: I1124 11:23:36.068870 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d8f3080-018d-4e4e-b38b-5a3a2190312e-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-82l5g\" (UID: \"4d8f3080-018d-4e4e-b38b-5a3a2190312e\") " pod="openstack/dnsmasq-dns-5bf47b49b7-82l5g" Nov 24 11:23:36 crc kubenswrapper[4752]: I1124 11:23:36.068879 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d8f3080-018d-4e4e-b38b-5a3a2190312e-config\") pod \"dnsmasq-dns-5bf47b49b7-82l5g\" (UID: \"4d8f3080-018d-4e4e-b38b-5a3a2190312e\") " pod="openstack/dnsmasq-dns-5bf47b49b7-82l5g" Nov 24 11:23:36 crc kubenswrapper[4752]: I1124 11:23:36.077363 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-82l5g" Nov 24 11:23:36 crc kubenswrapper[4752]: I1124 11:23:36.084261 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4h4j\" (UniqueName: \"kubernetes.io/projected/4d8f3080-018d-4e4e-b38b-5a3a2190312e-kube-api-access-x4h4j\") pod \"dnsmasq-dns-5bf47b49b7-82l5g\" (UID: \"4d8f3080-018d-4e4e-b38b-5a3a2190312e\") " pod="openstack/dnsmasq-dns-5bf47b49b7-82l5g" Nov 24 11:23:36 crc kubenswrapper[4752]: I1124 11:23:36.084475 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-82l5g" Nov 24 11:23:36 crc kubenswrapper[4752]: I1124 11:23:36.169074 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d8f3080-018d-4e4e-b38b-5a3a2190312e-dns-svc\") pod \"4d8f3080-018d-4e4e-b38b-5a3a2190312e\" (UID: \"4d8f3080-018d-4e4e-b38b-5a3a2190312e\") " Nov 24 11:23:36 crc kubenswrapper[4752]: I1124 11:23:36.169218 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4h4j\" (UniqueName: \"kubernetes.io/projected/4d8f3080-018d-4e4e-b38b-5a3a2190312e-kube-api-access-x4h4j\") pod \"4d8f3080-018d-4e4e-b38b-5a3a2190312e\" (UID: \"4d8f3080-018d-4e4e-b38b-5a3a2190312e\") " Nov 24 11:23:36 crc kubenswrapper[4752]: I1124 11:23:36.169299 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d8f3080-018d-4e4e-b38b-5a3a2190312e-config\") pod \"4d8f3080-018d-4e4e-b38b-5a3a2190312e\" (UID: \"4d8f3080-018d-4e4e-b38b-5a3a2190312e\") " Nov 24 11:23:36 crc kubenswrapper[4752]: I1124 11:23:36.169351 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d8f3080-018d-4e4e-b38b-5a3a2190312e-ovsdbserver-nb\") pod \"4d8f3080-018d-4e4e-b38b-5a3a2190312e\" (UID: \"4d8f3080-018d-4e4e-b38b-5a3a2190312e\") " Nov 24 11:23:36 crc kubenswrapper[4752]: I1124 11:23:36.169546 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgmfw\" (UniqueName: \"kubernetes.io/projected/ec9e7ec8-7392-4362-8576-d62ca0508254-kube-api-access-rgmfw\") pod \"dnsmasq-dns-8554648995-tdgsl\" (UID: \"ec9e7ec8-7392-4362-8576-d62ca0508254\") " pod="openstack/dnsmasq-dns-8554648995-tdgsl" Nov 24 11:23:36 crc kubenswrapper[4752]: I1124 11:23:36.169607 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec9e7ec8-7392-4362-8576-d62ca0508254-config\") pod \"dnsmasq-dns-8554648995-tdgsl\" (UID: \"ec9e7ec8-7392-4362-8576-d62ca0508254\") " pod="openstack/dnsmasq-dns-8554648995-tdgsl" Nov 24 11:23:36 crc kubenswrapper[4752]: I1124 11:23:36.169644 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec9e7ec8-7392-4362-8576-d62ca0508254-dns-svc\") pod \"dnsmasq-dns-8554648995-tdgsl\" (UID: \"ec9e7ec8-7392-4362-8576-d62ca0508254\") " pod="openstack/dnsmasq-dns-8554648995-tdgsl" Nov 24 11:23:36 crc kubenswrapper[4752]: I1124 11:23:36.169689 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec9e7ec8-7392-4362-8576-d62ca0508254-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-tdgsl\" (UID: \"ec9e7ec8-7392-4362-8576-d62ca0508254\") " pod="openstack/dnsmasq-dns-8554648995-tdgsl" Nov 24 11:23:36 crc kubenswrapper[4752]: I1124 11:23:36.169767 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d8f3080-018d-4e4e-b38b-5a3a2190312e-config" (OuterVolumeSpecName: "config") pod "4d8f3080-018d-4e4e-b38b-5a3a2190312e" (UID: "4d8f3080-018d-4e4e-b38b-5a3a2190312e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:23:36 crc kubenswrapper[4752]: I1124 11:23:36.169833 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec9e7ec8-7392-4362-8576-d62ca0508254-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-tdgsl\" (UID: \"ec9e7ec8-7392-4362-8576-d62ca0508254\") " pod="openstack/dnsmasq-dns-8554648995-tdgsl" Nov 24 11:23:36 crc kubenswrapper[4752]: I1124 11:23:36.169905 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d8f3080-018d-4e4e-b38b-5a3a2190312e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4d8f3080-018d-4e4e-b38b-5a3a2190312e" (UID: "4d8f3080-018d-4e4e-b38b-5a3a2190312e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:23:36 crc kubenswrapper[4752]: I1124 11:23:36.170027 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d8f3080-018d-4e4e-b38b-5a3a2190312e-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:23:36 crc kubenswrapper[4752]: I1124 11:23:36.170175 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d8f3080-018d-4e4e-b38b-5a3a2190312e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4d8f3080-018d-4e4e-b38b-5a3a2190312e" (UID: "4d8f3080-018d-4e4e-b38b-5a3a2190312e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:23:36 crc kubenswrapper[4752]: I1124 11:23:36.183023 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d8f3080-018d-4e4e-b38b-5a3a2190312e-kube-api-access-x4h4j" (OuterVolumeSpecName: "kube-api-access-x4h4j") pod "4d8f3080-018d-4e4e-b38b-5a3a2190312e" (UID: "4d8f3080-018d-4e4e-b38b-5a3a2190312e"). InnerVolumeSpecName "kube-api-access-x4h4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:23:36 crc kubenswrapper[4752]: I1124 11:23:36.271353 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec9e7ec8-7392-4362-8576-d62ca0508254-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-tdgsl\" (UID: \"ec9e7ec8-7392-4362-8576-d62ca0508254\") " pod="openstack/dnsmasq-dns-8554648995-tdgsl" Nov 24 11:23:36 crc kubenswrapper[4752]: I1124 11:23:36.271402 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec9e7ec8-7392-4362-8576-d62ca0508254-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-tdgsl\" (UID: \"ec9e7ec8-7392-4362-8576-d62ca0508254\") " pod="openstack/dnsmasq-dns-8554648995-tdgsl" Nov 24 11:23:36 crc kubenswrapper[4752]: I1124 11:23:36.271441 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgmfw\" (UniqueName: \"kubernetes.io/projected/ec9e7ec8-7392-4362-8576-d62ca0508254-kube-api-access-rgmfw\") pod \"dnsmasq-dns-8554648995-tdgsl\" (UID: \"ec9e7ec8-7392-4362-8576-d62ca0508254\") " pod="openstack/dnsmasq-dns-8554648995-tdgsl" Nov 24 11:23:36 crc kubenswrapper[4752]: I1124 11:23:36.271493 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec9e7ec8-7392-4362-8576-d62ca0508254-config\") pod \"dnsmasq-dns-8554648995-tdgsl\" (UID: \"ec9e7ec8-7392-4362-8576-d62ca0508254\") " pod="openstack/dnsmasq-dns-8554648995-tdgsl" Nov 24 11:23:36 crc kubenswrapper[4752]: I1124 11:23:36.271528 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec9e7ec8-7392-4362-8576-d62ca0508254-dns-svc\") pod \"dnsmasq-dns-8554648995-tdgsl\" (UID: \"ec9e7ec8-7392-4362-8576-d62ca0508254\") " pod="openstack/dnsmasq-dns-8554648995-tdgsl" Nov 24 11:23:36 crc kubenswrapper[4752]: I1124 11:23:36.271566 4752 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d8f3080-018d-4e4e-b38b-5a3a2190312e-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 11:23:36 crc kubenswrapper[4752]: I1124 11:23:36.271585 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4h4j\" (UniqueName: \"kubernetes.io/projected/4d8f3080-018d-4e4e-b38b-5a3a2190312e-kube-api-access-x4h4j\") on node \"crc\" DevicePath \"\"" Nov 24 11:23:36 crc kubenswrapper[4752]: I1124 11:23:36.271605 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d8f3080-018d-4e4e-b38b-5a3a2190312e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 24 11:23:36 crc kubenswrapper[4752]: I1124 11:23:36.272335 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec9e7ec8-7392-4362-8576-d62ca0508254-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-tdgsl\" (UID: \"ec9e7ec8-7392-4362-8576-d62ca0508254\") " pod="openstack/dnsmasq-dns-8554648995-tdgsl" Nov 24 11:23:36 crc kubenswrapper[4752]: I1124 11:23:36.272616 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec9e7ec8-7392-4362-8576-d62ca0508254-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-tdgsl\" (UID: \"ec9e7ec8-7392-4362-8576-d62ca0508254\") " pod="openstack/dnsmasq-dns-8554648995-tdgsl" Nov 24 11:23:36 crc kubenswrapper[4752]: I1124 11:23:36.278502 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec9e7ec8-7392-4362-8576-d62ca0508254-dns-svc\") pod \"dnsmasq-dns-8554648995-tdgsl\" (UID: \"ec9e7ec8-7392-4362-8576-d62ca0508254\") " pod="openstack/dnsmasq-dns-8554648995-tdgsl" Nov 24 11:23:36 crc kubenswrapper[4752]: I1124 11:23:36.278520 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec9e7ec8-7392-4362-8576-d62ca0508254-config\") pod \"dnsmasq-dns-8554648995-tdgsl\" (UID: \"ec9e7ec8-7392-4362-8576-d62ca0508254\") " pod="openstack/dnsmasq-dns-8554648995-tdgsl" Nov 24 11:23:36 crc kubenswrapper[4752]: I1124 11:23:36.289154 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgmfw\" (UniqueName: \"kubernetes.io/projected/ec9e7ec8-7392-4362-8576-d62ca0508254-kube-api-access-rgmfw\") pod \"dnsmasq-dns-8554648995-tdgsl\" (UID: \"ec9e7ec8-7392-4362-8576-d62ca0508254\") " pod="openstack/dnsmasq-dns-8554648995-tdgsl" Nov 24 11:23:36 crc kubenswrapper[4752]: I1124 11:23:36.335162 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-tdgsl" Nov 24 11:23:37 crc kubenswrapper[4752]: I1124 11:23:37.085989 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-82l5g" Nov 24 11:23:37 crc kubenswrapper[4752]: I1124 11:23:37.157802 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-82l5g"] Nov 24 11:23:37 crc kubenswrapper[4752]: I1124 11:23:37.166872 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-82l5g"] Nov 24 11:23:37 crc kubenswrapper[4752]: I1124 11:23:37.513695 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-cm8dh"] Nov 24 11:23:37 crc kubenswrapper[4752]: I1124 11:23:37.526691 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-tdgsl"] Nov 24 11:23:38 crc kubenswrapper[4752]: I1124 11:23:38.095862 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e3426b66-ea91-4d99-86c5-955d77073619","Type":"ContainerStarted","Data":"d34722d3f40c1996b7a80559039e9a50e8496db2ea2f4dc7d64da0f375ada589"} Nov 24 11:23:38 crc kubenswrapper[4752]: I1124 11:23:38.109861 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-xrzhq" event={"ID":"ddf3eb37-e41b-4a4f-b394-83cc91c4d837","Type":"ContainerStarted","Data":"d0809b9c5076d92e13d72d854ceba77e542503440a2ffed6e558da8be7898f7a"} Nov 24 11:23:38 crc kubenswrapper[4752]: I1124 11:23:38.109973 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-xrzhq" Nov 24 11:23:38 crc kubenswrapper[4752]: I1124 11:23:38.109966 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-xrzhq" podUID="ddf3eb37-e41b-4a4f-b394-83cc91c4d837" containerName="dnsmasq-dns" containerID="cri-o://d0809b9c5076d92e13d72d854ceba77e542503440a2ffed6e558da8be7898f7a" gracePeriod=10 Nov 24 11:23:38 crc kubenswrapper[4752]: I1124 11:23:38.111561 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-cm8dh" event={"ID":"01bd298d-28f4-4c50-80b3-f81ed96de794","Type":"ContainerStarted","Data":"7ffa3c32b294f924b2c9e536b2d0f491231355abdf781c8100897626e5c10bdb"} Nov 24 11:23:38 crc kubenswrapper[4752]: I1124 11:23:38.113129 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-tdgsl" event={"ID":"ec9e7ec8-7392-4362-8576-d62ca0508254","Type":"ContainerStarted","Data":"e84117e98e19ede3ba855ad928c963edbc9c978ccf97222ff938c19073b9acdb"} Nov 24 11:23:38 crc kubenswrapper[4752]: I1124 11:23:38.116167 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"1020727e-3725-4fed-b276-043ae4d30c4c","Type":"ContainerStarted","Data":"2163dde349442cb3ffdc514182a2490f24052d186366d6badae0a20befe860b1"} Nov 24 11:23:38 crc kubenswrapper[4752]: I1124 11:23:38.118720 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Nov 24 11:23:38 crc kubenswrapper[4752]: I1124 11:23:38.142087 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-xrzhq" podStartSLOduration=12.230568566 podStartE2EDuration="25.142067256s" podCreationTimestamp="2025-11-24 11:23:13 +0000 UTC" firstStartedPulling="2025-11-24 11:23:14.904049173 +0000 UTC m=+1000.888869462" lastFinishedPulling="2025-11-24 11:23:27.815547863 +0000 UTC m=+1013.800368152" observedRunningTime="2025-11-24 11:23:38.137144284 +0000 UTC m=+1024.121964583" watchObservedRunningTime="2025-11-24 11:23:38.142067256 +0000 UTC m=+1024.126887545" Nov 24 11:23:38 crc kubenswrapper[4752]: I1124 11:23:38.163143 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=12.090636623 podStartE2EDuration="20.163117035s" podCreationTimestamp="2025-11-24 11:23:18 +0000 UTC" firstStartedPulling="2025-11-24 11:23:28.720224374 +0000 UTC m=+1014.705044663" lastFinishedPulling="2025-11-24 11:23:36.792704786 +0000 UTC m=+1022.777525075" observedRunningTime="2025-11-24 11:23:38.159792869 +0000 UTC m=+1024.144613178" watchObservedRunningTime="2025-11-24 11:23:38.163117035 +0000 UTC m=+1024.147937324" Nov 24 11:23:38 crc kubenswrapper[4752]: I1124 11:23:38.516623 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-xrzhq" Nov 24 11:23:38 crc kubenswrapper[4752]: I1124 11:23:38.643114 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddf3eb37-e41b-4a4f-b394-83cc91c4d837-config\") pod \"ddf3eb37-e41b-4a4f-b394-83cc91c4d837\" (UID: \"ddf3eb37-e41b-4a4f-b394-83cc91c4d837\") " Nov 24 11:23:38 crc kubenswrapper[4752]: I1124 11:23:38.643223 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztxxz\" (UniqueName: \"kubernetes.io/projected/ddf3eb37-e41b-4a4f-b394-83cc91c4d837-kube-api-access-ztxxz\") pod \"ddf3eb37-e41b-4a4f-b394-83cc91c4d837\" (UID: \"ddf3eb37-e41b-4a4f-b394-83cc91c4d837\") " Nov 24 11:23:38 crc kubenswrapper[4752]: I1124 11:23:38.643268 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddf3eb37-e41b-4a4f-b394-83cc91c4d837-dns-svc\") pod \"ddf3eb37-e41b-4a4f-b394-83cc91c4d837\" (UID: \"ddf3eb37-e41b-4a4f-b394-83cc91c4d837\") " Nov 24 11:23:38 crc kubenswrapper[4752]: I1124 11:23:38.652346 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddf3eb37-e41b-4a4f-b394-83cc91c4d837-kube-api-access-ztxxz" (OuterVolumeSpecName: "kube-api-access-ztxxz") pod "ddf3eb37-e41b-4a4f-b394-83cc91c4d837" (UID: "ddf3eb37-e41b-4a4f-b394-83cc91c4d837"). InnerVolumeSpecName "kube-api-access-ztxxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:23:38 crc kubenswrapper[4752]: I1124 11:23:38.701041 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddf3eb37-e41b-4a4f-b394-83cc91c4d837-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ddf3eb37-e41b-4a4f-b394-83cc91c4d837" (UID: "ddf3eb37-e41b-4a4f-b394-83cc91c4d837"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:23:38 crc kubenswrapper[4752]: I1124 11:23:38.702924 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddf3eb37-e41b-4a4f-b394-83cc91c4d837-config" (OuterVolumeSpecName: "config") pod "ddf3eb37-e41b-4a4f-b394-83cc91c4d837" (UID: "ddf3eb37-e41b-4a4f-b394-83cc91c4d837"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:23:38 crc kubenswrapper[4752]: I1124 11:23:38.738981 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d8f3080-018d-4e4e-b38b-5a3a2190312e" path="/var/lib/kubelet/pods/4d8f3080-018d-4e4e-b38b-5a3a2190312e/volumes" Nov 24 11:23:38 crc kubenswrapper[4752]: I1124 11:23:38.745975 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztxxz\" (UniqueName: \"kubernetes.io/projected/ddf3eb37-e41b-4a4f-b394-83cc91c4d837-kube-api-access-ztxxz\") on node \"crc\" DevicePath \"\"" Nov 24 11:23:38 crc kubenswrapper[4752]: I1124 11:23:38.746005 4752 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddf3eb37-e41b-4a4f-b394-83cc91c4d837-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 11:23:38 crc kubenswrapper[4752]: I1124 11:23:38.746017 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddf3eb37-e41b-4a4f-b394-83cc91c4d837-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:23:38 crc kubenswrapper[4752]: E1124 11:23:38.885730 4752 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26da9a73_3ede_476b_bfab_08839a8b88d1.slice/crio-95748596e6497c0d2fb881e9a27eb4fca9a23e8c544c31ba9b578f2427735437.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podddf3eb37_e41b_4a4f_b394_83cc91c4d837.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podddf3eb37_e41b_4a4f_b394_83cc91c4d837.slice/crio-d0a6a1b167389b0fc3545e087d7f4d38b8d18b7c0abc0dd4a5fe790ceaa426f6\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26da9a73_3ede_476b_bfab_08839a8b88d1.slice/crio-conmon-95748596e6497c0d2fb881e9a27eb4fca9a23e8c544c31ba9b578f2427735437.scope\": RecentStats: unable to find data in memory cache]" Nov 24 11:23:39 crc kubenswrapper[4752]: I1124 11:23:39.131456 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4777fb8a-2a08-4286-8c15-39d249db6db0","Type":"ContainerStarted","Data":"048913e60e85a005c432d1b709dfd27f75ae10ec635f5a2ae5749e8ba5d88dcb"} Nov 24 11:23:39 crc kubenswrapper[4752]: I1124 11:23:39.131545 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 24 11:23:39 crc kubenswrapper[4752]: I1124 11:23:39.136390 4752 generic.go:334] "Generic (PLEG): container finished" podID="ddf3eb37-e41b-4a4f-b394-83cc91c4d837" containerID="d0809b9c5076d92e13d72d854ceba77e542503440a2ffed6e558da8be7898f7a" exitCode=0 Nov 24 11:23:39 crc kubenswrapper[4752]: I1124 11:23:39.136444 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-xrzhq" event={"ID":"ddf3eb37-e41b-4a4f-b394-83cc91c4d837","Type":"ContainerDied","Data":"d0809b9c5076d92e13d72d854ceba77e542503440a2ffed6e558da8be7898f7a"} Nov 24 11:23:39 crc kubenswrapper[4752]: I1124 11:23:39.136468 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-xrzhq" event={"ID":"ddf3eb37-e41b-4a4f-b394-83cc91c4d837","Type":"ContainerDied","Data":"d0a6a1b167389b0fc3545e087d7f4d38b8d18b7c0abc0dd4a5fe790ceaa426f6"} Nov 24 11:23:39 crc kubenswrapper[4752]: I1124 11:23:39.136491 4752 scope.go:117] "RemoveContainer" containerID="d0809b9c5076d92e13d72d854ceba77e542503440a2ffed6e558da8be7898f7a" Nov 24 11:23:39 crc kubenswrapper[4752]: I1124 11:23:39.136616 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-xrzhq" Nov 24 11:23:39 crc kubenswrapper[4752]: I1124 11:23:39.142276 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xmc2s" event={"ID":"9495af0b-dca8-4695-8f40-f4f9a7ec5229","Type":"ContainerStarted","Data":"a049cd20437a840deeead794f8a8f79bd0e17445da58b94a4ff6632215070a13"} Nov 24 11:23:39 crc kubenswrapper[4752]: I1124 11:23:39.142399 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-xmc2s" Nov 24 11:23:39 crc kubenswrapper[4752]: I1124 11:23:39.147678 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d5a1ee4e-2810-485d-b931-d3121fe7e264","Type":"ContainerStarted","Data":"342bfdc4cfd18e6bd3ed709a4a89ca54622aa4743390cc89a011bb7607799950"} Nov 24 11:23:39 crc kubenswrapper[4752]: I1124 11:23:39.153947 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=9.882544245 podStartE2EDuration="19.153905345s" podCreationTimestamp="2025-11-24 11:23:20 +0000 UTC" firstStartedPulling="2025-11-24 11:23:28.714250012 +0000 UTC m=+1014.699070301" lastFinishedPulling="2025-11-24 11:23:37.985611112 +0000 UTC m=+1023.970431401" observedRunningTime="2025-11-24 11:23:39.145921714 +0000 UTC m=+1025.130742013" watchObservedRunningTime="2025-11-24 11:23:39.153905345 +0000 UTC m=+1025.138725634" Nov 24 11:23:39 crc kubenswrapper[4752]: I1124 11:23:39.154415 4752 generic.go:334] "Generic (PLEG): container finished" podID="ec9e7ec8-7392-4362-8576-d62ca0508254" containerID="b631f35d465abf3ee59e224aeb1ffa1314cc93be6fa424eed3d23de583d0fa6f" exitCode=0 Nov 24 11:23:39 crc kubenswrapper[4752]: I1124 11:23:39.154477 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-tdgsl" event={"ID":"ec9e7ec8-7392-4362-8576-d62ca0508254","Type":"ContainerDied","Data":"b631f35d465abf3ee59e224aeb1ffa1314cc93be6fa424eed3d23de583d0fa6f"} Nov 24 11:23:39 crc kubenswrapper[4752]: I1124 11:23:39.160167 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"30292578-890d-42e8-b689-b909de083b48","Type":"ContainerStarted","Data":"62e09945d9b86b9c0065485c10b2db775ec7009251b346c71a1da8a7ab7d633f"} Nov 24 11:23:39 crc kubenswrapper[4752]: I1124 11:23:39.169130 4752 generic.go:334] "Generic (PLEG): container finished" podID="26da9a73-3ede-476b-bfab-08839a8b88d1" containerID="95748596e6497c0d2fb881e9a27eb4fca9a23e8c544c31ba9b578f2427735437" exitCode=0 Nov 24 11:23:39 crc kubenswrapper[4752]: I1124 11:23:39.169212 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-v6jhw" event={"ID":"26da9a73-3ede-476b-bfab-08839a8b88d1","Type":"ContainerDied","Data":"95748596e6497c0d2fb881e9a27eb4fca9a23e8c544c31ba9b578f2427735437"} Nov 24 11:23:39 crc kubenswrapper[4752]: I1124 11:23:39.176957 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7d558bea-9793-4831-9304-f8cee2b2331e","Type":"ContainerStarted","Data":"2ea78ef62ccc3e6245a885fb49cb63708cc7ee44c9d209af4583cd80c3340003"} Nov 24 11:23:39 crc kubenswrapper[4752]: I1124 11:23:39.185617 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-xmc2s" podStartSLOduration=7.885418654 podStartE2EDuration="16.18556967s" podCreationTimestamp="2025-11-24 11:23:23 +0000 UTC" firstStartedPulling="2025-11-24 11:23:28.722787608 +0000 UTC m=+1014.707607907" lastFinishedPulling="2025-11-24 11:23:37.022938634 +0000 UTC m=+1023.007758923" observedRunningTime="2025-11-24 11:23:39.164056928 +0000 UTC m=+1025.148877237" watchObservedRunningTime="2025-11-24 11:23:39.18556967 +0000 UTC m=+1025.170389959" Nov 24 11:23:39 crc kubenswrapper[4752]: I1124 11:23:39.192903 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-xrzhq"] Nov 24 11:23:39 crc kubenswrapper[4752]: I1124 11:23:39.203936 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-xrzhq"] Nov 24 11:23:39 crc kubenswrapper[4752]: I1124 11:23:39.208969 4752 scope.go:117] "RemoveContainer" containerID="b42434001f9b02e2f0dd76869d4e7bdcb382a7ea6e37e3cf8f553847b60e00da" Nov 24 11:23:39 crc kubenswrapper[4752]: I1124 11:23:39.297456 4752 scope.go:117] "RemoveContainer" containerID="d0809b9c5076d92e13d72d854ceba77e542503440a2ffed6e558da8be7898f7a" Nov 24 11:23:39 crc kubenswrapper[4752]: E1124 11:23:39.298412 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0809b9c5076d92e13d72d854ceba77e542503440a2ffed6e558da8be7898f7a\": container with ID starting with d0809b9c5076d92e13d72d854ceba77e542503440a2ffed6e558da8be7898f7a not found: ID does not exist" containerID="d0809b9c5076d92e13d72d854ceba77e542503440a2ffed6e558da8be7898f7a" Nov 24 11:23:39 crc kubenswrapper[4752]: I1124 11:23:39.298540 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0809b9c5076d92e13d72d854ceba77e542503440a2ffed6e558da8be7898f7a"} err="failed to get container status \"d0809b9c5076d92e13d72d854ceba77e542503440a2ffed6e558da8be7898f7a\": rpc error: code = NotFound desc = could not find container \"d0809b9c5076d92e13d72d854ceba77e542503440a2ffed6e558da8be7898f7a\": container with ID starting with d0809b9c5076d92e13d72d854ceba77e542503440a2ffed6e558da8be7898f7a not found: ID does not exist" Nov 24 11:23:39 crc kubenswrapper[4752]: I1124 11:23:39.298627 4752 scope.go:117] "RemoveContainer" containerID="b42434001f9b02e2f0dd76869d4e7bdcb382a7ea6e37e3cf8f553847b60e00da" Nov 24 11:23:39 crc kubenswrapper[4752]: E1124 11:23:39.299317 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b42434001f9b02e2f0dd76869d4e7bdcb382a7ea6e37e3cf8f553847b60e00da\": container with ID starting with b42434001f9b02e2f0dd76869d4e7bdcb382a7ea6e37e3cf8f553847b60e00da not found: ID does not exist" containerID="b42434001f9b02e2f0dd76869d4e7bdcb382a7ea6e37e3cf8f553847b60e00da" Nov 24 11:23:39 crc kubenswrapper[4752]: I1124 11:23:39.299365 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b42434001f9b02e2f0dd76869d4e7bdcb382a7ea6e37e3cf8f553847b60e00da"} err="failed to get container status \"b42434001f9b02e2f0dd76869d4e7bdcb382a7ea6e37e3cf8f553847b60e00da\": rpc error: code = NotFound desc = could not find container \"b42434001f9b02e2f0dd76869d4e7bdcb382a7ea6e37e3cf8f553847b60e00da\": container with ID starting with b42434001f9b02e2f0dd76869d4e7bdcb382a7ea6e37e3cf8f553847b60e00da not found: ID does not exist" Nov 24 11:23:40 crc kubenswrapper[4752]: I1124 11:23:40.186418 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5","Type":"ContainerStarted","Data":"50d83bb320cd3643b150ab5645490c9c124d9eff8abf9994e68c250ab8f36395"} Nov 24 11:23:40 crc kubenswrapper[4752]: I1124 11:23:40.193223 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-tdgsl" event={"ID":"ec9e7ec8-7392-4362-8576-d62ca0508254","Type":"ContainerStarted","Data":"9fc9daf79bc6a767e3df083652a066248cae02378b9f0cf58166d676e1860bfa"} Nov 24 11:23:40 crc kubenswrapper[4752]: I1124 11:23:40.193358 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-tdgsl" Nov 24 11:23:40 crc kubenswrapper[4752]: I1124 11:23:40.194839 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3d820ba8-03d6-48f4-9423-bbc1ed64a36b","Type":"ContainerStarted","Data":"ca63fd0de5f52cf43fe00bb9dfbd1436710419f23b5ae8d734e790c2aa804e2b"} Nov 24 11:23:40 crc kubenswrapper[4752]: I1124 11:23:40.198555 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-v6jhw" event={"ID":"26da9a73-3ede-476b-bfab-08839a8b88d1","Type":"ContainerStarted","Data":"3d7ebdd7e2accb69d034418b3619ad363a595fd55694997a32bd7b3f79d31253"} Nov 24 11:23:40 crc kubenswrapper[4752]: I1124 11:23:40.198587 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-v6jhw" Nov 24 11:23:40 crc kubenswrapper[4752]: I1124 11:23:40.198599 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-v6jhw" event={"ID":"26da9a73-3ede-476b-bfab-08839a8b88d1","Type":"ContainerStarted","Data":"8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce"} Nov 24 11:23:40 crc kubenswrapper[4752]: I1124 11:23:40.200549 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-v6jhw" Nov 24 11:23:40 crc kubenswrapper[4752]: I1124 11:23:40.241508 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-tdgsl" podStartSLOduration=5.241489434 podStartE2EDuration="5.241489434s" podCreationTimestamp="2025-11-24 11:23:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:23:40.23339309 +0000 UTC m=+1026.218213379" watchObservedRunningTime="2025-11-24 11:23:40.241489434 +0000 UTC m=+1026.226309723" Nov 24 11:23:40 crc kubenswrapper[4752]: I1124 11:23:40.255586 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-v6jhw" podStartSLOduration=9.784094660000001 podStartE2EDuration="17.255569692s" podCreationTimestamp="2025-11-24 11:23:23 +0000 UTC" firstStartedPulling="2025-11-24 11:23:29.37848952 +0000 UTC m=+1015.363309809" lastFinishedPulling="2025-11-24 11:23:36.849964552 +0000 UTC m=+1022.834784841" observedRunningTime="2025-11-24 11:23:40.249931649 +0000 UTC m=+1026.234751938" watchObservedRunningTime="2025-11-24 11:23:40.255569692 +0000 UTC m=+1026.240389981" Nov 24 11:23:40 crc kubenswrapper[4752]: I1124 11:23:40.737186 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddf3eb37-e41b-4a4f-b394-83cc91c4d837" path="/var/lib/kubelet/pods/ddf3eb37-e41b-4a4f-b394-83cc91c4d837/volumes" Nov 24 11:23:41 crc kubenswrapper[4752]: I1124 11:23:41.208526 4752 generic.go:334] "Generic (PLEG): container finished" podID="e3426b66-ea91-4d99-86c5-955d77073619" containerID="d34722d3f40c1996b7a80559039e9a50e8496db2ea2f4dc7d64da0f375ada589" exitCode=0 Nov 24 11:23:41 crc kubenswrapper[4752]: I1124 11:23:41.208669 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e3426b66-ea91-4d99-86c5-955d77073619","Type":"ContainerDied","Data":"d34722d3f40c1996b7a80559039e9a50e8496db2ea2f4dc7d64da0f375ada589"} Nov 24 11:23:42 crc kubenswrapper[4752]: I1124 11:23:42.219965 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-cm8dh" event={"ID":"01bd298d-28f4-4c50-80b3-f81ed96de794","Type":"ContainerStarted","Data":"c9770a80fae990d49f8d0649788f9b375b5556a34ea582ab37233bdac9368a5a"} Nov 24 11:23:42 crc kubenswrapper[4752]: I1124 11:23:42.223479 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d5a1ee4e-2810-485d-b931-d3121fe7e264","Type":"ContainerStarted","Data":"e53bf221b6c3d910bf35949c2778fbbe1ce10498615e17af98ca6b7a7b5fd127"} Nov 24 11:23:42 crc kubenswrapper[4752]: I1124 11:23:42.225461 4752 generic.go:334] "Generic (PLEG): container finished" podID="30292578-890d-42e8-b689-b909de083b48" containerID="62e09945d9b86b9c0065485c10b2db775ec7009251b346c71a1da8a7ab7d633f" exitCode=0 Nov 24 11:23:42 crc kubenswrapper[4752]: I1124 11:23:42.225576 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"30292578-890d-42e8-b689-b909de083b48","Type":"ContainerDied","Data":"62e09945d9b86b9c0065485c10b2db775ec7009251b346c71a1da8a7ab7d633f"} Nov 24 11:23:42 crc kubenswrapper[4752]: I1124 11:23:42.228046 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7d558bea-9793-4831-9304-f8cee2b2331e","Type":"ContainerStarted","Data":"44a0f217799b6ba62959ff62bff9dd54d47f9e3c7d76763038b8475083244095"} Nov 24 11:23:42 crc kubenswrapper[4752]: I1124 11:23:42.232019 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e3426b66-ea91-4d99-86c5-955d77073619","Type":"ContainerStarted","Data":"00c9561e9a5735f6ed6ea2bd3f779c56f52d92d1f868c56150ff9276fc50cb92"} Nov 24 11:23:42 crc kubenswrapper[4752]: I1124 11:23:42.255657 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-cm8dh" podStartSLOduration=4.059641158 podStartE2EDuration="7.255633937s" podCreationTimestamp="2025-11-24 11:23:35 +0000 UTC" firstStartedPulling="2025-11-24 11:23:37.896970178 +0000 UTC m=+1023.881790467" lastFinishedPulling="2025-11-24 11:23:41.092962957 +0000 UTC m=+1027.077783246" observedRunningTime="2025-11-24 11:23:42.236887525 +0000 UTC m=+1028.221707814" watchObservedRunningTime="2025-11-24 11:23:42.255633937 +0000 UTC m=+1028.240454226" Nov 24 11:23:42 crc kubenswrapper[4752]: I1124 11:23:42.309446 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.059588324 podStartE2EDuration="16.309430953s" podCreationTimestamp="2025-11-24 11:23:26 +0000 UTC" firstStartedPulling="2025-11-24 11:23:28.816834558 +0000 UTC m=+1014.801654847" lastFinishedPulling="2025-11-24 11:23:41.066677187 +0000 UTC m=+1027.051497476" observedRunningTime="2025-11-24 11:23:42.299950109 +0000 UTC m=+1028.284770398" watchObservedRunningTime="2025-11-24 11:23:42.309430953 +0000 UTC m=+1028.294251242" Nov 24 11:23:42 crc kubenswrapper[4752]: I1124 11:23:42.342566 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=9.621192137 podStartE2EDuration="19.34254856s" podCreationTimestamp="2025-11-24 11:23:23 +0000 UTC" firstStartedPulling="2025-11-24 11:23:31.358445693 +0000 UTC m=+1017.343265982" lastFinishedPulling="2025-11-24 11:23:41.079802116 +0000 UTC m=+1027.064622405" observedRunningTime="2025-11-24 11:23:42.337671839 +0000 UTC m=+1028.322492148" watchObservedRunningTime="2025-11-24 11:23:42.34254856 +0000 UTC m=+1028.327368849" Nov 24 11:23:42 crc kubenswrapper[4752]: I1124 11:23:42.362689 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=17.090810685 podStartE2EDuration="25.362675413s" podCreationTimestamp="2025-11-24 11:23:17 +0000 UTC" firstStartedPulling="2025-11-24 11:23:28.51985943 +0000 UTC m=+1014.504679719" lastFinishedPulling="2025-11-24 11:23:36.791724158 +0000 UTC m=+1022.776544447" observedRunningTime="2025-11-24 11:23:42.357933495 +0000 UTC m=+1028.342753784" watchObservedRunningTime="2025-11-24 11:23:42.362675413 +0000 UTC m=+1028.347495702" Nov 24 11:23:43 crc kubenswrapper[4752]: I1124 11:23:43.008987 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Nov 24 11:23:43 crc kubenswrapper[4752]: I1124 11:23:43.009356 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Nov 24 11:23:43 crc kubenswrapper[4752]: I1124 11:23:43.064362 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Nov 24 11:23:43 crc kubenswrapper[4752]: I1124 11:23:43.219056 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Nov 24 11:23:43 crc kubenswrapper[4752]: I1124 11:23:43.243895 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"30292578-890d-42e8-b689-b909de083b48","Type":"ContainerStarted","Data":"2be1c33a29006d4527d097e0563524c636dc5fa73eff7ea290845ce9f9a8aa72"} Nov 24 11:23:43 crc kubenswrapper[4752]: I1124 11:23:43.264733 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=20.037553362 podStartE2EDuration="28.264713687s" podCreationTimestamp="2025-11-24 11:23:15 +0000 UTC" firstStartedPulling="2025-11-24 11:23:28.711824651 +0000 UTC m=+1014.696644940" lastFinishedPulling="2025-11-24 11:23:36.938984976 +0000 UTC m=+1022.923805265" observedRunningTime="2025-11-24 11:23:43.264086989 +0000 UTC m=+1029.248907298" watchObservedRunningTime="2025-11-24 11:23:43.264713687 +0000 UTC m=+1029.249533976" Nov 24 11:23:43 crc kubenswrapper[4752]: I1124 11:23:43.267563 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Nov 24 11:23:43 crc kubenswrapper[4752]: I1124 11:23:43.295103 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Nov 24 11:23:43 crc kubenswrapper[4752]: I1124 11:23:43.690706 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Nov 24 11:23:44 crc kubenswrapper[4752]: I1124 11:23:44.253915 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Nov 24 11:23:44 crc kubenswrapper[4752]: I1124 11:23:44.300298 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Nov 24 11:23:44 crc kubenswrapper[4752]: I1124 11:23:44.489583 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Nov 24 11:23:44 crc kubenswrapper[4752]: E1124 11:23:44.489958 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddf3eb37-e41b-4a4f-b394-83cc91c4d837" containerName="init" Nov 24 11:23:44 crc kubenswrapper[4752]: I1124 11:23:44.489974 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddf3eb37-e41b-4a4f-b394-83cc91c4d837" containerName="init" Nov 24 11:23:44 crc kubenswrapper[4752]: E1124 11:23:44.489998 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddf3eb37-e41b-4a4f-b394-83cc91c4d837" containerName="dnsmasq-dns" Nov 24 11:23:44 crc kubenswrapper[4752]: I1124 11:23:44.490005 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddf3eb37-e41b-4a4f-b394-83cc91c4d837" containerName="dnsmasq-dns" Nov 24 11:23:44 crc kubenswrapper[4752]: I1124 11:23:44.490165 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddf3eb37-e41b-4a4f-b394-83cc91c4d837" containerName="dnsmasq-dns" Nov 24 11:23:44 crc kubenswrapper[4752]: I1124 11:23:44.490991 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 24 11:23:44 crc kubenswrapper[4752]: I1124 11:23:44.493900 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-9s7cf" Nov 24 11:23:44 crc kubenswrapper[4752]: I1124 11:23:44.494617 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Nov 24 11:23:44 crc kubenswrapper[4752]: I1124 11:23:44.494659 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Nov 24 11:23:44 crc kubenswrapper[4752]: I1124 11:23:44.494688 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Nov 24 11:23:44 crc kubenswrapper[4752]: I1124 11:23:44.501315 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 24 11:23:44 crc kubenswrapper[4752]: I1124 11:23:44.652632 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1077002f-be90-4e09-8158-efdc98329e5b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"1077002f-be90-4e09-8158-efdc98329e5b\") " pod="openstack/ovn-northd-0" Nov 24 11:23:44 crc kubenswrapper[4752]: I1124 11:23:44.653011 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czr57\" (UniqueName: \"kubernetes.io/projected/1077002f-be90-4e09-8158-efdc98329e5b-kube-api-access-czr57\") pod \"ovn-northd-0\" (UID: \"1077002f-be90-4e09-8158-efdc98329e5b\") " pod="openstack/ovn-northd-0" Nov 24 11:23:44 crc kubenswrapper[4752]: I1124 11:23:44.653166 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1077002f-be90-4e09-8158-efdc98329e5b-config\") pod \"ovn-northd-0\" (UID: \"1077002f-be90-4e09-8158-efdc98329e5b\") " pod="openstack/ovn-northd-0" Nov 24 11:23:44 crc kubenswrapper[4752]: I1124 11:23:44.653208 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1077002f-be90-4e09-8158-efdc98329e5b-scripts\") pod \"ovn-northd-0\" (UID: \"1077002f-be90-4e09-8158-efdc98329e5b\") " pod="openstack/ovn-northd-0" Nov 24 11:23:44 crc kubenswrapper[4752]: I1124 11:23:44.653415 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1077002f-be90-4e09-8158-efdc98329e5b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"1077002f-be90-4e09-8158-efdc98329e5b\") " pod="openstack/ovn-northd-0" Nov 24 11:23:44 crc kubenswrapper[4752]: I1124 11:23:44.653522 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1077002f-be90-4e09-8158-efdc98329e5b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"1077002f-be90-4e09-8158-efdc98329e5b\") " pod="openstack/ovn-northd-0" Nov 24 11:23:44 crc kubenswrapper[4752]: I1124 11:23:44.653619 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1077002f-be90-4e09-8158-efdc98329e5b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"1077002f-be90-4e09-8158-efdc98329e5b\") " pod="openstack/ovn-northd-0" Nov 24 11:23:44 crc kubenswrapper[4752]: I1124 11:23:44.755252 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1077002f-be90-4e09-8158-efdc98329e5b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"1077002f-be90-4e09-8158-efdc98329e5b\") " pod="openstack/ovn-northd-0" Nov 24 11:23:44 crc kubenswrapper[4752]: I1124 11:23:44.755310 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czr57\" (UniqueName: \"kubernetes.io/projected/1077002f-be90-4e09-8158-efdc98329e5b-kube-api-access-czr57\") pod \"ovn-northd-0\" (UID: \"1077002f-be90-4e09-8158-efdc98329e5b\") " pod="openstack/ovn-northd-0" Nov 24 11:23:44 crc kubenswrapper[4752]: I1124 11:23:44.755373 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1077002f-be90-4e09-8158-efdc98329e5b-config\") pod \"ovn-northd-0\" (UID: \"1077002f-be90-4e09-8158-efdc98329e5b\") " pod="openstack/ovn-northd-0" Nov 24 11:23:44 crc kubenswrapper[4752]: I1124 11:23:44.755395 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1077002f-be90-4e09-8158-efdc98329e5b-scripts\") pod \"ovn-northd-0\" (UID: \"1077002f-be90-4e09-8158-efdc98329e5b\") " pod="openstack/ovn-northd-0" Nov 24 11:23:44 crc kubenswrapper[4752]: I1124 11:23:44.755448 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1077002f-be90-4e09-8158-efdc98329e5b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"1077002f-be90-4e09-8158-efdc98329e5b\") " pod="openstack/ovn-northd-0" Nov 24 11:23:44 crc kubenswrapper[4752]: I1124 11:23:44.755474 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1077002f-be90-4e09-8158-efdc98329e5b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"1077002f-be90-4e09-8158-efdc98329e5b\") " pod="openstack/ovn-northd-0" Nov 24 11:23:44 crc kubenswrapper[4752]: I1124 11:23:44.755496 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1077002f-be90-4e09-8158-efdc98329e5b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"1077002f-be90-4e09-8158-efdc98329e5b\") " pod="openstack/ovn-northd-0" Nov 24 11:23:44 crc kubenswrapper[4752]: I1124 11:23:44.757254 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1077002f-be90-4e09-8158-efdc98329e5b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"1077002f-be90-4e09-8158-efdc98329e5b\") " pod="openstack/ovn-northd-0" Nov 24 11:23:44 crc kubenswrapper[4752]: I1124 11:23:44.758533 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1077002f-be90-4e09-8158-efdc98329e5b-scripts\") pod \"ovn-northd-0\" (UID: \"1077002f-be90-4e09-8158-efdc98329e5b\") " pod="openstack/ovn-northd-0" Nov 24 11:23:44 crc kubenswrapper[4752]: I1124 11:23:44.759092 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1077002f-be90-4e09-8158-efdc98329e5b-config\") pod \"ovn-northd-0\" (UID: \"1077002f-be90-4e09-8158-efdc98329e5b\") " pod="openstack/ovn-northd-0" Nov 24 11:23:44 crc kubenswrapper[4752]: I1124 11:23:44.770585 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1077002f-be90-4e09-8158-efdc98329e5b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"1077002f-be90-4e09-8158-efdc98329e5b\") " pod="openstack/ovn-northd-0" Nov 24 11:23:44 crc kubenswrapper[4752]: I1124 11:23:44.770898 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1077002f-be90-4e09-8158-efdc98329e5b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"1077002f-be90-4e09-8158-efdc98329e5b\") " pod="openstack/ovn-northd-0" Nov 24 11:23:44 crc kubenswrapper[4752]: I1124 11:23:44.776536 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1077002f-be90-4e09-8158-efdc98329e5b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"1077002f-be90-4e09-8158-efdc98329e5b\") " pod="openstack/ovn-northd-0" Nov 24 11:23:44 crc kubenswrapper[4752]: I1124 11:23:44.776616 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czr57\" (UniqueName: \"kubernetes.io/projected/1077002f-be90-4e09-8158-efdc98329e5b-kube-api-access-czr57\") pod \"ovn-northd-0\" (UID: \"1077002f-be90-4e09-8158-efdc98329e5b\") " pod="openstack/ovn-northd-0" Nov 24 11:23:44 crc kubenswrapper[4752]: I1124 11:23:44.816823 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 24 11:23:45 crc kubenswrapper[4752]: W1124 11:23:45.282320 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1077002f_be90_4e09_8158_efdc98329e5b.slice/crio-1ddb11e8cd6745332066e35a57f83011763d8e124d2b3c44c89bcc30b301c98e WatchSource:0}: Error finding container 1ddb11e8cd6745332066e35a57f83011763d8e124d2b3c44c89bcc30b301c98e: Status 404 returned error can't find the container with id 1ddb11e8cd6745332066e35a57f83011763d8e124d2b3c44c89bcc30b301c98e Nov 24 11:23:45 crc kubenswrapper[4752]: I1124 11:23:45.282895 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 24 11:23:46 crc kubenswrapper[4752]: I1124 11:23:46.269144 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"1077002f-be90-4e09-8158-efdc98329e5b","Type":"ContainerStarted","Data":"1ddb11e8cd6745332066e35a57f83011763d8e124d2b3c44c89bcc30b301c98e"} Nov 24 11:23:46 crc kubenswrapper[4752]: I1124 11:23:46.336921 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-tdgsl" Nov 24 11:23:46 crc kubenswrapper[4752]: I1124 11:23:46.396005 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-mtl48"] Nov 24 11:23:46 crc kubenswrapper[4752]: I1124 11:23:46.396295 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-mtl48" podUID="152a7f03-9669-4f3f-977e-0da6d03122f5" containerName="dnsmasq-dns" containerID="cri-o://162f2f38a32ff4f50b02e21cbe01ccd0e9abe6155ab668e7f74c46cf3a0bd5d3" gracePeriod=10 Nov 24 11:23:47 crc kubenswrapper[4752]: I1124 11:23:47.053809 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Nov 24 11:23:47 crc kubenswrapper[4752]: I1124 11:23:47.053853 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Nov 24 11:23:47 crc kubenswrapper[4752]: I1124 11:23:47.279797 4752 generic.go:334] "Generic (PLEG): container finished" podID="152a7f03-9669-4f3f-977e-0da6d03122f5" containerID="162f2f38a32ff4f50b02e21cbe01ccd0e9abe6155ab668e7f74c46cf3a0bd5d3" exitCode=0 Nov 24 11:23:47 crc kubenswrapper[4752]: I1124 11:23:47.279848 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-mtl48" event={"ID":"152a7f03-9669-4f3f-977e-0da6d03122f5","Type":"ContainerDied","Data":"162f2f38a32ff4f50b02e21cbe01ccd0e9abe6155ab668e7f74c46cf3a0bd5d3"} Nov 24 11:23:48 crc kubenswrapper[4752]: I1124 11:23:48.753578 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Nov 24 11:23:48 crc kubenswrapper[4752]: I1124 11:23:48.753917 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Nov 24 11:23:49 crc kubenswrapper[4752]: I1124 11:23:49.521922 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57d769cc4f-mtl48" podUID="152a7f03-9669-4f3f-977e-0da6d03122f5" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.100:5353: connect: connection refused" Nov 24 11:23:50 crc kubenswrapper[4752]: I1124 11:23:50.641575 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 24 11:23:50 crc kubenswrapper[4752]: I1124 11:23:50.673881 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-prc8b"] Nov 24 11:23:50 crc kubenswrapper[4752]: I1124 11:23:50.676028 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-prc8b" Nov 24 11:23:50 crc kubenswrapper[4752]: I1124 11:23:50.698211 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-prc8b"] Nov 24 11:23:50 crc kubenswrapper[4752]: I1124 11:23:50.772102 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwg4p\" (UniqueName: \"kubernetes.io/projected/d6a8e0cd-1585-416d-9142-fbe4991e8cda-kube-api-access-vwg4p\") pod \"dnsmasq-dns-b8fbc5445-prc8b\" (UID: \"d6a8e0cd-1585-416d-9142-fbe4991e8cda\") " pod="openstack/dnsmasq-dns-b8fbc5445-prc8b" Nov 24 11:23:50 crc kubenswrapper[4752]: I1124 11:23:50.772184 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d6a8e0cd-1585-416d-9142-fbe4991e8cda-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-prc8b\" (UID: \"d6a8e0cd-1585-416d-9142-fbe4991e8cda\") " pod="openstack/dnsmasq-dns-b8fbc5445-prc8b" Nov 24 11:23:50 crc kubenswrapper[4752]: I1124 11:23:50.772231 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6a8e0cd-1585-416d-9142-fbe4991e8cda-config\") pod \"dnsmasq-dns-b8fbc5445-prc8b\" (UID: \"d6a8e0cd-1585-416d-9142-fbe4991e8cda\") " pod="openstack/dnsmasq-dns-b8fbc5445-prc8b" Nov 24 11:23:50 crc kubenswrapper[4752]: I1124 11:23:50.772261 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d6a8e0cd-1585-416d-9142-fbe4991e8cda-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-prc8b\" (UID: \"d6a8e0cd-1585-416d-9142-fbe4991e8cda\") " pod="openstack/dnsmasq-dns-b8fbc5445-prc8b" Nov 24 11:23:50 crc kubenswrapper[4752]: I1124 11:23:50.772286 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6a8e0cd-1585-416d-9142-fbe4991e8cda-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-prc8b\" (UID: \"d6a8e0cd-1585-416d-9142-fbe4991e8cda\") " pod="openstack/dnsmasq-dns-b8fbc5445-prc8b" Nov 24 11:23:50 crc kubenswrapper[4752]: I1124 11:23:50.799262 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Nov 24 11:23:50 crc kubenswrapper[4752]: I1124 11:23:50.817815 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-mtl48" Nov 24 11:23:50 crc kubenswrapper[4752]: I1124 11:23:50.873275 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/152a7f03-9669-4f3f-977e-0da6d03122f5-dns-svc\") pod \"152a7f03-9669-4f3f-977e-0da6d03122f5\" (UID: \"152a7f03-9669-4f3f-977e-0da6d03122f5\") " Nov 24 11:23:50 crc kubenswrapper[4752]: I1124 11:23:50.873356 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/152a7f03-9669-4f3f-977e-0da6d03122f5-config\") pod \"152a7f03-9669-4f3f-977e-0da6d03122f5\" (UID: \"152a7f03-9669-4f3f-977e-0da6d03122f5\") " Nov 24 11:23:50 crc kubenswrapper[4752]: I1124 11:23:50.873489 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dx8vw\" (UniqueName: \"kubernetes.io/projected/152a7f03-9669-4f3f-977e-0da6d03122f5-kube-api-access-dx8vw\") pod \"152a7f03-9669-4f3f-977e-0da6d03122f5\" (UID: \"152a7f03-9669-4f3f-977e-0da6d03122f5\") " Nov 24 11:23:50 crc kubenswrapper[4752]: I1124 11:23:50.873725 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6a8e0cd-1585-416d-9142-fbe4991e8cda-config\") pod \"dnsmasq-dns-b8fbc5445-prc8b\" (UID: \"d6a8e0cd-1585-416d-9142-fbe4991e8cda\") " pod="openstack/dnsmasq-dns-b8fbc5445-prc8b" Nov 24 11:23:50 crc kubenswrapper[4752]: I1124 11:23:50.873817 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d6a8e0cd-1585-416d-9142-fbe4991e8cda-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-prc8b\" (UID: \"d6a8e0cd-1585-416d-9142-fbe4991e8cda\") " pod="openstack/dnsmasq-dns-b8fbc5445-prc8b" Nov 24 11:23:50 crc kubenswrapper[4752]: I1124 11:23:50.873845 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6a8e0cd-1585-416d-9142-fbe4991e8cda-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-prc8b\" (UID: \"d6a8e0cd-1585-416d-9142-fbe4991e8cda\") " pod="openstack/dnsmasq-dns-b8fbc5445-prc8b" Nov 24 11:23:50 crc kubenswrapper[4752]: I1124 11:23:50.873948 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwg4p\" (UniqueName: \"kubernetes.io/projected/d6a8e0cd-1585-416d-9142-fbe4991e8cda-kube-api-access-vwg4p\") pod \"dnsmasq-dns-b8fbc5445-prc8b\" (UID: \"d6a8e0cd-1585-416d-9142-fbe4991e8cda\") " pod="openstack/dnsmasq-dns-b8fbc5445-prc8b" Nov 24 11:23:50 crc kubenswrapper[4752]: I1124 11:23:50.874000 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d6a8e0cd-1585-416d-9142-fbe4991e8cda-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-prc8b\" (UID: \"d6a8e0cd-1585-416d-9142-fbe4991e8cda\") " pod="openstack/dnsmasq-dns-b8fbc5445-prc8b" Nov 24 11:23:50 crc kubenswrapper[4752]: I1124 11:23:50.875026 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d6a8e0cd-1585-416d-9142-fbe4991e8cda-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-prc8b\" (UID: \"d6a8e0cd-1585-416d-9142-fbe4991e8cda\") " pod="openstack/dnsmasq-dns-b8fbc5445-prc8b" Nov 24 11:23:50 crc kubenswrapper[4752]: I1124 11:23:50.876087 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6a8e0cd-1585-416d-9142-fbe4991e8cda-config\") pod \"dnsmasq-dns-b8fbc5445-prc8b\" (UID: \"d6a8e0cd-1585-416d-9142-fbe4991e8cda\") " pod="openstack/dnsmasq-dns-b8fbc5445-prc8b" Nov 24 11:23:50 crc kubenswrapper[4752]: I1124 11:23:50.877315 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d6a8e0cd-1585-416d-9142-fbe4991e8cda-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-prc8b\" (UID: \"d6a8e0cd-1585-416d-9142-fbe4991e8cda\") " pod="openstack/dnsmasq-dns-b8fbc5445-prc8b" Nov 24 11:23:50 crc kubenswrapper[4752]: I1124 11:23:50.877895 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6a8e0cd-1585-416d-9142-fbe4991e8cda-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-prc8b\" (UID: \"d6a8e0cd-1585-416d-9142-fbe4991e8cda\") " pod="openstack/dnsmasq-dns-b8fbc5445-prc8b" Nov 24 11:23:50 crc kubenswrapper[4752]: I1124 11:23:50.889943 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/152a7f03-9669-4f3f-977e-0da6d03122f5-kube-api-access-dx8vw" (OuterVolumeSpecName: "kube-api-access-dx8vw") pod "152a7f03-9669-4f3f-977e-0da6d03122f5" (UID: "152a7f03-9669-4f3f-977e-0da6d03122f5"). InnerVolumeSpecName "kube-api-access-dx8vw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:23:50 crc kubenswrapper[4752]: I1124 11:23:50.928629 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwg4p\" (UniqueName: \"kubernetes.io/projected/d6a8e0cd-1585-416d-9142-fbe4991e8cda-kube-api-access-vwg4p\") pod \"dnsmasq-dns-b8fbc5445-prc8b\" (UID: \"d6a8e0cd-1585-416d-9142-fbe4991e8cda\") " pod="openstack/dnsmasq-dns-b8fbc5445-prc8b" Nov 24 11:23:50 crc kubenswrapper[4752]: I1124 11:23:50.964873 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/152a7f03-9669-4f3f-977e-0da6d03122f5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "152a7f03-9669-4f3f-977e-0da6d03122f5" (UID: "152a7f03-9669-4f3f-977e-0da6d03122f5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:23:50 crc kubenswrapper[4752]: I1124 11:23:50.970337 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/152a7f03-9669-4f3f-977e-0da6d03122f5-config" (OuterVolumeSpecName: "config") pod "152a7f03-9669-4f3f-977e-0da6d03122f5" (UID: "152a7f03-9669-4f3f-977e-0da6d03122f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:23:50 crc kubenswrapper[4752]: I1124 11:23:50.979947 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dx8vw\" (UniqueName: \"kubernetes.io/projected/152a7f03-9669-4f3f-977e-0da6d03122f5-kube-api-access-dx8vw\") on node \"crc\" DevicePath \"\"" Nov 24 11:23:50 crc kubenswrapper[4752]: I1124 11:23:50.980016 4752 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/152a7f03-9669-4f3f-977e-0da6d03122f5-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 11:23:50 crc kubenswrapper[4752]: I1124 11:23:50.980030 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/152a7f03-9669-4f3f-977e-0da6d03122f5-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:23:51 crc kubenswrapper[4752]: I1124 11:23:51.003231 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-prc8b" Nov 24 11:23:51 crc kubenswrapper[4752]: I1124 11:23:51.112259 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Nov 24 11:23:51 crc kubenswrapper[4752]: I1124 11:23:51.314396 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-mtl48" event={"ID":"152a7f03-9669-4f3f-977e-0da6d03122f5","Type":"ContainerDied","Data":"859b0faa5a37a68ef428441a2e3d48617818fa4577c9fbd32676a827dc16a410"} Nov 24 11:23:51 crc kubenswrapper[4752]: I1124 11:23:51.314420 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-mtl48" Nov 24 11:23:51 crc kubenswrapper[4752]: I1124 11:23:51.314451 4752 scope.go:117] "RemoveContainer" containerID="162f2f38a32ff4f50b02e21cbe01ccd0e9abe6155ab668e7f74c46cf3a0bd5d3" Nov 24 11:23:51 crc kubenswrapper[4752]: I1124 11:23:51.344690 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-mtl48"] Nov 24 11:23:51 crc kubenswrapper[4752]: I1124 11:23:51.350835 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-mtl48"] Nov 24 11:23:51 crc kubenswrapper[4752]: I1124 11:23:51.468453 4752 scope.go:117] "RemoveContainer" containerID="1c1e759158a12e5160fb333c16f566796a888f4675fa0f3eb44a700dd00afe8e" Nov 24 11:23:51 crc kubenswrapper[4752]: I1124 11:23:51.519249 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-prc8b"] Nov 24 11:23:51 crc kubenswrapper[4752]: W1124 11:23:51.539730 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6a8e0cd_1585_416d_9142_fbe4991e8cda.slice/crio-fae57d95dcb3c404f8071f7045bafa84c91d0011114ccab4adbef1ea8e51c254 WatchSource:0}: Error finding container fae57d95dcb3c404f8071f7045bafa84c91d0011114ccab4adbef1ea8e51c254: Status 404 returned error can't find the container with id fae57d95dcb3c404f8071f7045bafa84c91d0011114ccab4adbef1ea8e51c254 Nov 24 11:23:51 crc kubenswrapper[4752]: I1124 11:23:51.894853 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Nov 24 11:23:51 crc kubenswrapper[4752]: E1124 11:23:51.896027 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="152a7f03-9669-4f3f-977e-0da6d03122f5" containerName="init" Nov 24 11:23:51 crc kubenswrapper[4752]: I1124 11:23:51.896043 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="152a7f03-9669-4f3f-977e-0da6d03122f5" containerName="init" Nov 24 11:23:51 crc kubenswrapper[4752]: E1124 11:23:51.896068 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="152a7f03-9669-4f3f-977e-0da6d03122f5" containerName="dnsmasq-dns" Nov 24 11:23:51 crc kubenswrapper[4752]: I1124 11:23:51.896074 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="152a7f03-9669-4f3f-977e-0da6d03122f5" containerName="dnsmasq-dns" Nov 24 11:23:51 crc kubenswrapper[4752]: I1124 11:23:51.896669 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="152a7f03-9669-4f3f-977e-0da6d03122f5" containerName="dnsmasq-dns" Nov 24 11:23:51 crc kubenswrapper[4752]: I1124 11:23:51.912283 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Nov 24 11:23:51 crc kubenswrapper[4752]: I1124 11:23:51.922313 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Nov 24 11:23:51 crc kubenswrapper[4752]: I1124 11:23:51.922419 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Nov 24 11:23:51 crc kubenswrapper[4752]: I1124 11:23:51.922590 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Nov 24 11:23:51 crc kubenswrapper[4752]: I1124 11:23:51.922731 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-rj74h" Nov 24 11:23:51 crc kubenswrapper[4752]: I1124 11:23:51.922916 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Nov 24 11:23:51 crc kubenswrapper[4752]: I1124 11:23:51.997625 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjs2z\" (UniqueName: \"kubernetes.io/projected/d140408d-b7e5-4ba0-9404-877498cf18a1-kube-api-access-mjs2z\") pod \"swift-storage-0\" (UID: \"d140408d-b7e5-4ba0-9404-877498cf18a1\") " pod="openstack/swift-storage-0" Nov 24 11:23:51 crc kubenswrapper[4752]: I1124 11:23:51.997899 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"d140408d-b7e5-4ba0-9404-877498cf18a1\") " pod="openstack/swift-storage-0" Nov 24 11:23:51 crc kubenswrapper[4752]: I1124 11:23:51.998057 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d140408d-b7e5-4ba0-9404-877498cf18a1-cache\") pod \"swift-storage-0\" (UID: \"d140408d-b7e5-4ba0-9404-877498cf18a1\") " pod="openstack/swift-storage-0" Nov 24 11:23:51 crc kubenswrapper[4752]: I1124 11:23:51.998129 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d140408d-b7e5-4ba0-9404-877498cf18a1-lock\") pod \"swift-storage-0\" (UID: \"d140408d-b7e5-4ba0-9404-877498cf18a1\") " pod="openstack/swift-storage-0" Nov 24 11:23:51 crc kubenswrapper[4752]: I1124 11:23:51.998215 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d140408d-b7e5-4ba0-9404-877498cf18a1-etc-swift\") pod \"swift-storage-0\" (UID: \"d140408d-b7e5-4ba0-9404-877498cf18a1\") " pod="openstack/swift-storage-0" Nov 24 11:23:52 crc kubenswrapper[4752]: I1124 11:23:52.099425 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d140408d-b7e5-4ba0-9404-877498cf18a1-lock\") pod \"swift-storage-0\" (UID: \"d140408d-b7e5-4ba0-9404-877498cf18a1\") " pod="openstack/swift-storage-0" Nov 24 11:23:52 crc kubenswrapper[4752]: I1124 11:23:52.099878 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d140408d-b7e5-4ba0-9404-877498cf18a1-etc-swift\") pod \"swift-storage-0\" (UID: \"d140408d-b7e5-4ba0-9404-877498cf18a1\") " pod="openstack/swift-storage-0" Nov 24 11:23:52 crc kubenswrapper[4752]: I1124 11:23:52.099944 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjs2z\" (UniqueName: \"kubernetes.io/projected/d140408d-b7e5-4ba0-9404-877498cf18a1-kube-api-access-mjs2z\") pod \"swift-storage-0\" (UID: \"d140408d-b7e5-4ba0-9404-877498cf18a1\") " pod="openstack/swift-storage-0" Nov 24 11:23:52 crc kubenswrapper[4752]: I1124 11:23:52.099978 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"d140408d-b7e5-4ba0-9404-877498cf18a1\") " pod="openstack/swift-storage-0" Nov 24 11:23:52 crc kubenswrapper[4752]: I1124 11:23:52.100021 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d140408d-b7e5-4ba0-9404-877498cf18a1-lock\") pod \"swift-storage-0\" (UID: \"d140408d-b7e5-4ba0-9404-877498cf18a1\") " pod="openstack/swift-storage-0" Nov 24 11:23:52 crc kubenswrapper[4752]: I1124 11:23:52.100113 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d140408d-b7e5-4ba0-9404-877498cf18a1-cache\") pod \"swift-storage-0\" (UID: \"d140408d-b7e5-4ba0-9404-877498cf18a1\") " pod="openstack/swift-storage-0" Nov 24 11:23:52 crc kubenswrapper[4752]: E1124 11:23:52.100138 4752 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 24 11:23:52 crc kubenswrapper[4752]: E1124 11:23:52.100163 4752 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 24 11:23:52 crc kubenswrapper[4752]: E1124 11:23:52.100233 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d140408d-b7e5-4ba0-9404-877498cf18a1-etc-swift podName:d140408d-b7e5-4ba0-9404-877498cf18a1 nodeName:}" failed. No retries permitted until 2025-11-24 11:23:52.600213982 +0000 UTC m=+1038.585034271 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d140408d-b7e5-4ba0-9404-877498cf18a1-etc-swift") pod "swift-storage-0" (UID: "d140408d-b7e5-4ba0-9404-877498cf18a1") : configmap "swift-ring-files" not found Nov 24 11:23:52 crc kubenswrapper[4752]: I1124 11:23:52.100335 4752 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"d140408d-b7e5-4ba0-9404-877498cf18a1\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/swift-storage-0" Nov 24 11:23:52 crc kubenswrapper[4752]: I1124 11:23:52.100555 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d140408d-b7e5-4ba0-9404-877498cf18a1-cache\") pod \"swift-storage-0\" (UID: \"d140408d-b7e5-4ba0-9404-877498cf18a1\") " pod="openstack/swift-storage-0" Nov 24 11:23:52 crc kubenswrapper[4752]: I1124 11:23:52.120260 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"d140408d-b7e5-4ba0-9404-877498cf18a1\") " pod="openstack/swift-storage-0" Nov 24 11:23:52 crc kubenswrapper[4752]: I1124 11:23:52.120561 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjs2z\" (UniqueName: \"kubernetes.io/projected/d140408d-b7e5-4ba0-9404-877498cf18a1-kube-api-access-mjs2z\") pod \"swift-storage-0\" (UID: \"d140408d-b7e5-4ba0-9404-877498cf18a1\") " pod="openstack/swift-storage-0" Nov 24 11:23:52 crc kubenswrapper[4752]: I1124 11:23:52.324230 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"1077002f-be90-4e09-8158-efdc98329e5b","Type":"ContainerStarted","Data":"04d87e64c2950336bd5ce6ab63d3778aa490d6f9336803686ba547e5f016a8d5"} Nov 24 11:23:52 crc kubenswrapper[4752]: I1124 11:23:52.324276 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"1077002f-be90-4e09-8158-efdc98329e5b","Type":"ContainerStarted","Data":"cbc8819aaddd8d4d2d0dad826bb4ce82df8a1c47c68ac131ce481a35067952c7"} Nov 24 11:23:52 crc kubenswrapper[4752]: I1124 11:23:52.324320 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Nov 24 11:23:52 crc kubenswrapper[4752]: I1124 11:23:52.326804 4752 generic.go:334] "Generic (PLEG): container finished" podID="d6a8e0cd-1585-416d-9142-fbe4991e8cda" containerID="2b7c2876d99a171eee9ad830ee388c74f56a340be2c36c547eb8a428b01a7b82" exitCode=0 Nov 24 11:23:52 crc kubenswrapper[4752]: I1124 11:23:52.326843 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-prc8b" event={"ID":"d6a8e0cd-1585-416d-9142-fbe4991e8cda","Type":"ContainerDied","Data":"2b7c2876d99a171eee9ad830ee388c74f56a340be2c36c547eb8a428b01a7b82"} Nov 24 11:23:52 crc kubenswrapper[4752]: I1124 11:23:52.326864 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-prc8b" event={"ID":"d6a8e0cd-1585-416d-9142-fbe4991e8cda","Type":"ContainerStarted","Data":"fae57d95dcb3c404f8071f7045bafa84c91d0011114ccab4adbef1ea8e51c254"} Nov 24 11:23:52 crc kubenswrapper[4752]: I1124 11:23:52.355447 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.092791687 podStartE2EDuration="8.355416451s" podCreationTimestamp="2025-11-24 11:23:44 +0000 UTC" firstStartedPulling="2025-11-24 11:23:45.285179843 +0000 UTC m=+1031.270000132" lastFinishedPulling="2025-11-24 11:23:51.547804597 +0000 UTC m=+1037.532624896" observedRunningTime="2025-11-24 11:23:52.350572011 +0000 UTC m=+1038.335392300" watchObservedRunningTime="2025-11-24 11:23:52.355416451 +0000 UTC m=+1038.340236740" Nov 24 11:23:52 crc kubenswrapper[4752]: I1124 11:23:52.608510 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d140408d-b7e5-4ba0-9404-877498cf18a1-etc-swift\") pod \"swift-storage-0\" (UID: \"d140408d-b7e5-4ba0-9404-877498cf18a1\") " pod="openstack/swift-storage-0" Nov 24 11:23:52 crc kubenswrapper[4752]: E1124 11:23:52.608696 4752 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 24 11:23:52 crc kubenswrapper[4752]: E1124 11:23:52.608721 4752 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 24 11:23:52 crc kubenswrapper[4752]: E1124 11:23:52.608801 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d140408d-b7e5-4ba0-9404-877498cf18a1-etc-swift podName:d140408d-b7e5-4ba0-9404-877498cf18a1 nodeName:}" failed. No retries permitted until 2025-11-24 11:23:53.608775967 +0000 UTC m=+1039.593596256 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d140408d-b7e5-4ba0-9404-877498cf18a1-etc-swift") pod "swift-storage-0" (UID: "d140408d-b7e5-4ba0-9404-877498cf18a1") : configmap "swift-ring-files" not found Nov 24 11:23:52 crc kubenswrapper[4752]: I1124 11:23:52.746970 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="152a7f03-9669-4f3f-977e-0da6d03122f5" path="/var/lib/kubelet/pods/152a7f03-9669-4f3f-977e-0da6d03122f5/volumes" Nov 24 11:23:52 crc kubenswrapper[4752]: I1124 11:23:52.847493 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Nov 24 11:23:52 crc kubenswrapper[4752]: I1124 11:23:52.919290 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Nov 24 11:23:53 crc kubenswrapper[4752]: I1124 11:23:53.334198 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-prc8b" event={"ID":"d6a8e0cd-1585-416d-9142-fbe4991e8cda","Type":"ContainerStarted","Data":"7fee1cd1a6c9edd3a15d864d2449f04014cf565e182b53a6fe8f3432c046e3ab"} Nov 24 11:23:53 crc kubenswrapper[4752]: I1124 11:23:53.334721 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-prc8b" Nov 24 11:23:53 crc kubenswrapper[4752]: I1124 11:23:53.354880 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-prc8b" podStartSLOduration=3.354861792 podStartE2EDuration="3.354861792s" podCreationTimestamp="2025-11-24 11:23:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:23:53.351719021 +0000 UTC m=+1039.336539320" watchObservedRunningTime="2025-11-24 11:23:53.354861792 +0000 UTC m=+1039.339682091" Nov 24 11:23:53 crc kubenswrapper[4752]: I1124 11:23:53.623084 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d140408d-b7e5-4ba0-9404-877498cf18a1-etc-swift\") pod \"swift-storage-0\" (UID: \"d140408d-b7e5-4ba0-9404-877498cf18a1\") " pod="openstack/swift-storage-0" Nov 24 11:23:53 crc kubenswrapper[4752]: E1124 11:23:53.623309 4752 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 24 11:23:53 crc kubenswrapper[4752]: E1124 11:23:53.623333 4752 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 24 11:23:53 crc kubenswrapper[4752]: E1124 11:23:53.623397 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d140408d-b7e5-4ba0-9404-877498cf18a1-etc-swift podName:d140408d-b7e5-4ba0-9404-877498cf18a1 nodeName:}" failed. No retries permitted until 2025-11-24 11:23:55.62337518 +0000 UTC m=+1041.608195469 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d140408d-b7e5-4ba0-9404-877498cf18a1-etc-swift") pod "swift-storage-0" (UID: "d140408d-b7e5-4ba0-9404-877498cf18a1") : configmap "swift-ring-files" not found Nov 24 11:23:53 crc kubenswrapper[4752]: I1124 11:23:53.906407 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-f4c5-account-create-6zcx6"] Nov 24 11:23:53 crc kubenswrapper[4752]: I1124 11:23:53.907668 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f4c5-account-create-6zcx6" Nov 24 11:23:53 crc kubenswrapper[4752]: I1124 11:23:53.910921 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Nov 24 11:23:53 crc kubenswrapper[4752]: I1124 11:23:53.916931 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f4c5-account-create-6zcx6"] Nov 24 11:23:53 crc kubenswrapper[4752]: I1124 11:23:53.940908 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-j8hcn"] Nov 24 11:23:53 crc kubenswrapper[4752]: I1124 11:23:53.941976 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-j8hcn" Nov 24 11:23:53 crc kubenswrapper[4752]: I1124 11:23:53.957667 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-j8hcn"] Nov 24 11:23:54 crc kubenswrapper[4752]: I1124 11:23:54.032639 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fk54\" (UniqueName: \"kubernetes.io/projected/2a954e1c-c17d-4296-b165-22b1653af22f-kube-api-access-9fk54\") pod \"glance-db-create-j8hcn\" (UID: \"2a954e1c-c17d-4296-b165-22b1653af22f\") " pod="openstack/glance-db-create-j8hcn" Nov 24 11:23:54 crc kubenswrapper[4752]: I1124 11:23:54.032709 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a954e1c-c17d-4296-b165-22b1653af22f-operator-scripts\") pod \"glance-db-create-j8hcn\" (UID: \"2a954e1c-c17d-4296-b165-22b1653af22f\") " pod="openstack/glance-db-create-j8hcn" Nov 24 11:23:54 crc kubenswrapper[4752]: I1124 11:23:54.032801 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6fts\" (UniqueName: \"kubernetes.io/projected/0e756e30-c210-4624-a396-e9e05691f1ed-kube-api-access-p6fts\") pod \"glance-f4c5-account-create-6zcx6\" (UID: \"0e756e30-c210-4624-a396-e9e05691f1ed\") " pod="openstack/glance-f4c5-account-create-6zcx6" Nov 24 11:23:54 crc kubenswrapper[4752]: I1124 11:23:54.032835 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e756e30-c210-4624-a396-e9e05691f1ed-operator-scripts\") pod \"glance-f4c5-account-create-6zcx6\" (UID: \"0e756e30-c210-4624-a396-e9e05691f1ed\") " pod="openstack/glance-f4c5-account-create-6zcx6" Nov 24 11:23:54 crc kubenswrapper[4752]: I1124 11:23:54.134814 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fk54\" (UniqueName: \"kubernetes.io/projected/2a954e1c-c17d-4296-b165-22b1653af22f-kube-api-access-9fk54\") pod \"glance-db-create-j8hcn\" (UID: \"2a954e1c-c17d-4296-b165-22b1653af22f\") " pod="openstack/glance-db-create-j8hcn" Nov 24 11:23:54 crc kubenswrapper[4752]: I1124 11:23:54.134883 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a954e1c-c17d-4296-b165-22b1653af22f-operator-scripts\") pod \"glance-db-create-j8hcn\" (UID: \"2a954e1c-c17d-4296-b165-22b1653af22f\") " pod="openstack/glance-db-create-j8hcn" Nov 24 11:23:54 crc kubenswrapper[4752]: I1124 11:23:54.134923 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6fts\" (UniqueName: \"kubernetes.io/projected/0e756e30-c210-4624-a396-e9e05691f1ed-kube-api-access-p6fts\") pod \"glance-f4c5-account-create-6zcx6\" (UID: \"0e756e30-c210-4624-a396-e9e05691f1ed\") " pod="openstack/glance-f4c5-account-create-6zcx6" Nov 24 11:23:54 crc kubenswrapper[4752]: I1124 11:23:54.134956 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e756e30-c210-4624-a396-e9e05691f1ed-operator-scripts\") pod \"glance-f4c5-account-create-6zcx6\" (UID: \"0e756e30-c210-4624-a396-e9e05691f1ed\") " pod="openstack/glance-f4c5-account-create-6zcx6" Nov 24 11:23:54 crc kubenswrapper[4752]: I1124 11:23:54.135819 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e756e30-c210-4624-a396-e9e05691f1ed-operator-scripts\") pod \"glance-f4c5-account-create-6zcx6\" (UID: \"0e756e30-c210-4624-a396-e9e05691f1ed\") " pod="openstack/glance-f4c5-account-create-6zcx6" Nov 24 11:23:54 crc kubenswrapper[4752]: I1124 11:23:54.135878 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a954e1c-c17d-4296-b165-22b1653af22f-operator-scripts\") pod \"glance-db-create-j8hcn\" (UID: \"2a954e1c-c17d-4296-b165-22b1653af22f\") " pod="openstack/glance-db-create-j8hcn" Nov 24 11:23:54 crc kubenswrapper[4752]: I1124 11:23:54.194669 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6fts\" (UniqueName: \"kubernetes.io/projected/0e756e30-c210-4624-a396-e9e05691f1ed-kube-api-access-p6fts\") pod \"glance-f4c5-account-create-6zcx6\" (UID: \"0e756e30-c210-4624-a396-e9e05691f1ed\") " pod="openstack/glance-f4c5-account-create-6zcx6" Nov 24 11:23:54 crc kubenswrapper[4752]: I1124 11:23:54.194859 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fk54\" (UniqueName: \"kubernetes.io/projected/2a954e1c-c17d-4296-b165-22b1653af22f-kube-api-access-9fk54\") pod \"glance-db-create-j8hcn\" (UID: \"2a954e1c-c17d-4296-b165-22b1653af22f\") " pod="openstack/glance-db-create-j8hcn" Nov 24 11:23:54 crc kubenswrapper[4752]: I1124 11:23:54.232218 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f4c5-account-create-6zcx6" Nov 24 11:23:54 crc kubenswrapper[4752]: I1124 11:23:54.262411 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-j8hcn" Nov 24 11:23:54 crc kubenswrapper[4752]: I1124 11:23:54.687933 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f4c5-account-create-6zcx6"] Nov 24 11:23:54 crc kubenswrapper[4752]: I1124 11:23:54.768243 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-j8hcn"] Nov 24 11:23:55 crc kubenswrapper[4752]: I1124 11:23:55.354456 4752 generic.go:334] "Generic (PLEG): container finished" podID="2a954e1c-c17d-4296-b165-22b1653af22f" containerID="4407b9469063ff93a22f5f0b8c37efedcfbeb77bcd5803ed59f64e869161de89" exitCode=0 Nov 24 11:23:55 crc kubenswrapper[4752]: I1124 11:23:55.354560 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-j8hcn" event={"ID":"2a954e1c-c17d-4296-b165-22b1653af22f","Type":"ContainerDied","Data":"4407b9469063ff93a22f5f0b8c37efedcfbeb77bcd5803ed59f64e869161de89"} Nov 24 11:23:55 crc kubenswrapper[4752]: I1124 11:23:55.354604 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-j8hcn" event={"ID":"2a954e1c-c17d-4296-b165-22b1653af22f","Type":"ContainerStarted","Data":"c5879f7fdb8bdad8b7dec2c75289ae23ed12176c7357d21b193fc4b70bed3f45"} Nov 24 11:23:55 crc kubenswrapper[4752]: I1124 11:23:55.355774 4752 generic.go:334] "Generic (PLEG): container finished" podID="0e756e30-c210-4624-a396-e9e05691f1ed" containerID="04e513bef10e5502ac575ac143cf0082eb87c1fcae98ef5badd9c267c63d7fa4" exitCode=0 Nov 24 11:23:55 crc kubenswrapper[4752]: I1124 11:23:55.355816 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f4c5-account-create-6zcx6" event={"ID":"0e756e30-c210-4624-a396-e9e05691f1ed","Type":"ContainerDied","Data":"04e513bef10e5502ac575ac143cf0082eb87c1fcae98ef5badd9c267c63d7fa4"} Nov 24 11:23:55 crc kubenswrapper[4752]: I1124 11:23:55.355832 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f4c5-account-create-6zcx6" event={"ID":"0e756e30-c210-4624-a396-e9e05691f1ed","Type":"ContainerStarted","Data":"30ae9a2e881223168524a88a189e6bf97f45a204b955334ebdddf5d441c01687"} Nov 24 11:23:55 crc kubenswrapper[4752]: I1124 11:23:55.673071 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d140408d-b7e5-4ba0-9404-877498cf18a1-etc-swift\") pod \"swift-storage-0\" (UID: \"d140408d-b7e5-4ba0-9404-877498cf18a1\") " pod="openstack/swift-storage-0" Nov 24 11:23:55 crc kubenswrapper[4752]: E1124 11:23:55.673274 4752 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 24 11:23:55 crc kubenswrapper[4752]: E1124 11:23:55.673637 4752 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 24 11:23:55 crc kubenswrapper[4752]: E1124 11:23:55.674106 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d140408d-b7e5-4ba0-9404-877498cf18a1-etc-swift podName:d140408d-b7e5-4ba0-9404-877498cf18a1 nodeName:}" failed. No retries permitted until 2025-11-24 11:23:59.673677693 +0000 UTC m=+1045.658497982 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d140408d-b7e5-4ba0-9404-877498cf18a1-etc-swift") pod "swift-storage-0" (UID: "d140408d-b7e5-4ba0-9404-877498cf18a1") : configmap "swift-ring-files" not found Nov 24 11:23:55 crc kubenswrapper[4752]: I1124 11:23:55.802757 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-rmx7z"] Nov 24 11:23:55 crc kubenswrapper[4752]: I1124 11:23:55.803913 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rmx7z" Nov 24 11:23:55 crc kubenswrapper[4752]: I1124 11:23:55.809551 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Nov 24 11:23:55 crc kubenswrapper[4752]: I1124 11:23:55.809616 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Nov 24 11:23:55 crc kubenswrapper[4752]: I1124 11:23:55.809693 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Nov 24 11:23:55 crc kubenswrapper[4752]: I1124 11:23:55.822801 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-rmx7z"] Nov 24 11:23:55 crc kubenswrapper[4752]: I1124 11:23:55.829778 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-rmx7z"] Nov 24 11:23:55 crc kubenswrapper[4752]: E1124 11:23:55.830335 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-rdrsk ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/swift-ring-rebalance-rmx7z" podUID="f276d434-1756-4be0-a4b1-58c6ead29161" Nov 24 11:23:55 crc kubenswrapper[4752]: I1124 11:23:55.850974 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-vvkzg"] Nov 24 11:23:55 crc kubenswrapper[4752]: I1124 11:23:55.852936 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vvkzg" Nov 24 11:23:55 crc kubenswrapper[4752]: I1124 11:23:55.869936 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-vvkzg"] Nov 24 11:23:55 crc kubenswrapper[4752]: I1124 11:23:55.876028 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f276d434-1756-4be0-a4b1-58c6ead29161-dispersionconf\") pod \"swift-ring-rebalance-rmx7z\" (UID: \"f276d434-1756-4be0-a4b1-58c6ead29161\") " pod="openstack/swift-ring-rebalance-rmx7z" Nov 24 11:23:55 crc kubenswrapper[4752]: I1124 11:23:55.876066 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdrsk\" (UniqueName: \"kubernetes.io/projected/f276d434-1756-4be0-a4b1-58c6ead29161-kube-api-access-rdrsk\") pod \"swift-ring-rebalance-rmx7z\" (UID: \"f276d434-1756-4be0-a4b1-58c6ead29161\") " pod="openstack/swift-ring-rebalance-rmx7z" Nov 24 11:23:55 crc kubenswrapper[4752]: I1124 11:23:55.876096 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f276d434-1756-4be0-a4b1-58c6ead29161-swiftconf\") pod \"swift-ring-rebalance-rmx7z\" (UID: \"f276d434-1756-4be0-a4b1-58c6ead29161\") " pod="openstack/swift-ring-rebalance-rmx7z" Nov 24 11:23:55 crc kubenswrapper[4752]: I1124 11:23:55.876137 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f276d434-1756-4be0-a4b1-58c6ead29161-scripts\") pod \"swift-ring-rebalance-rmx7z\" (UID: \"f276d434-1756-4be0-a4b1-58c6ead29161\") " pod="openstack/swift-ring-rebalance-rmx7z" Nov 24 11:23:55 crc kubenswrapper[4752]: I1124 11:23:55.876447 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f276d434-1756-4be0-a4b1-58c6ead29161-ring-data-devices\") pod \"swift-ring-rebalance-rmx7z\" (UID: \"f276d434-1756-4be0-a4b1-58c6ead29161\") " pod="openstack/swift-ring-rebalance-rmx7z" Nov 24 11:23:55 crc kubenswrapper[4752]: I1124 11:23:55.876558 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f276d434-1756-4be0-a4b1-58c6ead29161-etc-swift\") pod \"swift-ring-rebalance-rmx7z\" (UID: \"f276d434-1756-4be0-a4b1-58c6ead29161\") " pod="openstack/swift-ring-rebalance-rmx7z" Nov 24 11:23:55 crc kubenswrapper[4752]: I1124 11:23:55.876604 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f276d434-1756-4be0-a4b1-58c6ead29161-combined-ca-bundle\") pod \"swift-ring-rebalance-rmx7z\" (UID: \"f276d434-1756-4be0-a4b1-58c6ead29161\") " pod="openstack/swift-ring-rebalance-rmx7z" Nov 24 11:23:55 crc kubenswrapper[4752]: I1124 11:23:55.978171 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f276d434-1756-4be0-a4b1-58c6ead29161-scripts\") pod \"swift-ring-rebalance-rmx7z\" (UID: \"f276d434-1756-4be0-a4b1-58c6ead29161\") " pod="openstack/swift-ring-rebalance-rmx7z" Nov 24 11:23:55 crc kubenswrapper[4752]: I1124 11:23:55.978246 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0ecf935a-a2fa-4fb4-91f4-46f5564cb094-ring-data-devices\") pod \"swift-ring-rebalance-vvkzg\" (UID: \"0ecf935a-a2fa-4fb4-91f4-46f5564cb094\") " pod="openstack/swift-ring-rebalance-vvkzg" Nov 24 11:23:55 crc kubenswrapper[4752]: I1124 11:23:55.978284 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgxbh\" (UniqueName: \"kubernetes.io/projected/0ecf935a-a2fa-4fb4-91f4-46f5564cb094-kube-api-access-cgxbh\") pod \"swift-ring-rebalance-vvkzg\" (UID: \"0ecf935a-a2fa-4fb4-91f4-46f5564cb094\") " pod="openstack/swift-ring-rebalance-vvkzg" Nov 24 11:23:55 crc kubenswrapper[4752]: I1124 11:23:55.978328 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0ecf935a-a2fa-4fb4-91f4-46f5564cb094-dispersionconf\") pod \"swift-ring-rebalance-vvkzg\" (UID: \"0ecf935a-a2fa-4fb4-91f4-46f5564cb094\") " pod="openstack/swift-ring-rebalance-vvkzg" Nov 24 11:23:55 crc kubenswrapper[4752]: I1124 11:23:55.978355 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0ecf935a-a2fa-4fb4-91f4-46f5564cb094-scripts\") pod \"swift-ring-rebalance-vvkzg\" (UID: \"0ecf935a-a2fa-4fb4-91f4-46f5564cb094\") " pod="openstack/swift-ring-rebalance-vvkzg" Nov 24 11:23:55 crc kubenswrapper[4752]: I1124 11:23:55.978372 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f276d434-1756-4be0-a4b1-58c6ead29161-ring-data-devices\") pod \"swift-ring-rebalance-rmx7z\" (UID: \"f276d434-1756-4be0-a4b1-58c6ead29161\") " pod="openstack/swift-ring-rebalance-rmx7z" Nov 24 11:23:55 crc kubenswrapper[4752]: I1124 11:23:55.978402 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0ecf935a-a2fa-4fb4-91f4-46f5564cb094-etc-swift\") pod \"swift-ring-rebalance-vvkzg\" (UID: \"0ecf935a-a2fa-4fb4-91f4-46f5564cb094\") " pod="openstack/swift-ring-rebalance-vvkzg" Nov 24 11:23:55 crc kubenswrapper[4752]: I1124 11:23:55.978420 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ecf935a-a2fa-4fb4-91f4-46f5564cb094-combined-ca-bundle\") pod \"swift-ring-rebalance-vvkzg\" (UID: \"0ecf935a-a2fa-4fb4-91f4-46f5564cb094\") " pod="openstack/swift-ring-rebalance-vvkzg" Nov 24 11:23:55 crc kubenswrapper[4752]: I1124 11:23:55.978452 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f276d434-1756-4be0-a4b1-58c6ead29161-etc-swift\") pod \"swift-ring-rebalance-rmx7z\" (UID: \"f276d434-1756-4be0-a4b1-58c6ead29161\") " pod="openstack/swift-ring-rebalance-rmx7z" Nov 24 11:23:55 crc kubenswrapper[4752]: I1124 11:23:55.978473 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f276d434-1756-4be0-a4b1-58c6ead29161-combined-ca-bundle\") pod \"swift-ring-rebalance-rmx7z\" (UID: \"f276d434-1756-4be0-a4b1-58c6ead29161\") " pod="openstack/swift-ring-rebalance-rmx7z" Nov 24 11:23:55 crc kubenswrapper[4752]: I1124 11:23:55.978512 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f276d434-1756-4be0-a4b1-58c6ead29161-dispersionconf\") pod \"swift-ring-rebalance-rmx7z\" (UID: \"f276d434-1756-4be0-a4b1-58c6ead29161\") " pod="openstack/swift-ring-rebalance-rmx7z" Nov 24 11:23:55 crc kubenswrapper[4752]: I1124 11:23:55.978526 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdrsk\" (UniqueName: \"kubernetes.io/projected/f276d434-1756-4be0-a4b1-58c6ead29161-kube-api-access-rdrsk\") pod \"swift-ring-rebalance-rmx7z\" (UID: \"f276d434-1756-4be0-a4b1-58c6ead29161\") " pod="openstack/swift-ring-rebalance-rmx7z" Nov 24 11:23:55 crc kubenswrapper[4752]: I1124 11:23:55.978553 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0ecf935a-a2fa-4fb4-91f4-46f5564cb094-swiftconf\") pod \"swift-ring-rebalance-vvkzg\" (UID: \"0ecf935a-a2fa-4fb4-91f4-46f5564cb094\") " pod="openstack/swift-ring-rebalance-vvkzg" Nov 24 11:23:55 crc kubenswrapper[4752]: I1124 11:23:55.978569 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f276d434-1756-4be0-a4b1-58c6ead29161-swiftconf\") pod \"swift-ring-rebalance-rmx7z\" (UID: \"f276d434-1756-4be0-a4b1-58c6ead29161\") " pod="openstack/swift-ring-rebalance-rmx7z" Nov 24 11:23:55 crc kubenswrapper[4752]: I1124 11:23:55.979226 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f276d434-1756-4be0-a4b1-58c6ead29161-scripts\") pod \"swift-ring-rebalance-rmx7z\" (UID: \"f276d434-1756-4be0-a4b1-58c6ead29161\") " pod="openstack/swift-ring-rebalance-rmx7z" Nov 24 11:23:55 crc kubenswrapper[4752]: I1124 11:23:55.979540 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f276d434-1756-4be0-a4b1-58c6ead29161-etc-swift\") pod \"swift-ring-rebalance-rmx7z\" (UID: \"f276d434-1756-4be0-a4b1-58c6ead29161\") " pod="openstack/swift-ring-rebalance-rmx7z" Nov 24 11:23:55 crc kubenswrapper[4752]: I1124 11:23:55.979641 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f276d434-1756-4be0-a4b1-58c6ead29161-ring-data-devices\") pod \"swift-ring-rebalance-rmx7z\" (UID: \"f276d434-1756-4be0-a4b1-58c6ead29161\") " pod="openstack/swift-ring-rebalance-rmx7z" Nov 24 11:23:55 crc kubenswrapper[4752]: I1124 11:23:55.992543 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f276d434-1756-4be0-a4b1-58c6ead29161-swiftconf\") pod \"swift-ring-rebalance-rmx7z\" (UID: \"f276d434-1756-4be0-a4b1-58c6ead29161\") " pod="openstack/swift-ring-rebalance-rmx7z" Nov 24 11:23:55 crc kubenswrapper[4752]: I1124 11:23:55.992710 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f276d434-1756-4be0-a4b1-58c6ead29161-combined-ca-bundle\") pod \"swift-ring-rebalance-rmx7z\" (UID: \"f276d434-1756-4be0-a4b1-58c6ead29161\") " pod="openstack/swift-ring-rebalance-rmx7z" Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.004237 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdrsk\" (UniqueName: \"kubernetes.io/projected/f276d434-1756-4be0-a4b1-58c6ead29161-kube-api-access-rdrsk\") pod \"swift-ring-rebalance-rmx7z\" (UID: \"f276d434-1756-4be0-a4b1-58c6ead29161\") " pod="openstack/swift-ring-rebalance-rmx7z" Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.005198 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f276d434-1756-4be0-a4b1-58c6ead29161-dispersionconf\") pod \"swift-ring-rebalance-rmx7z\" (UID: \"f276d434-1756-4be0-a4b1-58c6ead29161\") " pod="openstack/swift-ring-rebalance-rmx7z" Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.079807 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0ecf935a-a2fa-4fb4-91f4-46f5564cb094-swiftconf\") pod \"swift-ring-rebalance-vvkzg\" (UID: \"0ecf935a-a2fa-4fb4-91f4-46f5564cb094\") " pod="openstack/swift-ring-rebalance-vvkzg" Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.079918 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0ecf935a-a2fa-4fb4-91f4-46f5564cb094-ring-data-devices\") pod \"swift-ring-rebalance-vvkzg\" (UID: \"0ecf935a-a2fa-4fb4-91f4-46f5564cb094\") " pod="openstack/swift-ring-rebalance-vvkzg" Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.079951 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgxbh\" (UniqueName: \"kubernetes.io/projected/0ecf935a-a2fa-4fb4-91f4-46f5564cb094-kube-api-access-cgxbh\") pod \"swift-ring-rebalance-vvkzg\" (UID: \"0ecf935a-a2fa-4fb4-91f4-46f5564cb094\") " pod="openstack/swift-ring-rebalance-vvkzg" Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.079993 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0ecf935a-a2fa-4fb4-91f4-46f5564cb094-dispersionconf\") pod \"swift-ring-rebalance-vvkzg\" (UID: \"0ecf935a-a2fa-4fb4-91f4-46f5564cb094\") " pod="openstack/swift-ring-rebalance-vvkzg" Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.080018 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0ecf935a-a2fa-4fb4-91f4-46f5564cb094-scripts\") pod \"swift-ring-rebalance-vvkzg\" (UID: \"0ecf935a-a2fa-4fb4-91f4-46f5564cb094\") " pod="openstack/swift-ring-rebalance-vvkzg" Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.080049 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0ecf935a-a2fa-4fb4-91f4-46f5564cb094-etc-swift\") pod \"swift-ring-rebalance-vvkzg\" (UID: \"0ecf935a-a2fa-4fb4-91f4-46f5564cb094\") " pod="openstack/swift-ring-rebalance-vvkzg" Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.080066 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ecf935a-a2fa-4fb4-91f4-46f5564cb094-combined-ca-bundle\") pod \"swift-ring-rebalance-vvkzg\" (UID: \"0ecf935a-a2fa-4fb4-91f4-46f5564cb094\") " pod="openstack/swift-ring-rebalance-vvkzg" Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.080777 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0ecf935a-a2fa-4fb4-91f4-46f5564cb094-ring-data-devices\") pod \"swift-ring-rebalance-vvkzg\" (UID: \"0ecf935a-a2fa-4fb4-91f4-46f5564cb094\") " pod="openstack/swift-ring-rebalance-vvkzg" Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.081301 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0ecf935a-a2fa-4fb4-91f4-46f5564cb094-etc-swift\") pod \"swift-ring-rebalance-vvkzg\" (UID: \"0ecf935a-a2fa-4fb4-91f4-46f5564cb094\") " pod="openstack/swift-ring-rebalance-vvkzg" Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.081398 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0ecf935a-a2fa-4fb4-91f4-46f5564cb094-scripts\") pod \"swift-ring-rebalance-vvkzg\" (UID: \"0ecf935a-a2fa-4fb4-91f4-46f5564cb094\") " pod="openstack/swift-ring-rebalance-vvkzg" Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.083116 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ecf935a-a2fa-4fb4-91f4-46f5564cb094-combined-ca-bundle\") pod \"swift-ring-rebalance-vvkzg\" (UID: \"0ecf935a-a2fa-4fb4-91f4-46f5564cb094\") " pod="openstack/swift-ring-rebalance-vvkzg" Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.083306 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0ecf935a-a2fa-4fb4-91f4-46f5564cb094-dispersionconf\") pod \"swift-ring-rebalance-vvkzg\" (UID: \"0ecf935a-a2fa-4fb4-91f4-46f5564cb094\") " pod="openstack/swift-ring-rebalance-vvkzg" Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.083994 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0ecf935a-a2fa-4fb4-91f4-46f5564cb094-swiftconf\") pod \"swift-ring-rebalance-vvkzg\" (UID: \"0ecf935a-a2fa-4fb4-91f4-46f5564cb094\") " pod="openstack/swift-ring-rebalance-vvkzg" Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.098472 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgxbh\" (UniqueName: \"kubernetes.io/projected/0ecf935a-a2fa-4fb4-91f4-46f5564cb094-kube-api-access-cgxbh\") pod \"swift-ring-rebalance-vvkzg\" (UID: \"0ecf935a-a2fa-4fb4-91f4-46f5564cb094\") " pod="openstack/swift-ring-rebalance-vvkzg" Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.179858 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vvkzg" Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.380434 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rmx7z" Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.394853 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rmx7z" Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.491385 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f276d434-1756-4be0-a4b1-58c6ead29161-swiftconf\") pod \"f276d434-1756-4be0-a4b1-58c6ead29161\" (UID: \"f276d434-1756-4be0-a4b1-58c6ead29161\") " Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.491450 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdrsk\" (UniqueName: \"kubernetes.io/projected/f276d434-1756-4be0-a4b1-58c6ead29161-kube-api-access-rdrsk\") pod \"f276d434-1756-4be0-a4b1-58c6ead29161\" (UID: \"f276d434-1756-4be0-a4b1-58c6ead29161\") " Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.491491 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f276d434-1756-4be0-a4b1-58c6ead29161-etc-swift\") pod \"f276d434-1756-4be0-a4b1-58c6ead29161\" (UID: \"f276d434-1756-4be0-a4b1-58c6ead29161\") " Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.491530 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f276d434-1756-4be0-a4b1-58c6ead29161-dispersionconf\") pod \"f276d434-1756-4be0-a4b1-58c6ead29161\" (UID: \"f276d434-1756-4be0-a4b1-58c6ead29161\") " Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.491596 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f276d434-1756-4be0-a4b1-58c6ead29161-scripts\") pod \"f276d434-1756-4be0-a4b1-58c6ead29161\" (UID: \"f276d434-1756-4be0-a4b1-58c6ead29161\") " Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.491625 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f276d434-1756-4be0-a4b1-58c6ead29161-combined-ca-bundle\") pod \"f276d434-1756-4be0-a4b1-58c6ead29161\" (UID: \"f276d434-1756-4be0-a4b1-58c6ead29161\") " Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.491706 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f276d434-1756-4be0-a4b1-58c6ead29161-ring-data-devices\") pod \"f276d434-1756-4be0-a4b1-58c6ead29161\" (UID: \"f276d434-1756-4be0-a4b1-58c6ead29161\") " Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.492670 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f276d434-1756-4be0-a4b1-58c6ead29161-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f276d434-1756-4be0-a4b1-58c6ead29161" (UID: "f276d434-1756-4be0-a4b1-58c6ead29161"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.493676 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f276d434-1756-4be0-a4b1-58c6ead29161-scripts" (OuterVolumeSpecName: "scripts") pod "f276d434-1756-4be0-a4b1-58c6ead29161" (UID: "f276d434-1756-4be0-a4b1-58c6ead29161"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.493853 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f276d434-1756-4be0-a4b1-58c6ead29161-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "f276d434-1756-4be0-a4b1-58c6ead29161" (UID: "f276d434-1756-4be0-a4b1-58c6ead29161"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.497172 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f276d434-1756-4be0-a4b1-58c6ead29161-kube-api-access-rdrsk" (OuterVolumeSpecName: "kube-api-access-rdrsk") pod "f276d434-1756-4be0-a4b1-58c6ead29161" (UID: "f276d434-1756-4be0-a4b1-58c6ead29161"). InnerVolumeSpecName "kube-api-access-rdrsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.498276 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f276d434-1756-4be0-a4b1-58c6ead29161-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "f276d434-1756-4be0-a4b1-58c6ead29161" (UID: "f276d434-1756-4be0-a4b1-58c6ead29161"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.498572 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f276d434-1756-4be0-a4b1-58c6ead29161-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "f276d434-1756-4be0-a4b1-58c6ead29161" (UID: "f276d434-1756-4be0-a4b1-58c6ead29161"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.501922 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f276d434-1756-4be0-a4b1-58c6ead29161-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f276d434-1756-4be0-a4b1-58c6ead29161" (UID: "f276d434-1756-4be0-a4b1-58c6ead29161"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.593690 4752 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f276d434-1756-4be0-a4b1-58c6ead29161-ring-data-devices\") on node \"crc\" DevicePath \"\"" Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.593723 4752 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f276d434-1756-4be0-a4b1-58c6ead29161-swiftconf\") on node \"crc\" DevicePath \"\"" Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.593735 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdrsk\" (UniqueName: \"kubernetes.io/projected/f276d434-1756-4be0-a4b1-58c6ead29161-kube-api-access-rdrsk\") on node \"crc\" DevicePath \"\"" Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.593763 4752 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f276d434-1756-4be0-a4b1-58c6ead29161-etc-swift\") on node \"crc\" DevicePath \"\"" Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.593775 4752 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f276d434-1756-4be0-a4b1-58c6ead29161-dispersionconf\") on node \"crc\" DevicePath \"\"" Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.593786 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f276d434-1756-4be0-a4b1-58c6ead29161-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.593796 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f276d434-1756-4be0-a4b1-58c6ead29161-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.596239 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-vvkzg"] Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.687970 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f4c5-account-create-6zcx6" Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.738373 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-j8hcn" Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.796384 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e756e30-c210-4624-a396-e9e05691f1ed-operator-scripts\") pod \"0e756e30-c210-4624-a396-e9e05691f1ed\" (UID: \"0e756e30-c210-4624-a396-e9e05691f1ed\") " Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.796775 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6fts\" (UniqueName: \"kubernetes.io/projected/0e756e30-c210-4624-a396-e9e05691f1ed-kube-api-access-p6fts\") pod \"0e756e30-c210-4624-a396-e9e05691f1ed\" (UID: \"0e756e30-c210-4624-a396-e9e05691f1ed\") " Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.796998 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fk54\" (UniqueName: \"kubernetes.io/projected/2a954e1c-c17d-4296-b165-22b1653af22f-kube-api-access-9fk54\") pod \"2a954e1c-c17d-4296-b165-22b1653af22f\" (UID: \"2a954e1c-c17d-4296-b165-22b1653af22f\") " Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.797059 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a954e1c-c17d-4296-b165-22b1653af22f-operator-scripts\") pod \"2a954e1c-c17d-4296-b165-22b1653af22f\" (UID: \"2a954e1c-c17d-4296-b165-22b1653af22f\") " Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.797232 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e756e30-c210-4624-a396-e9e05691f1ed-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0e756e30-c210-4624-a396-e9e05691f1ed" (UID: "0e756e30-c210-4624-a396-e9e05691f1ed"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.797533 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e756e30-c210-4624-a396-e9e05691f1ed-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.798518 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a954e1c-c17d-4296-b165-22b1653af22f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2a954e1c-c17d-4296-b165-22b1653af22f" (UID: "2a954e1c-c17d-4296-b165-22b1653af22f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.801946 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e756e30-c210-4624-a396-e9e05691f1ed-kube-api-access-p6fts" (OuterVolumeSpecName: "kube-api-access-p6fts") pod "0e756e30-c210-4624-a396-e9e05691f1ed" (UID: "0e756e30-c210-4624-a396-e9e05691f1ed"). InnerVolumeSpecName "kube-api-access-p6fts". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.802004 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a954e1c-c17d-4296-b165-22b1653af22f-kube-api-access-9fk54" (OuterVolumeSpecName: "kube-api-access-9fk54") pod "2a954e1c-c17d-4296-b165-22b1653af22f" (UID: "2a954e1c-c17d-4296-b165-22b1653af22f"). InnerVolumeSpecName "kube-api-access-9fk54". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.898956 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fk54\" (UniqueName: \"kubernetes.io/projected/2a954e1c-c17d-4296-b165-22b1653af22f-kube-api-access-9fk54\") on node \"crc\" DevicePath \"\"" Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.898988 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a954e1c-c17d-4296-b165-22b1653af22f-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:23:56 crc kubenswrapper[4752]: I1124 11:23:56.898997 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6fts\" (UniqueName: \"kubernetes.io/projected/0e756e30-c210-4624-a396-e9e05691f1ed-kube-api-access-p6fts\") on node \"crc\" DevicePath \"\"" Nov 24 11:23:57 crc kubenswrapper[4752]: I1124 11:23:57.392194 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-j8hcn" event={"ID":"2a954e1c-c17d-4296-b165-22b1653af22f","Type":"ContainerDied","Data":"c5879f7fdb8bdad8b7dec2c75289ae23ed12176c7357d21b193fc4b70bed3f45"} Nov 24 11:23:57 crc kubenswrapper[4752]: I1124 11:23:57.392238 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5879f7fdb8bdad8b7dec2c75289ae23ed12176c7357d21b193fc4b70bed3f45" Nov 24 11:23:57 crc kubenswrapper[4752]: I1124 11:23:57.392287 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-j8hcn" Nov 24 11:23:57 crc kubenswrapper[4752]: I1124 11:23:57.395629 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f4c5-account-create-6zcx6" event={"ID":"0e756e30-c210-4624-a396-e9e05691f1ed","Type":"ContainerDied","Data":"30ae9a2e881223168524a88a189e6bf97f45a204b955334ebdddf5d441c01687"} Nov 24 11:23:57 crc kubenswrapper[4752]: I1124 11:23:57.395653 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30ae9a2e881223168524a88a189e6bf97f45a204b955334ebdddf5d441c01687" Nov 24 11:23:57 crc kubenswrapper[4752]: I1124 11:23:57.395655 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f4c5-account-create-6zcx6" Nov 24 11:23:57 crc kubenswrapper[4752]: I1124 11:23:57.396965 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vvkzg" event={"ID":"0ecf935a-a2fa-4fb4-91f4-46f5564cb094","Type":"ContainerStarted","Data":"e3275f2e46442a0e5935e4e36b083c5cd9a29cc63f2996ca7f6227a4f12165ac"} Nov 24 11:23:57 crc kubenswrapper[4752]: I1124 11:23:57.396990 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rmx7z" Nov 24 11:23:57 crc kubenswrapper[4752]: I1124 11:23:57.448736 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-rmx7z"] Nov 24 11:23:57 crc kubenswrapper[4752]: I1124 11:23:57.455102 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-rmx7z"] Nov 24 11:23:58 crc kubenswrapper[4752]: I1124 11:23:58.267924 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-4kfpv"] Nov 24 11:23:58 crc kubenswrapper[4752]: E1124 11:23:58.268686 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a954e1c-c17d-4296-b165-22b1653af22f" containerName="mariadb-database-create" Nov 24 11:23:58 crc kubenswrapper[4752]: I1124 11:23:58.268704 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a954e1c-c17d-4296-b165-22b1653af22f" containerName="mariadb-database-create" Nov 24 11:23:58 crc kubenswrapper[4752]: E1124 11:23:58.268721 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e756e30-c210-4624-a396-e9e05691f1ed" containerName="mariadb-account-create" Nov 24 11:23:58 crc kubenswrapper[4752]: I1124 11:23:58.268727 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e756e30-c210-4624-a396-e9e05691f1ed" containerName="mariadb-account-create" Nov 24 11:23:58 crc kubenswrapper[4752]: I1124 11:23:58.268924 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e756e30-c210-4624-a396-e9e05691f1ed" containerName="mariadb-account-create" Nov 24 11:23:58 crc kubenswrapper[4752]: I1124 11:23:58.268944 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a954e1c-c17d-4296-b165-22b1653af22f" containerName="mariadb-database-create" Nov 24 11:23:58 crc kubenswrapper[4752]: I1124 11:23:58.269769 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4kfpv" Nov 24 11:23:58 crc kubenswrapper[4752]: I1124 11:23:58.290819 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-4kfpv"] Nov 24 11:23:58 crc kubenswrapper[4752]: I1124 11:23:58.327095 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rvdc\" (UniqueName: \"kubernetes.io/projected/3c789332-baf6-4602-b509-7421b2ff22fe-kube-api-access-4rvdc\") pod \"keystone-db-create-4kfpv\" (UID: \"3c789332-baf6-4602-b509-7421b2ff22fe\") " pod="openstack/keystone-db-create-4kfpv" Nov 24 11:23:58 crc kubenswrapper[4752]: I1124 11:23:58.327466 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c789332-baf6-4602-b509-7421b2ff22fe-operator-scripts\") pod \"keystone-db-create-4kfpv\" (UID: \"3c789332-baf6-4602-b509-7421b2ff22fe\") " pod="openstack/keystone-db-create-4kfpv" Nov 24 11:23:58 crc kubenswrapper[4752]: I1124 11:23:58.373469 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7b4f-account-create-pq4n9"] Nov 24 11:23:58 crc kubenswrapper[4752]: I1124 11:23:58.374679 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7b4f-account-create-pq4n9" Nov 24 11:23:58 crc kubenswrapper[4752]: I1124 11:23:58.383948 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Nov 24 11:23:58 crc kubenswrapper[4752]: I1124 11:23:58.386166 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7b4f-account-create-pq4n9"] Nov 24 11:23:58 crc kubenswrapper[4752]: I1124 11:23:58.428947 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c789332-baf6-4602-b509-7421b2ff22fe-operator-scripts\") pod \"keystone-db-create-4kfpv\" (UID: \"3c789332-baf6-4602-b509-7421b2ff22fe\") " pod="openstack/keystone-db-create-4kfpv" Nov 24 11:23:58 crc kubenswrapper[4752]: I1124 11:23:58.429056 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rvdc\" (UniqueName: \"kubernetes.io/projected/3c789332-baf6-4602-b509-7421b2ff22fe-kube-api-access-4rvdc\") pod \"keystone-db-create-4kfpv\" (UID: \"3c789332-baf6-4602-b509-7421b2ff22fe\") " pod="openstack/keystone-db-create-4kfpv" Nov 24 11:23:58 crc kubenswrapper[4752]: I1124 11:23:58.429103 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mggk5\" (UniqueName: \"kubernetes.io/projected/93537e9d-a8b8-4fb4-b982-2971ccf60f9d-kube-api-access-mggk5\") pod \"keystone-7b4f-account-create-pq4n9\" (UID: \"93537e9d-a8b8-4fb4-b982-2971ccf60f9d\") " pod="openstack/keystone-7b4f-account-create-pq4n9" Nov 24 11:23:58 crc kubenswrapper[4752]: I1124 11:23:58.429178 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93537e9d-a8b8-4fb4-b982-2971ccf60f9d-operator-scripts\") pod \"keystone-7b4f-account-create-pq4n9\" (UID: \"93537e9d-a8b8-4fb4-b982-2971ccf60f9d\") " pod="openstack/keystone-7b4f-account-create-pq4n9" Nov 24 11:23:58 crc kubenswrapper[4752]: I1124 11:23:58.431570 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c789332-baf6-4602-b509-7421b2ff22fe-operator-scripts\") pod \"keystone-db-create-4kfpv\" (UID: \"3c789332-baf6-4602-b509-7421b2ff22fe\") " pod="openstack/keystone-db-create-4kfpv" Nov 24 11:23:58 crc kubenswrapper[4752]: I1124 11:23:58.448140 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rvdc\" (UniqueName: \"kubernetes.io/projected/3c789332-baf6-4602-b509-7421b2ff22fe-kube-api-access-4rvdc\") pod \"keystone-db-create-4kfpv\" (UID: \"3c789332-baf6-4602-b509-7421b2ff22fe\") " pod="openstack/keystone-db-create-4kfpv" Nov 24 11:23:58 crc kubenswrapper[4752]: I1124 11:23:58.531914 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93537e9d-a8b8-4fb4-b982-2971ccf60f9d-operator-scripts\") pod \"keystone-7b4f-account-create-pq4n9\" (UID: \"93537e9d-a8b8-4fb4-b982-2971ccf60f9d\") " pod="openstack/keystone-7b4f-account-create-pq4n9" Nov 24 11:23:58 crc kubenswrapper[4752]: I1124 11:23:58.532085 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mggk5\" (UniqueName: \"kubernetes.io/projected/93537e9d-a8b8-4fb4-b982-2971ccf60f9d-kube-api-access-mggk5\") pod \"keystone-7b4f-account-create-pq4n9\" (UID: \"93537e9d-a8b8-4fb4-b982-2971ccf60f9d\") " pod="openstack/keystone-7b4f-account-create-pq4n9" Nov 24 11:23:58 crc kubenswrapper[4752]: I1124 11:23:58.535729 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93537e9d-a8b8-4fb4-b982-2971ccf60f9d-operator-scripts\") pod \"keystone-7b4f-account-create-pq4n9\" (UID: \"93537e9d-a8b8-4fb4-b982-2971ccf60f9d\") " pod="openstack/keystone-7b4f-account-create-pq4n9" Nov 24 11:23:58 crc kubenswrapper[4752]: I1124 11:23:58.550589 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mggk5\" (UniqueName: \"kubernetes.io/projected/93537e9d-a8b8-4fb4-b982-2971ccf60f9d-kube-api-access-mggk5\") pod \"keystone-7b4f-account-create-pq4n9\" (UID: \"93537e9d-a8b8-4fb4-b982-2971ccf60f9d\") " pod="openstack/keystone-7b4f-account-create-pq4n9" Nov 24 11:23:58 crc kubenswrapper[4752]: I1124 11:23:58.595160 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4kfpv" Nov 24 11:23:58 crc kubenswrapper[4752]: I1124 11:23:58.701920 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7b4f-account-create-pq4n9" Nov 24 11:23:58 crc kubenswrapper[4752]: I1124 11:23:58.711803 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-mbqd5"] Nov 24 11:23:58 crc kubenswrapper[4752]: I1124 11:23:58.714873 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-mbqd5" Nov 24 11:23:58 crc kubenswrapper[4752]: I1124 11:23:58.754135 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f276d434-1756-4be0-a4b1-58c6ead29161" path="/var/lib/kubelet/pods/f276d434-1756-4be0-a4b1-58c6ead29161/volumes" Nov 24 11:23:58 crc kubenswrapper[4752]: I1124 11:23:58.754907 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-mbqd5"] Nov 24 11:23:58 crc kubenswrapper[4752]: I1124 11:23:58.828539 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-83dd-account-create-m48bt"] Nov 24 11:23:58 crc kubenswrapper[4752]: I1124 11:23:58.830454 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-83dd-account-create-m48bt" Nov 24 11:23:58 crc kubenswrapper[4752]: I1124 11:23:58.832643 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Nov 24 11:23:58 crc kubenswrapper[4752]: I1124 11:23:58.837563 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-83dd-account-create-m48bt"] Nov 24 11:23:58 crc kubenswrapper[4752]: I1124 11:23:58.837727 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc9xh\" (UniqueName: \"kubernetes.io/projected/7df74304-1dc2-466a-b089-ecfa45af2ce1-kube-api-access-mc9xh\") pod \"placement-db-create-mbqd5\" (UID: \"7df74304-1dc2-466a-b089-ecfa45af2ce1\") " pod="openstack/placement-db-create-mbqd5" Nov 24 11:23:58 crc kubenswrapper[4752]: I1124 11:23:58.837906 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7df74304-1dc2-466a-b089-ecfa45af2ce1-operator-scripts\") pod \"placement-db-create-mbqd5\" (UID: \"7df74304-1dc2-466a-b089-ecfa45af2ce1\") " pod="openstack/placement-db-create-mbqd5" Nov 24 11:23:58 crc kubenswrapper[4752]: I1124 11:23:58.939238 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7df74304-1dc2-466a-b089-ecfa45af2ce1-operator-scripts\") pod \"placement-db-create-mbqd5\" (UID: \"7df74304-1dc2-466a-b089-ecfa45af2ce1\") " pod="openstack/placement-db-create-mbqd5" Nov 24 11:23:58 crc kubenswrapper[4752]: I1124 11:23:58.939292 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/acd74806-0b12-4c05-9594-acbaf92b9af9-operator-scripts\") pod \"placement-83dd-account-create-m48bt\" (UID: \"acd74806-0b12-4c05-9594-acbaf92b9af9\") " pod="openstack/placement-83dd-account-create-m48bt" Nov 24 11:23:58 crc kubenswrapper[4752]: I1124 11:23:58.939344 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk48f\" (UniqueName: \"kubernetes.io/projected/acd74806-0b12-4c05-9594-acbaf92b9af9-kube-api-access-sk48f\") pod \"placement-83dd-account-create-m48bt\" (UID: \"acd74806-0b12-4c05-9594-acbaf92b9af9\") " pod="openstack/placement-83dd-account-create-m48bt" Nov 24 11:23:58 crc kubenswrapper[4752]: I1124 11:23:58.939473 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc9xh\" (UniqueName: \"kubernetes.io/projected/7df74304-1dc2-466a-b089-ecfa45af2ce1-kube-api-access-mc9xh\") pod \"placement-db-create-mbqd5\" (UID: \"7df74304-1dc2-466a-b089-ecfa45af2ce1\") " pod="openstack/placement-db-create-mbqd5" Nov 24 11:23:58 crc kubenswrapper[4752]: I1124 11:23:58.940227 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7df74304-1dc2-466a-b089-ecfa45af2ce1-operator-scripts\") pod \"placement-db-create-mbqd5\" (UID: \"7df74304-1dc2-466a-b089-ecfa45af2ce1\") " pod="openstack/placement-db-create-mbqd5" Nov 24 11:23:58 crc kubenswrapper[4752]: I1124 11:23:58.968390 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc9xh\" (UniqueName: \"kubernetes.io/projected/7df74304-1dc2-466a-b089-ecfa45af2ce1-kube-api-access-mc9xh\") pod \"placement-db-create-mbqd5\" (UID: \"7df74304-1dc2-466a-b089-ecfa45af2ce1\") " pod="openstack/placement-db-create-mbqd5" Nov 24 11:23:59 crc kubenswrapper[4752]: I1124 11:23:59.041529 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/acd74806-0b12-4c05-9594-acbaf92b9af9-operator-scripts\") pod \"placement-83dd-account-create-m48bt\" (UID: \"acd74806-0b12-4c05-9594-acbaf92b9af9\") " pod="openstack/placement-83dd-account-create-m48bt" Nov 24 11:23:59 crc kubenswrapper[4752]: I1124 11:23:59.041595 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk48f\" (UniqueName: \"kubernetes.io/projected/acd74806-0b12-4c05-9594-acbaf92b9af9-kube-api-access-sk48f\") pod \"placement-83dd-account-create-m48bt\" (UID: \"acd74806-0b12-4c05-9594-acbaf92b9af9\") " pod="openstack/placement-83dd-account-create-m48bt" Nov 24 11:23:59 crc kubenswrapper[4752]: I1124 11:23:59.042655 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/acd74806-0b12-4c05-9594-acbaf92b9af9-operator-scripts\") pod \"placement-83dd-account-create-m48bt\" (UID: \"acd74806-0b12-4c05-9594-acbaf92b9af9\") " pod="openstack/placement-83dd-account-create-m48bt" Nov 24 11:23:59 crc kubenswrapper[4752]: I1124 11:23:59.052238 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-mbqd5" Nov 24 11:23:59 crc kubenswrapper[4752]: I1124 11:23:59.057398 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk48f\" (UniqueName: \"kubernetes.io/projected/acd74806-0b12-4c05-9594-acbaf92b9af9-kube-api-access-sk48f\") pod \"placement-83dd-account-create-m48bt\" (UID: \"acd74806-0b12-4c05-9594-acbaf92b9af9\") " pod="openstack/placement-83dd-account-create-m48bt" Nov 24 11:23:59 crc kubenswrapper[4752]: I1124 11:23:59.142238 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-xx5fv"] Nov 24 11:23:59 crc kubenswrapper[4752]: I1124 11:23:59.143313 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xx5fv" Nov 24 11:23:59 crc kubenswrapper[4752]: I1124 11:23:59.151837 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-83dd-account-create-m48bt" Nov 24 11:23:59 crc kubenswrapper[4752]: I1124 11:23:59.153817 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Nov 24 11:23:59 crc kubenswrapper[4752]: I1124 11:23:59.153915 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-z5nd5" Nov 24 11:23:59 crc kubenswrapper[4752]: I1124 11:23:59.161499 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-xx5fv"] Nov 24 11:23:59 crc kubenswrapper[4752]: I1124 11:23:59.243993 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdls8\" (UniqueName: \"kubernetes.io/projected/7812a05d-b13c-413a-ae63-9317fec939e5-kube-api-access-pdls8\") pod \"glance-db-sync-xx5fv\" (UID: \"7812a05d-b13c-413a-ae63-9317fec939e5\") " pod="openstack/glance-db-sync-xx5fv" Nov 24 11:23:59 crc kubenswrapper[4752]: I1124 11:23:59.244055 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7812a05d-b13c-413a-ae63-9317fec939e5-config-data\") pod \"glance-db-sync-xx5fv\" (UID: \"7812a05d-b13c-413a-ae63-9317fec939e5\") " pod="openstack/glance-db-sync-xx5fv" Nov 24 11:23:59 crc kubenswrapper[4752]: I1124 11:23:59.244174 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7812a05d-b13c-413a-ae63-9317fec939e5-combined-ca-bundle\") pod \"glance-db-sync-xx5fv\" (UID: \"7812a05d-b13c-413a-ae63-9317fec939e5\") " pod="openstack/glance-db-sync-xx5fv" Nov 24 11:23:59 crc kubenswrapper[4752]: I1124 11:23:59.244541 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7812a05d-b13c-413a-ae63-9317fec939e5-db-sync-config-data\") pod \"glance-db-sync-xx5fv\" (UID: \"7812a05d-b13c-413a-ae63-9317fec939e5\") " pod="openstack/glance-db-sync-xx5fv" Nov 24 11:23:59 crc kubenswrapper[4752]: I1124 11:23:59.345907 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7812a05d-b13c-413a-ae63-9317fec939e5-db-sync-config-data\") pod \"glance-db-sync-xx5fv\" (UID: \"7812a05d-b13c-413a-ae63-9317fec939e5\") " pod="openstack/glance-db-sync-xx5fv" Nov 24 11:23:59 crc kubenswrapper[4752]: I1124 11:23:59.346153 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdls8\" (UniqueName: \"kubernetes.io/projected/7812a05d-b13c-413a-ae63-9317fec939e5-kube-api-access-pdls8\") pod \"glance-db-sync-xx5fv\" (UID: \"7812a05d-b13c-413a-ae63-9317fec939e5\") " pod="openstack/glance-db-sync-xx5fv" Nov 24 11:23:59 crc kubenswrapper[4752]: I1124 11:23:59.346233 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7812a05d-b13c-413a-ae63-9317fec939e5-config-data\") pod \"glance-db-sync-xx5fv\" (UID: \"7812a05d-b13c-413a-ae63-9317fec939e5\") " pod="openstack/glance-db-sync-xx5fv" Nov 24 11:23:59 crc kubenswrapper[4752]: I1124 11:23:59.346268 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7812a05d-b13c-413a-ae63-9317fec939e5-combined-ca-bundle\") pod \"glance-db-sync-xx5fv\" (UID: \"7812a05d-b13c-413a-ae63-9317fec939e5\") " pod="openstack/glance-db-sync-xx5fv" Nov 24 11:23:59 crc kubenswrapper[4752]: I1124 11:23:59.348959 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7812a05d-b13c-413a-ae63-9317fec939e5-db-sync-config-data\") pod \"glance-db-sync-xx5fv\" (UID: \"7812a05d-b13c-413a-ae63-9317fec939e5\") " pod="openstack/glance-db-sync-xx5fv" Nov 24 11:23:59 crc kubenswrapper[4752]: I1124 11:23:59.353113 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7812a05d-b13c-413a-ae63-9317fec939e5-combined-ca-bundle\") pod \"glance-db-sync-xx5fv\" (UID: \"7812a05d-b13c-413a-ae63-9317fec939e5\") " pod="openstack/glance-db-sync-xx5fv" Nov 24 11:23:59 crc kubenswrapper[4752]: I1124 11:23:59.363272 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7812a05d-b13c-413a-ae63-9317fec939e5-config-data\") pod \"glance-db-sync-xx5fv\" (UID: \"7812a05d-b13c-413a-ae63-9317fec939e5\") " pod="openstack/glance-db-sync-xx5fv" Nov 24 11:23:59 crc kubenswrapper[4752]: I1124 11:23:59.410668 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdls8\" (UniqueName: \"kubernetes.io/projected/7812a05d-b13c-413a-ae63-9317fec939e5-kube-api-access-pdls8\") pod \"glance-db-sync-xx5fv\" (UID: \"7812a05d-b13c-413a-ae63-9317fec939e5\") " pod="openstack/glance-db-sync-xx5fv" Nov 24 11:23:59 crc kubenswrapper[4752]: I1124 11:23:59.476893 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xx5fv" Nov 24 11:23:59 crc kubenswrapper[4752]: I1124 11:23:59.754093 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d140408d-b7e5-4ba0-9404-877498cf18a1-etc-swift\") pod \"swift-storage-0\" (UID: \"d140408d-b7e5-4ba0-9404-877498cf18a1\") " pod="openstack/swift-storage-0" Nov 24 11:23:59 crc kubenswrapper[4752]: E1124 11:23:59.754282 4752 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 24 11:23:59 crc kubenswrapper[4752]: E1124 11:23:59.754306 4752 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 24 11:23:59 crc kubenswrapper[4752]: E1124 11:23:59.754362 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d140408d-b7e5-4ba0-9404-877498cf18a1-etc-swift podName:d140408d-b7e5-4ba0-9404-877498cf18a1 nodeName:}" failed. No retries permitted until 2025-11-24 11:24:07.754344861 +0000 UTC m=+1053.739165150 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d140408d-b7e5-4ba0-9404-877498cf18a1-etc-swift") pod "swift-storage-0" (UID: "d140408d-b7e5-4ba0-9404-877498cf18a1") : configmap "swift-ring-files" not found Nov 24 11:24:00 crc kubenswrapper[4752]: I1124 11:24:00.424080 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vvkzg" event={"ID":"0ecf935a-a2fa-4fb4-91f4-46f5564cb094","Type":"ContainerStarted","Data":"954b47a267aaf50acc4f84e04c8beafb487e7e1d28523852ecbb0dc2c03e0221"} Nov 24 11:24:00 crc kubenswrapper[4752]: I1124 11:24:00.449392 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-vvkzg" podStartSLOduration=1.968685391 podStartE2EDuration="5.449372333s" podCreationTimestamp="2025-11-24 11:23:55 +0000 UTC" firstStartedPulling="2025-11-24 11:23:56.632715163 +0000 UTC m=+1042.617535462" lastFinishedPulling="2025-11-24 11:24:00.113402115 +0000 UTC m=+1046.098222404" observedRunningTime="2025-11-24 11:24:00.444364778 +0000 UTC m=+1046.429185077" watchObservedRunningTime="2025-11-24 11:24:00.449372333 +0000 UTC m=+1046.434192622" Nov 24 11:24:00 crc kubenswrapper[4752]: I1124 11:24:00.627457 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-mbqd5"] Nov 24 11:24:00 crc kubenswrapper[4752]: W1124 11:24:00.627907 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7df74304_1dc2_466a_b089_ecfa45af2ce1.slice/crio-ab5cdfa85a8e20b1add182eec71dd648eb9bd1ef4f5e85359bad45feef699c2b WatchSource:0}: Error finding container ab5cdfa85a8e20b1add182eec71dd648eb9bd1ef4f5e85359bad45feef699c2b: Status 404 returned error can't find the container with id ab5cdfa85a8e20b1add182eec71dd648eb9bd1ef4f5e85359bad45feef699c2b Nov 24 11:24:00 crc kubenswrapper[4752]: W1124 11:24:00.717341 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c789332_baf6_4602_b509_7421b2ff22fe.slice/crio-56325570c022ac13240414882892a413db2aaee96fe240bda39068c9e48c2d60 WatchSource:0}: Error finding container 56325570c022ac13240414882892a413db2aaee96fe240bda39068c9e48c2d60: Status 404 returned error can't find the container with id 56325570c022ac13240414882892a413db2aaee96fe240bda39068c9e48c2d60 Nov 24 11:24:00 crc kubenswrapper[4752]: I1124 11:24:00.721312 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-4kfpv"] Nov 24 11:24:00 crc kubenswrapper[4752]: I1124 11:24:00.727293 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7b4f-account-create-pq4n9"] Nov 24 11:24:00 crc kubenswrapper[4752]: W1124 11:24:00.743145 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93537e9d_a8b8_4fb4_b982_2971ccf60f9d.slice/crio-60328ce7a1e3b3b7d6d69769b67275cff7ef9dbb504ea9ee41b0b0f77867eec7 WatchSource:0}: Error finding container 60328ce7a1e3b3b7d6d69769b67275cff7ef9dbb504ea9ee41b0b0f77867eec7: Status 404 returned error can't find the container with id 60328ce7a1e3b3b7d6d69769b67275cff7ef9dbb504ea9ee41b0b0f77867eec7 Nov 24 11:24:00 crc kubenswrapper[4752]: I1124 11:24:00.773826 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-83dd-account-create-m48bt"] Nov 24 11:24:00 crc kubenswrapper[4752]: I1124 11:24:00.835849 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-xx5fv"] Nov 24 11:24:01 crc kubenswrapper[4752]: I1124 11:24:01.004928 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-prc8b" Nov 24 11:24:01 crc kubenswrapper[4752]: I1124 11:24:01.082195 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-tdgsl"] Nov 24 11:24:01 crc kubenswrapper[4752]: I1124 11:24:01.082462 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-tdgsl" podUID="ec9e7ec8-7392-4362-8576-d62ca0508254" containerName="dnsmasq-dns" containerID="cri-o://9fc9daf79bc6a767e3df083652a066248cae02378b9f0cf58166d676e1860bfa" gracePeriod=10 Nov 24 11:24:01 crc kubenswrapper[4752]: I1124 11:24:01.336248 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-8554648995-tdgsl" podUID="ec9e7ec8-7392-4362-8576-d62ca0508254" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.113:5353: connect: connection refused" Nov 24 11:24:01 crc kubenswrapper[4752]: I1124 11:24:01.438147 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xx5fv" event={"ID":"7812a05d-b13c-413a-ae63-9317fec939e5","Type":"ContainerStarted","Data":"f5eeba3351c8b1dde0059d1f07e3dbed9b001fc82be9110c3c2ca259f1b429a0"} Nov 24 11:24:01 crc kubenswrapper[4752]: I1124 11:24:01.440239 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7b4f-account-create-pq4n9" event={"ID":"93537e9d-a8b8-4fb4-b982-2971ccf60f9d","Type":"ContainerStarted","Data":"ba0485f587e84e6729b1a0edb628c4d684a2d0274e0deb773446ecb4a744c7d3"} Nov 24 11:24:01 crc kubenswrapper[4752]: I1124 11:24:01.440291 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7b4f-account-create-pq4n9" event={"ID":"93537e9d-a8b8-4fb4-b982-2971ccf60f9d","Type":"ContainerStarted","Data":"60328ce7a1e3b3b7d6d69769b67275cff7ef9dbb504ea9ee41b0b0f77867eec7"} Nov 24 11:24:01 crc kubenswrapper[4752]: I1124 11:24:01.456367 4752 generic.go:334] "Generic (PLEG): container finished" podID="ec9e7ec8-7392-4362-8576-d62ca0508254" containerID="9fc9daf79bc6a767e3df083652a066248cae02378b9f0cf58166d676e1860bfa" exitCode=0 Nov 24 11:24:01 crc kubenswrapper[4752]: I1124 11:24:01.456424 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-tdgsl" event={"ID":"ec9e7ec8-7392-4362-8576-d62ca0508254","Type":"ContainerDied","Data":"9fc9daf79bc6a767e3df083652a066248cae02378b9f0cf58166d676e1860bfa"} Nov 24 11:24:01 crc kubenswrapper[4752]: I1124 11:24:01.457709 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-mbqd5" event={"ID":"7df74304-1dc2-466a-b089-ecfa45af2ce1","Type":"ContainerStarted","Data":"8a24630c9c439bcce109e7be607830a0db7b63fc24cf6c38472435d5053bd622"} Nov 24 11:24:01 crc kubenswrapper[4752]: I1124 11:24:01.457734 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-mbqd5" event={"ID":"7df74304-1dc2-466a-b089-ecfa45af2ce1","Type":"ContainerStarted","Data":"ab5cdfa85a8e20b1add182eec71dd648eb9bd1ef4f5e85359bad45feef699c2b"} Nov 24 11:24:01 crc kubenswrapper[4752]: I1124 11:24:01.468887 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-4kfpv" event={"ID":"3c789332-baf6-4602-b509-7421b2ff22fe","Type":"ContainerStarted","Data":"94a1d35075cddbba48a56524dbb8d50a49c8101d32685bfc1f3920e8a633ef21"} Nov 24 11:24:01 crc kubenswrapper[4752]: I1124 11:24:01.468966 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-4kfpv" event={"ID":"3c789332-baf6-4602-b509-7421b2ff22fe","Type":"ContainerStarted","Data":"56325570c022ac13240414882892a413db2aaee96fe240bda39068c9e48c2d60"} Nov 24 11:24:01 crc kubenswrapper[4752]: I1124 11:24:01.470825 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-83dd-account-create-m48bt" event={"ID":"acd74806-0b12-4c05-9594-acbaf92b9af9","Type":"ContainerStarted","Data":"c9c0faa7fabab5aa35b498dc4929bf133db55cdfd0aaa0be2d13f77832bd89d9"} Nov 24 11:24:01 crc kubenswrapper[4752]: I1124 11:24:01.470855 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-83dd-account-create-m48bt" event={"ID":"acd74806-0b12-4c05-9594-acbaf92b9af9","Type":"ContainerStarted","Data":"be616eb4cc13396e2c2cd1b27da3380908a24a13b67a1d916e2870b8152f7d34"} Nov 24 11:24:01 crc kubenswrapper[4752]: I1124 11:24:01.485965 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7b4f-account-create-pq4n9" podStartSLOduration=3.485943744 podStartE2EDuration="3.485943744s" podCreationTimestamp="2025-11-24 11:23:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:24:01.462205768 +0000 UTC m=+1047.447026057" watchObservedRunningTime="2025-11-24 11:24:01.485943744 +0000 UTC m=+1047.470764033" Nov 24 11:24:01 crc kubenswrapper[4752]: I1124 11:24:01.487325 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-mbqd5" podStartSLOduration=3.487314194 podStartE2EDuration="3.487314194s" podCreationTimestamp="2025-11-24 11:23:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:24:01.479403545 +0000 UTC m=+1047.464223834" watchObservedRunningTime="2025-11-24 11:24:01.487314194 +0000 UTC m=+1047.472134503" Nov 24 11:24:01 crc kubenswrapper[4752]: I1124 11:24:01.501902 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-83dd-account-create-m48bt" podStartSLOduration=3.501885155 podStartE2EDuration="3.501885155s" podCreationTimestamp="2025-11-24 11:23:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:24:01.497301533 +0000 UTC m=+1047.482121822" watchObservedRunningTime="2025-11-24 11:24:01.501885155 +0000 UTC m=+1047.486705444" Nov 24 11:24:01 crc kubenswrapper[4752]: I1124 11:24:01.522255 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-4kfpv" podStartSLOduration=3.522234273 podStartE2EDuration="3.522234273s" podCreationTimestamp="2025-11-24 11:23:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:24:01.517011112 +0000 UTC m=+1047.501831411" watchObservedRunningTime="2025-11-24 11:24:01.522234273 +0000 UTC m=+1047.507054562" Nov 24 11:24:01 crc kubenswrapper[4752]: I1124 11:24:01.739612 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-tdgsl" Nov 24 11:24:01 crc kubenswrapper[4752]: I1124 11:24:01.748080 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec9e7ec8-7392-4362-8576-d62ca0508254-ovsdbserver-sb\") pod \"ec9e7ec8-7392-4362-8576-d62ca0508254\" (UID: \"ec9e7ec8-7392-4362-8576-d62ca0508254\") " Nov 24 11:24:01 crc kubenswrapper[4752]: I1124 11:24:01.748143 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgmfw\" (UniqueName: \"kubernetes.io/projected/ec9e7ec8-7392-4362-8576-d62ca0508254-kube-api-access-rgmfw\") pod \"ec9e7ec8-7392-4362-8576-d62ca0508254\" (UID: \"ec9e7ec8-7392-4362-8576-d62ca0508254\") " Nov 24 11:24:01 crc kubenswrapper[4752]: I1124 11:24:01.756462 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec9e7ec8-7392-4362-8576-d62ca0508254-kube-api-access-rgmfw" (OuterVolumeSpecName: "kube-api-access-rgmfw") pod "ec9e7ec8-7392-4362-8576-d62ca0508254" (UID: "ec9e7ec8-7392-4362-8576-d62ca0508254"). InnerVolumeSpecName "kube-api-access-rgmfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:24:01 crc kubenswrapper[4752]: I1124 11:24:01.829371 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec9e7ec8-7392-4362-8576-d62ca0508254-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ec9e7ec8-7392-4362-8576-d62ca0508254" (UID: "ec9e7ec8-7392-4362-8576-d62ca0508254"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:24:01 crc kubenswrapper[4752]: I1124 11:24:01.851500 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec9e7ec8-7392-4362-8576-d62ca0508254-dns-svc\") pod \"ec9e7ec8-7392-4362-8576-d62ca0508254\" (UID: \"ec9e7ec8-7392-4362-8576-d62ca0508254\") " Nov 24 11:24:01 crc kubenswrapper[4752]: I1124 11:24:01.851707 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec9e7ec8-7392-4362-8576-d62ca0508254-ovsdbserver-nb\") pod \"ec9e7ec8-7392-4362-8576-d62ca0508254\" (UID: \"ec9e7ec8-7392-4362-8576-d62ca0508254\") " Nov 24 11:24:01 crc kubenswrapper[4752]: I1124 11:24:01.851799 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec9e7ec8-7392-4362-8576-d62ca0508254-config\") pod \"ec9e7ec8-7392-4362-8576-d62ca0508254\" (UID: \"ec9e7ec8-7392-4362-8576-d62ca0508254\") " Nov 24 11:24:01 crc kubenswrapper[4752]: I1124 11:24:01.852119 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec9e7ec8-7392-4362-8576-d62ca0508254-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:01 crc kubenswrapper[4752]: I1124 11:24:01.852136 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgmfw\" (UniqueName: \"kubernetes.io/projected/ec9e7ec8-7392-4362-8576-d62ca0508254-kube-api-access-rgmfw\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:01 crc kubenswrapper[4752]: I1124 11:24:01.895015 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec9e7ec8-7392-4362-8576-d62ca0508254-config" (OuterVolumeSpecName: "config") pod "ec9e7ec8-7392-4362-8576-d62ca0508254" (UID: "ec9e7ec8-7392-4362-8576-d62ca0508254"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:24:01 crc kubenswrapper[4752]: I1124 11:24:01.898088 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec9e7ec8-7392-4362-8576-d62ca0508254-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ec9e7ec8-7392-4362-8576-d62ca0508254" (UID: "ec9e7ec8-7392-4362-8576-d62ca0508254"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:24:01 crc kubenswrapper[4752]: I1124 11:24:01.903969 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec9e7ec8-7392-4362-8576-d62ca0508254-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ec9e7ec8-7392-4362-8576-d62ca0508254" (UID: "ec9e7ec8-7392-4362-8576-d62ca0508254"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:24:01 crc kubenswrapper[4752]: I1124 11:24:01.955527 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec9e7ec8-7392-4362-8576-d62ca0508254-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:01 crc kubenswrapper[4752]: I1124 11:24:01.955563 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec9e7ec8-7392-4362-8576-d62ca0508254-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:01 crc kubenswrapper[4752]: I1124 11:24:01.955573 4752 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec9e7ec8-7392-4362-8576-d62ca0508254-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:02 crc kubenswrapper[4752]: I1124 11:24:02.485523 4752 generic.go:334] "Generic (PLEG): container finished" podID="acd74806-0b12-4c05-9594-acbaf92b9af9" containerID="c9c0faa7fabab5aa35b498dc4929bf133db55cdfd0aaa0be2d13f77832bd89d9" exitCode=0 Nov 24 11:24:02 crc kubenswrapper[4752]: I1124 11:24:02.485690 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-83dd-account-create-m48bt" event={"ID":"acd74806-0b12-4c05-9594-acbaf92b9af9","Type":"ContainerDied","Data":"c9c0faa7fabab5aa35b498dc4929bf133db55cdfd0aaa0be2d13f77832bd89d9"} Nov 24 11:24:02 crc kubenswrapper[4752]: I1124 11:24:02.498409 4752 generic.go:334] "Generic (PLEG): container finished" podID="93537e9d-a8b8-4fb4-b982-2971ccf60f9d" containerID="ba0485f587e84e6729b1a0edb628c4d684a2d0274e0deb773446ecb4a744c7d3" exitCode=0 Nov 24 11:24:02 crc kubenswrapper[4752]: I1124 11:24:02.498534 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7b4f-account-create-pq4n9" event={"ID":"93537e9d-a8b8-4fb4-b982-2971ccf60f9d","Type":"ContainerDied","Data":"ba0485f587e84e6729b1a0edb628c4d684a2d0274e0deb773446ecb4a744c7d3"} Nov 24 11:24:02 crc kubenswrapper[4752]: I1124 11:24:02.500963 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-tdgsl" event={"ID":"ec9e7ec8-7392-4362-8576-d62ca0508254","Type":"ContainerDied","Data":"e84117e98e19ede3ba855ad928c963edbc9c978ccf97222ff938c19073b9acdb"} Nov 24 11:24:02 crc kubenswrapper[4752]: I1124 11:24:02.501008 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-tdgsl" Nov 24 11:24:02 crc kubenswrapper[4752]: I1124 11:24:02.501026 4752 scope.go:117] "RemoveContainer" containerID="9fc9daf79bc6a767e3df083652a066248cae02378b9f0cf58166d676e1860bfa" Nov 24 11:24:02 crc kubenswrapper[4752]: I1124 11:24:02.506264 4752 generic.go:334] "Generic (PLEG): container finished" podID="7df74304-1dc2-466a-b089-ecfa45af2ce1" containerID="8a24630c9c439bcce109e7be607830a0db7b63fc24cf6c38472435d5053bd622" exitCode=0 Nov 24 11:24:02 crc kubenswrapper[4752]: I1124 11:24:02.506330 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-mbqd5" event={"ID":"7df74304-1dc2-466a-b089-ecfa45af2ce1","Type":"ContainerDied","Data":"8a24630c9c439bcce109e7be607830a0db7b63fc24cf6c38472435d5053bd622"} Nov 24 11:24:02 crc kubenswrapper[4752]: I1124 11:24:02.507976 4752 generic.go:334] "Generic (PLEG): container finished" podID="3c789332-baf6-4602-b509-7421b2ff22fe" containerID="94a1d35075cddbba48a56524dbb8d50a49c8101d32685bfc1f3920e8a633ef21" exitCode=0 Nov 24 11:24:02 crc kubenswrapper[4752]: I1124 11:24:02.507999 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-4kfpv" event={"ID":"3c789332-baf6-4602-b509-7421b2ff22fe","Type":"ContainerDied","Data":"94a1d35075cddbba48a56524dbb8d50a49c8101d32685bfc1f3920e8a633ef21"} Nov 24 11:24:02 crc kubenswrapper[4752]: I1124 11:24:02.532521 4752 scope.go:117] "RemoveContainer" containerID="b631f35d465abf3ee59e224aeb1ffa1314cc93be6fa424eed3d23de583d0fa6f" Nov 24 11:24:02 crc kubenswrapper[4752]: I1124 11:24:02.578737 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-tdgsl"] Nov 24 11:24:02 crc kubenswrapper[4752]: I1124 11:24:02.586241 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-tdgsl"] Nov 24 11:24:02 crc kubenswrapper[4752]: I1124 11:24:02.740538 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec9e7ec8-7392-4362-8576-d62ca0508254" path="/var/lib/kubelet/pods/ec9e7ec8-7392-4362-8576-d62ca0508254/volumes" Nov 24 11:24:04 crc kubenswrapper[4752]: I1124 11:24:04.026393 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-83dd-account-create-m48bt" Nov 24 11:24:04 crc kubenswrapper[4752]: I1124 11:24:04.037128 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4kfpv" Nov 24 11:24:04 crc kubenswrapper[4752]: I1124 11:24:04.054076 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7b4f-account-create-pq4n9" Nov 24 11:24:04 crc kubenswrapper[4752]: I1124 11:24:04.063967 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-mbqd5" Nov 24 11:24:04 crc kubenswrapper[4752]: I1124 11:24:04.212181 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mggk5\" (UniqueName: \"kubernetes.io/projected/93537e9d-a8b8-4fb4-b982-2971ccf60f9d-kube-api-access-mggk5\") pod \"93537e9d-a8b8-4fb4-b982-2971ccf60f9d\" (UID: \"93537e9d-a8b8-4fb4-b982-2971ccf60f9d\") " Nov 24 11:24:04 crc kubenswrapper[4752]: I1124 11:24:04.212258 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c789332-baf6-4602-b509-7421b2ff22fe-operator-scripts\") pod \"3c789332-baf6-4602-b509-7421b2ff22fe\" (UID: \"3c789332-baf6-4602-b509-7421b2ff22fe\") " Nov 24 11:24:04 crc kubenswrapper[4752]: I1124 11:24:04.212317 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mc9xh\" (UniqueName: \"kubernetes.io/projected/7df74304-1dc2-466a-b089-ecfa45af2ce1-kube-api-access-mc9xh\") pod \"7df74304-1dc2-466a-b089-ecfa45af2ce1\" (UID: \"7df74304-1dc2-466a-b089-ecfa45af2ce1\") " Nov 24 11:24:04 crc kubenswrapper[4752]: I1124 11:24:04.212379 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rvdc\" (UniqueName: \"kubernetes.io/projected/3c789332-baf6-4602-b509-7421b2ff22fe-kube-api-access-4rvdc\") pod \"3c789332-baf6-4602-b509-7421b2ff22fe\" (UID: \"3c789332-baf6-4602-b509-7421b2ff22fe\") " Nov 24 11:24:04 crc kubenswrapper[4752]: I1124 11:24:04.212418 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7df74304-1dc2-466a-b089-ecfa45af2ce1-operator-scripts\") pod \"7df74304-1dc2-466a-b089-ecfa45af2ce1\" (UID: \"7df74304-1dc2-466a-b089-ecfa45af2ce1\") " Nov 24 11:24:04 crc kubenswrapper[4752]: I1124 11:24:04.212474 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/acd74806-0b12-4c05-9594-acbaf92b9af9-operator-scripts\") pod \"acd74806-0b12-4c05-9594-acbaf92b9af9\" (UID: \"acd74806-0b12-4c05-9594-acbaf92b9af9\") " Nov 24 11:24:04 crc kubenswrapper[4752]: I1124 11:24:04.212506 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sk48f\" (UniqueName: \"kubernetes.io/projected/acd74806-0b12-4c05-9594-acbaf92b9af9-kube-api-access-sk48f\") pod \"acd74806-0b12-4c05-9594-acbaf92b9af9\" (UID: \"acd74806-0b12-4c05-9594-acbaf92b9af9\") " Nov 24 11:24:04 crc kubenswrapper[4752]: I1124 11:24:04.212595 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93537e9d-a8b8-4fb4-b982-2971ccf60f9d-operator-scripts\") pod \"93537e9d-a8b8-4fb4-b982-2971ccf60f9d\" (UID: \"93537e9d-a8b8-4fb4-b982-2971ccf60f9d\") " Nov 24 11:24:04 crc kubenswrapper[4752]: I1124 11:24:04.213863 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7df74304-1dc2-466a-b089-ecfa45af2ce1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7df74304-1dc2-466a-b089-ecfa45af2ce1" (UID: "7df74304-1dc2-466a-b089-ecfa45af2ce1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:24:04 crc kubenswrapper[4752]: I1124 11:24:04.213911 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93537e9d-a8b8-4fb4-b982-2971ccf60f9d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "93537e9d-a8b8-4fb4-b982-2971ccf60f9d" (UID: "93537e9d-a8b8-4fb4-b982-2971ccf60f9d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:24:04 crc kubenswrapper[4752]: I1124 11:24:04.214656 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acd74806-0b12-4c05-9594-acbaf92b9af9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "acd74806-0b12-4c05-9594-acbaf92b9af9" (UID: "acd74806-0b12-4c05-9594-acbaf92b9af9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:24:04 crc kubenswrapper[4752]: I1124 11:24:04.215823 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c789332-baf6-4602-b509-7421b2ff22fe-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3c789332-baf6-4602-b509-7421b2ff22fe" (UID: "3c789332-baf6-4602-b509-7421b2ff22fe"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:24:04 crc kubenswrapper[4752]: I1124 11:24:04.217562 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7df74304-1dc2-466a-b089-ecfa45af2ce1-kube-api-access-mc9xh" (OuterVolumeSpecName: "kube-api-access-mc9xh") pod "7df74304-1dc2-466a-b089-ecfa45af2ce1" (UID: "7df74304-1dc2-466a-b089-ecfa45af2ce1"). InnerVolumeSpecName "kube-api-access-mc9xh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:24:04 crc kubenswrapper[4752]: I1124 11:24:04.218064 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c789332-baf6-4602-b509-7421b2ff22fe-kube-api-access-4rvdc" (OuterVolumeSpecName: "kube-api-access-4rvdc") pod "3c789332-baf6-4602-b509-7421b2ff22fe" (UID: "3c789332-baf6-4602-b509-7421b2ff22fe"). InnerVolumeSpecName "kube-api-access-4rvdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:24:04 crc kubenswrapper[4752]: I1124 11:24:04.218857 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93537e9d-a8b8-4fb4-b982-2971ccf60f9d-kube-api-access-mggk5" (OuterVolumeSpecName: "kube-api-access-mggk5") pod "93537e9d-a8b8-4fb4-b982-2971ccf60f9d" (UID: "93537e9d-a8b8-4fb4-b982-2971ccf60f9d"). InnerVolumeSpecName "kube-api-access-mggk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:24:04 crc kubenswrapper[4752]: I1124 11:24:04.218950 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acd74806-0b12-4c05-9594-acbaf92b9af9-kube-api-access-sk48f" (OuterVolumeSpecName: "kube-api-access-sk48f") pod "acd74806-0b12-4c05-9594-acbaf92b9af9" (UID: "acd74806-0b12-4c05-9594-acbaf92b9af9"). InnerVolumeSpecName "kube-api-access-sk48f". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:24:04 crc kubenswrapper[4752]: I1124 11:24:04.314057 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mc9xh\" (UniqueName: \"kubernetes.io/projected/7df74304-1dc2-466a-b089-ecfa45af2ce1-kube-api-access-mc9xh\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:04 crc kubenswrapper[4752]: I1124 11:24:04.314090 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rvdc\" (UniqueName: \"kubernetes.io/projected/3c789332-baf6-4602-b509-7421b2ff22fe-kube-api-access-4rvdc\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:04 crc kubenswrapper[4752]: I1124 11:24:04.314100 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7df74304-1dc2-466a-b089-ecfa45af2ce1-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:04 crc kubenswrapper[4752]: I1124 11:24:04.314109 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/acd74806-0b12-4c05-9594-acbaf92b9af9-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:04 crc kubenswrapper[4752]: I1124 11:24:04.314117 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sk48f\" (UniqueName: \"kubernetes.io/projected/acd74806-0b12-4c05-9594-acbaf92b9af9-kube-api-access-sk48f\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:04 crc kubenswrapper[4752]: I1124 11:24:04.314126 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93537e9d-a8b8-4fb4-b982-2971ccf60f9d-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:04 crc kubenswrapper[4752]: I1124 11:24:04.314135 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mggk5\" (UniqueName: \"kubernetes.io/projected/93537e9d-a8b8-4fb4-b982-2971ccf60f9d-kube-api-access-mggk5\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:04 crc kubenswrapper[4752]: I1124 11:24:04.314143 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c789332-baf6-4602-b509-7421b2ff22fe-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:04 crc kubenswrapper[4752]: I1124 11:24:04.532682 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-83dd-account-create-m48bt" event={"ID":"acd74806-0b12-4c05-9594-acbaf92b9af9","Type":"ContainerDied","Data":"be616eb4cc13396e2c2cd1b27da3380908a24a13b67a1d916e2870b8152f7d34"} Nov 24 11:24:04 crc kubenswrapper[4752]: I1124 11:24:04.532721 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be616eb4cc13396e2c2cd1b27da3380908a24a13b67a1d916e2870b8152f7d34" Nov 24 11:24:04 crc kubenswrapper[4752]: I1124 11:24:04.532784 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-83dd-account-create-m48bt" Nov 24 11:24:04 crc kubenswrapper[4752]: I1124 11:24:04.547607 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7b4f-account-create-pq4n9" event={"ID":"93537e9d-a8b8-4fb4-b982-2971ccf60f9d","Type":"ContainerDied","Data":"60328ce7a1e3b3b7d6d69769b67275cff7ef9dbb504ea9ee41b0b0f77867eec7"} Nov 24 11:24:04 crc kubenswrapper[4752]: I1124 11:24:04.547682 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60328ce7a1e3b3b7d6d69769b67275cff7ef9dbb504ea9ee41b0b0f77867eec7" Nov 24 11:24:04 crc kubenswrapper[4752]: I1124 11:24:04.547737 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7b4f-account-create-pq4n9" Nov 24 11:24:04 crc kubenswrapper[4752]: I1124 11:24:04.549484 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-mbqd5" Nov 24 11:24:04 crc kubenswrapper[4752]: I1124 11:24:04.549474 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-mbqd5" event={"ID":"7df74304-1dc2-466a-b089-ecfa45af2ce1","Type":"ContainerDied","Data":"ab5cdfa85a8e20b1add182eec71dd648eb9bd1ef4f5e85359bad45feef699c2b"} Nov 24 11:24:04 crc kubenswrapper[4752]: I1124 11:24:04.549607 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab5cdfa85a8e20b1add182eec71dd648eb9bd1ef4f5e85359bad45feef699c2b" Nov 24 11:24:04 crc kubenswrapper[4752]: I1124 11:24:04.550842 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-4kfpv" event={"ID":"3c789332-baf6-4602-b509-7421b2ff22fe","Type":"ContainerDied","Data":"56325570c022ac13240414882892a413db2aaee96fe240bda39068c9e48c2d60"} Nov 24 11:24:04 crc kubenswrapper[4752]: I1124 11:24:04.550864 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56325570c022ac13240414882892a413db2aaee96fe240bda39068c9e48c2d60" Nov 24 11:24:04 crc kubenswrapper[4752]: I1124 11:24:04.550899 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4kfpv" Nov 24 11:24:04 crc kubenswrapper[4752]: I1124 11:24:04.878374 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Nov 24 11:24:07 crc kubenswrapper[4752]: I1124 11:24:07.593276 4752 generic.go:334] "Generic (PLEG): container finished" podID="0ecf935a-a2fa-4fb4-91f4-46f5564cb094" containerID="954b47a267aaf50acc4f84e04c8beafb487e7e1d28523852ecbb0dc2c03e0221" exitCode=0 Nov 24 11:24:07 crc kubenswrapper[4752]: I1124 11:24:07.593352 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vvkzg" event={"ID":"0ecf935a-a2fa-4fb4-91f4-46f5564cb094","Type":"ContainerDied","Data":"954b47a267aaf50acc4f84e04c8beafb487e7e1d28523852ecbb0dc2c03e0221"} Nov 24 11:24:07 crc kubenswrapper[4752]: I1124 11:24:07.773243 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d140408d-b7e5-4ba0-9404-877498cf18a1-etc-swift\") pod \"swift-storage-0\" (UID: \"d140408d-b7e5-4ba0-9404-877498cf18a1\") " pod="openstack/swift-storage-0" Nov 24 11:24:07 crc kubenswrapper[4752]: I1124 11:24:07.793016 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d140408d-b7e5-4ba0-9404-877498cf18a1-etc-swift\") pod \"swift-storage-0\" (UID: \"d140408d-b7e5-4ba0-9404-877498cf18a1\") " pod="openstack/swift-storage-0" Nov 24 11:24:07 crc kubenswrapper[4752]: I1124 11:24:07.854528 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Nov 24 11:24:08 crc kubenswrapper[4752]: I1124 11:24:08.899728 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-xmc2s" podUID="9495af0b-dca8-4695-8f40-f4f9a7ec5229" containerName="ovn-controller" probeResult="failure" output=< Nov 24 11:24:08 crc kubenswrapper[4752]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Nov 24 11:24:08 crc kubenswrapper[4752]: > Nov 24 11:24:11 crc kubenswrapper[4752]: I1124 11:24:11.631367 4752 generic.go:334] "Generic (PLEG): container finished" podID="a1f1e943-afb7-40c3-9ff2-56791f4e0ad5" containerID="50d83bb320cd3643b150ab5645490c9c124d9eff8abf9994e68c250ab8f36395" exitCode=0 Nov 24 11:24:11 crc kubenswrapper[4752]: I1124 11:24:11.631454 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5","Type":"ContainerDied","Data":"50d83bb320cd3643b150ab5645490c9c124d9eff8abf9994e68c250ab8f36395"} Nov 24 11:24:11 crc kubenswrapper[4752]: I1124 11:24:11.634782 4752 generic.go:334] "Generic (PLEG): container finished" podID="3d820ba8-03d6-48f4-9423-bbc1ed64a36b" containerID="ca63fd0de5f52cf43fe00bb9dfbd1436710419f23b5ae8d734e790c2aa804e2b" exitCode=0 Nov 24 11:24:11 crc kubenswrapper[4752]: I1124 11:24:11.634808 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3d820ba8-03d6-48f4-9423-bbc1ed64a36b","Type":"ContainerDied","Data":"ca63fd0de5f52cf43fe00bb9dfbd1436710419f23b5ae8d734e790c2aa804e2b"} Nov 24 11:24:12 crc kubenswrapper[4752]: I1124 11:24:12.058437 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vvkzg" Nov 24 11:24:12 crc kubenswrapper[4752]: I1124 11:24:12.235413 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Nov 24 11:24:12 crc kubenswrapper[4752]: I1124 11:24:12.259990 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0ecf935a-a2fa-4fb4-91f4-46f5564cb094-etc-swift\") pod \"0ecf935a-a2fa-4fb4-91f4-46f5564cb094\" (UID: \"0ecf935a-a2fa-4fb4-91f4-46f5564cb094\") " Nov 24 11:24:12 crc kubenswrapper[4752]: I1124 11:24:12.260089 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0ecf935a-a2fa-4fb4-91f4-46f5564cb094-scripts\") pod \"0ecf935a-a2fa-4fb4-91f4-46f5564cb094\" (UID: \"0ecf935a-a2fa-4fb4-91f4-46f5564cb094\") " Nov 24 11:24:12 crc kubenswrapper[4752]: I1124 11:24:12.260112 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ecf935a-a2fa-4fb4-91f4-46f5564cb094-combined-ca-bundle\") pod \"0ecf935a-a2fa-4fb4-91f4-46f5564cb094\" (UID: \"0ecf935a-a2fa-4fb4-91f4-46f5564cb094\") " Nov 24 11:24:12 crc kubenswrapper[4752]: I1124 11:24:12.260155 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0ecf935a-a2fa-4fb4-91f4-46f5564cb094-ring-data-devices\") pod \"0ecf935a-a2fa-4fb4-91f4-46f5564cb094\" (UID: \"0ecf935a-a2fa-4fb4-91f4-46f5564cb094\") " Nov 24 11:24:12 crc kubenswrapper[4752]: I1124 11:24:12.260205 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgxbh\" (UniqueName: \"kubernetes.io/projected/0ecf935a-a2fa-4fb4-91f4-46f5564cb094-kube-api-access-cgxbh\") pod \"0ecf935a-a2fa-4fb4-91f4-46f5564cb094\" (UID: \"0ecf935a-a2fa-4fb4-91f4-46f5564cb094\") " Nov 24 11:24:12 crc kubenswrapper[4752]: I1124 11:24:12.260263 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0ecf935a-a2fa-4fb4-91f4-46f5564cb094-swiftconf\") pod \"0ecf935a-a2fa-4fb4-91f4-46f5564cb094\" (UID: \"0ecf935a-a2fa-4fb4-91f4-46f5564cb094\") " Nov 24 11:24:12 crc kubenswrapper[4752]: I1124 11:24:12.260288 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0ecf935a-a2fa-4fb4-91f4-46f5564cb094-dispersionconf\") pod \"0ecf935a-a2fa-4fb4-91f4-46f5564cb094\" (UID: \"0ecf935a-a2fa-4fb4-91f4-46f5564cb094\") " Nov 24 11:24:12 crc kubenswrapper[4752]: I1124 11:24:12.260951 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ecf935a-a2fa-4fb4-91f4-46f5564cb094-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "0ecf935a-a2fa-4fb4-91f4-46f5564cb094" (UID: "0ecf935a-a2fa-4fb4-91f4-46f5564cb094"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:24:12 crc kubenswrapper[4752]: I1124 11:24:12.261247 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ecf935a-a2fa-4fb4-91f4-46f5564cb094-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "0ecf935a-a2fa-4fb4-91f4-46f5564cb094" (UID: "0ecf935a-a2fa-4fb4-91f4-46f5564cb094"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:24:12 crc kubenswrapper[4752]: I1124 11:24:12.264574 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ecf935a-a2fa-4fb4-91f4-46f5564cb094-kube-api-access-cgxbh" (OuterVolumeSpecName: "kube-api-access-cgxbh") pod "0ecf935a-a2fa-4fb4-91f4-46f5564cb094" (UID: "0ecf935a-a2fa-4fb4-91f4-46f5564cb094"). InnerVolumeSpecName "kube-api-access-cgxbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:24:12 crc kubenswrapper[4752]: I1124 11:24:12.267610 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ecf935a-a2fa-4fb4-91f4-46f5564cb094-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "0ecf935a-a2fa-4fb4-91f4-46f5564cb094" (UID: "0ecf935a-a2fa-4fb4-91f4-46f5564cb094"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:24:12 crc kubenswrapper[4752]: I1124 11:24:12.280776 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ecf935a-a2fa-4fb4-91f4-46f5564cb094-scripts" (OuterVolumeSpecName: "scripts") pod "0ecf935a-a2fa-4fb4-91f4-46f5564cb094" (UID: "0ecf935a-a2fa-4fb4-91f4-46f5564cb094"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:24:12 crc kubenswrapper[4752]: I1124 11:24:12.282935 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ecf935a-a2fa-4fb4-91f4-46f5564cb094-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ecf935a-a2fa-4fb4-91f4-46f5564cb094" (UID: "0ecf935a-a2fa-4fb4-91f4-46f5564cb094"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:24:12 crc kubenswrapper[4752]: I1124 11:24:12.290053 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ecf935a-a2fa-4fb4-91f4-46f5564cb094-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "0ecf935a-a2fa-4fb4-91f4-46f5564cb094" (UID: "0ecf935a-a2fa-4fb4-91f4-46f5564cb094"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:24:12 crc kubenswrapper[4752]: I1124 11:24:12.362695 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgxbh\" (UniqueName: \"kubernetes.io/projected/0ecf935a-a2fa-4fb4-91f4-46f5564cb094-kube-api-access-cgxbh\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:12 crc kubenswrapper[4752]: I1124 11:24:12.362732 4752 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0ecf935a-a2fa-4fb4-91f4-46f5564cb094-swiftconf\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:12 crc kubenswrapper[4752]: I1124 11:24:12.362756 4752 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0ecf935a-a2fa-4fb4-91f4-46f5564cb094-dispersionconf\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:12 crc kubenswrapper[4752]: I1124 11:24:12.362765 4752 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0ecf935a-a2fa-4fb4-91f4-46f5564cb094-etc-swift\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:12 crc kubenswrapper[4752]: I1124 11:24:12.362775 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0ecf935a-a2fa-4fb4-91f4-46f5564cb094-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:12 crc kubenswrapper[4752]: I1124 11:24:12.362783 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ecf935a-a2fa-4fb4-91f4-46f5564cb094-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:12 crc kubenswrapper[4752]: I1124 11:24:12.362791 4752 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0ecf935a-a2fa-4fb4-91f4-46f5564cb094-ring-data-devices\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:12 crc kubenswrapper[4752]: I1124 11:24:12.642516 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xx5fv" event={"ID":"7812a05d-b13c-413a-ae63-9317fec939e5","Type":"ContainerStarted","Data":"cd577f878c121c20c426390b80b9699b5ef8403e3a67c8fb2140b6b0c76f6a3f"} Nov 24 11:24:12 crc kubenswrapper[4752]: I1124 11:24:12.644737 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5","Type":"ContainerStarted","Data":"d59dfb8e5264cefe368e6631c2fcd950f686b3a37ff453ac4489ed24628cf2fd"} Nov 24 11:24:12 crc kubenswrapper[4752]: I1124 11:24:12.645020 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Nov 24 11:24:12 crc kubenswrapper[4752]: I1124 11:24:12.646666 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3d820ba8-03d6-48f4-9423-bbc1ed64a36b","Type":"ContainerStarted","Data":"d0225139306006f6265b3807404dbc72413aeefa6c2ec7ca1125827941ca3fbd"} Nov 24 11:24:12 crc kubenswrapper[4752]: I1124 11:24:12.646865 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Nov 24 11:24:12 crc kubenswrapper[4752]: I1124 11:24:12.648034 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vvkzg" event={"ID":"0ecf935a-a2fa-4fb4-91f4-46f5564cb094","Type":"ContainerDied","Data":"e3275f2e46442a0e5935e4e36b083c5cd9a29cc63f2996ca7f6227a4f12165ac"} Nov 24 11:24:12 crc kubenswrapper[4752]: I1124 11:24:12.648063 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3275f2e46442a0e5935e4e36b083c5cd9a29cc63f2996ca7f6227a4f12165ac" Nov 24 11:24:12 crc kubenswrapper[4752]: I1124 11:24:12.648039 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vvkzg" Nov 24 11:24:12 crc kubenswrapper[4752]: I1124 11:24:12.649058 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d140408d-b7e5-4ba0-9404-877498cf18a1","Type":"ContainerStarted","Data":"7ed21dba93fa06628760ea5c9289230f478fea6824ab9e73cdc301dbb0a189b7"} Nov 24 11:24:12 crc kubenswrapper[4752]: I1124 11:24:12.666018 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-xx5fv" podStartSLOduration=2.572468605 podStartE2EDuration="13.665996914s" podCreationTimestamp="2025-11-24 11:23:59 +0000 UTC" firstStartedPulling="2025-11-24 11:24:00.864660723 +0000 UTC m=+1046.849481012" lastFinishedPulling="2025-11-24 11:24:11.958189032 +0000 UTC m=+1057.943009321" observedRunningTime="2025-11-24 11:24:12.658995502 +0000 UTC m=+1058.643815791" watchObservedRunningTime="2025-11-24 11:24:12.665996914 +0000 UTC m=+1058.650817213" Nov 24 11:24:12 crc kubenswrapper[4752]: I1124 11:24:12.690240 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=49.38689947 podStartE2EDuration="58.690224044s" podCreationTimestamp="2025-11-24 11:23:14 +0000 UTC" firstStartedPulling="2025-11-24 11:23:27.48828197 +0000 UTC m=+1013.473102259" lastFinishedPulling="2025-11-24 11:23:36.791606534 +0000 UTC m=+1022.776426833" observedRunningTime="2025-11-24 11:24:12.685431416 +0000 UTC m=+1058.670251705" watchObservedRunningTime="2025-11-24 11:24:12.690224044 +0000 UTC m=+1058.675044333" Nov 24 11:24:12 crc kubenswrapper[4752]: I1124 11:24:12.706175 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=50.489622517 podStartE2EDuration="58.706163335s" podCreationTimestamp="2025-11-24 11:23:14 +0000 UTC" firstStartedPulling="2025-11-24 11:23:28.714272332 +0000 UTC m=+1014.699092611" lastFinishedPulling="2025-11-24 11:23:36.93081314 +0000 UTC m=+1022.915633429" observedRunningTime="2025-11-24 11:24:12.703513028 +0000 UTC m=+1058.688333327" watchObservedRunningTime="2025-11-24 11:24:12.706163335 +0000 UTC m=+1058.690983624" Nov 24 11:24:13 crc kubenswrapper[4752]: I1124 11:24:13.996728 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-v6jhw" Nov 24 11:24:14 crc kubenswrapper[4752]: I1124 11:24:14.028905 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-v6jhw" Nov 24 11:24:14 crc kubenswrapper[4752]: I1124 11:24:14.058447 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-xmc2s" podUID="9495af0b-dca8-4695-8f40-f4f9a7ec5229" containerName="ovn-controller" probeResult="failure" output=< Nov 24 11:24:14 crc kubenswrapper[4752]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Nov 24 11:24:14 crc kubenswrapper[4752]: > Nov 24 11:24:14 crc kubenswrapper[4752]: I1124 11:24:14.315017 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-xmc2s-config-jzrjx"] Nov 24 11:24:14 crc kubenswrapper[4752]: E1124 11:24:14.315617 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec9e7ec8-7392-4362-8576-d62ca0508254" containerName="dnsmasq-dns" Nov 24 11:24:14 crc kubenswrapper[4752]: I1124 11:24:14.315663 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec9e7ec8-7392-4362-8576-d62ca0508254" containerName="dnsmasq-dns" Nov 24 11:24:14 crc kubenswrapper[4752]: E1124 11:24:14.315685 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec9e7ec8-7392-4362-8576-d62ca0508254" containerName="init" Nov 24 11:24:14 crc kubenswrapper[4752]: I1124 11:24:14.315696 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec9e7ec8-7392-4362-8576-d62ca0508254" containerName="init" Nov 24 11:24:14 crc kubenswrapper[4752]: E1124 11:24:14.315709 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ecf935a-a2fa-4fb4-91f4-46f5564cb094" containerName="swift-ring-rebalance" Nov 24 11:24:14 crc kubenswrapper[4752]: I1124 11:24:14.315737 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ecf935a-a2fa-4fb4-91f4-46f5564cb094" containerName="swift-ring-rebalance" Nov 24 11:24:14 crc kubenswrapper[4752]: E1124 11:24:14.315763 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7df74304-1dc2-466a-b089-ecfa45af2ce1" containerName="mariadb-database-create" Nov 24 11:24:14 crc kubenswrapper[4752]: I1124 11:24:14.315772 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="7df74304-1dc2-466a-b089-ecfa45af2ce1" containerName="mariadb-database-create" Nov 24 11:24:14 crc kubenswrapper[4752]: E1124 11:24:14.315816 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c789332-baf6-4602-b509-7421b2ff22fe" containerName="mariadb-database-create" Nov 24 11:24:14 crc kubenswrapper[4752]: I1124 11:24:14.315828 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c789332-baf6-4602-b509-7421b2ff22fe" containerName="mariadb-database-create" Nov 24 11:24:14 crc kubenswrapper[4752]: E1124 11:24:14.315846 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93537e9d-a8b8-4fb4-b982-2971ccf60f9d" containerName="mariadb-account-create" Nov 24 11:24:14 crc kubenswrapper[4752]: I1124 11:24:14.315854 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="93537e9d-a8b8-4fb4-b982-2971ccf60f9d" containerName="mariadb-account-create" Nov 24 11:24:14 crc kubenswrapper[4752]: E1124 11:24:14.315864 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acd74806-0b12-4c05-9594-acbaf92b9af9" containerName="mariadb-account-create" Nov 24 11:24:14 crc kubenswrapper[4752]: I1124 11:24:14.315871 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="acd74806-0b12-4c05-9594-acbaf92b9af9" containerName="mariadb-account-create" Nov 24 11:24:14 crc kubenswrapper[4752]: I1124 11:24:14.316164 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="93537e9d-a8b8-4fb4-b982-2971ccf60f9d" containerName="mariadb-account-create" Nov 24 11:24:14 crc kubenswrapper[4752]: I1124 11:24:14.316240 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="7df74304-1dc2-466a-b089-ecfa45af2ce1" containerName="mariadb-database-create" Nov 24 11:24:14 crc kubenswrapper[4752]: I1124 11:24:14.316265 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ecf935a-a2fa-4fb4-91f4-46f5564cb094" containerName="swift-ring-rebalance" Nov 24 11:24:14 crc kubenswrapper[4752]: I1124 11:24:14.316312 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c789332-baf6-4602-b509-7421b2ff22fe" containerName="mariadb-database-create" Nov 24 11:24:14 crc kubenswrapper[4752]: I1124 11:24:14.316343 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec9e7ec8-7392-4362-8576-d62ca0508254" containerName="dnsmasq-dns" Nov 24 11:24:14 crc kubenswrapper[4752]: I1124 11:24:14.316389 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="acd74806-0b12-4c05-9594-acbaf92b9af9" containerName="mariadb-account-create" Nov 24 11:24:14 crc kubenswrapper[4752]: I1124 11:24:14.317450 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xmc2s-config-jzrjx" Nov 24 11:24:14 crc kubenswrapper[4752]: I1124 11:24:14.320443 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Nov 24 11:24:14 crc kubenswrapper[4752]: I1124 11:24:14.329109 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-xmc2s-config-jzrjx"] Nov 24 11:24:14 crc kubenswrapper[4752]: I1124 11:24:14.496766 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e21413b7-2f97-4fed-912d-d2d71555b32d-var-run\") pod \"ovn-controller-xmc2s-config-jzrjx\" (UID: \"e21413b7-2f97-4fed-912d-d2d71555b32d\") " pod="openstack/ovn-controller-xmc2s-config-jzrjx" Nov 24 11:24:14 crc kubenswrapper[4752]: I1124 11:24:14.496809 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e21413b7-2f97-4fed-912d-d2d71555b32d-var-log-ovn\") pod \"ovn-controller-xmc2s-config-jzrjx\" (UID: \"e21413b7-2f97-4fed-912d-d2d71555b32d\") " pod="openstack/ovn-controller-xmc2s-config-jzrjx" Nov 24 11:24:14 crc kubenswrapper[4752]: I1124 11:24:14.496843 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e21413b7-2f97-4fed-912d-d2d71555b32d-additional-scripts\") pod \"ovn-controller-xmc2s-config-jzrjx\" (UID: \"e21413b7-2f97-4fed-912d-d2d71555b32d\") " pod="openstack/ovn-controller-xmc2s-config-jzrjx" Nov 24 11:24:14 crc kubenswrapper[4752]: I1124 11:24:14.496864 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e21413b7-2f97-4fed-912d-d2d71555b32d-var-run-ovn\") pod \"ovn-controller-xmc2s-config-jzrjx\" (UID: \"e21413b7-2f97-4fed-912d-d2d71555b32d\") " pod="openstack/ovn-controller-xmc2s-config-jzrjx" Nov 24 11:24:14 crc kubenswrapper[4752]: I1124 11:24:14.496980 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e21413b7-2f97-4fed-912d-d2d71555b32d-scripts\") pod \"ovn-controller-xmc2s-config-jzrjx\" (UID: \"e21413b7-2f97-4fed-912d-d2d71555b32d\") " pod="openstack/ovn-controller-xmc2s-config-jzrjx" Nov 24 11:24:14 crc kubenswrapper[4752]: I1124 11:24:14.497189 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb6dm\" (UniqueName: \"kubernetes.io/projected/e21413b7-2f97-4fed-912d-d2d71555b32d-kube-api-access-bb6dm\") pod \"ovn-controller-xmc2s-config-jzrjx\" (UID: \"e21413b7-2f97-4fed-912d-d2d71555b32d\") " pod="openstack/ovn-controller-xmc2s-config-jzrjx" Nov 24 11:24:14 crc kubenswrapper[4752]: I1124 11:24:14.598595 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e21413b7-2f97-4fed-912d-d2d71555b32d-var-run\") pod \"ovn-controller-xmc2s-config-jzrjx\" (UID: \"e21413b7-2f97-4fed-912d-d2d71555b32d\") " pod="openstack/ovn-controller-xmc2s-config-jzrjx" Nov 24 11:24:14 crc kubenswrapper[4752]: I1124 11:24:14.598649 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e21413b7-2f97-4fed-912d-d2d71555b32d-var-log-ovn\") pod \"ovn-controller-xmc2s-config-jzrjx\" (UID: \"e21413b7-2f97-4fed-912d-d2d71555b32d\") " pod="openstack/ovn-controller-xmc2s-config-jzrjx" Nov 24 11:24:14 crc kubenswrapper[4752]: I1124 11:24:14.598681 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e21413b7-2f97-4fed-912d-d2d71555b32d-additional-scripts\") pod \"ovn-controller-xmc2s-config-jzrjx\" (UID: \"e21413b7-2f97-4fed-912d-d2d71555b32d\") " pod="openstack/ovn-controller-xmc2s-config-jzrjx" Nov 24 11:24:14 crc kubenswrapper[4752]: I1124 11:24:14.598701 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e21413b7-2f97-4fed-912d-d2d71555b32d-var-run-ovn\") pod \"ovn-controller-xmc2s-config-jzrjx\" (UID: \"e21413b7-2f97-4fed-912d-d2d71555b32d\") " pod="openstack/ovn-controller-xmc2s-config-jzrjx" Nov 24 11:24:14 crc kubenswrapper[4752]: I1124 11:24:14.598725 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e21413b7-2f97-4fed-912d-d2d71555b32d-scripts\") pod \"ovn-controller-xmc2s-config-jzrjx\" (UID: \"e21413b7-2f97-4fed-912d-d2d71555b32d\") " pod="openstack/ovn-controller-xmc2s-config-jzrjx" Nov 24 11:24:14 crc kubenswrapper[4752]: I1124 11:24:14.598798 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bb6dm\" (UniqueName: \"kubernetes.io/projected/e21413b7-2f97-4fed-912d-d2d71555b32d-kube-api-access-bb6dm\") pod \"ovn-controller-xmc2s-config-jzrjx\" (UID: \"e21413b7-2f97-4fed-912d-d2d71555b32d\") " pod="openstack/ovn-controller-xmc2s-config-jzrjx" Nov 24 11:24:14 crc kubenswrapper[4752]: I1124 11:24:14.599225 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e21413b7-2f97-4fed-912d-d2d71555b32d-var-log-ovn\") pod \"ovn-controller-xmc2s-config-jzrjx\" (UID: \"e21413b7-2f97-4fed-912d-d2d71555b32d\") " pod="openstack/ovn-controller-xmc2s-config-jzrjx" Nov 24 11:24:14 crc kubenswrapper[4752]: I1124 11:24:14.599250 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e21413b7-2f97-4fed-912d-d2d71555b32d-var-run-ovn\") pod \"ovn-controller-xmc2s-config-jzrjx\" (UID: \"e21413b7-2f97-4fed-912d-d2d71555b32d\") " pod="openstack/ovn-controller-xmc2s-config-jzrjx" Nov 24 11:24:14 crc kubenswrapper[4752]: I1124 11:24:14.599361 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e21413b7-2f97-4fed-912d-d2d71555b32d-var-run\") pod \"ovn-controller-xmc2s-config-jzrjx\" (UID: \"e21413b7-2f97-4fed-912d-d2d71555b32d\") " pod="openstack/ovn-controller-xmc2s-config-jzrjx" Nov 24 11:24:14 crc kubenswrapper[4752]: I1124 11:24:14.599734 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e21413b7-2f97-4fed-912d-d2d71555b32d-additional-scripts\") pod \"ovn-controller-xmc2s-config-jzrjx\" (UID: \"e21413b7-2f97-4fed-912d-d2d71555b32d\") " pod="openstack/ovn-controller-xmc2s-config-jzrjx" Nov 24 11:24:14 crc kubenswrapper[4752]: I1124 11:24:14.601482 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e21413b7-2f97-4fed-912d-d2d71555b32d-scripts\") pod \"ovn-controller-xmc2s-config-jzrjx\" (UID: \"e21413b7-2f97-4fed-912d-d2d71555b32d\") " pod="openstack/ovn-controller-xmc2s-config-jzrjx" Nov 24 11:24:14 crc kubenswrapper[4752]: I1124 11:24:14.623646 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb6dm\" (UniqueName: \"kubernetes.io/projected/e21413b7-2f97-4fed-912d-d2d71555b32d-kube-api-access-bb6dm\") pod \"ovn-controller-xmc2s-config-jzrjx\" (UID: \"e21413b7-2f97-4fed-912d-d2d71555b32d\") " pod="openstack/ovn-controller-xmc2s-config-jzrjx" Nov 24 11:24:14 crc kubenswrapper[4752]: I1124 11:24:14.637679 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xmc2s-config-jzrjx" Nov 24 11:24:14 crc kubenswrapper[4752]: I1124 11:24:14.667976 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d140408d-b7e5-4ba0-9404-877498cf18a1","Type":"ContainerStarted","Data":"0aa9b465d5873d31219e417b504faeddfc33faecc4e6894d5ac1e3bed4d23124"} Nov 24 11:24:14 crc kubenswrapper[4752]: I1124 11:24:14.668494 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d140408d-b7e5-4ba0-9404-877498cf18a1","Type":"ContainerStarted","Data":"c40fe6199d59ba83d6f5ecc88363dc9b53f3780bec0cb8d3bf3828832d55bce4"} Nov 24 11:24:14 crc kubenswrapper[4752]: I1124 11:24:14.668609 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d140408d-b7e5-4ba0-9404-877498cf18a1","Type":"ContainerStarted","Data":"8800b3e493a8c8c3ffef58fdad6314f8858048be66617819eddea42dd54bfa75"} Nov 24 11:24:14 crc kubenswrapper[4752]: I1124 11:24:14.668692 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d140408d-b7e5-4ba0-9404-877498cf18a1","Type":"ContainerStarted","Data":"c2e4b5f4a8a1169d5bc08d2b66fc675cb52f5aca6edec61cb505367c31a96177"} Nov 24 11:24:15 crc kubenswrapper[4752]: I1124 11:24:15.085179 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-xmc2s-config-jzrjx"] Nov 24 11:24:15 crc kubenswrapper[4752]: I1124 11:24:15.468429 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 11:24:15 crc kubenswrapper[4752]: I1124 11:24:15.468493 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 11:24:15 crc kubenswrapper[4752]: I1124 11:24:15.677600 4752 generic.go:334] "Generic (PLEG): container finished" podID="e21413b7-2f97-4fed-912d-d2d71555b32d" containerID="f4135c4e3c9981db1bcebbf4dd1fe967223301c429be2fa843360da3aaec3e0b" exitCode=0 Nov 24 11:24:15 crc kubenswrapper[4752]: I1124 11:24:15.677707 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xmc2s-config-jzrjx" event={"ID":"e21413b7-2f97-4fed-912d-d2d71555b32d","Type":"ContainerDied","Data":"f4135c4e3c9981db1bcebbf4dd1fe967223301c429be2fa843360da3aaec3e0b"} Nov 24 11:24:15 crc kubenswrapper[4752]: I1124 11:24:15.678189 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xmc2s-config-jzrjx" event={"ID":"e21413b7-2f97-4fed-912d-d2d71555b32d","Type":"ContainerStarted","Data":"bfb7b1a7b519d64f6d895ba1f84e48a87d302d160e8448b28103a9c8b7c74772"} Nov 24 11:24:16 crc kubenswrapper[4752]: I1124 11:24:16.691528 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d140408d-b7e5-4ba0-9404-877498cf18a1","Type":"ContainerStarted","Data":"767dc8c699d90b9a5af04f3792d4ddd678467deee8fa5af4963960ebd783d478"} Nov 24 11:24:16 crc kubenswrapper[4752]: I1124 11:24:16.691881 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d140408d-b7e5-4ba0-9404-877498cf18a1","Type":"ContainerStarted","Data":"488850f07865f9460361539e0d1fa936b99c6aeeb6356f50bc78ab5f8d90ffae"} Nov 24 11:24:16 crc kubenswrapper[4752]: I1124 11:24:16.691895 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d140408d-b7e5-4ba0-9404-877498cf18a1","Type":"ContainerStarted","Data":"b223ea43e6b2067d0faea7345f42301d5670008d230a3af4e07aa9d728dd386e"} Nov 24 11:24:16 crc kubenswrapper[4752]: I1124 11:24:16.691905 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d140408d-b7e5-4ba0-9404-877498cf18a1","Type":"ContainerStarted","Data":"1f69e9bf406833b72e92b79eb1f7d0d4f08fa2bf38125f2569affd7aa3802317"} Nov 24 11:24:17 crc kubenswrapper[4752]: I1124 11:24:17.063705 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xmc2s-config-jzrjx" Nov 24 11:24:17 crc kubenswrapper[4752]: I1124 11:24:17.236317 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e21413b7-2f97-4fed-912d-d2d71555b32d-additional-scripts\") pod \"e21413b7-2f97-4fed-912d-d2d71555b32d\" (UID: \"e21413b7-2f97-4fed-912d-d2d71555b32d\") " Nov 24 11:24:17 crc kubenswrapper[4752]: I1124 11:24:17.236369 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bb6dm\" (UniqueName: \"kubernetes.io/projected/e21413b7-2f97-4fed-912d-d2d71555b32d-kube-api-access-bb6dm\") pod \"e21413b7-2f97-4fed-912d-d2d71555b32d\" (UID: \"e21413b7-2f97-4fed-912d-d2d71555b32d\") " Nov 24 11:24:17 crc kubenswrapper[4752]: I1124 11:24:17.236399 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e21413b7-2f97-4fed-912d-d2d71555b32d-var-run\") pod \"e21413b7-2f97-4fed-912d-d2d71555b32d\" (UID: \"e21413b7-2f97-4fed-912d-d2d71555b32d\") " Nov 24 11:24:17 crc kubenswrapper[4752]: I1124 11:24:17.236525 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e21413b7-2f97-4fed-912d-d2d71555b32d-scripts\") pod \"e21413b7-2f97-4fed-912d-d2d71555b32d\" (UID: \"e21413b7-2f97-4fed-912d-d2d71555b32d\") " Nov 24 11:24:17 crc kubenswrapper[4752]: I1124 11:24:17.236544 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e21413b7-2f97-4fed-912d-d2d71555b32d-var-log-ovn\") pod \"e21413b7-2f97-4fed-912d-d2d71555b32d\" (UID: \"e21413b7-2f97-4fed-912d-d2d71555b32d\") " Nov 24 11:24:17 crc kubenswrapper[4752]: I1124 11:24:17.236599 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e21413b7-2f97-4fed-912d-d2d71555b32d-var-run-ovn\") pod \"e21413b7-2f97-4fed-912d-d2d71555b32d\" (UID: \"e21413b7-2f97-4fed-912d-d2d71555b32d\") " Nov 24 11:24:17 crc kubenswrapper[4752]: I1124 11:24:17.237035 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e21413b7-2f97-4fed-912d-d2d71555b32d-var-run" (OuterVolumeSpecName: "var-run") pod "e21413b7-2f97-4fed-912d-d2d71555b32d" (UID: "e21413b7-2f97-4fed-912d-d2d71555b32d"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 11:24:17 crc kubenswrapper[4752]: I1124 11:24:17.237104 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e21413b7-2f97-4fed-912d-d2d71555b32d-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "e21413b7-2f97-4fed-912d-d2d71555b32d" (UID: "e21413b7-2f97-4fed-912d-d2d71555b32d"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 11:24:17 crc kubenswrapper[4752]: I1124 11:24:17.237266 4752 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e21413b7-2f97-4fed-912d-d2d71555b32d-var-run\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:17 crc kubenswrapper[4752]: I1124 11:24:17.237589 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e21413b7-2f97-4fed-912d-d2d71555b32d-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "e21413b7-2f97-4fed-912d-d2d71555b32d" (UID: "e21413b7-2f97-4fed-912d-d2d71555b32d"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 11:24:17 crc kubenswrapper[4752]: I1124 11:24:17.238093 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e21413b7-2f97-4fed-912d-d2d71555b32d-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "e21413b7-2f97-4fed-912d-d2d71555b32d" (UID: "e21413b7-2f97-4fed-912d-d2d71555b32d"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:24:17 crc kubenswrapper[4752]: I1124 11:24:17.239190 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e21413b7-2f97-4fed-912d-d2d71555b32d-scripts" (OuterVolumeSpecName: "scripts") pod "e21413b7-2f97-4fed-912d-d2d71555b32d" (UID: "e21413b7-2f97-4fed-912d-d2d71555b32d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:24:17 crc kubenswrapper[4752]: I1124 11:24:17.242930 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e21413b7-2f97-4fed-912d-d2d71555b32d-kube-api-access-bb6dm" (OuterVolumeSpecName: "kube-api-access-bb6dm") pod "e21413b7-2f97-4fed-912d-d2d71555b32d" (UID: "e21413b7-2f97-4fed-912d-d2d71555b32d"). InnerVolumeSpecName "kube-api-access-bb6dm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:24:17 crc kubenswrapper[4752]: I1124 11:24:17.338570 4752 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e21413b7-2f97-4fed-912d-d2d71555b32d-additional-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:17 crc kubenswrapper[4752]: I1124 11:24:17.338600 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bb6dm\" (UniqueName: \"kubernetes.io/projected/e21413b7-2f97-4fed-912d-d2d71555b32d-kube-api-access-bb6dm\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:17 crc kubenswrapper[4752]: I1124 11:24:17.338611 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e21413b7-2f97-4fed-912d-d2d71555b32d-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:17 crc kubenswrapper[4752]: I1124 11:24:17.338620 4752 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e21413b7-2f97-4fed-912d-d2d71555b32d-var-log-ovn\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:17 crc kubenswrapper[4752]: I1124 11:24:17.338627 4752 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e21413b7-2f97-4fed-912d-d2d71555b32d-var-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:17 crc kubenswrapper[4752]: I1124 11:24:17.701038 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xmc2s-config-jzrjx" event={"ID":"e21413b7-2f97-4fed-912d-d2d71555b32d","Type":"ContainerDied","Data":"bfb7b1a7b519d64f6d895ba1f84e48a87d302d160e8448b28103a9c8b7c74772"} Nov 24 11:24:17 crc kubenswrapper[4752]: I1124 11:24:17.701403 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfb7b1a7b519d64f6d895ba1f84e48a87d302d160e8448b28103a9c8b7c74772" Nov 24 11:24:17 crc kubenswrapper[4752]: I1124 11:24:17.701077 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xmc2s-config-jzrjx" Nov 24 11:24:18 crc kubenswrapper[4752]: I1124 11:24:18.170785 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-xmc2s-config-jzrjx"] Nov 24 11:24:18 crc kubenswrapper[4752]: I1124 11:24:18.174679 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-xmc2s-config-jzrjx"] Nov 24 11:24:18 crc kubenswrapper[4752]: I1124 11:24:18.712923 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d140408d-b7e5-4ba0-9404-877498cf18a1","Type":"ContainerStarted","Data":"655afbf26b251383053ab1cd2f78be797251cca8e2d6d31b15777423d8e8b8d6"} Nov 24 11:24:18 crc kubenswrapper[4752]: I1124 11:24:18.713268 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d140408d-b7e5-4ba0-9404-877498cf18a1","Type":"ContainerStarted","Data":"43d8ca866663d4ecd92842929ff03ff00e617cdd862f758f162611f3514cc729"} Nov 24 11:24:18 crc kubenswrapper[4752]: I1124 11:24:18.713290 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d140408d-b7e5-4ba0-9404-877498cf18a1","Type":"ContainerStarted","Data":"e96acd23c65d5f84d30d5433b104a5d9611a4f0c7182eced372f80832d7b71b7"} Nov 24 11:24:18 crc kubenswrapper[4752]: I1124 11:24:18.741981 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e21413b7-2f97-4fed-912d-d2d71555b32d" path="/var/lib/kubelet/pods/e21413b7-2f97-4fed-912d-d2d71555b32d/volumes" Nov 24 11:24:18 crc kubenswrapper[4752]: I1124 11:24:18.924393 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-xmc2s" Nov 24 11:24:19 crc kubenswrapper[4752]: I1124 11:24:19.727516 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d140408d-b7e5-4ba0-9404-877498cf18a1","Type":"ContainerStarted","Data":"3be396b0e6f5e0aba349a55cbe34b21319263e884f766386de78d9f798a96776"} Nov 24 11:24:19 crc kubenswrapper[4752]: I1124 11:24:19.727888 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d140408d-b7e5-4ba0-9404-877498cf18a1","Type":"ContainerStarted","Data":"9f08577ef801c3e7522c491d0c718c0de2696b6e95c04bd48ac0a4caf2a50f3f"} Nov 24 11:24:19 crc kubenswrapper[4752]: I1124 11:24:19.727899 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d140408d-b7e5-4ba0-9404-877498cf18a1","Type":"ContainerStarted","Data":"22d66d38b77ecc103d240b9da3486e681fb8545959095b39b6a50d203a648e8f"} Nov 24 11:24:19 crc kubenswrapper[4752]: I1124 11:24:19.727907 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d140408d-b7e5-4ba0-9404-877498cf18a1","Type":"ContainerStarted","Data":"40650a43647baeef577e362f8903bd0a8586c76d9193a87a0ee8036e11ec9bd6"} Nov 24 11:24:19 crc kubenswrapper[4752]: I1124 11:24:19.730304 4752 generic.go:334] "Generic (PLEG): container finished" podID="7812a05d-b13c-413a-ae63-9317fec939e5" containerID="cd577f878c121c20c426390b80b9699b5ef8403e3a67c8fb2140b6b0c76f6a3f" exitCode=0 Nov 24 11:24:19 crc kubenswrapper[4752]: I1124 11:24:19.730356 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xx5fv" event={"ID":"7812a05d-b13c-413a-ae63-9317fec939e5","Type":"ContainerDied","Data":"cd577f878c121c20c426390b80b9699b5ef8403e3a67c8fb2140b6b0c76f6a3f"} Nov 24 11:24:19 crc kubenswrapper[4752]: I1124 11:24:19.772279 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=23.946175794 podStartE2EDuration="29.772253584s" podCreationTimestamp="2025-11-24 11:23:50 +0000 UTC" firstStartedPulling="2025-11-24 11:24:12.250347844 +0000 UTC m=+1058.235168143" lastFinishedPulling="2025-11-24 11:24:18.076425644 +0000 UTC m=+1064.061245933" observedRunningTime="2025-11-24 11:24:19.758669742 +0000 UTC m=+1065.743490041" watchObservedRunningTime="2025-11-24 11:24:19.772253584 +0000 UTC m=+1065.757073873" Nov 24 11:24:20 crc kubenswrapper[4752]: I1124 11:24:20.046789 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-nxgwb"] Nov 24 11:24:20 crc kubenswrapper[4752]: E1124 11:24:20.047173 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e21413b7-2f97-4fed-912d-d2d71555b32d" containerName="ovn-config" Nov 24 11:24:20 crc kubenswrapper[4752]: I1124 11:24:20.047191 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="e21413b7-2f97-4fed-912d-d2d71555b32d" containerName="ovn-config" Nov 24 11:24:20 crc kubenswrapper[4752]: I1124 11:24:20.047370 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="e21413b7-2f97-4fed-912d-d2d71555b32d" containerName="ovn-config" Nov 24 11:24:20 crc kubenswrapper[4752]: I1124 11:24:20.048197 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-nxgwb" Nov 24 11:24:20 crc kubenswrapper[4752]: I1124 11:24:20.050145 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Nov 24 11:24:20 crc kubenswrapper[4752]: I1124 11:24:20.062539 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-nxgwb"] Nov 24 11:24:20 crc kubenswrapper[4752]: I1124 11:24:20.200518 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e222c213-2c36-4011-bdac-ce0d2b58e244-config\") pod \"dnsmasq-dns-5c79d794d7-nxgwb\" (UID: \"e222c213-2c36-4011-bdac-ce0d2b58e244\") " pod="openstack/dnsmasq-dns-5c79d794d7-nxgwb" Nov 24 11:24:20 crc kubenswrapper[4752]: I1124 11:24:20.200604 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e222c213-2c36-4011-bdac-ce0d2b58e244-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-nxgwb\" (UID: \"e222c213-2c36-4011-bdac-ce0d2b58e244\") " pod="openstack/dnsmasq-dns-5c79d794d7-nxgwb" Nov 24 11:24:20 crc kubenswrapper[4752]: I1124 11:24:20.200639 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db267\" (UniqueName: \"kubernetes.io/projected/e222c213-2c36-4011-bdac-ce0d2b58e244-kube-api-access-db267\") pod \"dnsmasq-dns-5c79d794d7-nxgwb\" (UID: \"e222c213-2c36-4011-bdac-ce0d2b58e244\") " pod="openstack/dnsmasq-dns-5c79d794d7-nxgwb" Nov 24 11:24:20 crc kubenswrapper[4752]: I1124 11:24:20.200666 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e222c213-2c36-4011-bdac-ce0d2b58e244-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-nxgwb\" (UID: \"e222c213-2c36-4011-bdac-ce0d2b58e244\") " pod="openstack/dnsmasq-dns-5c79d794d7-nxgwb" Nov 24 11:24:20 crc kubenswrapper[4752]: I1124 11:24:20.200701 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e222c213-2c36-4011-bdac-ce0d2b58e244-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-nxgwb\" (UID: \"e222c213-2c36-4011-bdac-ce0d2b58e244\") " pod="openstack/dnsmasq-dns-5c79d794d7-nxgwb" Nov 24 11:24:20 crc kubenswrapper[4752]: I1124 11:24:20.200725 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e222c213-2c36-4011-bdac-ce0d2b58e244-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-nxgwb\" (UID: \"e222c213-2c36-4011-bdac-ce0d2b58e244\") " pod="openstack/dnsmasq-dns-5c79d794d7-nxgwb" Nov 24 11:24:20 crc kubenswrapper[4752]: I1124 11:24:20.302398 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e222c213-2c36-4011-bdac-ce0d2b58e244-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-nxgwb\" (UID: \"e222c213-2c36-4011-bdac-ce0d2b58e244\") " pod="openstack/dnsmasq-dns-5c79d794d7-nxgwb" Nov 24 11:24:20 crc kubenswrapper[4752]: I1124 11:24:20.302454 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-db267\" (UniqueName: \"kubernetes.io/projected/e222c213-2c36-4011-bdac-ce0d2b58e244-kube-api-access-db267\") pod \"dnsmasq-dns-5c79d794d7-nxgwb\" (UID: \"e222c213-2c36-4011-bdac-ce0d2b58e244\") " pod="openstack/dnsmasq-dns-5c79d794d7-nxgwb" Nov 24 11:24:20 crc kubenswrapper[4752]: I1124 11:24:20.302483 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e222c213-2c36-4011-bdac-ce0d2b58e244-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-nxgwb\" (UID: \"e222c213-2c36-4011-bdac-ce0d2b58e244\") " pod="openstack/dnsmasq-dns-5c79d794d7-nxgwb" Nov 24 11:24:20 crc kubenswrapper[4752]: I1124 11:24:20.302522 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e222c213-2c36-4011-bdac-ce0d2b58e244-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-nxgwb\" (UID: \"e222c213-2c36-4011-bdac-ce0d2b58e244\") " pod="openstack/dnsmasq-dns-5c79d794d7-nxgwb" Nov 24 11:24:20 crc kubenswrapper[4752]: I1124 11:24:20.302547 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e222c213-2c36-4011-bdac-ce0d2b58e244-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-nxgwb\" (UID: \"e222c213-2c36-4011-bdac-ce0d2b58e244\") " pod="openstack/dnsmasq-dns-5c79d794d7-nxgwb" Nov 24 11:24:20 crc kubenswrapper[4752]: I1124 11:24:20.302587 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e222c213-2c36-4011-bdac-ce0d2b58e244-config\") pod \"dnsmasq-dns-5c79d794d7-nxgwb\" (UID: \"e222c213-2c36-4011-bdac-ce0d2b58e244\") " pod="openstack/dnsmasq-dns-5c79d794d7-nxgwb" Nov 24 11:24:20 crc kubenswrapper[4752]: I1124 11:24:20.303363 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e222c213-2c36-4011-bdac-ce0d2b58e244-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-nxgwb\" (UID: \"e222c213-2c36-4011-bdac-ce0d2b58e244\") " pod="openstack/dnsmasq-dns-5c79d794d7-nxgwb" Nov 24 11:24:20 crc kubenswrapper[4752]: I1124 11:24:20.303375 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e222c213-2c36-4011-bdac-ce0d2b58e244-config\") pod \"dnsmasq-dns-5c79d794d7-nxgwb\" (UID: \"e222c213-2c36-4011-bdac-ce0d2b58e244\") " pod="openstack/dnsmasq-dns-5c79d794d7-nxgwb" Nov 24 11:24:20 crc kubenswrapper[4752]: I1124 11:24:20.303936 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e222c213-2c36-4011-bdac-ce0d2b58e244-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-nxgwb\" (UID: \"e222c213-2c36-4011-bdac-ce0d2b58e244\") " pod="openstack/dnsmasq-dns-5c79d794d7-nxgwb" Nov 24 11:24:20 crc kubenswrapper[4752]: I1124 11:24:20.303982 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e222c213-2c36-4011-bdac-ce0d2b58e244-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-nxgwb\" (UID: \"e222c213-2c36-4011-bdac-ce0d2b58e244\") " pod="openstack/dnsmasq-dns-5c79d794d7-nxgwb" Nov 24 11:24:20 crc kubenswrapper[4752]: I1124 11:24:20.304708 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e222c213-2c36-4011-bdac-ce0d2b58e244-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-nxgwb\" (UID: \"e222c213-2c36-4011-bdac-ce0d2b58e244\") " pod="openstack/dnsmasq-dns-5c79d794d7-nxgwb" Nov 24 11:24:20 crc kubenswrapper[4752]: I1124 11:24:20.327717 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-db267\" (UniqueName: \"kubernetes.io/projected/e222c213-2c36-4011-bdac-ce0d2b58e244-kube-api-access-db267\") pod \"dnsmasq-dns-5c79d794d7-nxgwb\" (UID: \"e222c213-2c36-4011-bdac-ce0d2b58e244\") " pod="openstack/dnsmasq-dns-5c79d794d7-nxgwb" Nov 24 11:24:20 crc kubenswrapper[4752]: I1124 11:24:20.366661 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-nxgwb" Nov 24 11:24:20 crc kubenswrapper[4752]: I1124 11:24:20.832834 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-nxgwb"] Nov 24 11:24:21 crc kubenswrapper[4752]: I1124 11:24:21.096543 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xx5fv" Nov 24 11:24:21 crc kubenswrapper[4752]: I1124 11:24:21.217349 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7812a05d-b13c-413a-ae63-9317fec939e5-combined-ca-bundle\") pod \"7812a05d-b13c-413a-ae63-9317fec939e5\" (UID: \"7812a05d-b13c-413a-ae63-9317fec939e5\") " Nov 24 11:24:21 crc kubenswrapper[4752]: I1124 11:24:21.217464 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdls8\" (UniqueName: \"kubernetes.io/projected/7812a05d-b13c-413a-ae63-9317fec939e5-kube-api-access-pdls8\") pod \"7812a05d-b13c-413a-ae63-9317fec939e5\" (UID: \"7812a05d-b13c-413a-ae63-9317fec939e5\") " Nov 24 11:24:21 crc kubenswrapper[4752]: I1124 11:24:21.217498 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7812a05d-b13c-413a-ae63-9317fec939e5-config-data\") pod \"7812a05d-b13c-413a-ae63-9317fec939e5\" (UID: \"7812a05d-b13c-413a-ae63-9317fec939e5\") " Nov 24 11:24:21 crc kubenswrapper[4752]: I1124 11:24:21.217557 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7812a05d-b13c-413a-ae63-9317fec939e5-db-sync-config-data\") pod \"7812a05d-b13c-413a-ae63-9317fec939e5\" (UID: \"7812a05d-b13c-413a-ae63-9317fec939e5\") " Nov 24 11:24:21 crc kubenswrapper[4752]: I1124 11:24:21.221900 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7812a05d-b13c-413a-ae63-9317fec939e5-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7812a05d-b13c-413a-ae63-9317fec939e5" (UID: "7812a05d-b13c-413a-ae63-9317fec939e5"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:24:21 crc kubenswrapper[4752]: I1124 11:24:21.223475 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7812a05d-b13c-413a-ae63-9317fec939e5-kube-api-access-pdls8" (OuterVolumeSpecName: "kube-api-access-pdls8") pod "7812a05d-b13c-413a-ae63-9317fec939e5" (UID: "7812a05d-b13c-413a-ae63-9317fec939e5"). InnerVolumeSpecName "kube-api-access-pdls8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:24:21 crc kubenswrapper[4752]: I1124 11:24:21.241233 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7812a05d-b13c-413a-ae63-9317fec939e5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7812a05d-b13c-413a-ae63-9317fec939e5" (UID: "7812a05d-b13c-413a-ae63-9317fec939e5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:24:21 crc kubenswrapper[4752]: I1124 11:24:21.260119 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7812a05d-b13c-413a-ae63-9317fec939e5-config-data" (OuterVolumeSpecName: "config-data") pod "7812a05d-b13c-413a-ae63-9317fec939e5" (UID: "7812a05d-b13c-413a-ae63-9317fec939e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:24:21 crc kubenswrapper[4752]: I1124 11:24:21.319621 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7812a05d-b13c-413a-ae63-9317fec939e5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:21 crc kubenswrapper[4752]: I1124 11:24:21.319861 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdls8\" (UniqueName: \"kubernetes.io/projected/7812a05d-b13c-413a-ae63-9317fec939e5-kube-api-access-pdls8\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:21 crc kubenswrapper[4752]: I1124 11:24:21.319947 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7812a05d-b13c-413a-ae63-9317fec939e5-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:21 crc kubenswrapper[4752]: I1124 11:24:21.320004 4752 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7812a05d-b13c-413a-ae63-9317fec939e5-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:21 crc kubenswrapper[4752]: I1124 11:24:21.752283 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xx5fv" event={"ID":"7812a05d-b13c-413a-ae63-9317fec939e5","Type":"ContainerDied","Data":"f5eeba3351c8b1dde0059d1f07e3dbed9b001fc82be9110c3c2ca259f1b429a0"} Nov 24 11:24:21 crc kubenswrapper[4752]: I1124 11:24:21.752620 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5eeba3351c8b1dde0059d1f07e3dbed9b001fc82be9110c3c2ca259f1b429a0" Nov 24 11:24:21 crc kubenswrapper[4752]: I1124 11:24:21.752329 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xx5fv" Nov 24 11:24:21 crc kubenswrapper[4752]: I1124 11:24:21.756201 4752 generic.go:334] "Generic (PLEG): container finished" podID="e222c213-2c36-4011-bdac-ce0d2b58e244" containerID="ccb5b583ceac005d3d543d314154b084ab3360d01b463a69e12ccc0a95f2c3d5" exitCode=0 Nov 24 11:24:21 crc kubenswrapper[4752]: I1124 11:24:21.756383 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-nxgwb" event={"ID":"e222c213-2c36-4011-bdac-ce0d2b58e244","Type":"ContainerDied","Data":"ccb5b583ceac005d3d543d314154b084ab3360d01b463a69e12ccc0a95f2c3d5"} Nov 24 11:24:21 crc kubenswrapper[4752]: I1124 11:24:21.756481 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-nxgwb" event={"ID":"e222c213-2c36-4011-bdac-ce0d2b58e244","Type":"ContainerStarted","Data":"6001be2b9207b4cbf9206d8c9a99b38313b2bc40b63cc008c18a1f2fe7d6d03f"} Nov 24 11:24:22 crc kubenswrapper[4752]: I1124 11:24:22.153912 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-nxgwb"] Nov 24 11:24:22 crc kubenswrapper[4752]: I1124 11:24:22.194881 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-wlhcd"] Nov 24 11:24:22 crc kubenswrapper[4752]: E1124 11:24:22.195432 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7812a05d-b13c-413a-ae63-9317fec939e5" containerName="glance-db-sync" Nov 24 11:24:22 crc kubenswrapper[4752]: I1124 11:24:22.195507 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="7812a05d-b13c-413a-ae63-9317fec939e5" containerName="glance-db-sync" Nov 24 11:24:22 crc kubenswrapper[4752]: I1124 11:24:22.195707 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="7812a05d-b13c-413a-ae63-9317fec939e5" containerName="glance-db-sync" Nov 24 11:24:22 crc kubenswrapper[4752]: I1124 11:24:22.197277 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-wlhcd" Nov 24 11:24:22 crc kubenswrapper[4752]: I1124 11:24:22.212115 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-wlhcd"] Nov 24 11:24:22 crc kubenswrapper[4752]: I1124 11:24:22.337464 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f9fc951-696e-4d2e-aa1e-1180df42a675-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-wlhcd\" (UID: \"2f9fc951-696e-4d2e-aa1e-1180df42a675\") " pod="openstack/dnsmasq-dns-5f59b8f679-wlhcd" Nov 24 11:24:22 crc kubenswrapper[4752]: I1124 11:24:22.337521 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f9fc951-696e-4d2e-aa1e-1180df42a675-config\") pod \"dnsmasq-dns-5f59b8f679-wlhcd\" (UID: \"2f9fc951-696e-4d2e-aa1e-1180df42a675\") " pod="openstack/dnsmasq-dns-5f59b8f679-wlhcd" Nov 24 11:24:22 crc kubenswrapper[4752]: I1124 11:24:22.337680 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f9fc951-696e-4d2e-aa1e-1180df42a675-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-wlhcd\" (UID: \"2f9fc951-696e-4d2e-aa1e-1180df42a675\") " pod="openstack/dnsmasq-dns-5f59b8f679-wlhcd" Nov 24 11:24:22 crc kubenswrapper[4752]: I1124 11:24:22.337733 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f9fc951-696e-4d2e-aa1e-1180df42a675-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-wlhcd\" (UID: \"2f9fc951-696e-4d2e-aa1e-1180df42a675\") " pod="openstack/dnsmasq-dns-5f59b8f679-wlhcd" Nov 24 11:24:22 crc kubenswrapper[4752]: I1124 11:24:22.337826 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f9fc951-696e-4d2e-aa1e-1180df42a675-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-wlhcd\" (UID: \"2f9fc951-696e-4d2e-aa1e-1180df42a675\") " pod="openstack/dnsmasq-dns-5f59b8f679-wlhcd" Nov 24 11:24:22 crc kubenswrapper[4752]: I1124 11:24:22.337842 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6phk5\" (UniqueName: \"kubernetes.io/projected/2f9fc951-696e-4d2e-aa1e-1180df42a675-kube-api-access-6phk5\") pod \"dnsmasq-dns-5f59b8f679-wlhcd\" (UID: \"2f9fc951-696e-4d2e-aa1e-1180df42a675\") " pod="openstack/dnsmasq-dns-5f59b8f679-wlhcd" Nov 24 11:24:22 crc kubenswrapper[4752]: I1124 11:24:22.439216 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f9fc951-696e-4d2e-aa1e-1180df42a675-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-wlhcd\" (UID: \"2f9fc951-696e-4d2e-aa1e-1180df42a675\") " pod="openstack/dnsmasq-dns-5f59b8f679-wlhcd" Nov 24 11:24:22 crc kubenswrapper[4752]: I1124 11:24:22.439273 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f9fc951-696e-4d2e-aa1e-1180df42a675-config\") pod \"dnsmasq-dns-5f59b8f679-wlhcd\" (UID: \"2f9fc951-696e-4d2e-aa1e-1180df42a675\") " pod="openstack/dnsmasq-dns-5f59b8f679-wlhcd" Nov 24 11:24:22 crc kubenswrapper[4752]: I1124 11:24:22.439341 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f9fc951-696e-4d2e-aa1e-1180df42a675-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-wlhcd\" (UID: \"2f9fc951-696e-4d2e-aa1e-1180df42a675\") " pod="openstack/dnsmasq-dns-5f59b8f679-wlhcd" Nov 24 11:24:22 crc kubenswrapper[4752]: I1124 11:24:22.439389 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f9fc951-696e-4d2e-aa1e-1180df42a675-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-wlhcd\" (UID: \"2f9fc951-696e-4d2e-aa1e-1180df42a675\") " pod="openstack/dnsmasq-dns-5f59b8f679-wlhcd" Nov 24 11:24:22 crc kubenswrapper[4752]: I1124 11:24:22.439439 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f9fc951-696e-4d2e-aa1e-1180df42a675-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-wlhcd\" (UID: \"2f9fc951-696e-4d2e-aa1e-1180df42a675\") " pod="openstack/dnsmasq-dns-5f59b8f679-wlhcd" Nov 24 11:24:22 crc kubenswrapper[4752]: I1124 11:24:22.439460 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6phk5\" (UniqueName: \"kubernetes.io/projected/2f9fc951-696e-4d2e-aa1e-1180df42a675-kube-api-access-6phk5\") pod \"dnsmasq-dns-5f59b8f679-wlhcd\" (UID: \"2f9fc951-696e-4d2e-aa1e-1180df42a675\") " pod="openstack/dnsmasq-dns-5f59b8f679-wlhcd" Nov 24 11:24:22 crc kubenswrapper[4752]: I1124 11:24:22.440388 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f9fc951-696e-4d2e-aa1e-1180df42a675-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-wlhcd\" (UID: \"2f9fc951-696e-4d2e-aa1e-1180df42a675\") " pod="openstack/dnsmasq-dns-5f59b8f679-wlhcd" Nov 24 11:24:22 crc kubenswrapper[4752]: I1124 11:24:22.440450 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f9fc951-696e-4d2e-aa1e-1180df42a675-config\") pod \"dnsmasq-dns-5f59b8f679-wlhcd\" (UID: \"2f9fc951-696e-4d2e-aa1e-1180df42a675\") " pod="openstack/dnsmasq-dns-5f59b8f679-wlhcd" Nov 24 11:24:22 crc kubenswrapper[4752]: I1124 11:24:22.440532 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f9fc951-696e-4d2e-aa1e-1180df42a675-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-wlhcd\" (UID: \"2f9fc951-696e-4d2e-aa1e-1180df42a675\") " pod="openstack/dnsmasq-dns-5f59b8f679-wlhcd" Nov 24 11:24:22 crc kubenswrapper[4752]: I1124 11:24:22.440605 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f9fc951-696e-4d2e-aa1e-1180df42a675-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-wlhcd\" (UID: \"2f9fc951-696e-4d2e-aa1e-1180df42a675\") " pod="openstack/dnsmasq-dns-5f59b8f679-wlhcd" Nov 24 11:24:22 crc kubenswrapper[4752]: I1124 11:24:22.441229 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f9fc951-696e-4d2e-aa1e-1180df42a675-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-wlhcd\" (UID: \"2f9fc951-696e-4d2e-aa1e-1180df42a675\") " pod="openstack/dnsmasq-dns-5f59b8f679-wlhcd" Nov 24 11:24:22 crc kubenswrapper[4752]: I1124 11:24:22.456202 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6phk5\" (UniqueName: \"kubernetes.io/projected/2f9fc951-696e-4d2e-aa1e-1180df42a675-kube-api-access-6phk5\") pod \"dnsmasq-dns-5f59b8f679-wlhcd\" (UID: \"2f9fc951-696e-4d2e-aa1e-1180df42a675\") " pod="openstack/dnsmasq-dns-5f59b8f679-wlhcd" Nov 24 11:24:22 crc kubenswrapper[4752]: I1124 11:24:22.529189 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-wlhcd" Nov 24 11:24:22 crc kubenswrapper[4752]: I1124 11:24:22.774024 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-nxgwb" event={"ID":"e222c213-2c36-4011-bdac-ce0d2b58e244","Type":"ContainerStarted","Data":"afb6403354fe472f7aab259fadbca6c14685ba7052492d7162e2a87d9d250805"} Nov 24 11:24:22 crc kubenswrapper[4752]: I1124 11:24:22.774485 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c79d794d7-nxgwb" podUID="e222c213-2c36-4011-bdac-ce0d2b58e244" containerName="dnsmasq-dns" containerID="cri-o://afb6403354fe472f7aab259fadbca6c14685ba7052492d7162e2a87d9d250805" gracePeriod=10 Nov 24 11:24:22 crc kubenswrapper[4752]: I1124 11:24:22.774578 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c79d794d7-nxgwb" Nov 24 11:24:22 crc kubenswrapper[4752]: I1124 11:24:22.800473 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c79d794d7-nxgwb" podStartSLOduration=2.800449892 podStartE2EDuration="2.800449892s" podCreationTimestamp="2025-11-24 11:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:24:22.792030969 +0000 UTC m=+1068.776851258" watchObservedRunningTime="2025-11-24 11:24:22.800449892 +0000 UTC m=+1068.785270181" Nov 24 11:24:23 crc kubenswrapper[4752]: I1124 11:24:23.014111 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-wlhcd"] Nov 24 11:24:23 crc kubenswrapper[4752]: W1124 11:24:23.026008 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f9fc951_696e_4d2e_aa1e_1180df42a675.slice/crio-2aec3a77a1a43c9eece1f743737ce874f36615f7d74223cfd9676df7feb9f314 WatchSource:0}: Error finding container 2aec3a77a1a43c9eece1f743737ce874f36615f7d74223cfd9676df7feb9f314: Status 404 returned error can't find the container with id 2aec3a77a1a43c9eece1f743737ce874f36615f7d74223cfd9676df7feb9f314 Nov 24 11:24:23 crc kubenswrapper[4752]: I1124 11:24:23.129080 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-nxgwb" Nov 24 11:24:23 crc kubenswrapper[4752]: I1124 11:24:23.250376 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e222c213-2c36-4011-bdac-ce0d2b58e244-config\") pod \"e222c213-2c36-4011-bdac-ce0d2b58e244\" (UID: \"e222c213-2c36-4011-bdac-ce0d2b58e244\") " Nov 24 11:24:23 crc kubenswrapper[4752]: I1124 11:24:23.250434 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-db267\" (UniqueName: \"kubernetes.io/projected/e222c213-2c36-4011-bdac-ce0d2b58e244-kube-api-access-db267\") pod \"e222c213-2c36-4011-bdac-ce0d2b58e244\" (UID: \"e222c213-2c36-4011-bdac-ce0d2b58e244\") " Nov 24 11:24:23 crc kubenswrapper[4752]: I1124 11:24:23.250927 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e222c213-2c36-4011-bdac-ce0d2b58e244-ovsdbserver-sb\") pod \"e222c213-2c36-4011-bdac-ce0d2b58e244\" (UID: \"e222c213-2c36-4011-bdac-ce0d2b58e244\") " Nov 24 11:24:23 crc kubenswrapper[4752]: I1124 11:24:23.251154 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e222c213-2c36-4011-bdac-ce0d2b58e244-ovsdbserver-nb\") pod \"e222c213-2c36-4011-bdac-ce0d2b58e244\" (UID: \"e222c213-2c36-4011-bdac-ce0d2b58e244\") " Nov 24 11:24:23 crc kubenswrapper[4752]: I1124 11:24:23.251227 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e222c213-2c36-4011-bdac-ce0d2b58e244-dns-svc\") pod \"e222c213-2c36-4011-bdac-ce0d2b58e244\" (UID: \"e222c213-2c36-4011-bdac-ce0d2b58e244\") " Nov 24 11:24:23 crc kubenswrapper[4752]: I1124 11:24:23.251265 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e222c213-2c36-4011-bdac-ce0d2b58e244-dns-swift-storage-0\") pod \"e222c213-2c36-4011-bdac-ce0d2b58e244\" (UID: \"e222c213-2c36-4011-bdac-ce0d2b58e244\") " Nov 24 11:24:23 crc kubenswrapper[4752]: I1124 11:24:23.257084 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e222c213-2c36-4011-bdac-ce0d2b58e244-kube-api-access-db267" (OuterVolumeSpecName: "kube-api-access-db267") pod "e222c213-2c36-4011-bdac-ce0d2b58e244" (UID: "e222c213-2c36-4011-bdac-ce0d2b58e244"). InnerVolumeSpecName "kube-api-access-db267". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:24:23 crc kubenswrapper[4752]: I1124 11:24:23.307733 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e222c213-2c36-4011-bdac-ce0d2b58e244-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e222c213-2c36-4011-bdac-ce0d2b58e244" (UID: "e222c213-2c36-4011-bdac-ce0d2b58e244"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:24:23 crc kubenswrapper[4752]: I1124 11:24:23.307875 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e222c213-2c36-4011-bdac-ce0d2b58e244-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e222c213-2c36-4011-bdac-ce0d2b58e244" (UID: "e222c213-2c36-4011-bdac-ce0d2b58e244"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:24:23 crc kubenswrapper[4752]: I1124 11:24:23.309117 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e222c213-2c36-4011-bdac-ce0d2b58e244-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e222c213-2c36-4011-bdac-ce0d2b58e244" (UID: "e222c213-2c36-4011-bdac-ce0d2b58e244"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:24:23 crc kubenswrapper[4752]: I1124 11:24:23.319850 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e222c213-2c36-4011-bdac-ce0d2b58e244-config" (OuterVolumeSpecName: "config") pod "e222c213-2c36-4011-bdac-ce0d2b58e244" (UID: "e222c213-2c36-4011-bdac-ce0d2b58e244"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:24:23 crc kubenswrapper[4752]: I1124 11:24:23.321095 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e222c213-2c36-4011-bdac-ce0d2b58e244-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e222c213-2c36-4011-bdac-ce0d2b58e244" (UID: "e222c213-2c36-4011-bdac-ce0d2b58e244"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:24:23 crc kubenswrapper[4752]: I1124 11:24:23.356210 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e222c213-2c36-4011-bdac-ce0d2b58e244-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:23 crc kubenswrapper[4752]: I1124 11:24:23.356254 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-db267\" (UniqueName: \"kubernetes.io/projected/e222c213-2c36-4011-bdac-ce0d2b58e244-kube-api-access-db267\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:23 crc kubenswrapper[4752]: I1124 11:24:23.356271 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e222c213-2c36-4011-bdac-ce0d2b58e244-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:23 crc kubenswrapper[4752]: I1124 11:24:23.356283 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e222c213-2c36-4011-bdac-ce0d2b58e244-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:23 crc kubenswrapper[4752]: I1124 11:24:23.356298 4752 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e222c213-2c36-4011-bdac-ce0d2b58e244-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:23 crc kubenswrapper[4752]: I1124 11:24:23.356310 4752 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e222c213-2c36-4011-bdac-ce0d2b58e244-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:23 crc kubenswrapper[4752]: I1124 11:24:23.781708 4752 generic.go:334] "Generic (PLEG): container finished" podID="e222c213-2c36-4011-bdac-ce0d2b58e244" containerID="afb6403354fe472f7aab259fadbca6c14685ba7052492d7162e2a87d9d250805" exitCode=0 Nov 24 11:24:23 crc kubenswrapper[4752]: I1124 11:24:23.781785 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-nxgwb" event={"ID":"e222c213-2c36-4011-bdac-ce0d2b58e244","Type":"ContainerDied","Data":"afb6403354fe472f7aab259fadbca6c14685ba7052492d7162e2a87d9d250805"} Nov 24 11:24:23 crc kubenswrapper[4752]: I1124 11:24:23.781811 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-nxgwb" event={"ID":"e222c213-2c36-4011-bdac-ce0d2b58e244","Type":"ContainerDied","Data":"6001be2b9207b4cbf9206d8c9a99b38313b2bc40b63cc008c18a1f2fe7d6d03f"} Nov 24 11:24:23 crc kubenswrapper[4752]: I1124 11:24:23.781826 4752 scope.go:117] "RemoveContainer" containerID="afb6403354fe472f7aab259fadbca6c14685ba7052492d7162e2a87d9d250805" Nov 24 11:24:23 crc kubenswrapper[4752]: I1124 11:24:23.781942 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-nxgwb" Nov 24 11:24:23 crc kubenswrapper[4752]: I1124 11:24:23.792563 4752 generic.go:334] "Generic (PLEG): container finished" podID="2f9fc951-696e-4d2e-aa1e-1180df42a675" containerID="4110fb4eaf2a33bcb9b0aa5b35b8932cde2412fa52a1416f911ab446c9d3029e" exitCode=0 Nov 24 11:24:23 crc kubenswrapper[4752]: I1124 11:24:23.792681 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-wlhcd" event={"ID":"2f9fc951-696e-4d2e-aa1e-1180df42a675","Type":"ContainerDied","Data":"4110fb4eaf2a33bcb9b0aa5b35b8932cde2412fa52a1416f911ab446c9d3029e"} Nov 24 11:24:23 crc kubenswrapper[4752]: I1124 11:24:23.792776 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-wlhcd" event={"ID":"2f9fc951-696e-4d2e-aa1e-1180df42a675","Type":"ContainerStarted","Data":"2aec3a77a1a43c9eece1f743737ce874f36615f7d74223cfd9676df7feb9f314"} Nov 24 11:24:23 crc kubenswrapper[4752]: I1124 11:24:23.807623 4752 scope.go:117] "RemoveContainer" containerID="ccb5b583ceac005d3d543d314154b084ab3360d01b463a69e12ccc0a95f2c3d5" Nov 24 11:24:23 crc kubenswrapper[4752]: I1124 11:24:23.850640 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-nxgwb"] Nov 24 11:24:23 crc kubenswrapper[4752]: I1124 11:24:23.858871 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-nxgwb"] Nov 24 11:24:23 crc kubenswrapper[4752]: I1124 11:24:23.930556 4752 scope.go:117] "RemoveContainer" containerID="afb6403354fe472f7aab259fadbca6c14685ba7052492d7162e2a87d9d250805" Nov 24 11:24:23 crc kubenswrapper[4752]: E1124 11:24:23.931159 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afb6403354fe472f7aab259fadbca6c14685ba7052492d7162e2a87d9d250805\": container with ID starting with afb6403354fe472f7aab259fadbca6c14685ba7052492d7162e2a87d9d250805 not found: ID does not exist" containerID="afb6403354fe472f7aab259fadbca6c14685ba7052492d7162e2a87d9d250805" Nov 24 11:24:23 crc kubenswrapper[4752]: I1124 11:24:23.931211 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afb6403354fe472f7aab259fadbca6c14685ba7052492d7162e2a87d9d250805"} err="failed to get container status \"afb6403354fe472f7aab259fadbca6c14685ba7052492d7162e2a87d9d250805\": rpc error: code = NotFound desc = could not find container \"afb6403354fe472f7aab259fadbca6c14685ba7052492d7162e2a87d9d250805\": container with ID starting with afb6403354fe472f7aab259fadbca6c14685ba7052492d7162e2a87d9d250805 not found: ID does not exist" Nov 24 11:24:23 crc kubenswrapper[4752]: I1124 11:24:23.931251 4752 scope.go:117] "RemoveContainer" containerID="ccb5b583ceac005d3d543d314154b084ab3360d01b463a69e12ccc0a95f2c3d5" Nov 24 11:24:23 crc kubenswrapper[4752]: E1124 11:24:23.931959 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccb5b583ceac005d3d543d314154b084ab3360d01b463a69e12ccc0a95f2c3d5\": container with ID starting with ccb5b583ceac005d3d543d314154b084ab3360d01b463a69e12ccc0a95f2c3d5 not found: ID does not exist" containerID="ccb5b583ceac005d3d543d314154b084ab3360d01b463a69e12ccc0a95f2c3d5" Nov 24 11:24:23 crc kubenswrapper[4752]: I1124 11:24:23.932012 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccb5b583ceac005d3d543d314154b084ab3360d01b463a69e12ccc0a95f2c3d5"} err="failed to get container status \"ccb5b583ceac005d3d543d314154b084ab3360d01b463a69e12ccc0a95f2c3d5\": rpc error: code = NotFound desc = could not find container \"ccb5b583ceac005d3d543d314154b084ab3360d01b463a69e12ccc0a95f2c3d5\": container with ID starting with ccb5b583ceac005d3d543d314154b084ab3360d01b463a69e12ccc0a95f2c3d5 not found: ID does not exist" Nov 24 11:24:24 crc kubenswrapper[4752]: I1124 11:24:24.743724 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e222c213-2c36-4011-bdac-ce0d2b58e244" path="/var/lib/kubelet/pods/e222c213-2c36-4011-bdac-ce0d2b58e244/volumes" Nov 24 11:24:24 crc kubenswrapper[4752]: I1124 11:24:24.803252 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-wlhcd" event={"ID":"2f9fc951-696e-4d2e-aa1e-1180df42a675","Type":"ContainerStarted","Data":"aab9554c3d38909724fd4664e4734e67c9131a32b84339e4dc3f11e690fd71f1"} Nov 24 11:24:24 crc kubenswrapper[4752]: I1124 11:24:24.803489 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f59b8f679-wlhcd" Nov 24 11:24:24 crc kubenswrapper[4752]: I1124 11:24:24.828268 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f59b8f679-wlhcd" podStartSLOduration=2.828249344 podStartE2EDuration="2.828249344s" podCreationTimestamp="2025-11-24 11:24:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:24:24.825263618 +0000 UTC m=+1070.810083927" watchObservedRunningTime="2025-11-24 11:24:24.828249344 +0000 UTC m=+1070.813069633" Nov 24 11:24:25 crc kubenswrapper[4752]: I1124 11:24:25.377989 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Nov 24 11:24:25 crc kubenswrapper[4752]: I1124 11:24:25.661449 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Nov 24 11:24:25 crc kubenswrapper[4752]: I1124 11:24:25.762985 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-5krk8"] Nov 24 11:24:25 crc kubenswrapper[4752]: E1124 11:24:25.763339 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e222c213-2c36-4011-bdac-ce0d2b58e244" containerName="dnsmasq-dns" Nov 24 11:24:25 crc kubenswrapper[4752]: I1124 11:24:25.763353 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="e222c213-2c36-4011-bdac-ce0d2b58e244" containerName="dnsmasq-dns" Nov 24 11:24:25 crc kubenswrapper[4752]: E1124 11:24:25.763388 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e222c213-2c36-4011-bdac-ce0d2b58e244" containerName="init" Nov 24 11:24:25 crc kubenswrapper[4752]: I1124 11:24:25.763395 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="e222c213-2c36-4011-bdac-ce0d2b58e244" containerName="init" Nov 24 11:24:25 crc kubenswrapper[4752]: I1124 11:24:25.763575 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="e222c213-2c36-4011-bdac-ce0d2b58e244" containerName="dnsmasq-dns" Nov 24 11:24:25 crc kubenswrapper[4752]: I1124 11:24:25.764194 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-5krk8" Nov 24 11:24:25 crc kubenswrapper[4752]: I1124 11:24:25.773722 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-5krk8"] Nov 24 11:24:25 crc kubenswrapper[4752]: I1124 11:24:25.827501 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-dp6s7"] Nov 24 11:24:25 crc kubenswrapper[4752]: I1124 11:24:25.828513 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-dp6s7" Nov 24 11:24:25 crc kubenswrapper[4752]: I1124 11:24:25.841770 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-dp6s7"] Nov 24 11:24:25 crc kubenswrapper[4752]: I1124 11:24:25.848393 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-3b7b-account-create-6hhgc"] Nov 24 11:24:25 crc kubenswrapper[4752]: I1124 11:24:25.849377 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3b7b-account-create-6hhgc" Nov 24 11:24:25 crc kubenswrapper[4752]: I1124 11:24:25.851244 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Nov 24 11:24:25 crc kubenswrapper[4752]: I1124 11:24:25.867119 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-3b7b-account-create-6hhgc"] Nov 24 11:24:25 crc kubenswrapper[4752]: I1124 11:24:25.894483 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74msc\" (UniqueName: \"kubernetes.io/projected/fa205dc7-c2a0-418c-a696-ccdfa4c0d43f-kube-api-access-74msc\") pod \"cinder-db-create-5krk8\" (UID: \"fa205dc7-c2a0-418c-a696-ccdfa4c0d43f\") " pod="openstack/cinder-db-create-5krk8" Nov 24 11:24:25 crc kubenswrapper[4752]: I1124 11:24:25.894610 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8smcr\" (UniqueName: \"kubernetes.io/projected/aa308cc4-8cc5-4a63-926a-033a151f7291-kube-api-access-8smcr\") pod \"barbican-db-create-dp6s7\" (UID: \"aa308cc4-8cc5-4a63-926a-033a151f7291\") " pod="openstack/barbican-db-create-dp6s7" Nov 24 11:24:25 crc kubenswrapper[4752]: I1124 11:24:25.894665 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa308cc4-8cc5-4a63-926a-033a151f7291-operator-scripts\") pod \"barbican-db-create-dp6s7\" (UID: \"aa308cc4-8cc5-4a63-926a-033a151f7291\") " pod="openstack/barbican-db-create-dp6s7" Nov 24 11:24:25 crc kubenswrapper[4752]: I1124 11:24:25.894735 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vl58\" (UniqueName: \"kubernetes.io/projected/baa3eaaa-7445-48bc-80fc-753d38628a73-kube-api-access-6vl58\") pod \"cinder-3b7b-account-create-6hhgc\" (UID: \"baa3eaaa-7445-48bc-80fc-753d38628a73\") " pod="openstack/cinder-3b7b-account-create-6hhgc" Nov 24 11:24:25 crc kubenswrapper[4752]: I1124 11:24:25.894821 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa205dc7-c2a0-418c-a696-ccdfa4c0d43f-operator-scripts\") pod \"cinder-db-create-5krk8\" (UID: \"fa205dc7-c2a0-418c-a696-ccdfa4c0d43f\") " pod="openstack/cinder-db-create-5krk8" Nov 24 11:24:25 crc kubenswrapper[4752]: I1124 11:24:25.894957 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/baa3eaaa-7445-48bc-80fc-753d38628a73-operator-scripts\") pod \"cinder-3b7b-account-create-6hhgc\" (UID: \"baa3eaaa-7445-48bc-80fc-753d38628a73\") " pod="openstack/cinder-3b7b-account-create-6hhgc" Nov 24 11:24:25 crc kubenswrapper[4752]: I1124 11:24:25.948898 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-f581-account-create-hwjsp"] Nov 24 11:24:25 crc kubenswrapper[4752]: I1124 11:24:25.950045 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-f581-account-create-hwjsp" Nov 24 11:24:25 crc kubenswrapper[4752]: I1124 11:24:25.958020 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Nov 24 11:24:25 crc kubenswrapper[4752]: I1124 11:24:25.973511 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-f581-account-create-hwjsp"] Nov 24 11:24:25 crc kubenswrapper[4752]: I1124 11:24:25.996274 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8smcr\" (UniqueName: \"kubernetes.io/projected/aa308cc4-8cc5-4a63-926a-033a151f7291-kube-api-access-8smcr\") pod \"barbican-db-create-dp6s7\" (UID: \"aa308cc4-8cc5-4a63-926a-033a151f7291\") " pod="openstack/barbican-db-create-dp6s7" Nov 24 11:24:25 crc kubenswrapper[4752]: I1124 11:24:25.996549 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa308cc4-8cc5-4a63-926a-033a151f7291-operator-scripts\") pod \"barbican-db-create-dp6s7\" (UID: \"aa308cc4-8cc5-4a63-926a-033a151f7291\") " pod="openstack/barbican-db-create-dp6s7" Nov 24 11:24:25 crc kubenswrapper[4752]: I1124 11:24:25.996651 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbrzj\" (UniqueName: \"kubernetes.io/projected/04ae1a5e-39a4-40d2-8a7f-dd5e7dac6c0d-kube-api-access-cbrzj\") pod \"barbican-f581-account-create-hwjsp\" (UID: \"04ae1a5e-39a4-40d2-8a7f-dd5e7dac6c0d\") " pod="openstack/barbican-f581-account-create-hwjsp" Nov 24 11:24:25 crc kubenswrapper[4752]: I1124 11:24:25.996736 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vl58\" (UniqueName: \"kubernetes.io/projected/baa3eaaa-7445-48bc-80fc-753d38628a73-kube-api-access-6vl58\") pod \"cinder-3b7b-account-create-6hhgc\" (UID: \"baa3eaaa-7445-48bc-80fc-753d38628a73\") " pod="openstack/cinder-3b7b-account-create-6hhgc" Nov 24 11:24:25 crc kubenswrapper[4752]: I1124 11:24:25.996849 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa205dc7-c2a0-418c-a696-ccdfa4c0d43f-operator-scripts\") pod \"cinder-db-create-5krk8\" (UID: \"fa205dc7-c2a0-418c-a696-ccdfa4c0d43f\") " pod="openstack/cinder-db-create-5krk8" Nov 24 11:24:25 crc kubenswrapper[4752]: I1124 11:24:25.997002 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/baa3eaaa-7445-48bc-80fc-753d38628a73-operator-scripts\") pod \"cinder-3b7b-account-create-6hhgc\" (UID: \"baa3eaaa-7445-48bc-80fc-753d38628a73\") " pod="openstack/cinder-3b7b-account-create-6hhgc" Nov 24 11:24:25 crc kubenswrapper[4752]: I1124 11:24:25.997101 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74msc\" (UniqueName: \"kubernetes.io/projected/fa205dc7-c2a0-418c-a696-ccdfa4c0d43f-kube-api-access-74msc\") pod \"cinder-db-create-5krk8\" (UID: \"fa205dc7-c2a0-418c-a696-ccdfa4c0d43f\") " pod="openstack/cinder-db-create-5krk8" Nov 24 11:24:25 crc kubenswrapper[4752]: I1124 11:24:25.997170 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04ae1a5e-39a4-40d2-8a7f-dd5e7dac6c0d-operator-scripts\") pod \"barbican-f581-account-create-hwjsp\" (UID: \"04ae1a5e-39a4-40d2-8a7f-dd5e7dac6c0d\") " pod="openstack/barbican-f581-account-create-hwjsp" Nov 24 11:24:25 crc kubenswrapper[4752]: I1124 11:24:25.998086 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa308cc4-8cc5-4a63-926a-033a151f7291-operator-scripts\") pod \"barbican-db-create-dp6s7\" (UID: \"aa308cc4-8cc5-4a63-926a-033a151f7291\") " pod="openstack/barbican-db-create-dp6s7" Nov 24 11:24:25 crc kubenswrapper[4752]: I1124 11:24:25.998795 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa205dc7-c2a0-418c-a696-ccdfa4c0d43f-operator-scripts\") pod \"cinder-db-create-5krk8\" (UID: \"fa205dc7-c2a0-418c-a696-ccdfa4c0d43f\") " pod="openstack/cinder-db-create-5krk8" Nov 24 11:24:25 crc kubenswrapper[4752]: I1124 11:24:25.999430 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/baa3eaaa-7445-48bc-80fc-753d38628a73-operator-scripts\") pod \"cinder-3b7b-account-create-6hhgc\" (UID: \"baa3eaaa-7445-48bc-80fc-753d38628a73\") " pod="openstack/cinder-3b7b-account-create-6hhgc" Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.033545 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8smcr\" (UniqueName: \"kubernetes.io/projected/aa308cc4-8cc5-4a63-926a-033a151f7291-kube-api-access-8smcr\") pod \"barbican-db-create-dp6s7\" (UID: \"aa308cc4-8cc5-4a63-926a-033a151f7291\") " pod="openstack/barbican-db-create-dp6s7" Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.043393 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74msc\" (UniqueName: \"kubernetes.io/projected/fa205dc7-c2a0-418c-a696-ccdfa4c0d43f-kube-api-access-74msc\") pod \"cinder-db-create-5krk8\" (UID: \"fa205dc7-c2a0-418c-a696-ccdfa4c0d43f\") " pod="openstack/cinder-db-create-5krk8" Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.049270 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vl58\" (UniqueName: \"kubernetes.io/projected/baa3eaaa-7445-48bc-80fc-753d38628a73-kube-api-access-6vl58\") pod \"cinder-3b7b-account-create-6hhgc\" (UID: \"baa3eaaa-7445-48bc-80fc-753d38628a73\") " pod="openstack/cinder-3b7b-account-create-6hhgc" Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.054921 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-6rbdm"] Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.055942 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6rbdm" Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.061105 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-6rbdm"] Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.081459 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-5krk8" Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.097566 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-wf9mt"] Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.098575 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wf9mt" Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.100108 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04ae1a5e-39a4-40d2-8a7f-dd5e7dac6c0d-operator-scripts\") pod \"barbican-f581-account-create-hwjsp\" (UID: \"04ae1a5e-39a4-40d2-8a7f-dd5e7dac6c0d\") " pod="openstack/barbican-f581-account-create-hwjsp" Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.100155 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr4n4\" (UniqueName: \"kubernetes.io/projected/7e150ca9-5a87-40ba-bbad-bfae139179ae-kube-api-access-fr4n4\") pod \"neutron-db-create-6rbdm\" (UID: \"7e150ca9-5a87-40ba-bbad-bfae139179ae\") " pod="openstack/neutron-db-create-6rbdm" Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.100179 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e150ca9-5a87-40ba-bbad-bfae139179ae-operator-scripts\") pod \"neutron-db-create-6rbdm\" (UID: \"7e150ca9-5a87-40ba-bbad-bfae139179ae\") " pod="openstack/neutron-db-create-6rbdm" Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.100213 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbrzj\" (UniqueName: \"kubernetes.io/projected/04ae1a5e-39a4-40d2-8a7f-dd5e7dac6c0d-kube-api-access-cbrzj\") pod \"barbican-f581-account-create-hwjsp\" (UID: \"04ae1a5e-39a4-40d2-8a7f-dd5e7dac6c0d\") " pod="openstack/barbican-f581-account-create-hwjsp" Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.100769 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04ae1a5e-39a4-40d2-8a7f-dd5e7dac6c0d-operator-scripts\") pod \"barbican-f581-account-create-hwjsp\" (UID: \"04ae1a5e-39a4-40d2-8a7f-dd5e7dac6c0d\") " pod="openstack/barbican-f581-account-create-hwjsp" Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.103833 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.103858 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.104032 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.104193 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-kdtjx" Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.121728 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-wf9mt"] Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.122872 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbrzj\" (UniqueName: \"kubernetes.io/projected/04ae1a5e-39a4-40d2-8a7f-dd5e7dac6c0d-kube-api-access-cbrzj\") pod \"barbican-f581-account-create-hwjsp\" (UID: \"04ae1a5e-39a4-40d2-8a7f-dd5e7dac6c0d\") " pod="openstack/barbican-f581-account-create-hwjsp" Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.153486 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-dp6s7" Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.162701 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-574b-account-create-qkvsq"] Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.165440 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-574b-account-create-qkvsq" Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.167276 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3b7b-account-create-6hhgc" Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.168573 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.177837 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-574b-account-create-qkvsq"] Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.202088 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fr4n4\" (UniqueName: \"kubernetes.io/projected/7e150ca9-5a87-40ba-bbad-bfae139179ae-kube-api-access-fr4n4\") pod \"neutron-db-create-6rbdm\" (UID: \"7e150ca9-5a87-40ba-bbad-bfae139179ae\") " pod="openstack/neutron-db-create-6rbdm" Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.202129 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e150ca9-5a87-40ba-bbad-bfae139179ae-operator-scripts\") pod \"neutron-db-create-6rbdm\" (UID: \"7e150ca9-5a87-40ba-bbad-bfae139179ae\") " pod="openstack/neutron-db-create-6rbdm" Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.202153 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4e82288-6d0b-4bc0-b934-5c0c869a8198-config-data\") pod \"keystone-db-sync-wf9mt\" (UID: \"e4e82288-6d0b-4bc0-b934-5c0c869a8198\") " pod="openstack/keystone-db-sync-wf9mt" Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.202173 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e82288-6d0b-4bc0-b934-5c0c869a8198-combined-ca-bundle\") pod \"keystone-db-sync-wf9mt\" (UID: \"e4e82288-6d0b-4bc0-b934-5c0c869a8198\") " pod="openstack/keystone-db-sync-wf9mt" Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.202203 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw2hs\" (UniqueName: \"kubernetes.io/projected/e4e82288-6d0b-4bc0-b934-5c0c869a8198-kube-api-access-bw2hs\") pod \"keystone-db-sync-wf9mt\" (UID: \"e4e82288-6d0b-4bc0-b934-5c0c869a8198\") " pod="openstack/keystone-db-sync-wf9mt" Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.203111 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e150ca9-5a87-40ba-bbad-bfae139179ae-operator-scripts\") pod \"neutron-db-create-6rbdm\" (UID: \"7e150ca9-5a87-40ba-bbad-bfae139179ae\") " pod="openstack/neutron-db-create-6rbdm" Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.239234 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr4n4\" (UniqueName: \"kubernetes.io/projected/7e150ca9-5a87-40ba-bbad-bfae139179ae-kube-api-access-fr4n4\") pod \"neutron-db-create-6rbdm\" (UID: \"7e150ca9-5a87-40ba-bbad-bfae139179ae\") " pod="openstack/neutron-db-create-6rbdm" Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.269881 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-f581-account-create-hwjsp" Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.304267 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4e82288-6d0b-4bc0-b934-5c0c869a8198-config-data\") pod \"keystone-db-sync-wf9mt\" (UID: \"e4e82288-6d0b-4bc0-b934-5c0c869a8198\") " pod="openstack/keystone-db-sync-wf9mt" Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.304312 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e82288-6d0b-4bc0-b934-5c0c869a8198-combined-ca-bundle\") pod \"keystone-db-sync-wf9mt\" (UID: \"e4e82288-6d0b-4bc0-b934-5c0c869a8198\") " pod="openstack/keystone-db-sync-wf9mt" Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.304344 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw2hs\" (UniqueName: \"kubernetes.io/projected/e4e82288-6d0b-4bc0-b934-5c0c869a8198-kube-api-access-bw2hs\") pod \"keystone-db-sync-wf9mt\" (UID: \"e4e82288-6d0b-4bc0-b934-5c0c869a8198\") " pod="openstack/keystone-db-sync-wf9mt" Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.304423 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7w74\" (UniqueName: \"kubernetes.io/projected/0c1cc988-7c84-4647-98d5-6831aaff2915-kube-api-access-c7w74\") pod \"neutron-574b-account-create-qkvsq\" (UID: \"0c1cc988-7c84-4647-98d5-6831aaff2915\") " pod="openstack/neutron-574b-account-create-qkvsq" Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.304450 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c1cc988-7c84-4647-98d5-6831aaff2915-operator-scripts\") pod \"neutron-574b-account-create-qkvsq\" (UID: \"0c1cc988-7c84-4647-98d5-6831aaff2915\") " pod="openstack/neutron-574b-account-create-qkvsq" Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.308920 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e82288-6d0b-4bc0-b934-5c0c869a8198-combined-ca-bundle\") pod \"keystone-db-sync-wf9mt\" (UID: \"e4e82288-6d0b-4bc0-b934-5c0c869a8198\") " pod="openstack/keystone-db-sync-wf9mt" Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.314154 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4e82288-6d0b-4bc0-b934-5c0c869a8198-config-data\") pod \"keystone-db-sync-wf9mt\" (UID: \"e4e82288-6d0b-4bc0-b934-5c0c869a8198\") " pod="openstack/keystone-db-sync-wf9mt" Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.330083 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw2hs\" (UniqueName: \"kubernetes.io/projected/e4e82288-6d0b-4bc0-b934-5c0c869a8198-kube-api-access-bw2hs\") pod \"keystone-db-sync-wf9mt\" (UID: \"e4e82288-6d0b-4bc0-b934-5c0c869a8198\") " pod="openstack/keystone-db-sync-wf9mt" Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.405719 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7w74\" (UniqueName: \"kubernetes.io/projected/0c1cc988-7c84-4647-98d5-6831aaff2915-kube-api-access-c7w74\") pod \"neutron-574b-account-create-qkvsq\" (UID: \"0c1cc988-7c84-4647-98d5-6831aaff2915\") " pod="openstack/neutron-574b-account-create-qkvsq" Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.405783 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c1cc988-7c84-4647-98d5-6831aaff2915-operator-scripts\") pod \"neutron-574b-account-create-qkvsq\" (UID: \"0c1cc988-7c84-4647-98d5-6831aaff2915\") " pod="openstack/neutron-574b-account-create-qkvsq" Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.406544 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c1cc988-7c84-4647-98d5-6831aaff2915-operator-scripts\") pod \"neutron-574b-account-create-qkvsq\" (UID: \"0c1cc988-7c84-4647-98d5-6831aaff2915\") " pod="openstack/neutron-574b-account-create-qkvsq" Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.423792 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7w74\" (UniqueName: \"kubernetes.io/projected/0c1cc988-7c84-4647-98d5-6831aaff2915-kube-api-access-c7w74\") pod \"neutron-574b-account-create-qkvsq\" (UID: \"0c1cc988-7c84-4647-98d5-6831aaff2915\") " pod="openstack/neutron-574b-account-create-qkvsq" Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.499020 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6rbdm" Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.547833 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wf9mt" Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.557217 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-574b-account-create-qkvsq" Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.591605 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-5krk8"] Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.726301 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-3b7b-account-create-6hhgc"] Nov 24 11:24:26 crc kubenswrapper[4752]: I1124 11:24:26.737313 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-dp6s7"] Nov 24 11:24:28 crc kubenswrapper[4752]: W1124 11:24:28.170058 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa308cc4_8cc5_4a63_926a_033a151f7291.slice/crio-a22dd843e547ae24880f732987ed3607616100f00fdd6d0b7c0a832ecf68d38d WatchSource:0}: Error finding container a22dd843e547ae24880f732987ed3607616100f00fdd6d0b7c0a832ecf68d38d: Status 404 returned error can't find the container with id a22dd843e547ae24880f732987ed3607616100f00fdd6d0b7c0a832ecf68d38d Nov 24 11:24:28 crc kubenswrapper[4752]: W1124 11:24:28.175338 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa205dc7_c2a0_418c_a696_ccdfa4c0d43f.slice/crio-1be20492b051433d99a938bbe84a38333ddf866549dc79a3ce1f4ead1f804097 WatchSource:0}: Error finding container 1be20492b051433d99a938bbe84a38333ddf866549dc79a3ce1f4ead1f804097: Status 404 returned error can't find the container with id 1be20492b051433d99a938bbe84a38333ddf866549dc79a3ce1f4ead1f804097 Nov 24 11:24:28 crc kubenswrapper[4752]: I1124 11:24:28.634485 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-6rbdm"] Nov 24 11:24:28 crc kubenswrapper[4752]: W1124 11:24:28.638919 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e150ca9_5a87_40ba_bbad_bfae139179ae.slice/crio-e0ba6aa64cba38678c4574d02ef74a8614a787b76bf66dad910dba664253978c WatchSource:0}: Error finding container e0ba6aa64cba38678c4574d02ef74a8614a787b76bf66dad910dba664253978c: Status 404 returned error can't find the container with id e0ba6aa64cba38678c4574d02ef74a8614a787b76bf66dad910dba664253978c Nov 24 11:24:28 crc kubenswrapper[4752]: I1124 11:24:28.677415 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-f581-account-create-hwjsp"] Nov 24 11:24:28 crc kubenswrapper[4752]: W1124 11:24:28.680263 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04ae1a5e_39a4_40d2_8a7f_dd5e7dac6c0d.slice/crio-525620a12830d24c1a64bc6025c7ca13e4b2dc072e4ef9c51dc3990f97b641f2 WatchSource:0}: Error finding container 525620a12830d24c1a64bc6025c7ca13e4b2dc072e4ef9c51dc3990f97b641f2: Status 404 returned error can't find the container with id 525620a12830d24c1a64bc6025c7ca13e4b2dc072e4ef9c51dc3990f97b641f2 Nov 24 11:24:28 crc kubenswrapper[4752]: W1124 11:24:28.809283 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c1cc988_7c84_4647_98d5_6831aaff2915.slice/crio-003dd8c971ba45adba3126e73ca5434f407b7cb775a65ce9547eac2746d3a22e WatchSource:0}: Error finding container 003dd8c971ba45adba3126e73ca5434f407b7cb775a65ce9547eac2746d3a22e: Status 404 returned error can't find the container with id 003dd8c971ba45adba3126e73ca5434f407b7cb775a65ce9547eac2746d3a22e Nov 24 11:24:28 crc kubenswrapper[4752]: I1124 11:24:28.819670 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-574b-account-create-qkvsq"] Nov 24 11:24:28 crc kubenswrapper[4752]: I1124 11:24:28.846931 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-574b-account-create-qkvsq" event={"ID":"0c1cc988-7c84-4647-98d5-6831aaff2915","Type":"ContainerStarted","Data":"003dd8c971ba45adba3126e73ca5434f407b7cb775a65ce9547eac2746d3a22e"} Nov 24 11:24:28 crc kubenswrapper[4752]: I1124 11:24:28.851980 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-5krk8" event={"ID":"fa205dc7-c2a0-418c-a696-ccdfa4c0d43f","Type":"ContainerStarted","Data":"1be20492b051433d99a938bbe84a38333ddf866549dc79a3ce1f4ead1f804097"} Nov 24 11:24:28 crc kubenswrapper[4752]: I1124 11:24:28.854462 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3b7b-account-create-6hhgc" event={"ID":"baa3eaaa-7445-48bc-80fc-753d38628a73","Type":"ContainerStarted","Data":"ad73cc4f9efd638bf52e579a67b2c7a0f7854015411152e4bd1fa71790c929aa"} Nov 24 11:24:28 crc kubenswrapper[4752]: I1124 11:24:28.861873 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-6rbdm" event={"ID":"7e150ca9-5a87-40ba-bbad-bfae139179ae","Type":"ContainerStarted","Data":"e0ba6aa64cba38678c4574d02ef74a8614a787b76bf66dad910dba664253978c"} Nov 24 11:24:28 crc kubenswrapper[4752]: I1124 11:24:28.864324 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-dp6s7" event={"ID":"aa308cc4-8cc5-4a63-926a-033a151f7291","Type":"ContainerStarted","Data":"a22dd843e547ae24880f732987ed3607616100f00fdd6d0b7c0a832ecf68d38d"} Nov 24 11:24:28 crc kubenswrapper[4752]: I1124 11:24:28.868108 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-f581-account-create-hwjsp" event={"ID":"04ae1a5e-39a4-40d2-8a7f-dd5e7dac6c0d","Type":"ContainerStarted","Data":"525620a12830d24c1a64bc6025c7ca13e4b2dc072e4ef9c51dc3990f97b641f2"} Nov 24 11:24:28 crc kubenswrapper[4752]: I1124 11:24:28.880298 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-dp6s7" podStartSLOduration=3.880281625 podStartE2EDuration="3.880281625s" podCreationTimestamp="2025-11-24 11:24:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:24:28.878254476 +0000 UTC m=+1074.863074765" watchObservedRunningTime="2025-11-24 11:24:28.880281625 +0000 UTC m=+1074.865101914" Nov 24 11:24:28 crc kubenswrapper[4752]: I1124 11:24:28.913935 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-wf9mt"] Nov 24 11:24:28 crc kubenswrapper[4752]: W1124 11:24:28.930372 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4e82288_6d0b_4bc0_b934_5c0c869a8198.slice/crio-e1f9df57292a2a2a550468e29e2bca54ac7205bb81be55218b006af27fdb7d79 WatchSource:0}: Error finding container e1f9df57292a2a2a550468e29e2bca54ac7205bb81be55218b006af27fdb7d79: Status 404 returned error can't find the container with id e1f9df57292a2a2a550468e29e2bca54ac7205bb81be55218b006af27fdb7d79 Nov 24 11:24:29 crc kubenswrapper[4752]: I1124 11:24:29.879934 4752 generic.go:334] "Generic (PLEG): container finished" podID="7e150ca9-5a87-40ba-bbad-bfae139179ae" containerID="635c128b61e0f1be298eda06897c755b3bed52e73a8086297eaf4ffc3f88e948" exitCode=0 Nov 24 11:24:29 crc kubenswrapper[4752]: I1124 11:24:29.879991 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-6rbdm" event={"ID":"7e150ca9-5a87-40ba-bbad-bfae139179ae","Type":"ContainerDied","Data":"635c128b61e0f1be298eda06897c755b3bed52e73a8086297eaf4ffc3f88e948"} Nov 24 11:24:29 crc kubenswrapper[4752]: I1124 11:24:29.882649 4752 generic.go:334] "Generic (PLEG): container finished" podID="aa308cc4-8cc5-4a63-926a-033a151f7291" containerID="edf2a6cd801741600904ad2ea8216f876114e28709f11d16ba51608f5807b542" exitCode=0 Nov 24 11:24:29 crc kubenswrapper[4752]: I1124 11:24:29.882712 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-dp6s7" event={"ID":"aa308cc4-8cc5-4a63-926a-033a151f7291","Type":"ContainerDied","Data":"edf2a6cd801741600904ad2ea8216f876114e28709f11d16ba51608f5807b542"} Nov 24 11:24:29 crc kubenswrapper[4752]: I1124 11:24:29.885736 4752 generic.go:334] "Generic (PLEG): container finished" podID="04ae1a5e-39a4-40d2-8a7f-dd5e7dac6c0d" containerID="e14543262653c738d489c1dbda8ff1a881ebfc697586d0e9304a135ff3b5f748" exitCode=0 Nov 24 11:24:29 crc kubenswrapper[4752]: I1124 11:24:29.885915 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-f581-account-create-hwjsp" event={"ID":"04ae1a5e-39a4-40d2-8a7f-dd5e7dac6c0d","Type":"ContainerDied","Data":"e14543262653c738d489c1dbda8ff1a881ebfc697586d0e9304a135ff3b5f748"} Nov 24 11:24:29 crc kubenswrapper[4752]: I1124 11:24:29.889932 4752 generic.go:334] "Generic (PLEG): container finished" podID="0c1cc988-7c84-4647-98d5-6831aaff2915" containerID="89f677f71ca7f0defc38da546867b6b1ffd8d6a90e5b7096f3f34fdcbd93a5d2" exitCode=0 Nov 24 11:24:29 crc kubenswrapper[4752]: I1124 11:24:29.890001 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-574b-account-create-qkvsq" event={"ID":"0c1cc988-7c84-4647-98d5-6831aaff2915","Type":"ContainerDied","Data":"89f677f71ca7f0defc38da546867b6b1ffd8d6a90e5b7096f3f34fdcbd93a5d2"} Nov 24 11:24:29 crc kubenswrapper[4752]: I1124 11:24:29.892053 4752 generic.go:334] "Generic (PLEG): container finished" podID="fa205dc7-c2a0-418c-a696-ccdfa4c0d43f" containerID="b77b1b1e377ddca3c012d7c1f6ddebf83b2c8548e70877aa230822808f642838" exitCode=0 Nov 24 11:24:29 crc kubenswrapper[4752]: I1124 11:24:29.892074 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-5krk8" event={"ID":"fa205dc7-c2a0-418c-a696-ccdfa4c0d43f","Type":"ContainerDied","Data":"b77b1b1e377ddca3c012d7c1f6ddebf83b2c8548e70877aa230822808f642838"} Nov 24 11:24:29 crc kubenswrapper[4752]: I1124 11:24:29.895674 4752 generic.go:334] "Generic (PLEG): container finished" podID="baa3eaaa-7445-48bc-80fc-753d38628a73" containerID="5c6f110ff0032fe3eb25f36701cc1f335c97745e7f001c2c26b1d02c30d0b5ce" exitCode=0 Nov 24 11:24:29 crc kubenswrapper[4752]: I1124 11:24:29.895830 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3b7b-account-create-6hhgc" event={"ID":"baa3eaaa-7445-48bc-80fc-753d38628a73","Type":"ContainerDied","Data":"5c6f110ff0032fe3eb25f36701cc1f335c97745e7f001c2c26b1d02c30d0b5ce"} Nov 24 11:24:29 crc kubenswrapper[4752]: I1124 11:24:29.904507 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wf9mt" event={"ID":"e4e82288-6d0b-4bc0-b934-5c0c869a8198","Type":"ContainerStarted","Data":"e1f9df57292a2a2a550468e29e2bca54ac7205bb81be55218b006af27fdb7d79"} Nov 24 11:24:32 crc kubenswrapper[4752]: I1124 11:24:32.530939 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f59b8f679-wlhcd" Nov 24 11:24:32 crc kubenswrapper[4752]: I1124 11:24:32.587436 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-prc8b"] Nov 24 11:24:32 crc kubenswrapper[4752]: I1124 11:24:32.587661 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-prc8b" podUID="d6a8e0cd-1585-416d-9142-fbe4991e8cda" containerName="dnsmasq-dns" containerID="cri-o://7fee1cd1a6c9edd3a15d864d2449f04014cf565e182b53a6fe8f3432c046e3ab" gracePeriod=10 Nov 24 11:24:32 crc kubenswrapper[4752]: I1124 11:24:32.930843 4752 generic.go:334] "Generic (PLEG): container finished" podID="d6a8e0cd-1585-416d-9142-fbe4991e8cda" containerID="7fee1cd1a6c9edd3a15d864d2449f04014cf565e182b53a6fe8f3432c046e3ab" exitCode=0 Nov 24 11:24:32 crc kubenswrapper[4752]: I1124 11:24:32.931012 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-prc8b" event={"ID":"d6a8e0cd-1585-416d-9142-fbe4991e8cda","Type":"ContainerDied","Data":"7fee1cd1a6c9edd3a15d864d2449f04014cf565e182b53a6fe8f3432c046e3ab"} Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.526982 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-574b-account-create-qkvsq" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.533874 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-dp6s7" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.565861 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-f581-account-create-hwjsp" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.574380 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-5krk8" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.587631 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6rbdm" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.605343 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3b7b-account-create-6hhgc" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.611556 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-prc8b" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.692151 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c1cc988-7c84-4647-98d5-6831aaff2915-operator-scripts\") pod \"0c1cc988-7c84-4647-98d5-6831aaff2915\" (UID: \"0c1cc988-7c84-4647-98d5-6831aaff2915\") " Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.692773 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74msc\" (UniqueName: \"kubernetes.io/projected/fa205dc7-c2a0-418c-a696-ccdfa4c0d43f-kube-api-access-74msc\") pod \"fa205dc7-c2a0-418c-a696-ccdfa4c0d43f\" (UID: \"fa205dc7-c2a0-418c-a696-ccdfa4c0d43f\") " Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.692925 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbrzj\" (UniqueName: \"kubernetes.io/projected/04ae1a5e-39a4-40d2-8a7f-dd5e7dac6c0d-kube-api-access-cbrzj\") pod \"04ae1a5e-39a4-40d2-8a7f-dd5e7dac6c0d\" (UID: \"04ae1a5e-39a4-40d2-8a7f-dd5e7dac6c0d\") " Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.692917 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c1cc988-7c84-4647-98d5-6831aaff2915-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0c1cc988-7c84-4647-98d5-6831aaff2915" (UID: "0c1cc988-7c84-4647-98d5-6831aaff2915"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.693235 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d6a8e0cd-1585-416d-9142-fbe4991e8cda-ovsdbserver-nb\") pod \"d6a8e0cd-1585-416d-9142-fbe4991e8cda\" (UID: \"d6a8e0cd-1585-416d-9142-fbe4991e8cda\") " Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.693365 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vl58\" (UniqueName: \"kubernetes.io/projected/baa3eaaa-7445-48bc-80fc-753d38628a73-kube-api-access-6vl58\") pod \"baa3eaaa-7445-48bc-80fc-753d38628a73\" (UID: \"baa3eaaa-7445-48bc-80fc-753d38628a73\") " Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.693507 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa205dc7-c2a0-418c-a696-ccdfa4c0d43f-operator-scripts\") pod \"fa205dc7-c2a0-418c-a696-ccdfa4c0d43f\" (UID: \"fa205dc7-c2a0-418c-a696-ccdfa4c0d43f\") " Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.693642 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6a8e0cd-1585-416d-9142-fbe4991e8cda-dns-svc\") pod \"d6a8e0cd-1585-416d-9142-fbe4991e8cda\" (UID: \"d6a8e0cd-1585-416d-9142-fbe4991e8cda\") " Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.693777 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa308cc4-8cc5-4a63-926a-033a151f7291-operator-scripts\") pod \"aa308cc4-8cc5-4a63-926a-033a151f7291\" (UID: \"aa308cc4-8cc5-4a63-926a-033a151f7291\") " Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.693891 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwg4p\" (UniqueName: \"kubernetes.io/projected/d6a8e0cd-1585-416d-9142-fbe4991e8cda-kube-api-access-vwg4p\") pod \"d6a8e0cd-1585-416d-9142-fbe4991e8cda\" (UID: \"d6a8e0cd-1585-416d-9142-fbe4991e8cda\") " Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.694011 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fr4n4\" (UniqueName: \"kubernetes.io/projected/7e150ca9-5a87-40ba-bbad-bfae139179ae-kube-api-access-fr4n4\") pod \"7e150ca9-5a87-40ba-bbad-bfae139179ae\" (UID: \"7e150ca9-5a87-40ba-bbad-bfae139179ae\") " Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.694287 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa308cc4-8cc5-4a63-926a-033a151f7291-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "aa308cc4-8cc5-4a63-926a-033a151f7291" (UID: "aa308cc4-8cc5-4a63-926a-033a151f7291"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.694296 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7w74\" (UniqueName: \"kubernetes.io/projected/0c1cc988-7c84-4647-98d5-6831aaff2915-kube-api-access-c7w74\") pod \"0c1cc988-7c84-4647-98d5-6831aaff2915\" (UID: \"0c1cc988-7c84-4647-98d5-6831aaff2915\") " Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.694584 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8smcr\" (UniqueName: \"kubernetes.io/projected/aa308cc4-8cc5-4a63-926a-033a151f7291-kube-api-access-8smcr\") pod \"aa308cc4-8cc5-4a63-926a-033a151f7291\" (UID: \"aa308cc4-8cc5-4a63-926a-033a151f7291\") " Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.694697 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e150ca9-5a87-40ba-bbad-bfae139179ae-operator-scripts\") pod \"7e150ca9-5a87-40ba-bbad-bfae139179ae\" (UID: \"7e150ca9-5a87-40ba-bbad-bfae139179ae\") " Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.694859 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04ae1a5e-39a4-40d2-8a7f-dd5e7dac6c0d-operator-scripts\") pod \"04ae1a5e-39a4-40d2-8a7f-dd5e7dac6c0d\" (UID: \"04ae1a5e-39a4-40d2-8a7f-dd5e7dac6c0d\") " Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.695527 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c1cc988-7c84-4647-98d5-6831aaff2915-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.695905 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa308cc4-8cc5-4a63-926a-033a151f7291-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.694762 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa205dc7-c2a0-418c-a696-ccdfa4c0d43f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fa205dc7-c2a0-418c-a696-ccdfa4c0d43f" (UID: "fa205dc7-c2a0-418c-a696-ccdfa4c0d43f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.695687 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e150ca9-5a87-40ba-bbad-bfae139179ae-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7e150ca9-5a87-40ba-bbad-bfae139179ae" (UID: "7e150ca9-5a87-40ba-bbad-bfae139179ae"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.696076 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04ae1a5e-39a4-40d2-8a7f-dd5e7dac6c0d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "04ae1a5e-39a4-40d2-8a7f-dd5e7dac6c0d" (UID: "04ae1a5e-39a4-40d2-8a7f-dd5e7dac6c0d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.699423 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6a8e0cd-1585-416d-9142-fbe4991e8cda-kube-api-access-vwg4p" (OuterVolumeSpecName: "kube-api-access-vwg4p") pod "d6a8e0cd-1585-416d-9142-fbe4991e8cda" (UID: "d6a8e0cd-1585-416d-9142-fbe4991e8cda"). InnerVolumeSpecName "kube-api-access-vwg4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.699466 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04ae1a5e-39a4-40d2-8a7f-dd5e7dac6c0d-kube-api-access-cbrzj" (OuterVolumeSpecName: "kube-api-access-cbrzj") pod "04ae1a5e-39a4-40d2-8a7f-dd5e7dac6c0d" (UID: "04ae1a5e-39a4-40d2-8a7f-dd5e7dac6c0d"). InnerVolumeSpecName "kube-api-access-cbrzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.699594 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baa3eaaa-7445-48bc-80fc-753d38628a73-kube-api-access-6vl58" (OuterVolumeSpecName: "kube-api-access-6vl58") pod "baa3eaaa-7445-48bc-80fc-753d38628a73" (UID: "baa3eaaa-7445-48bc-80fc-753d38628a73"). InnerVolumeSpecName "kube-api-access-6vl58". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.699757 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa308cc4-8cc5-4a63-926a-033a151f7291-kube-api-access-8smcr" (OuterVolumeSpecName: "kube-api-access-8smcr") pod "aa308cc4-8cc5-4a63-926a-033a151f7291" (UID: "aa308cc4-8cc5-4a63-926a-033a151f7291"). InnerVolumeSpecName "kube-api-access-8smcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.700068 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c1cc988-7c84-4647-98d5-6831aaff2915-kube-api-access-c7w74" (OuterVolumeSpecName: "kube-api-access-c7w74") pod "0c1cc988-7c84-4647-98d5-6831aaff2915" (UID: "0c1cc988-7c84-4647-98d5-6831aaff2915"). InnerVolumeSpecName "kube-api-access-c7w74". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.700977 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e150ca9-5a87-40ba-bbad-bfae139179ae-kube-api-access-fr4n4" (OuterVolumeSpecName: "kube-api-access-fr4n4") pod "7e150ca9-5a87-40ba-bbad-bfae139179ae" (UID: "7e150ca9-5a87-40ba-bbad-bfae139179ae"). InnerVolumeSpecName "kube-api-access-fr4n4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.701502 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa205dc7-c2a0-418c-a696-ccdfa4c0d43f-kube-api-access-74msc" (OuterVolumeSpecName: "kube-api-access-74msc") pod "fa205dc7-c2a0-418c-a696-ccdfa4c0d43f" (UID: "fa205dc7-c2a0-418c-a696-ccdfa4c0d43f"). InnerVolumeSpecName "kube-api-access-74msc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.741037 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6a8e0cd-1585-416d-9142-fbe4991e8cda-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d6a8e0cd-1585-416d-9142-fbe4991e8cda" (UID: "d6a8e0cd-1585-416d-9142-fbe4991e8cda"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.745175 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6a8e0cd-1585-416d-9142-fbe4991e8cda-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d6a8e0cd-1585-416d-9142-fbe4991e8cda" (UID: "d6a8e0cd-1585-416d-9142-fbe4991e8cda"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.796894 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d6a8e0cd-1585-416d-9142-fbe4991e8cda-ovsdbserver-sb\") pod \"d6a8e0cd-1585-416d-9142-fbe4991e8cda\" (UID: \"d6a8e0cd-1585-416d-9142-fbe4991e8cda\") " Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.797447 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6a8e0cd-1585-416d-9142-fbe4991e8cda-config\") pod \"d6a8e0cd-1585-416d-9142-fbe4991e8cda\" (UID: \"d6a8e0cd-1585-416d-9142-fbe4991e8cda\") " Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.797558 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/baa3eaaa-7445-48bc-80fc-753d38628a73-operator-scripts\") pod \"baa3eaaa-7445-48bc-80fc-753d38628a73\" (UID: \"baa3eaaa-7445-48bc-80fc-753d38628a73\") " Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.798170 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/baa3eaaa-7445-48bc-80fc-753d38628a73-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "baa3eaaa-7445-48bc-80fc-753d38628a73" (UID: "baa3eaaa-7445-48bc-80fc-753d38628a73"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.798843 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fr4n4\" (UniqueName: \"kubernetes.io/projected/7e150ca9-5a87-40ba-bbad-bfae139179ae-kube-api-access-fr4n4\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.798949 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7w74\" (UniqueName: \"kubernetes.io/projected/0c1cc988-7c84-4647-98d5-6831aaff2915-kube-api-access-c7w74\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.799068 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8smcr\" (UniqueName: \"kubernetes.io/projected/aa308cc4-8cc5-4a63-926a-033a151f7291-kube-api-access-8smcr\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.799105 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e150ca9-5a87-40ba-bbad-bfae139179ae-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.799127 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04ae1a5e-39a4-40d2-8a7f-dd5e7dac6c0d-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.799149 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/baa3eaaa-7445-48bc-80fc-753d38628a73-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.799168 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74msc\" (UniqueName: \"kubernetes.io/projected/fa205dc7-c2a0-418c-a696-ccdfa4c0d43f-kube-api-access-74msc\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.799185 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbrzj\" (UniqueName: \"kubernetes.io/projected/04ae1a5e-39a4-40d2-8a7f-dd5e7dac6c0d-kube-api-access-cbrzj\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.799204 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d6a8e0cd-1585-416d-9142-fbe4991e8cda-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.799223 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vl58\" (UniqueName: \"kubernetes.io/projected/baa3eaaa-7445-48bc-80fc-753d38628a73-kube-api-access-6vl58\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.799282 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa205dc7-c2a0-418c-a696-ccdfa4c0d43f-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.799317 4752 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6a8e0cd-1585-416d-9142-fbe4991e8cda-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.799356 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwg4p\" (UniqueName: \"kubernetes.io/projected/d6a8e0cd-1585-416d-9142-fbe4991e8cda-kube-api-access-vwg4p\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.837291 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6a8e0cd-1585-416d-9142-fbe4991e8cda-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d6a8e0cd-1585-416d-9142-fbe4991e8cda" (UID: "d6a8e0cd-1585-416d-9142-fbe4991e8cda"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.840349 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6a8e0cd-1585-416d-9142-fbe4991e8cda-config" (OuterVolumeSpecName: "config") pod "d6a8e0cd-1585-416d-9142-fbe4991e8cda" (UID: "d6a8e0cd-1585-416d-9142-fbe4991e8cda"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.901283 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6a8e0cd-1585-416d-9142-fbe4991e8cda-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.901312 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d6a8e0cd-1585-416d-9142-fbe4991e8cda-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.949961 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-574b-account-create-qkvsq" event={"ID":"0c1cc988-7c84-4647-98d5-6831aaff2915","Type":"ContainerDied","Data":"003dd8c971ba45adba3126e73ca5434f407b7cb775a65ce9547eac2746d3a22e"} Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.950001 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="003dd8c971ba45adba3126e73ca5434f407b7cb775a65ce9547eac2746d3a22e" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.950027 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-574b-account-create-qkvsq" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.951423 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-5krk8" event={"ID":"fa205dc7-c2a0-418c-a696-ccdfa4c0d43f","Type":"ContainerDied","Data":"1be20492b051433d99a938bbe84a38333ddf866549dc79a3ce1f4ead1f804097"} Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.951466 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1be20492b051433d99a938bbe84a38333ddf866549dc79a3ce1f4ead1f804097" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.951502 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-5krk8" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.954972 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3b7b-account-create-6hhgc" event={"ID":"baa3eaaa-7445-48bc-80fc-753d38628a73","Type":"ContainerDied","Data":"ad73cc4f9efd638bf52e579a67b2c7a0f7854015411152e4bd1fa71790c929aa"} Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.954999 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad73cc4f9efd638bf52e579a67b2c7a0f7854015411152e4bd1fa71790c929aa" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.955029 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3b7b-account-create-6hhgc" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.957993 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wf9mt" event={"ID":"e4e82288-6d0b-4bc0-b934-5c0c869a8198","Type":"ContainerStarted","Data":"bac5dbc1bd66465003c32bb0bf945a6afc962ebd56594b233ce83b4b293019e0"} Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.962312 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-6rbdm" event={"ID":"7e150ca9-5a87-40ba-bbad-bfae139179ae","Type":"ContainerDied","Data":"e0ba6aa64cba38678c4574d02ef74a8614a787b76bf66dad910dba664253978c"} Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.962359 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0ba6aa64cba38678c4574d02ef74a8614a787b76bf66dad910dba664253978c" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.962404 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6rbdm" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.965870 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-dp6s7" event={"ID":"aa308cc4-8cc5-4a63-926a-033a151f7291","Type":"ContainerDied","Data":"a22dd843e547ae24880f732987ed3607616100f00fdd6d0b7c0a832ecf68d38d"} Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.965918 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a22dd843e547ae24880f732987ed3607616100f00fdd6d0b7c0a832ecf68d38d" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.965998 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-dp6s7" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.970786 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-prc8b" event={"ID":"d6a8e0cd-1585-416d-9142-fbe4991e8cda","Type":"ContainerDied","Data":"fae57d95dcb3c404f8071f7045bafa84c91d0011114ccab4adbef1ea8e51c254"} Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.970838 4752 scope.go:117] "RemoveContainer" containerID="7fee1cd1a6c9edd3a15d864d2449f04014cf565e182b53a6fe8f3432c046e3ab" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.971092 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-prc8b" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.973189 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-f581-account-create-hwjsp" event={"ID":"04ae1a5e-39a4-40d2-8a7f-dd5e7dac6c0d","Type":"ContainerDied","Data":"525620a12830d24c1a64bc6025c7ca13e4b2dc072e4ef9c51dc3990f97b641f2"} Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.973225 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="525620a12830d24c1a64bc6025c7ca13e4b2dc072e4ef9c51dc3990f97b641f2" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.973304 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-f581-account-create-hwjsp" Nov 24 11:24:34 crc kubenswrapper[4752]: I1124 11:24:34.995300 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-wf9mt" podStartSLOduration=3.591273059 podStartE2EDuration="8.995263734s" podCreationTimestamp="2025-11-24 11:24:26 +0000 UTC" firstStartedPulling="2025-11-24 11:24:28.947429165 +0000 UTC m=+1074.932249454" lastFinishedPulling="2025-11-24 11:24:34.35141984 +0000 UTC m=+1080.336240129" observedRunningTime="2025-11-24 11:24:34.988436056 +0000 UTC m=+1080.973256345" watchObservedRunningTime="2025-11-24 11:24:34.995263734 +0000 UTC m=+1080.980084053" Nov 24 11:24:35 crc kubenswrapper[4752]: I1124 11:24:35.004454 4752 scope.go:117] "RemoveContainer" containerID="2b7c2876d99a171eee9ad830ee388c74f56a340be2c36c547eb8a428b01a7b82" Nov 24 11:24:35 crc kubenswrapper[4752]: I1124 11:24:35.038317 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-prc8b"] Nov 24 11:24:35 crc kubenswrapper[4752]: I1124 11:24:35.073721 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-prc8b"] Nov 24 11:24:36 crc kubenswrapper[4752]: I1124 11:24:36.739413 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6a8e0cd-1585-416d-9142-fbe4991e8cda" path="/var/lib/kubelet/pods/d6a8e0cd-1585-416d-9142-fbe4991e8cda/volumes" Nov 24 11:24:38 crc kubenswrapper[4752]: I1124 11:24:38.004879 4752 generic.go:334] "Generic (PLEG): container finished" podID="e4e82288-6d0b-4bc0-b934-5c0c869a8198" containerID="bac5dbc1bd66465003c32bb0bf945a6afc962ebd56594b233ce83b4b293019e0" exitCode=0 Nov 24 11:24:38 crc kubenswrapper[4752]: I1124 11:24:38.005005 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wf9mt" event={"ID":"e4e82288-6d0b-4bc0-b934-5c0c869a8198","Type":"ContainerDied","Data":"bac5dbc1bd66465003c32bb0bf945a6afc962ebd56594b233ce83b4b293019e0"} Nov 24 11:24:39 crc kubenswrapper[4752]: I1124 11:24:39.380661 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wf9mt" Nov 24 11:24:39 crc kubenswrapper[4752]: I1124 11:24:39.576400 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4e82288-6d0b-4bc0-b934-5c0c869a8198-config-data\") pod \"e4e82288-6d0b-4bc0-b934-5c0c869a8198\" (UID: \"e4e82288-6d0b-4bc0-b934-5c0c869a8198\") " Nov 24 11:24:39 crc kubenswrapper[4752]: I1124 11:24:39.576529 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bw2hs\" (UniqueName: \"kubernetes.io/projected/e4e82288-6d0b-4bc0-b934-5c0c869a8198-kube-api-access-bw2hs\") pod \"e4e82288-6d0b-4bc0-b934-5c0c869a8198\" (UID: \"e4e82288-6d0b-4bc0-b934-5c0c869a8198\") " Nov 24 11:24:39 crc kubenswrapper[4752]: I1124 11:24:39.576660 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e82288-6d0b-4bc0-b934-5c0c869a8198-combined-ca-bundle\") pod \"e4e82288-6d0b-4bc0-b934-5c0c869a8198\" (UID: \"e4e82288-6d0b-4bc0-b934-5c0c869a8198\") " Nov 24 11:24:39 crc kubenswrapper[4752]: I1124 11:24:39.585987 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4e82288-6d0b-4bc0-b934-5c0c869a8198-kube-api-access-bw2hs" (OuterVolumeSpecName: "kube-api-access-bw2hs") pod "e4e82288-6d0b-4bc0-b934-5c0c869a8198" (UID: "e4e82288-6d0b-4bc0-b934-5c0c869a8198"). InnerVolumeSpecName "kube-api-access-bw2hs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:24:39 crc kubenswrapper[4752]: I1124 11:24:39.608551 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4e82288-6d0b-4bc0-b934-5c0c869a8198-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4e82288-6d0b-4bc0-b934-5c0c869a8198" (UID: "e4e82288-6d0b-4bc0-b934-5c0c869a8198"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:24:39 crc kubenswrapper[4752]: I1124 11:24:39.616591 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4e82288-6d0b-4bc0-b934-5c0c869a8198-config-data" (OuterVolumeSpecName: "config-data") pod "e4e82288-6d0b-4bc0-b934-5c0c869a8198" (UID: "e4e82288-6d0b-4bc0-b934-5c0c869a8198"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:24:39 crc kubenswrapper[4752]: I1124 11:24:39.679010 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e82288-6d0b-4bc0-b934-5c0c869a8198-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:39 crc kubenswrapper[4752]: I1124 11:24:39.679052 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4e82288-6d0b-4bc0-b934-5c0c869a8198-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:39 crc kubenswrapper[4752]: I1124 11:24:39.679065 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bw2hs\" (UniqueName: \"kubernetes.io/projected/e4e82288-6d0b-4bc0-b934-5c0c869a8198-kube-api-access-bw2hs\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.022119 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wf9mt" event={"ID":"e4e82288-6d0b-4bc0-b934-5c0c869a8198","Type":"ContainerDied","Data":"e1f9df57292a2a2a550468e29e2bca54ac7205bb81be55218b006af27fdb7d79"} Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.022159 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1f9df57292a2a2a550468e29e2bca54ac7205bb81be55218b006af27fdb7d79" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.022187 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wf9mt" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.194328 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-hltxr"] Nov 24 11:24:40 crc kubenswrapper[4752]: E1124 11:24:40.194872 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04ae1a5e-39a4-40d2-8a7f-dd5e7dac6c0d" containerName="mariadb-account-create" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.194889 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="04ae1a5e-39a4-40d2-8a7f-dd5e7dac6c0d" containerName="mariadb-account-create" Nov 24 11:24:40 crc kubenswrapper[4752]: E1124 11:24:40.194901 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4e82288-6d0b-4bc0-b934-5c0c869a8198" containerName="keystone-db-sync" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.194907 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4e82288-6d0b-4bc0-b934-5c0c869a8198" containerName="keystone-db-sync" Nov 24 11:24:40 crc kubenswrapper[4752]: E1124 11:24:40.194917 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baa3eaaa-7445-48bc-80fc-753d38628a73" containerName="mariadb-account-create" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.194925 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="baa3eaaa-7445-48bc-80fc-753d38628a73" containerName="mariadb-account-create" Nov 24 11:24:40 crc kubenswrapper[4752]: E1124 11:24:40.194943 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6a8e0cd-1585-416d-9142-fbe4991e8cda" containerName="dnsmasq-dns" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.194949 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6a8e0cd-1585-416d-9142-fbe4991e8cda" containerName="dnsmasq-dns" Nov 24 11:24:40 crc kubenswrapper[4752]: E1124 11:24:40.194966 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6a8e0cd-1585-416d-9142-fbe4991e8cda" containerName="init" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.194973 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6a8e0cd-1585-416d-9142-fbe4991e8cda" containerName="init" Nov 24 11:24:40 crc kubenswrapper[4752]: E1124 11:24:40.194979 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa205dc7-c2a0-418c-a696-ccdfa4c0d43f" containerName="mariadb-database-create" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.194986 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa205dc7-c2a0-418c-a696-ccdfa4c0d43f" containerName="mariadb-database-create" Nov 24 11:24:40 crc kubenswrapper[4752]: E1124 11:24:40.195002 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e150ca9-5a87-40ba-bbad-bfae139179ae" containerName="mariadb-database-create" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.195008 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e150ca9-5a87-40ba-bbad-bfae139179ae" containerName="mariadb-database-create" Nov 24 11:24:40 crc kubenswrapper[4752]: E1124 11:24:40.195014 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa308cc4-8cc5-4a63-926a-033a151f7291" containerName="mariadb-database-create" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.195020 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa308cc4-8cc5-4a63-926a-033a151f7291" containerName="mariadb-database-create" Nov 24 11:24:40 crc kubenswrapper[4752]: E1124 11:24:40.195032 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c1cc988-7c84-4647-98d5-6831aaff2915" containerName="mariadb-account-create" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.195038 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c1cc988-7c84-4647-98d5-6831aaff2915" containerName="mariadb-account-create" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.195191 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6a8e0cd-1585-416d-9142-fbe4991e8cda" containerName="dnsmasq-dns" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.195208 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa308cc4-8cc5-4a63-926a-033a151f7291" containerName="mariadb-database-create" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.195216 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="04ae1a5e-39a4-40d2-8a7f-dd5e7dac6c0d" containerName="mariadb-account-create" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.195231 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c1cc988-7c84-4647-98d5-6831aaff2915" containerName="mariadb-account-create" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.195239 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa205dc7-c2a0-418c-a696-ccdfa4c0d43f" containerName="mariadb-database-create" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.195247 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e150ca9-5a87-40ba-bbad-bfae139179ae" containerName="mariadb-database-create" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.195260 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="baa3eaaa-7445-48bc-80fc-753d38628a73" containerName="mariadb-account-create" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.195270 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4e82288-6d0b-4bc0-b934-5c0c869a8198" containerName="keystone-db-sync" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.196150 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-hltxr" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.203477 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-hltxr"] Nov 24 11:24:40 crc kubenswrapper[4752]: E1124 11:24:40.268496 4752 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa308cc4_8cc5_4a63_926a_033a151f7291.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa308cc4_8cc5_4a63_926a_033a151f7291.slice/crio-a22dd843e547ae24880f732987ed3607616100f00fdd6d0b7c0a832ecf68d38d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04ae1a5e_39a4_40d2_8a7f_dd5e7dac6c0d.slice/crio-525620a12830d24c1a64bc6025c7ca13e4b2dc072e4ef9c51dc3990f97b641f2\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e150ca9_5a87_40ba_bbad_bfae139179ae.slice/crio-e0ba6aa64cba38678c4574d02ef74a8614a787b76bf66dad910dba664253978c\": RecentStats: unable to find data in memory cache]" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.271695 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-xttvj"] Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.274365 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xttvj" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.278086 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-kdtjx" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.278264 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.278440 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.278566 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.287127 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.290347 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xttvj"] Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.293158 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/caca9e92-caca-4ab5-b7d0-9e1803191a3c-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-hltxr\" (UID: \"caca9e92-caca-4ab5-b7d0-9e1803191a3c\") " pod="openstack/dnsmasq-dns-bbf5cc879-hltxr" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.293203 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dae10b86-7236-4998-81d0-88399d0770a8-credential-keys\") pod \"keystone-bootstrap-xttvj\" (UID: \"dae10b86-7236-4998-81d0-88399d0770a8\") " pod="openstack/keystone-bootstrap-xttvj" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.293241 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dae10b86-7236-4998-81d0-88399d0770a8-config-data\") pod \"keystone-bootstrap-xttvj\" (UID: \"dae10b86-7236-4998-81d0-88399d0770a8\") " pod="openstack/keystone-bootstrap-xttvj" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.293269 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/caca9e92-caca-4ab5-b7d0-9e1803191a3c-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-hltxr\" (UID: \"caca9e92-caca-4ab5-b7d0-9e1803191a3c\") " pod="openstack/dnsmasq-dns-bbf5cc879-hltxr" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.293299 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dae10b86-7236-4998-81d0-88399d0770a8-fernet-keys\") pod \"keystone-bootstrap-xttvj\" (UID: \"dae10b86-7236-4998-81d0-88399d0770a8\") " pod="openstack/keystone-bootstrap-xttvj" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.293329 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp9x8\" (UniqueName: \"kubernetes.io/projected/caca9e92-caca-4ab5-b7d0-9e1803191a3c-kube-api-access-xp9x8\") pod \"dnsmasq-dns-bbf5cc879-hltxr\" (UID: \"caca9e92-caca-4ab5-b7d0-9e1803191a3c\") " pod="openstack/dnsmasq-dns-bbf5cc879-hltxr" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.293373 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dae10b86-7236-4998-81d0-88399d0770a8-combined-ca-bundle\") pod \"keystone-bootstrap-xttvj\" (UID: \"dae10b86-7236-4998-81d0-88399d0770a8\") " pod="openstack/keystone-bootstrap-xttvj" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.293414 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/caca9e92-caca-4ab5-b7d0-9e1803191a3c-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-hltxr\" (UID: \"caca9e92-caca-4ab5-b7d0-9e1803191a3c\") " pod="openstack/dnsmasq-dns-bbf5cc879-hltxr" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.293438 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wgbw\" (UniqueName: \"kubernetes.io/projected/dae10b86-7236-4998-81d0-88399d0770a8-kube-api-access-5wgbw\") pod \"keystone-bootstrap-xttvj\" (UID: \"dae10b86-7236-4998-81d0-88399d0770a8\") " pod="openstack/keystone-bootstrap-xttvj" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.293461 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/caca9e92-caca-4ab5-b7d0-9e1803191a3c-config\") pod \"dnsmasq-dns-bbf5cc879-hltxr\" (UID: \"caca9e92-caca-4ab5-b7d0-9e1803191a3c\") " pod="openstack/dnsmasq-dns-bbf5cc879-hltxr" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.293484 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/caca9e92-caca-4ab5-b7d0-9e1803191a3c-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-hltxr\" (UID: \"caca9e92-caca-4ab5-b7d0-9e1803191a3c\") " pod="openstack/dnsmasq-dns-bbf5cc879-hltxr" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.293519 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dae10b86-7236-4998-81d0-88399d0770a8-scripts\") pod \"keystone-bootstrap-xttvj\" (UID: \"dae10b86-7236-4998-81d0-88399d0770a8\") " pod="openstack/keystone-bootstrap-xttvj" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.395162 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/caca9e92-caca-4ab5-b7d0-9e1803191a3c-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-hltxr\" (UID: \"caca9e92-caca-4ab5-b7d0-9e1803191a3c\") " pod="openstack/dnsmasq-dns-bbf5cc879-hltxr" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.395244 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dae10b86-7236-4998-81d0-88399d0770a8-scripts\") pod \"keystone-bootstrap-xttvj\" (UID: \"dae10b86-7236-4998-81d0-88399d0770a8\") " pod="openstack/keystone-bootstrap-xttvj" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.395285 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/caca9e92-caca-4ab5-b7d0-9e1803191a3c-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-hltxr\" (UID: \"caca9e92-caca-4ab5-b7d0-9e1803191a3c\") " pod="openstack/dnsmasq-dns-bbf5cc879-hltxr" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.395313 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dae10b86-7236-4998-81d0-88399d0770a8-credential-keys\") pod \"keystone-bootstrap-xttvj\" (UID: \"dae10b86-7236-4998-81d0-88399d0770a8\") " pod="openstack/keystone-bootstrap-xttvj" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.395358 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dae10b86-7236-4998-81d0-88399d0770a8-config-data\") pod \"keystone-bootstrap-xttvj\" (UID: \"dae10b86-7236-4998-81d0-88399d0770a8\") " pod="openstack/keystone-bootstrap-xttvj" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.395386 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/caca9e92-caca-4ab5-b7d0-9e1803191a3c-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-hltxr\" (UID: \"caca9e92-caca-4ab5-b7d0-9e1803191a3c\") " pod="openstack/dnsmasq-dns-bbf5cc879-hltxr" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.395418 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dae10b86-7236-4998-81d0-88399d0770a8-fernet-keys\") pod \"keystone-bootstrap-xttvj\" (UID: \"dae10b86-7236-4998-81d0-88399d0770a8\") " pod="openstack/keystone-bootstrap-xttvj" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.395458 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xp9x8\" (UniqueName: \"kubernetes.io/projected/caca9e92-caca-4ab5-b7d0-9e1803191a3c-kube-api-access-xp9x8\") pod \"dnsmasq-dns-bbf5cc879-hltxr\" (UID: \"caca9e92-caca-4ab5-b7d0-9e1803191a3c\") " pod="openstack/dnsmasq-dns-bbf5cc879-hltxr" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.395523 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dae10b86-7236-4998-81d0-88399d0770a8-combined-ca-bundle\") pod \"keystone-bootstrap-xttvj\" (UID: \"dae10b86-7236-4998-81d0-88399d0770a8\") " pod="openstack/keystone-bootstrap-xttvj" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.395576 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/caca9e92-caca-4ab5-b7d0-9e1803191a3c-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-hltxr\" (UID: \"caca9e92-caca-4ab5-b7d0-9e1803191a3c\") " pod="openstack/dnsmasq-dns-bbf5cc879-hltxr" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.395604 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wgbw\" (UniqueName: \"kubernetes.io/projected/dae10b86-7236-4998-81d0-88399d0770a8-kube-api-access-5wgbw\") pod \"keystone-bootstrap-xttvj\" (UID: \"dae10b86-7236-4998-81d0-88399d0770a8\") " pod="openstack/keystone-bootstrap-xttvj" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.395630 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/caca9e92-caca-4ab5-b7d0-9e1803191a3c-config\") pod \"dnsmasq-dns-bbf5cc879-hltxr\" (UID: \"caca9e92-caca-4ab5-b7d0-9e1803191a3c\") " pod="openstack/dnsmasq-dns-bbf5cc879-hltxr" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.396688 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/caca9e92-caca-4ab5-b7d0-9e1803191a3c-config\") pod \"dnsmasq-dns-bbf5cc879-hltxr\" (UID: \"caca9e92-caca-4ab5-b7d0-9e1803191a3c\") " pod="openstack/dnsmasq-dns-bbf5cc879-hltxr" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.397516 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/caca9e92-caca-4ab5-b7d0-9e1803191a3c-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-hltxr\" (UID: \"caca9e92-caca-4ab5-b7d0-9e1803191a3c\") " pod="openstack/dnsmasq-dns-bbf5cc879-hltxr" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.397610 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/caca9e92-caca-4ab5-b7d0-9e1803191a3c-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-hltxr\" (UID: \"caca9e92-caca-4ab5-b7d0-9e1803191a3c\") " pod="openstack/dnsmasq-dns-bbf5cc879-hltxr" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.397649 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/caca9e92-caca-4ab5-b7d0-9e1803191a3c-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-hltxr\" (UID: \"caca9e92-caca-4ab5-b7d0-9e1803191a3c\") " pod="openstack/dnsmasq-dns-bbf5cc879-hltxr" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.402823 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/caca9e92-caca-4ab5-b7d0-9e1803191a3c-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-hltxr\" (UID: \"caca9e92-caca-4ab5-b7d0-9e1803191a3c\") " pod="openstack/dnsmasq-dns-bbf5cc879-hltxr" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.411029 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-t8l2x"] Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.412024 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-t8l2x" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.418884 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dae10b86-7236-4998-81d0-88399d0770a8-scripts\") pod \"keystone-bootstrap-xttvj\" (UID: \"dae10b86-7236-4998-81d0-88399d0770a8\") " pod="openstack/keystone-bootstrap-xttvj" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.419191 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dae10b86-7236-4998-81d0-88399d0770a8-combined-ca-bundle\") pod \"keystone-bootstrap-xttvj\" (UID: \"dae10b86-7236-4998-81d0-88399d0770a8\") " pod="openstack/keystone-bootstrap-xttvj" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.420001 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dae10b86-7236-4998-81d0-88399d0770a8-config-data\") pod \"keystone-bootstrap-xttvj\" (UID: \"dae10b86-7236-4998-81d0-88399d0770a8\") " pod="openstack/keystone-bootstrap-xttvj" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.420784 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.420942 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.423309 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-nr2fk" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.427320 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wgbw\" (UniqueName: \"kubernetes.io/projected/dae10b86-7236-4998-81d0-88399d0770a8-kube-api-access-5wgbw\") pod \"keystone-bootstrap-xttvj\" (UID: \"dae10b86-7236-4998-81d0-88399d0770a8\") " pod="openstack/keystone-bootstrap-xttvj" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.427385 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dae10b86-7236-4998-81d0-88399d0770a8-credential-keys\") pod \"keystone-bootstrap-xttvj\" (UID: \"dae10b86-7236-4998-81d0-88399d0770a8\") " pod="openstack/keystone-bootstrap-xttvj" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.427548 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dae10b86-7236-4998-81d0-88399d0770a8-fernet-keys\") pod \"keystone-bootstrap-xttvj\" (UID: \"dae10b86-7236-4998-81d0-88399d0770a8\") " pod="openstack/keystone-bootstrap-xttvj" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.442458 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp9x8\" (UniqueName: \"kubernetes.io/projected/caca9e92-caca-4ab5-b7d0-9e1803191a3c-kube-api-access-xp9x8\") pod \"dnsmasq-dns-bbf5cc879-hltxr\" (UID: \"caca9e92-caca-4ab5-b7d0-9e1803191a3c\") " pod="openstack/dnsmasq-dns-bbf5cc879-hltxr" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.442521 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-t8l2x"] Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.494143 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.496470 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.496709 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c082fce-df36-4282-9630-e2f3089fa482-combined-ca-bundle\") pod \"neutron-db-sync-t8l2x\" (UID: \"2c082fce-df36-4282-9630-e2f3089fa482\") " pod="openstack/neutron-db-sync-t8l2x" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.496853 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2c082fce-df36-4282-9630-e2f3089fa482-config\") pod \"neutron-db-sync-t8l2x\" (UID: \"2c082fce-df36-4282-9630-e2f3089fa482\") " pod="openstack/neutron-db-sync-t8l2x" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.496946 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw7bf\" (UniqueName: \"kubernetes.io/projected/2c082fce-df36-4282-9630-e2f3089fa482-kube-api-access-vw7bf\") pod \"neutron-db-sync-t8l2x\" (UID: \"2c082fce-df36-4282-9630-e2f3089fa482\") " pod="openstack/neutron-db-sync-t8l2x" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.499325 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.499626 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.511208 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.527348 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-hltxr" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.588410 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-ff2l8"] Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.595082 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-ff2l8" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.599164 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9-scripts\") pod \"cinder-db-sync-ff2l8\" (UID: \"dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9\") " pod="openstack/cinder-db-sync-ff2l8" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.599198 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47dd8\" (UniqueName: \"kubernetes.io/projected/dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9-kube-api-access-47dd8\") pod \"cinder-db-sync-ff2l8\" (UID: \"dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9\") " pod="openstack/cinder-db-sync-ff2l8" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.599222 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ade2fdf2-9bc9-45ee-81cf-25be73764135-run-httpd\") pod \"ceilometer-0\" (UID: \"ade2fdf2-9bc9-45ee-81cf-25be73764135\") " pod="openstack/ceilometer-0" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.599247 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9-combined-ca-bundle\") pod \"cinder-db-sync-ff2l8\" (UID: \"dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9\") " pod="openstack/cinder-db-sync-ff2l8" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.599271 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ade2fdf2-9bc9-45ee-81cf-25be73764135-log-httpd\") pod \"ceilometer-0\" (UID: \"ade2fdf2-9bc9-45ee-81cf-25be73764135\") " pod="openstack/ceilometer-0" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.599290 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9-etc-machine-id\") pod \"cinder-db-sync-ff2l8\" (UID: \"dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9\") " pod="openstack/cinder-db-sync-ff2l8" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.599305 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ade2fdf2-9bc9-45ee-81cf-25be73764135-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ade2fdf2-9bc9-45ee-81cf-25be73764135\") " pod="openstack/ceilometer-0" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.599325 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ade2fdf2-9bc9-45ee-81cf-25be73764135-scripts\") pod \"ceilometer-0\" (UID: \"ade2fdf2-9bc9-45ee-81cf-25be73764135\") " pod="openstack/ceilometer-0" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.599349 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c082fce-df36-4282-9630-e2f3089fa482-combined-ca-bundle\") pod \"neutron-db-sync-t8l2x\" (UID: \"2c082fce-df36-4282-9630-e2f3089fa482\") " pod="openstack/neutron-db-sync-t8l2x" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.599379 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ade2fdf2-9bc9-45ee-81cf-25be73764135-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ade2fdf2-9bc9-45ee-81cf-25be73764135\") " pod="openstack/ceilometer-0" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.599398 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ps2r\" (UniqueName: \"kubernetes.io/projected/ade2fdf2-9bc9-45ee-81cf-25be73764135-kube-api-access-5ps2r\") pod \"ceilometer-0\" (UID: \"ade2fdf2-9bc9-45ee-81cf-25be73764135\") " pod="openstack/ceilometer-0" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.599428 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2c082fce-df36-4282-9630-e2f3089fa482-config\") pod \"neutron-db-sync-t8l2x\" (UID: \"2c082fce-df36-4282-9630-e2f3089fa482\") " pod="openstack/neutron-db-sync-t8l2x" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.599462 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ade2fdf2-9bc9-45ee-81cf-25be73764135-config-data\") pod \"ceilometer-0\" (UID: \"ade2fdf2-9bc9-45ee-81cf-25be73764135\") " pod="openstack/ceilometer-0" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.599490 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw7bf\" (UniqueName: \"kubernetes.io/projected/2c082fce-df36-4282-9630-e2f3089fa482-kube-api-access-vw7bf\") pod \"neutron-db-sync-t8l2x\" (UID: \"2c082fce-df36-4282-9630-e2f3089fa482\") " pod="openstack/neutron-db-sync-t8l2x" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.599514 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9-config-data\") pod \"cinder-db-sync-ff2l8\" (UID: \"dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9\") " pod="openstack/cinder-db-sync-ff2l8" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.599548 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9-db-sync-config-data\") pod \"cinder-db-sync-ff2l8\" (UID: \"dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9\") " pod="openstack/cinder-db-sync-ff2l8" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.611754 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-5677w" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.612024 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.612648 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.622444 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xttvj" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.655002 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c082fce-df36-4282-9630-e2f3089fa482-combined-ca-bundle\") pod \"neutron-db-sync-t8l2x\" (UID: \"2c082fce-df36-4282-9630-e2f3089fa482\") " pod="openstack/neutron-db-sync-t8l2x" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.673823 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2c082fce-df36-4282-9630-e2f3089fa482-config\") pod \"neutron-db-sync-t8l2x\" (UID: \"2c082fce-df36-4282-9630-e2f3089fa482\") " pod="openstack/neutron-db-sync-t8l2x" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.678350 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw7bf\" (UniqueName: \"kubernetes.io/projected/2c082fce-df36-4282-9630-e2f3089fa482-kube-api-access-vw7bf\") pod \"neutron-db-sync-t8l2x\" (UID: \"2c082fce-df36-4282-9630-e2f3089fa482\") " pod="openstack/neutron-db-sync-t8l2x" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.700514 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-ff2l8"] Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.701323 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ade2fdf2-9bc9-45ee-81cf-25be73764135-config-data\") pod \"ceilometer-0\" (UID: \"ade2fdf2-9bc9-45ee-81cf-25be73764135\") " pod="openstack/ceilometer-0" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.701361 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9-config-data\") pod \"cinder-db-sync-ff2l8\" (UID: \"dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9\") " pod="openstack/cinder-db-sync-ff2l8" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.701396 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9-db-sync-config-data\") pod \"cinder-db-sync-ff2l8\" (UID: \"dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9\") " pod="openstack/cinder-db-sync-ff2l8" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.701421 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9-scripts\") pod \"cinder-db-sync-ff2l8\" (UID: \"dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9\") " pod="openstack/cinder-db-sync-ff2l8" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.701440 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47dd8\" (UniqueName: \"kubernetes.io/projected/dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9-kube-api-access-47dd8\") pod \"cinder-db-sync-ff2l8\" (UID: \"dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9\") " pod="openstack/cinder-db-sync-ff2l8" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.701484 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ade2fdf2-9bc9-45ee-81cf-25be73764135-run-httpd\") pod \"ceilometer-0\" (UID: \"ade2fdf2-9bc9-45ee-81cf-25be73764135\") " pod="openstack/ceilometer-0" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.701503 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9-combined-ca-bundle\") pod \"cinder-db-sync-ff2l8\" (UID: \"dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9\") " pod="openstack/cinder-db-sync-ff2l8" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.701523 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ade2fdf2-9bc9-45ee-81cf-25be73764135-log-httpd\") pod \"ceilometer-0\" (UID: \"ade2fdf2-9bc9-45ee-81cf-25be73764135\") " pod="openstack/ceilometer-0" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.701540 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9-etc-machine-id\") pod \"cinder-db-sync-ff2l8\" (UID: \"dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9\") " pod="openstack/cinder-db-sync-ff2l8" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.701567 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ade2fdf2-9bc9-45ee-81cf-25be73764135-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ade2fdf2-9bc9-45ee-81cf-25be73764135\") " pod="openstack/ceilometer-0" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.701581 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ade2fdf2-9bc9-45ee-81cf-25be73764135-scripts\") pod \"ceilometer-0\" (UID: \"ade2fdf2-9bc9-45ee-81cf-25be73764135\") " pod="openstack/ceilometer-0" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.701605 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ade2fdf2-9bc9-45ee-81cf-25be73764135-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ade2fdf2-9bc9-45ee-81cf-25be73764135\") " pod="openstack/ceilometer-0" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.701621 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ps2r\" (UniqueName: \"kubernetes.io/projected/ade2fdf2-9bc9-45ee-81cf-25be73764135-kube-api-access-5ps2r\") pod \"ceilometer-0\" (UID: \"ade2fdf2-9bc9-45ee-81cf-25be73764135\") " pod="openstack/ceilometer-0" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.706941 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-hltxr"] Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.707857 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ade2fdf2-9bc9-45ee-81cf-25be73764135-config-data\") pod \"ceilometer-0\" (UID: \"ade2fdf2-9bc9-45ee-81cf-25be73764135\") " pod="openstack/ceilometer-0" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.708184 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ade2fdf2-9bc9-45ee-81cf-25be73764135-log-httpd\") pod \"ceilometer-0\" (UID: \"ade2fdf2-9bc9-45ee-81cf-25be73764135\") " pod="openstack/ceilometer-0" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.708330 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9-etc-machine-id\") pod \"cinder-db-sync-ff2l8\" (UID: \"dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9\") " pod="openstack/cinder-db-sync-ff2l8" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.708382 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ade2fdf2-9bc9-45ee-81cf-25be73764135-run-httpd\") pod \"ceilometer-0\" (UID: \"ade2fdf2-9bc9-45ee-81cf-25be73764135\") " pod="openstack/ceilometer-0" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.710412 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9-scripts\") pod \"cinder-db-sync-ff2l8\" (UID: \"dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9\") " pod="openstack/cinder-db-sync-ff2l8" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.712432 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9-config-data\") pod \"cinder-db-sync-ff2l8\" (UID: \"dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9\") " pod="openstack/cinder-db-sync-ff2l8" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.713072 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ade2fdf2-9bc9-45ee-81cf-25be73764135-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ade2fdf2-9bc9-45ee-81cf-25be73764135\") " pod="openstack/ceilometer-0" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.713694 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ade2fdf2-9bc9-45ee-81cf-25be73764135-scripts\") pod \"ceilometer-0\" (UID: \"ade2fdf2-9bc9-45ee-81cf-25be73764135\") " pod="openstack/ceilometer-0" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.717049 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9-combined-ca-bundle\") pod \"cinder-db-sync-ff2l8\" (UID: \"dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9\") " pod="openstack/cinder-db-sync-ff2l8" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.718712 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9-db-sync-config-data\") pod \"cinder-db-sync-ff2l8\" (UID: \"dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9\") " pod="openstack/cinder-db-sync-ff2l8" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.726149 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-fdrhb"] Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.727233 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-fdrhb" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.728557 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ps2r\" (UniqueName: \"kubernetes.io/projected/ade2fdf2-9bc9-45ee-81cf-25be73764135-kube-api-access-5ps2r\") pod \"ceilometer-0\" (UID: \"ade2fdf2-9bc9-45ee-81cf-25be73764135\") " pod="openstack/ceilometer-0" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.729822 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ade2fdf2-9bc9-45ee-81cf-25be73764135-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ade2fdf2-9bc9-45ee-81cf-25be73764135\") " pod="openstack/ceilometer-0" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.733366 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47dd8\" (UniqueName: \"kubernetes.io/projected/dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9-kube-api-access-47dd8\") pod \"cinder-db-sync-ff2l8\" (UID: \"dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9\") " pod="openstack/cinder-db-sync-ff2l8" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.736410 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.737110 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-tv5xf" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.755701 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-fdrhb"] Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.764183 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-s49hz"] Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.765554 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-s49hz" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.774290 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.774520 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-47vqz" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.792333 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.792620 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-pz4zr"] Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.794402 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-pz4zr" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.801876 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-s49hz"] Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.812237 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-pz4zr"] Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.826072 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-t8l2x" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.905273 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96ca5590-9544-4b84-b86d-1ec3ef57a829-combined-ca-bundle\") pod \"placement-db-sync-s49hz\" (UID: \"96ca5590-9544-4b84-b86d-1ec3ef57a829\") " pod="openstack/placement-db-sync-s49hz" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.905647 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bf654b7-df16-44f1-99a8-7eb680634013-config\") pod \"dnsmasq-dns-56df8fb6b7-pz4zr\" (UID: \"8bf654b7-df16-44f1-99a8-7eb680634013\") " pod="openstack/dnsmasq-dns-56df8fb6b7-pz4zr" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.905702 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bf654b7-df16-44f1-99a8-7eb680634013-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-pz4zr\" (UID: \"8bf654b7-df16-44f1-99a8-7eb680634013\") " pod="openstack/dnsmasq-dns-56df8fb6b7-pz4zr" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.905728 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96ca5590-9544-4b84-b86d-1ec3ef57a829-config-data\") pod \"placement-db-sync-s49hz\" (UID: \"96ca5590-9544-4b84-b86d-1ec3ef57a829\") " pod="openstack/placement-db-sync-s49hz" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.905786 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4-combined-ca-bundle\") pod \"barbican-db-sync-fdrhb\" (UID: \"9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4\") " pod="openstack/barbican-db-sync-fdrhb" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.905842 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96ca5590-9544-4b84-b86d-1ec3ef57a829-scripts\") pod \"placement-db-sync-s49hz\" (UID: \"96ca5590-9544-4b84-b86d-1ec3ef57a829\") " pod="openstack/placement-db-sync-s49hz" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.905864 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4-db-sync-config-data\") pod \"barbican-db-sync-fdrhb\" (UID: \"9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4\") " pod="openstack/barbican-db-sync-fdrhb" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.905888 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96ca5590-9544-4b84-b86d-1ec3ef57a829-logs\") pod \"placement-db-sync-s49hz\" (UID: \"96ca5590-9544-4b84-b86d-1ec3ef57a829\") " pod="openstack/placement-db-sync-s49hz" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.905925 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8bf654b7-df16-44f1-99a8-7eb680634013-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-pz4zr\" (UID: \"8bf654b7-df16-44f1-99a8-7eb680634013\") " pod="openstack/dnsmasq-dns-56df8fb6b7-pz4zr" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.905986 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bf654b7-df16-44f1-99a8-7eb680634013-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-pz4zr\" (UID: \"8bf654b7-df16-44f1-99a8-7eb680634013\") " pod="openstack/dnsmasq-dns-56df8fb6b7-pz4zr" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.906014 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8bf654b7-df16-44f1-99a8-7eb680634013-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-pz4zr\" (UID: \"8bf654b7-df16-44f1-99a8-7eb680634013\") " pod="openstack/dnsmasq-dns-56df8fb6b7-pz4zr" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.906506 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj54n\" (UniqueName: \"kubernetes.io/projected/8bf654b7-df16-44f1-99a8-7eb680634013-kube-api-access-lj54n\") pod \"dnsmasq-dns-56df8fb6b7-pz4zr\" (UID: \"8bf654b7-df16-44f1-99a8-7eb680634013\") " pod="openstack/dnsmasq-dns-56df8fb6b7-pz4zr" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.906617 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtwq6\" (UniqueName: \"kubernetes.io/projected/96ca5590-9544-4b84-b86d-1ec3ef57a829-kube-api-access-qtwq6\") pod \"placement-db-sync-s49hz\" (UID: \"96ca5590-9544-4b84-b86d-1ec3ef57a829\") " pod="openstack/placement-db-sync-s49hz" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.906659 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-726wp\" (UniqueName: \"kubernetes.io/projected/9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4-kube-api-access-726wp\") pod \"barbican-db-sync-fdrhb\" (UID: \"9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4\") " pod="openstack/barbican-db-sync-fdrhb" Nov 24 11:24:40 crc kubenswrapper[4752]: I1124 11:24:40.973835 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.009736 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8bf654b7-df16-44f1-99a8-7eb680634013-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-pz4zr\" (UID: \"8bf654b7-df16-44f1-99a8-7eb680634013\") " pod="openstack/dnsmasq-dns-56df8fb6b7-pz4zr" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.009811 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bf654b7-df16-44f1-99a8-7eb680634013-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-pz4zr\" (UID: \"8bf654b7-df16-44f1-99a8-7eb680634013\") " pod="openstack/dnsmasq-dns-56df8fb6b7-pz4zr" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.009836 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8bf654b7-df16-44f1-99a8-7eb680634013-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-pz4zr\" (UID: \"8bf654b7-df16-44f1-99a8-7eb680634013\") " pod="openstack/dnsmasq-dns-56df8fb6b7-pz4zr" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.009874 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj54n\" (UniqueName: \"kubernetes.io/projected/8bf654b7-df16-44f1-99a8-7eb680634013-kube-api-access-lj54n\") pod \"dnsmasq-dns-56df8fb6b7-pz4zr\" (UID: \"8bf654b7-df16-44f1-99a8-7eb680634013\") " pod="openstack/dnsmasq-dns-56df8fb6b7-pz4zr" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.009909 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtwq6\" (UniqueName: \"kubernetes.io/projected/96ca5590-9544-4b84-b86d-1ec3ef57a829-kube-api-access-qtwq6\") pod \"placement-db-sync-s49hz\" (UID: \"96ca5590-9544-4b84-b86d-1ec3ef57a829\") " pod="openstack/placement-db-sync-s49hz" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.010142 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-726wp\" (UniqueName: \"kubernetes.io/projected/9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4-kube-api-access-726wp\") pod \"barbican-db-sync-fdrhb\" (UID: \"9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4\") " pod="openstack/barbican-db-sync-fdrhb" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.010230 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96ca5590-9544-4b84-b86d-1ec3ef57a829-combined-ca-bundle\") pod \"placement-db-sync-s49hz\" (UID: \"96ca5590-9544-4b84-b86d-1ec3ef57a829\") " pod="openstack/placement-db-sync-s49hz" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.010285 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bf654b7-df16-44f1-99a8-7eb680634013-config\") pod \"dnsmasq-dns-56df8fb6b7-pz4zr\" (UID: \"8bf654b7-df16-44f1-99a8-7eb680634013\") " pod="openstack/dnsmasq-dns-56df8fb6b7-pz4zr" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.010354 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bf654b7-df16-44f1-99a8-7eb680634013-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-pz4zr\" (UID: \"8bf654b7-df16-44f1-99a8-7eb680634013\") " pod="openstack/dnsmasq-dns-56df8fb6b7-pz4zr" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.010389 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96ca5590-9544-4b84-b86d-1ec3ef57a829-config-data\") pod \"placement-db-sync-s49hz\" (UID: \"96ca5590-9544-4b84-b86d-1ec3ef57a829\") " pod="openstack/placement-db-sync-s49hz" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.010428 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4-combined-ca-bundle\") pod \"barbican-db-sync-fdrhb\" (UID: \"9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4\") " pod="openstack/barbican-db-sync-fdrhb" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.010527 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96ca5590-9544-4b84-b86d-1ec3ef57a829-scripts\") pod \"placement-db-sync-s49hz\" (UID: \"96ca5590-9544-4b84-b86d-1ec3ef57a829\") " pod="openstack/placement-db-sync-s49hz" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.010554 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4-db-sync-config-data\") pod \"barbican-db-sync-fdrhb\" (UID: \"9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4\") " pod="openstack/barbican-db-sync-fdrhb" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.010580 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96ca5590-9544-4b84-b86d-1ec3ef57a829-logs\") pod \"placement-db-sync-s49hz\" (UID: \"96ca5590-9544-4b84-b86d-1ec3ef57a829\") " pod="openstack/placement-db-sync-s49hz" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.010654 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8bf654b7-df16-44f1-99a8-7eb680634013-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-pz4zr\" (UID: \"8bf654b7-df16-44f1-99a8-7eb680634013\") " pod="openstack/dnsmasq-dns-56df8fb6b7-pz4zr" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.011003 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96ca5590-9544-4b84-b86d-1ec3ef57a829-logs\") pod \"placement-db-sync-s49hz\" (UID: \"96ca5590-9544-4b84-b86d-1ec3ef57a829\") " pod="openstack/placement-db-sync-s49hz" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.011136 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bf654b7-df16-44f1-99a8-7eb680634013-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-pz4zr\" (UID: \"8bf654b7-df16-44f1-99a8-7eb680634013\") " pod="openstack/dnsmasq-dns-56df8fb6b7-pz4zr" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.011558 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bf654b7-df16-44f1-99a8-7eb680634013-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-pz4zr\" (UID: \"8bf654b7-df16-44f1-99a8-7eb680634013\") " pod="openstack/dnsmasq-dns-56df8fb6b7-pz4zr" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.011677 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8bf654b7-df16-44f1-99a8-7eb680634013-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-pz4zr\" (UID: \"8bf654b7-df16-44f1-99a8-7eb680634013\") " pod="openstack/dnsmasq-dns-56df8fb6b7-pz4zr" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.012220 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bf654b7-df16-44f1-99a8-7eb680634013-config\") pod \"dnsmasq-dns-56df8fb6b7-pz4zr\" (UID: \"8bf654b7-df16-44f1-99a8-7eb680634013\") " pod="openstack/dnsmasq-dns-56df8fb6b7-pz4zr" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.016974 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96ca5590-9544-4b84-b86d-1ec3ef57a829-scripts\") pod \"placement-db-sync-s49hz\" (UID: \"96ca5590-9544-4b84-b86d-1ec3ef57a829\") " pod="openstack/placement-db-sync-s49hz" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.017263 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96ca5590-9544-4b84-b86d-1ec3ef57a829-combined-ca-bundle\") pod \"placement-db-sync-s49hz\" (UID: \"96ca5590-9544-4b84-b86d-1ec3ef57a829\") " pod="openstack/placement-db-sync-s49hz" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.018582 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96ca5590-9544-4b84-b86d-1ec3ef57a829-config-data\") pod \"placement-db-sync-s49hz\" (UID: \"96ca5590-9544-4b84-b86d-1ec3ef57a829\") " pod="openstack/placement-db-sync-s49hz" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.019841 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4-combined-ca-bundle\") pod \"barbican-db-sync-fdrhb\" (UID: \"9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4\") " pod="openstack/barbican-db-sync-fdrhb" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.026160 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-ff2l8" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.029456 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-726wp\" (UniqueName: \"kubernetes.io/projected/9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4-kube-api-access-726wp\") pod \"barbican-db-sync-fdrhb\" (UID: \"9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4\") " pod="openstack/barbican-db-sync-fdrhb" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.032357 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4-db-sync-config-data\") pod \"barbican-db-sync-fdrhb\" (UID: \"9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4\") " pod="openstack/barbican-db-sync-fdrhb" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.036676 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtwq6\" (UniqueName: \"kubernetes.io/projected/96ca5590-9544-4b84-b86d-1ec3ef57a829-kube-api-access-qtwq6\") pod \"placement-db-sync-s49hz\" (UID: \"96ca5590-9544-4b84-b86d-1ec3ef57a829\") " pod="openstack/placement-db-sync-s49hz" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.040399 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj54n\" (UniqueName: \"kubernetes.io/projected/8bf654b7-df16-44f1-99a8-7eb680634013-kube-api-access-lj54n\") pod \"dnsmasq-dns-56df8fb6b7-pz4zr\" (UID: \"8bf654b7-df16-44f1-99a8-7eb680634013\") " pod="openstack/dnsmasq-dns-56df8fb6b7-pz4zr" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.059909 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-fdrhb" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.170347 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-s49hz" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.182541 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-pz4zr" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.201620 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-hltxr"] Nov 24 11:24:42 crc kubenswrapper[4752]: W1124 11:24:41.334227 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcaca9e92_caca_4ab5_b7d0_9e1803191a3c.slice/crio-5ffacd732dc318c12ab7384b1bfb79ce398624dc159310cfb19af842ce941e53 WatchSource:0}: Error finding container 5ffacd732dc318c12ab7384b1bfb79ce398624dc159310cfb19af842ce941e53: Status 404 returned error can't find the container with id 5ffacd732dc318c12ab7384b1bfb79ce398624dc159310cfb19af842ce941e53 Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.367643 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.369235 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.373088 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.373422 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.373572 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.374601 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-z5nd5" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.415311 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.437847 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.439261 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.451378 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.452211 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.480816 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.520415 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgv2s\" (UniqueName: \"kubernetes.io/projected/c246d067-016f-43a4-8b99-cf00683e3306-kube-api-access-cgv2s\") pod \"glance-default-external-api-0\" (UID: \"c246d067-016f-43a4-8b99-cf00683e3306\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.520507 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c246d067-016f-43a4-8b99-cf00683e3306-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c246d067-016f-43a4-8b99-cf00683e3306\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.520531 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c246d067-016f-43a4-8b99-cf00683e3306-logs\") pod \"glance-default-external-api-0\" (UID: \"c246d067-016f-43a4-8b99-cf00683e3306\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.520661 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"c246d067-016f-43a4-8b99-cf00683e3306\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.520681 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c246d067-016f-43a4-8b99-cf00683e3306-config-data\") pod \"glance-default-external-api-0\" (UID: \"c246d067-016f-43a4-8b99-cf00683e3306\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.520707 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c246d067-016f-43a4-8b99-cf00683e3306-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c246d067-016f-43a4-8b99-cf00683e3306\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.520780 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c246d067-016f-43a4-8b99-cf00683e3306-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c246d067-016f-43a4-8b99-cf00683e3306\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.520831 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c246d067-016f-43a4-8b99-cf00683e3306-scripts\") pod \"glance-default-external-api-0\" (UID: \"c246d067-016f-43a4-8b99-cf00683e3306\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.622398 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"fbacd1db-932a-44d7-8f1f-3031b30f0931\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.622673 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbacd1db-932a-44d7-8f1f-3031b30f0931-logs\") pod \"glance-default-internal-api-0\" (UID: \"fbacd1db-932a-44d7-8f1f-3031b30f0931\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.622703 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c246d067-016f-43a4-8b99-cf00683e3306-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c246d067-016f-43a4-8b99-cf00683e3306\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.622729 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbacd1db-932a-44d7-8f1f-3031b30f0931-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fbacd1db-932a-44d7-8f1f-3031b30f0931\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.622815 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c246d067-016f-43a4-8b99-cf00683e3306-scripts\") pod \"glance-default-external-api-0\" (UID: \"c246d067-016f-43a4-8b99-cf00683e3306\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.622845 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ztn9\" (UniqueName: \"kubernetes.io/projected/fbacd1db-932a-44d7-8f1f-3031b30f0931-kube-api-access-9ztn9\") pod \"glance-default-internal-api-0\" (UID: \"fbacd1db-932a-44d7-8f1f-3031b30f0931\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.624208 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgv2s\" (UniqueName: \"kubernetes.io/projected/c246d067-016f-43a4-8b99-cf00683e3306-kube-api-access-cgv2s\") pod \"glance-default-external-api-0\" (UID: \"c246d067-016f-43a4-8b99-cf00683e3306\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.624312 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbacd1db-932a-44d7-8f1f-3031b30f0931-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"fbacd1db-932a-44d7-8f1f-3031b30f0931\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.624362 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbacd1db-932a-44d7-8f1f-3031b30f0931-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fbacd1db-932a-44d7-8f1f-3031b30f0931\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.624390 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c246d067-016f-43a4-8b99-cf00683e3306-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c246d067-016f-43a4-8b99-cf00683e3306\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.624451 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c246d067-016f-43a4-8b99-cf00683e3306-logs\") pod \"glance-default-external-api-0\" (UID: \"c246d067-016f-43a4-8b99-cf00683e3306\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.624591 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c246d067-016f-43a4-8b99-cf00683e3306-config-data\") pod \"glance-default-external-api-0\" (UID: \"c246d067-016f-43a4-8b99-cf00683e3306\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.624620 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"c246d067-016f-43a4-8b99-cf00683e3306\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.624672 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c246d067-016f-43a4-8b99-cf00683e3306-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c246d067-016f-43a4-8b99-cf00683e3306\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.624725 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbacd1db-932a-44d7-8f1f-3031b30f0931-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"fbacd1db-932a-44d7-8f1f-3031b30f0931\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.624856 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbacd1db-932a-44d7-8f1f-3031b30f0931-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fbacd1db-932a-44d7-8f1f-3031b30f0931\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.625497 4752 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"c246d067-016f-43a4-8b99-cf00683e3306\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.631431 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c246d067-016f-43a4-8b99-cf00683e3306-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c246d067-016f-43a4-8b99-cf00683e3306\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.632264 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c246d067-016f-43a4-8b99-cf00683e3306-logs\") pod \"glance-default-external-api-0\" (UID: \"c246d067-016f-43a4-8b99-cf00683e3306\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.637704 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c246d067-016f-43a4-8b99-cf00683e3306-config-data\") pod \"glance-default-external-api-0\" (UID: \"c246d067-016f-43a4-8b99-cf00683e3306\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.639561 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c246d067-016f-43a4-8b99-cf00683e3306-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c246d067-016f-43a4-8b99-cf00683e3306\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.643510 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c246d067-016f-43a4-8b99-cf00683e3306-scripts\") pod \"glance-default-external-api-0\" (UID: \"c246d067-016f-43a4-8b99-cf00683e3306\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.644098 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c246d067-016f-43a4-8b99-cf00683e3306-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c246d067-016f-43a4-8b99-cf00683e3306\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.646689 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgv2s\" (UniqueName: \"kubernetes.io/projected/c246d067-016f-43a4-8b99-cf00683e3306-kube-api-access-cgv2s\") pod \"glance-default-external-api-0\" (UID: \"c246d067-016f-43a4-8b99-cf00683e3306\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.677854 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"c246d067-016f-43a4-8b99-cf00683e3306\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.686635 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.726071 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ztn9\" (UniqueName: \"kubernetes.io/projected/fbacd1db-932a-44d7-8f1f-3031b30f0931-kube-api-access-9ztn9\") pod \"glance-default-internal-api-0\" (UID: \"fbacd1db-932a-44d7-8f1f-3031b30f0931\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.726165 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbacd1db-932a-44d7-8f1f-3031b30f0931-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"fbacd1db-932a-44d7-8f1f-3031b30f0931\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.726189 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbacd1db-932a-44d7-8f1f-3031b30f0931-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fbacd1db-932a-44d7-8f1f-3031b30f0931\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.726252 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbacd1db-932a-44d7-8f1f-3031b30f0931-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"fbacd1db-932a-44d7-8f1f-3031b30f0931\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.726291 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbacd1db-932a-44d7-8f1f-3031b30f0931-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fbacd1db-932a-44d7-8f1f-3031b30f0931\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.726325 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"fbacd1db-932a-44d7-8f1f-3031b30f0931\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.726349 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbacd1db-932a-44d7-8f1f-3031b30f0931-logs\") pod \"glance-default-internal-api-0\" (UID: \"fbacd1db-932a-44d7-8f1f-3031b30f0931\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.726393 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbacd1db-932a-44d7-8f1f-3031b30f0931-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fbacd1db-932a-44d7-8f1f-3031b30f0931\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.726992 4752 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"fbacd1db-932a-44d7-8f1f-3031b30f0931\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.727374 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbacd1db-932a-44d7-8f1f-3031b30f0931-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fbacd1db-932a-44d7-8f1f-3031b30f0931\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.727633 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbacd1db-932a-44d7-8f1f-3031b30f0931-logs\") pod \"glance-default-internal-api-0\" (UID: \"fbacd1db-932a-44d7-8f1f-3031b30f0931\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.732396 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbacd1db-932a-44d7-8f1f-3031b30f0931-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"fbacd1db-932a-44d7-8f1f-3031b30f0931\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.735102 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbacd1db-932a-44d7-8f1f-3031b30f0931-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fbacd1db-932a-44d7-8f1f-3031b30f0931\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.737667 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbacd1db-932a-44d7-8f1f-3031b30f0931-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"fbacd1db-932a-44d7-8f1f-3031b30f0931\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.738009 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbacd1db-932a-44d7-8f1f-3031b30f0931-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fbacd1db-932a-44d7-8f1f-3031b30f0931\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.751544 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ztn9\" (UniqueName: \"kubernetes.io/projected/fbacd1db-932a-44d7-8f1f-3031b30f0931-kube-api-access-9ztn9\") pod \"glance-default-internal-api-0\" (UID: \"fbacd1db-932a-44d7-8f1f-3031b30f0931\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:41.773341 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"fbacd1db-932a-44d7-8f1f-3031b30f0931\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:42.023823 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:42.057414 4752 generic.go:334] "Generic (PLEG): container finished" podID="caca9e92-caca-4ab5-b7d0-9e1803191a3c" containerID="a9779e301759b8ba70b0d0620030da504f21a74179aac645b6e5a2991bb3b6b9" exitCode=0 Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:42.057462 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-hltxr" event={"ID":"caca9e92-caca-4ab5-b7d0-9e1803191a3c","Type":"ContainerDied","Data":"a9779e301759b8ba70b0d0620030da504f21a74179aac645b6e5a2991bb3b6b9"} Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:42.057487 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-hltxr" event={"ID":"caca9e92-caca-4ab5-b7d0-9e1803191a3c","Type":"ContainerStarted","Data":"5ffacd732dc318c12ab7384b1bfb79ce398624dc159310cfb19af842ce941e53"} Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:42.757996 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xttvj"] Nov 24 11:24:42 crc kubenswrapper[4752]: W1124 11:24:42.758767 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddae10b86_7236_4998_81d0_88399d0770a8.slice/crio-2254afe9c035855e8005bb8cdf680c5abaea812b4f347d8542b89e403fe08a0f WatchSource:0}: Error finding container 2254afe9c035855e8005bb8cdf680c5abaea812b4f347d8542b89e403fe08a0f: Status 404 returned error can't find the container with id 2254afe9c035855e8005bb8cdf680c5abaea812b4f347d8542b89e403fe08a0f Nov 24 11:24:42 crc kubenswrapper[4752]: I1124 11:24:42.936626 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-hltxr" Nov 24 11:24:43 crc kubenswrapper[4752]: I1124 11:24:43.052094 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/caca9e92-caca-4ab5-b7d0-9e1803191a3c-dns-svc\") pod \"caca9e92-caca-4ab5-b7d0-9e1803191a3c\" (UID: \"caca9e92-caca-4ab5-b7d0-9e1803191a3c\") " Nov 24 11:24:43 crc kubenswrapper[4752]: I1124 11:24:43.052210 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/caca9e92-caca-4ab5-b7d0-9e1803191a3c-ovsdbserver-nb\") pod \"caca9e92-caca-4ab5-b7d0-9e1803191a3c\" (UID: \"caca9e92-caca-4ab5-b7d0-9e1803191a3c\") " Nov 24 11:24:43 crc kubenswrapper[4752]: I1124 11:24:43.052236 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/caca9e92-caca-4ab5-b7d0-9e1803191a3c-dns-swift-storage-0\") pod \"caca9e92-caca-4ab5-b7d0-9e1803191a3c\" (UID: \"caca9e92-caca-4ab5-b7d0-9e1803191a3c\") " Nov 24 11:24:43 crc kubenswrapper[4752]: I1124 11:24:43.052279 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/caca9e92-caca-4ab5-b7d0-9e1803191a3c-config\") pod \"caca9e92-caca-4ab5-b7d0-9e1803191a3c\" (UID: \"caca9e92-caca-4ab5-b7d0-9e1803191a3c\") " Nov 24 11:24:43 crc kubenswrapper[4752]: I1124 11:24:43.052299 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/caca9e92-caca-4ab5-b7d0-9e1803191a3c-ovsdbserver-sb\") pod \"caca9e92-caca-4ab5-b7d0-9e1803191a3c\" (UID: \"caca9e92-caca-4ab5-b7d0-9e1803191a3c\") " Nov 24 11:24:43 crc kubenswrapper[4752]: I1124 11:24:43.052346 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xp9x8\" (UniqueName: \"kubernetes.io/projected/caca9e92-caca-4ab5-b7d0-9e1803191a3c-kube-api-access-xp9x8\") pod \"caca9e92-caca-4ab5-b7d0-9e1803191a3c\" (UID: \"caca9e92-caca-4ab5-b7d0-9e1803191a3c\") " Nov 24 11:24:43 crc kubenswrapper[4752]: I1124 11:24:43.078971 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caca9e92-caca-4ab5-b7d0-9e1803191a3c-kube-api-access-xp9x8" (OuterVolumeSpecName: "kube-api-access-xp9x8") pod "caca9e92-caca-4ab5-b7d0-9e1803191a3c" (UID: "caca9e92-caca-4ab5-b7d0-9e1803191a3c"). InnerVolumeSpecName "kube-api-access-xp9x8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:24:43 crc kubenswrapper[4752]: I1124 11:24:43.093634 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/caca9e92-caca-4ab5-b7d0-9e1803191a3c-config" (OuterVolumeSpecName: "config") pod "caca9e92-caca-4ab5-b7d0-9e1803191a3c" (UID: "caca9e92-caca-4ab5-b7d0-9e1803191a3c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:24:43 crc kubenswrapper[4752]: I1124 11:24:43.099476 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/caca9e92-caca-4ab5-b7d0-9e1803191a3c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "caca9e92-caca-4ab5-b7d0-9e1803191a3c" (UID: "caca9e92-caca-4ab5-b7d0-9e1803191a3c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:24:43 crc kubenswrapper[4752]: I1124 11:24:43.104998 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-hltxr" event={"ID":"caca9e92-caca-4ab5-b7d0-9e1803191a3c","Type":"ContainerDied","Data":"5ffacd732dc318c12ab7384b1bfb79ce398624dc159310cfb19af842ce941e53"} Nov 24 11:24:43 crc kubenswrapper[4752]: I1124 11:24:43.105052 4752 scope.go:117] "RemoveContainer" containerID="a9779e301759b8ba70b0d0620030da504f21a74179aac645b6e5a2991bb3b6b9" Nov 24 11:24:43 crc kubenswrapper[4752]: I1124 11:24:43.105183 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-hltxr" Nov 24 11:24:43 crc kubenswrapper[4752]: I1124 11:24:43.105297 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/caca9e92-caca-4ab5-b7d0-9e1803191a3c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "caca9e92-caca-4ab5-b7d0-9e1803191a3c" (UID: "caca9e92-caca-4ab5-b7d0-9e1803191a3c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:24:43 crc kubenswrapper[4752]: I1124 11:24:43.114350 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/caca9e92-caca-4ab5-b7d0-9e1803191a3c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "caca9e92-caca-4ab5-b7d0-9e1803191a3c" (UID: "caca9e92-caca-4ab5-b7d0-9e1803191a3c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:24:43 crc kubenswrapper[4752]: I1124 11:24:43.119013 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-fdrhb"] Nov 24 11:24:43 crc kubenswrapper[4752]: I1124 11:24:43.121624 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xttvj" event={"ID":"dae10b86-7236-4998-81d0-88399d0770a8","Type":"ContainerStarted","Data":"fe306ec37ea8c3b9abfe5a4546df04b8bc9a09dc37fbf1c4dfdef5c811afc94e"} Nov 24 11:24:43 crc kubenswrapper[4752]: I1124 11:24:43.121664 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xttvj" event={"ID":"dae10b86-7236-4998-81d0-88399d0770a8","Type":"ContainerStarted","Data":"2254afe9c035855e8005bb8cdf680c5abaea812b4f347d8542b89e403fe08a0f"} Nov 24 11:24:43 crc kubenswrapper[4752]: I1124 11:24:43.142994 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-pz4zr"] Nov 24 11:24:43 crc kubenswrapper[4752]: I1124 11:24:43.145307 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/caca9e92-caca-4ab5-b7d0-9e1803191a3c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "caca9e92-caca-4ab5-b7d0-9e1803191a3c" (UID: "caca9e92-caca-4ab5-b7d0-9e1803191a3c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:24:43 crc kubenswrapper[4752]: I1124 11:24:43.154682 4752 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/caca9e92-caca-4ab5-b7d0-9e1803191a3c-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:43 crc kubenswrapper[4752]: I1124 11:24:43.154718 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/caca9e92-caca-4ab5-b7d0-9e1803191a3c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:43 crc kubenswrapper[4752]: I1124 11:24:43.154732 4752 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/caca9e92-caca-4ab5-b7d0-9e1803191a3c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:43 crc kubenswrapper[4752]: I1124 11:24:43.154747 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/caca9e92-caca-4ab5-b7d0-9e1803191a3c-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:43 crc kubenswrapper[4752]: I1124 11:24:43.154768 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/caca9e92-caca-4ab5-b7d0-9e1803191a3c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:43 crc kubenswrapper[4752]: I1124 11:24:43.154776 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xp9x8\" (UniqueName: \"kubernetes.io/projected/caca9e92-caca-4ab5-b7d0-9e1803191a3c-kube-api-access-xp9x8\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:43 crc kubenswrapper[4752]: I1124 11:24:43.188830 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 11:24:43 crc kubenswrapper[4752]: I1124 11:24:43.194146 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-ff2l8"] Nov 24 11:24:43 crc kubenswrapper[4752]: I1124 11:24:43.212170 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 24 11:24:43 crc kubenswrapper[4752]: I1124 11:24:43.227093 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-s49hz"] Nov 24 11:24:43 crc kubenswrapper[4752]: I1124 11:24:43.232989 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-xttvj" podStartSLOduration=3.232971435 podStartE2EDuration="3.232971435s" podCreationTimestamp="2025-11-24 11:24:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:24:43.179723027 +0000 UTC m=+1089.164543326" watchObservedRunningTime="2025-11-24 11:24:43.232971435 +0000 UTC m=+1089.217791724" Nov 24 11:24:43 crc kubenswrapper[4752]: I1124 11:24:43.247793 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-t8l2x"] Nov 24 11:24:43 crc kubenswrapper[4752]: I1124 11:24:43.253312 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 11:24:43 crc kubenswrapper[4752]: I1124 11:24:43.270808 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 11:24:43 crc kubenswrapper[4752]: W1124 11:24:43.281940 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc246d067_016f_43a4_8b99_cf00683e3306.slice/crio-87ff943193cb0478a4756e64984e293c16b4f5ae58e5f27e0e11737099a0761e WatchSource:0}: Error finding container 87ff943193cb0478a4756e64984e293c16b4f5ae58e5f27e0e11737099a0761e: Status 404 returned error can't find the container with id 87ff943193cb0478a4756e64984e293c16b4f5ae58e5f27e0e11737099a0761e Nov 24 11:24:43 crc kubenswrapper[4752]: I1124 11:24:43.379483 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 24 11:24:43 crc kubenswrapper[4752]: I1124 11:24:43.591696 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-hltxr"] Nov 24 11:24:43 crc kubenswrapper[4752]: I1124 11:24:43.606946 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-hltxr"] Nov 24 11:24:44 crc kubenswrapper[4752]: I1124 11:24:44.144835 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-t8l2x" event={"ID":"2c082fce-df36-4282-9630-e2f3089fa482","Type":"ContainerStarted","Data":"e55b3974f19bd84417679acf10b5aaff2058381e530e84be3ead5c645b2ddec0"} Nov 24 11:24:44 crc kubenswrapper[4752]: I1124 11:24:44.145192 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-t8l2x" event={"ID":"2c082fce-df36-4282-9630-e2f3089fa482","Type":"ContainerStarted","Data":"f779a517af572778171c60a2301d31202063615fd65ed6ae953080d5ca435b9f"} Nov 24 11:24:44 crc kubenswrapper[4752]: I1124 11:24:44.148521 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-fdrhb" event={"ID":"9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4","Type":"ContainerStarted","Data":"afb52af5fe007dede7579705ed524275c030273563c931d2b0a0302172a0835b"} Nov 24 11:24:44 crc kubenswrapper[4752]: I1124 11:24:44.169898 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c246d067-016f-43a4-8b99-cf00683e3306","Type":"ContainerStarted","Data":"87ff943193cb0478a4756e64984e293c16b4f5ae58e5f27e0e11737099a0761e"} Nov 24 11:24:44 crc kubenswrapper[4752]: I1124 11:24:44.175799 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-ff2l8" event={"ID":"dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9","Type":"ContainerStarted","Data":"b4a3bdf00e9d790da0d1fd427c18837eb66d7fe486248c2c030f879f87dd2969"} Nov 24 11:24:44 crc kubenswrapper[4752]: I1124 11:24:44.194852 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-s49hz" event={"ID":"96ca5590-9544-4b84-b86d-1ec3ef57a829","Type":"ContainerStarted","Data":"8bbe99dd7b498081d1e77dc7d049f2ffff9c5aacb9d2a7a74f8e0ef9c7bab0cb"} Nov 24 11:24:44 crc kubenswrapper[4752]: I1124 11:24:44.201604 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ade2fdf2-9bc9-45ee-81cf-25be73764135","Type":"ContainerStarted","Data":"ee1787e9ba7bd24e18ff2cc096607abe7b971d4d9bae956410d7615f3d5959b7"} Nov 24 11:24:44 crc kubenswrapper[4752]: I1124 11:24:44.211530 4752 generic.go:334] "Generic (PLEG): container finished" podID="8bf654b7-df16-44f1-99a8-7eb680634013" containerID="4df45b8527f2c003f39ff0dbda540833a7117cc872aceb703558a81c2b0f4ae1" exitCode=0 Nov 24 11:24:44 crc kubenswrapper[4752]: I1124 11:24:44.212946 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-pz4zr" event={"ID":"8bf654b7-df16-44f1-99a8-7eb680634013","Type":"ContainerDied","Data":"4df45b8527f2c003f39ff0dbda540833a7117cc872aceb703558a81c2b0f4ae1"} Nov 24 11:24:44 crc kubenswrapper[4752]: I1124 11:24:44.213014 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-pz4zr" event={"ID":"8bf654b7-df16-44f1-99a8-7eb680634013","Type":"ContainerStarted","Data":"6ee29efca3f221bc0072713257c24428eec1b696d3bb5a1b45901ceecef7a46e"} Nov 24 11:24:44 crc kubenswrapper[4752]: I1124 11:24:44.214982 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-t8l2x" podStartSLOduration=4.21496589 podStartE2EDuration="4.21496589s" podCreationTimestamp="2025-11-24 11:24:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:24:44.167973142 +0000 UTC m=+1090.152793431" watchObservedRunningTime="2025-11-24 11:24:44.21496589 +0000 UTC m=+1090.199786179" Nov 24 11:24:44 crc kubenswrapper[4752]: I1124 11:24:44.221099 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 11:24:44 crc kubenswrapper[4752]: I1124 11:24:44.755104 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="caca9e92-caca-4ab5-b7d0-9e1803191a3c" path="/var/lib/kubelet/pods/caca9e92-caca-4ab5-b7d0-9e1803191a3c/volumes" Nov 24 11:24:45 crc kubenswrapper[4752]: I1124 11:24:45.227627 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fbacd1db-932a-44d7-8f1f-3031b30f0931","Type":"ContainerStarted","Data":"e3179ae533e6c957090d48c858d41f79edb1c5138ef321b0b2afa286bfe894d8"} Nov 24 11:24:45 crc kubenswrapper[4752]: I1124 11:24:45.242852 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-pz4zr" event={"ID":"8bf654b7-df16-44f1-99a8-7eb680634013","Type":"ContainerStarted","Data":"2081cff28fb25e3157d2d48fa39f73ccfb779669f82a6e60a027ff939e135e2c"} Nov 24 11:24:45 crc kubenswrapper[4752]: I1124 11:24:45.242896 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56df8fb6b7-pz4zr" Nov 24 11:24:45 crc kubenswrapper[4752]: I1124 11:24:45.253862 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c246d067-016f-43a4-8b99-cf00683e3306","Type":"ContainerStarted","Data":"70eb8854934d561903625898d2298bc404bbd97fb551fd51ea4981a3958681d2"} Nov 24 11:24:45 crc kubenswrapper[4752]: I1124 11:24:45.286541 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56df8fb6b7-pz4zr" podStartSLOduration=5.286523412 podStartE2EDuration="5.286523412s" podCreationTimestamp="2025-11-24 11:24:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:24:45.277934384 +0000 UTC m=+1091.262754683" watchObservedRunningTime="2025-11-24 11:24:45.286523412 +0000 UTC m=+1091.271343701" Nov 24 11:24:45 crc kubenswrapper[4752]: I1124 11:24:45.468964 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 11:24:45 crc kubenswrapper[4752]: I1124 11:24:45.469096 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 11:24:46 crc kubenswrapper[4752]: I1124 11:24:46.270185 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fbacd1db-932a-44d7-8f1f-3031b30f0931","Type":"ContainerStarted","Data":"5cfc996b90ac6dd5cd0d523c98e7b600760a45f7c195157e2cf3847b643d0845"} Nov 24 11:24:46 crc kubenswrapper[4752]: I1124 11:24:46.274837 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c246d067-016f-43a4-8b99-cf00683e3306" containerName="glance-log" containerID="cri-o://70eb8854934d561903625898d2298bc404bbd97fb551fd51ea4981a3958681d2" gracePeriod=30 Nov 24 11:24:46 crc kubenswrapper[4752]: I1124 11:24:46.274921 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c246d067-016f-43a4-8b99-cf00683e3306","Type":"ContainerStarted","Data":"62f1b4e3dce5d3e873840a291f78123117fda6d27c0f90cfb181435975c30d57"} Nov 24 11:24:46 crc kubenswrapper[4752]: I1124 11:24:46.274941 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c246d067-016f-43a4-8b99-cf00683e3306" containerName="glance-httpd" containerID="cri-o://62f1b4e3dce5d3e873840a291f78123117fda6d27c0f90cfb181435975c30d57" gracePeriod=30 Nov 24 11:24:46 crc kubenswrapper[4752]: I1124 11:24:46.300204 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.30018184 podStartE2EDuration="6.30018184s" podCreationTimestamp="2025-11-24 11:24:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:24:46.293409444 +0000 UTC m=+1092.278229743" watchObservedRunningTime="2025-11-24 11:24:46.30018184 +0000 UTC m=+1092.285002129" Nov 24 11:24:47 crc kubenswrapper[4752]: I1124 11:24:47.284047 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fbacd1db-932a-44d7-8f1f-3031b30f0931","Type":"ContainerStarted","Data":"5b5cf439f6d8f889e8bf939804560389978a22cd31f7c8ca5d69f808c9d5b8bc"} Nov 24 11:24:47 crc kubenswrapper[4752]: I1124 11:24:47.284495 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="fbacd1db-932a-44d7-8f1f-3031b30f0931" containerName="glance-log" containerID="cri-o://5cfc996b90ac6dd5cd0d523c98e7b600760a45f7c195157e2cf3847b643d0845" gracePeriod=30 Nov 24 11:24:47 crc kubenswrapper[4752]: I1124 11:24:47.284989 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="fbacd1db-932a-44d7-8f1f-3031b30f0931" containerName="glance-httpd" containerID="cri-o://5b5cf439f6d8f889e8bf939804560389978a22cd31f7c8ca5d69f808c9d5b8bc" gracePeriod=30 Nov 24 11:24:47 crc kubenswrapper[4752]: I1124 11:24:47.290080 4752 generic.go:334] "Generic (PLEG): container finished" podID="c246d067-016f-43a4-8b99-cf00683e3306" containerID="62f1b4e3dce5d3e873840a291f78123117fda6d27c0f90cfb181435975c30d57" exitCode=0 Nov 24 11:24:47 crc kubenswrapper[4752]: I1124 11:24:47.290102 4752 generic.go:334] "Generic (PLEG): container finished" podID="c246d067-016f-43a4-8b99-cf00683e3306" containerID="70eb8854934d561903625898d2298bc404bbd97fb551fd51ea4981a3958681d2" exitCode=143 Nov 24 11:24:47 crc kubenswrapper[4752]: I1124 11:24:47.290280 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c246d067-016f-43a4-8b99-cf00683e3306","Type":"ContainerDied","Data":"62f1b4e3dce5d3e873840a291f78123117fda6d27c0f90cfb181435975c30d57"} Nov 24 11:24:47 crc kubenswrapper[4752]: I1124 11:24:47.290331 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c246d067-016f-43a4-8b99-cf00683e3306","Type":"ContainerDied","Data":"70eb8854934d561903625898d2298bc404bbd97fb551fd51ea4981a3958681d2"} Nov 24 11:24:48 crc kubenswrapper[4752]: I1124 11:24:48.310310 4752 generic.go:334] "Generic (PLEG): container finished" podID="dae10b86-7236-4998-81d0-88399d0770a8" containerID="fe306ec37ea8c3b9abfe5a4546df04b8bc9a09dc37fbf1c4dfdef5c811afc94e" exitCode=0 Nov 24 11:24:48 crc kubenswrapper[4752]: I1124 11:24:48.310391 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xttvj" event={"ID":"dae10b86-7236-4998-81d0-88399d0770a8","Type":"ContainerDied","Data":"fe306ec37ea8c3b9abfe5a4546df04b8bc9a09dc37fbf1c4dfdef5c811afc94e"} Nov 24 11:24:48 crc kubenswrapper[4752]: I1124 11:24:48.316709 4752 generic.go:334] "Generic (PLEG): container finished" podID="fbacd1db-932a-44d7-8f1f-3031b30f0931" containerID="5b5cf439f6d8f889e8bf939804560389978a22cd31f7c8ca5d69f808c9d5b8bc" exitCode=0 Nov 24 11:24:48 crc kubenswrapper[4752]: I1124 11:24:48.316776 4752 generic.go:334] "Generic (PLEG): container finished" podID="fbacd1db-932a-44d7-8f1f-3031b30f0931" containerID="5cfc996b90ac6dd5cd0d523c98e7b600760a45f7c195157e2cf3847b643d0845" exitCode=143 Nov 24 11:24:48 crc kubenswrapper[4752]: I1124 11:24:48.316811 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fbacd1db-932a-44d7-8f1f-3031b30f0931","Type":"ContainerDied","Data":"5b5cf439f6d8f889e8bf939804560389978a22cd31f7c8ca5d69f808c9d5b8bc"} Nov 24 11:24:48 crc kubenswrapper[4752]: I1124 11:24:48.316854 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fbacd1db-932a-44d7-8f1f-3031b30f0931","Type":"ContainerDied","Data":"5cfc996b90ac6dd5cd0d523c98e7b600760a45f7c195157e2cf3847b643d0845"} Nov 24 11:24:48 crc kubenswrapper[4752]: I1124 11:24:48.338230 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=8.338043793 podStartE2EDuration="8.338043793s" podCreationTimestamp="2025-11-24 11:24:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:24:47.310297987 +0000 UTC m=+1093.295118276" watchObservedRunningTime="2025-11-24 11:24:48.338043793 +0000 UTC m=+1094.323005656" Nov 24 11:24:50 crc kubenswrapper[4752]: E1124 11:24:50.474853 4752 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa308cc4_8cc5_4a63_926a_033a151f7291.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa308cc4_8cc5_4a63_926a_033a151f7291.slice/crio-a22dd843e547ae24880f732987ed3607616100f00fdd6d0b7c0a832ecf68d38d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04ae1a5e_39a4_40d2_8a7f_dd5e7dac6c0d.slice/crio-525620a12830d24c1a64bc6025c7ca13e4b2dc072e4ef9c51dc3990f97b641f2\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e150ca9_5a87_40ba_bbad_bfae139179ae.slice/crio-e0ba6aa64cba38678c4574d02ef74a8614a787b76bf66dad910dba664253978c\": RecentStats: unable to find data in memory cache]" Nov 24 11:24:51 crc kubenswrapper[4752]: I1124 11:24:51.183915 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56df8fb6b7-pz4zr" Nov 24 11:24:51 crc kubenswrapper[4752]: I1124 11:24:51.257047 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-wlhcd"] Nov 24 11:24:51 crc kubenswrapper[4752]: I1124 11:24:51.257349 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f59b8f679-wlhcd" podUID="2f9fc951-696e-4d2e-aa1e-1180df42a675" containerName="dnsmasq-dns" containerID="cri-o://aab9554c3d38909724fd4664e4734e67c9131a32b84339e4dc3f11e690fd71f1" gracePeriod=10 Nov 24 11:24:52 crc kubenswrapper[4752]: I1124 11:24:52.364995 4752 generic.go:334] "Generic (PLEG): container finished" podID="2f9fc951-696e-4d2e-aa1e-1180df42a675" containerID="aab9554c3d38909724fd4664e4734e67c9131a32b84339e4dc3f11e690fd71f1" exitCode=0 Nov 24 11:24:52 crc kubenswrapper[4752]: I1124 11:24:52.365067 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-wlhcd" event={"ID":"2f9fc951-696e-4d2e-aa1e-1180df42a675","Type":"ContainerDied","Data":"aab9554c3d38909724fd4664e4734e67c9131a32b84339e4dc3f11e690fd71f1"} Nov 24 11:24:52 crc kubenswrapper[4752]: I1124 11:24:52.538064 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-wlhcd" podUID="2f9fc951-696e-4d2e-aa1e-1180df42a675" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.128:5353: connect: connection refused" Nov 24 11:24:56 crc kubenswrapper[4752]: E1124 11:24:56.716068 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Nov 24 11:24:56 crc kubenswrapper[4752]: E1124 11:24:56.716262 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n669h95h7ch57fh87h68ch574h5f7h5chbch8h568h77h5f8hcfh5c8h76h668h65ch68fh669h5cdhcbh97hc5hch599hc7hdbhb8h546h5b6q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5ps2r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(ade2fdf2-9bc9-45ee-81cf-25be73764135): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 24 11:24:56 crc kubenswrapper[4752]: I1124 11:24:56.914601 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 24 11:24:56 crc kubenswrapper[4752]: I1124 11:24:56.984726 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c246d067-016f-43a4-8b99-cf00683e3306-public-tls-certs\") pod \"c246d067-016f-43a4-8b99-cf00683e3306\" (UID: \"c246d067-016f-43a4-8b99-cf00683e3306\") " Nov 24 11:24:56 crc kubenswrapper[4752]: I1124 11:24:56.984838 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgv2s\" (UniqueName: \"kubernetes.io/projected/c246d067-016f-43a4-8b99-cf00683e3306-kube-api-access-cgv2s\") pod \"c246d067-016f-43a4-8b99-cf00683e3306\" (UID: \"c246d067-016f-43a4-8b99-cf00683e3306\") " Nov 24 11:24:56 crc kubenswrapper[4752]: I1124 11:24:56.984889 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c246d067-016f-43a4-8b99-cf00683e3306-combined-ca-bundle\") pod \"c246d067-016f-43a4-8b99-cf00683e3306\" (UID: \"c246d067-016f-43a4-8b99-cf00683e3306\") " Nov 24 11:24:56 crc kubenswrapper[4752]: I1124 11:24:56.984955 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c246d067-016f-43a4-8b99-cf00683e3306-config-data\") pod \"c246d067-016f-43a4-8b99-cf00683e3306\" (UID: \"c246d067-016f-43a4-8b99-cf00683e3306\") " Nov 24 11:24:56 crc kubenswrapper[4752]: I1124 11:24:56.985807 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"c246d067-016f-43a4-8b99-cf00683e3306\" (UID: \"c246d067-016f-43a4-8b99-cf00683e3306\") " Nov 24 11:24:56 crc kubenswrapper[4752]: I1124 11:24:56.986151 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c246d067-016f-43a4-8b99-cf00683e3306-logs\") pod \"c246d067-016f-43a4-8b99-cf00683e3306\" (UID: \"c246d067-016f-43a4-8b99-cf00683e3306\") " Nov 24 11:24:56 crc kubenswrapper[4752]: I1124 11:24:56.986291 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c246d067-016f-43a4-8b99-cf00683e3306-scripts\") pod \"c246d067-016f-43a4-8b99-cf00683e3306\" (UID: \"c246d067-016f-43a4-8b99-cf00683e3306\") " Nov 24 11:24:56 crc kubenswrapper[4752]: I1124 11:24:56.986368 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c246d067-016f-43a4-8b99-cf00683e3306-httpd-run\") pod \"c246d067-016f-43a4-8b99-cf00683e3306\" (UID: \"c246d067-016f-43a4-8b99-cf00683e3306\") " Nov 24 11:24:56 crc kubenswrapper[4752]: I1124 11:24:56.986623 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c246d067-016f-43a4-8b99-cf00683e3306-logs" (OuterVolumeSpecName: "logs") pod "c246d067-016f-43a4-8b99-cf00683e3306" (UID: "c246d067-016f-43a4-8b99-cf00683e3306"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:24:56 crc kubenswrapper[4752]: I1124 11:24:56.986735 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c246d067-016f-43a4-8b99-cf00683e3306-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c246d067-016f-43a4-8b99-cf00683e3306" (UID: "c246d067-016f-43a4-8b99-cf00683e3306"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:24:56 crc kubenswrapper[4752]: I1124 11:24:56.987370 4752 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c246d067-016f-43a4-8b99-cf00683e3306-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:56 crc kubenswrapper[4752]: I1124 11:24:56.987388 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c246d067-016f-43a4-8b99-cf00683e3306-logs\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.009114 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "c246d067-016f-43a4-8b99-cf00683e3306" (UID: "c246d067-016f-43a4-8b99-cf00683e3306"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.009172 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c246d067-016f-43a4-8b99-cf00683e3306-kube-api-access-cgv2s" (OuterVolumeSpecName: "kube-api-access-cgv2s") pod "c246d067-016f-43a4-8b99-cf00683e3306" (UID: "c246d067-016f-43a4-8b99-cf00683e3306"). InnerVolumeSpecName "kube-api-access-cgv2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.010945 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c246d067-016f-43a4-8b99-cf00683e3306-scripts" (OuterVolumeSpecName: "scripts") pod "c246d067-016f-43a4-8b99-cf00683e3306" (UID: "c246d067-016f-43a4-8b99-cf00683e3306"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.028014 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c246d067-016f-43a4-8b99-cf00683e3306-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c246d067-016f-43a4-8b99-cf00683e3306" (UID: "c246d067-016f-43a4-8b99-cf00683e3306"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.049218 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c246d067-016f-43a4-8b99-cf00683e3306-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c246d067-016f-43a4-8b99-cf00683e3306" (UID: "c246d067-016f-43a4-8b99-cf00683e3306"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.054986 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c246d067-016f-43a4-8b99-cf00683e3306-config-data" (OuterVolumeSpecName: "config-data") pod "c246d067-016f-43a4-8b99-cf00683e3306" (UID: "c246d067-016f-43a4-8b99-cf00683e3306"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.088523 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c246d067-016f-43a4-8b99-cf00683e3306-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.088551 4752 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c246d067-016f-43a4-8b99-cf00683e3306-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.088563 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgv2s\" (UniqueName: \"kubernetes.io/projected/c246d067-016f-43a4-8b99-cf00683e3306-kube-api-access-cgv2s\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.088571 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c246d067-016f-43a4-8b99-cf00683e3306-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.088579 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c246d067-016f-43a4-8b99-cf00683e3306-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.088615 4752 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.110698 4752 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.189767 4752 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.414170 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c246d067-016f-43a4-8b99-cf00683e3306","Type":"ContainerDied","Data":"87ff943193cb0478a4756e64984e293c16b4f5ae58e5f27e0e11737099a0761e"} Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.414232 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.414243 4752 scope.go:117] "RemoveContainer" containerID="62f1b4e3dce5d3e873840a291f78123117fda6d27c0f90cfb181435975c30d57" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.457428 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.500471 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.507457 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 11:24:57 crc kubenswrapper[4752]: E1124 11:24:57.507996 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c246d067-016f-43a4-8b99-cf00683e3306" containerName="glance-log" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.508020 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="c246d067-016f-43a4-8b99-cf00683e3306" containerName="glance-log" Nov 24 11:24:57 crc kubenswrapper[4752]: E1124 11:24:57.508040 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c246d067-016f-43a4-8b99-cf00683e3306" containerName="glance-httpd" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.508048 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="c246d067-016f-43a4-8b99-cf00683e3306" containerName="glance-httpd" Nov 24 11:24:57 crc kubenswrapper[4752]: E1124 11:24:57.508072 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caca9e92-caca-4ab5-b7d0-9e1803191a3c" containerName="init" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.508080 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="caca9e92-caca-4ab5-b7d0-9e1803191a3c" containerName="init" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.508327 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="c246d067-016f-43a4-8b99-cf00683e3306" containerName="glance-log" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.508357 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="caca9e92-caca-4ab5-b7d0-9e1803191a3c" containerName="init" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.508376 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="c246d067-016f-43a4-8b99-cf00683e3306" containerName="glance-httpd" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.509646 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.513407 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.513976 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.514569 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.530433 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-wlhcd" podUID="2f9fc951-696e-4d2e-aa1e-1180df42a675" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.128:5353: connect: connection refused" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.596176 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b4cf2fd-213b-4101-b13b-9a58eb9f3f28-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.596301 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b4cf2fd-213b-4101-b13b-9a58eb9f3f28-logs\") pod \"glance-default-external-api-0\" (UID: \"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.596427 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b4cf2fd-213b-4101-b13b-9a58eb9f3f28-scripts\") pod \"glance-default-external-api-0\" (UID: \"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.596520 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b4cf2fd-213b-4101-b13b-9a58eb9f3f28-config-data\") pod \"glance-default-external-api-0\" (UID: \"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.597157 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b4cf2fd-213b-4101-b13b-9a58eb9f3f28-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.597462 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvh9b\" (UniqueName: \"kubernetes.io/projected/9b4cf2fd-213b-4101-b13b-9a58eb9f3f28-kube-api-access-tvh9b\") pod \"glance-default-external-api-0\" (UID: \"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.597538 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.597648 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9b4cf2fd-213b-4101-b13b-9a58eb9f3f28-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.699783 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvh9b\" (UniqueName: \"kubernetes.io/projected/9b4cf2fd-213b-4101-b13b-9a58eb9f3f28-kube-api-access-tvh9b\") pod \"glance-default-external-api-0\" (UID: \"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.699849 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.699892 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9b4cf2fd-213b-4101-b13b-9a58eb9f3f28-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.699948 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b4cf2fd-213b-4101-b13b-9a58eb9f3f28-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.699974 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b4cf2fd-213b-4101-b13b-9a58eb9f3f28-logs\") pod \"glance-default-external-api-0\" (UID: \"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.699994 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b4cf2fd-213b-4101-b13b-9a58eb9f3f28-scripts\") pod \"glance-default-external-api-0\" (UID: \"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.700022 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b4cf2fd-213b-4101-b13b-9a58eb9f3f28-config-data\") pod \"glance-default-external-api-0\" (UID: \"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.700045 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b4cf2fd-213b-4101-b13b-9a58eb9f3f28-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.700437 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9b4cf2fd-213b-4101-b13b-9a58eb9f3f28-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.700999 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b4cf2fd-213b-4101-b13b-9a58eb9f3f28-logs\") pod \"glance-default-external-api-0\" (UID: \"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.702254 4752 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.708523 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b4cf2fd-213b-4101-b13b-9a58eb9f3f28-config-data\") pod \"glance-default-external-api-0\" (UID: \"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.709424 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b4cf2fd-213b-4101-b13b-9a58eb9f3f28-scripts\") pod \"glance-default-external-api-0\" (UID: \"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.709513 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b4cf2fd-213b-4101-b13b-9a58eb9f3f28-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.711252 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b4cf2fd-213b-4101-b13b-9a58eb9f3f28-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.717387 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvh9b\" (UniqueName: \"kubernetes.io/projected/9b4cf2fd-213b-4101-b13b-9a58eb9f3f28-kube-api-access-tvh9b\") pod \"glance-default-external-api-0\" (UID: \"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.743408 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28\") " pod="openstack/glance-default-external-api-0" Nov 24 11:24:57 crc kubenswrapper[4752]: I1124 11:24:57.834262 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 24 11:24:58 crc kubenswrapper[4752]: I1124 11:24:58.742363 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c246d067-016f-43a4-8b99-cf00683e3306" path="/var/lib/kubelet/pods/c246d067-016f-43a4-8b99-cf00683e3306/volumes" Nov 24 11:24:58 crc kubenswrapper[4752]: E1124 11:24:58.803123 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Nov 24 11:24:58 crc kubenswrapper[4752]: E1124 11:24:58.803294 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qtwq6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-s49hz_openstack(96ca5590-9544-4b84-b86d-1ec3ef57a829): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 24 11:24:58 crc kubenswrapper[4752]: E1124 11:24:58.804498 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-s49hz" podUID="96ca5590-9544-4b84-b86d-1ec3ef57a829" Nov 24 11:24:58 crc kubenswrapper[4752]: I1124 11:24:58.872011 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xttvj" Nov 24 11:24:58 crc kubenswrapper[4752]: I1124 11:24:58.920271 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wgbw\" (UniqueName: \"kubernetes.io/projected/dae10b86-7236-4998-81d0-88399d0770a8-kube-api-access-5wgbw\") pod \"dae10b86-7236-4998-81d0-88399d0770a8\" (UID: \"dae10b86-7236-4998-81d0-88399d0770a8\") " Nov 24 11:24:58 crc kubenswrapper[4752]: I1124 11:24:58.920692 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dae10b86-7236-4998-81d0-88399d0770a8-config-data\") pod \"dae10b86-7236-4998-81d0-88399d0770a8\" (UID: \"dae10b86-7236-4998-81d0-88399d0770a8\") " Nov 24 11:24:58 crc kubenswrapper[4752]: I1124 11:24:58.920797 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dae10b86-7236-4998-81d0-88399d0770a8-scripts\") pod \"dae10b86-7236-4998-81d0-88399d0770a8\" (UID: \"dae10b86-7236-4998-81d0-88399d0770a8\") " Nov 24 11:24:58 crc kubenswrapper[4752]: I1124 11:24:58.920825 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dae10b86-7236-4998-81d0-88399d0770a8-fernet-keys\") pod \"dae10b86-7236-4998-81d0-88399d0770a8\" (UID: \"dae10b86-7236-4998-81d0-88399d0770a8\") " Nov 24 11:24:58 crc kubenswrapper[4752]: I1124 11:24:58.920841 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dae10b86-7236-4998-81d0-88399d0770a8-combined-ca-bundle\") pod \"dae10b86-7236-4998-81d0-88399d0770a8\" (UID: \"dae10b86-7236-4998-81d0-88399d0770a8\") " Nov 24 11:24:58 crc kubenswrapper[4752]: I1124 11:24:58.922860 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dae10b86-7236-4998-81d0-88399d0770a8-credential-keys\") pod \"dae10b86-7236-4998-81d0-88399d0770a8\" (UID: \"dae10b86-7236-4998-81d0-88399d0770a8\") " Nov 24 11:24:58 crc kubenswrapper[4752]: I1124 11:24:58.932397 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dae10b86-7236-4998-81d0-88399d0770a8-kube-api-access-5wgbw" (OuterVolumeSpecName: "kube-api-access-5wgbw") pod "dae10b86-7236-4998-81d0-88399d0770a8" (UID: "dae10b86-7236-4998-81d0-88399d0770a8"). InnerVolumeSpecName "kube-api-access-5wgbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:24:58 crc kubenswrapper[4752]: I1124 11:24:58.933423 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dae10b86-7236-4998-81d0-88399d0770a8-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "dae10b86-7236-4998-81d0-88399d0770a8" (UID: "dae10b86-7236-4998-81d0-88399d0770a8"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:24:58 crc kubenswrapper[4752]: I1124 11:24:58.934523 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dae10b86-7236-4998-81d0-88399d0770a8-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "dae10b86-7236-4998-81d0-88399d0770a8" (UID: "dae10b86-7236-4998-81d0-88399d0770a8"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:24:58 crc kubenswrapper[4752]: I1124 11:24:58.939882 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dae10b86-7236-4998-81d0-88399d0770a8-scripts" (OuterVolumeSpecName: "scripts") pod "dae10b86-7236-4998-81d0-88399d0770a8" (UID: "dae10b86-7236-4998-81d0-88399d0770a8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:24:58 crc kubenswrapper[4752]: I1124 11:24:58.972984 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dae10b86-7236-4998-81d0-88399d0770a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dae10b86-7236-4998-81d0-88399d0770a8" (UID: "dae10b86-7236-4998-81d0-88399d0770a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:24:59 crc kubenswrapper[4752]: I1124 11:24:59.000552 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dae10b86-7236-4998-81d0-88399d0770a8-config-data" (OuterVolumeSpecName: "config-data") pod "dae10b86-7236-4998-81d0-88399d0770a8" (UID: "dae10b86-7236-4998-81d0-88399d0770a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:24:59 crc kubenswrapper[4752]: I1124 11:24:59.032271 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dae10b86-7236-4998-81d0-88399d0770a8-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:59 crc kubenswrapper[4752]: I1124 11:24:59.032320 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dae10b86-7236-4998-81d0-88399d0770a8-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:59 crc kubenswrapper[4752]: I1124 11:24:59.032332 4752 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dae10b86-7236-4998-81d0-88399d0770a8-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:59 crc kubenswrapper[4752]: I1124 11:24:59.032342 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dae10b86-7236-4998-81d0-88399d0770a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:59 crc kubenswrapper[4752]: I1124 11:24:59.032359 4752 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dae10b86-7236-4998-81d0-88399d0770a8-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:59 crc kubenswrapper[4752]: I1124 11:24:59.032409 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wgbw\" (UniqueName: \"kubernetes.io/projected/dae10b86-7236-4998-81d0-88399d0770a8-kube-api-access-5wgbw\") on node \"crc\" DevicePath \"\"" Nov 24 11:24:59 crc kubenswrapper[4752]: I1124 11:24:59.430886 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xttvj" event={"ID":"dae10b86-7236-4998-81d0-88399d0770a8","Type":"ContainerDied","Data":"2254afe9c035855e8005bb8cdf680c5abaea812b4f347d8542b89e403fe08a0f"} Nov 24 11:24:59 crc kubenswrapper[4752]: I1124 11:24:59.431007 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2254afe9c035855e8005bb8cdf680c5abaea812b4f347d8542b89e403fe08a0f" Nov 24 11:24:59 crc kubenswrapper[4752]: I1124 11:24:59.430917 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xttvj" Nov 24 11:24:59 crc kubenswrapper[4752]: E1124 11:24:59.432975 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-s49hz" podUID="96ca5590-9544-4b84-b86d-1ec3ef57a829" Nov 24 11:24:59 crc kubenswrapper[4752]: I1124 11:24:59.962046 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-xttvj"] Nov 24 11:24:59 crc kubenswrapper[4752]: I1124 11:24:59.968963 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-xttvj"] Nov 24 11:25:00 crc kubenswrapper[4752]: I1124 11:25:00.067947 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-v9hbc"] Nov 24 11:25:00 crc kubenswrapper[4752]: E1124 11:25:00.068855 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dae10b86-7236-4998-81d0-88399d0770a8" containerName="keystone-bootstrap" Nov 24 11:25:00 crc kubenswrapper[4752]: I1124 11:25:00.068890 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="dae10b86-7236-4998-81d0-88399d0770a8" containerName="keystone-bootstrap" Nov 24 11:25:00 crc kubenswrapper[4752]: I1124 11:25:00.069175 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="dae10b86-7236-4998-81d0-88399d0770a8" containerName="keystone-bootstrap" Nov 24 11:25:00 crc kubenswrapper[4752]: I1124 11:25:00.070338 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-v9hbc" Nov 24 11:25:00 crc kubenswrapper[4752]: I1124 11:25:00.073962 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-v9hbc"] Nov 24 11:25:00 crc kubenswrapper[4752]: I1124 11:25:00.074963 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 24 11:25:00 crc kubenswrapper[4752]: I1124 11:25:00.074993 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 24 11:25:00 crc kubenswrapper[4752]: I1124 11:25:00.075279 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 24 11:25:00 crc kubenswrapper[4752]: I1124 11:25:00.075315 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-kdtjx" Nov 24 11:25:00 crc kubenswrapper[4752]: I1124 11:25:00.075501 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 24 11:25:00 crc kubenswrapper[4752]: I1124 11:25:00.155780 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe564c7e-869d-40d3-9472-839c4fbb51e6-config-data\") pod \"keystone-bootstrap-v9hbc\" (UID: \"fe564c7e-869d-40d3-9472-839c4fbb51e6\") " pod="openstack/keystone-bootstrap-v9hbc" Nov 24 11:25:00 crc kubenswrapper[4752]: I1124 11:25:00.155835 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fe564c7e-869d-40d3-9472-839c4fbb51e6-credential-keys\") pod \"keystone-bootstrap-v9hbc\" (UID: \"fe564c7e-869d-40d3-9472-839c4fbb51e6\") " pod="openstack/keystone-bootstrap-v9hbc" Nov 24 11:25:00 crc kubenswrapper[4752]: I1124 11:25:00.155873 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe564c7e-869d-40d3-9472-839c4fbb51e6-scripts\") pod \"keystone-bootstrap-v9hbc\" (UID: \"fe564c7e-869d-40d3-9472-839c4fbb51e6\") " pod="openstack/keystone-bootstrap-v9hbc" Nov 24 11:25:00 crc kubenswrapper[4752]: I1124 11:25:00.156113 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fe564c7e-869d-40d3-9472-839c4fbb51e6-fernet-keys\") pod \"keystone-bootstrap-v9hbc\" (UID: \"fe564c7e-869d-40d3-9472-839c4fbb51e6\") " pod="openstack/keystone-bootstrap-v9hbc" Nov 24 11:25:00 crc kubenswrapper[4752]: I1124 11:25:00.156136 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tczdq\" (UniqueName: \"kubernetes.io/projected/fe564c7e-869d-40d3-9472-839c4fbb51e6-kube-api-access-tczdq\") pod \"keystone-bootstrap-v9hbc\" (UID: \"fe564c7e-869d-40d3-9472-839c4fbb51e6\") " pod="openstack/keystone-bootstrap-v9hbc" Nov 24 11:25:00 crc kubenswrapper[4752]: I1124 11:25:00.156291 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe564c7e-869d-40d3-9472-839c4fbb51e6-combined-ca-bundle\") pod \"keystone-bootstrap-v9hbc\" (UID: \"fe564c7e-869d-40d3-9472-839c4fbb51e6\") " pod="openstack/keystone-bootstrap-v9hbc" Nov 24 11:25:00 crc kubenswrapper[4752]: I1124 11:25:00.258651 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe564c7e-869d-40d3-9472-839c4fbb51e6-combined-ca-bundle\") pod \"keystone-bootstrap-v9hbc\" (UID: \"fe564c7e-869d-40d3-9472-839c4fbb51e6\") " pod="openstack/keystone-bootstrap-v9hbc" Nov 24 11:25:00 crc kubenswrapper[4752]: I1124 11:25:00.258930 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe564c7e-869d-40d3-9472-839c4fbb51e6-config-data\") pod \"keystone-bootstrap-v9hbc\" (UID: \"fe564c7e-869d-40d3-9472-839c4fbb51e6\") " pod="openstack/keystone-bootstrap-v9hbc" Nov 24 11:25:00 crc kubenswrapper[4752]: I1124 11:25:00.260310 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fe564c7e-869d-40d3-9472-839c4fbb51e6-credential-keys\") pod \"keystone-bootstrap-v9hbc\" (UID: \"fe564c7e-869d-40d3-9472-839c4fbb51e6\") " pod="openstack/keystone-bootstrap-v9hbc" Nov 24 11:25:00 crc kubenswrapper[4752]: I1124 11:25:00.260784 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe564c7e-869d-40d3-9472-839c4fbb51e6-scripts\") pod \"keystone-bootstrap-v9hbc\" (UID: \"fe564c7e-869d-40d3-9472-839c4fbb51e6\") " pod="openstack/keystone-bootstrap-v9hbc" Nov 24 11:25:00 crc kubenswrapper[4752]: I1124 11:25:00.260851 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fe564c7e-869d-40d3-9472-839c4fbb51e6-fernet-keys\") pod \"keystone-bootstrap-v9hbc\" (UID: \"fe564c7e-869d-40d3-9472-839c4fbb51e6\") " pod="openstack/keystone-bootstrap-v9hbc" Nov 24 11:25:00 crc kubenswrapper[4752]: I1124 11:25:00.260888 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tczdq\" (UniqueName: \"kubernetes.io/projected/fe564c7e-869d-40d3-9472-839c4fbb51e6-kube-api-access-tczdq\") pod \"keystone-bootstrap-v9hbc\" (UID: \"fe564c7e-869d-40d3-9472-839c4fbb51e6\") " pod="openstack/keystone-bootstrap-v9hbc" Nov 24 11:25:00 crc kubenswrapper[4752]: I1124 11:25:00.266328 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe564c7e-869d-40d3-9472-839c4fbb51e6-scripts\") pod \"keystone-bootstrap-v9hbc\" (UID: \"fe564c7e-869d-40d3-9472-839c4fbb51e6\") " pod="openstack/keystone-bootstrap-v9hbc" Nov 24 11:25:00 crc kubenswrapper[4752]: I1124 11:25:00.272170 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fe564c7e-869d-40d3-9472-839c4fbb51e6-fernet-keys\") pod \"keystone-bootstrap-v9hbc\" (UID: \"fe564c7e-869d-40d3-9472-839c4fbb51e6\") " pod="openstack/keystone-bootstrap-v9hbc" Nov 24 11:25:00 crc kubenswrapper[4752]: I1124 11:25:00.272865 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fe564c7e-869d-40d3-9472-839c4fbb51e6-credential-keys\") pod \"keystone-bootstrap-v9hbc\" (UID: \"fe564c7e-869d-40d3-9472-839c4fbb51e6\") " pod="openstack/keystone-bootstrap-v9hbc" Nov 24 11:25:00 crc kubenswrapper[4752]: I1124 11:25:00.273440 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe564c7e-869d-40d3-9472-839c4fbb51e6-config-data\") pod \"keystone-bootstrap-v9hbc\" (UID: \"fe564c7e-869d-40d3-9472-839c4fbb51e6\") " pod="openstack/keystone-bootstrap-v9hbc" Nov 24 11:25:00 crc kubenswrapper[4752]: I1124 11:25:00.284033 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe564c7e-869d-40d3-9472-839c4fbb51e6-combined-ca-bundle\") pod \"keystone-bootstrap-v9hbc\" (UID: \"fe564c7e-869d-40d3-9472-839c4fbb51e6\") " pod="openstack/keystone-bootstrap-v9hbc" Nov 24 11:25:00 crc kubenswrapper[4752]: I1124 11:25:00.284519 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tczdq\" (UniqueName: \"kubernetes.io/projected/fe564c7e-869d-40d3-9472-839c4fbb51e6-kube-api-access-tczdq\") pod \"keystone-bootstrap-v9hbc\" (UID: \"fe564c7e-869d-40d3-9472-839c4fbb51e6\") " pod="openstack/keystone-bootstrap-v9hbc" Nov 24 11:25:00 crc kubenswrapper[4752]: I1124 11:25:00.415032 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-v9hbc" Nov 24 11:25:00 crc kubenswrapper[4752]: E1124 11:25:00.736392 4752 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e150ca9_5a87_40ba_bbad_bfae139179ae.slice/crio-e0ba6aa64cba38678c4574d02ef74a8614a787b76bf66dad910dba664253978c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa308cc4_8cc5_4a63_926a_033a151f7291.slice/crio-a22dd843e547ae24880f732987ed3607616100f00fdd6d0b7c0a832ecf68d38d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04ae1a5e_39a4_40d2_8a7f_dd5e7dac6c0d.slice/crio-525620a12830d24c1a64bc6025c7ca13e4b2dc072e4ef9c51dc3990f97b641f2\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa308cc4_8cc5_4a63_926a_033a151f7291.slice\": RecentStats: unable to find data in memory cache]" Nov 24 11:25:00 crc kubenswrapper[4752]: I1124 11:25:00.738547 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dae10b86-7236-4998-81d0-88399d0770a8" path="/var/lib/kubelet/pods/dae10b86-7236-4998-81d0-88399d0770a8/volumes" Nov 24 11:25:07 crc kubenswrapper[4752]: I1124 11:25:07.530771 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-wlhcd" podUID="2f9fc951-696e-4d2e-aa1e-1180df42a675" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.128:5353: i/o timeout" Nov 24 11:25:07 crc kubenswrapper[4752]: I1124 11:25:07.532688 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f59b8f679-wlhcd" Nov 24 11:25:07 crc kubenswrapper[4752]: I1124 11:25:07.761338 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 24 11:25:07 crc kubenswrapper[4752]: I1124 11:25:07.769676 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-wlhcd" Nov 24 11:25:07 crc kubenswrapper[4752]: I1124 11:25:07.911059 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f9fc951-696e-4d2e-aa1e-1180df42a675-ovsdbserver-sb\") pod \"2f9fc951-696e-4d2e-aa1e-1180df42a675\" (UID: \"2f9fc951-696e-4d2e-aa1e-1180df42a675\") " Nov 24 11:25:07 crc kubenswrapper[4752]: I1124 11:25:07.911136 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbacd1db-932a-44d7-8f1f-3031b30f0931-httpd-run\") pod \"fbacd1db-932a-44d7-8f1f-3031b30f0931\" (UID: \"fbacd1db-932a-44d7-8f1f-3031b30f0931\") " Nov 24 11:25:07 crc kubenswrapper[4752]: I1124 11:25:07.911157 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f9fc951-696e-4d2e-aa1e-1180df42a675-config\") pod \"2f9fc951-696e-4d2e-aa1e-1180df42a675\" (UID: \"2f9fc951-696e-4d2e-aa1e-1180df42a675\") " Nov 24 11:25:07 crc kubenswrapper[4752]: I1124 11:25:07.911190 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbacd1db-932a-44d7-8f1f-3031b30f0931-config-data\") pod \"fbacd1db-932a-44d7-8f1f-3031b30f0931\" (UID: \"fbacd1db-932a-44d7-8f1f-3031b30f0931\") " Nov 24 11:25:07 crc kubenswrapper[4752]: I1124 11:25:07.911248 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ztn9\" (UniqueName: \"kubernetes.io/projected/fbacd1db-932a-44d7-8f1f-3031b30f0931-kube-api-access-9ztn9\") pod \"fbacd1db-932a-44d7-8f1f-3031b30f0931\" (UID: \"fbacd1db-932a-44d7-8f1f-3031b30f0931\") " Nov 24 11:25:07 crc kubenswrapper[4752]: I1124 11:25:07.911268 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f9fc951-696e-4d2e-aa1e-1180df42a675-ovsdbserver-nb\") pod \"2f9fc951-696e-4d2e-aa1e-1180df42a675\" (UID: \"2f9fc951-696e-4d2e-aa1e-1180df42a675\") " Nov 24 11:25:07 crc kubenswrapper[4752]: I1124 11:25:07.911296 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbacd1db-932a-44d7-8f1f-3031b30f0931-scripts\") pod \"fbacd1db-932a-44d7-8f1f-3031b30f0931\" (UID: \"fbacd1db-932a-44d7-8f1f-3031b30f0931\") " Nov 24 11:25:07 crc kubenswrapper[4752]: I1124 11:25:07.911341 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"fbacd1db-932a-44d7-8f1f-3031b30f0931\" (UID: \"fbacd1db-932a-44d7-8f1f-3031b30f0931\") " Nov 24 11:25:07 crc kubenswrapper[4752]: I1124 11:25:07.911684 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbacd1db-932a-44d7-8f1f-3031b30f0931-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "fbacd1db-932a-44d7-8f1f-3031b30f0931" (UID: "fbacd1db-932a-44d7-8f1f-3031b30f0931"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:25:07 crc kubenswrapper[4752]: I1124 11:25:07.911959 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbacd1db-932a-44d7-8f1f-3031b30f0931-logs\") pod \"fbacd1db-932a-44d7-8f1f-3031b30f0931\" (UID: \"fbacd1db-932a-44d7-8f1f-3031b30f0931\") " Nov 24 11:25:07 crc kubenswrapper[4752]: I1124 11:25:07.912024 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbacd1db-932a-44d7-8f1f-3031b30f0931-internal-tls-certs\") pod \"fbacd1db-932a-44d7-8f1f-3031b30f0931\" (UID: \"fbacd1db-932a-44d7-8f1f-3031b30f0931\") " Nov 24 11:25:07 crc kubenswrapper[4752]: I1124 11:25:07.912056 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f9fc951-696e-4d2e-aa1e-1180df42a675-dns-swift-storage-0\") pod \"2f9fc951-696e-4d2e-aa1e-1180df42a675\" (UID: \"2f9fc951-696e-4d2e-aa1e-1180df42a675\") " Nov 24 11:25:07 crc kubenswrapper[4752]: I1124 11:25:07.912104 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f9fc951-696e-4d2e-aa1e-1180df42a675-dns-svc\") pod \"2f9fc951-696e-4d2e-aa1e-1180df42a675\" (UID: \"2f9fc951-696e-4d2e-aa1e-1180df42a675\") " Nov 24 11:25:07 crc kubenswrapper[4752]: I1124 11:25:07.912129 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbacd1db-932a-44d7-8f1f-3031b30f0931-combined-ca-bundle\") pod \"fbacd1db-932a-44d7-8f1f-3031b30f0931\" (UID: \"fbacd1db-932a-44d7-8f1f-3031b30f0931\") " Nov 24 11:25:07 crc kubenswrapper[4752]: I1124 11:25:07.912331 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbacd1db-932a-44d7-8f1f-3031b30f0931-logs" (OuterVolumeSpecName: "logs") pod "fbacd1db-932a-44d7-8f1f-3031b30f0931" (UID: "fbacd1db-932a-44d7-8f1f-3031b30f0931"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:25:07 crc kubenswrapper[4752]: I1124 11:25:07.912456 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6phk5\" (UniqueName: \"kubernetes.io/projected/2f9fc951-696e-4d2e-aa1e-1180df42a675-kube-api-access-6phk5\") pod \"2f9fc951-696e-4d2e-aa1e-1180df42a675\" (UID: \"2f9fc951-696e-4d2e-aa1e-1180df42a675\") " Nov 24 11:25:07 crc kubenswrapper[4752]: I1124 11:25:07.912887 4752 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbacd1db-932a-44d7-8f1f-3031b30f0931-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:07 crc kubenswrapper[4752]: I1124 11:25:07.912899 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbacd1db-932a-44d7-8f1f-3031b30f0931-logs\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:07 crc kubenswrapper[4752]: I1124 11:25:07.917828 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbacd1db-932a-44d7-8f1f-3031b30f0931-scripts" (OuterVolumeSpecName: "scripts") pod "fbacd1db-932a-44d7-8f1f-3031b30f0931" (UID: "fbacd1db-932a-44d7-8f1f-3031b30f0931"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:25:07 crc kubenswrapper[4752]: I1124 11:25:07.917846 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "fbacd1db-932a-44d7-8f1f-3031b30f0931" (UID: "fbacd1db-932a-44d7-8f1f-3031b30f0931"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 24 11:25:07 crc kubenswrapper[4752]: I1124 11:25:07.918027 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbacd1db-932a-44d7-8f1f-3031b30f0931-kube-api-access-9ztn9" (OuterVolumeSpecName: "kube-api-access-9ztn9") pod "fbacd1db-932a-44d7-8f1f-3031b30f0931" (UID: "fbacd1db-932a-44d7-8f1f-3031b30f0931"). InnerVolumeSpecName "kube-api-access-9ztn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:25:07 crc kubenswrapper[4752]: I1124 11:25:07.920425 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f9fc951-696e-4d2e-aa1e-1180df42a675-kube-api-access-6phk5" (OuterVolumeSpecName: "kube-api-access-6phk5") pod "2f9fc951-696e-4d2e-aa1e-1180df42a675" (UID: "2f9fc951-696e-4d2e-aa1e-1180df42a675"). InnerVolumeSpecName "kube-api-access-6phk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:25:07 crc kubenswrapper[4752]: I1124 11:25:07.955787 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbacd1db-932a-44d7-8f1f-3031b30f0931-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fbacd1db-932a-44d7-8f1f-3031b30f0931" (UID: "fbacd1db-932a-44d7-8f1f-3031b30f0931"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:25:07 crc kubenswrapper[4752]: I1124 11:25:07.968540 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f9fc951-696e-4d2e-aa1e-1180df42a675-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2f9fc951-696e-4d2e-aa1e-1180df42a675" (UID: "2f9fc951-696e-4d2e-aa1e-1180df42a675"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:25:07 crc kubenswrapper[4752]: I1124 11:25:07.970172 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f9fc951-696e-4d2e-aa1e-1180df42a675-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2f9fc951-696e-4d2e-aa1e-1180df42a675" (UID: "2f9fc951-696e-4d2e-aa1e-1180df42a675"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:25:07 crc kubenswrapper[4752]: I1124 11:25:07.971654 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbacd1db-932a-44d7-8f1f-3031b30f0931-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fbacd1db-932a-44d7-8f1f-3031b30f0931" (UID: "fbacd1db-932a-44d7-8f1f-3031b30f0931"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:25:07 crc kubenswrapper[4752]: I1124 11:25:07.975924 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbacd1db-932a-44d7-8f1f-3031b30f0931-config-data" (OuterVolumeSpecName: "config-data") pod "fbacd1db-932a-44d7-8f1f-3031b30f0931" (UID: "fbacd1db-932a-44d7-8f1f-3031b30f0931"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:25:07 crc kubenswrapper[4752]: I1124 11:25:07.977671 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f9fc951-696e-4d2e-aa1e-1180df42a675-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2f9fc951-696e-4d2e-aa1e-1180df42a675" (UID: "2f9fc951-696e-4d2e-aa1e-1180df42a675"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:25:07 crc kubenswrapper[4752]: I1124 11:25:07.978311 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f9fc951-696e-4d2e-aa1e-1180df42a675-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2f9fc951-696e-4d2e-aa1e-1180df42a675" (UID: "2f9fc951-696e-4d2e-aa1e-1180df42a675"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:25:07 crc kubenswrapper[4752]: I1124 11:25:07.994853 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f9fc951-696e-4d2e-aa1e-1180df42a675-config" (OuterVolumeSpecName: "config") pod "2f9fc951-696e-4d2e-aa1e-1180df42a675" (UID: "2f9fc951-696e-4d2e-aa1e-1180df42a675"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.022489 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f9fc951-696e-4d2e-aa1e-1180df42a675-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.022519 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbacd1db-932a-44d7-8f1f-3031b30f0931-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.022531 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ztn9\" (UniqueName: \"kubernetes.io/projected/fbacd1db-932a-44d7-8f1f-3031b30f0931-kube-api-access-9ztn9\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.022541 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f9fc951-696e-4d2e-aa1e-1180df42a675-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.022551 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbacd1db-932a-44d7-8f1f-3031b30f0931-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.022585 4752 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.022594 4752 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbacd1db-932a-44d7-8f1f-3031b30f0931-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.022603 4752 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f9fc951-696e-4d2e-aa1e-1180df42a675-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.022612 4752 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f9fc951-696e-4d2e-aa1e-1180df42a675-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.022620 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbacd1db-932a-44d7-8f1f-3031b30f0931-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.022628 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6phk5\" (UniqueName: \"kubernetes.io/projected/2f9fc951-696e-4d2e-aa1e-1180df42a675-kube-api-access-6phk5\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.022635 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f9fc951-696e-4d2e-aa1e-1180df42a675-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.041399 4752 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.124580 4752 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.503625 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fbacd1db-932a-44d7-8f1f-3031b30f0931","Type":"ContainerDied","Data":"e3179ae533e6c957090d48c858d41f79edb1c5138ef321b0b2afa286bfe894d8"} Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.503664 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.505849 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-wlhcd" event={"ID":"2f9fc951-696e-4d2e-aa1e-1180df42a675","Type":"ContainerDied","Data":"2aec3a77a1a43c9eece1f743737ce874f36615f7d74223cfd9676df7feb9f314"} Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.505932 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-wlhcd" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.547150 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-wlhcd"] Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.556372 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-wlhcd"] Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.563440 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.575305 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.581070 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 11:25:08 crc kubenswrapper[4752]: E1124 11:25:08.581502 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbacd1db-932a-44d7-8f1f-3031b30f0931" containerName="glance-log" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.581540 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbacd1db-932a-44d7-8f1f-3031b30f0931" containerName="glance-log" Nov 24 11:25:08 crc kubenswrapper[4752]: E1124 11:25:08.581562 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f9fc951-696e-4d2e-aa1e-1180df42a675" containerName="dnsmasq-dns" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.581570 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f9fc951-696e-4d2e-aa1e-1180df42a675" containerName="dnsmasq-dns" Nov 24 11:25:08 crc kubenswrapper[4752]: E1124 11:25:08.581617 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbacd1db-932a-44d7-8f1f-3031b30f0931" containerName="glance-httpd" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.581628 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbacd1db-932a-44d7-8f1f-3031b30f0931" containerName="glance-httpd" Nov 24 11:25:08 crc kubenswrapper[4752]: E1124 11:25:08.581647 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f9fc951-696e-4d2e-aa1e-1180df42a675" containerName="init" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.581654 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f9fc951-696e-4d2e-aa1e-1180df42a675" containerName="init" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.581899 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbacd1db-932a-44d7-8f1f-3031b30f0931" containerName="glance-log" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.581952 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f9fc951-696e-4d2e-aa1e-1180df42a675" containerName="dnsmasq-dns" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.581963 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbacd1db-932a-44d7-8f1f-3031b30f0931" containerName="glance-httpd" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.583087 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.588422 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.589261 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.591135 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.736690 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"a512d7d1-a13c-4c87-a20b-81a2ae62655a\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.736819 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a512d7d1-a13c-4c87-a20b-81a2ae62655a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a512d7d1-a13c-4c87-a20b-81a2ae62655a\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.736851 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a512d7d1-a13c-4c87-a20b-81a2ae62655a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a512d7d1-a13c-4c87-a20b-81a2ae62655a\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.736879 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a512d7d1-a13c-4c87-a20b-81a2ae62655a-logs\") pod \"glance-default-internal-api-0\" (UID: \"a512d7d1-a13c-4c87-a20b-81a2ae62655a\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.737922 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a512d7d1-a13c-4c87-a20b-81a2ae62655a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a512d7d1-a13c-4c87-a20b-81a2ae62655a\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.738018 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a512d7d1-a13c-4c87-a20b-81a2ae62655a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a512d7d1-a13c-4c87-a20b-81a2ae62655a\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.738176 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cqrt\" (UniqueName: \"kubernetes.io/projected/a512d7d1-a13c-4c87-a20b-81a2ae62655a-kube-api-access-2cqrt\") pod \"glance-default-internal-api-0\" (UID: \"a512d7d1-a13c-4c87-a20b-81a2ae62655a\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.738339 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a512d7d1-a13c-4c87-a20b-81a2ae62655a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a512d7d1-a13c-4c87-a20b-81a2ae62655a\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.738661 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f9fc951-696e-4d2e-aa1e-1180df42a675" path="/var/lib/kubelet/pods/2f9fc951-696e-4d2e-aa1e-1180df42a675/volumes" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.739432 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbacd1db-932a-44d7-8f1f-3031b30f0931" path="/var/lib/kubelet/pods/fbacd1db-932a-44d7-8f1f-3031b30f0931/volumes" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.840420 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a512d7d1-a13c-4c87-a20b-81a2ae62655a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a512d7d1-a13c-4c87-a20b-81a2ae62655a\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.840882 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a512d7d1-a13c-4c87-a20b-81a2ae62655a-logs\") pod \"glance-default-internal-api-0\" (UID: \"a512d7d1-a13c-4c87-a20b-81a2ae62655a\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.840950 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a512d7d1-a13c-4c87-a20b-81a2ae62655a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a512d7d1-a13c-4c87-a20b-81a2ae62655a\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.840996 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a512d7d1-a13c-4c87-a20b-81a2ae62655a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a512d7d1-a13c-4c87-a20b-81a2ae62655a\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.841087 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cqrt\" (UniqueName: \"kubernetes.io/projected/a512d7d1-a13c-4c87-a20b-81a2ae62655a-kube-api-access-2cqrt\") pod \"glance-default-internal-api-0\" (UID: \"a512d7d1-a13c-4c87-a20b-81a2ae62655a\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.841168 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a512d7d1-a13c-4c87-a20b-81a2ae62655a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a512d7d1-a13c-4c87-a20b-81a2ae62655a\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.841228 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"a512d7d1-a13c-4c87-a20b-81a2ae62655a\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.841330 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a512d7d1-a13c-4c87-a20b-81a2ae62655a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a512d7d1-a13c-4c87-a20b-81a2ae62655a\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.841393 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a512d7d1-a13c-4c87-a20b-81a2ae62655a-logs\") pod \"glance-default-internal-api-0\" (UID: \"a512d7d1-a13c-4c87-a20b-81a2ae62655a\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.841532 4752 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"a512d7d1-a13c-4c87-a20b-81a2ae62655a\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.841764 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a512d7d1-a13c-4c87-a20b-81a2ae62655a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a512d7d1-a13c-4c87-a20b-81a2ae62655a\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.848144 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a512d7d1-a13c-4c87-a20b-81a2ae62655a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a512d7d1-a13c-4c87-a20b-81a2ae62655a\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.848270 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a512d7d1-a13c-4c87-a20b-81a2ae62655a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a512d7d1-a13c-4c87-a20b-81a2ae62655a\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.849162 4752 scope.go:117] "RemoveContainer" containerID="70eb8854934d561903625898d2298bc404bbd97fb551fd51ea4981a3958681d2" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.850501 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a512d7d1-a13c-4c87-a20b-81a2ae62655a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a512d7d1-a13c-4c87-a20b-81a2ae62655a\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.852332 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a512d7d1-a13c-4c87-a20b-81a2ae62655a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a512d7d1-a13c-4c87-a20b-81a2ae62655a\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.863415 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cqrt\" (UniqueName: \"kubernetes.io/projected/a512d7d1-a13c-4c87-a20b-81a2ae62655a-kube-api-access-2cqrt\") pod \"glance-default-internal-api-0\" (UID: \"a512d7d1-a13c-4c87-a20b-81a2ae62655a\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.874620 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"a512d7d1-a13c-4c87-a20b-81a2ae62655a\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:25:08 crc kubenswrapper[4752]: E1124 11:25:08.880814 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Nov 24 11:25:08 crc kubenswrapper[4752]: E1124 11:25:08.881004 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-47dd8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-ff2l8_openstack(dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 24 11:25:08 crc kubenswrapper[4752]: E1124 11:25:08.882178 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-ff2l8" podUID="dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9" Nov 24 11:25:08 crc kubenswrapper[4752]: I1124 11:25:08.920035 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 24 11:25:09 crc kubenswrapper[4752]: I1124 11:25:09.133165 4752 scope.go:117] "RemoveContainer" containerID="5b5cf439f6d8f889e8bf939804560389978a22cd31f7c8ca5d69f808c9d5b8bc" Nov 24 11:25:09 crc kubenswrapper[4752]: I1124 11:25:09.227495 4752 scope.go:117] "RemoveContainer" containerID="5cfc996b90ac6dd5cd0d523c98e7b600760a45f7c195157e2cf3847b643d0845" Nov 24 11:25:09 crc kubenswrapper[4752]: I1124 11:25:09.274581 4752 scope.go:117] "RemoveContainer" containerID="aab9554c3d38909724fd4664e4734e67c9131a32b84339e4dc3f11e690fd71f1" Nov 24 11:25:09 crc kubenswrapper[4752]: I1124 11:25:09.311204 4752 scope.go:117] "RemoveContainer" containerID="4110fb4eaf2a33bcb9b0aa5b35b8932cde2412fa52a1416f911ab446c9d3029e" Nov 24 11:25:09 crc kubenswrapper[4752]: I1124 11:25:09.548035 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ade2fdf2-9bc9-45ee-81cf-25be73764135","Type":"ContainerStarted","Data":"bd98460d4d8cd5d5c0e2dbc7e61bc74a65033d2df617a5b4cc89773afc5f69b3"} Nov 24 11:25:09 crc kubenswrapper[4752]: I1124 11:25:09.551811 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-fdrhb" event={"ID":"9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4","Type":"ContainerStarted","Data":"c2dc34b980c8970786595272b69db9751ab1245dd93ac741139b987c4b76363e"} Nov 24 11:25:09 crc kubenswrapper[4752]: E1124 11:25:09.552856 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-ff2l8" podUID="dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9" Nov 24 11:25:09 crc kubenswrapper[4752]: I1124 11:25:09.588371 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-fdrhb" podStartSLOduration=3.622493701 podStartE2EDuration="29.588351876s" podCreationTimestamp="2025-11-24 11:24:40 +0000 UTC" firstStartedPulling="2025-11-24 11:24:43.143925183 +0000 UTC m=+1089.128745472" lastFinishedPulling="2025-11-24 11:25:09.109783358 +0000 UTC m=+1115.094603647" observedRunningTime="2025-11-24 11:25:09.584122594 +0000 UTC m=+1115.568942883" watchObservedRunningTime="2025-11-24 11:25:09.588351876 +0000 UTC m=+1115.573172185" Nov 24 11:25:09 crc kubenswrapper[4752]: I1124 11:25:09.622851 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-v9hbc"] Nov 24 11:25:09 crc kubenswrapper[4752]: W1124 11:25:09.628318 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe564c7e_869d_40d3_9472_839c4fbb51e6.slice/crio-ce719147c88ab77052249e418796ffaa58b3842303b47e1be14af85343e7a6b9 WatchSource:0}: Error finding container ce719147c88ab77052249e418796ffaa58b3842303b47e1be14af85343e7a6b9: Status 404 returned error can't find the container with id ce719147c88ab77052249e418796ffaa58b3842303b47e1be14af85343e7a6b9 Nov 24 11:25:09 crc kubenswrapper[4752]: I1124 11:25:09.665939 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 11:25:09 crc kubenswrapper[4752]: I1124 11:25:09.812600 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 11:25:09 crc kubenswrapper[4752]: W1124 11:25:09.813220 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda512d7d1_a13c_4c87_a20b_81a2ae62655a.slice/crio-eed6ef5813eb0a5321cdbdf048b830fc768f0ac22fc0f88d819d573bea155758 WatchSource:0}: Error finding container eed6ef5813eb0a5321cdbdf048b830fc768f0ac22fc0f88d819d573bea155758: Status 404 returned error can't find the container with id eed6ef5813eb0a5321cdbdf048b830fc768f0ac22fc0f88d819d573bea155758 Nov 24 11:25:10 crc kubenswrapper[4752]: I1124 11:25:10.566500 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-v9hbc" event={"ID":"fe564c7e-869d-40d3-9472-839c4fbb51e6","Type":"ContainerStarted","Data":"681a52540e99f093687c5c5cb86c74a9c9b1f74b74796ab532bbc3a3af899758"} Nov 24 11:25:10 crc kubenswrapper[4752]: I1124 11:25:10.567153 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-v9hbc" event={"ID":"fe564c7e-869d-40d3-9472-839c4fbb51e6","Type":"ContainerStarted","Data":"ce719147c88ab77052249e418796ffaa58b3842303b47e1be14af85343e7a6b9"} Nov 24 11:25:10 crc kubenswrapper[4752]: I1124 11:25:10.572368 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28","Type":"ContainerStarted","Data":"e81d5a0b95235237f39a31dd9ae847f30d22805d65c467d4d9437a45460ca9d7"} Nov 24 11:25:10 crc kubenswrapper[4752]: I1124 11:25:10.572424 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28","Type":"ContainerStarted","Data":"df5c05c48db5b1b37626d2bc284eb52b40b045399ddc23068bdf8e6feafd0f26"} Nov 24 11:25:10 crc kubenswrapper[4752]: I1124 11:25:10.582213 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a512d7d1-a13c-4c87-a20b-81a2ae62655a","Type":"ContainerStarted","Data":"efd227698d64905aa18bab84b4b892d17c7c39b448d5682eaadcccf82dbda928"} Nov 24 11:25:10 crc kubenswrapper[4752]: I1124 11:25:10.582330 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a512d7d1-a13c-4c87-a20b-81a2ae62655a","Type":"ContainerStarted","Data":"eed6ef5813eb0a5321cdbdf048b830fc768f0ac22fc0f88d819d573bea155758"} Nov 24 11:25:10 crc kubenswrapper[4752]: E1124 11:25:10.996436 4752 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04ae1a5e_39a4_40d2_8a7f_dd5e7dac6c0d.slice/crio-525620a12830d24c1a64bc6025c7ca13e4b2dc072e4ef9c51dc3990f97b641f2\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa308cc4_8cc5_4a63_926a_033a151f7291.slice/crio-a22dd843e547ae24880f732987ed3607616100f00fdd6d0b7c0a832ecf68d38d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa308cc4_8cc5_4a63_926a_033a151f7291.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e150ca9_5a87_40ba_bbad_bfae139179ae.slice/crio-e0ba6aa64cba38678c4574d02ef74a8614a787b76bf66dad910dba664253978c\": RecentStats: unable to find data in memory cache]" Nov 24 11:25:11 crc kubenswrapper[4752]: I1124 11:25:11.593447 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28","Type":"ContainerStarted","Data":"15375c58a53fe633e1759e20c9d16ad63468a36df47866fd2c275d06f38ec48e"} Nov 24 11:25:11 crc kubenswrapper[4752]: I1124 11:25:11.595528 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a512d7d1-a13c-4c87-a20b-81a2ae62655a","Type":"ContainerStarted","Data":"8a5c9af4ba1963c78b62aa82f9df991156dceab59a337d81426c1979f5fe519d"} Nov 24 11:25:11 crc kubenswrapper[4752]: I1124 11:25:11.600230 4752 generic.go:334] "Generic (PLEG): container finished" podID="2c082fce-df36-4282-9630-e2f3089fa482" containerID="e55b3974f19bd84417679acf10b5aaff2058381e530e84be3ead5c645b2ddec0" exitCode=0 Nov 24 11:25:11 crc kubenswrapper[4752]: I1124 11:25:11.600346 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-t8l2x" event={"ID":"2c082fce-df36-4282-9630-e2f3089fa482","Type":"ContainerDied","Data":"e55b3974f19bd84417679acf10b5aaff2058381e530e84be3ead5c645b2ddec0"} Nov 24 11:25:11 crc kubenswrapper[4752]: I1124 11:25:11.625100 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-v9hbc" podStartSLOduration=11.625082585 podStartE2EDuration="11.625082585s" podCreationTimestamp="2025-11-24 11:25:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:25:10.589603857 +0000 UTC m=+1116.574424146" watchObservedRunningTime="2025-11-24 11:25:11.625082585 +0000 UTC m=+1117.609902874" Nov 24 11:25:11 crc kubenswrapper[4752]: I1124 11:25:11.627612 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=14.627605618 podStartE2EDuration="14.627605618s" podCreationTimestamp="2025-11-24 11:24:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:25:11.624247731 +0000 UTC m=+1117.609068020" watchObservedRunningTime="2025-11-24 11:25:11.627605618 +0000 UTC m=+1117.612425907" Nov 24 11:25:11 crc kubenswrapper[4752]: I1124 11:25:11.649659 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.649636445 podStartE2EDuration="3.649636445s" podCreationTimestamp="2025-11-24 11:25:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:25:11.649358427 +0000 UTC m=+1117.634178726" watchObservedRunningTime="2025-11-24 11:25:11.649636445 +0000 UTC m=+1117.634456754" Nov 24 11:25:12 crc kubenswrapper[4752]: I1124 11:25:12.532423 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-wlhcd" podUID="2f9fc951-696e-4d2e-aa1e-1180df42a675" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.128:5353: i/o timeout" Nov 24 11:25:12 crc kubenswrapper[4752]: I1124 11:25:12.623924 4752 generic.go:334] "Generic (PLEG): container finished" podID="9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4" containerID="c2dc34b980c8970786595272b69db9751ab1245dd93ac741139b987c4b76363e" exitCode=0 Nov 24 11:25:12 crc kubenswrapper[4752]: I1124 11:25:12.624858 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-fdrhb" event={"ID":"9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4","Type":"ContainerDied","Data":"c2dc34b980c8970786595272b69db9751ab1245dd93ac741139b987c4b76363e"} Nov 24 11:25:13 crc kubenswrapper[4752]: I1124 11:25:13.213005 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-t8l2x" Nov 24 11:25:13 crc kubenswrapper[4752]: I1124 11:25:13.338304 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2c082fce-df36-4282-9630-e2f3089fa482-config\") pod \"2c082fce-df36-4282-9630-e2f3089fa482\" (UID: \"2c082fce-df36-4282-9630-e2f3089fa482\") " Nov 24 11:25:13 crc kubenswrapper[4752]: I1124 11:25:13.338468 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vw7bf\" (UniqueName: \"kubernetes.io/projected/2c082fce-df36-4282-9630-e2f3089fa482-kube-api-access-vw7bf\") pod \"2c082fce-df36-4282-9630-e2f3089fa482\" (UID: \"2c082fce-df36-4282-9630-e2f3089fa482\") " Nov 24 11:25:13 crc kubenswrapper[4752]: I1124 11:25:13.338596 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c082fce-df36-4282-9630-e2f3089fa482-combined-ca-bundle\") pod \"2c082fce-df36-4282-9630-e2f3089fa482\" (UID: \"2c082fce-df36-4282-9630-e2f3089fa482\") " Nov 24 11:25:13 crc kubenswrapper[4752]: I1124 11:25:13.344556 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c082fce-df36-4282-9630-e2f3089fa482-kube-api-access-vw7bf" (OuterVolumeSpecName: "kube-api-access-vw7bf") pod "2c082fce-df36-4282-9630-e2f3089fa482" (UID: "2c082fce-df36-4282-9630-e2f3089fa482"). InnerVolumeSpecName "kube-api-access-vw7bf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:25:13 crc kubenswrapper[4752]: I1124 11:25:13.373891 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c082fce-df36-4282-9630-e2f3089fa482-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c082fce-df36-4282-9630-e2f3089fa482" (UID: "2c082fce-df36-4282-9630-e2f3089fa482"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:25:13 crc kubenswrapper[4752]: I1124 11:25:13.389110 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c082fce-df36-4282-9630-e2f3089fa482-config" (OuterVolumeSpecName: "config") pod "2c082fce-df36-4282-9630-e2f3089fa482" (UID: "2c082fce-df36-4282-9630-e2f3089fa482"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:25:13 crc kubenswrapper[4752]: I1124 11:25:13.440159 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vw7bf\" (UniqueName: \"kubernetes.io/projected/2c082fce-df36-4282-9630-e2f3089fa482-kube-api-access-vw7bf\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:13 crc kubenswrapper[4752]: I1124 11:25:13.440195 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c082fce-df36-4282-9630-e2f3089fa482-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:13 crc kubenswrapper[4752]: I1124 11:25:13.440210 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2c082fce-df36-4282-9630-e2f3089fa482-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:13 crc kubenswrapper[4752]: I1124 11:25:13.634719 4752 generic.go:334] "Generic (PLEG): container finished" podID="fe564c7e-869d-40d3-9472-839c4fbb51e6" containerID="681a52540e99f093687c5c5cb86c74a9c9b1f74b74796ab532bbc3a3af899758" exitCode=0 Nov 24 11:25:13 crc kubenswrapper[4752]: I1124 11:25:13.634820 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-v9hbc" event={"ID":"fe564c7e-869d-40d3-9472-839c4fbb51e6","Type":"ContainerDied","Data":"681a52540e99f093687c5c5cb86c74a9c9b1f74b74796ab532bbc3a3af899758"} Nov 24 11:25:13 crc kubenswrapper[4752]: I1124 11:25:13.636801 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-t8l2x" event={"ID":"2c082fce-df36-4282-9630-e2f3089fa482","Type":"ContainerDied","Data":"f779a517af572778171c60a2301d31202063615fd65ed6ae953080d5ca435b9f"} Nov 24 11:25:13 crc kubenswrapper[4752]: I1124 11:25:13.636845 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f779a517af572778171c60a2301d31202063615fd65ed6ae953080d5ca435b9f" Nov 24 11:25:13 crc kubenswrapper[4752]: I1124 11:25:13.636817 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-t8l2x" Nov 24 11:25:13 crc kubenswrapper[4752]: I1124 11:25:13.863715 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-xv98q"] Nov 24 11:25:13 crc kubenswrapper[4752]: E1124 11:25:13.866990 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c082fce-df36-4282-9630-e2f3089fa482" containerName="neutron-db-sync" Nov 24 11:25:13 crc kubenswrapper[4752]: I1124 11:25:13.867017 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c082fce-df36-4282-9630-e2f3089fa482" containerName="neutron-db-sync" Nov 24 11:25:13 crc kubenswrapper[4752]: I1124 11:25:13.867178 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c082fce-df36-4282-9630-e2f3089fa482" containerName="neutron-db-sync" Nov 24 11:25:13 crc kubenswrapper[4752]: I1124 11:25:13.872503 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-xv98q" Nov 24 11:25:13 crc kubenswrapper[4752]: I1124 11:25:13.876223 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-xv98q"] Nov 24 11:25:13 crc kubenswrapper[4752]: I1124 11:25:13.955561 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/790096ba-f8cd-46da-af7a-91992b6f39df-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-xv98q\" (UID: \"790096ba-f8cd-46da-af7a-91992b6f39df\") " pod="openstack/dnsmasq-dns-6b7b667979-xv98q" Nov 24 11:25:13 crc kubenswrapper[4752]: I1124 11:25:13.955871 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/790096ba-f8cd-46da-af7a-91992b6f39df-config\") pod \"dnsmasq-dns-6b7b667979-xv98q\" (UID: \"790096ba-f8cd-46da-af7a-91992b6f39df\") " pod="openstack/dnsmasq-dns-6b7b667979-xv98q" Nov 24 11:25:13 crc kubenswrapper[4752]: I1124 11:25:13.955892 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhktm\" (UniqueName: \"kubernetes.io/projected/790096ba-f8cd-46da-af7a-91992b6f39df-kube-api-access-rhktm\") pod \"dnsmasq-dns-6b7b667979-xv98q\" (UID: \"790096ba-f8cd-46da-af7a-91992b6f39df\") " pod="openstack/dnsmasq-dns-6b7b667979-xv98q" Nov 24 11:25:13 crc kubenswrapper[4752]: I1124 11:25:13.955958 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/790096ba-f8cd-46da-af7a-91992b6f39df-dns-svc\") pod \"dnsmasq-dns-6b7b667979-xv98q\" (UID: \"790096ba-f8cd-46da-af7a-91992b6f39df\") " pod="openstack/dnsmasq-dns-6b7b667979-xv98q" Nov 24 11:25:13 crc kubenswrapper[4752]: I1124 11:25:13.955984 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/790096ba-f8cd-46da-af7a-91992b6f39df-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-xv98q\" (UID: \"790096ba-f8cd-46da-af7a-91992b6f39df\") " pod="openstack/dnsmasq-dns-6b7b667979-xv98q" Nov 24 11:25:13 crc kubenswrapper[4752]: I1124 11:25:13.956010 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/790096ba-f8cd-46da-af7a-91992b6f39df-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-xv98q\" (UID: \"790096ba-f8cd-46da-af7a-91992b6f39df\") " pod="openstack/dnsmasq-dns-6b7b667979-xv98q" Nov 24 11:25:13 crc kubenswrapper[4752]: I1124 11:25:13.971391 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-76c46557f6-h7874"] Nov 24 11:25:13 crc kubenswrapper[4752]: I1124 11:25:13.981294 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-76c46557f6-h7874" Nov 24 11:25:13 crc kubenswrapper[4752]: I1124 11:25:13.983854 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-76c46557f6-h7874"] Nov 24 11:25:13 crc kubenswrapper[4752]: I1124 11:25:13.986958 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-nr2fk" Nov 24 11:25:13 crc kubenswrapper[4752]: I1124 11:25:13.987285 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Nov 24 11:25:13 crc kubenswrapper[4752]: I1124 11:25:13.987466 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 24 11:25:13 crc kubenswrapper[4752]: I1124 11:25:13.987596 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 24 11:25:14 crc kubenswrapper[4752]: I1124 11:25:14.057399 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/790096ba-f8cd-46da-af7a-91992b6f39df-dns-svc\") pod \"dnsmasq-dns-6b7b667979-xv98q\" (UID: \"790096ba-f8cd-46da-af7a-91992b6f39df\") " pod="openstack/dnsmasq-dns-6b7b667979-xv98q" Nov 24 11:25:14 crc kubenswrapper[4752]: I1124 11:25:14.057462 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/790096ba-f8cd-46da-af7a-91992b6f39df-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-xv98q\" (UID: \"790096ba-f8cd-46da-af7a-91992b6f39df\") " pod="openstack/dnsmasq-dns-6b7b667979-xv98q" Nov 24 11:25:14 crc kubenswrapper[4752]: I1124 11:25:14.057496 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p8kt\" (UniqueName: \"kubernetes.io/projected/42096ad9-dd99-4b49-ad30-57e43f4e97a1-kube-api-access-5p8kt\") pod \"neutron-76c46557f6-h7874\" (UID: \"42096ad9-dd99-4b49-ad30-57e43f4e97a1\") " pod="openstack/neutron-76c46557f6-h7874" Nov 24 11:25:14 crc kubenswrapper[4752]: I1124 11:25:14.057517 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/790096ba-f8cd-46da-af7a-91992b6f39df-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-xv98q\" (UID: \"790096ba-f8cd-46da-af7a-91992b6f39df\") " pod="openstack/dnsmasq-dns-6b7b667979-xv98q" Nov 24 11:25:14 crc kubenswrapper[4752]: I1124 11:25:14.057537 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/42096ad9-dd99-4b49-ad30-57e43f4e97a1-ovndb-tls-certs\") pod \"neutron-76c46557f6-h7874\" (UID: \"42096ad9-dd99-4b49-ad30-57e43f4e97a1\") " pod="openstack/neutron-76c46557f6-h7874" Nov 24 11:25:14 crc kubenswrapper[4752]: I1124 11:25:14.057580 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/42096ad9-dd99-4b49-ad30-57e43f4e97a1-config\") pod \"neutron-76c46557f6-h7874\" (UID: \"42096ad9-dd99-4b49-ad30-57e43f4e97a1\") " pod="openstack/neutron-76c46557f6-h7874" Nov 24 11:25:14 crc kubenswrapper[4752]: I1124 11:25:14.057612 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42096ad9-dd99-4b49-ad30-57e43f4e97a1-combined-ca-bundle\") pod \"neutron-76c46557f6-h7874\" (UID: \"42096ad9-dd99-4b49-ad30-57e43f4e97a1\") " pod="openstack/neutron-76c46557f6-h7874" Nov 24 11:25:14 crc kubenswrapper[4752]: I1124 11:25:14.057644 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/790096ba-f8cd-46da-af7a-91992b6f39df-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-xv98q\" (UID: \"790096ba-f8cd-46da-af7a-91992b6f39df\") " pod="openstack/dnsmasq-dns-6b7b667979-xv98q" Nov 24 11:25:14 crc kubenswrapper[4752]: I1124 11:25:14.057666 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/790096ba-f8cd-46da-af7a-91992b6f39df-config\") pod \"dnsmasq-dns-6b7b667979-xv98q\" (UID: \"790096ba-f8cd-46da-af7a-91992b6f39df\") " pod="openstack/dnsmasq-dns-6b7b667979-xv98q" Nov 24 11:25:14 crc kubenswrapper[4752]: I1124 11:25:14.057686 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhktm\" (UniqueName: \"kubernetes.io/projected/790096ba-f8cd-46da-af7a-91992b6f39df-kube-api-access-rhktm\") pod \"dnsmasq-dns-6b7b667979-xv98q\" (UID: \"790096ba-f8cd-46da-af7a-91992b6f39df\") " pod="openstack/dnsmasq-dns-6b7b667979-xv98q" Nov 24 11:25:14 crc kubenswrapper[4752]: I1124 11:25:14.057707 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/42096ad9-dd99-4b49-ad30-57e43f4e97a1-httpd-config\") pod \"neutron-76c46557f6-h7874\" (UID: \"42096ad9-dd99-4b49-ad30-57e43f4e97a1\") " pod="openstack/neutron-76c46557f6-h7874" Nov 24 11:25:14 crc kubenswrapper[4752]: I1124 11:25:14.058357 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/790096ba-f8cd-46da-af7a-91992b6f39df-dns-svc\") pod \"dnsmasq-dns-6b7b667979-xv98q\" (UID: \"790096ba-f8cd-46da-af7a-91992b6f39df\") " pod="openstack/dnsmasq-dns-6b7b667979-xv98q" Nov 24 11:25:14 crc kubenswrapper[4752]: I1124 11:25:14.058964 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/790096ba-f8cd-46da-af7a-91992b6f39df-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-xv98q\" (UID: \"790096ba-f8cd-46da-af7a-91992b6f39df\") " pod="openstack/dnsmasq-dns-6b7b667979-xv98q" Nov 24 11:25:14 crc kubenswrapper[4752]: I1124 11:25:14.059269 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/790096ba-f8cd-46da-af7a-91992b6f39df-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-xv98q\" (UID: \"790096ba-f8cd-46da-af7a-91992b6f39df\") " pod="openstack/dnsmasq-dns-6b7b667979-xv98q" Nov 24 11:25:14 crc kubenswrapper[4752]: I1124 11:25:14.059367 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/790096ba-f8cd-46da-af7a-91992b6f39df-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-xv98q\" (UID: \"790096ba-f8cd-46da-af7a-91992b6f39df\") " pod="openstack/dnsmasq-dns-6b7b667979-xv98q" Nov 24 11:25:14 crc kubenswrapper[4752]: I1124 11:25:14.059481 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/790096ba-f8cd-46da-af7a-91992b6f39df-config\") pod \"dnsmasq-dns-6b7b667979-xv98q\" (UID: \"790096ba-f8cd-46da-af7a-91992b6f39df\") " pod="openstack/dnsmasq-dns-6b7b667979-xv98q" Nov 24 11:25:14 crc kubenswrapper[4752]: I1124 11:25:14.081721 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhktm\" (UniqueName: \"kubernetes.io/projected/790096ba-f8cd-46da-af7a-91992b6f39df-kube-api-access-rhktm\") pod \"dnsmasq-dns-6b7b667979-xv98q\" (UID: \"790096ba-f8cd-46da-af7a-91992b6f39df\") " pod="openstack/dnsmasq-dns-6b7b667979-xv98q" Nov 24 11:25:14 crc kubenswrapper[4752]: I1124 11:25:14.159810 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p8kt\" (UniqueName: \"kubernetes.io/projected/42096ad9-dd99-4b49-ad30-57e43f4e97a1-kube-api-access-5p8kt\") pod \"neutron-76c46557f6-h7874\" (UID: \"42096ad9-dd99-4b49-ad30-57e43f4e97a1\") " pod="openstack/neutron-76c46557f6-h7874" Nov 24 11:25:14 crc kubenswrapper[4752]: I1124 11:25:14.159865 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/42096ad9-dd99-4b49-ad30-57e43f4e97a1-ovndb-tls-certs\") pod \"neutron-76c46557f6-h7874\" (UID: \"42096ad9-dd99-4b49-ad30-57e43f4e97a1\") " pod="openstack/neutron-76c46557f6-h7874" Nov 24 11:25:14 crc kubenswrapper[4752]: I1124 11:25:14.159907 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/42096ad9-dd99-4b49-ad30-57e43f4e97a1-config\") pod \"neutron-76c46557f6-h7874\" (UID: \"42096ad9-dd99-4b49-ad30-57e43f4e97a1\") " pod="openstack/neutron-76c46557f6-h7874" Nov 24 11:25:14 crc kubenswrapper[4752]: I1124 11:25:14.159939 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42096ad9-dd99-4b49-ad30-57e43f4e97a1-combined-ca-bundle\") pod \"neutron-76c46557f6-h7874\" (UID: \"42096ad9-dd99-4b49-ad30-57e43f4e97a1\") " pod="openstack/neutron-76c46557f6-h7874" Nov 24 11:25:14 crc kubenswrapper[4752]: I1124 11:25:14.159983 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/42096ad9-dd99-4b49-ad30-57e43f4e97a1-httpd-config\") pod \"neutron-76c46557f6-h7874\" (UID: \"42096ad9-dd99-4b49-ad30-57e43f4e97a1\") " pod="openstack/neutron-76c46557f6-h7874" Nov 24 11:25:14 crc kubenswrapper[4752]: I1124 11:25:14.167719 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/42096ad9-dd99-4b49-ad30-57e43f4e97a1-httpd-config\") pod \"neutron-76c46557f6-h7874\" (UID: \"42096ad9-dd99-4b49-ad30-57e43f4e97a1\") " pod="openstack/neutron-76c46557f6-h7874" Nov 24 11:25:14 crc kubenswrapper[4752]: I1124 11:25:14.171149 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/42096ad9-dd99-4b49-ad30-57e43f4e97a1-config\") pod \"neutron-76c46557f6-h7874\" (UID: \"42096ad9-dd99-4b49-ad30-57e43f4e97a1\") " pod="openstack/neutron-76c46557f6-h7874" Nov 24 11:25:14 crc kubenswrapper[4752]: I1124 11:25:14.173576 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42096ad9-dd99-4b49-ad30-57e43f4e97a1-combined-ca-bundle\") pod \"neutron-76c46557f6-h7874\" (UID: \"42096ad9-dd99-4b49-ad30-57e43f4e97a1\") " pod="openstack/neutron-76c46557f6-h7874" Nov 24 11:25:14 crc kubenswrapper[4752]: I1124 11:25:14.187665 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/42096ad9-dd99-4b49-ad30-57e43f4e97a1-ovndb-tls-certs\") pod \"neutron-76c46557f6-h7874\" (UID: \"42096ad9-dd99-4b49-ad30-57e43f4e97a1\") " pod="openstack/neutron-76c46557f6-h7874" Nov 24 11:25:14 crc kubenswrapper[4752]: I1124 11:25:14.190997 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p8kt\" (UniqueName: \"kubernetes.io/projected/42096ad9-dd99-4b49-ad30-57e43f4e97a1-kube-api-access-5p8kt\") pod \"neutron-76c46557f6-h7874\" (UID: \"42096ad9-dd99-4b49-ad30-57e43f4e97a1\") " pod="openstack/neutron-76c46557f6-h7874" Nov 24 11:25:14 crc kubenswrapper[4752]: I1124 11:25:14.219096 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-xv98q" Nov 24 11:25:14 crc kubenswrapper[4752]: I1124 11:25:14.304119 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-76c46557f6-h7874" Nov 24 11:25:15 crc kubenswrapper[4752]: I1124 11:25:15.468382 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 11:25:15 crc kubenswrapper[4752]: I1124 11:25:15.468782 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 11:25:15 crc kubenswrapper[4752]: I1124 11:25:15.468839 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 11:25:15 crc kubenswrapper[4752]: I1124 11:25:15.469560 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"47e4c2373d3287581d775010f911ee2e00a31c4abfe85132a68240b84e00095c"} pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 24 11:25:15 crc kubenswrapper[4752]: I1124 11:25:15.469617 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" containerID="cri-o://47e4c2373d3287581d775010f911ee2e00a31c4abfe85132a68240b84e00095c" gracePeriod=600 Nov 24 11:25:15 crc kubenswrapper[4752]: I1124 11:25:15.603354 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-fdrhb" Nov 24 11:25:15 crc kubenswrapper[4752]: I1124 11:25:15.697328 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4-combined-ca-bundle\") pod \"9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4\" (UID: \"9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4\") " Nov 24 11:25:15 crc kubenswrapper[4752]: I1124 11:25:15.697827 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4-db-sync-config-data\") pod \"9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4\" (UID: \"9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4\") " Nov 24 11:25:15 crc kubenswrapper[4752]: I1124 11:25:15.698001 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-726wp\" (UniqueName: \"kubernetes.io/projected/9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4-kube-api-access-726wp\") pod \"9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4\" (UID: \"9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4\") " Nov 24 11:25:15 crc kubenswrapper[4752]: I1124 11:25:15.707574 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4-kube-api-access-726wp" (OuterVolumeSpecName: "kube-api-access-726wp") pod "9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4" (UID: "9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4"). InnerVolumeSpecName "kube-api-access-726wp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:25:15 crc kubenswrapper[4752]: I1124 11:25:15.707649 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4" (UID: "9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:25:15 crc kubenswrapper[4752]: I1124 11:25:15.724907 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4" (UID: "9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:25:15 crc kubenswrapper[4752]: I1124 11:25:15.754635 4752 generic.go:334] "Generic (PLEG): container finished" podID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerID="47e4c2373d3287581d775010f911ee2e00a31c4abfe85132a68240b84e00095c" exitCode=0 Nov 24 11:25:15 crc kubenswrapper[4752]: I1124 11:25:15.755259 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerDied","Data":"47e4c2373d3287581d775010f911ee2e00a31c4abfe85132a68240b84e00095c"} Nov 24 11:25:15 crc kubenswrapper[4752]: I1124 11:25:15.755354 4752 scope.go:117] "RemoveContainer" containerID="232a5964f5a1b0c13db2c00fa30a37e21418af2bce45250ff9defff836c970d9" Nov 24 11:25:15 crc kubenswrapper[4752]: I1124 11:25:15.759325 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-v9hbc" Nov 24 11:25:15 crc kubenswrapper[4752]: I1124 11:25:15.776575 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-v9hbc" event={"ID":"fe564c7e-869d-40d3-9472-839c4fbb51e6","Type":"ContainerDied","Data":"ce719147c88ab77052249e418796ffaa58b3842303b47e1be14af85343e7a6b9"} Nov 24 11:25:15 crc kubenswrapper[4752]: I1124 11:25:15.776777 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce719147c88ab77052249e418796ffaa58b3842303b47e1be14af85343e7a6b9" Nov 24 11:25:15 crc kubenswrapper[4752]: I1124 11:25:15.781855 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-fdrhb" event={"ID":"9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4","Type":"ContainerDied","Data":"afb52af5fe007dede7579705ed524275c030273563c931d2b0a0302172a0835b"} Nov 24 11:25:15 crc kubenswrapper[4752]: I1124 11:25:15.781901 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afb52af5fe007dede7579705ed524275c030273563c931d2b0a0302172a0835b" Nov 24 11:25:15 crc kubenswrapper[4752]: I1124 11:25:15.782131 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-fdrhb" Nov 24 11:25:15 crc kubenswrapper[4752]: I1124 11:25:15.811938 4752 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:15 crc kubenswrapper[4752]: I1124 11:25:15.812124 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-726wp\" (UniqueName: \"kubernetes.io/projected/9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4-kube-api-access-726wp\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:15 crc kubenswrapper[4752]: I1124 11:25:15.812142 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:15 crc kubenswrapper[4752]: I1124 11:25:15.913989 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fe564c7e-869d-40d3-9472-839c4fbb51e6-credential-keys\") pod \"fe564c7e-869d-40d3-9472-839c4fbb51e6\" (UID: \"fe564c7e-869d-40d3-9472-839c4fbb51e6\") " Nov 24 11:25:15 crc kubenswrapper[4752]: I1124 11:25:15.914071 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fe564c7e-869d-40d3-9472-839c4fbb51e6-fernet-keys\") pod \"fe564c7e-869d-40d3-9472-839c4fbb51e6\" (UID: \"fe564c7e-869d-40d3-9472-839c4fbb51e6\") " Nov 24 11:25:15 crc kubenswrapper[4752]: I1124 11:25:15.914171 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe564c7e-869d-40d3-9472-839c4fbb51e6-combined-ca-bundle\") pod \"fe564c7e-869d-40d3-9472-839c4fbb51e6\" (UID: \"fe564c7e-869d-40d3-9472-839c4fbb51e6\") " Nov 24 11:25:15 crc kubenswrapper[4752]: I1124 11:25:15.914217 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe564c7e-869d-40d3-9472-839c4fbb51e6-config-data\") pod \"fe564c7e-869d-40d3-9472-839c4fbb51e6\" (UID: \"fe564c7e-869d-40d3-9472-839c4fbb51e6\") " Nov 24 11:25:15 crc kubenswrapper[4752]: I1124 11:25:15.914274 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tczdq\" (UniqueName: \"kubernetes.io/projected/fe564c7e-869d-40d3-9472-839c4fbb51e6-kube-api-access-tczdq\") pod \"fe564c7e-869d-40d3-9472-839c4fbb51e6\" (UID: \"fe564c7e-869d-40d3-9472-839c4fbb51e6\") " Nov 24 11:25:15 crc kubenswrapper[4752]: I1124 11:25:15.914297 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe564c7e-869d-40d3-9472-839c4fbb51e6-scripts\") pod \"fe564c7e-869d-40d3-9472-839c4fbb51e6\" (UID: \"fe564c7e-869d-40d3-9472-839c4fbb51e6\") " Nov 24 11:25:15 crc kubenswrapper[4752]: I1124 11:25:15.928354 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe564c7e-869d-40d3-9472-839c4fbb51e6-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "fe564c7e-869d-40d3-9472-839c4fbb51e6" (UID: "fe564c7e-869d-40d3-9472-839c4fbb51e6"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:25:15 crc kubenswrapper[4752]: I1124 11:25:15.936179 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe564c7e-869d-40d3-9472-839c4fbb51e6-kube-api-access-tczdq" (OuterVolumeSpecName: "kube-api-access-tczdq") pod "fe564c7e-869d-40d3-9472-839c4fbb51e6" (UID: "fe564c7e-869d-40d3-9472-839c4fbb51e6"). InnerVolumeSpecName "kube-api-access-tczdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:25:15 crc kubenswrapper[4752]: I1124 11:25:15.947205 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe564c7e-869d-40d3-9472-839c4fbb51e6-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "fe564c7e-869d-40d3-9472-839c4fbb51e6" (UID: "fe564c7e-869d-40d3-9472-839c4fbb51e6"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:25:15 crc kubenswrapper[4752]: I1124 11:25:15.957594 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe564c7e-869d-40d3-9472-839c4fbb51e6-scripts" (OuterVolumeSpecName: "scripts") pod "fe564c7e-869d-40d3-9472-839c4fbb51e6" (UID: "fe564c7e-869d-40d3-9472-839c4fbb51e6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:25:15 crc kubenswrapper[4752]: I1124 11:25:15.989840 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe564c7e-869d-40d3-9472-839c4fbb51e6-config-data" (OuterVolumeSpecName: "config-data") pod "fe564c7e-869d-40d3-9472-839c4fbb51e6" (UID: "fe564c7e-869d-40d3-9472-839c4fbb51e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:25:15 crc kubenswrapper[4752]: I1124 11:25:15.997662 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe564c7e-869d-40d3-9472-839c4fbb51e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe564c7e-869d-40d3-9472-839c4fbb51e6" (UID: "fe564c7e-869d-40d3-9472-839c4fbb51e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:25:16 crc kubenswrapper[4752]: I1124 11:25:16.025342 4752 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fe564c7e-869d-40d3-9472-839c4fbb51e6-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:16 crc kubenswrapper[4752]: I1124 11:25:16.025373 4752 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fe564c7e-869d-40d3-9472-839c4fbb51e6-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:16 crc kubenswrapper[4752]: I1124 11:25:16.025381 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe564c7e-869d-40d3-9472-839c4fbb51e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:16 crc kubenswrapper[4752]: I1124 11:25:16.025389 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe564c7e-869d-40d3-9472-839c4fbb51e6-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:16 crc kubenswrapper[4752]: I1124 11:25:16.025398 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tczdq\" (UniqueName: \"kubernetes.io/projected/fe564c7e-869d-40d3-9472-839c4fbb51e6-kube-api-access-tczdq\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:16 crc kubenswrapper[4752]: I1124 11:25:16.025406 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe564c7e-869d-40d3-9472-839c4fbb51e6-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:16 crc kubenswrapper[4752]: I1124 11:25:16.135479 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-xv98q"] Nov 24 11:25:16 crc kubenswrapper[4752]: I1124 11:25:16.337260 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-76c46557f6-h7874"] Nov 24 11:25:16 crc kubenswrapper[4752]: W1124 11:25:16.388366 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42096ad9_dd99_4b49_ad30_57e43f4e97a1.slice/crio-0ae6b9242a8a0dc207b186cc8a763c94f3d1c7de06c516cc7afb9b2a21473d4d WatchSource:0}: Error finding container 0ae6b9242a8a0dc207b186cc8a763c94f3d1c7de06c516cc7afb9b2a21473d4d: Status 404 returned error can't find the container with id 0ae6b9242a8a0dc207b186cc8a763c94f3d1c7de06c516cc7afb9b2a21473d4d Nov 24 11:25:16 crc kubenswrapper[4752]: I1124 11:25:16.803627 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ade2fdf2-9bc9-45ee-81cf-25be73764135","Type":"ContainerStarted","Data":"498fb2360c218ad0383c97e165942009a3fa76dcc477214cca208deb136f6bdf"} Nov 24 11:25:16 crc kubenswrapper[4752]: I1124 11:25:16.806480 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerStarted","Data":"1a145d3540c315b0dee20cc299bc5214b8f1897d2d61c319cc9bb2cf1542af39"} Nov 24 11:25:16 crc kubenswrapper[4752]: I1124 11:25:16.814203 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76c46557f6-h7874" event={"ID":"42096ad9-dd99-4b49-ad30-57e43f4e97a1","Type":"ContainerStarted","Data":"2083f407f786cbdc50075533918be421651e6ec55fbad9e1b8c579416e048543"} Nov 24 11:25:16 crc kubenswrapper[4752]: I1124 11:25:16.814251 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76c46557f6-h7874" event={"ID":"42096ad9-dd99-4b49-ad30-57e43f4e97a1","Type":"ContainerStarted","Data":"0ae6b9242a8a0dc207b186cc8a763c94f3d1c7de06c516cc7afb9b2a21473d4d"} Nov 24 11:25:16 crc kubenswrapper[4752]: I1124 11:25:16.822122 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-s49hz" event={"ID":"96ca5590-9544-4b84-b86d-1ec3ef57a829","Type":"ContainerStarted","Data":"28993df0524c42d77ebd5a549957c79d732fa86a3d6de2ac8c0b38ed64532ac2"} Nov 24 11:25:16 crc kubenswrapper[4752]: I1124 11:25:16.833482 4752 generic.go:334] "Generic (PLEG): container finished" podID="790096ba-f8cd-46da-af7a-91992b6f39df" containerID="b526584b27b044b9e54b10848e673041b160508afce948c9cbdd5f892eeee5d2" exitCode=0 Nov 24 11:25:16 crc kubenswrapper[4752]: I1124 11:25:16.833560 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-v9hbc" Nov 24 11:25:16 crc kubenswrapper[4752]: I1124 11:25:16.834602 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-xv98q" event={"ID":"790096ba-f8cd-46da-af7a-91992b6f39df","Type":"ContainerDied","Data":"b526584b27b044b9e54b10848e673041b160508afce948c9cbdd5f892eeee5d2"} Nov 24 11:25:16 crc kubenswrapper[4752]: I1124 11:25:16.834641 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-xv98q" event={"ID":"790096ba-f8cd-46da-af7a-91992b6f39df","Type":"ContainerStarted","Data":"963a7a70e819415eaaccc34348dd83ca14297d66400b312712c685c33a7bce40"} Nov 24 11:25:16 crc kubenswrapper[4752]: I1124 11:25:16.912809 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7ffb7c9ccc-wsc5d"] Nov 24 11:25:16 crc kubenswrapper[4752]: E1124 11:25:16.913409 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4" containerName="barbican-db-sync" Nov 24 11:25:16 crc kubenswrapper[4752]: I1124 11:25:16.913429 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4" containerName="barbican-db-sync" Nov 24 11:25:16 crc kubenswrapper[4752]: E1124 11:25:16.913478 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe564c7e-869d-40d3-9472-839c4fbb51e6" containerName="keystone-bootstrap" Nov 24 11:25:16 crc kubenswrapper[4752]: I1124 11:25:16.913487 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe564c7e-869d-40d3-9472-839c4fbb51e6" containerName="keystone-bootstrap" Nov 24 11:25:16 crc kubenswrapper[4752]: I1124 11:25:16.913676 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4" containerName="barbican-db-sync" Nov 24 11:25:16 crc kubenswrapper[4752]: I1124 11:25:16.913687 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe564c7e-869d-40d3-9472-839c4fbb51e6" containerName="keystone-bootstrap" Nov 24 11:25:16 crc kubenswrapper[4752]: I1124 11:25:16.922817 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7ffb7c9ccc-wsc5d" Nov 24 11:25:16 crc kubenswrapper[4752]: I1124 11:25:16.945082 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Nov 24 11:25:16 crc kubenswrapper[4752]: I1124 11:25:16.945658 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 24 11:25:16 crc kubenswrapper[4752]: I1124 11:25:16.945794 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-tv5xf" Nov 24 11:25:16 crc kubenswrapper[4752]: I1124 11:25:16.953216 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-57bcdc7dc8-2mzt6"] Nov 24 11:25:16 crc kubenswrapper[4752]: I1124 11:25:16.954768 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-57bcdc7dc8-2mzt6" Nov 24 11:25:16 crc kubenswrapper[4752]: I1124 11:25:16.959958 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:16.994935 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-57bcdc7dc8-2mzt6"] Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.027609 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7ffb7c9ccc-wsc5d"] Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.041705 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd899b7f-383d-45fd-9ec8-48caa4f954f2-logs\") pod \"barbican-keystone-listener-57bcdc7dc8-2mzt6\" (UID: \"dd899b7f-383d-45fd-9ec8-48caa4f954f2\") " pod="openstack/barbican-keystone-listener-57bcdc7dc8-2mzt6" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.041784 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd899b7f-383d-45fd-9ec8-48caa4f954f2-combined-ca-bundle\") pod \"barbican-keystone-listener-57bcdc7dc8-2mzt6\" (UID: \"dd899b7f-383d-45fd-9ec8-48caa4f954f2\") " pod="openstack/barbican-keystone-listener-57bcdc7dc8-2mzt6" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.041814 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq42q\" (UniqueName: \"kubernetes.io/projected/7cb9117e-3eeb-4c01-af6a-643eec81666c-kube-api-access-tq42q\") pod \"barbican-worker-7ffb7c9ccc-wsc5d\" (UID: \"7cb9117e-3eeb-4c01-af6a-643eec81666c\") " pod="openstack/barbican-worker-7ffb7c9ccc-wsc5d" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.041849 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd899b7f-383d-45fd-9ec8-48caa4f954f2-config-data\") pod \"barbican-keystone-listener-57bcdc7dc8-2mzt6\" (UID: \"dd899b7f-383d-45fd-9ec8-48caa4f954f2\") " pod="openstack/barbican-keystone-listener-57bcdc7dc8-2mzt6" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.041917 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7cb9117e-3eeb-4c01-af6a-643eec81666c-config-data-custom\") pod \"barbican-worker-7ffb7c9ccc-wsc5d\" (UID: \"7cb9117e-3eeb-4c01-af6a-643eec81666c\") " pod="openstack/barbican-worker-7ffb7c9ccc-wsc5d" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.041938 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxnkx\" (UniqueName: \"kubernetes.io/projected/dd899b7f-383d-45fd-9ec8-48caa4f954f2-kube-api-access-pxnkx\") pod \"barbican-keystone-listener-57bcdc7dc8-2mzt6\" (UID: \"dd899b7f-383d-45fd-9ec8-48caa4f954f2\") " pod="openstack/barbican-keystone-listener-57bcdc7dc8-2mzt6" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.041953 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cb9117e-3eeb-4c01-af6a-643eec81666c-logs\") pod \"barbican-worker-7ffb7c9ccc-wsc5d\" (UID: \"7cb9117e-3eeb-4c01-af6a-643eec81666c\") " pod="openstack/barbican-worker-7ffb7c9ccc-wsc5d" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.041973 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd899b7f-383d-45fd-9ec8-48caa4f954f2-config-data-custom\") pod \"barbican-keystone-listener-57bcdc7dc8-2mzt6\" (UID: \"dd899b7f-383d-45fd-9ec8-48caa4f954f2\") " pod="openstack/barbican-keystone-listener-57bcdc7dc8-2mzt6" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.041989 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cb9117e-3eeb-4c01-af6a-643eec81666c-config-data\") pod \"barbican-worker-7ffb7c9ccc-wsc5d\" (UID: \"7cb9117e-3eeb-4c01-af6a-643eec81666c\") " pod="openstack/barbican-worker-7ffb7c9ccc-wsc5d" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.042020 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cb9117e-3eeb-4c01-af6a-643eec81666c-combined-ca-bundle\") pod \"barbican-worker-7ffb7c9ccc-wsc5d\" (UID: \"7cb9117e-3eeb-4c01-af6a-643eec81666c\") " pod="openstack/barbican-worker-7ffb7c9ccc-wsc5d" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.062222 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-s49hz" podStartSLOduration=4.671827981 podStartE2EDuration="37.062201578s" podCreationTimestamp="2025-11-24 11:24:40 +0000 UTC" firstStartedPulling="2025-11-24 11:24:43.247135985 +0000 UTC m=+1089.231956274" lastFinishedPulling="2025-11-24 11:25:15.637509582 +0000 UTC m=+1121.622329871" observedRunningTime="2025-11-24 11:25:16.981054283 +0000 UTC m=+1122.965874572" watchObservedRunningTime="2025-11-24 11:25:17.062201578 +0000 UTC m=+1123.047021867" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.104765 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7f4ddd687-vk74x"] Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.105992 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7f4ddd687-vk74x" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.124724 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.124836 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.125047 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.125130 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-kdtjx" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.125173 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.125675 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.154197 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7cb9117e-3eeb-4c01-af6a-643eec81666c-config-data-custom\") pod \"barbican-worker-7ffb7c9ccc-wsc5d\" (UID: \"7cb9117e-3eeb-4c01-af6a-643eec81666c\") " pod="openstack/barbican-worker-7ffb7c9ccc-wsc5d" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.159693 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxnkx\" (UniqueName: \"kubernetes.io/projected/dd899b7f-383d-45fd-9ec8-48caa4f954f2-kube-api-access-pxnkx\") pod \"barbican-keystone-listener-57bcdc7dc8-2mzt6\" (UID: \"dd899b7f-383d-45fd-9ec8-48caa4f954f2\") " pod="openstack/barbican-keystone-listener-57bcdc7dc8-2mzt6" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.159726 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cb9117e-3eeb-4c01-af6a-643eec81666c-logs\") pod \"barbican-worker-7ffb7c9ccc-wsc5d\" (UID: \"7cb9117e-3eeb-4c01-af6a-643eec81666c\") " pod="openstack/barbican-worker-7ffb7c9ccc-wsc5d" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.159766 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd899b7f-383d-45fd-9ec8-48caa4f954f2-config-data-custom\") pod \"barbican-keystone-listener-57bcdc7dc8-2mzt6\" (UID: \"dd899b7f-383d-45fd-9ec8-48caa4f954f2\") " pod="openstack/barbican-keystone-listener-57bcdc7dc8-2mzt6" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.159793 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cb9117e-3eeb-4c01-af6a-643eec81666c-config-data\") pod \"barbican-worker-7ffb7c9ccc-wsc5d\" (UID: \"7cb9117e-3eeb-4c01-af6a-643eec81666c\") " pod="openstack/barbican-worker-7ffb7c9ccc-wsc5d" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.159843 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cb9117e-3eeb-4c01-af6a-643eec81666c-combined-ca-bundle\") pod \"barbican-worker-7ffb7c9ccc-wsc5d\" (UID: \"7cb9117e-3eeb-4c01-af6a-643eec81666c\") " pod="openstack/barbican-worker-7ffb7c9ccc-wsc5d" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.159901 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd899b7f-383d-45fd-9ec8-48caa4f954f2-logs\") pod \"barbican-keystone-listener-57bcdc7dc8-2mzt6\" (UID: \"dd899b7f-383d-45fd-9ec8-48caa4f954f2\") " pod="openstack/barbican-keystone-listener-57bcdc7dc8-2mzt6" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.160079 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd899b7f-383d-45fd-9ec8-48caa4f954f2-combined-ca-bundle\") pod \"barbican-keystone-listener-57bcdc7dc8-2mzt6\" (UID: \"dd899b7f-383d-45fd-9ec8-48caa4f954f2\") " pod="openstack/barbican-keystone-listener-57bcdc7dc8-2mzt6" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.160128 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq42q\" (UniqueName: \"kubernetes.io/projected/7cb9117e-3eeb-4c01-af6a-643eec81666c-kube-api-access-tq42q\") pod \"barbican-worker-7ffb7c9ccc-wsc5d\" (UID: \"7cb9117e-3eeb-4c01-af6a-643eec81666c\") " pod="openstack/barbican-worker-7ffb7c9ccc-wsc5d" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.160186 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd899b7f-383d-45fd-9ec8-48caa4f954f2-config-data\") pod \"barbican-keystone-listener-57bcdc7dc8-2mzt6\" (UID: \"dd899b7f-383d-45fd-9ec8-48caa4f954f2\") " pod="openstack/barbican-keystone-listener-57bcdc7dc8-2mzt6" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.161848 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cb9117e-3eeb-4c01-af6a-643eec81666c-logs\") pod \"barbican-worker-7ffb7c9ccc-wsc5d\" (UID: \"7cb9117e-3eeb-4c01-af6a-643eec81666c\") " pod="openstack/barbican-worker-7ffb7c9ccc-wsc5d" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.162069 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd899b7f-383d-45fd-9ec8-48caa4f954f2-logs\") pod \"barbican-keystone-listener-57bcdc7dc8-2mzt6\" (UID: \"dd899b7f-383d-45fd-9ec8-48caa4f954f2\") " pod="openstack/barbican-keystone-listener-57bcdc7dc8-2mzt6" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.199972 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cb9117e-3eeb-4c01-af6a-643eec81666c-config-data\") pod \"barbican-worker-7ffb7c9ccc-wsc5d\" (UID: \"7cb9117e-3eeb-4c01-af6a-643eec81666c\") " pod="openstack/barbican-worker-7ffb7c9ccc-wsc5d" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.203079 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd899b7f-383d-45fd-9ec8-48caa4f954f2-combined-ca-bundle\") pod \"barbican-keystone-listener-57bcdc7dc8-2mzt6\" (UID: \"dd899b7f-383d-45fd-9ec8-48caa4f954f2\") " pod="openstack/barbican-keystone-listener-57bcdc7dc8-2mzt6" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.205612 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cb9117e-3eeb-4c01-af6a-643eec81666c-combined-ca-bundle\") pod \"barbican-worker-7ffb7c9ccc-wsc5d\" (UID: \"7cb9117e-3eeb-4c01-af6a-643eec81666c\") " pod="openstack/barbican-worker-7ffb7c9ccc-wsc5d" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.211613 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7f4ddd687-vk74x"] Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.220644 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxnkx\" (UniqueName: \"kubernetes.io/projected/dd899b7f-383d-45fd-9ec8-48caa4f954f2-kube-api-access-pxnkx\") pod \"barbican-keystone-listener-57bcdc7dc8-2mzt6\" (UID: \"dd899b7f-383d-45fd-9ec8-48caa4f954f2\") " pod="openstack/barbican-keystone-listener-57bcdc7dc8-2mzt6" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.221108 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd899b7f-383d-45fd-9ec8-48caa4f954f2-config-data-custom\") pod \"barbican-keystone-listener-57bcdc7dc8-2mzt6\" (UID: \"dd899b7f-383d-45fd-9ec8-48caa4f954f2\") " pod="openstack/barbican-keystone-listener-57bcdc7dc8-2mzt6" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.234143 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-xv98q"] Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.235203 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd899b7f-383d-45fd-9ec8-48caa4f954f2-config-data\") pod \"barbican-keystone-listener-57bcdc7dc8-2mzt6\" (UID: \"dd899b7f-383d-45fd-9ec8-48caa4f954f2\") " pod="openstack/barbican-keystone-listener-57bcdc7dc8-2mzt6" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.240730 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq42q\" (UniqueName: \"kubernetes.io/projected/7cb9117e-3eeb-4c01-af6a-643eec81666c-kube-api-access-tq42q\") pod \"barbican-worker-7ffb7c9ccc-wsc5d\" (UID: \"7cb9117e-3eeb-4c01-af6a-643eec81666c\") " pod="openstack/barbican-worker-7ffb7c9ccc-wsc5d" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.243139 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7cb9117e-3eeb-4c01-af6a-643eec81666c-config-data-custom\") pod \"barbican-worker-7ffb7c9ccc-wsc5d\" (UID: \"7cb9117e-3eeb-4c01-af6a-643eec81666c\") " pod="openstack/barbican-worker-7ffb7c9ccc-wsc5d" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.266837 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj6pv\" (UniqueName: \"kubernetes.io/projected/1cd9d1a6-a562-443e-b16d-76c159107794-kube-api-access-xj6pv\") pod \"keystone-7f4ddd687-vk74x\" (UID: \"1cd9d1a6-a562-443e-b16d-76c159107794\") " pod="openstack/keystone-7f4ddd687-vk74x" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.266926 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cd9d1a6-a562-443e-b16d-76c159107794-config-data\") pod \"keystone-7f4ddd687-vk74x\" (UID: \"1cd9d1a6-a562-443e-b16d-76c159107794\") " pod="openstack/keystone-7f4ddd687-vk74x" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.266969 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1cd9d1a6-a562-443e-b16d-76c159107794-fernet-keys\") pod \"keystone-7f4ddd687-vk74x\" (UID: \"1cd9d1a6-a562-443e-b16d-76c159107794\") " pod="openstack/keystone-7f4ddd687-vk74x" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.267056 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cd9d1a6-a562-443e-b16d-76c159107794-combined-ca-bundle\") pod \"keystone-7f4ddd687-vk74x\" (UID: \"1cd9d1a6-a562-443e-b16d-76c159107794\") " pod="openstack/keystone-7f4ddd687-vk74x" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.267083 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cd9d1a6-a562-443e-b16d-76c159107794-scripts\") pod \"keystone-7f4ddd687-vk74x\" (UID: \"1cd9d1a6-a562-443e-b16d-76c159107794\") " pod="openstack/keystone-7f4ddd687-vk74x" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.267196 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cd9d1a6-a562-443e-b16d-76c159107794-internal-tls-certs\") pod \"keystone-7f4ddd687-vk74x\" (UID: \"1cd9d1a6-a562-443e-b16d-76c159107794\") " pod="openstack/keystone-7f4ddd687-vk74x" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.267239 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cd9d1a6-a562-443e-b16d-76c159107794-public-tls-certs\") pod \"keystone-7f4ddd687-vk74x\" (UID: \"1cd9d1a6-a562-443e-b16d-76c159107794\") " pod="openstack/keystone-7f4ddd687-vk74x" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.267289 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1cd9d1a6-a562-443e-b16d-76c159107794-credential-keys\") pod \"keystone-7f4ddd687-vk74x\" (UID: \"1cd9d1a6-a562-443e-b16d-76c159107794\") " pod="openstack/keystone-7f4ddd687-vk74x" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.284177 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7ffb7c9ccc-wsc5d" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.284804 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-pdfvr"] Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.289425 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-pdfvr" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.295814 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-pdfvr"] Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.305788 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-77cb55bf5d-j9l7d"] Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.313089 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-77cb55bf5d-j9l7d" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.314127 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-57bcdc7dc8-2mzt6" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.326167 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.345143 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-77cb55bf5d-j9l7d"] Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.374669 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/037885e0-98e4-4c61-b8dd-652d9697be28-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-pdfvr\" (UID: \"037885e0-98e4-4c61-b8dd-652d9697be28\") " pod="openstack/dnsmasq-dns-848cf88cfc-pdfvr" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.374707 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/037885e0-98e4-4c61-b8dd-652d9697be28-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-pdfvr\" (UID: \"037885e0-98e4-4c61-b8dd-652d9697be28\") " pod="openstack/dnsmasq-dns-848cf88cfc-pdfvr" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.374731 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cd9d1a6-a562-443e-b16d-76c159107794-config-data\") pod \"keystone-7f4ddd687-vk74x\" (UID: \"1cd9d1a6-a562-443e-b16d-76c159107794\") " pod="openstack/keystone-7f4ddd687-vk74x" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.374786 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/037885e0-98e4-4c61-b8dd-652d9697be28-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-pdfvr\" (UID: \"037885e0-98e4-4c61-b8dd-652d9697be28\") " pod="openstack/dnsmasq-dns-848cf88cfc-pdfvr" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.374810 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1cd9d1a6-a562-443e-b16d-76c159107794-fernet-keys\") pod \"keystone-7f4ddd687-vk74x\" (UID: \"1cd9d1a6-a562-443e-b16d-76c159107794\") " pod="openstack/keystone-7f4ddd687-vk74x" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.374832 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ccd1d72-1c78-439b-b3ba-d38159757b03-combined-ca-bundle\") pod \"barbican-api-77cb55bf5d-j9l7d\" (UID: \"3ccd1d72-1c78-439b-b3ba-d38159757b03\") " pod="openstack/barbican-api-77cb55bf5d-j9l7d" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.374863 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mflck\" (UniqueName: \"kubernetes.io/projected/037885e0-98e4-4c61-b8dd-652d9697be28-kube-api-access-mflck\") pod \"dnsmasq-dns-848cf88cfc-pdfvr\" (UID: \"037885e0-98e4-4c61-b8dd-652d9697be28\") " pod="openstack/dnsmasq-dns-848cf88cfc-pdfvr" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.374890 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cd9d1a6-a562-443e-b16d-76c159107794-combined-ca-bundle\") pod \"keystone-7f4ddd687-vk74x\" (UID: \"1cd9d1a6-a562-443e-b16d-76c159107794\") " pod="openstack/keystone-7f4ddd687-vk74x" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.374911 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cd9d1a6-a562-443e-b16d-76c159107794-scripts\") pod \"keystone-7f4ddd687-vk74x\" (UID: \"1cd9d1a6-a562-443e-b16d-76c159107794\") " pod="openstack/keystone-7f4ddd687-vk74x" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.374965 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ccd1d72-1c78-439b-b3ba-d38159757b03-config-data-custom\") pod \"barbican-api-77cb55bf5d-j9l7d\" (UID: \"3ccd1d72-1c78-439b-b3ba-d38159757b03\") " pod="openstack/barbican-api-77cb55bf5d-j9l7d" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.374986 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cd9d1a6-a562-443e-b16d-76c159107794-internal-tls-certs\") pod \"keystone-7f4ddd687-vk74x\" (UID: \"1cd9d1a6-a562-443e-b16d-76c159107794\") " pod="openstack/keystone-7f4ddd687-vk74x" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.375896 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ccd1d72-1c78-439b-b3ba-d38159757b03-logs\") pod \"barbican-api-77cb55bf5d-j9l7d\" (UID: \"3ccd1d72-1c78-439b-b3ba-d38159757b03\") " pod="openstack/barbican-api-77cb55bf5d-j9l7d" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.375920 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhcn4\" (UniqueName: \"kubernetes.io/projected/3ccd1d72-1c78-439b-b3ba-d38159757b03-kube-api-access-xhcn4\") pod \"barbican-api-77cb55bf5d-j9l7d\" (UID: \"3ccd1d72-1c78-439b-b3ba-d38159757b03\") " pod="openstack/barbican-api-77cb55bf5d-j9l7d" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.375943 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cd9d1a6-a562-443e-b16d-76c159107794-public-tls-certs\") pod \"keystone-7f4ddd687-vk74x\" (UID: \"1cd9d1a6-a562-443e-b16d-76c159107794\") " pod="openstack/keystone-7f4ddd687-vk74x" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.376007 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/037885e0-98e4-4c61-b8dd-652d9697be28-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-pdfvr\" (UID: \"037885e0-98e4-4c61-b8dd-652d9697be28\") " pod="openstack/dnsmasq-dns-848cf88cfc-pdfvr" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.376044 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1cd9d1a6-a562-443e-b16d-76c159107794-credential-keys\") pod \"keystone-7f4ddd687-vk74x\" (UID: \"1cd9d1a6-a562-443e-b16d-76c159107794\") " pod="openstack/keystone-7f4ddd687-vk74x" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.376598 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/037885e0-98e4-4c61-b8dd-652d9697be28-config\") pod \"dnsmasq-dns-848cf88cfc-pdfvr\" (UID: \"037885e0-98e4-4c61-b8dd-652d9697be28\") " pod="openstack/dnsmasq-dns-848cf88cfc-pdfvr" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.376757 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ccd1d72-1c78-439b-b3ba-d38159757b03-config-data\") pod \"barbican-api-77cb55bf5d-j9l7d\" (UID: \"3ccd1d72-1c78-439b-b3ba-d38159757b03\") " pod="openstack/barbican-api-77cb55bf5d-j9l7d" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.376850 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj6pv\" (UniqueName: \"kubernetes.io/projected/1cd9d1a6-a562-443e-b16d-76c159107794-kube-api-access-xj6pv\") pod \"keystone-7f4ddd687-vk74x\" (UID: \"1cd9d1a6-a562-443e-b16d-76c159107794\") " pod="openstack/keystone-7f4ddd687-vk74x" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.379796 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1cd9d1a6-a562-443e-b16d-76c159107794-fernet-keys\") pod \"keystone-7f4ddd687-vk74x\" (UID: \"1cd9d1a6-a562-443e-b16d-76c159107794\") " pod="openstack/keystone-7f4ddd687-vk74x" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.380639 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cd9d1a6-a562-443e-b16d-76c159107794-internal-tls-certs\") pod \"keystone-7f4ddd687-vk74x\" (UID: \"1cd9d1a6-a562-443e-b16d-76c159107794\") " pod="openstack/keystone-7f4ddd687-vk74x" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.381284 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1cd9d1a6-a562-443e-b16d-76c159107794-credential-keys\") pod \"keystone-7f4ddd687-vk74x\" (UID: \"1cd9d1a6-a562-443e-b16d-76c159107794\") " pod="openstack/keystone-7f4ddd687-vk74x" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.385082 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cd9d1a6-a562-443e-b16d-76c159107794-public-tls-certs\") pod \"keystone-7f4ddd687-vk74x\" (UID: \"1cd9d1a6-a562-443e-b16d-76c159107794\") " pod="openstack/keystone-7f4ddd687-vk74x" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.390642 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cd9d1a6-a562-443e-b16d-76c159107794-config-data\") pod \"keystone-7f4ddd687-vk74x\" (UID: \"1cd9d1a6-a562-443e-b16d-76c159107794\") " pod="openstack/keystone-7f4ddd687-vk74x" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.392000 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-56dd6c8857-kp42z"] Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.394494 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56dd6c8857-kp42z" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.398729 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cd9d1a6-a562-443e-b16d-76c159107794-scripts\") pod \"keystone-7f4ddd687-vk74x\" (UID: \"1cd9d1a6-a562-443e-b16d-76c159107794\") " pod="openstack/keystone-7f4ddd687-vk74x" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.399344 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cd9d1a6-a562-443e-b16d-76c159107794-combined-ca-bundle\") pod \"keystone-7f4ddd687-vk74x\" (UID: \"1cd9d1a6-a562-443e-b16d-76c159107794\") " pod="openstack/keystone-7f4ddd687-vk74x" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.399549 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.399794 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.407209 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj6pv\" (UniqueName: \"kubernetes.io/projected/1cd9d1a6-a562-443e-b16d-76c159107794-kube-api-access-xj6pv\") pod \"keystone-7f4ddd687-vk74x\" (UID: \"1cd9d1a6-a562-443e-b16d-76c159107794\") " pod="openstack/keystone-7f4ddd687-vk74x" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.463971 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56dd6c8857-kp42z"] Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.479287 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9afd9c8-7b23-44fb-99ce-7924833d4589-internal-tls-certs\") pod \"neutron-56dd6c8857-kp42z\" (UID: \"e9afd9c8-7b23-44fb-99ce-7924833d4589\") " pod="openstack/neutron-56dd6c8857-kp42z" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.479323 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e9afd9c8-7b23-44fb-99ce-7924833d4589-config\") pod \"neutron-56dd6c8857-kp42z\" (UID: \"e9afd9c8-7b23-44fb-99ce-7924833d4589\") " pod="openstack/neutron-56dd6c8857-kp42z" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.479344 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9afd9c8-7b23-44fb-99ce-7924833d4589-ovndb-tls-certs\") pod \"neutron-56dd6c8857-kp42z\" (UID: \"e9afd9c8-7b23-44fb-99ce-7924833d4589\") " pod="openstack/neutron-56dd6c8857-kp42z" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.479361 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9afd9c8-7b23-44fb-99ce-7924833d4589-combined-ca-bundle\") pod \"neutron-56dd6c8857-kp42z\" (UID: \"e9afd9c8-7b23-44fb-99ce-7924833d4589\") " pod="openstack/neutron-56dd6c8857-kp42z" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.479384 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/037885e0-98e4-4c61-b8dd-652d9697be28-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-pdfvr\" (UID: \"037885e0-98e4-4c61-b8dd-652d9697be28\") " pod="openstack/dnsmasq-dns-848cf88cfc-pdfvr" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.479403 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/037885e0-98e4-4c61-b8dd-652d9697be28-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-pdfvr\" (UID: \"037885e0-98e4-4c61-b8dd-652d9697be28\") " pod="openstack/dnsmasq-dns-848cf88cfc-pdfvr" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.479424 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/037885e0-98e4-4c61-b8dd-652d9697be28-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-pdfvr\" (UID: \"037885e0-98e4-4c61-b8dd-652d9697be28\") " pod="openstack/dnsmasq-dns-848cf88cfc-pdfvr" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.479447 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ccd1d72-1c78-439b-b3ba-d38159757b03-combined-ca-bundle\") pod \"barbican-api-77cb55bf5d-j9l7d\" (UID: \"3ccd1d72-1c78-439b-b3ba-d38159757b03\") " pod="openstack/barbican-api-77cb55bf5d-j9l7d" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.479469 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mflck\" (UniqueName: \"kubernetes.io/projected/037885e0-98e4-4c61-b8dd-652d9697be28-kube-api-access-mflck\") pod \"dnsmasq-dns-848cf88cfc-pdfvr\" (UID: \"037885e0-98e4-4c61-b8dd-652d9697be28\") " pod="openstack/dnsmasq-dns-848cf88cfc-pdfvr" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.479507 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ccd1d72-1c78-439b-b3ba-d38159757b03-config-data-custom\") pod \"barbican-api-77cb55bf5d-j9l7d\" (UID: \"3ccd1d72-1c78-439b-b3ba-d38159757b03\") " pod="openstack/barbican-api-77cb55bf5d-j9l7d" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.479527 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e9afd9c8-7b23-44fb-99ce-7924833d4589-httpd-config\") pod \"neutron-56dd6c8857-kp42z\" (UID: \"e9afd9c8-7b23-44fb-99ce-7924833d4589\") " pod="openstack/neutron-56dd6c8857-kp42z" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.479551 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ccd1d72-1c78-439b-b3ba-d38159757b03-logs\") pod \"barbican-api-77cb55bf5d-j9l7d\" (UID: \"3ccd1d72-1c78-439b-b3ba-d38159757b03\") " pod="openstack/barbican-api-77cb55bf5d-j9l7d" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.479566 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhcn4\" (UniqueName: \"kubernetes.io/projected/3ccd1d72-1c78-439b-b3ba-d38159757b03-kube-api-access-xhcn4\") pod \"barbican-api-77cb55bf5d-j9l7d\" (UID: \"3ccd1d72-1c78-439b-b3ba-d38159757b03\") " pod="openstack/barbican-api-77cb55bf5d-j9l7d" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.479593 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/037885e0-98e4-4c61-b8dd-652d9697be28-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-pdfvr\" (UID: \"037885e0-98e4-4c61-b8dd-652d9697be28\") " pod="openstack/dnsmasq-dns-848cf88cfc-pdfvr" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.479638 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/037885e0-98e4-4c61-b8dd-652d9697be28-config\") pod \"dnsmasq-dns-848cf88cfc-pdfvr\" (UID: \"037885e0-98e4-4c61-b8dd-652d9697be28\") " pod="openstack/dnsmasq-dns-848cf88cfc-pdfvr" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.479665 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ccd1d72-1c78-439b-b3ba-d38159757b03-config-data\") pod \"barbican-api-77cb55bf5d-j9l7d\" (UID: \"3ccd1d72-1c78-439b-b3ba-d38159757b03\") " pod="openstack/barbican-api-77cb55bf5d-j9l7d" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.479687 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmllg\" (UniqueName: \"kubernetes.io/projected/e9afd9c8-7b23-44fb-99ce-7924833d4589-kube-api-access-nmllg\") pod \"neutron-56dd6c8857-kp42z\" (UID: \"e9afd9c8-7b23-44fb-99ce-7924833d4589\") " pod="openstack/neutron-56dd6c8857-kp42z" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.479713 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9afd9c8-7b23-44fb-99ce-7924833d4589-public-tls-certs\") pod \"neutron-56dd6c8857-kp42z\" (UID: \"e9afd9c8-7b23-44fb-99ce-7924833d4589\") " pod="openstack/neutron-56dd6c8857-kp42z" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.482282 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/037885e0-98e4-4c61-b8dd-652d9697be28-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-pdfvr\" (UID: \"037885e0-98e4-4c61-b8dd-652d9697be28\") " pod="openstack/dnsmasq-dns-848cf88cfc-pdfvr" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.483109 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/037885e0-98e4-4c61-b8dd-652d9697be28-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-pdfvr\" (UID: \"037885e0-98e4-4c61-b8dd-652d9697be28\") " pod="openstack/dnsmasq-dns-848cf88cfc-pdfvr" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.484031 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/037885e0-98e4-4c61-b8dd-652d9697be28-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-pdfvr\" (UID: \"037885e0-98e4-4c61-b8dd-652d9697be28\") " pod="openstack/dnsmasq-dns-848cf88cfc-pdfvr" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.486597 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ccd1d72-1c78-439b-b3ba-d38159757b03-logs\") pod \"barbican-api-77cb55bf5d-j9l7d\" (UID: \"3ccd1d72-1c78-439b-b3ba-d38159757b03\") " pod="openstack/barbican-api-77cb55bf5d-j9l7d" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.488279 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/037885e0-98e4-4c61-b8dd-652d9697be28-config\") pod \"dnsmasq-dns-848cf88cfc-pdfvr\" (UID: \"037885e0-98e4-4c61-b8dd-652d9697be28\") " pod="openstack/dnsmasq-dns-848cf88cfc-pdfvr" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.488800 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/037885e0-98e4-4c61-b8dd-652d9697be28-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-pdfvr\" (UID: \"037885e0-98e4-4c61-b8dd-652d9697be28\") " pod="openstack/dnsmasq-dns-848cf88cfc-pdfvr" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.488963 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ccd1d72-1c78-439b-b3ba-d38159757b03-combined-ca-bundle\") pod \"barbican-api-77cb55bf5d-j9l7d\" (UID: \"3ccd1d72-1c78-439b-b3ba-d38159757b03\") " pod="openstack/barbican-api-77cb55bf5d-j9l7d" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.489501 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ccd1d72-1c78-439b-b3ba-d38159757b03-config-data-custom\") pod \"barbican-api-77cb55bf5d-j9l7d\" (UID: \"3ccd1d72-1c78-439b-b3ba-d38159757b03\") " pod="openstack/barbican-api-77cb55bf5d-j9l7d" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.504774 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ccd1d72-1c78-439b-b3ba-d38159757b03-config-data\") pod \"barbican-api-77cb55bf5d-j9l7d\" (UID: \"3ccd1d72-1c78-439b-b3ba-d38159757b03\") " pod="openstack/barbican-api-77cb55bf5d-j9l7d" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.513016 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7f4ddd687-vk74x" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.513544 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mflck\" (UniqueName: \"kubernetes.io/projected/037885e0-98e4-4c61-b8dd-652d9697be28-kube-api-access-mflck\") pod \"dnsmasq-dns-848cf88cfc-pdfvr\" (UID: \"037885e0-98e4-4c61-b8dd-652d9697be28\") " pod="openstack/dnsmasq-dns-848cf88cfc-pdfvr" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.517289 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhcn4\" (UniqueName: \"kubernetes.io/projected/3ccd1d72-1c78-439b-b3ba-d38159757b03-kube-api-access-xhcn4\") pod \"barbican-api-77cb55bf5d-j9l7d\" (UID: \"3ccd1d72-1c78-439b-b3ba-d38159757b03\") " pod="openstack/barbican-api-77cb55bf5d-j9l7d" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.586636 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmllg\" (UniqueName: \"kubernetes.io/projected/e9afd9c8-7b23-44fb-99ce-7924833d4589-kube-api-access-nmllg\") pod \"neutron-56dd6c8857-kp42z\" (UID: \"e9afd9c8-7b23-44fb-99ce-7924833d4589\") " pod="openstack/neutron-56dd6c8857-kp42z" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.586686 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9afd9c8-7b23-44fb-99ce-7924833d4589-public-tls-certs\") pod \"neutron-56dd6c8857-kp42z\" (UID: \"e9afd9c8-7b23-44fb-99ce-7924833d4589\") " pod="openstack/neutron-56dd6c8857-kp42z" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.586725 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9afd9c8-7b23-44fb-99ce-7924833d4589-internal-tls-certs\") pod \"neutron-56dd6c8857-kp42z\" (UID: \"e9afd9c8-7b23-44fb-99ce-7924833d4589\") " pod="openstack/neutron-56dd6c8857-kp42z" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.586754 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e9afd9c8-7b23-44fb-99ce-7924833d4589-config\") pod \"neutron-56dd6c8857-kp42z\" (UID: \"e9afd9c8-7b23-44fb-99ce-7924833d4589\") " pod="openstack/neutron-56dd6c8857-kp42z" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.586771 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9afd9c8-7b23-44fb-99ce-7924833d4589-ovndb-tls-certs\") pod \"neutron-56dd6c8857-kp42z\" (UID: \"e9afd9c8-7b23-44fb-99ce-7924833d4589\") " pod="openstack/neutron-56dd6c8857-kp42z" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.586789 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9afd9c8-7b23-44fb-99ce-7924833d4589-combined-ca-bundle\") pod \"neutron-56dd6c8857-kp42z\" (UID: \"e9afd9c8-7b23-44fb-99ce-7924833d4589\") " pod="openstack/neutron-56dd6c8857-kp42z" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.586843 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e9afd9c8-7b23-44fb-99ce-7924833d4589-httpd-config\") pod \"neutron-56dd6c8857-kp42z\" (UID: \"e9afd9c8-7b23-44fb-99ce-7924833d4589\") " pod="openstack/neutron-56dd6c8857-kp42z" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.612654 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e9afd9c8-7b23-44fb-99ce-7924833d4589-httpd-config\") pod \"neutron-56dd6c8857-kp42z\" (UID: \"e9afd9c8-7b23-44fb-99ce-7924833d4589\") " pod="openstack/neutron-56dd6c8857-kp42z" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.626469 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmllg\" (UniqueName: \"kubernetes.io/projected/e9afd9c8-7b23-44fb-99ce-7924833d4589-kube-api-access-nmllg\") pod \"neutron-56dd6c8857-kp42z\" (UID: \"e9afd9c8-7b23-44fb-99ce-7924833d4589\") " pod="openstack/neutron-56dd6c8857-kp42z" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.630588 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e9afd9c8-7b23-44fb-99ce-7924833d4589-config\") pod \"neutron-56dd6c8857-kp42z\" (UID: \"e9afd9c8-7b23-44fb-99ce-7924833d4589\") " pod="openstack/neutron-56dd6c8857-kp42z" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.633933 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9afd9c8-7b23-44fb-99ce-7924833d4589-combined-ca-bundle\") pod \"neutron-56dd6c8857-kp42z\" (UID: \"e9afd9c8-7b23-44fb-99ce-7924833d4589\") " pod="openstack/neutron-56dd6c8857-kp42z" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.639506 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9afd9c8-7b23-44fb-99ce-7924833d4589-ovndb-tls-certs\") pod \"neutron-56dd6c8857-kp42z\" (UID: \"e9afd9c8-7b23-44fb-99ce-7924833d4589\") " pod="openstack/neutron-56dd6c8857-kp42z" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.640524 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9afd9c8-7b23-44fb-99ce-7924833d4589-public-tls-certs\") pod \"neutron-56dd6c8857-kp42z\" (UID: \"e9afd9c8-7b23-44fb-99ce-7924833d4589\") " pod="openstack/neutron-56dd6c8857-kp42z" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.643585 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9afd9c8-7b23-44fb-99ce-7924833d4589-internal-tls-certs\") pod \"neutron-56dd6c8857-kp42z\" (UID: \"e9afd9c8-7b23-44fb-99ce-7924833d4589\") " pod="openstack/neutron-56dd6c8857-kp42z" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.651295 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-pdfvr" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.683345 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-77cb55bf5d-j9l7d" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.734225 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56dd6c8857-kp42z" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.846331 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.846387 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.888562 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76c46557f6-h7874" event={"ID":"42096ad9-dd99-4b49-ad30-57e43f4e97a1","Type":"ContainerStarted","Data":"11fb1748bacf7cdd4881adda1adafd0b56fde11f31f6e1af46f23c8badc5f64a"} Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.898933 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-76c46557f6-h7874" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.915254 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7b667979-xv98q" podUID="790096ba-f8cd-46da-af7a-91992b6f39df" containerName="dnsmasq-dns" containerID="cri-o://2de8f261253e3975a1f45ac501152ce8ec9bcf703e90871bb8dcd7719edcc079" gracePeriod=10 Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.915369 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-xv98q" event={"ID":"790096ba-f8cd-46da-af7a-91992b6f39df","Type":"ContainerStarted","Data":"2de8f261253e3975a1f45ac501152ce8ec9bcf703e90871bb8dcd7719edcc079"} Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.915613 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7b667979-xv98q" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.940046 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.945300 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 24 11:25:17 crc kubenswrapper[4752]: I1124 11:25:17.955630 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-76c46557f6-h7874" podStartSLOduration=4.955603892 podStartE2EDuration="4.955603892s" podCreationTimestamp="2025-11-24 11:25:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:25:17.943044259 +0000 UTC m=+1123.927864548" watchObservedRunningTime="2025-11-24 11:25:17.955603892 +0000 UTC m=+1123.940424191" Nov 24 11:25:18 crc kubenswrapper[4752]: I1124 11:25:18.018115 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7b667979-xv98q" podStartSLOduration=5.018088528 podStartE2EDuration="5.018088528s" podCreationTimestamp="2025-11-24 11:25:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:25:17.999785599 +0000 UTC m=+1123.984605898" watchObservedRunningTime="2025-11-24 11:25:18.018088528 +0000 UTC m=+1124.002908817" Nov 24 11:25:18 crc kubenswrapper[4752]: I1124 11:25:18.071427 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-57bcdc7dc8-2mzt6"] Nov 24 11:25:18 crc kubenswrapper[4752]: I1124 11:25:18.242414 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7ffb7c9ccc-wsc5d"] Nov 24 11:25:18 crc kubenswrapper[4752]: I1124 11:25:18.510945 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-xv98q" Nov 24 11:25:18 crc kubenswrapper[4752]: I1124 11:25:18.633813 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/790096ba-f8cd-46da-af7a-91992b6f39df-config\") pod \"790096ba-f8cd-46da-af7a-91992b6f39df\" (UID: \"790096ba-f8cd-46da-af7a-91992b6f39df\") " Nov 24 11:25:18 crc kubenswrapper[4752]: I1124 11:25:18.633907 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/790096ba-f8cd-46da-af7a-91992b6f39df-dns-svc\") pod \"790096ba-f8cd-46da-af7a-91992b6f39df\" (UID: \"790096ba-f8cd-46da-af7a-91992b6f39df\") " Nov 24 11:25:18 crc kubenswrapper[4752]: I1124 11:25:18.633928 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/790096ba-f8cd-46da-af7a-91992b6f39df-dns-swift-storage-0\") pod \"790096ba-f8cd-46da-af7a-91992b6f39df\" (UID: \"790096ba-f8cd-46da-af7a-91992b6f39df\") " Nov 24 11:25:18 crc kubenswrapper[4752]: I1124 11:25:18.633971 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/790096ba-f8cd-46da-af7a-91992b6f39df-ovsdbserver-nb\") pod \"790096ba-f8cd-46da-af7a-91992b6f39df\" (UID: \"790096ba-f8cd-46da-af7a-91992b6f39df\") " Nov 24 11:25:18 crc kubenswrapper[4752]: I1124 11:25:18.633997 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/790096ba-f8cd-46da-af7a-91992b6f39df-ovsdbserver-sb\") pod \"790096ba-f8cd-46da-af7a-91992b6f39df\" (UID: \"790096ba-f8cd-46da-af7a-91992b6f39df\") " Nov 24 11:25:18 crc kubenswrapper[4752]: I1124 11:25:18.634047 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhktm\" (UniqueName: \"kubernetes.io/projected/790096ba-f8cd-46da-af7a-91992b6f39df-kube-api-access-rhktm\") pod \"790096ba-f8cd-46da-af7a-91992b6f39df\" (UID: \"790096ba-f8cd-46da-af7a-91992b6f39df\") " Nov 24 11:25:18 crc kubenswrapper[4752]: I1124 11:25:18.678440 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7f4ddd687-vk74x"] Nov 24 11:25:18 crc kubenswrapper[4752]: I1124 11:25:18.693081 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/790096ba-f8cd-46da-af7a-91992b6f39df-kube-api-access-rhktm" (OuterVolumeSpecName: "kube-api-access-rhktm") pod "790096ba-f8cd-46da-af7a-91992b6f39df" (UID: "790096ba-f8cd-46da-af7a-91992b6f39df"). InnerVolumeSpecName "kube-api-access-rhktm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:25:18 crc kubenswrapper[4752]: I1124 11:25:18.698501 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/790096ba-f8cd-46da-af7a-91992b6f39df-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "790096ba-f8cd-46da-af7a-91992b6f39df" (UID: "790096ba-f8cd-46da-af7a-91992b6f39df"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:25:18 crc kubenswrapper[4752]: I1124 11:25:18.702058 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/790096ba-f8cd-46da-af7a-91992b6f39df-config" (OuterVolumeSpecName: "config") pod "790096ba-f8cd-46da-af7a-91992b6f39df" (UID: "790096ba-f8cd-46da-af7a-91992b6f39df"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:25:18 crc kubenswrapper[4752]: I1124 11:25:18.704415 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/790096ba-f8cd-46da-af7a-91992b6f39df-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "790096ba-f8cd-46da-af7a-91992b6f39df" (UID: "790096ba-f8cd-46da-af7a-91992b6f39df"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:25:18 crc kubenswrapper[4752]: I1124 11:25:18.726596 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/790096ba-f8cd-46da-af7a-91992b6f39df-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "790096ba-f8cd-46da-af7a-91992b6f39df" (UID: "790096ba-f8cd-46da-af7a-91992b6f39df"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:25:18 crc kubenswrapper[4752]: I1124 11:25:18.728602 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/790096ba-f8cd-46da-af7a-91992b6f39df-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "790096ba-f8cd-46da-af7a-91992b6f39df" (UID: "790096ba-f8cd-46da-af7a-91992b6f39df"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:25:18 crc kubenswrapper[4752]: I1124 11:25:18.735699 4752 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/790096ba-f8cd-46da-af7a-91992b6f39df-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:18 crc kubenswrapper[4752]: I1124 11:25:18.735730 4752 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/790096ba-f8cd-46da-af7a-91992b6f39df-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:18 crc kubenswrapper[4752]: I1124 11:25:18.735754 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/790096ba-f8cd-46da-af7a-91992b6f39df-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:18 crc kubenswrapper[4752]: I1124 11:25:18.735763 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/790096ba-f8cd-46da-af7a-91992b6f39df-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:18 crc kubenswrapper[4752]: I1124 11:25:18.735772 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhktm\" (UniqueName: \"kubernetes.io/projected/790096ba-f8cd-46da-af7a-91992b6f39df-kube-api-access-rhktm\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:18 crc kubenswrapper[4752]: I1124 11:25:18.735783 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/790096ba-f8cd-46da-af7a-91992b6f39df-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:18 crc kubenswrapper[4752]: I1124 11:25:18.780468 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-77cb55bf5d-j9l7d"] Nov 24 11:25:18 crc kubenswrapper[4752]: I1124 11:25:18.815120 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-pdfvr"] Nov 24 11:25:18 crc kubenswrapper[4752]: W1124 11:25:18.842876 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod037885e0_98e4_4c61_b8dd_652d9697be28.slice/crio-90f029dec2cc57e8a911ef2cff832684b61182924b2a4081c3429a3b0685959c WatchSource:0}: Error finding container 90f029dec2cc57e8a911ef2cff832684b61182924b2a4081c3429a3b0685959c: Status 404 returned error can't find the container with id 90f029dec2cc57e8a911ef2cff832684b61182924b2a4081c3429a3b0685959c Nov 24 11:25:18 crc kubenswrapper[4752]: I1124 11:25:18.924881 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 24 11:25:18 crc kubenswrapper[4752]: I1124 11:25:18.924927 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 24 11:25:18 crc kubenswrapper[4752]: I1124 11:25:18.964876 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77cb55bf5d-j9l7d" event={"ID":"3ccd1d72-1c78-439b-b3ba-d38159757b03","Type":"ContainerStarted","Data":"0767e4392a4f29bb6c6db310302c1e32144c1a0148514dd43a2006f11eb94daa"} Nov 24 11:25:18 crc kubenswrapper[4752]: I1124 11:25:18.981755 4752 generic.go:334] "Generic (PLEG): container finished" podID="790096ba-f8cd-46da-af7a-91992b6f39df" containerID="2de8f261253e3975a1f45ac501152ce8ec9bcf703e90871bb8dcd7719edcc079" exitCode=0 Nov 24 11:25:18 crc kubenswrapper[4752]: I1124 11:25:18.981951 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-xv98q" Nov 24 11:25:18 crc kubenswrapper[4752]: I1124 11:25:18.981950 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-xv98q" event={"ID":"790096ba-f8cd-46da-af7a-91992b6f39df","Type":"ContainerDied","Data":"2de8f261253e3975a1f45ac501152ce8ec9bcf703e90871bb8dcd7719edcc079"} Nov 24 11:25:18 crc kubenswrapper[4752]: I1124 11:25:18.982019 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-xv98q" event={"ID":"790096ba-f8cd-46da-af7a-91992b6f39df","Type":"ContainerDied","Data":"963a7a70e819415eaaccc34348dd83ca14297d66400b312712c685c33a7bce40"} Nov 24 11:25:18 crc kubenswrapper[4752]: I1124 11:25:18.982039 4752 scope.go:117] "RemoveContainer" containerID="2de8f261253e3975a1f45ac501152ce8ec9bcf703e90871bb8dcd7719edcc079" Nov 24 11:25:18 crc kubenswrapper[4752]: I1124 11:25:18.982550 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56dd6c8857-kp42z"] Nov 24 11:25:19 crc kubenswrapper[4752]: I1124 11:25:19.017714 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-xv98q"] Nov 24 11:25:19 crc kubenswrapper[4752]: I1124 11:25:19.019940 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7ffb7c9ccc-wsc5d" event={"ID":"7cb9117e-3eeb-4c01-af6a-643eec81666c","Type":"ContainerStarted","Data":"732074ae4e8be72a1a53e99b47e2b6e7e29e754a05bc470943479a5ff3b53dd8"} Nov 24 11:25:19 crc kubenswrapper[4752]: I1124 11:25:19.021842 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7f4ddd687-vk74x" event={"ID":"1cd9d1a6-a562-443e-b16d-76c159107794","Type":"ContainerStarted","Data":"4e5f7910f2ce4f6ad256efb05f29eb26420dbab368e692d14cce58e0dcac8968"} Nov 24 11:25:19 crc kubenswrapper[4752]: I1124 11:25:19.024073 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-57bcdc7dc8-2mzt6" event={"ID":"dd899b7f-383d-45fd-9ec8-48caa4f954f2","Type":"ContainerStarted","Data":"c497636f28d92eb377f777117aba918117bba1876dda5e891bc31443246d02c2"} Nov 24 11:25:19 crc kubenswrapper[4752]: I1124 11:25:19.027612 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-pdfvr" event={"ID":"037885e0-98e4-4c61-b8dd-652d9697be28","Type":"ContainerStarted","Data":"90f029dec2cc57e8a911ef2cff832684b61182924b2a4081c3429a3b0685959c"} Nov 24 11:25:19 crc kubenswrapper[4752]: I1124 11:25:19.035815 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 24 11:25:19 crc kubenswrapper[4752]: I1124 11:25:19.035977 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 24 11:25:19 crc kubenswrapper[4752]: I1124 11:25:19.037055 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-xv98q"] Nov 24 11:25:19 crc kubenswrapper[4752]: I1124 11:25:19.044292 4752 scope.go:117] "RemoveContainer" containerID="b526584b27b044b9e54b10848e673041b160508afce948c9cbdd5f892eeee5d2" Nov 24 11:25:19 crc kubenswrapper[4752]: I1124 11:25:19.113379 4752 scope.go:117] "RemoveContainer" containerID="2de8f261253e3975a1f45ac501152ce8ec9bcf703e90871bb8dcd7719edcc079" Nov 24 11:25:19 crc kubenswrapper[4752]: E1124 11:25:19.114357 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2de8f261253e3975a1f45ac501152ce8ec9bcf703e90871bb8dcd7719edcc079\": container with ID starting with 2de8f261253e3975a1f45ac501152ce8ec9bcf703e90871bb8dcd7719edcc079 not found: ID does not exist" containerID="2de8f261253e3975a1f45ac501152ce8ec9bcf703e90871bb8dcd7719edcc079" Nov 24 11:25:19 crc kubenswrapper[4752]: I1124 11:25:19.114418 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2de8f261253e3975a1f45ac501152ce8ec9bcf703e90871bb8dcd7719edcc079"} err="failed to get container status \"2de8f261253e3975a1f45ac501152ce8ec9bcf703e90871bb8dcd7719edcc079\": rpc error: code = NotFound desc = could not find container \"2de8f261253e3975a1f45ac501152ce8ec9bcf703e90871bb8dcd7719edcc079\": container with ID starting with 2de8f261253e3975a1f45ac501152ce8ec9bcf703e90871bb8dcd7719edcc079 not found: ID does not exist" Nov 24 11:25:19 crc kubenswrapper[4752]: I1124 11:25:19.114453 4752 scope.go:117] "RemoveContainer" containerID="b526584b27b044b9e54b10848e673041b160508afce948c9cbdd5f892eeee5d2" Nov 24 11:25:19 crc kubenswrapper[4752]: E1124 11:25:19.114923 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b526584b27b044b9e54b10848e673041b160508afce948c9cbdd5f892eeee5d2\": container with ID starting with b526584b27b044b9e54b10848e673041b160508afce948c9cbdd5f892eeee5d2 not found: ID does not exist" containerID="b526584b27b044b9e54b10848e673041b160508afce948c9cbdd5f892eeee5d2" Nov 24 11:25:19 crc kubenswrapper[4752]: I1124 11:25:19.114958 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b526584b27b044b9e54b10848e673041b160508afce948c9cbdd5f892eeee5d2"} err="failed to get container status \"b526584b27b044b9e54b10848e673041b160508afce948c9cbdd5f892eeee5d2\": rpc error: code = NotFound desc = could not find container \"b526584b27b044b9e54b10848e673041b160508afce948c9cbdd5f892eeee5d2\": container with ID starting with b526584b27b044b9e54b10848e673041b160508afce948c9cbdd5f892eeee5d2 not found: ID does not exist" Nov 24 11:25:19 crc kubenswrapper[4752]: I1124 11:25:19.116270 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 24 11:25:19 crc kubenswrapper[4752]: I1124 11:25:19.188545 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 24 11:25:20 crc kubenswrapper[4752]: I1124 11:25:20.038440 4752 generic.go:334] "Generic (PLEG): container finished" podID="037885e0-98e4-4c61-b8dd-652d9697be28" containerID="89d418148239f12b3d76d424e0dc7aacfdc84f38d5a2d0e0b99b864bd0c84869" exitCode=0 Nov 24 11:25:20 crc kubenswrapper[4752]: I1124 11:25:20.038497 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-pdfvr" event={"ID":"037885e0-98e4-4c61-b8dd-652d9697be28","Type":"ContainerDied","Data":"89d418148239f12b3d76d424e0dc7aacfdc84f38d5a2d0e0b99b864bd0c84869"} Nov 24 11:25:20 crc kubenswrapper[4752]: I1124 11:25:20.042272 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77cb55bf5d-j9l7d" event={"ID":"3ccd1d72-1c78-439b-b3ba-d38159757b03","Type":"ContainerStarted","Data":"0944d76f410a93967812d134b9813035d6cbceb79e73c46ce318e5b60349cf87"} Nov 24 11:25:20 crc kubenswrapper[4752]: I1124 11:25:20.042418 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77cb55bf5d-j9l7d" event={"ID":"3ccd1d72-1c78-439b-b3ba-d38159757b03","Type":"ContainerStarted","Data":"1251c4aae8a25020abf87749980547519404f0a5223e44e5247b6f79a26cfc95"} Nov 24 11:25:20 crc kubenswrapper[4752]: I1124 11:25:20.042527 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-77cb55bf5d-j9l7d" Nov 24 11:25:20 crc kubenswrapper[4752]: I1124 11:25:20.044301 4752 generic.go:334] "Generic (PLEG): container finished" podID="96ca5590-9544-4b84-b86d-1ec3ef57a829" containerID="28993df0524c42d77ebd5a549957c79d732fa86a3d6de2ac8c0b38ed64532ac2" exitCode=0 Nov 24 11:25:20 crc kubenswrapper[4752]: I1124 11:25:20.044373 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-s49hz" event={"ID":"96ca5590-9544-4b84-b86d-1ec3ef57a829","Type":"ContainerDied","Data":"28993df0524c42d77ebd5a549957c79d732fa86a3d6de2ac8c0b38ed64532ac2"} Nov 24 11:25:20 crc kubenswrapper[4752]: I1124 11:25:20.080600 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7f4ddd687-vk74x" event={"ID":"1cd9d1a6-a562-443e-b16d-76c159107794","Type":"ContainerStarted","Data":"0a1dcccaeed6faa3a0411c4c11965da3a4a0622d15fdfb5a731e239a841bf153"} Nov 24 11:25:20 crc kubenswrapper[4752]: I1124 11:25:20.081039 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7f4ddd687-vk74x" Nov 24 11:25:20 crc kubenswrapper[4752]: I1124 11:25:20.106353 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56dd6c8857-kp42z" event={"ID":"e9afd9c8-7b23-44fb-99ce-7924833d4589","Type":"ContainerStarted","Data":"b891d4bc5a7581172d59c7e614c2ede206cf3a7bc2dcfbffabe7f2e2bc23b602"} Nov 24 11:25:20 crc kubenswrapper[4752]: I1124 11:25:20.107045 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56dd6c8857-kp42z" event={"ID":"e9afd9c8-7b23-44fb-99ce-7924833d4589","Type":"ContainerStarted","Data":"31f4463200798c8e71465693c21c3b267736953dd9c6093042d651dea9d77b08"} Nov 24 11:25:20 crc kubenswrapper[4752]: I1124 11:25:20.107141 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56dd6c8857-kp42z" event={"ID":"e9afd9c8-7b23-44fb-99ce-7924833d4589","Type":"ContainerStarted","Data":"2ab9d714009d285cbf3a02722516d82bfbff37f2e74cc178d47099771bf093e0"} Nov 24 11:25:20 crc kubenswrapper[4752]: I1124 11:25:20.107213 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-56dd6c8857-kp42z" Nov 24 11:25:20 crc kubenswrapper[4752]: I1124 11:25:20.108077 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 24 11:25:20 crc kubenswrapper[4752]: I1124 11:25:20.108167 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 24 11:25:20 crc kubenswrapper[4752]: I1124 11:25:20.141466 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-77cb55bf5d-j9l7d" podStartSLOduration=3.14145069 podStartE2EDuration="3.14145069s" podCreationTimestamp="2025-11-24 11:25:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:25:20.10821817 +0000 UTC m=+1126.093038459" watchObservedRunningTime="2025-11-24 11:25:20.14145069 +0000 UTC m=+1126.126270979" Nov 24 11:25:20 crc kubenswrapper[4752]: I1124 11:25:20.153108 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7f4ddd687-vk74x" podStartSLOduration=4.153090107 podStartE2EDuration="4.153090107s" podCreationTimestamp="2025-11-24 11:25:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:25:20.128160796 +0000 UTC m=+1126.112981085" watchObservedRunningTime="2025-11-24 11:25:20.153090107 +0000 UTC m=+1126.137910386" Nov 24 11:25:20 crc kubenswrapper[4752]: I1124 11:25:20.162279 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-56dd6c8857-kp42z" podStartSLOduration=3.162263072 podStartE2EDuration="3.162263072s" podCreationTimestamp="2025-11-24 11:25:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:25:20.146587699 +0000 UTC m=+1126.131407988" watchObservedRunningTime="2025-11-24 11:25:20.162263072 +0000 UTC m=+1126.147083361" Nov 24 11:25:20 crc kubenswrapper[4752]: I1124 11:25:20.739003 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="790096ba-f8cd-46da-af7a-91992b6f39df" path="/var/lib/kubelet/pods/790096ba-f8cd-46da-af7a-91992b6f39df/volumes" Nov 24 11:25:20 crc kubenswrapper[4752]: I1124 11:25:20.970014 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-68f8b8b648-65q5g"] Nov 24 11:25:20 crc kubenswrapper[4752]: E1124 11:25:20.970786 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="790096ba-f8cd-46da-af7a-91992b6f39df" containerName="dnsmasq-dns" Nov 24 11:25:20 crc kubenswrapper[4752]: I1124 11:25:20.970808 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="790096ba-f8cd-46da-af7a-91992b6f39df" containerName="dnsmasq-dns" Nov 24 11:25:20 crc kubenswrapper[4752]: E1124 11:25:20.970843 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="790096ba-f8cd-46da-af7a-91992b6f39df" containerName="init" Nov 24 11:25:20 crc kubenswrapper[4752]: I1124 11:25:20.970851 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="790096ba-f8cd-46da-af7a-91992b6f39df" containerName="init" Nov 24 11:25:20 crc kubenswrapper[4752]: I1124 11:25:20.971076 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="790096ba-f8cd-46da-af7a-91992b6f39df" containerName="dnsmasq-dns" Nov 24 11:25:20 crc kubenswrapper[4752]: I1124 11:25:20.980387 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-68f8b8b648-65q5g" Nov 24 11:25:20 crc kubenswrapper[4752]: I1124 11:25:20.983033 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Nov 24 11:25:20 crc kubenswrapper[4752]: I1124 11:25:20.983292 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Nov 24 11:25:20 crc kubenswrapper[4752]: I1124 11:25:20.990493 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-68f8b8b648-65q5g"] Nov 24 11:25:21 crc kubenswrapper[4752]: I1124 11:25:21.097941 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6wb7\" (UniqueName: \"kubernetes.io/projected/6c592991-2ffb-417a-aa80-49d0111618bb-kube-api-access-n6wb7\") pod \"barbican-api-68f8b8b648-65q5g\" (UID: \"6c592991-2ffb-417a-aa80-49d0111618bb\") " pod="openstack/barbican-api-68f8b8b648-65q5g" Nov 24 11:25:21 crc kubenswrapper[4752]: I1124 11:25:21.098016 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c592991-2ffb-417a-aa80-49d0111618bb-config-data-custom\") pod \"barbican-api-68f8b8b648-65q5g\" (UID: \"6c592991-2ffb-417a-aa80-49d0111618bb\") " pod="openstack/barbican-api-68f8b8b648-65q5g" Nov 24 11:25:21 crc kubenswrapper[4752]: I1124 11:25:21.098070 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c592991-2ffb-417a-aa80-49d0111618bb-config-data\") pod \"barbican-api-68f8b8b648-65q5g\" (UID: \"6c592991-2ffb-417a-aa80-49d0111618bb\") " pod="openstack/barbican-api-68f8b8b648-65q5g" Nov 24 11:25:21 crc kubenswrapper[4752]: I1124 11:25:21.098114 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c592991-2ffb-417a-aa80-49d0111618bb-public-tls-certs\") pod \"barbican-api-68f8b8b648-65q5g\" (UID: \"6c592991-2ffb-417a-aa80-49d0111618bb\") " pod="openstack/barbican-api-68f8b8b648-65q5g" Nov 24 11:25:21 crc kubenswrapper[4752]: I1124 11:25:21.098138 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c592991-2ffb-417a-aa80-49d0111618bb-combined-ca-bundle\") pod \"barbican-api-68f8b8b648-65q5g\" (UID: \"6c592991-2ffb-417a-aa80-49d0111618bb\") " pod="openstack/barbican-api-68f8b8b648-65q5g" Nov 24 11:25:21 crc kubenswrapper[4752]: I1124 11:25:21.098156 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c592991-2ffb-417a-aa80-49d0111618bb-logs\") pod \"barbican-api-68f8b8b648-65q5g\" (UID: \"6c592991-2ffb-417a-aa80-49d0111618bb\") " pod="openstack/barbican-api-68f8b8b648-65q5g" Nov 24 11:25:21 crc kubenswrapper[4752]: I1124 11:25:21.098183 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c592991-2ffb-417a-aa80-49d0111618bb-internal-tls-certs\") pod \"barbican-api-68f8b8b648-65q5g\" (UID: \"6c592991-2ffb-417a-aa80-49d0111618bb\") " pod="openstack/barbican-api-68f8b8b648-65q5g" Nov 24 11:25:21 crc kubenswrapper[4752]: I1124 11:25:21.147892 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-pdfvr" event={"ID":"037885e0-98e4-4c61-b8dd-652d9697be28","Type":"ContainerStarted","Data":"4394190ea47e8968767d41d26ef1ea71de11fd42f1d75708f3180991725e13c6"} Nov 24 11:25:21 crc kubenswrapper[4752]: I1124 11:25:21.148105 4752 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 24 11:25:21 crc kubenswrapper[4752]: I1124 11:25:21.148120 4752 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 24 11:25:21 crc kubenswrapper[4752]: I1124 11:25:21.150027 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-77cb55bf5d-j9l7d" Nov 24 11:25:21 crc kubenswrapper[4752]: I1124 11:25:21.207231 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-848cf88cfc-pdfvr" podStartSLOduration=4.2072080849999995 podStartE2EDuration="4.207208085s" podCreationTimestamp="2025-11-24 11:25:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:25:21.181246195 +0000 UTC m=+1127.166066484" watchObservedRunningTime="2025-11-24 11:25:21.207208085 +0000 UTC m=+1127.192028394" Nov 24 11:25:21 crc kubenswrapper[4752]: I1124 11:25:21.235992 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6wb7\" (UniqueName: \"kubernetes.io/projected/6c592991-2ffb-417a-aa80-49d0111618bb-kube-api-access-n6wb7\") pod \"barbican-api-68f8b8b648-65q5g\" (UID: \"6c592991-2ffb-417a-aa80-49d0111618bb\") " pod="openstack/barbican-api-68f8b8b648-65q5g" Nov 24 11:25:21 crc kubenswrapper[4752]: I1124 11:25:21.236210 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c592991-2ffb-417a-aa80-49d0111618bb-config-data-custom\") pod \"barbican-api-68f8b8b648-65q5g\" (UID: \"6c592991-2ffb-417a-aa80-49d0111618bb\") " pod="openstack/barbican-api-68f8b8b648-65q5g" Nov 24 11:25:21 crc kubenswrapper[4752]: I1124 11:25:21.236363 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c592991-2ffb-417a-aa80-49d0111618bb-config-data\") pod \"barbican-api-68f8b8b648-65q5g\" (UID: \"6c592991-2ffb-417a-aa80-49d0111618bb\") " pod="openstack/barbican-api-68f8b8b648-65q5g" Nov 24 11:25:21 crc kubenswrapper[4752]: I1124 11:25:21.236485 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c592991-2ffb-417a-aa80-49d0111618bb-public-tls-certs\") pod \"barbican-api-68f8b8b648-65q5g\" (UID: \"6c592991-2ffb-417a-aa80-49d0111618bb\") " pod="openstack/barbican-api-68f8b8b648-65q5g" Nov 24 11:25:21 crc kubenswrapper[4752]: I1124 11:25:21.236530 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c592991-2ffb-417a-aa80-49d0111618bb-combined-ca-bundle\") pod \"barbican-api-68f8b8b648-65q5g\" (UID: \"6c592991-2ffb-417a-aa80-49d0111618bb\") " pod="openstack/barbican-api-68f8b8b648-65q5g" Nov 24 11:25:21 crc kubenswrapper[4752]: I1124 11:25:21.236569 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c592991-2ffb-417a-aa80-49d0111618bb-logs\") pod \"barbican-api-68f8b8b648-65q5g\" (UID: \"6c592991-2ffb-417a-aa80-49d0111618bb\") " pod="openstack/barbican-api-68f8b8b648-65q5g" Nov 24 11:25:21 crc kubenswrapper[4752]: I1124 11:25:21.236635 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c592991-2ffb-417a-aa80-49d0111618bb-internal-tls-certs\") pod \"barbican-api-68f8b8b648-65q5g\" (UID: \"6c592991-2ffb-417a-aa80-49d0111618bb\") " pod="openstack/barbican-api-68f8b8b648-65q5g" Nov 24 11:25:21 crc kubenswrapper[4752]: I1124 11:25:21.242269 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c592991-2ffb-417a-aa80-49d0111618bb-logs\") pod \"barbican-api-68f8b8b648-65q5g\" (UID: \"6c592991-2ffb-417a-aa80-49d0111618bb\") " pod="openstack/barbican-api-68f8b8b648-65q5g" Nov 24 11:25:21 crc kubenswrapper[4752]: I1124 11:25:21.249218 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c592991-2ffb-417a-aa80-49d0111618bb-config-data\") pod \"barbican-api-68f8b8b648-65q5g\" (UID: \"6c592991-2ffb-417a-aa80-49d0111618bb\") " pod="openstack/barbican-api-68f8b8b648-65q5g" Nov 24 11:25:21 crc kubenswrapper[4752]: I1124 11:25:21.254628 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c592991-2ffb-417a-aa80-49d0111618bb-config-data-custom\") pod \"barbican-api-68f8b8b648-65q5g\" (UID: \"6c592991-2ffb-417a-aa80-49d0111618bb\") " pod="openstack/barbican-api-68f8b8b648-65q5g" Nov 24 11:25:21 crc kubenswrapper[4752]: I1124 11:25:21.255006 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c592991-2ffb-417a-aa80-49d0111618bb-combined-ca-bundle\") pod \"barbican-api-68f8b8b648-65q5g\" (UID: \"6c592991-2ffb-417a-aa80-49d0111618bb\") " pod="openstack/barbican-api-68f8b8b648-65q5g" Nov 24 11:25:21 crc kubenswrapper[4752]: I1124 11:25:21.257088 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c592991-2ffb-417a-aa80-49d0111618bb-public-tls-certs\") pod \"barbican-api-68f8b8b648-65q5g\" (UID: \"6c592991-2ffb-417a-aa80-49d0111618bb\") " pod="openstack/barbican-api-68f8b8b648-65q5g" Nov 24 11:25:21 crc kubenswrapper[4752]: I1124 11:25:21.270190 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c592991-2ffb-417a-aa80-49d0111618bb-internal-tls-certs\") pod \"barbican-api-68f8b8b648-65q5g\" (UID: \"6c592991-2ffb-417a-aa80-49d0111618bb\") " pod="openstack/barbican-api-68f8b8b648-65q5g" Nov 24 11:25:21 crc kubenswrapper[4752]: I1124 11:25:21.329714 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6wb7\" (UniqueName: \"kubernetes.io/projected/6c592991-2ffb-417a-aa80-49d0111618bb-kube-api-access-n6wb7\") pod \"barbican-api-68f8b8b648-65q5g\" (UID: \"6c592991-2ffb-417a-aa80-49d0111618bb\") " pod="openstack/barbican-api-68f8b8b648-65q5g" Nov 24 11:25:21 crc kubenswrapper[4752]: E1124 11:25:21.464107 4752 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04ae1a5e_39a4_40d2_8a7f_dd5e7dac6c0d.slice/crio-525620a12830d24c1a64bc6025c7ca13e4b2dc072e4ef9c51dc3990f97b641f2\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa308cc4_8cc5_4a63_926a_033a151f7291.slice/crio-a22dd843e547ae24880f732987ed3607616100f00fdd6d0b7c0a832ecf68d38d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e150ca9_5a87_40ba_bbad_bfae139179ae.slice/crio-e0ba6aa64cba38678c4574d02ef74a8614a787b76bf66dad910dba664253978c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa308cc4_8cc5_4a63_926a_033a151f7291.slice\": RecentStats: unable to find data in memory cache]" Nov 24 11:25:21 crc kubenswrapper[4752]: I1124 11:25:21.595678 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-68f8b8b648-65q5g" Nov 24 11:25:21 crc kubenswrapper[4752]: I1124 11:25:21.972134 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 24 11:25:22 crc kubenswrapper[4752]: I1124 11:25:22.073368 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 24 11:25:22 crc kubenswrapper[4752]: I1124 11:25:22.185229 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-s49hz" event={"ID":"96ca5590-9544-4b84-b86d-1ec3ef57a829","Type":"ContainerDied","Data":"8bbe99dd7b498081d1e77dc7d049f2ffff9c5aacb9d2a7a74f8e0ef9c7bab0cb"} Nov 24 11:25:22 crc kubenswrapper[4752]: I1124 11:25:22.185266 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8bbe99dd7b498081d1e77dc7d049f2ffff9c5aacb9d2a7a74f8e0ef9c7bab0cb" Nov 24 11:25:22 crc kubenswrapper[4752]: I1124 11:25:22.185319 4752 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 24 11:25:22 crc kubenswrapper[4752]: I1124 11:25:22.185329 4752 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 24 11:25:22 crc kubenswrapper[4752]: I1124 11:25:22.185894 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-848cf88cfc-pdfvr" Nov 24 11:25:22 crc kubenswrapper[4752]: I1124 11:25:22.246490 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-s49hz" Nov 24 11:25:22 crc kubenswrapper[4752]: I1124 11:25:22.362811 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtwq6\" (UniqueName: \"kubernetes.io/projected/96ca5590-9544-4b84-b86d-1ec3ef57a829-kube-api-access-qtwq6\") pod \"96ca5590-9544-4b84-b86d-1ec3ef57a829\" (UID: \"96ca5590-9544-4b84-b86d-1ec3ef57a829\") " Nov 24 11:25:22 crc kubenswrapper[4752]: I1124 11:25:22.362905 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96ca5590-9544-4b84-b86d-1ec3ef57a829-config-data\") pod \"96ca5590-9544-4b84-b86d-1ec3ef57a829\" (UID: \"96ca5590-9544-4b84-b86d-1ec3ef57a829\") " Nov 24 11:25:22 crc kubenswrapper[4752]: I1124 11:25:22.362957 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96ca5590-9544-4b84-b86d-1ec3ef57a829-scripts\") pod \"96ca5590-9544-4b84-b86d-1ec3ef57a829\" (UID: \"96ca5590-9544-4b84-b86d-1ec3ef57a829\") " Nov 24 11:25:22 crc kubenswrapper[4752]: I1124 11:25:22.362999 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96ca5590-9544-4b84-b86d-1ec3ef57a829-logs\") pod \"96ca5590-9544-4b84-b86d-1ec3ef57a829\" (UID: \"96ca5590-9544-4b84-b86d-1ec3ef57a829\") " Nov 24 11:25:22 crc kubenswrapper[4752]: I1124 11:25:22.363030 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96ca5590-9544-4b84-b86d-1ec3ef57a829-combined-ca-bundle\") pod \"96ca5590-9544-4b84-b86d-1ec3ef57a829\" (UID: \"96ca5590-9544-4b84-b86d-1ec3ef57a829\") " Nov 24 11:25:22 crc kubenswrapper[4752]: I1124 11:25:22.364498 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96ca5590-9544-4b84-b86d-1ec3ef57a829-logs" (OuterVolumeSpecName: "logs") pod "96ca5590-9544-4b84-b86d-1ec3ef57a829" (UID: "96ca5590-9544-4b84-b86d-1ec3ef57a829"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:25:22 crc kubenswrapper[4752]: I1124 11:25:22.386887 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96ca5590-9544-4b84-b86d-1ec3ef57a829-scripts" (OuterVolumeSpecName: "scripts") pod "96ca5590-9544-4b84-b86d-1ec3ef57a829" (UID: "96ca5590-9544-4b84-b86d-1ec3ef57a829"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:25:22 crc kubenswrapper[4752]: I1124 11:25:22.408074 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96ca5590-9544-4b84-b86d-1ec3ef57a829-kube-api-access-qtwq6" (OuterVolumeSpecName: "kube-api-access-qtwq6") pod "96ca5590-9544-4b84-b86d-1ec3ef57a829" (UID: "96ca5590-9544-4b84-b86d-1ec3ef57a829"). InnerVolumeSpecName "kube-api-access-qtwq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:25:22 crc kubenswrapper[4752]: I1124 11:25:22.447705 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96ca5590-9544-4b84-b86d-1ec3ef57a829-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "96ca5590-9544-4b84-b86d-1ec3ef57a829" (UID: "96ca5590-9544-4b84-b86d-1ec3ef57a829"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:25:22 crc kubenswrapper[4752]: I1124 11:25:22.468212 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtwq6\" (UniqueName: \"kubernetes.io/projected/96ca5590-9544-4b84-b86d-1ec3ef57a829-kube-api-access-qtwq6\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:22 crc kubenswrapper[4752]: I1124 11:25:22.468466 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96ca5590-9544-4b84-b86d-1ec3ef57a829-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:22 crc kubenswrapper[4752]: I1124 11:25:22.468560 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96ca5590-9544-4b84-b86d-1ec3ef57a829-logs\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:22 crc kubenswrapper[4752]: I1124 11:25:22.468636 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96ca5590-9544-4b84-b86d-1ec3ef57a829-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:22 crc kubenswrapper[4752]: I1124 11:25:22.492916 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96ca5590-9544-4b84-b86d-1ec3ef57a829-config-data" (OuterVolumeSpecName: "config-data") pod "96ca5590-9544-4b84-b86d-1ec3ef57a829" (UID: "96ca5590-9544-4b84-b86d-1ec3ef57a829"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:25:22 crc kubenswrapper[4752]: I1124 11:25:22.573421 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96ca5590-9544-4b84-b86d-1ec3ef57a829-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:22 crc kubenswrapper[4752]: I1124 11:25:22.924833 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-68f8b8b648-65q5g"] Nov 24 11:25:22 crc kubenswrapper[4752]: W1124 11:25:22.934492 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c592991_2ffb_417a_aa80_49d0111618bb.slice/crio-ce2c6985b321235d1278471374362c9d02efbc790fbe346fd16a502f6a1612f5 WatchSource:0}: Error finding container ce2c6985b321235d1278471374362c9d02efbc790fbe346fd16a502f6a1612f5: Status 404 returned error can't find the container with id ce2c6985b321235d1278471374362c9d02efbc790fbe346fd16a502f6a1612f5 Nov 24 11:25:22 crc kubenswrapper[4752]: I1124 11:25:22.981403 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 24 11:25:23 crc kubenswrapper[4752]: I1124 11:25:23.080406 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 24 11:25:23 crc kubenswrapper[4752]: I1124 11:25:23.213719 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-68f8b8b648-65q5g" event={"ID":"6c592991-2ffb-417a-aa80-49d0111618bb","Type":"ContainerStarted","Data":"b77d7cf0b069a2acc502c8ead2f25a536ecf99b661c7953794e5aa4a197061a2"} Nov 24 11:25:23 crc kubenswrapper[4752]: I1124 11:25:23.213792 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-68f8b8b648-65q5g" event={"ID":"6c592991-2ffb-417a-aa80-49d0111618bb","Type":"ContainerStarted","Data":"ce2c6985b321235d1278471374362c9d02efbc790fbe346fd16a502f6a1612f5"} Nov 24 11:25:23 crc kubenswrapper[4752]: I1124 11:25:23.218081 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-57bcdc7dc8-2mzt6" event={"ID":"dd899b7f-383d-45fd-9ec8-48caa4f954f2","Type":"ContainerStarted","Data":"1b7082498c964fc9737d85041d682c2e7e12fce540ffa1aaeb63a62e8dc412bf"} Nov 24 11:25:23 crc kubenswrapper[4752]: I1124 11:25:23.218239 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-57bcdc7dc8-2mzt6" event={"ID":"dd899b7f-383d-45fd-9ec8-48caa4f954f2","Type":"ContainerStarted","Data":"92a697b4785f20d4c2e3ad0c1a378d7b86f11315ec2519ac9828547a6554fe0b"} Nov 24 11:25:23 crc kubenswrapper[4752]: I1124 11:25:23.225643 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7ffb7c9ccc-wsc5d" event={"ID":"7cb9117e-3eeb-4c01-af6a-643eec81666c","Type":"ContainerStarted","Data":"30b709fe7204f5ebbe3e77680225e7690d93a9b9881d529fce8b592b7c3e865e"} Nov 24 11:25:23 crc kubenswrapper[4752]: I1124 11:25:23.225710 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7ffb7c9ccc-wsc5d" event={"ID":"7cb9117e-3eeb-4c01-af6a-643eec81666c","Type":"ContainerStarted","Data":"b756f1cb33ac9fd77f5462fb04d0efd55526c3b8882a7fb850fc306538938658"} Nov 24 11:25:23 crc kubenswrapper[4752]: I1124 11:25:23.225727 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-s49hz" Nov 24 11:25:23 crc kubenswrapper[4752]: I1124 11:25:23.243736 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-57bcdc7dc8-2mzt6" podStartSLOduration=3.175064218 podStartE2EDuration="7.243718818s" podCreationTimestamp="2025-11-24 11:25:16 +0000 UTC" firstStartedPulling="2025-11-24 11:25:18.109798868 +0000 UTC m=+1124.094619157" lastFinishedPulling="2025-11-24 11:25:22.178453468 +0000 UTC m=+1128.163273757" observedRunningTime="2025-11-24 11:25:23.237256291 +0000 UTC m=+1129.222076580" watchObservedRunningTime="2025-11-24 11:25:23.243718818 +0000 UTC m=+1129.228539107" Nov 24 11:25:23 crc kubenswrapper[4752]: I1124 11:25:23.300359 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7ffb7c9ccc-wsc5d" podStartSLOduration=3.396040583 podStartE2EDuration="7.300341074s" podCreationTimestamp="2025-11-24 11:25:16 +0000 UTC" firstStartedPulling="2025-11-24 11:25:18.293138035 +0000 UTC m=+1124.277958324" lastFinishedPulling="2025-11-24 11:25:22.197438536 +0000 UTC m=+1128.182258815" observedRunningTime="2025-11-24 11:25:23.251373569 +0000 UTC m=+1129.236193858" watchObservedRunningTime="2025-11-24 11:25:23.300341074 +0000 UTC m=+1129.285161363" Nov 24 11:25:23 crc kubenswrapper[4752]: I1124 11:25:23.386154 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-57b7bbd86d-9drzs"] Nov 24 11:25:23 crc kubenswrapper[4752]: E1124 11:25:23.386857 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96ca5590-9544-4b84-b86d-1ec3ef57a829" containerName="placement-db-sync" Nov 24 11:25:23 crc kubenswrapper[4752]: I1124 11:25:23.386873 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="96ca5590-9544-4b84-b86d-1ec3ef57a829" containerName="placement-db-sync" Nov 24 11:25:23 crc kubenswrapper[4752]: I1124 11:25:23.387554 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="96ca5590-9544-4b84-b86d-1ec3ef57a829" containerName="placement-db-sync" Nov 24 11:25:23 crc kubenswrapper[4752]: I1124 11:25:23.388518 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-57b7bbd86d-9drzs" Nov 24 11:25:23 crc kubenswrapper[4752]: I1124 11:25:23.400906 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-47vqz" Nov 24 11:25:23 crc kubenswrapper[4752]: I1124 11:25:23.401116 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 24 11:25:23 crc kubenswrapper[4752]: I1124 11:25:23.401234 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Nov 24 11:25:23 crc kubenswrapper[4752]: I1124 11:25:23.401340 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 24 11:25:23 crc kubenswrapper[4752]: I1124 11:25:23.401480 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Nov 24 11:25:23 crc kubenswrapper[4752]: I1124 11:25:23.403895 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-57b7bbd86d-9drzs"] Nov 24 11:25:23 crc kubenswrapper[4752]: I1124 11:25:23.502861 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6e4558a-350e-442a-ad61-70d9a9824219-scripts\") pod \"placement-57b7bbd86d-9drzs\" (UID: \"a6e4558a-350e-442a-ad61-70d9a9824219\") " pod="openstack/placement-57b7bbd86d-9drzs" Nov 24 11:25:23 crc kubenswrapper[4752]: I1124 11:25:23.502947 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6e4558a-350e-442a-ad61-70d9a9824219-public-tls-certs\") pod \"placement-57b7bbd86d-9drzs\" (UID: \"a6e4558a-350e-442a-ad61-70d9a9824219\") " pod="openstack/placement-57b7bbd86d-9drzs" Nov 24 11:25:23 crc kubenswrapper[4752]: I1124 11:25:23.502994 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6e4558a-350e-442a-ad61-70d9a9824219-internal-tls-certs\") pod \"placement-57b7bbd86d-9drzs\" (UID: \"a6e4558a-350e-442a-ad61-70d9a9824219\") " pod="openstack/placement-57b7bbd86d-9drzs" Nov 24 11:25:23 crc kubenswrapper[4752]: I1124 11:25:23.503015 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6e4558a-350e-442a-ad61-70d9a9824219-logs\") pod \"placement-57b7bbd86d-9drzs\" (UID: \"a6e4558a-350e-442a-ad61-70d9a9824219\") " pod="openstack/placement-57b7bbd86d-9drzs" Nov 24 11:25:23 crc kubenswrapper[4752]: I1124 11:25:23.503127 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6e4558a-350e-442a-ad61-70d9a9824219-combined-ca-bundle\") pod \"placement-57b7bbd86d-9drzs\" (UID: \"a6e4558a-350e-442a-ad61-70d9a9824219\") " pod="openstack/placement-57b7bbd86d-9drzs" Nov 24 11:25:23 crc kubenswrapper[4752]: I1124 11:25:23.503236 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxmfz\" (UniqueName: \"kubernetes.io/projected/a6e4558a-350e-442a-ad61-70d9a9824219-kube-api-access-qxmfz\") pod \"placement-57b7bbd86d-9drzs\" (UID: \"a6e4558a-350e-442a-ad61-70d9a9824219\") " pod="openstack/placement-57b7bbd86d-9drzs" Nov 24 11:25:23 crc kubenswrapper[4752]: I1124 11:25:23.503268 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6e4558a-350e-442a-ad61-70d9a9824219-config-data\") pod \"placement-57b7bbd86d-9drzs\" (UID: \"a6e4558a-350e-442a-ad61-70d9a9824219\") " pod="openstack/placement-57b7bbd86d-9drzs" Nov 24 11:25:23 crc kubenswrapper[4752]: I1124 11:25:23.604577 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6e4558a-350e-442a-ad61-70d9a9824219-combined-ca-bundle\") pod \"placement-57b7bbd86d-9drzs\" (UID: \"a6e4558a-350e-442a-ad61-70d9a9824219\") " pod="openstack/placement-57b7bbd86d-9drzs" Nov 24 11:25:23 crc kubenswrapper[4752]: I1124 11:25:23.604652 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxmfz\" (UniqueName: \"kubernetes.io/projected/a6e4558a-350e-442a-ad61-70d9a9824219-kube-api-access-qxmfz\") pod \"placement-57b7bbd86d-9drzs\" (UID: \"a6e4558a-350e-442a-ad61-70d9a9824219\") " pod="openstack/placement-57b7bbd86d-9drzs" Nov 24 11:25:23 crc kubenswrapper[4752]: I1124 11:25:23.604675 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6e4558a-350e-442a-ad61-70d9a9824219-config-data\") pod \"placement-57b7bbd86d-9drzs\" (UID: \"a6e4558a-350e-442a-ad61-70d9a9824219\") " pod="openstack/placement-57b7bbd86d-9drzs" Nov 24 11:25:23 crc kubenswrapper[4752]: I1124 11:25:23.604734 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6e4558a-350e-442a-ad61-70d9a9824219-scripts\") pod \"placement-57b7bbd86d-9drzs\" (UID: \"a6e4558a-350e-442a-ad61-70d9a9824219\") " pod="openstack/placement-57b7bbd86d-9drzs" Nov 24 11:25:23 crc kubenswrapper[4752]: I1124 11:25:23.604779 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6e4558a-350e-442a-ad61-70d9a9824219-public-tls-certs\") pod \"placement-57b7bbd86d-9drzs\" (UID: \"a6e4558a-350e-442a-ad61-70d9a9824219\") " pod="openstack/placement-57b7bbd86d-9drzs" Nov 24 11:25:23 crc kubenswrapper[4752]: I1124 11:25:23.604800 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6e4558a-350e-442a-ad61-70d9a9824219-internal-tls-certs\") pod \"placement-57b7bbd86d-9drzs\" (UID: \"a6e4558a-350e-442a-ad61-70d9a9824219\") " pod="openstack/placement-57b7bbd86d-9drzs" Nov 24 11:25:23 crc kubenswrapper[4752]: I1124 11:25:23.604814 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6e4558a-350e-442a-ad61-70d9a9824219-logs\") pod \"placement-57b7bbd86d-9drzs\" (UID: \"a6e4558a-350e-442a-ad61-70d9a9824219\") " pod="openstack/placement-57b7bbd86d-9drzs" Nov 24 11:25:23 crc kubenswrapper[4752]: I1124 11:25:23.605180 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6e4558a-350e-442a-ad61-70d9a9824219-logs\") pod \"placement-57b7bbd86d-9drzs\" (UID: \"a6e4558a-350e-442a-ad61-70d9a9824219\") " pod="openstack/placement-57b7bbd86d-9drzs" Nov 24 11:25:23 crc kubenswrapper[4752]: I1124 11:25:23.611960 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6e4558a-350e-442a-ad61-70d9a9824219-internal-tls-certs\") pod \"placement-57b7bbd86d-9drzs\" (UID: \"a6e4558a-350e-442a-ad61-70d9a9824219\") " pod="openstack/placement-57b7bbd86d-9drzs" Nov 24 11:25:23 crc kubenswrapper[4752]: I1124 11:25:23.613234 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6e4558a-350e-442a-ad61-70d9a9824219-public-tls-certs\") pod \"placement-57b7bbd86d-9drzs\" (UID: \"a6e4558a-350e-442a-ad61-70d9a9824219\") " pod="openstack/placement-57b7bbd86d-9drzs" Nov 24 11:25:23 crc kubenswrapper[4752]: I1124 11:25:23.615017 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6e4558a-350e-442a-ad61-70d9a9824219-scripts\") pod \"placement-57b7bbd86d-9drzs\" (UID: \"a6e4558a-350e-442a-ad61-70d9a9824219\") " pod="openstack/placement-57b7bbd86d-9drzs" Nov 24 11:25:23 crc kubenswrapper[4752]: I1124 11:25:23.621526 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6e4558a-350e-442a-ad61-70d9a9824219-config-data\") pod \"placement-57b7bbd86d-9drzs\" (UID: \"a6e4558a-350e-442a-ad61-70d9a9824219\") " pod="openstack/placement-57b7bbd86d-9drzs" Nov 24 11:25:23 crc kubenswrapper[4752]: I1124 11:25:23.628235 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxmfz\" (UniqueName: \"kubernetes.io/projected/a6e4558a-350e-442a-ad61-70d9a9824219-kube-api-access-qxmfz\") pod \"placement-57b7bbd86d-9drzs\" (UID: \"a6e4558a-350e-442a-ad61-70d9a9824219\") " pod="openstack/placement-57b7bbd86d-9drzs" Nov 24 11:25:23 crc kubenswrapper[4752]: I1124 11:25:23.635383 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6e4558a-350e-442a-ad61-70d9a9824219-combined-ca-bundle\") pod \"placement-57b7bbd86d-9drzs\" (UID: \"a6e4558a-350e-442a-ad61-70d9a9824219\") " pod="openstack/placement-57b7bbd86d-9drzs" Nov 24 11:25:23 crc kubenswrapper[4752]: I1124 11:25:23.756399 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-57b7bbd86d-9drzs" Nov 24 11:25:24 crc kubenswrapper[4752]: I1124 11:25:24.236202 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-68f8b8b648-65q5g" event={"ID":"6c592991-2ffb-417a-aa80-49d0111618bb","Type":"ContainerStarted","Data":"58ade1c43e6907e3577011461f577c10b33e8682d39ece87e8fbf57d78ed060c"} Nov 24 11:25:24 crc kubenswrapper[4752]: I1124 11:25:24.236534 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-68f8b8b648-65q5g" Nov 24 11:25:24 crc kubenswrapper[4752]: I1124 11:25:24.237994 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-ff2l8" event={"ID":"dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9","Type":"ContainerStarted","Data":"7db16ce7ee3ac97fc182bddca6eb14221eeabd5088bcf45d2229d3661c794580"} Nov 24 11:25:24 crc kubenswrapper[4752]: I1124 11:25:24.266639 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-68f8b8b648-65q5g" podStartSLOduration=4.266615624 podStartE2EDuration="4.266615624s" podCreationTimestamp="2025-11-24 11:25:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:25:24.257542142 +0000 UTC m=+1130.242362431" watchObservedRunningTime="2025-11-24 11:25:24.266615624 +0000 UTC m=+1130.251435913" Nov 24 11:25:24 crc kubenswrapper[4752]: I1124 11:25:24.293990 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-ff2l8" podStartSLOduration=5.383106762 podStartE2EDuration="44.293971034s" podCreationTimestamp="2025-11-24 11:24:40 +0000 UTC" firstStartedPulling="2025-11-24 11:24:43.27569415 +0000 UTC m=+1089.260514439" lastFinishedPulling="2025-11-24 11:25:22.186558422 +0000 UTC m=+1128.171378711" observedRunningTime="2025-11-24 11:25:24.293072409 +0000 UTC m=+1130.277892698" watchObservedRunningTime="2025-11-24 11:25:24.293971034 +0000 UTC m=+1130.278791323" Nov 24 11:25:25 crc kubenswrapper[4752]: I1124 11:25:25.177524 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-57b7bbd86d-9drzs"] Nov 24 11:25:25 crc kubenswrapper[4752]: I1124 11:25:25.249397 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-68f8b8b648-65q5g" Nov 24 11:25:27 crc kubenswrapper[4752]: I1124 11:25:27.654044 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-848cf88cfc-pdfvr" Nov 24 11:25:27 crc kubenswrapper[4752]: I1124 11:25:27.744890 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-pz4zr"] Nov 24 11:25:27 crc kubenswrapper[4752]: I1124 11:25:27.745737 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56df8fb6b7-pz4zr" podUID="8bf654b7-df16-44f1-99a8-7eb680634013" containerName="dnsmasq-dns" containerID="cri-o://2081cff28fb25e3157d2d48fa39f73ccfb779669f82a6e60a027ff939e135e2c" gracePeriod=10 Nov 24 11:25:28 crc kubenswrapper[4752]: I1124 11:25:28.289139 4752 generic.go:334] "Generic (PLEG): container finished" podID="8bf654b7-df16-44f1-99a8-7eb680634013" containerID="2081cff28fb25e3157d2d48fa39f73ccfb779669f82a6e60a027ff939e135e2c" exitCode=0 Nov 24 11:25:28 crc kubenswrapper[4752]: I1124 11:25:28.289183 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-pz4zr" event={"ID":"8bf654b7-df16-44f1-99a8-7eb680634013","Type":"ContainerDied","Data":"2081cff28fb25e3157d2d48fa39f73ccfb779669f82a6e60a027ff939e135e2c"} Nov 24 11:25:29 crc kubenswrapper[4752]: I1124 11:25:29.297304 4752 generic.go:334] "Generic (PLEG): container finished" podID="dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9" containerID="7db16ce7ee3ac97fc182bddca6eb14221eeabd5088bcf45d2229d3661c794580" exitCode=0 Nov 24 11:25:29 crc kubenswrapper[4752]: I1124 11:25:29.297394 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-ff2l8" event={"ID":"dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9","Type":"ContainerDied","Data":"7db16ce7ee3ac97fc182bddca6eb14221eeabd5088bcf45d2229d3661c794580"} Nov 24 11:25:29 crc kubenswrapper[4752]: I1124 11:25:29.309212 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-77cb55bf5d-j9l7d" Nov 24 11:25:29 crc kubenswrapper[4752]: I1124 11:25:29.472276 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-77cb55bf5d-j9l7d" Nov 24 11:25:30 crc kubenswrapper[4752]: I1124 11:25:30.308933 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-pz4zr" event={"ID":"8bf654b7-df16-44f1-99a8-7eb680634013","Type":"ContainerDied","Data":"6ee29efca3f221bc0072713257c24428eec1b696d3bb5a1b45901ceecef7a46e"} Nov 24 11:25:30 crc kubenswrapper[4752]: I1124 11:25:30.309209 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ee29efca3f221bc0072713257c24428eec1b696d3bb5a1b45901ceecef7a46e" Nov 24 11:25:30 crc kubenswrapper[4752]: I1124 11:25:30.310819 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-57b7bbd86d-9drzs" event={"ID":"a6e4558a-350e-442a-ad61-70d9a9824219","Type":"ContainerStarted","Data":"327462de80c54143cc9b42a44939bcc5f3f982657f91658cc2099742d508a41d"} Nov 24 11:25:30 crc kubenswrapper[4752]: I1124 11:25:30.342601 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-pz4zr" Nov 24 11:25:30 crc kubenswrapper[4752]: E1124 11:25:30.428103 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="ade2fdf2-9bc9-45ee-81cf-25be73764135" Nov 24 11:25:30 crc kubenswrapper[4752]: I1124 11:25:30.441371 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bf654b7-df16-44f1-99a8-7eb680634013-dns-svc\") pod \"8bf654b7-df16-44f1-99a8-7eb680634013\" (UID: \"8bf654b7-df16-44f1-99a8-7eb680634013\") " Nov 24 11:25:30 crc kubenswrapper[4752]: I1124 11:25:30.441500 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8bf654b7-df16-44f1-99a8-7eb680634013-ovsdbserver-sb\") pod \"8bf654b7-df16-44f1-99a8-7eb680634013\" (UID: \"8bf654b7-df16-44f1-99a8-7eb680634013\") " Nov 24 11:25:30 crc kubenswrapper[4752]: I1124 11:25:30.441520 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8bf654b7-df16-44f1-99a8-7eb680634013-dns-swift-storage-0\") pod \"8bf654b7-df16-44f1-99a8-7eb680634013\" (UID: \"8bf654b7-df16-44f1-99a8-7eb680634013\") " Nov 24 11:25:30 crc kubenswrapper[4752]: I1124 11:25:30.441550 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lj54n\" (UniqueName: \"kubernetes.io/projected/8bf654b7-df16-44f1-99a8-7eb680634013-kube-api-access-lj54n\") pod \"8bf654b7-df16-44f1-99a8-7eb680634013\" (UID: \"8bf654b7-df16-44f1-99a8-7eb680634013\") " Nov 24 11:25:30 crc kubenswrapper[4752]: I1124 11:25:30.441645 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bf654b7-df16-44f1-99a8-7eb680634013-config\") pod \"8bf654b7-df16-44f1-99a8-7eb680634013\" (UID: \"8bf654b7-df16-44f1-99a8-7eb680634013\") " Nov 24 11:25:30 crc kubenswrapper[4752]: I1124 11:25:30.441667 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bf654b7-df16-44f1-99a8-7eb680634013-ovsdbserver-nb\") pod \"8bf654b7-df16-44f1-99a8-7eb680634013\" (UID: \"8bf654b7-df16-44f1-99a8-7eb680634013\") " Nov 24 11:25:30 crc kubenswrapper[4752]: I1124 11:25:30.448014 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bf654b7-df16-44f1-99a8-7eb680634013-kube-api-access-lj54n" (OuterVolumeSpecName: "kube-api-access-lj54n") pod "8bf654b7-df16-44f1-99a8-7eb680634013" (UID: "8bf654b7-df16-44f1-99a8-7eb680634013"). InnerVolumeSpecName "kube-api-access-lj54n". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:25:30 crc kubenswrapper[4752]: I1124 11:25:30.540453 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bf654b7-df16-44f1-99a8-7eb680634013-config" (OuterVolumeSpecName: "config") pod "8bf654b7-df16-44f1-99a8-7eb680634013" (UID: "8bf654b7-df16-44f1-99a8-7eb680634013"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:25:30 crc kubenswrapper[4752]: I1124 11:25:30.544819 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lj54n\" (UniqueName: \"kubernetes.io/projected/8bf654b7-df16-44f1-99a8-7eb680634013-kube-api-access-lj54n\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:30 crc kubenswrapper[4752]: I1124 11:25:30.544856 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bf654b7-df16-44f1-99a8-7eb680634013-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:30 crc kubenswrapper[4752]: I1124 11:25:30.545800 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bf654b7-df16-44f1-99a8-7eb680634013-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8bf654b7-df16-44f1-99a8-7eb680634013" (UID: "8bf654b7-df16-44f1-99a8-7eb680634013"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:25:30 crc kubenswrapper[4752]: I1124 11:25:30.551921 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bf654b7-df16-44f1-99a8-7eb680634013-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8bf654b7-df16-44f1-99a8-7eb680634013" (UID: "8bf654b7-df16-44f1-99a8-7eb680634013"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:25:30 crc kubenswrapper[4752]: I1124 11:25:30.556839 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bf654b7-df16-44f1-99a8-7eb680634013-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8bf654b7-df16-44f1-99a8-7eb680634013" (UID: "8bf654b7-df16-44f1-99a8-7eb680634013"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:25:30 crc kubenswrapper[4752]: I1124 11:25:30.573311 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bf654b7-df16-44f1-99a8-7eb680634013-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8bf654b7-df16-44f1-99a8-7eb680634013" (UID: "8bf654b7-df16-44f1-99a8-7eb680634013"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:25:30 crc kubenswrapper[4752]: I1124 11:25:30.646621 4752 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bf654b7-df16-44f1-99a8-7eb680634013-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:30 crc kubenswrapper[4752]: I1124 11:25:30.647450 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8bf654b7-df16-44f1-99a8-7eb680634013-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:30 crc kubenswrapper[4752]: I1124 11:25:30.647544 4752 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8bf654b7-df16-44f1-99a8-7eb680634013-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:30 crc kubenswrapper[4752]: I1124 11:25:30.647620 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bf654b7-df16-44f1-99a8-7eb680634013-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:30 crc kubenswrapper[4752]: I1124 11:25:30.749306 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-ff2l8" Nov 24 11:25:30 crc kubenswrapper[4752]: I1124 11:25:30.855644 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47dd8\" (UniqueName: \"kubernetes.io/projected/dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9-kube-api-access-47dd8\") pod \"dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9\" (UID: \"dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9\") " Nov 24 11:25:30 crc kubenswrapper[4752]: I1124 11:25:30.855730 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9-scripts\") pod \"dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9\" (UID: \"dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9\") " Nov 24 11:25:30 crc kubenswrapper[4752]: I1124 11:25:30.855778 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9-combined-ca-bundle\") pod \"dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9\" (UID: \"dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9\") " Nov 24 11:25:30 crc kubenswrapper[4752]: I1124 11:25:30.855826 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9-etc-machine-id\") pod \"dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9\" (UID: \"dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9\") " Nov 24 11:25:30 crc kubenswrapper[4752]: I1124 11:25:30.855899 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9-config-data\") pod \"dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9\" (UID: \"dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9\") " Nov 24 11:25:30 crc kubenswrapper[4752]: I1124 11:25:30.855926 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9-db-sync-config-data\") pod \"dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9\" (UID: \"dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9\") " Nov 24 11:25:30 crc kubenswrapper[4752]: I1124 11:25:30.856184 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9" (UID: "dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 11:25:30 crc kubenswrapper[4752]: I1124 11:25:30.856583 4752 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:30 crc kubenswrapper[4752]: I1124 11:25:30.860328 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9-scripts" (OuterVolumeSpecName: "scripts") pod "dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9" (UID: "dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:25:30 crc kubenswrapper[4752]: I1124 11:25:30.860386 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9-kube-api-access-47dd8" (OuterVolumeSpecName: "kube-api-access-47dd8") pod "dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9" (UID: "dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9"). InnerVolumeSpecName "kube-api-access-47dd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:25:30 crc kubenswrapper[4752]: I1124 11:25:30.862838 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9" (UID: "dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:25:30 crc kubenswrapper[4752]: I1124 11:25:30.882601 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9" (UID: "dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:25:30 crc kubenswrapper[4752]: I1124 11:25:30.906388 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9-config-data" (OuterVolumeSpecName: "config-data") pod "dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9" (UID: "dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:25:30 crc kubenswrapper[4752]: I1124 11:25:30.957835 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:30 crc kubenswrapper[4752]: I1124 11:25:30.957884 4752 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:30 crc kubenswrapper[4752]: I1124 11:25:30.957901 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47dd8\" (UniqueName: \"kubernetes.io/projected/dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9-kube-api-access-47dd8\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:30 crc kubenswrapper[4752]: I1124 11:25:30.957913 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:30 crc kubenswrapper[4752]: I1124 11:25:30.957924 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.320653 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-ff2l8" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.320655 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-ff2l8" event={"ID":"dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9","Type":"ContainerDied","Data":"b4a3bdf00e9d790da0d1fd427c18837eb66d7fe486248c2c030f879f87dd2969"} Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.320727 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4a3bdf00e9d790da0d1fd427c18837eb66d7fe486248c2c030f879f87dd2969" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.323077 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-57b7bbd86d-9drzs" event={"ID":"a6e4558a-350e-442a-ad61-70d9a9824219","Type":"ContainerStarted","Data":"aaadfc25131dfddd66fa261c60845cf30f3a819ab939eec3683de48fc967913c"} Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.323151 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-57b7bbd86d-9drzs" event={"ID":"a6e4558a-350e-442a-ad61-70d9a9824219","Type":"ContainerStarted","Data":"608e26395b4dc1de0bd00669f57c463df6c52c83abe35e40f229dda6794c160f"} Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.323279 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-57b7bbd86d-9drzs" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.326199 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-pz4zr" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.326268 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ade2fdf2-9bc9-45ee-81cf-25be73764135","Type":"ContainerStarted","Data":"c0b402145f88ec10addac0b2ea25d8fd53ca5ce1bf1151c73b5b5046a11677e4"} Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.327375 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ade2fdf2-9bc9-45ee-81cf-25be73764135" containerName="ceilometer-notification-agent" containerID="cri-o://bd98460d4d8cd5d5c0e2dbc7e61bc74a65033d2df617a5b4cc89773afc5f69b3" gracePeriod=30 Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.327408 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ade2fdf2-9bc9-45ee-81cf-25be73764135" containerName="proxy-httpd" containerID="cri-o://c0b402145f88ec10addac0b2ea25d8fd53ca5ce1bf1151c73b5b5046a11677e4" gracePeriod=30 Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.327517 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ade2fdf2-9bc9-45ee-81cf-25be73764135" containerName="sg-core" containerID="cri-o://498fb2360c218ad0383c97e165942009a3fa76dcc477214cca208deb136f6bdf" gracePeriod=30 Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.364912 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-57b7bbd86d-9drzs" podStartSLOduration=8.364889584 podStartE2EDuration="8.364889584s" podCreationTimestamp="2025-11-24 11:25:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:25:31.349999553 +0000 UTC m=+1137.334819862" watchObservedRunningTime="2025-11-24 11:25:31.364889584 +0000 UTC m=+1137.349709893" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.402160 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-pz4zr"] Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.408336 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-pz4zr"] Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.685336 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 24 11:25:31 crc kubenswrapper[4752]: E1124 11:25:31.694567 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bf654b7-df16-44f1-99a8-7eb680634013" containerName="init" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.694609 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bf654b7-df16-44f1-99a8-7eb680634013" containerName="init" Nov 24 11:25:31 crc kubenswrapper[4752]: E1124 11:25:31.694625 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bf654b7-df16-44f1-99a8-7eb680634013" containerName="dnsmasq-dns" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.694634 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bf654b7-df16-44f1-99a8-7eb680634013" containerName="dnsmasq-dns" Nov 24 11:25:31 crc kubenswrapper[4752]: E1124 11:25:31.694674 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9" containerName="cinder-db-sync" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.694685 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9" containerName="cinder-db-sync" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.694958 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bf654b7-df16-44f1-99a8-7eb680634013" containerName="dnsmasq-dns" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.694982 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9" containerName="cinder-db-sync" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.696187 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.704197 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.704484 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.704655 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-5677w" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.705161 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.713486 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.777883 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d7aff9e-10fe-48a7-a77b-049a0b1bbf89-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9d7aff9e-10fe-48a7-a77b-049a0b1bbf89\") " pod="openstack/cinder-scheduler-0" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.777950 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d7aff9e-10fe-48a7-a77b-049a0b1bbf89-scripts\") pod \"cinder-scheduler-0\" (UID: \"9d7aff9e-10fe-48a7-a77b-049a0b1bbf89\") " pod="openstack/cinder-scheduler-0" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.777985 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d7aff9e-10fe-48a7-a77b-049a0b1bbf89-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9d7aff9e-10fe-48a7-a77b-049a0b1bbf89\") " pod="openstack/cinder-scheduler-0" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.778031 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9d7aff9e-10fe-48a7-a77b-049a0b1bbf89-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9d7aff9e-10fe-48a7-a77b-049a0b1bbf89\") " pod="openstack/cinder-scheduler-0" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.778110 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d7aff9e-10fe-48a7-a77b-049a0b1bbf89-config-data\") pod \"cinder-scheduler-0\" (UID: \"9d7aff9e-10fe-48a7-a77b-049a0b1bbf89\") " pod="openstack/cinder-scheduler-0" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.778214 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dwpj\" (UniqueName: \"kubernetes.io/projected/9d7aff9e-10fe-48a7-a77b-049a0b1bbf89-kube-api-access-5dwpj\") pod \"cinder-scheduler-0\" (UID: \"9d7aff9e-10fe-48a7-a77b-049a0b1bbf89\") " pod="openstack/cinder-scheduler-0" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.804597 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-q2svj"] Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.806429 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-q2svj" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.830267 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-q2svj"] Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.880812 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3e1d72bc-5f15-4bf5-a424-62f34eafd84e-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-q2svj\" (UID: \"3e1d72bc-5f15-4bf5-a424-62f34eafd84e\") " pod="openstack/dnsmasq-dns-6578955fd5-q2svj" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.880882 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d7aff9e-10fe-48a7-a77b-049a0b1bbf89-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9d7aff9e-10fe-48a7-a77b-049a0b1bbf89\") " pod="openstack/cinder-scheduler-0" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.880930 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9d7aff9e-10fe-48a7-a77b-049a0b1bbf89-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9d7aff9e-10fe-48a7-a77b-049a0b1bbf89\") " pod="openstack/cinder-scheduler-0" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.880957 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e1d72bc-5f15-4bf5-a424-62f34eafd84e-config\") pod \"dnsmasq-dns-6578955fd5-q2svj\" (UID: \"3e1d72bc-5f15-4bf5-a424-62f34eafd84e\") " pod="openstack/dnsmasq-dns-6578955fd5-q2svj" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.881026 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3e1d72bc-5f15-4bf5-a424-62f34eafd84e-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-q2svj\" (UID: \"3e1d72bc-5f15-4bf5-a424-62f34eafd84e\") " pod="openstack/dnsmasq-dns-6578955fd5-q2svj" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.881053 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d7aff9e-10fe-48a7-a77b-049a0b1bbf89-config-data\") pod \"cinder-scheduler-0\" (UID: \"9d7aff9e-10fe-48a7-a77b-049a0b1bbf89\") " pod="openstack/cinder-scheduler-0" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.881142 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb7wn\" (UniqueName: \"kubernetes.io/projected/3e1d72bc-5f15-4bf5-a424-62f34eafd84e-kube-api-access-cb7wn\") pod \"dnsmasq-dns-6578955fd5-q2svj\" (UID: \"3e1d72bc-5f15-4bf5-a424-62f34eafd84e\") " pod="openstack/dnsmasq-dns-6578955fd5-q2svj" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.881200 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dwpj\" (UniqueName: \"kubernetes.io/projected/9d7aff9e-10fe-48a7-a77b-049a0b1bbf89-kube-api-access-5dwpj\") pod \"cinder-scheduler-0\" (UID: \"9d7aff9e-10fe-48a7-a77b-049a0b1bbf89\") " pod="openstack/cinder-scheduler-0" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.881226 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e1d72bc-5f15-4bf5-a424-62f34eafd84e-dns-svc\") pod \"dnsmasq-dns-6578955fd5-q2svj\" (UID: \"3e1d72bc-5f15-4bf5-a424-62f34eafd84e\") " pod="openstack/dnsmasq-dns-6578955fd5-q2svj" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.881265 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3e1d72bc-5f15-4bf5-a424-62f34eafd84e-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-q2svj\" (UID: \"3e1d72bc-5f15-4bf5-a424-62f34eafd84e\") " pod="openstack/dnsmasq-dns-6578955fd5-q2svj" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.881313 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d7aff9e-10fe-48a7-a77b-049a0b1bbf89-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9d7aff9e-10fe-48a7-a77b-049a0b1bbf89\") " pod="openstack/cinder-scheduler-0" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.881358 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d7aff9e-10fe-48a7-a77b-049a0b1bbf89-scripts\") pod \"cinder-scheduler-0\" (UID: \"9d7aff9e-10fe-48a7-a77b-049a0b1bbf89\") " pod="openstack/cinder-scheduler-0" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.889982 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9d7aff9e-10fe-48a7-a77b-049a0b1bbf89-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9d7aff9e-10fe-48a7-a77b-049a0b1bbf89\") " pod="openstack/cinder-scheduler-0" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.893593 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d7aff9e-10fe-48a7-a77b-049a0b1bbf89-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9d7aff9e-10fe-48a7-a77b-049a0b1bbf89\") " pod="openstack/cinder-scheduler-0" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.900843 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d7aff9e-10fe-48a7-a77b-049a0b1bbf89-config-data\") pod \"cinder-scheduler-0\" (UID: \"9d7aff9e-10fe-48a7-a77b-049a0b1bbf89\") " pod="openstack/cinder-scheduler-0" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.902394 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d7aff9e-10fe-48a7-a77b-049a0b1bbf89-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9d7aff9e-10fe-48a7-a77b-049a0b1bbf89\") " pod="openstack/cinder-scheduler-0" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.912709 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d7aff9e-10fe-48a7-a77b-049a0b1bbf89-scripts\") pod \"cinder-scheduler-0\" (UID: \"9d7aff9e-10fe-48a7-a77b-049a0b1bbf89\") " pod="openstack/cinder-scheduler-0" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.921238 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dwpj\" (UniqueName: \"kubernetes.io/projected/9d7aff9e-10fe-48a7-a77b-049a0b1bbf89-kube-api-access-5dwpj\") pod \"cinder-scheduler-0\" (UID: \"9d7aff9e-10fe-48a7-a77b-049a0b1bbf89\") " pod="openstack/cinder-scheduler-0" Nov 24 11:25:31 crc kubenswrapper[4752]: E1124 11:25:31.944515 4752 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podade2fdf2_9bc9_45ee_81cf_25be73764135.slice/crio-conmon-c0b402145f88ec10addac0b2ea25d8fd53ca5ce1bf1151c73b5b5046a11677e4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podade2fdf2_9bc9_45ee_81cf_25be73764135.slice/crio-c0b402145f88ec10addac0b2ea25d8fd53ca5ce1bf1151c73b5b5046a11677e4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa308cc4_8cc5_4a63_926a_033a151f7291.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e150ca9_5a87_40ba_bbad_bfae139179ae.slice/crio-e0ba6aa64cba38678c4574d02ef74a8614a787b76bf66dad910dba664253978c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa308cc4_8cc5_4a63_926a_033a151f7291.slice/crio-a22dd843e547ae24880f732987ed3607616100f00fdd6d0b7c0a832ecf68d38d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podade2fdf2_9bc9_45ee_81cf_25be73764135.slice/crio-conmon-498fb2360c218ad0383c97e165942009a3fa76dcc477214cca208deb136f6bdf.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04ae1a5e_39a4_40d2_8a7f_dd5e7dac6c0d.slice/crio-525620a12830d24c1a64bc6025c7ca13e4b2dc072e4ef9c51dc3990f97b641f2\": RecentStats: unable to find data in memory cache]" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.978111 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.979539 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.985372 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.985859 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3e1d72bc-5f15-4bf5-a424-62f34eafd84e-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-q2svj\" (UID: \"3e1d72bc-5f15-4bf5-a424-62f34eafd84e\") " pod="openstack/dnsmasq-dns-6578955fd5-q2svj" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.985944 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb7wn\" (UniqueName: \"kubernetes.io/projected/3e1d72bc-5f15-4bf5-a424-62f34eafd84e-kube-api-access-cb7wn\") pod \"dnsmasq-dns-6578955fd5-q2svj\" (UID: \"3e1d72bc-5f15-4bf5-a424-62f34eafd84e\") " pod="openstack/dnsmasq-dns-6578955fd5-q2svj" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.986004 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e1d72bc-5f15-4bf5-a424-62f34eafd84e-dns-svc\") pod \"dnsmasq-dns-6578955fd5-q2svj\" (UID: \"3e1d72bc-5f15-4bf5-a424-62f34eafd84e\") " pod="openstack/dnsmasq-dns-6578955fd5-q2svj" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.986041 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3e1d72bc-5f15-4bf5-a424-62f34eafd84e-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-q2svj\" (UID: \"3e1d72bc-5f15-4bf5-a424-62f34eafd84e\") " pod="openstack/dnsmasq-dns-6578955fd5-q2svj" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.986098 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3e1d72bc-5f15-4bf5-a424-62f34eafd84e-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-q2svj\" (UID: \"3e1d72bc-5f15-4bf5-a424-62f34eafd84e\") " pod="openstack/dnsmasq-dns-6578955fd5-q2svj" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.986150 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e1d72bc-5f15-4bf5-a424-62f34eafd84e-config\") pod \"dnsmasq-dns-6578955fd5-q2svj\" (UID: \"3e1d72bc-5f15-4bf5-a424-62f34eafd84e\") " pod="openstack/dnsmasq-dns-6578955fd5-q2svj" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.987033 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3e1d72bc-5f15-4bf5-a424-62f34eafd84e-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-q2svj\" (UID: \"3e1d72bc-5f15-4bf5-a424-62f34eafd84e\") " pod="openstack/dnsmasq-dns-6578955fd5-q2svj" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.987315 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e1d72bc-5f15-4bf5-a424-62f34eafd84e-dns-svc\") pod \"dnsmasq-dns-6578955fd5-q2svj\" (UID: \"3e1d72bc-5f15-4bf5-a424-62f34eafd84e\") " pod="openstack/dnsmasq-dns-6578955fd5-q2svj" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.988202 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3e1d72bc-5f15-4bf5-a424-62f34eafd84e-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-q2svj\" (UID: \"3e1d72bc-5f15-4bf5-a424-62f34eafd84e\") " pod="openstack/dnsmasq-dns-6578955fd5-q2svj" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.988262 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3e1d72bc-5f15-4bf5-a424-62f34eafd84e-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-q2svj\" (UID: \"3e1d72bc-5f15-4bf5-a424-62f34eafd84e\") " pod="openstack/dnsmasq-dns-6578955fd5-q2svj" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.988685 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e1d72bc-5f15-4bf5-a424-62f34eafd84e-config\") pod \"dnsmasq-dns-6578955fd5-q2svj\" (UID: \"3e1d72bc-5f15-4bf5-a424-62f34eafd84e\") " pod="openstack/dnsmasq-dns-6578955fd5-q2svj" Nov 24 11:25:31 crc kubenswrapper[4752]: I1124 11:25:31.994193 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 24 11:25:32 crc kubenswrapper[4752]: I1124 11:25:32.010061 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb7wn\" (UniqueName: \"kubernetes.io/projected/3e1d72bc-5f15-4bf5-a424-62f34eafd84e-kube-api-access-cb7wn\") pod \"dnsmasq-dns-6578955fd5-q2svj\" (UID: \"3e1d72bc-5f15-4bf5-a424-62f34eafd84e\") " pod="openstack/dnsmasq-dns-6578955fd5-q2svj" Nov 24 11:25:32 crc kubenswrapper[4752]: I1124 11:25:32.050127 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 24 11:25:32 crc kubenswrapper[4752]: I1124 11:25:32.092771 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f12684d-5b53-4655-b64b-45ecb2ae5cf1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6f12684d-5b53-4655-b64b-45ecb2ae5cf1\") " pod="openstack/cinder-api-0" Nov 24 11:25:32 crc kubenswrapper[4752]: I1124 11:25:32.092821 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w5cg\" (UniqueName: \"kubernetes.io/projected/6f12684d-5b53-4655-b64b-45ecb2ae5cf1-kube-api-access-8w5cg\") pod \"cinder-api-0\" (UID: \"6f12684d-5b53-4655-b64b-45ecb2ae5cf1\") " pod="openstack/cinder-api-0" Nov 24 11:25:32 crc kubenswrapper[4752]: I1124 11:25:32.092866 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f12684d-5b53-4655-b64b-45ecb2ae5cf1-scripts\") pod \"cinder-api-0\" (UID: \"6f12684d-5b53-4655-b64b-45ecb2ae5cf1\") " pod="openstack/cinder-api-0" Nov 24 11:25:32 crc kubenswrapper[4752]: I1124 11:25:32.092887 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f12684d-5b53-4655-b64b-45ecb2ae5cf1-config-data-custom\") pod \"cinder-api-0\" (UID: \"6f12684d-5b53-4655-b64b-45ecb2ae5cf1\") " pod="openstack/cinder-api-0" Nov 24 11:25:32 crc kubenswrapper[4752]: I1124 11:25:32.092972 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f12684d-5b53-4655-b64b-45ecb2ae5cf1-config-data\") pod \"cinder-api-0\" (UID: \"6f12684d-5b53-4655-b64b-45ecb2ae5cf1\") " pod="openstack/cinder-api-0" Nov 24 11:25:32 crc kubenswrapper[4752]: I1124 11:25:32.092992 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f12684d-5b53-4655-b64b-45ecb2ae5cf1-logs\") pod \"cinder-api-0\" (UID: \"6f12684d-5b53-4655-b64b-45ecb2ae5cf1\") " pod="openstack/cinder-api-0" Nov 24 11:25:32 crc kubenswrapper[4752]: I1124 11:25:32.093015 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6f12684d-5b53-4655-b64b-45ecb2ae5cf1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6f12684d-5b53-4655-b64b-45ecb2ae5cf1\") " pod="openstack/cinder-api-0" Nov 24 11:25:32 crc kubenswrapper[4752]: I1124 11:25:32.148511 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-q2svj" Nov 24 11:25:32 crc kubenswrapper[4752]: I1124 11:25:32.195275 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f12684d-5b53-4655-b64b-45ecb2ae5cf1-config-data\") pod \"cinder-api-0\" (UID: \"6f12684d-5b53-4655-b64b-45ecb2ae5cf1\") " pod="openstack/cinder-api-0" Nov 24 11:25:32 crc kubenswrapper[4752]: I1124 11:25:32.195642 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f12684d-5b53-4655-b64b-45ecb2ae5cf1-logs\") pod \"cinder-api-0\" (UID: \"6f12684d-5b53-4655-b64b-45ecb2ae5cf1\") " pod="openstack/cinder-api-0" Nov 24 11:25:32 crc kubenswrapper[4752]: I1124 11:25:32.195676 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6f12684d-5b53-4655-b64b-45ecb2ae5cf1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6f12684d-5b53-4655-b64b-45ecb2ae5cf1\") " pod="openstack/cinder-api-0" Nov 24 11:25:32 crc kubenswrapper[4752]: I1124 11:25:32.195795 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f12684d-5b53-4655-b64b-45ecb2ae5cf1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6f12684d-5b53-4655-b64b-45ecb2ae5cf1\") " pod="openstack/cinder-api-0" Nov 24 11:25:32 crc kubenswrapper[4752]: I1124 11:25:32.195897 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6f12684d-5b53-4655-b64b-45ecb2ae5cf1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6f12684d-5b53-4655-b64b-45ecb2ae5cf1\") " pod="openstack/cinder-api-0" Nov 24 11:25:32 crc kubenswrapper[4752]: I1124 11:25:32.195970 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w5cg\" (UniqueName: \"kubernetes.io/projected/6f12684d-5b53-4655-b64b-45ecb2ae5cf1-kube-api-access-8w5cg\") pod \"cinder-api-0\" (UID: \"6f12684d-5b53-4655-b64b-45ecb2ae5cf1\") " pod="openstack/cinder-api-0" Nov 24 11:25:32 crc kubenswrapper[4752]: I1124 11:25:32.196119 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f12684d-5b53-4655-b64b-45ecb2ae5cf1-logs\") pod \"cinder-api-0\" (UID: \"6f12684d-5b53-4655-b64b-45ecb2ae5cf1\") " pod="openstack/cinder-api-0" Nov 24 11:25:32 crc kubenswrapper[4752]: I1124 11:25:32.196450 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f12684d-5b53-4655-b64b-45ecb2ae5cf1-scripts\") pod \"cinder-api-0\" (UID: \"6f12684d-5b53-4655-b64b-45ecb2ae5cf1\") " pod="openstack/cinder-api-0" Nov 24 11:25:32 crc kubenswrapper[4752]: I1124 11:25:32.197055 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f12684d-5b53-4655-b64b-45ecb2ae5cf1-config-data-custom\") pod \"cinder-api-0\" (UID: \"6f12684d-5b53-4655-b64b-45ecb2ae5cf1\") " pod="openstack/cinder-api-0" Nov 24 11:25:32 crc kubenswrapper[4752]: I1124 11:25:32.200322 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f12684d-5b53-4655-b64b-45ecb2ae5cf1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6f12684d-5b53-4655-b64b-45ecb2ae5cf1\") " pod="openstack/cinder-api-0" Nov 24 11:25:32 crc kubenswrapper[4752]: I1124 11:25:32.201430 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f12684d-5b53-4655-b64b-45ecb2ae5cf1-config-data\") pod \"cinder-api-0\" (UID: \"6f12684d-5b53-4655-b64b-45ecb2ae5cf1\") " pod="openstack/cinder-api-0" Nov 24 11:25:32 crc kubenswrapper[4752]: I1124 11:25:32.212917 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f12684d-5b53-4655-b64b-45ecb2ae5cf1-scripts\") pod \"cinder-api-0\" (UID: \"6f12684d-5b53-4655-b64b-45ecb2ae5cf1\") " pod="openstack/cinder-api-0" Nov 24 11:25:32 crc kubenswrapper[4752]: I1124 11:25:32.213186 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f12684d-5b53-4655-b64b-45ecb2ae5cf1-config-data-custom\") pod \"cinder-api-0\" (UID: \"6f12684d-5b53-4655-b64b-45ecb2ae5cf1\") " pod="openstack/cinder-api-0" Nov 24 11:25:32 crc kubenswrapper[4752]: I1124 11:25:32.226402 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w5cg\" (UniqueName: \"kubernetes.io/projected/6f12684d-5b53-4655-b64b-45ecb2ae5cf1-kube-api-access-8w5cg\") pod \"cinder-api-0\" (UID: \"6f12684d-5b53-4655-b64b-45ecb2ae5cf1\") " pod="openstack/cinder-api-0" Nov 24 11:25:32 crc kubenswrapper[4752]: I1124 11:25:32.352311 4752 generic.go:334] "Generic (PLEG): container finished" podID="ade2fdf2-9bc9-45ee-81cf-25be73764135" containerID="c0b402145f88ec10addac0b2ea25d8fd53ca5ce1bf1151c73b5b5046a11677e4" exitCode=0 Nov 24 11:25:32 crc kubenswrapper[4752]: I1124 11:25:32.352343 4752 generic.go:334] "Generic (PLEG): container finished" podID="ade2fdf2-9bc9-45ee-81cf-25be73764135" containerID="498fb2360c218ad0383c97e165942009a3fa76dcc477214cca208deb136f6bdf" exitCode=2 Nov 24 11:25:32 crc kubenswrapper[4752]: I1124 11:25:32.353180 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ade2fdf2-9bc9-45ee-81cf-25be73764135","Type":"ContainerDied","Data":"c0b402145f88ec10addac0b2ea25d8fd53ca5ce1bf1151c73b5b5046a11677e4"} Nov 24 11:25:32 crc kubenswrapper[4752]: I1124 11:25:32.353205 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ade2fdf2-9bc9-45ee-81cf-25be73764135","Type":"ContainerDied","Data":"498fb2360c218ad0383c97e165942009a3fa76dcc477214cca208deb136f6bdf"} Nov 24 11:25:32 crc kubenswrapper[4752]: I1124 11:25:32.353222 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-57b7bbd86d-9drzs" Nov 24 11:25:32 crc kubenswrapper[4752]: I1124 11:25:32.360808 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 24 11:25:32 crc kubenswrapper[4752]: I1124 11:25:32.705794 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 24 11:25:32 crc kubenswrapper[4752]: I1124 11:25:32.752464 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bf654b7-df16-44f1-99a8-7eb680634013" path="/var/lib/kubelet/pods/8bf654b7-df16-44f1-99a8-7eb680634013/volumes" Nov 24 11:25:32 crc kubenswrapper[4752]: I1124 11:25:32.803670 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-q2svj"] Nov 24 11:25:32 crc kubenswrapper[4752]: I1124 11:25:32.993566 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.024906 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ade2fdf2-9bc9-45ee-81cf-25be73764135-run-httpd\") pod \"ade2fdf2-9bc9-45ee-81cf-25be73764135\" (UID: \"ade2fdf2-9bc9-45ee-81cf-25be73764135\") " Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.025210 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ps2r\" (UniqueName: \"kubernetes.io/projected/ade2fdf2-9bc9-45ee-81cf-25be73764135-kube-api-access-5ps2r\") pod \"ade2fdf2-9bc9-45ee-81cf-25be73764135\" (UID: \"ade2fdf2-9bc9-45ee-81cf-25be73764135\") " Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.025353 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ade2fdf2-9bc9-45ee-81cf-25be73764135-combined-ca-bundle\") pod \"ade2fdf2-9bc9-45ee-81cf-25be73764135\" (UID: \"ade2fdf2-9bc9-45ee-81cf-25be73764135\") " Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.025450 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ade2fdf2-9bc9-45ee-81cf-25be73764135-sg-core-conf-yaml\") pod \"ade2fdf2-9bc9-45ee-81cf-25be73764135\" (UID: \"ade2fdf2-9bc9-45ee-81cf-25be73764135\") " Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.025611 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ade2fdf2-9bc9-45ee-81cf-25be73764135-scripts\") pod \"ade2fdf2-9bc9-45ee-81cf-25be73764135\" (UID: \"ade2fdf2-9bc9-45ee-81cf-25be73764135\") " Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.025688 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ade2fdf2-9bc9-45ee-81cf-25be73764135-config-data\") pod \"ade2fdf2-9bc9-45ee-81cf-25be73764135\" (UID: \"ade2fdf2-9bc9-45ee-81cf-25be73764135\") " Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.025825 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ade2fdf2-9bc9-45ee-81cf-25be73764135-log-httpd\") pod \"ade2fdf2-9bc9-45ee-81cf-25be73764135\" (UID: \"ade2fdf2-9bc9-45ee-81cf-25be73764135\") " Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.025376 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ade2fdf2-9bc9-45ee-81cf-25be73764135-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ade2fdf2-9bc9-45ee-81cf-25be73764135" (UID: "ade2fdf2-9bc9-45ee-81cf-25be73764135"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.026299 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ade2fdf2-9bc9-45ee-81cf-25be73764135-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ade2fdf2-9bc9-45ee-81cf-25be73764135" (UID: "ade2fdf2-9bc9-45ee-81cf-25be73764135"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.031303 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ade2fdf2-9bc9-45ee-81cf-25be73764135-kube-api-access-5ps2r" (OuterVolumeSpecName: "kube-api-access-5ps2r") pod "ade2fdf2-9bc9-45ee-81cf-25be73764135" (UID: "ade2fdf2-9bc9-45ee-81cf-25be73764135"). InnerVolumeSpecName "kube-api-access-5ps2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.031451 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ade2fdf2-9bc9-45ee-81cf-25be73764135-scripts" (OuterVolumeSpecName: "scripts") pod "ade2fdf2-9bc9-45ee-81cf-25be73764135" (UID: "ade2fdf2-9bc9-45ee-81cf-25be73764135"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.084649 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ade2fdf2-9bc9-45ee-81cf-25be73764135-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ade2fdf2-9bc9-45ee-81cf-25be73764135" (UID: "ade2fdf2-9bc9-45ee-81cf-25be73764135"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.098061 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ade2fdf2-9bc9-45ee-81cf-25be73764135-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ade2fdf2-9bc9-45ee-81cf-25be73764135" (UID: "ade2fdf2-9bc9-45ee-81cf-25be73764135"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.113731 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.130755 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ade2fdf2-9bc9-45ee-81cf-25be73764135-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.130793 4752 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ade2fdf2-9bc9-45ee-81cf-25be73764135-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.130810 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ade2fdf2-9bc9-45ee-81cf-25be73764135-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.130821 4752 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ade2fdf2-9bc9-45ee-81cf-25be73764135-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.130948 4752 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ade2fdf2-9bc9-45ee-81cf-25be73764135-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.130961 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ps2r\" (UniqueName: \"kubernetes.io/projected/ade2fdf2-9bc9-45ee-81cf-25be73764135-kube-api-access-5ps2r\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.131023 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ade2fdf2-9bc9-45ee-81cf-25be73764135-config-data" (OuterVolumeSpecName: "config-data") pod "ade2fdf2-9bc9-45ee-81cf-25be73764135" (UID: "ade2fdf2-9bc9-45ee-81cf-25be73764135"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:25:33 crc kubenswrapper[4752]: W1124 11:25:33.170190 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f12684d_5b53_4655_b64b_45ecb2ae5cf1.slice/crio-f1ad69545b771c5a6e8c5d965ca41d714b3fc899cce5b71ef1855b24d9f309d4 WatchSource:0}: Error finding container f1ad69545b771c5a6e8c5d965ca41d714b3fc899cce5b71ef1855b24d9f309d4: Status 404 returned error can't find the container with id f1ad69545b771c5a6e8c5d965ca41d714b3fc899cce5b71ef1855b24d9f309d4 Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.233337 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ade2fdf2-9bc9-45ee-81cf-25be73764135-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.377032 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9d7aff9e-10fe-48a7-a77b-049a0b1bbf89","Type":"ContainerStarted","Data":"480a197775c0942bf52c52c5cdf6302a9b30381af1e2a21af6930276ccfb8a11"} Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.381010 4752 generic.go:334] "Generic (PLEG): container finished" podID="3e1d72bc-5f15-4bf5-a424-62f34eafd84e" containerID="c0a1ec892d670ab63c53b4b9aa81154db6743fafe39f3112eea17cfd2ef10528" exitCode=0 Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.381232 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-q2svj" event={"ID":"3e1d72bc-5f15-4bf5-a424-62f34eafd84e","Type":"ContainerDied","Data":"c0a1ec892d670ab63c53b4b9aa81154db6743fafe39f3112eea17cfd2ef10528"} Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.381280 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-q2svj" event={"ID":"3e1d72bc-5f15-4bf5-a424-62f34eafd84e","Type":"ContainerStarted","Data":"f1b43646d153d845681e012ce48db1e7b652caac948d8f1081c47b3f5d67cc4c"} Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.390843 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6f12684d-5b53-4655-b64b-45ecb2ae5cf1","Type":"ContainerStarted","Data":"f1ad69545b771c5a6e8c5d965ca41d714b3fc899cce5b71ef1855b24d9f309d4"} Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.399737 4752 generic.go:334] "Generic (PLEG): container finished" podID="ade2fdf2-9bc9-45ee-81cf-25be73764135" containerID="bd98460d4d8cd5d5c0e2dbc7e61bc74a65033d2df617a5b4cc89773afc5f69b3" exitCode=0 Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.400614 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.400936 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ade2fdf2-9bc9-45ee-81cf-25be73764135","Type":"ContainerDied","Data":"bd98460d4d8cd5d5c0e2dbc7e61bc74a65033d2df617a5b4cc89773afc5f69b3"} Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.401027 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ade2fdf2-9bc9-45ee-81cf-25be73764135","Type":"ContainerDied","Data":"ee1787e9ba7bd24e18ff2cc096607abe7b971d4d9bae956410d7615f3d5959b7"} Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.401051 4752 scope.go:117] "RemoveContainer" containerID="c0b402145f88ec10addac0b2ea25d8fd53ca5ce1bf1151c73b5b5046a11677e4" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.564870 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.579022 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.596728 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 24 11:25:33 crc kubenswrapper[4752]: E1124 11:25:33.597254 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ade2fdf2-9bc9-45ee-81cf-25be73764135" containerName="proxy-httpd" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.597279 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="ade2fdf2-9bc9-45ee-81cf-25be73764135" containerName="proxy-httpd" Nov 24 11:25:33 crc kubenswrapper[4752]: E1124 11:25:33.597307 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ade2fdf2-9bc9-45ee-81cf-25be73764135" containerName="sg-core" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.597315 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="ade2fdf2-9bc9-45ee-81cf-25be73764135" containerName="sg-core" Nov 24 11:25:33 crc kubenswrapper[4752]: E1124 11:25:33.597330 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ade2fdf2-9bc9-45ee-81cf-25be73764135" containerName="ceilometer-notification-agent" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.597339 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="ade2fdf2-9bc9-45ee-81cf-25be73764135" containerName="ceilometer-notification-agent" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.597609 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="ade2fdf2-9bc9-45ee-81cf-25be73764135" containerName="sg-core" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.597634 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="ade2fdf2-9bc9-45ee-81cf-25be73764135" containerName="proxy-httpd" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.597646 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="ade2fdf2-9bc9-45ee-81cf-25be73764135" containerName="ceilometer-notification-agent" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.599603 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.601949 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.606483 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.610147 4752 scope.go:117] "RemoveContainer" containerID="498fb2360c218ad0383c97e165942009a3fa76dcc477214cca208deb136f6bdf" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.625066 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.652938 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e27be68-eb58-40ff-8cae-686399b726a5-log-httpd\") pod \"ceilometer-0\" (UID: \"7e27be68-eb58-40ff-8cae-686399b726a5\") " pod="openstack/ceilometer-0" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.652999 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e27be68-eb58-40ff-8cae-686399b726a5-scripts\") pod \"ceilometer-0\" (UID: \"7e27be68-eb58-40ff-8cae-686399b726a5\") " pod="openstack/ceilometer-0" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.653058 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e27be68-eb58-40ff-8cae-686399b726a5-run-httpd\") pod \"ceilometer-0\" (UID: \"7e27be68-eb58-40ff-8cae-686399b726a5\") " pod="openstack/ceilometer-0" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.653090 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e27be68-eb58-40ff-8cae-686399b726a5-config-data\") pod \"ceilometer-0\" (UID: \"7e27be68-eb58-40ff-8cae-686399b726a5\") " pod="openstack/ceilometer-0" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.653115 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e27be68-eb58-40ff-8cae-686399b726a5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7e27be68-eb58-40ff-8cae-686399b726a5\") " pod="openstack/ceilometer-0" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.653152 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e27be68-eb58-40ff-8cae-686399b726a5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7e27be68-eb58-40ff-8cae-686399b726a5\") " pod="openstack/ceilometer-0" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.653177 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd6fq\" (UniqueName: \"kubernetes.io/projected/7e27be68-eb58-40ff-8cae-686399b726a5-kube-api-access-vd6fq\") pod \"ceilometer-0\" (UID: \"7e27be68-eb58-40ff-8cae-686399b726a5\") " pod="openstack/ceilometer-0" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.667886 4752 scope.go:117] "RemoveContainer" containerID="bd98460d4d8cd5d5c0e2dbc7e61bc74a65033d2df617a5b4cc89773afc5f69b3" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.756434 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e27be68-eb58-40ff-8cae-686399b726a5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7e27be68-eb58-40ff-8cae-686399b726a5\") " pod="openstack/ceilometer-0" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.756519 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd6fq\" (UniqueName: \"kubernetes.io/projected/7e27be68-eb58-40ff-8cae-686399b726a5-kube-api-access-vd6fq\") pod \"ceilometer-0\" (UID: \"7e27be68-eb58-40ff-8cae-686399b726a5\") " pod="openstack/ceilometer-0" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.756599 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e27be68-eb58-40ff-8cae-686399b726a5-log-httpd\") pod \"ceilometer-0\" (UID: \"7e27be68-eb58-40ff-8cae-686399b726a5\") " pod="openstack/ceilometer-0" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.756667 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e27be68-eb58-40ff-8cae-686399b726a5-scripts\") pod \"ceilometer-0\" (UID: \"7e27be68-eb58-40ff-8cae-686399b726a5\") " pod="openstack/ceilometer-0" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.756766 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e27be68-eb58-40ff-8cae-686399b726a5-run-httpd\") pod \"ceilometer-0\" (UID: \"7e27be68-eb58-40ff-8cae-686399b726a5\") " pod="openstack/ceilometer-0" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.756807 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e27be68-eb58-40ff-8cae-686399b726a5-config-data\") pod \"ceilometer-0\" (UID: \"7e27be68-eb58-40ff-8cae-686399b726a5\") " pod="openstack/ceilometer-0" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.756838 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e27be68-eb58-40ff-8cae-686399b726a5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7e27be68-eb58-40ff-8cae-686399b726a5\") " pod="openstack/ceilometer-0" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.758143 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e27be68-eb58-40ff-8cae-686399b726a5-run-httpd\") pod \"ceilometer-0\" (UID: \"7e27be68-eb58-40ff-8cae-686399b726a5\") " pod="openstack/ceilometer-0" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.762331 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e27be68-eb58-40ff-8cae-686399b726a5-log-httpd\") pod \"ceilometer-0\" (UID: \"7e27be68-eb58-40ff-8cae-686399b726a5\") " pod="openstack/ceilometer-0" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.774023 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e27be68-eb58-40ff-8cae-686399b726a5-config-data\") pod \"ceilometer-0\" (UID: \"7e27be68-eb58-40ff-8cae-686399b726a5\") " pod="openstack/ceilometer-0" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.778680 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e27be68-eb58-40ff-8cae-686399b726a5-scripts\") pod \"ceilometer-0\" (UID: \"7e27be68-eb58-40ff-8cae-686399b726a5\") " pod="openstack/ceilometer-0" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.782018 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e27be68-eb58-40ff-8cae-686399b726a5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7e27be68-eb58-40ff-8cae-686399b726a5\") " pod="openstack/ceilometer-0" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.782373 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e27be68-eb58-40ff-8cae-686399b726a5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7e27be68-eb58-40ff-8cae-686399b726a5\") " pod="openstack/ceilometer-0" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.790574 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd6fq\" (UniqueName: \"kubernetes.io/projected/7e27be68-eb58-40ff-8cae-686399b726a5-kube-api-access-vd6fq\") pod \"ceilometer-0\" (UID: \"7e27be68-eb58-40ff-8cae-686399b726a5\") " pod="openstack/ceilometer-0" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.856925 4752 scope.go:117] "RemoveContainer" containerID="c0b402145f88ec10addac0b2ea25d8fd53ca5ce1bf1151c73b5b5046a11677e4" Nov 24 11:25:33 crc kubenswrapper[4752]: E1124 11:25:33.857875 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0b402145f88ec10addac0b2ea25d8fd53ca5ce1bf1151c73b5b5046a11677e4\": container with ID starting with c0b402145f88ec10addac0b2ea25d8fd53ca5ce1bf1151c73b5b5046a11677e4 not found: ID does not exist" containerID="c0b402145f88ec10addac0b2ea25d8fd53ca5ce1bf1151c73b5b5046a11677e4" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.857962 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0b402145f88ec10addac0b2ea25d8fd53ca5ce1bf1151c73b5b5046a11677e4"} err="failed to get container status \"c0b402145f88ec10addac0b2ea25d8fd53ca5ce1bf1151c73b5b5046a11677e4\": rpc error: code = NotFound desc = could not find container \"c0b402145f88ec10addac0b2ea25d8fd53ca5ce1bf1151c73b5b5046a11677e4\": container with ID starting with c0b402145f88ec10addac0b2ea25d8fd53ca5ce1bf1151c73b5b5046a11677e4 not found: ID does not exist" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.858014 4752 scope.go:117] "RemoveContainer" containerID="498fb2360c218ad0383c97e165942009a3fa76dcc477214cca208deb136f6bdf" Nov 24 11:25:33 crc kubenswrapper[4752]: E1124 11:25:33.858587 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"498fb2360c218ad0383c97e165942009a3fa76dcc477214cca208deb136f6bdf\": container with ID starting with 498fb2360c218ad0383c97e165942009a3fa76dcc477214cca208deb136f6bdf not found: ID does not exist" containerID="498fb2360c218ad0383c97e165942009a3fa76dcc477214cca208deb136f6bdf" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.858618 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"498fb2360c218ad0383c97e165942009a3fa76dcc477214cca208deb136f6bdf"} err="failed to get container status \"498fb2360c218ad0383c97e165942009a3fa76dcc477214cca208deb136f6bdf\": rpc error: code = NotFound desc = could not find container \"498fb2360c218ad0383c97e165942009a3fa76dcc477214cca208deb136f6bdf\": container with ID starting with 498fb2360c218ad0383c97e165942009a3fa76dcc477214cca208deb136f6bdf not found: ID does not exist" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.858641 4752 scope.go:117] "RemoveContainer" containerID="bd98460d4d8cd5d5c0e2dbc7e61bc74a65033d2df617a5b4cc89773afc5f69b3" Nov 24 11:25:33 crc kubenswrapper[4752]: E1124 11:25:33.866713 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd98460d4d8cd5d5c0e2dbc7e61bc74a65033d2df617a5b4cc89773afc5f69b3\": container with ID starting with bd98460d4d8cd5d5c0e2dbc7e61bc74a65033d2df617a5b4cc89773afc5f69b3 not found: ID does not exist" containerID="bd98460d4d8cd5d5c0e2dbc7e61bc74a65033d2df617a5b4cc89773afc5f69b3" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.866980 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd98460d4d8cd5d5c0e2dbc7e61bc74a65033d2df617a5b4cc89773afc5f69b3"} err="failed to get container status \"bd98460d4d8cd5d5c0e2dbc7e61bc74a65033d2df617a5b4cc89773afc5f69b3\": rpc error: code = NotFound desc = could not find container \"bd98460d4d8cd5d5c0e2dbc7e61bc74a65033d2df617a5b4cc89773afc5f69b3\": container with ID starting with bd98460d4d8cd5d5c0e2dbc7e61bc74a65033d2df617a5b4cc89773afc5f69b3 not found: ID does not exist" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.919181 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-68f8b8b648-65q5g" Nov 24 11:25:33 crc kubenswrapper[4752]: I1124 11:25:33.931201 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 11:25:34 crc kubenswrapper[4752]: I1124 11:25:34.148435 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-68f8b8b648-65q5g" Nov 24 11:25:34 crc kubenswrapper[4752]: I1124 11:25:34.226043 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-77cb55bf5d-j9l7d"] Nov 24 11:25:34 crc kubenswrapper[4752]: I1124 11:25:34.226334 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-77cb55bf5d-j9l7d" podUID="3ccd1d72-1c78-439b-b3ba-d38159757b03" containerName="barbican-api-log" containerID="cri-o://1251c4aae8a25020abf87749980547519404f0a5223e44e5247b6f79a26cfc95" gracePeriod=30 Nov 24 11:25:34 crc kubenswrapper[4752]: I1124 11:25:34.228073 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-77cb55bf5d-j9l7d" podUID="3ccd1d72-1c78-439b-b3ba-d38159757b03" containerName="barbican-api" containerID="cri-o://0944d76f410a93967812d134b9813035d6cbceb79e73c46ce318e5b60349cf87" gracePeriod=30 Nov 24 11:25:34 crc kubenswrapper[4752]: I1124 11:25:34.235255 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-77cb55bf5d-j9l7d" podUID="3ccd1d72-1c78-439b-b3ba-d38159757b03" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.155:9311/healthcheck\": EOF" Nov 24 11:25:34 crc kubenswrapper[4752]: I1124 11:25:34.448613 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 24 11:25:34 crc kubenswrapper[4752]: I1124 11:25:34.513647 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-q2svj" event={"ID":"3e1d72bc-5f15-4bf5-a424-62f34eafd84e","Type":"ContainerStarted","Data":"bac1392813903ed8becd43cc8d47a7e24faaeb6360ec55c28bab36488f31e191"} Nov 24 11:25:34 crc kubenswrapper[4752]: I1124 11:25:34.513724 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6578955fd5-q2svj" Nov 24 11:25:34 crc kubenswrapper[4752]: I1124 11:25:34.552949 4752 generic.go:334] "Generic (PLEG): container finished" podID="3ccd1d72-1c78-439b-b3ba-d38159757b03" containerID="1251c4aae8a25020abf87749980547519404f0a5223e44e5247b6f79a26cfc95" exitCode=143 Nov 24 11:25:34 crc kubenswrapper[4752]: I1124 11:25:34.553027 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77cb55bf5d-j9l7d" event={"ID":"3ccd1d72-1c78-439b-b3ba-d38159757b03","Type":"ContainerDied","Data":"1251c4aae8a25020abf87749980547519404f0a5223e44e5247b6f79a26cfc95"} Nov 24 11:25:34 crc kubenswrapper[4752]: I1124 11:25:34.587899 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6f12684d-5b53-4655-b64b-45ecb2ae5cf1","Type":"ContainerStarted","Data":"4acd9eefec8f3ea7ac6032304d1c98bb9137b693ed7f35b765f14329e99f34d6"} Nov 24 11:25:34 crc kubenswrapper[4752]: I1124 11:25:34.625222 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6578955fd5-q2svj" podStartSLOduration=3.625200828 podStartE2EDuration="3.625200828s" podCreationTimestamp="2025-11-24 11:25:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:25:34.596485319 +0000 UTC m=+1140.581305628" watchObservedRunningTime="2025-11-24 11:25:34.625200828 +0000 UTC m=+1140.610021117" Nov 24 11:25:34 crc kubenswrapper[4752]: I1124 11:25:34.749985 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ade2fdf2-9bc9-45ee-81cf-25be73764135" path="/var/lib/kubelet/pods/ade2fdf2-9bc9-45ee-81cf-25be73764135/volumes" Nov 24 11:25:34 crc kubenswrapper[4752]: I1124 11:25:34.751167 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 24 11:25:34 crc kubenswrapper[4752]: W1124 11:25:34.783066 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e27be68_eb58_40ff_8cae_686399b726a5.slice/crio-67afd53a4f784f0add9e4745231637b977af697ce3962edad1952cd91f57f73c WatchSource:0}: Error finding container 67afd53a4f784f0add9e4745231637b977af697ce3962edad1952cd91f57f73c: Status 404 returned error can't find the container with id 67afd53a4f784f0add9e4745231637b977af697ce3962edad1952cd91f57f73c Nov 24 11:25:35 crc kubenswrapper[4752]: I1124 11:25:35.611038 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e27be68-eb58-40ff-8cae-686399b726a5","Type":"ContainerStarted","Data":"86db44d46e0e1fa3d0c22549d2bf06886013ac00affef218eb39fd11985bc94e"} Nov 24 11:25:35 crc kubenswrapper[4752]: I1124 11:25:35.611318 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e27be68-eb58-40ff-8cae-686399b726a5","Type":"ContainerStarted","Data":"67afd53a4f784f0add9e4745231637b977af697ce3962edad1952cd91f57f73c"} Nov 24 11:25:35 crc kubenswrapper[4752]: I1124 11:25:35.621476 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6f12684d-5b53-4655-b64b-45ecb2ae5cf1","Type":"ContainerStarted","Data":"1e37174cb5776bcc9bf2ae316d4aabad5f8fbcba2f81c0a8a967ce97a7f4d54f"} Nov 24 11:25:35 crc kubenswrapper[4752]: I1124 11:25:35.621602 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="6f12684d-5b53-4655-b64b-45ecb2ae5cf1" containerName="cinder-api-log" containerID="cri-o://4acd9eefec8f3ea7ac6032304d1c98bb9137b693ed7f35b765f14329e99f34d6" gracePeriod=30 Nov 24 11:25:35 crc kubenswrapper[4752]: I1124 11:25:35.621715 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="6f12684d-5b53-4655-b64b-45ecb2ae5cf1" containerName="cinder-api" containerID="cri-o://1e37174cb5776bcc9bf2ae316d4aabad5f8fbcba2f81c0a8a967ce97a7f4d54f" gracePeriod=30 Nov 24 11:25:35 crc kubenswrapper[4752]: I1124 11:25:35.621956 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Nov 24 11:25:35 crc kubenswrapper[4752]: I1124 11:25:35.636495 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9d7aff9e-10fe-48a7-a77b-049a0b1bbf89","Type":"ContainerStarted","Data":"f3ac6206d54b4af8879ca35629b03c6e8f2fb6ab610ec2911130781ab09b3220"} Nov 24 11:25:35 crc kubenswrapper[4752]: I1124 11:25:35.646500 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.646476098 podStartE2EDuration="4.646476098s" podCreationTimestamp="2025-11-24 11:25:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:25:35.641776812 +0000 UTC m=+1141.626597101" watchObservedRunningTime="2025-11-24 11:25:35.646476098 +0000 UTC m=+1141.631296387" Nov 24 11:25:36 crc kubenswrapper[4752]: I1124 11:25:36.573488 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 24 11:25:36 crc kubenswrapper[4752]: I1124 11:25:36.649190 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9d7aff9e-10fe-48a7-a77b-049a0b1bbf89","Type":"ContainerStarted","Data":"7a0303ebe71bd53987be3a1043fac15257dbe0312bba14b10d7f2e13779db2ad"} Nov 24 11:25:36 crc kubenswrapper[4752]: I1124 11:25:36.653027 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e27be68-eb58-40ff-8cae-686399b726a5","Type":"ContainerStarted","Data":"4afdae42fda582e2e9cc1f06a212428e40a4399ac84443a679b5537eca27548b"} Nov 24 11:25:36 crc kubenswrapper[4752]: I1124 11:25:36.654976 4752 generic.go:334] "Generic (PLEG): container finished" podID="6f12684d-5b53-4655-b64b-45ecb2ae5cf1" containerID="1e37174cb5776bcc9bf2ae316d4aabad5f8fbcba2f81c0a8a967ce97a7f4d54f" exitCode=0 Nov 24 11:25:36 crc kubenswrapper[4752]: I1124 11:25:36.654999 4752 generic.go:334] "Generic (PLEG): container finished" podID="6f12684d-5b53-4655-b64b-45ecb2ae5cf1" containerID="4acd9eefec8f3ea7ac6032304d1c98bb9137b693ed7f35b765f14329e99f34d6" exitCode=143 Nov 24 11:25:36 crc kubenswrapper[4752]: I1124 11:25:36.655015 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6f12684d-5b53-4655-b64b-45ecb2ae5cf1","Type":"ContainerDied","Data":"1e37174cb5776bcc9bf2ae316d4aabad5f8fbcba2f81c0a8a967ce97a7f4d54f"} Nov 24 11:25:36 crc kubenswrapper[4752]: I1124 11:25:36.655030 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6f12684d-5b53-4655-b64b-45ecb2ae5cf1","Type":"ContainerDied","Data":"4acd9eefec8f3ea7ac6032304d1c98bb9137b693ed7f35b765f14329e99f34d6"} Nov 24 11:25:36 crc kubenswrapper[4752]: I1124 11:25:36.655040 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6f12684d-5b53-4655-b64b-45ecb2ae5cf1","Type":"ContainerDied","Data":"f1ad69545b771c5a6e8c5d965ca41d714b3fc899cce5b71ef1855b24d9f309d4"} Nov 24 11:25:36 crc kubenswrapper[4752]: I1124 11:25:36.655054 4752 scope.go:117] "RemoveContainer" containerID="1e37174cb5776bcc9bf2ae316d4aabad5f8fbcba2f81c0a8a967ce97a7f4d54f" Nov 24 11:25:36 crc kubenswrapper[4752]: I1124 11:25:36.655074 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 24 11:25:36 crc kubenswrapper[4752]: I1124 11:25:36.682018 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.728425886 podStartE2EDuration="5.681998038s" podCreationTimestamp="2025-11-24 11:25:31 +0000 UTC" firstStartedPulling="2025-11-24 11:25:32.716358634 +0000 UTC m=+1138.701178923" lastFinishedPulling="2025-11-24 11:25:33.669930786 +0000 UTC m=+1139.654751075" observedRunningTime="2025-11-24 11:25:36.679857266 +0000 UTC m=+1142.664677575" watchObservedRunningTime="2025-11-24 11:25:36.681998038 +0000 UTC m=+1142.666818327" Nov 24 11:25:36 crc kubenswrapper[4752]: I1124 11:25:36.694777 4752 scope.go:117] "RemoveContainer" containerID="4acd9eefec8f3ea7ac6032304d1c98bb9137b693ed7f35b765f14329e99f34d6" Nov 24 11:25:36 crc kubenswrapper[4752]: I1124 11:25:36.722916 4752 scope.go:117] "RemoveContainer" containerID="1e37174cb5776bcc9bf2ae316d4aabad5f8fbcba2f81c0a8a967ce97a7f4d54f" Nov 24 11:25:36 crc kubenswrapper[4752]: E1124 11:25:36.723292 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e37174cb5776bcc9bf2ae316d4aabad5f8fbcba2f81c0a8a967ce97a7f4d54f\": container with ID starting with 1e37174cb5776bcc9bf2ae316d4aabad5f8fbcba2f81c0a8a967ce97a7f4d54f not found: ID does not exist" containerID="1e37174cb5776bcc9bf2ae316d4aabad5f8fbcba2f81c0a8a967ce97a7f4d54f" Nov 24 11:25:36 crc kubenswrapper[4752]: I1124 11:25:36.723322 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e37174cb5776bcc9bf2ae316d4aabad5f8fbcba2f81c0a8a967ce97a7f4d54f"} err="failed to get container status \"1e37174cb5776bcc9bf2ae316d4aabad5f8fbcba2f81c0a8a967ce97a7f4d54f\": rpc error: code = NotFound desc = could not find container \"1e37174cb5776bcc9bf2ae316d4aabad5f8fbcba2f81c0a8a967ce97a7f4d54f\": container with ID starting with 1e37174cb5776bcc9bf2ae316d4aabad5f8fbcba2f81c0a8a967ce97a7f4d54f not found: ID does not exist" Nov 24 11:25:36 crc kubenswrapper[4752]: I1124 11:25:36.723342 4752 scope.go:117] "RemoveContainer" containerID="4acd9eefec8f3ea7ac6032304d1c98bb9137b693ed7f35b765f14329e99f34d6" Nov 24 11:25:36 crc kubenswrapper[4752]: E1124 11:25:36.723701 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4acd9eefec8f3ea7ac6032304d1c98bb9137b693ed7f35b765f14329e99f34d6\": container with ID starting with 4acd9eefec8f3ea7ac6032304d1c98bb9137b693ed7f35b765f14329e99f34d6 not found: ID does not exist" containerID="4acd9eefec8f3ea7ac6032304d1c98bb9137b693ed7f35b765f14329e99f34d6" Nov 24 11:25:36 crc kubenswrapper[4752]: I1124 11:25:36.723726 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4acd9eefec8f3ea7ac6032304d1c98bb9137b693ed7f35b765f14329e99f34d6"} err="failed to get container status \"4acd9eefec8f3ea7ac6032304d1c98bb9137b693ed7f35b765f14329e99f34d6\": rpc error: code = NotFound desc = could not find container \"4acd9eefec8f3ea7ac6032304d1c98bb9137b693ed7f35b765f14329e99f34d6\": container with ID starting with 4acd9eefec8f3ea7ac6032304d1c98bb9137b693ed7f35b765f14329e99f34d6 not found: ID does not exist" Nov 24 11:25:36 crc kubenswrapper[4752]: I1124 11:25:36.723773 4752 scope.go:117] "RemoveContainer" containerID="1e37174cb5776bcc9bf2ae316d4aabad5f8fbcba2f81c0a8a967ce97a7f4d54f" Nov 24 11:25:36 crc kubenswrapper[4752]: I1124 11:25:36.724215 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e37174cb5776bcc9bf2ae316d4aabad5f8fbcba2f81c0a8a967ce97a7f4d54f"} err="failed to get container status \"1e37174cb5776bcc9bf2ae316d4aabad5f8fbcba2f81c0a8a967ce97a7f4d54f\": rpc error: code = NotFound desc = could not find container \"1e37174cb5776bcc9bf2ae316d4aabad5f8fbcba2f81c0a8a967ce97a7f4d54f\": container with ID starting with 1e37174cb5776bcc9bf2ae316d4aabad5f8fbcba2f81c0a8a967ce97a7f4d54f not found: ID does not exist" Nov 24 11:25:36 crc kubenswrapper[4752]: I1124 11:25:36.724258 4752 scope.go:117] "RemoveContainer" containerID="4acd9eefec8f3ea7ac6032304d1c98bb9137b693ed7f35b765f14329e99f34d6" Nov 24 11:25:36 crc kubenswrapper[4752]: I1124 11:25:36.724572 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4acd9eefec8f3ea7ac6032304d1c98bb9137b693ed7f35b765f14329e99f34d6"} err="failed to get container status \"4acd9eefec8f3ea7ac6032304d1c98bb9137b693ed7f35b765f14329e99f34d6\": rpc error: code = NotFound desc = could not find container \"4acd9eefec8f3ea7ac6032304d1c98bb9137b693ed7f35b765f14329e99f34d6\": container with ID starting with 4acd9eefec8f3ea7ac6032304d1c98bb9137b693ed7f35b765f14329e99f34d6 not found: ID does not exist" Nov 24 11:25:36 crc kubenswrapper[4752]: I1124 11:25:36.725020 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f12684d-5b53-4655-b64b-45ecb2ae5cf1-config-data-custom\") pod \"6f12684d-5b53-4655-b64b-45ecb2ae5cf1\" (UID: \"6f12684d-5b53-4655-b64b-45ecb2ae5cf1\") " Nov 24 11:25:36 crc kubenswrapper[4752]: I1124 11:25:36.725070 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f12684d-5b53-4655-b64b-45ecb2ae5cf1-logs\") pod \"6f12684d-5b53-4655-b64b-45ecb2ae5cf1\" (UID: \"6f12684d-5b53-4655-b64b-45ecb2ae5cf1\") " Nov 24 11:25:36 crc kubenswrapper[4752]: I1124 11:25:36.725119 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6f12684d-5b53-4655-b64b-45ecb2ae5cf1-etc-machine-id\") pod \"6f12684d-5b53-4655-b64b-45ecb2ae5cf1\" (UID: \"6f12684d-5b53-4655-b64b-45ecb2ae5cf1\") " Nov 24 11:25:36 crc kubenswrapper[4752]: I1124 11:25:36.725235 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f12684d-5b53-4655-b64b-45ecb2ae5cf1-config-data\") pod \"6f12684d-5b53-4655-b64b-45ecb2ae5cf1\" (UID: \"6f12684d-5b53-4655-b64b-45ecb2ae5cf1\") " Nov 24 11:25:36 crc kubenswrapper[4752]: I1124 11:25:36.725274 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8w5cg\" (UniqueName: \"kubernetes.io/projected/6f12684d-5b53-4655-b64b-45ecb2ae5cf1-kube-api-access-8w5cg\") pod \"6f12684d-5b53-4655-b64b-45ecb2ae5cf1\" (UID: \"6f12684d-5b53-4655-b64b-45ecb2ae5cf1\") " Nov 24 11:25:36 crc kubenswrapper[4752]: I1124 11:25:36.725301 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f12684d-5b53-4655-b64b-45ecb2ae5cf1-scripts\") pod \"6f12684d-5b53-4655-b64b-45ecb2ae5cf1\" (UID: \"6f12684d-5b53-4655-b64b-45ecb2ae5cf1\") " Nov 24 11:25:36 crc kubenswrapper[4752]: I1124 11:25:36.725351 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f12684d-5b53-4655-b64b-45ecb2ae5cf1-combined-ca-bundle\") pod \"6f12684d-5b53-4655-b64b-45ecb2ae5cf1\" (UID: \"6f12684d-5b53-4655-b64b-45ecb2ae5cf1\") " Nov 24 11:25:36 crc kubenswrapper[4752]: I1124 11:25:36.725895 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f12684d-5b53-4655-b64b-45ecb2ae5cf1-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6f12684d-5b53-4655-b64b-45ecb2ae5cf1" (UID: "6f12684d-5b53-4655-b64b-45ecb2ae5cf1"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 11:25:36 crc kubenswrapper[4752]: I1124 11:25:36.726161 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f12684d-5b53-4655-b64b-45ecb2ae5cf1-logs" (OuterVolumeSpecName: "logs") pod "6f12684d-5b53-4655-b64b-45ecb2ae5cf1" (UID: "6f12684d-5b53-4655-b64b-45ecb2ae5cf1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:25:36 crc kubenswrapper[4752]: I1124 11:25:36.732578 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f12684d-5b53-4655-b64b-45ecb2ae5cf1-kube-api-access-8w5cg" (OuterVolumeSpecName: "kube-api-access-8w5cg") pod "6f12684d-5b53-4655-b64b-45ecb2ae5cf1" (UID: "6f12684d-5b53-4655-b64b-45ecb2ae5cf1"). InnerVolumeSpecName "kube-api-access-8w5cg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:25:36 crc kubenswrapper[4752]: I1124 11:25:36.733099 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f12684d-5b53-4655-b64b-45ecb2ae5cf1-scripts" (OuterVolumeSpecName: "scripts") pod "6f12684d-5b53-4655-b64b-45ecb2ae5cf1" (UID: "6f12684d-5b53-4655-b64b-45ecb2ae5cf1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:25:36 crc kubenswrapper[4752]: I1124 11:25:36.755831 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f12684d-5b53-4655-b64b-45ecb2ae5cf1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6f12684d-5b53-4655-b64b-45ecb2ae5cf1" (UID: "6f12684d-5b53-4655-b64b-45ecb2ae5cf1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:25:36 crc kubenswrapper[4752]: I1124 11:25:36.764957 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f12684d-5b53-4655-b64b-45ecb2ae5cf1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f12684d-5b53-4655-b64b-45ecb2ae5cf1" (UID: "6f12684d-5b53-4655-b64b-45ecb2ae5cf1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:25:36 crc kubenswrapper[4752]: I1124 11:25:36.826926 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f12684d-5b53-4655-b64b-45ecb2ae5cf1-config-data" (OuterVolumeSpecName: "config-data") pod "6f12684d-5b53-4655-b64b-45ecb2ae5cf1" (UID: "6f12684d-5b53-4655-b64b-45ecb2ae5cf1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:25:36 crc kubenswrapper[4752]: I1124 11:25:36.828450 4752 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f12684d-5b53-4655-b64b-45ecb2ae5cf1-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:36 crc kubenswrapper[4752]: I1124 11:25:36.828504 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f12684d-5b53-4655-b64b-45ecb2ae5cf1-logs\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:36 crc kubenswrapper[4752]: I1124 11:25:36.828525 4752 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6f12684d-5b53-4655-b64b-45ecb2ae5cf1-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:36 crc kubenswrapper[4752]: I1124 11:25:36.828541 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f12684d-5b53-4655-b64b-45ecb2ae5cf1-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:36 crc kubenswrapper[4752]: I1124 11:25:36.828559 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8w5cg\" (UniqueName: \"kubernetes.io/projected/6f12684d-5b53-4655-b64b-45ecb2ae5cf1-kube-api-access-8w5cg\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:36 crc kubenswrapper[4752]: I1124 11:25:36.828580 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f12684d-5b53-4655-b64b-45ecb2ae5cf1-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:36 crc kubenswrapper[4752]: I1124 11:25:36.828596 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f12684d-5b53-4655-b64b-45ecb2ae5cf1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:37 crc kubenswrapper[4752]: I1124 11:25:37.012195 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 24 11:25:37 crc kubenswrapper[4752]: I1124 11:25:37.021846 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Nov 24 11:25:37 crc kubenswrapper[4752]: I1124 11:25:37.050181 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 24 11:25:37 crc kubenswrapper[4752]: E1124 11:25:37.050809 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f12684d-5b53-4655-b64b-45ecb2ae5cf1" containerName="cinder-api-log" Nov 24 11:25:37 crc kubenswrapper[4752]: I1124 11:25:37.050827 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f12684d-5b53-4655-b64b-45ecb2ae5cf1" containerName="cinder-api-log" Nov 24 11:25:37 crc kubenswrapper[4752]: E1124 11:25:37.051042 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f12684d-5b53-4655-b64b-45ecb2ae5cf1" containerName="cinder-api" Nov 24 11:25:37 crc kubenswrapper[4752]: I1124 11:25:37.051054 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f12684d-5b53-4655-b64b-45ecb2ae5cf1" containerName="cinder-api" Nov 24 11:25:37 crc kubenswrapper[4752]: I1124 11:25:37.051220 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f12684d-5b53-4655-b64b-45ecb2ae5cf1" containerName="cinder-api-log" Nov 24 11:25:37 crc kubenswrapper[4752]: I1124 11:25:37.051252 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f12684d-5b53-4655-b64b-45ecb2ae5cf1" containerName="cinder-api" Nov 24 11:25:37 crc kubenswrapper[4752]: I1124 11:25:37.052491 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 24 11:25:37 crc kubenswrapper[4752]: I1124 11:25:37.053860 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 24 11:25:37 crc kubenswrapper[4752]: I1124 11:25:37.058408 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Nov 24 11:25:37 crc kubenswrapper[4752]: I1124 11:25:37.058659 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 24 11:25:37 crc kubenswrapper[4752]: I1124 11:25:37.058943 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Nov 24 11:25:37 crc kubenswrapper[4752]: I1124 11:25:37.080222 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 24 11:25:37 crc kubenswrapper[4752]: I1124 11:25:37.144097 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffd3886c-7122-4bd5-97d2-5c448c31e941-logs\") pod \"cinder-api-0\" (UID: \"ffd3886c-7122-4bd5-97d2-5c448c31e941\") " pod="openstack/cinder-api-0" Nov 24 11:25:37 crc kubenswrapper[4752]: I1124 11:25:37.144150 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ffd3886c-7122-4bd5-97d2-5c448c31e941-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ffd3886c-7122-4bd5-97d2-5c448c31e941\") " pod="openstack/cinder-api-0" Nov 24 11:25:37 crc kubenswrapper[4752]: I1124 11:25:37.144234 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ffd3886c-7122-4bd5-97d2-5c448c31e941-config-data-custom\") pod \"cinder-api-0\" (UID: \"ffd3886c-7122-4bd5-97d2-5c448c31e941\") " pod="openstack/cinder-api-0" Nov 24 11:25:37 crc kubenswrapper[4752]: I1124 11:25:37.144255 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffd3886c-7122-4bd5-97d2-5c448c31e941-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ffd3886c-7122-4bd5-97d2-5c448c31e941\") " pod="openstack/cinder-api-0" Nov 24 11:25:37 crc kubenswrapper[4752]: I1124 11:25:37.144271 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmhfj\" (UniqueName: \"kubernetes.io/projected/ffd3886c-7122-4bd5-97d2-5c448c31e941-kube-api-access-mmhfj\") pod \"cinder-api-0\" (UID: \"ffd3886c-7122-4bd5-97d2-5c448c31e941\") " pod="openstack/cinder-api-0" Nov 24 11:25:37 crc kubenswrapper[4752]: I1124 11:25:37.144310 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffd3886c-7122-4bd5-97d2-5c448c31e941-scripts\") pod \"cinder-api-0\" (UID: \"ffd3886c-7122-4bd5-97d2-5c448c31e941\") " pod="openstack/cinder-api-0" Nov 24 11:25:37 crc kubenswrapper[4752]: I1124 11:25:37.144332 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffd3886c-7122-4bd5-97d2-5c448c31e941-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ffd3886c-7122-4bd5-97d2-5c448c31e941\") " pod="openstack/cinder-api-0" Nov 24 11:25:37 crc kubenswrapper[4752]: I1124 11:25:37.144356 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffd3886c-7122-4bd5-97d2-5c448c31e941-config-data\") pod \"cinder-api-0\" (UID: \"ffd3886c-7122-4bd5-97d2-5c448c31e941\") " pod="openstack/cinder-api-0" Nov 24 11:25:37 crc kubenswrapper[4752]: I1124 11:25:37.144414 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffd3886c-7122-4bd5-97d2-5c448c31e941-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ffd3886c-7122-4bd5-97d2-5c448c31e941\") " pod="openstack/cinder-api-0" Nov 24 11:25:37 crc kubenswrapper[4752]: I1124 11:25:37.246036 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffd3886c-7122-4bd5-97d2-5c448c31e941-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ffd3886c-7122-4bd5-97d2-5c448c31e941\") " pod="openstack/cinder-api-0" Nov 24 11:25:37 crc kubenswrapper[4752]: I1124 11:25:37.246539 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffd3886c-7122-4bd5-97d2-5c448c31e941-logs\") pod \"cinder-api-0\" (UID: \"ffd3886c-7122-4bd5-97d2-5c448c31e941\") " pod="openstack/cinder-api-0" Nov 24 11:25:37 crc kubenswrapper[4752]: I1124 11:25:37.246578 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ffd3886c-7122-4bd5-97d2-5c448c31e941-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ffd3886c-7122-4bd5-97d2-5c448c31e941\") " pod="openstack/cinder-api-0" Nov 24 11:25:37 crc kubenswrapper[4752]: I1124 11:25:37.246642 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ffd3886c-7122-4bd5-97d2-5c448c31e941-config-data-custom\") pod \"cinder-api-0\" (UID: \"ffd3886c-7122-4bd5-97d2-5c448c31e941\") " pod="openstack/cinder-api-0" Nov 24 11:25:37 crc kubenswrapper[4752]: I1124 11:25:37.246664 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffd3886c-7122-4bd5-97d2-5c448c31e941-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ffd3886c-7122-4bd5-97d2-5c448c31e941\") " pod="openstack/cinder-api-0" Nov 24 11:25:37 crc kubenswrapper[4752]: I1124 11:25:37.246685 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmhfj\" (UniqueName: \"kubernetes.io/projected/ffd3886c-7122-4bd5-97d2-5c448c31e941-kube-api-access-mmhfj\") pod \"cinder-api-0\" (UID: \"ffd3886c-7122-4bd5-97d2-5c448c31e941\") " pod="openstack/cinder-api-0" Nov 24 11:25:37 crc kubenswrapper[4752]: I1124 11:25:37.246733 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffd3886c-7122-4bd5-97d2-5c448c31e941-scripts\") pod \"cinder-api-0\" (UID: \"ffd3886c-7122-4bd5-97d2-5c448c31e941\") " pod="openstack/cinder-api-0" Nov 24 11:25:37 crc kubenswrapper[4752]: I1124 11:25:37.246782 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffd3886c-7122-4bd5-97d2-5c448c31e941-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ffd3886c-7122-4bd5-97d2-5c448c31e941\") " pod="openstack/cinder-api-0" Nov 24 11:25:37 crc kubenswrapper[4752]: I1124 11:25:37.246795 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ffd3886c-7122-4bd5-97d2-5c448c31e941-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ffd3886c-7122-4bd5-97d2-5c448c31e941\") " pod="openstack/cinder-api-0" Nov 24 11:25:37 crc kubenswrapper[4752]: I1124 11:25:37.246807 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffd3886c-7122-4bd5-97d2-5c448c31e941-config-data\") pod \"cinder-api-0\" (UID: \"ffd3886c-7122-4bd5-97d2-5c448c31e941\") " pod="openstack/cinder-api-0" Nov 24 11:25:37 crc kubenswrapper[4752]: I1124 11:25:37.247726 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffd3886c-7122-4bd5-97d2-5c448c31e941-logs\") pod \"cinder-api-0\" (UID: \"ffd3886c-7122-4bd5-97d2-5c448c31e941\") " pod="openstack/cinder-api-0" Nov 24 11:25:37 crc kubenswrapper[4752]: I1124 11:25:37.252575 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffd3886c-7122-4bd5-97d2-5c448c31e941-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ffd3886c-7122-4bd5-97d2-5c448c31e941\") " pod="openstack/cinder-api-0" Nov 24 11:25:37 crc kubenswrapper[4752]: I1124 11:25:37.263480 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffd3886c-7122-4bd5-97d2-5c448c31e941-scripts\") pod \"cinder-api-0\" (UID: \"ffd3886c-7122-4bd5-97d2-5c448c31e941\") " pod="openstack/cinder-api-0" Nov 24 11:25:37 crc kubenswrapper[4752]: I1124 11:25:37.263585 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ffd3886c-7122-4bd5-97d2-5c448c31e941-config-data-custom\") pod \"cinder-api-0\" (UID: \"ffd3886c-7122-4bd5-97d2-5c448c31e941\") " pod="openstack/cinder-api-0" Nov 24 11:25:37 crc kubenswrapper[4752]: I1124 11:25:37.263665 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffd3886c-7122-4bd5-97d2-5c448c31e941-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ffd3886c-7122-4bd5-97d2-5c448c31e941\") " pod="openstack/cinder-api-0" Nov 24 11:25:37 crc kubenswrapper[4752]: I1124 11:25:37.264418 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffd3886c-7122-4bd5-97d2-5c448c31e941-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ffd3886c-7122-4bd5-97d2-5c448c31e941\") " pod="openstack/cinder-api-0" Nov 24 11:25:37 crc kubenswrapper[4752]: I1124 11:25:37.278167 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffd3886c-7122-4bd5-97d2-5c448c31e941-config-data\") pod \"cinder-api-0\" (UID: \"ffd3886c-7122-4bd5-97d2-5c448c31e941\") " pod="openstack/cinder-api-0" Nov 24 11:25:37 crc kubenswrapper[4752]: I1124 11:25:37.286711 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmhfj\" (UniqueName: \"kubernetes.io/projected/ffd3886c-7122-4bd5-97d2-5c448c31e941-kube-api-access-mmhfj\") pod \"cinder-api-0\" (UID: \"ffd3886c-7122-4bd5-97d2-5c448c31e941\") " pod="openstack/cinder-api-0" Nov 24 11:25:37 crc kubenswrapper[4752]: I1124 11:25:37.399636 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 24 11:25:37 crc kubenswrapper[4752]: I1124 11:25:37.734660 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e27be68-eb58-40ff-8cae-686399b726a5","Type":"ContainerStarted","Data":"78678ced3942ffc63c968c7e93809d754c643c147367ead303121fdfa7d27a16"} Nov 24 11:25:37 crc kubenswrapper[4752]: I1124 11:25:37.949598 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 24 11:25:37 crc kubenswrapper[4752]: W1124 11:25:37.960141 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffd3886c_7122_4bd5_97d2_5c448c31e941.slice/crio-e0949c324ffe3ff86542cb48faffa1696d02a6667acf5cdb256904e4f58d19f3 WatchSource:0}: Error finding container e0949c324ffe3ff86542cb48faffa1696d02a6667acf5cdb256904e4f58d19f3: Status 404 returned error can't find the container with id e0949c324ffe3ff86542cb48faffa1696d02a6667acf5cdb256904e4f58d19f3 Nov 24 11:25:38 crc kubenswrapper[4752]: I1124 11:25:38.725323 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-77cb55bf5d-j9l7d" podUID="3ccd1d72-1c78-439b-b3ba-d38159757b03" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.155:9311/healthcheck\": read tcp 10.217.0.2:57978->10.217.0.155:9311: read: connection reset by peer" Nov 24 11:25:38 crc kubenswrapper[4752]: I1124 11:25:38.726200 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-77cb55bf5d-j9l7d" podUID="3ccd1d72-1c78-439b-b3ba-d38159757b03" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.155:9311/healthcheck\": read tcp 10.217.0.2:57980->10.217.0.155:9311: read: connection reset by peer" Nov 24 11:25:38 crc kubenswrapper[4752]: I1124 11:25:38.740655 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f12684d-5b53-4655-b64b-45ecb2ae5cf1" path="/var/lib/kubelet/pods/6f12684d-5b53-4655-b64b-45ecb2ae5cf1/volumes" Nov 24 11:25:38 crc kubenswrapper[4752]: I1124 11:25:38.752164 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ffd3886c-7122-4bd5-97d2-5c448c31e941","Type":"ContainerStarted","Data":"8afbde4838c68ed65d0a791d3f6cf81c9dff82063ee60fdee90cec80f669ae26"} Nov 24 11:25:38 crc kubenswrapper[4752]: I1124 11:25:38.752210 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ffd3886c-7122-4bd5-97d2-5c448c31e941","Type":"ContainerStarted","Data":"e0949c324ffe3ff86542cb48faffa1696d02a6667acf5cdb256904e4f58d19f3"} Nov 24 11:25:39 crc kubenswrapper[4752]: I1124 11:25:39.129443 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-77cb55bf5d-j9l7d" Nov 24 11:25:39 crc kubenswrapper[4752]: I1124 11:25:39.194684 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ccd1d72-1c78-439b-b3ba-d38159757b03-config-data\") pod \"3ccd1d72-1c78-439b-b3ba-d38159757b03\" (UID: \"3ccd1d72-1c78-439b-b3ba-d38159757b03\") " Nov 24 11:25:39 crc kubenswrapper[4752]: I1124 11:25:39.194810 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ccd1d72-1c78-439b-b3ba-d38159757b03-combined-ca-bundle\") pod \"3ccd1d72-1c78-439b-b3ba-d38159757b03\" (UID: \"3ccd1d72-1c78-439b-b3ba-d38159757b03\") " Nov 24 11:25:39 crc kubenswrapper[4752]: I1124 11:25:39.194841 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhcn4\" (UniqueName: \"kubernetes.io/projected/3ccd1d72-1c78-439b-b3ba-d38159757b03-kube-api-access-xhcn4\") pod \"3ccd1d72-1c78-439b-b3ba-d38159757b03\" (UID: \"3ccd1d72-1c78-439b-b3ba-d38159757b03\") " Nov 24 11:25:39 crc kubenswrapper[4752]: I1124 11:25:39.194869 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ccd1d72-1c78-439b-b3ba-d38159757b03-config-data-custom\") pod \"3ccd1d72-1c78-439b-b3ba-d38159757b03\" (UID: \"3ccd1d72-1c78-439b-b3ba-d38159757b03\") " Nov 24 11:25:39 crc kubenswrapper[4752]: I1124 11:25:39.194918 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ccd1d72-1c78-439b-b3ba-d38159757b03-logs\") pod \"3ccd1d72-1c78-439b-b3ba-d38159757b03\" (UID: \"3ccd1d72-1c78-439b-b3ba-d38159757b03\") " Nov 24 11:25:39 crc kubenswrapper[4752]: I1124 11:25:39.196031 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ccd1d72-1c78-439b-b3ba-d38159757b03-logs" (OuterVolumeSpecName: "logs") pod "3ccd1d72-1c78-439b-b3ba-d38159757b03" (UID: "3ccd1d72-1c78-439b-b3ba-d38159757b03"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:25:39 crc kubenswrapper[4752]: I1124 11:25:39.198841 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ccd1d72-1c78-439b-b3ba-d38159757b03-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3ccd1d72-1c78-439b-b3ba-d38159757b03" (UID: "3ccd1d72-1c78-439b-b3ba-d38159757b03"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:25:39 crc kubenswrapper[4752]: I1124 11:25:39.198973 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ccd1d72-1c78-439b-b3ba-d38159757b03-kube-api-access-xhcn4" (OuterVolumeSpecName: "kube-api-access-xhcn4") pod "3ccd1d72-1c78-439b-b3ba-d38159757b03" (UID: "3ccd1d72-1c78-439b-b3ba-d38159757b03"). InnerVolumeSpecName "kube-api-access-xhcn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:25:39 crc kubenswrapper[4752]: I1124 11:25:39.220608 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ccd1d72-1c78-439b-b3ba-d38159757b03-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ccd1d72-1c78-439b-b3ba-d38159757b03" (UID: "3ccd1d72-1c78-439b-b3ba-d38159757b03"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:25:39 crc kubenswrapper[4752]: I1124 11:25:39.244429 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ccd1d72-1c78-439b-b3ba-d38159757b03-config-data" (OuterVolumeSpecName: "config-data") pod "3ccd1d72-1c78-439b-b3ba-d38159757b03" (UID: "3ccd1d72-1c78-439b-b3ba-d38159757b03"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:25:39 crc kubenswrapper[4752]: I1124 11:25:39.296932 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ccd1d72-1c78-439b-b3ba-d38159757b03-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:39 crc kubenswrapper[4752]: I1124 11:25:39.296977 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ccd1d72-1c78-439b-b3ba-d38159757b03-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:39 crc kubenswrapper[4752]: I1124 11:25:39.296995 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhcn4\" (UniqueName: \"kubernetes.io/projected/3ccd1d72-1c78-439b-b3ba-d38159757b03-kube-api-access-xhcn4\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:39 crc kubenswrapper[4752]: I1124 11:25:39.297007 4752 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ccd1d72-1c78-439b-b3ba-d38159757b03-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:39 crc kubenswrapper[4752]: I1124 11:25:39.297020 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ccd1d72-1c78-439b-b3ba-d38159757b03-logs\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:39 crc kubenswrapper[4752]: I1124 11:25:39.762481 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ffd3886c-7122-4bd5-97d2-5c448c31e941","Type":"ContainerStarted","Data":"11876dbf5b00d15eb3d002e8d11562f90c15b2bebf4c5f309e67dc8dc3e37bbb"} Nov 24 11:25:39 crc kubenswrapper[4752]: I1124 11:25:39.762926 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Nov 24 11:25:39 crc kubenswrapper[4752]: I1124 11:25:39.765324 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e27be68-eb58-40ff-8cae-686399b726a5","Type":"ContainerStarted","Data":"5b4736909b3a1a5ca49bd05264baefafda1155d24a297c3da3c06add676e920d"} Nov 24 11:25:39 crc kubenswrapper[4752]: I1124 11:25:39.765453 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 24 11:25:39 crc kubenswrapper[4752]: I1124 11:25:39.767510 4752 generic.go:334] "Generic (PLEG): container finished" podID="3ccd1d72-1c78-439b-b3ba-d38159757b03" containerID="0944d76f410a93967812d134b9813035d6cbceb79e73c46ce318e5b60349cf87" exitCode=0 Nov 24 11:25:39 crc kubenswrapper[4752]: I1124 11:25:39.767537 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-77cb55bf5d-j9l7d" Nov 24 11:25:39 crc kubenswrapper[4752]: I1124 11:25:39.767548 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77cb55bf5d-j9l7d" event={"ID":"3ccd1d72-1c78-439b-b3ba-d38159757b03","Type":"ContainerDied","Data":"0944d76f410a93967812d134b9813035d6cbceb79e73c46ce318e5b60349cf87"} Nov 24 11:25:39 crc kubenswrapper[4752]: I1124 11:25:39.767571 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77cb55bf5d-j9l7d" event={"ID":"3ccd1d72-1c78-439b-b3ba-d38159757b03","Type":"ContainerDied","Data":"0767e4392a4f29bb6c6db310302c1e32144c1a0148514dd43a2006f11eb94daa"} Nov 24 11:25:39 crc kubenswrapper[4752]: I1124 11:25:39.767609 4752 scope.go:117] "RemoveContainer" containerID="0944d76f410a93967812d134b9813035d6cbceb79e73c46ce318e5b60349cf87" Nov 24 11:25:39 crc kubenswrapper[4752]: I1124 11:25:39.793906 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.793880454 podStartE2EDuration="2.793880454s" podCreationTimestamp="2025-11-24 11:25:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:25:39.781881207 +0000 UTC m=+1145.766701506" watchObservedRunningTime="2025-11-24 11:25:39.793880454 +0000 UTC m=+1145.778700753" Nov 24 11:25:39 crc kubenswrapper[4752]: I1124 11:25:39.803227 4752 scope.go:117] "RemoveContainer" containerID="1251c4aae8a25020abf87749980547519404f0a5223e44e5247b6f79a26cfc95" Nov 24 11:25:39 crc kubenswrapper[4752]: I1124 11:25:39.809326 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-77cb55bf5d-j9l7d"] Nov 24 11:25:39 crc kubenswrapper[4752]: I1124 11:25:39.830508 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-77cb55bf5d-j9l7d"] Nov 24 11:25:39 crc kubenswrapper[4752]: I1124 11:25:39.849325 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.853380015 podStartE2EDuration="6.849297555s" podCreationTimestamp="2025-11-24 11:25:33 +0000 UTC" firstStartedPulling="2025-11-24 11:25:34.82842251 +0000 UTC m=+1140.813242799" lastFinishedPulling="2025-11-24 11:25:38.82434005 +0000 UTC m=+1144.809160339" observedRunningTime="2025-11-24 11:25:39.841372966 +0000 UTC m=+1145.826193275" watchObservedRunningTime="2025-11-24 11:25:39.849297555 +0000 UTC m=+1145.834117854" Nov 24 11:25:39 crc kubenswrapper[4752]: I1124 11:25:39.856798 4752 scope.go:117] "RemoveContainer" containerID="0944d76f410a93967812d134b9813035d6cbceb79e73c46ce318e5b60349cf87" Nov 24 11:25:39 crc kubenswrapper[4752]: E1124 11:25:39.857291 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0944d76f410a93967812d134b9813035d6cbceb79e73c46ce318e5b60349cf87\": container with ID starting with 0944d76f410a93967812d134b9813035d6cbceb79e73c46ce318e5b60349cf87 not found: ID does not exist" containerID="0944d76f410a93967812d134b9813035d6cbceb79e73c46ce318e5b60349cf87" Nov 24 11:25:39 crc kubenswrapper[4752]: I1124 11:25:39.857325 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0944d76f410a93967812d134b9813035d6cbceb79e73c46ce318e5b60349cf87"} err="failed to get container status \"0944d76f410a93967812d134b9813035d6cbceb79e73c46ce318e5b60349cf87\": rpc error: code = NotFound desc = could not find container \"0944d76f410a93967812d134b9813035d6cbceb79e73c46ce318e5b60349cf87\": container with ID starting with 0944d76f410a93967812d134b9813035d6cbceb79e73c46ce318e5b60349cf87 not found: ID does not exist" Nov 24 11:25:39 crc kubenswrapper[4752]: I1124 11:25:39.857351 4752 scope.go:117] "RemoveContainer" containerID="1251c4aae8a25020abf87749980547519404f0a5223e44e5247b6f79a26cfc95" Nov 24 11:25:39 crc kubenswrapper[4752]: E1124 11:25:39.857596 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1251c4aae8a25020abf87749980547519404f0a5223e44e5247b6f79a26cfc95\": container with ID starting with 1251c4aae8a25020abf87749980547519404f0a5223e44e5247b6f79a26cfc95 not found: ID does not exist" containerID="1251c4aae8a25020abf87749980547519404f0a5223e44e5247b6f79a26cfc95" Nov 24 11:25:39 crc kubenswrapper[4752]: I1124 11:25:39.857620 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1251c4aae8a25020abf87749980547519404f0a5223e44e5247b6f79a26cfc95"} err="failed to get container status \"1251c4aae8a25020abf87749980547519404f0a5223e44e5247b6f79a26cfc95\": rpc error: code = NotFound desc = could not find container \"1251c4aae8a25020abf87749980547519404f0a5223e44e5247b6f79a26cfc95\": container with ID starting with 1251c4aae8a25020abf87749980547519404f0a5223e44e5247b6f79a26cfc95 not found: ID does not exist" Nov 24 11:25:40 crc kubenswrapper[4752]: I1124 11:25:40.740815 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ccd1d72-1c78-439b-b3ba-d38159757b03" path="/var/lib/kubelet/pods/3ccd1d72-1c78-439b-b3ba-d38159757b03/volumes" Nov 24 11:25:42 crc kubenswrapper[4752]: I1124 11:25:42.149914 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6578955fd5-q2svj" Nov 24 11:25:42 crc kubenswrapper[4752]: I1124 11:25:42.230819 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-pdfvr"] Nov 24 11:25:42 crc kubenswrapper[4752]: I1124 11:25:42.232509 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-848cf88cfc-pdfvr" podUID="037885e0-98e4-4c61-b8dd-652d9697be28" containerName="dnsmasq-dns" containerID="cri-o://4394190ea47e8968767d41d26ef1ea71de11fd42f1d75708f3180991725e13c6" gracePeriod=10 Nov 24 11:25:42 crc kubenswrapper[4752]: I1124 11:25:42.335584 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 24 11:25:42 crc kubenswrapper[4752]: I1124 11:25:42.406265 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 24 11:25:42 crc kubenswrapper[4752]: I1124 11:25:42.759326 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-pdfvr" Nov 24 11:25:42 crc kubenswrapper[4752]: I1124 11:25:42.807522 4752 generic.go:334] "Generic (PLEG): container finished" podID="037885e0-98e4-4c61-b8dd-652d9697be28" containerID="4394190ea47e8968767d41d26ef1ea71de11fd42f1d75708f3180991725e13c6" exitCode=0 Nov 24 11:25:42 crc kubenswrapper[4752]: I1124 11:25:42.807585 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-pdfvr" Nov 24 11:25:42 crc kubenswrapper[4752]: I1124 11:25:42.807937 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="9d7aff9e-10fe-48a7-a77b-049a0b1bbf89" containerName="cinder-scheduler" containerID="cri-o://f3ac6206d54b4af8879ca35629b03c6e8f2fb6ab610ec2911130781ab09b3220" gracePeriod=30 Nov 24 11:25:42 crc kubenswrapper[4752]: I1124 11:25:42.808100 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-pdfvr" event={"ID":"037885e0-98e4-4c61-b8dd-652d9697be28","Type":"ContainerDied","Data":"4394190ea47e8968767d41d26ef1ea71de11fd42f1d75708f3180991725e13c6"} Nov 24 11:25:42 crc kubenswrapper[4752]: I1124 11:25:42.808148 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-pdfvr" event={"ID":"037885e0-98e4-4c61-b8dd-652d9697be28","Type":"ContainerDied","Data":"90f029dec2cc57e8a911ef2cff832684b61182924b2a4081c3429a3b0685959c"} Nov 24 11:25:42 crc kubenswrapper[4752]: I1124 11:25:42.808174 4752 scope.go:117] "RemoveContainer" containerID="4394190ea47e8968767d41d26ef1ea71de11fd42f1d75708f3180991725e13c6" Nov 24 11:25:42 crc kubenswrapper[4752]: I1124 11:25:42.808334 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="9d7aff9e-10fe-48a7-a77b-049a0b1bbf89" containerName="probe" containerID="cri-o://7a0303ebe71bd53987be3a1043fac15257dbe0312bba14b10d7f2e13779db2ad" gracePeriod=30 Nov 24 11:25:42 crc kubenswrapper[4752]: I1124 11:25:42.847107 4752 scope.go:117] "RemoveContainer" containerID="89d418148239f12b3d76d424e0dc7aacfdc84f38d5a2d0e0b99b864bd0c84869" Nov 24 11:25:42 crc kubenswrapper[4752]: I1124 11:25:42.874963 4752 scope.go:117] "RemoveContainer" containerID="4394190ea47e8968767d41d26ef1ea71de11fd42f1d75708f3180991725e13c6" Nov 24 11:25:42 crc kubenswrapper[4752]: E1124 11:25:42.875686 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4394190ea47e8968767d41d26ef1ea71de11fd42f1d75708f3180991725e13c6\": container with ID starting with 4394190ea47e8968767d41d26ef1ea71de11fd42f1d75708f3180991725e13c6 not found: ID does not exist" containerID="4394190ea47e8968767d41d26ef1ea71de11fd42f1d75708f3180991725e13c6" Nov 24 11:25:42 crc kubenswrapper[4752]: I1124 11:25:42.875925 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4394190ea47e8968767d41d26ef1ea71de11fd42f1d75708f3180991725e13c6"} err="failed to get container status \"4394190ea47e8968767d41d26ef1ea71de11fd42f1d75708f3180991725e13c6\": rpc error: code = NotFound desc = could not find container \"4394190ea47e8968767d41d26ef1ea71de11fd42f1d75708f3180991725e13c6\": container with ID starting with 4394190ea47e8968767d41d26ef1ea71de11fd42f1d75708f3180991725e13c6 not found: ID does not exist" Nov 24 11:25:42 crc kubenswrapper[4752]: I1124 11:25:42.875975 4752 scope.go:117] "RemoveContainer" containerID="89d418148239f12b3d76d424e0dc7aacfdc84f38d5a2d0e0b99b864bd0c84869" Nov 24 11:25:42 crc kubenswrapper[4752]: E1124 11:25:42.876488 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89d418148239f12b3d76d424e0dc7aacfdc84f38d5a2d0e0b99b864bd0c84869\": container with ID starting with 89d418148239f12b3d76d424e0dc7aacfdc84f38d5a2d0e0b99b864bd0c84869 not found: ID does not exist" containerID="89d418148239f12b3d76d424e0dc7aacfdc84f38d5a2d0e0b99b864bd0c84869" Nov 24 11:25:42 crc kubenswrapper[4752]: I1124 11:25:42.876521 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89d418148239f12b3d76d424e0dc7aacfdc84f38d5a2d0e0b99b864bd0c84869"} err="failed to get container status \"89d418148239f12b3d76d424e0dc7aacfdc84f38d5a2d0e0b99b864bd0c84869\": rpc error: code = NotFound desc = could not find container \"89d418148239f12b3d76d424e0dc7aacfdc84f38d5a2d0e0b99b864bd0c84869\": container with ID starting with 89d418148239f12b3d76d424e0dc7aacfdc84f38d5a2d0e0b99b864bd0c84869 not found: ID does not exist" Nov 24 11:25:42 crc kubenswrapper[4752]: I1124 11:25:42.893788 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/037885e0-98e4-4c61-b8dd-652d9697be28-dns-svc\") pod \"037885e0-98e4-4c61-b8dd-652d9697be28\" (UID: \"037885e0-98e4-4c61-b8dd-652d9697be28\") " Nov 24 11:25:42 crc kubenswrapper[4752]: I1124 11:25:42.893887 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/037885e0-98e4-4c61-b8dd-652d9697be28-config\") pod \"037885e0-98e4-4c61-b8dd-652d9697be28\" (UID: \"037885e0-98e4-4c61-b8dd-652d9697be28\") " Nov 24 11:25:42 crc kubenswrapper[4752]: I1124 11:25:42.893936 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/037885e0-98e4-4c61-b8dd-652d9697be28-dns-swift-storage-0\") pod \"037885e0-98e4-4c61-b8dd-652d9697be28\" (UID: \"037885e0-98e4-4c61-b8dd-652d9697be28\") " Nov 24 11:25:42 crc kubenswrapper[4752]: I1124 11:25:42.893971 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/037885e0-98e4-4c61-b8dd-652d9697be28-ovsdbserver-sb\") pod \"037885e0-98e4-4c61-b8dd-652d9697be28\" (UID: \"037885e0-98e4-4c61-b8dd-652d9697be28\") " Nov 24 11:25:42 crc kubenswrapper[4752]: I1124 11:25:42.894075 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mflck\" (UniqueName: \"kubernetes.io/projected/037885e0-98e4-4c61-b8dd-652d9697be28-kube-api-access-mflck\") pod \"037885e0-98e4-4c61-b8dd-652d9697be28\" (UID: \"037885e0-98e4-4c61-b8dd-652d9697be28\") " Nov 24 11:25:42 crc kubenswrapper[4752]: I1124 11:25:42.894128 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/037885e0-98e4-4c61-b8dd-652d9697be28-ovsdbserver-nb\") pod \"037885e0-98e4-4c61-b8dd-652d9697be28\" (UID: \"037885e0-98e4-4c61-b8dd-652d9697be28\") " Nov 24 11:25:42 crc kubenswrapper[4752]: I1124 11:25:42.901032 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/037885e0-98e4-4c61-b8dd-652d9697be28-kube-api-access-mflck" (OuterVolumeSpecName: "kube-api-access-mflck") pod "037885e0-98e4-4c61-b8dd-652d9697be28" (UID: "037885e0-98e4-4c61-b8dd-652d9697be28"). InnerVolumeSpecName "kube-api-access-mflck". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:25:42 crc kubenswrapper[4752]: I1124 11:25:42.939997 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/037885e0-98e4-4c61-b8dd-652d9697be28-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "037885e0-98e4-4c61-b8dd-652d9697be28" (UID: "037885e0-98e4-4c61-b8dd-652d9697be28"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:25:42 crc kubenswrapper[4752]: I1124 11:25:42.941212 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/037885e0-98e4-4c61-b8dd-652d9697be28-config" (OuterVolumeSpecName: "config") pod "037885e0-98e4-4c61-b8dd-652d9697be28" (UID: "037885e0-98e4-4c61-b8dd-652d9697be28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:25:42 crc kubenswrapper[4752]: I1124 11:25:42.942647 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/037885e0-98e4-4c61-b8dd-652d9697be28-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "037885e0-98e4-4c61-b8dd-652d9697be28" (UID: "037885e0-98e4-4c61-b8dd-652d9697be28"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:25:42 crc kubenswrapper[4752]: I1124 11:25:42.947700 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/037885e0-98e4-4c61-b8dd-652d9697be28-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "037885e0-98e4-4c61-b8dd-652d9697be28" (UID: "037885e0-98e4-4c61-b8dd-652d9697be28"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:25:42 crc kubenswrapper[4752]: I1124 11:25:42.952566 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/037885e0-98e4-4c61-b8dd-652d9697be28-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "037885e0-98e4-4c61-b8dd-652d9697be28" (UID: "037885e0-98e4-4c61-b8dd-652d9697be28"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:25:43 crc kubenswrapper[4752]: I1124 11:25:42.996592 4752 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/037885e0-98e4-4c61-b8dd-652d9697be28-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:43 crc kubenswrapper[4752]: I1124 11:25:42.996627 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/037885e0-98e4-4c61-b8dd-652d9697be28-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:43 crc kubenswrapper[4752]: I1124 11:25:42.996638 4752 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/037885e0-98e4-4c61-b8dd-652d9697be28-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:43 crc kubenswrapper[4752]: I1124 11:25:42.996649 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/037885e0-98e4-4c61-b8dd-652d9697be28-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:43 crc kubenswrapper[4752]: I1124 11:25:42.996660 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mflck\" (UniqueName: \"kubernetes.io/projected/037885e0-98e4-4c61-b8dd-652d9697be28-kube-api-access-mflck\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:43 crc kubenswrapper[4752]: I1124 11:25:42.996668 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/037885e0-98e4-4c61-b8dd-652d9697be28-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:43 crc kubenswrapper[4752]: I1124 11:25:43.138238 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-pdfvr"] Nov 24 11:25:43 crc kubenswrapper[4752]: I1124 11:25:43.145280 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-pdfvr"] Nov 24 11:25:43 crc kubenswrapper[4752]: I1124 11:25:43.818236 4752 generic.go:334] "Generic (PLEG): container finished" podID="9d7aff9e-10fe-48a7-a77b-049a0b1bbf89" containerID="7a0303ebe71bd53987be3a1043fac15257dbe0312bba14b10d7f2e13779db2ad" exitCode=0 Nov 24 11:25:43 crc kubenswrapper[4752]: I1124 11:25:43.818338 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9d7aff9e-10fe-48a7-a77b-049a0b1bbf89","Type":"ContainerDied","Data":"7a0303ebe71bd53987be3a1043fac15257dbe0312bba14b10d7f2e13779db2ad"} Nov 24 11:25:44 crc kubenswrapper[4752]: I1124 11:25:44.321474 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-76c46557f6-h7874" Nov 24 11:25:44 crc kubenswrapper[4752]: I1124 11:25:44.741265 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="037885e0-98e4-4c61-b8dd-652d9697be28" path="/var/lib/kubelet/pods/037885e0-98e4-4c61-b8dd-652d9697be28/volumes" Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.421436 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.569662 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d7aff9e-10fe-48a7-a77b-049a0b1bbf89-config-data-custom\") pod \"9d7aff9e-10fe-48a7-a77b-049a0b1bbf89\" (UID: \"9d7aff9e-10fe-48a7-a77b-049a0b1bbf89\") " Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.569791 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d7aff9e-10fe-48a7-a77b-049a0b1bbf89-scripts\") pod \"9d7aff9e-10fe-48a7-a77b-049a0b1bbf89\" (UID: \"9d7aff9e-10fe-48a7-a77b-049a0b1bbf89\") " Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.569871 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9d7aff9e-10fe-48a7-a77b-049a0b1bbf89-etc-machine-id\") pod \"9d7aff9e-10fe-48a7-a77b-049a0b1bbf89\" (UID: \"9d7aff9e-10fe-48a7-a77b-049a0b1bbf89\") " Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.569916 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d7aff9e-10fe-48a7-a77b-049a0b1bbf89-combined-ca-bundle\") pod \"9d7aff9e-10fe-48a7-a77b-049a0b1bbf89\" (UID: \"9d7aff9e-10fe-48a7-a77b-049a0b1bbf89\") " Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.569940 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dwpj\" (UniqueName: \"kubernetes.io/projected/9d7aff9e-10fe-48a7-a77b-049a0b1bbf89-kube-api-access-5dwpj\") pod \"9d7aff9e-10fe-48a7-a77b-049a0b1bbf89\" (UID: \"9d7aff9e-10fe-48a7-a77b-049a0b1bbf89\") " Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.569997 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d7aff9e-10fe-48a7-a77b-049a0b1bbf89-config-data\") pod \"9d7aff9e-10fe-48a7-a77b-049a0b1bbf89\" (UID: \"9d7aff9e-10fe-48a7-a77b-049a0b1bbf89\") " Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.570255 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9d7aff9e-10fe-48a7-a77b-049a0b1bbf89-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9d7aff9e-10fe-48a7-a77b-049a0b1bbf89" (UID: "9d7aff9e-10fe-48a7-a77b-049a0b1bbf89"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.575667 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d7aff9e-10fe-48a7-a77b-049a0b1bbf89-kube-api-access-5dwpj" (OuterVolumeSpecName: "kube-api-access-5dwpj") pod "9d7aff9e-10fe-48a7-a77b-049a0b1bbf89" (UID: "9d7aff9e-10fe-48a7-a77b-049a0b1bbf89"). InnerVolumeSpecName "kube-api-access-5dwpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.576430 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d7aff9e-10fe-48a7-a77b-049a0b1bbf89-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9d7aff9e-10fe-48a7-a77b-049a0b1bbf89" (UID: "9d7aff9e-10fe-48a7-a77b-049a0b1bbf89"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.576995 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d7aff9e-10fe-48a7-a77b-049a0b1bbf89-scripts" (OuterVolumeSpecName: "scripts") pod "9d7aff9e-10fe-48a7-a77b-049a0b1bbf89" (UID: "9d7aff9e-10fe-48a7-a77b-049a0b1bbf89"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.639894 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d7aff9e-10fe-48a7-a77b-049a0b1bbf89-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d7aff9e-10fe-48a7-a77b-049a0b1bbf89" (UID: "9d7aff9e-10fe-48a7-a77b-049a0b1bbf89"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.672485 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d7aff9e-10fe-48a7-a77b-049a0b1bbf89-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.672526 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dwpj\" (UniqueName: \"kubernetes.io/projected/9d7aff9e-10fe-48a7-a77b-049a0b1bbf89-kube-api-access-5dwpj\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.672540 4752 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d7aff9e-10fe-48a7-a77b-049a0b1bbf89-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.672549 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d7aff9e-10fe-48a7-a77b-049a0b1bbf89-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.672561 4752 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9d7aff9e-10fe-48a7-a77b-049a0b1bbf89-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.692436 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d7aff9e-10fe-48a7-a77b-049a0b1bbf89-config-data" (OuterVolumeSpecName: "config-data") pod "9d7aff9e-10fe-48a7-a77b-049a0b1bbf89" (UID: "9d7aff9e-10fe-48a7-a77b-049a0b1bbf89"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.774613 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d7aff9e-10fe-48a7-a77b-049a0b1bbf89-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.845789 4752 generic.go:334] "Generic (PLEG): container finished" podID="9d7aff9e-10fe-48a7-a77b-049a0b1bbf89" containerID="f3ac6206d54b4af8879ca35629b03c6e8f2fb6ab610ec2911130781ab09b3220" exitCode=0 Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.845893 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.845883 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9d7aff9e-10fe-48a7-a77b-049a0b1bbf89","Type":"ContainerDied","Data":"f3ac6206d54b4af8879ca35629b03c6e8f2fb6ab610ec2911130781ab09b3220"} Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.846287 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9d7aff9e-10fe-48a7-a77b-049a0b1bbf89","Type":"ContainerDied","Data":"480a197775c0942bf52c52c5cdf6302a9b30381af1e2a21af6930276ccfb8a11"} Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.846314 4752 scope.go:117] "RemoveContainer" containerID="7a0303ebe71bd53987be3a1043fac15257dbe0312bba14b10d7f2e13779db2ad" Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.870958 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.874883 4752 scope.go:117] "RemoveContainer" containerID="f3ac6206d54b4af8879ca35629b03c6e8f2fb6ab610ec2911130781ab09b3220" Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.882643 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.897917 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 24 11:25:46 crc kubenswrapper[4752]: E1124 11:25:46.898377 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d7aff9e-10fe-48a7-a77b-049a0b1bbf89" containerName="probe" Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.898399 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d7aff9e-10fe-48a7-a77b-049a0b1bbf89" containerName="probe" Nov 24 11:25:46 crc kubenswrapper[4752]: E1124 11:25:46.898420 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="037885e0-98e4-4c61-b8dd-652d9697be28" containerName="dnsmasq-dns" Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.898428 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="037885e0-98e4-4c61-b8dd-652d9697be28" containerName="dnsmasq-dns" Nov 24 11:25:46 crc kubenswrapper[4752]: E1124 11:25:46.898441 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d7aff9e-10fe-48a7-a77b-049a0b1bbf89" containerName="cinder-scheduler" Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.898449 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d7aff9e-10fe-48a7-a77b-049a0b1bbf89" containerName="cinder-scheduler" Nov 24 11:25:46 crc kubenswrapper[4752]: E1124 11:25:46.898482 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="037885e0-98e4-4c61-b8dd-652d9697be28" containerName="init" Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.898489 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="037885e0-98e4-4c61-b8dd-652d9697be28" containerName="init" Nov 24 11:25:46 crc kubenswrapper[4752]: E1124 11:25:46.898505 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ccd1d72-1c78-439b-b3ba-d38159757b03" containerName="barbican-api-log" Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.898513 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ccd1d72-1c78-439b-b3ba-d38159757b03" containerName="barbican-api-log" Nov 24 11:25:46 crc kubenswrapper[4752]: E1124 11:25:46.898522 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ccd1d72-1c78-439b-b3ba-d38159757b03" containerName="barbican-api" Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.898529 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ccd1d72-1c78-439b-b3ba-d38159757b03" containerName="barbican-api" Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.898726 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ccd1d72-1c78-439b-b3ba-d38159757b03" containerName="barbican-api-log" Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.898757 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d7aff9e-10fe-48a7-a77b-049a0b1bbf89" containerName="probe" Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.898773 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="037885e0-98e4-4c61-b8dd-652d9697be28" containerName="dnsmasq-dns" Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.898787 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d7aff9e-10fe-48a7-a77b-049a0b1bbf89" containerName="cinder-scheduler" Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.898807 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ccd1d72-1c78-439b-b3ba-d38159757b03" containerName="barbican-api" Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.900024 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.903323 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.905707 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.905853 4752 scope.go:117] "RemoveContainer" containerID="7a0303ebe71bd53987be3a1043fac15257dbe0312bba14b10d7f2e13779db2ad" Nov 24 11:25:46 crc kubenswrapper[4752]: E1124 11:25:46.906485 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a0303ebe71bd53987be3a1043fac15257dbe0312bba14b10d7f2e13779db2ad\": container with ID starting with 7a0303ebe71bd53987be3a1043fac15257dbe0312bba14b10d7f2e13779db2ad not found: ID does not exist" containerID="7a0303ebe71bd53987be3a1043fac15257dbe0312bba14b10d7f2e13779db2ad" Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.906522 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a0303ebe71bd53987be3a1043fac15257dbe0312bba14b10d7f2e13779db2ad"} err="failed to get container status \"7a0303ebe71bd53987be3a1043fac15257dbe0312bba14b10d7f2e13779db2ad\": rpc error: code = NotFound desc = could not find container \"7a0303ebe71bd53987be3a1043fac15257dbe0312bba14b10d7f2e13779db2ad\": container with ID starting with 7a0303ebe71bd53987be3a1043fac15257dbe0312bba14b10d7f2e13779db2ad not found: ID does not exist" Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.906552 4752 scope.go:117] "RemoveContainer" containerID="f3ac6206d54b4af8879ca35629b03c6e8f2fb6ab610ec2911130781ab09b3220" Nov 24 11:25:46 crc kubenswrapper[4752]: E1124 11:25:46.906940 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3ac6206d54b4af8879ca35629b03c6e8f2fb6ab610ec2911130781ab09b3220\": container with ID starting with f3ac6206d54b4af8879ca35629b03c6e8f2fb6ab610ec2911130781ab09b3220 not found: ID does not exist" containerID="f3ac6206d54b4af8879ca35629b03c6e8f2fb6ab610ec2911130781ab09b3220" Nov 24 11:25:46 crc kubenswrapper[4752]: I1124 11:25:46.906989 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3ac6206d54b4af8879ca35629b03c6e8f2fb6ab610ec2911130781ab09b3220"} err="failed to get container status \"f3ac6206d54b4af8879ca35629b03c6e8f2fb6ab610ec2911130781ab09b3220\": rpc error: code = NotFound desc = could not find container \"f3ac6206d54b4af8879ca35629b03c6e8f2fb6ab610ec2911130781ab09b3220\": container with ID starting with f3ac6206d54b4af8879ca35629b03c6e8f2fb6ab610ec2911130781ab09b3220 not found: ID does not exist" Nov 24 11:25:47 crc kubenswrapper[4752]: I1124 11:25:47.079165 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klw4x\" (UniqueName: \"kubernetes.io/projected/9d0084f6-064e-4089-87fe-4ead63923b56-kube-api-access-klw4x\") pod \"cinder-scheduler-0\" (UID: \"9d0084f6-064e-4089-87fe-4ead63923b56\") " pod="openstack/cinder-scheduler-0" Nov 24 11:25:47 crc kubenswrapper[4752]: I1124 11:25:47.079233 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d0084f6-064e-4089-87fe-4ead63923b56-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9d0084f6-064e-4089-87fe-4ead63923b56\") " pod="openstack/cinder-scheduler-0" Nov 24 11:25:47 crc kubenswrapper[4752]: I1124 11:25:47.079394 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9d0084f6-064e-4089-87fe-4ead63923b56-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9d0084f6-064e-4089-87fe-4ead63923b56\") " pod="openstack/cinder-scheduler-0" Nov 24 11:25:47 crc kubenswrapper[4752]: I1124 11:25:47.079462 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d0084f6-064e-4089-87fe-4ead63923b56-config-data\") pod \"cinder-scheduler-0\" (UID: \"9d0084f6-064e-4089-87fe-4ead63923b56\") " pod="openstack/cinder-scheduler-0" Nov 24 11:25:47 crc kubenswrapper[4752]: I1124 11:25:47.079543 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d0084f6-064e-4089-87fe-4ead63923b56-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9d0084f6-064e-4089-87fe-4ead63923b56\") " pod="openstack/cinder-scheduler-0" Nov 24 11:25:47 crc kubenswrapper[4752]: I1124 11:25:47.079584 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d0084f6-064e-4089-87fe-4ead63923b56-scripts\") pod \"cinder-scheduler-0\" (UID: \"9d0084f6-064e-4089-87fe-4ead63923b56\") " pod="openstack/cinder-scheduler-0" Nov 24 11:25:47 crc kubenswrapper[4752]: I1124 11:25:47.181218 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d0084f6-064e-4089-87fe-4ead63923b56-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9d0084f6-064e-4089-87fe-4ead63923b56\") " pod="openstack/cinder-scheduler-0" Nov 24 11:25:47 crc kubenswrapper[4752]: I1124 11:25:47.181301 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9d0084f6-064e-4089-87fe-4ead63923b56-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9d0084f6-064e-4089-87fe-4ead63923b56\") " pod="openstack/cinder-scheduler-0" Nov 24 11:25:47 crc kubenswrapper[4752]: I1124 11:25:47.181326 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d0084f6-064e-4089-87fe-4ead63923b56-config-data\") pod \"cinder-scheduler-0\" (UID: \"9d0084f6-064e-4089-87fe-4ead63923b56\") " pod="openstack/cinder-scheduler-0" Nov 24 11:25:47 crc kubenswrapper[4752]: I1124 11:25:47.181362 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d0084f6-064e-4089-87fe-4ead63923b56-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9d0084f6-064e-4089-87fe-4ead63923b56\") " pod="openstack/cinder-scheduler-0" Nov 24 11:25:47 crc kubenswrapper[4752]: I1124 11:25:47.181384 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d0084f6-064e-4089-87fe-4ead63923b56-scripts\") pod \"cinder-scheduler-0\" (UID: \"9d0084f6-064e-4089-87fe-4ead63923b56\") " pod="openstack/cinder-scheduler-0" Nov 24 11:25:47 crc kubenswrapper[4752]: I1124 11:25:47.181430 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klw4x\" (UniqueName: \"kubernetes.io/projected/9d0084f6-064e-4089-87fe-4ead63923b56-kube-api-access-klw4x\") pod \"cinder-scheduler-0\" (UID: \"9d0084f6-064e-4089-87fe-4ead63923b56\") " pod="openstack/cinder-scheduler-0" Nov 24 11:25:47 crc kubenswrapper[4752]: I1124 11:25:47.182209 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9d0084f6-064e-4089-87fe-4ead63923b56-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9d0084f6-064e-4089-87fe-4ead63923b56\") " pod="openstack/cinder-scheduler-0" Nov 24 11:25:47 crc kubenswrapper[4752]: I1124 11:25:47.188321 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d0084f6-064e-4089-87fe-4ead63923b56-scripts\") pod \"cinder-scheduler-0\" (UID: \"9d0084f6-064e-4089-87fe-4ead63923b56\") " pod="openstack/cinder-scheduler-0" Nov 24 11:25:47 crc kubenswrapper[4752]: I1124 11:25:47.188408 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d0084f6-064e-4089-87fe-4ead63923b56-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9d0084f6-064e-4089-87fe-4ead63923b56\") " pod="openstack/cinder-scheduler-0" Nov 24 11:25:47 crc kubenswrapper[4752]: I1124 11:25:47.188769 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d0084f6-064e-4089-87fe-4ead63923b56-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9d0084f6-064e-4089-87fe-4ead63923b56\") " pod="openstack/cinder-scheduler-0" Nov 24 11:25:47 crc kubenswrapper[4752]: I1124 11:25:47.189074 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d0084f6-064e-4089-87fe-4ead63923b56-config-data\") pod \"cinder-scheduler-0\" (UID: \"9d0084f6-064e-4089-87fe-4ead63923b56\") " pod="openstack/cinder-scheduler-0" Nov 24 11:25:47 crc kubenswrapper[4752]: I1124 11:25:47.207707 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klw4x\" (UniqueName: \"kubernetes.io/projected/9d0084f6-064e-4089-87fe-4ead63923b56-kube-api-access-klw4x\") pod \"cinder-scheduler-0\" (UID: \"9d0084f6-064e-4089-87fe-4ead63923b56\") " pod="openstack/cinder-scheduler-0" Nov 24 11:25:47 crc kubenswrapper[4752]: I1124 11:25:47.214668 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 24 11:25:47 crc kubenswrapper[4752]: W1124 11:25:47.654251 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d0084f6_064e_4089_87fe_4ead63923b56.slice/crio-cf1d42fd20548e21c71af29102a5bd9e07f9567f10fb372db9995b22fff3e318 WatchSource:0}: Error finding container cf1d42fd20548e21c71af29102a5bd9e07f9567f10fb372db9995b22fff3e318: Status 404 returned error can't find the container with id cf1d42fd20548e21c71af29102a5bd9e07f9567f10fb372db9995b22fff3e318 Nov 24 11:25:47 crc kubenswrapper[4752]: I1124 11:25:47.655587 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-848cf88cfc-pdfvr" podUID="037885e0-98e4-4c61-b8dd-652d9697be28" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.154:5353: i/o timeout" Nov 24 11:25:47 crc kubenswrapper[4752]: I1124 11:25:47.666246 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 24 11:25:47 crc kubenswrapper[4752]: I1124 11:25:47.759274 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-56dd6c8857-kp42z" Nov 24 11:25:47 crc kubenswrapper[4752]: I1124 11:25:47.829268 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-76c46557f6-h7874"] Nov 24 11:25:47 crc kubenswrapper[4752]: I1124 11:25:47.830461 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-76c46557f6-h7874" podUID="42096ad9-dd99-4b49-ad30-57e43f4e97a1" containerName="neutron-httpd" containerID="cri-o://11fb1748bacf7cdd4881adda1adafd0b56fde11f31f6e1af46f23c8badc5f64a" gracePeriod=30 Nov 24 11:25:47 crc kubenswrapper[4752]: I1124 11:25:47.830390 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-76c46557f6-h7874" podUID="42096ad9-dd99-4b49-ad30-57e43f4e97a1" containerName="neutron-api" containerID="cri-o://2083f407f786cbdc50075533918be421651e6ec55fbad9e1b8c579416e048543" gracePeriod=30 Nov 24 11:25:47 crc kubenswrapper[4752]: I1124 11:25:47.861702 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9d0084f6-064e-4089-87fe-4ead63923b56","Type":"ContainerStarted","Data":"cf1d42fd20548e21c71af29102a5bd9e07f9567f10fb372db9995b22fff3e318"} Nov 24 11:25:48 crc kubenswrapper[4752]: I1124 11:25:48.739614 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d7aff9e-10fe-48a7-a77b-049a0b1bbf89" path="/var/lib/kubelet/pods/9d7aff9e-10fe-48a7-a77b-049a0b1bbf89/volumes" Nov 24 11:25:48 crc kubenswrapper[4752]: I1124 11:25:48.875515 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9d0084f6-064e-4089-87fe-4ead63923b56","Type":"ContainerStarted","Data":"9e7930d8efc895b38537d979ec9125f2a956817fea154183ec332abf612d2f8c"} Nov 24 11:25:48 crc kubenswrapper[4752]: I1124 11:25:48.877077 4752 generic.go:334] "Generic (PLEG): container finished" podID="42096ad9-dd99-4b49-ad30-57e43f4e97a1" containerID="11fb1748bacf7cdd4881adda1adafd0b56fde11f31f6e1af46f23c8badc5f64a" exitCode=0 Nov 24 11:25:48 crc kubenswrapper[4752]: I1124 11:25:48.877104 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76c46557f6-h7874" event={"ID":"42096ad9-dd99-4b49-ad30-57e43f4e97a1","Type":"ContainerDied","Data":"11fb1748bacf7cdd4881adda1adafd0b56fde11f31f6e1af46f23c8badc5f64a"} Nov 24 11:25:49 crc kubenswrapper[4752]: I1124 11:25:49.149104 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7f4ddd687-vk74x" Nov 24 11:25:49 crc kubenswrapper[4752]: I1124 11:25:49.894644 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9d0084f6-064e-4089-87fe-4ead63923b56","Type":"ContainerStarted","Data":"be46845395e6568e301fffb75b2b8cbf950babfb7216ca6d7fcccb52601c5794"} Nov 24 11:25:49 crc kubenswrapper[4752]: I1124 11:25:49.913558 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.913535164 podStartE2EDuration="3.913535164s" podCreationTimestamp="2025-11-24 11:25:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:25:49.909319243 +0000 UTC m=+1155.894139542" watchObservedRunningTime="2025-11-24 11:25:49.913535164 +0000 UTC m=+1155.898355453" Nov 24 11:25:49 crc kubenswrapper[4752]: I1124 11:25:49.934594 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Nov 24 11:25:49 crc kubenswrapper[4752]: I1124 11:25:49.935780 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 24 11:25:49 crc kubenswrapper[4752]: I1124 11:25:49.937470 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-lb9wb" Nov 24 11:25:49 crc kubenswrapper[4752]: I1124 11:25:49.937562 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Nov 24 11:25:49 crc kubenswrapper[4752]: I1124 11:25:49.938811 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Nov 24 11:25:49 crc kubenswrapper[4752]: I1124 11:25:49.943560 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 24 11:25:49 crc kubenswrapper[4752]: I1124 11:25:49.974468 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Nov 24 11:25:50 crc kubenswrapper[4752]: I1124 11:25:50.033527 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aec21c99-26e2-45f2-9d50-224cbec04721-combined-ca-bundle\") pod \"openstackclient\" (UID: \"aec21c99-26e2-45f2-9d50-224cbec04721\") " pod="openstack/openstackclient" Nov 24 11:25:50 crc kubenswrapper[4752]: I1124 11:25:50.033814 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl8bk\" (UniqueName: \"kubernetes.io/projected/aec21c99-26e2-45f2-9d50-224cbec04721-kube-api-access-kl8bk\") pod \"openstackclient\" (UID: \"aec21c99-26e2-45f2-9d50-224cbec04721\") " pod="openstack/openstackclient" Nov 24 11:25:50 crc kubenswrapper[4752]: I1124 11:25:50.033925 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aec21c99-26e2-45f2-9d50-224cbec04721-openstack-config\") pod \"openstackclient\" (UID: \"aec21c99-26e2-45f2-9d50-224cbec04721\") " pod="openstack/openstackclient" Nov 24 11:25:50 crc kubenswrapper[4752]: I1124 11:25:50.034160 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aec21c99-26e2-45f2-9d50-224cbec04721-openstack-config-secret\") pod \"openstackclient\" (UID: \"aec21c99-26e2-45f2-9d50-224cbec04721\") " pod="openstack/openstackclient" Nov 24 11:25:50 crc kubenswrapper[4752]: I1124 11:25:50.135811 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl8bk\" (UniqueName: \"kubernetes.io/projected/aec21c99-26e2-45f2-9d50-224cbec04721-kube-api-access-kl8bk\") pod \"openstackclient\" (UID: \"aec21c99-26e2-45f2-9d50-224cbec04721\") " pod="openstack/openstackclient" Nov 24 11:25:50 crc kubenswrapper[4752]: I1124 11:25:50.135868 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aec21c99-26e2-45f2-9d50-224cbec04721-openstack-config\") pod \"openstackclient\" (UID: \"aec21c99-26e2-45f2-9d50-224cbec04721\") " pod="openstack/openstackclient" Nov 24 11:25:50 crc kubenswrapper[4752]: I1124 11:25:50.135987 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aec21c99-26e2-45f2-9d50-224cbec04721-openstack-config-secret\") pod \"openstackclient\" (UID: \"aec21c99-26e2-45f2-9d50-224cbec04721\") " pod="openstack/openstackclient" Nov 24 11:25:50 crc kubenswrapper[4752]: I1124 11:25:50.136736 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aec21c99-26e2-45f2-9d50-224cbec04721-openstack-config\") pod \"openstackclient\" (UID: \"aec21c99-26e2-45f2-9d50-224cbec04721\") " pod="openstack/openstackclient" Nov 24 11:25:50 crc kubenswrapper[4752]: I1124 11:25:50.137044 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aec21c99-26e2-45f2-9d50-224cbec04721-combined-ca-bundle\") pod \"openstackclient\" (UID: \"aec21c99-26e2-45f2-9d50-224cbec04721\") " pod="openstack/openstackclient" Nov 24 11:25:50 crc kubenswrapper[4752]: I1124 11:25:50.141701 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aec21c99-26e2-45f2-9d50-224cbec04721-combined-ca-bundle\") pod \"openstackclient\" (UID: \"aec21c99-26e2-45f2-9d50-224cbec04721\") " pod="openstack/openstackclient" Nov 24 11:25:50 crc kubenswrapper[4752]: I1124 11:25:50.154254 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aec21c99-26e2-45f2-9d50-224cbec04721-openstack-config-secret\") pod \"openstackclient\" (UID: \"aec21c99-26e2-45f2-9d50-224cbec04721\") " pod="openstack/openstackclient" Nov 24 11:25:50 crc kubenswrapper[4752]: I1124 11:25:50.170096 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl8bk\" (UniqueName: \"kubernetes.io/projected/aec21c99-26e2-45f2-9d50-224cbec04721-kube-api-access-kl8bk\") pod \"openstackclient\" (UID: \"aec21c99-26e2-45f2-9d50-224cbec04721\") " pod="openstack/openstackclient" Nov 24 11:25:50 crc kubenswrapper[4752]: I1124 11:25:50.252659 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 24 11:25:50 crc kubenswrapper[4752]: I1124 11:25:50.275166 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Nov 24 11:25:50 crc kubenswrapper[4752]: I1124 11:25:50.287868 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Nov 24 11:25:50 crc kubenswrapper[4752]: I1124 11:25:50.341195 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Nov 24 11:25:50 crc kubenswrapper[4752]: I1124 11:25:50.342892 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 24 11:25:50 crc kubenswrapper[4752]: I1124 11:25:50.360621 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 24 11:25:50 crc kubenswrapper[4752]: E1124 11:25:50.408503 4752 log.go:32] "RunPodSandbox from runtime service failed" err=< Nov 24 11:25:50 crc kubenswrapper[4752]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_aec21c99-26e2-45f2-9d50-224cbec04721_0(867f19db1cc22e95c1a51880bef4f5b26d5567bb6ac89baa1016dc26cfcc72c8): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"867f19db1cc22e95c1a51880bef4f5b26d5567bb6ac89baa1016dc26cfcc72c8" Netns:"/var/run/netns/0ac2d73d-2334-4bd8-957b-7e4f416dcf37" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=867f19db1cc22e95c1a51880bef4f5b26d5567bb6ac89baa1016dc26cfcc72c8;K8S_POD_UID=aec21c99-26e2-45f2-9d50-224cbec04721" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/aec21c99-26e2-45f2-9d50-224cbec04721]: expected pod UID "aec21c99-26e2-45f2-9d50-224cbec04721" but got "09540fa5-6ff8-45cc-98be-968283dc2bfd" from Kube API Nov 24 11:25:50 crc kubenswrapper[4752]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Nov 24 11:25:50 crc kubenswrapper[4752]: > Nov 24 11:25:50 crc kubenswrapper[4752]: E1124 11:25:50.408562 4752 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Nov 24 11:25:50 crc kubenswrapper[4752]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_aec21c99-26e2-45f2-9d50-224cbec04721_0(867f19db1cc22e95c1a51880bef4f5b26d5567bb6ac89baa1016dc26cfcc72c8): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"867f19db1cc22e95c1a51880bef4f5b26d5567bb6ac89baa1016dc26cfcc72c8" Netns:"/var/run/netns/0ac2d73d-2334-4bd8-957b-7e4f416dcf37" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=867f19db1cc22e95c1a51880bef4f5b26d5567bb6ac89baa1016dc26cfcc72c8;K8S_POD_UID=aec21c99-26e2-45f2-9d50-224cbec04721" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/aec21c99-26e2-45f2-9d50-224cbec04721]: expected pod UID "aec21c99-26e2-45f2-9d50-224cbec04721" but got "09540fa5-6ff8-45cc-98be-968283dc2bfd" from Kube API Nov 24 11:25:50 crc kubenswrapper[4752]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Nov 24 11:25:50 crc kubenswrapper[4752]: > pod="openstack/openstackclient" Nov 24 11:25:50 crc kubenswrapper[4752]: I1124 11:25:50.443799 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/09540fa5-6ff8-45cc-98be-968283dc2bfd-openstack-config-secret\") pod \"openstackclient\" (UID: \"09540fa5-6ff8-45cc-98be-968283dc2bfd\") " pod="openstack/openstackclient" Nov 24 11:25:50 crc kubenswrapper[4752]: I1124 11:25:50.443859 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09540fa5-6ff8-45cc-98be-968283dc2bfd-combined-ca-bundle\") pod \"openstackclient\" (UID: \"09540fa5-6ff8-45cc-98be-968283dc2bfd\") " pod="openstack/openstackclient" Nov 24 11:25:50 crc kubenswrapper[4752]: I1124 11:25:50.444325 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/09540fa5-6ff8-45cc-98be-968283dc2bfd-openstack-config\") pod \"openstackclient\" (UID: \"09540fa5-6ff8-45cc-98be-968283dc2bfd\") " pod="openstack/openstackclient" Nov 24 11:25:50 crc kubenswrapper[4752]: I1124 11:25:50.444593 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvtlv\" (UniqueName: \"kubernetes.io/projected/09540fa5-6ff8-45cc-98be-968283dc2bfd-kube-api-access-rvtlv\") pod \"openstackclient\" (UID: \"09540fa5-6ff8-45cc-98be-968283dc2bfd\") " pod="openstack/openstackclient" Nov 24 11:25:50 crc kubenswrapper[4752]: I1124 11:25:50.546876 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/09540fa5-6ff8-45cc-98be-968283dc2bfd-openstack-config-secret\") pod \"openstackclient\" (UID: \"09540fa5-6ff8-45cc-98be-968283dc2bfd\") " pod="openstack/openstackclient" Nov 24 11:25:50 crc kubenswrapper[4752]: I1124 11:25:50.546931 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09540fa5-6ff8-45cc-98be-968283dc2bfd-combined-ca-bundle\") pod \"openstackclient\" (UID: \"09540fa5-6ff8-45cc-98be-968283dc2bfd\") " pod="openstack/openstackclient" Nov 24 11:25:50 crc kubenswrapper[4752]: I1124 11:25:50.546974 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/09540fa5-6ff8-45cc-98be-968283dc2bfd-openstack-config\") pod \"openstackclient\" (UID: \"09540fa5-6ff8-45cc-98be-968283dc2bfd\") " pod="openstack/openstackclient" Nov 24 11:25:50 crc kubenswrapper[4752]: I1124 11:25:50.547024 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvtlv\" (UniqueName: \"kubernetes.io/projected/09540fa5-6ff8-45cc-98be-968283dc2bfd-kube-api-access-rvtlv\") pod \"openstackclient\" (UID: \"09540fa5-6ff8-45cc-98be-968283dc2bfd\") " pod="openstack/openstackclient" Nov 24 11:25:50 crc kubenswrapper[4752]: I1124 11:25:50.548522 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/09540fa5-6ff8-45cc-98be-968283dc2bfd-openstack-config\") pod \"openstackclient\" (UID: \"09540fa5-6ff8-45cc-98be-968283dc2bfd\") " pod="openstack/openstackclient" Nov 24 11:25:50 crc kubenswrapper[4752]: I1124 11:25:50.553532 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09540fa5-6ff8-45cc-98be-968283dc2bfd-combined-ca-bundle\") pod \"openstackclient\" (UID: \"09540fa5-6ff8-45cc-98be-968283dc2bfd\") " pod="openstack/openstackclient" Nov 24 11:25:50 crc kubenswrapper[4752]: I1124 11:25:50.553984 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/09540fa5-6ff8-45cc-98be-968283dc2bfd-openstack-config-secret\") pod \"openstackclient\" (UID: \"09540fa5-6ff8-45cc-98be-968283dc2bfd\") " pod="openstack/openstackclient" Nov 24 11:25:50 crc kubenswrapper[4752]: I1124 11:25:50.565087 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvtlv\" (UniqueName: \"kubernetes.io/projected/09540fa5-6ff8-45cc-98be-968283dc2bfd-kube-api-access-rvtlv\") pod \"openstackclient\" (UID: \"09540fa5-6ff8-45cc-98be-968283dc2bfd\") " pod="openstack/openstackclient" Nov 24 11:25:50 crc kubenswrapper[4752]: I1124 11:25:50.670403 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 24 11:25:50 crc kubenswrapper[4752]: I1124 11:25:50.925089 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 24 11:25:50 crc kubenswrapper[4752]: I1124 11:25:50.955990 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 24 11:25:50 crc kubenswrapper[4752]: I1124 11:25:50.962535 4752 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="aec21c99-26e2-45f2-9d50-224cbec04721" podUID="09540fa5-6ff8-45cc-98be-968283dc2bfd" Nov 24 11:25:50 crc kubenswrapper[4752]: I1124 11:25:50.998160 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 24 11:25:51 crc kubenswrapper[4752]: I1124 11:25:51.056836 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aec21c99-26e2-45f2-9d50-224cbec04721-combined-ca-bundle\") pod \"aec21c99-26e2-45f2-9d50-224cbec04721\" (UID: \"aec21c99-26e2-45f2-9d50-224cbec04721\") " Nov 24 11:25:51 crc kubenswrapper[4752]: I1124 11:25:51.057236 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aec21c99-26e2-45f2-9d50-224cbec04721-openstack-config\") pod \"aec21c99-26e2-45f2-9d50-224cbec04721\" (UID: \"aec21c99-26e2-45f2-9d50-224cbec04721\") " Nov 24 11:25:51 crc kubenswrapper[4752]: I1124 11:25:51.057290 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aec21c99-26e2-45f2-9d50-224cbec04721-openstack-config-secret\") pod \"aec21c99-26e2-45f2-9d50-224cbec04721\" (UID: \"aec21c99-26e2-45f2-9d50-224cbec04721\") " Nov 24 11:25:51 crc kubenswrapper[4752]: I1124 11:25:51.057451 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kl8bk\" (UniqueName: \"kubernetes.io/projected/aec21c99-26e2-45f2-9d50-224cbec04721-kube-api-access-kl8bk\") pod \"aec21c99-26e2-45f2-9d50-224cbec04721\" (UID: \"aec21c99-26e2-45f2-9d50-224cbec04721\") " Nov 24 11:25:51 crc kubenswrapper[4752]: I1124 11:25:51.057855 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aec21c99-26e2-45f2-9d50-224cbec04721-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "aec21c99-26e2-45f2-9d50-224cbec04721" (UID: "aec21c99-26e2-45f2-9d50-224cbec04721"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:25:51 crc kubenswrapper[4752]: I1124 11:25:51.058343 4752 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aec21c99-26e2-45f2-9d50-224cbec04721-openstack-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:51 crc kubenswrapper[4752]: I1124 11:25:51.061966 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aec21c99-26e2-45f2-9d50-224cbec04721-kube-api-access-kl8bk" (OuterVolumeSpecName: "kube-api-access-kl8bk") pod "aec21c99-26e2-45f2-9d50-224cbec04721" (UID: "aec21c99-26e2-45f2-9d50-224cbec04721"). InnerVolumeSpecName "kube-api-access-kl8bk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:25:51 crc kubenswrapper[4752]: I1124 11:25:51.062059 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aec21c99-26e2-45f2-9d50-224cbec04721-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "aec21c99-26e2-45f2-9d50-224cbec04721" (UID: "aec21c99-26e2-45f2-9d50-224cbec04721"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:25:51 crc kubenswrapper[4752]: I1124 11:25:51.063549 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aec21c99-26e2-45f2-9d50-224cbec04721-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aec21c99-26e2-45f2-9d50-224cbec04721" (UID: "aec21c99-26e2-45f2-9d50-224cbec04721"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:25:51 crc kubenswrapper[4752]: I1124 11:25:51.160629 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kl8bk\" (UniqueName: \"kubernetes.io/projected/aec21c99-26e2-45f2-9d50-224cbec04721-kube-api-access-kl8bk\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:51 crc kubenswrapper[4752]: I1124 11:25:51.160660 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aec21c99-26e2-45f2-9d50-224cbec04721-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:51 crc kubenswrapper[4752]: I1124 11:25:51.160726 4752 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aec21c99-26e2-45f2-9d50-224cbec04721-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:51 crc kubenswrapper[4752]: I1124 11:25:51.940176 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"09540fa5-6ff8-45cc-98be-968283dc2bfd","Type":"ContainerStarted","Data":"ccafef610cbab7d5fbf4aaf9167b875f93662621dfd8ac7f5211e3f428cdb0da"} Nov 24 11:25:51 crc kubenswrapper[4752]: I1124 11:25:51.940198 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 24 11:25:51 crc kubenswrapper[4752]: I1124 11:25:51.954002 4752 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="aec21c99-26e2-45f2-9d50-224cbec04721" podUID="09540fa5-6ff8-45cc-98be-968283dc2bfd" Nov 24 11:25:52 crc kubenswrapper[4752]: I1124 11:25:52.215312 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 24 11:25:52 crc kubenswrapper[4752]: I1124 11:25:52.739484 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aec21c99-26e2-45f2-9d50-224cbec04721" path="/var/lib/kubelet/pods/aec21c99-26e2-45f2-9d50-224cbec04721/volumes" Nov 24 11:25:54 crc kubenswrapper[4752]: I1124 11:25:54.888195 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-57b7bbd86d-9drzs" Nov 24 11:25:54 crc kubenswrapper[4752]: I1124 11:25:54.891464 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-57b7bbd86d-9drzs" Nov 24 11:25:54 crc kubenswrapper[4752]: I1124 11:25:54.985790 4752 generic.go:334] "Generic (PLEG): container finished" podID="42096ad9-dd99-4b49-ad30-57e43f4e97a1" containerID="2083f407f786cbdc50075533918be421651e6ec55fbad9e1b8c579416e048543" exitCode=0 Nov 24 11:25:54 crc kubenswrapper[4752]: I1124 11:25:54.985909 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76c46557f6-h7874" event={"ID":"42096ad9-dd99-4b49-ad30-57e43f4e97a1","Type":"ContainerDied","Data":"2083f407f786cbdc50075533918be421651e6ec55fbad9e1b8c579416e048543"} Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.311391 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-76c46557f6-h7874" Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.452286 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5p8kt\" (UniqueName: \"kubernetes.io/projected/42096ad9-dd99-4b49-ad30-57e43f4e97a1-kube-api-access-5p8kt\") pod \"42096ad9-dd99-4b49-ad30-57e43f4e97a1\" (UID: \"42096ad9-dd99-4b49-ad30-57e43f4e97a1\") " Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.452469 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/42096ad9-dd99-4b49-ad30-57e43f4e97a1-ovndb-tls-certs\") pod \"42096ad9-dd99-4b49-ad30-57e43f4e97a1\" (UID: \"42096ad9-dd99-4b49-ad30-57e43f4e97a1\") " Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.452558 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/42096ad9-dd99-4b49-ad30-57e43f4e97a1-config\") pod \"42096ad9-dd99-4b49-ad30-57e43f4e97a1\" (UID: \"42096ad9-dd99-4b49-ad30-57e43f4e97a1\") " Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.452579 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42096ad9-dd99-4b49-ad30-57e43f4e97a1-combined-ca-bundle\") pod \"42096ad9-dd99-4b49-ad30-57e43f4e97a1\" (UID: \"42096ad9-dd99-4b49-ad30-57e43f4e97a1\") " Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.452621 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/42096ad9-dd99-4b49-ad30-57e43f4e97a1-httpd-config\") pod \"42096ad9-dd99-4b49-ad30-57e43f4e97a1\" (UID: \"42096ad9-dd99-4b49-ad30-57e43f4e97a1\") " Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.457936 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42096ad9-dd99-4b49-ad30-57e43f4e97a1-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "42096ad9-dd99-4b49-ad30-57e43f4e97a1" (UID: "42096ad9-dd99-4b49-ad30-57e43f4e97a1"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.458497 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42096ad9-dd99-4b49-ad30-57e43f4e97a1-kube-api-access-5p8kt" (OuterVolumeSpecName: "kube-api-access-5p8kt") pod "42096ad9-dd99-4b49-ad30-57e43f4e97a1" (UID: "42096ad9-dd99-4b49-ad30-57e43f4e97a1"). InnerVolumeSpecName "kube-api-access-5p8kt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.552683 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-79cf79466c-6hfp8"] Nov 24 11:25:55 crc kubenswrapper[4752]: E1124 11:25:55.553147 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42096ad9-dd99-4b49-ad30-57e43f4e97a1" containerName="neutron-api" Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.553168 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="42096ad9-dd99-4b49-ad30-57e43f4e97a1" containerName="neutron-api" Nov 24 11:25:55 crc kubenswrapper[4752]: E1124 11:25:55.553200 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42096ad9-dd99-4b49-ad30-57e43f4e97a1" containerName="neutron-httpd" Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.553208 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="42096ad9-dd99-4b49-ad30-57e43f4e97a1" containerName="neutron-httpd" Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.553403 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="42096ad9-dd99-4b49-ad30-57e43f4e97a1" containerName="neutron-httpd" Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.553427 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="42096ad9-dd99-4b49-ad30-57e43f4e97a1" containerName="neutron-api" Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.554540 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-79cf79466c-6hfp8" Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.555994 4752 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/42096ad9-dd99-4b49-ad30-57e43f4e97a1-httpd-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.556039 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5p8kt\" (UniqueName: \"kubernetes.io/projected/42096ad9-dd99-4b49-ad30-57e43f4e97a1-kube-api-access-5p8kt\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.559520 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.559811 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.559976 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.567017 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42096ad9-dd99-4b49-ad30-57e43f4e97a1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "42096ad9-dd99-4b49-ad30-57e43f4e97a1" (UID: "42096ad9-dd99-4b49-ad30-57e43f4e97a1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.582059 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42096ad9-dd99-4b49-ad30-57e43f4e97a1-config" (OuterVolumeSpecName: "config") pod "42096ad9-dd99-4b49-ad30-57e43f4e97a1" (UID: "42096ad9-dd99-4b49-ad30-57e43f4e97a1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.583217 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-79cf79466c-6hfp8"] Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.593227 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42096ad9-dd99-4b49-ad30-57e43f4e97a1-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "42096ad9-dd99-4b49-ad30-57e43f4e97a1" (UID: "42096ad9-dd99-4b49-ad30-57e43f4e97a1"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.659999 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-run-httpd\") pod \"swift-proxy-79cf79466c-6hfp8\" (UID: \"4b18be8a-5ff5-4b20-b6d5-5ca167f33583\") " pod="openstack/swift-proxy-79cf79466c-6hfp8" Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.660328 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-log-httpd\") pod \"swift-proxy-79cf79466c-6hfp8\" (UID: \"4b18be8a-5ff5-4b20-b6d5-5ca167f33583\") " pod="openstack/swift-proxy-79cf79466c-6hfp8" Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.660368 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-internal-tls-certs\") pod \"swift-proxy-79cf79466c-6hfp8\" (UID: \"4b18be8a-5ff5-4b20-b6d5-5ca167f33583\") " pod="openstack/swift-proxy-79cf79466c-6hfp8" Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.660386 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-public-tls-certs\") pod \"swift-proxy-79cf79466c-6hfp8\" (UID: \"4b18be8a-5ff5-4b20-b6d5-5ca167f33583\") " pod="openstack/swift-proxy-79cf79466c-6hfp8" Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.660435 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-config-data\") pod \"swift-proxy-79cf79466c-6hfp8\" (UID: \"4b18be8a-5ff5-4b20-b6d5-5ca167f33583\") " pod="openstack/swift-proxy-79cf79466c-6hfp8" Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.660449 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-etc-swift\") pod \"swift-proxy-79cf79466c-6hfp8\" (UID: \"4b18be8a-5ff5-4b20-b6d5-5ca167f33583\") " pod="openstack/swift-proxy-79cf79466c-6hfp8" Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.660488 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x59xw\" (UniqueName: \"kubernetes.io/projected/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-kube-api-access-x59xw\") pod \"swift-proxy-79cf79466c-6hfp8\" (UID: \"4b18be8a-5ff5-4b20-b6d5-5ca167f33583\") " pod="openstack/swift-proxy-79cf79466c-6hfp8" Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.660529 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-combined-ca-bundle\") pod \"swift-proxy-79cf79466c-6hfp8\" (UID: \"4b18be8a-5ff5-4b20-b6d5-5ca167f33583\") " pod="openstack/swift-proxy-79cf79466c-6hfp8" Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.660574 4752 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/42096ad9-dd99-4b49-ad30-57e43f4e97a1-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.660587 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/42096ad9-dd99-4b49-ad30-57e43f4e97a1-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.661521 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42096ad9-dd99-4b49-ad30-57e43f4e97a1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.762811 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-log-httpd\") pod \"swift-proxy-79cf79466c-6hfp8\" (UID: \"4b18be8a-5ff5-4b20-b6d5-5ca167f33583\") " pod="openstack/swift-proxy-79cf79466c-6hfp8" Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.762902 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-public-tls-certs\") pod \"swift-proxy-79cf79466c-6hfp8\" (UID: \"4b18be8a-5ff5-4b20-b6d5-5ca167f33583\") " pod="openstack/swift-proxy-79cf79466c-6hfp8" Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.762922 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-internal-tls-certs\") pod \"swift-proxy-79cf79466c-6hfp8\" (UID: \"4b18be8a-5ff5-4b20-b6d5-5ca167f33583\") " pod="openstack/swift-proxy-79cf79466c-6hfp8" Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.762982 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-config-data\") pod \"swift-proxy-79cf79466c-6hfp8\" (UID: \"4b18be8a-5ff5-4b20-b6d5-5ca167f33583\") " pod="openstack/swift-proxy-79cf79466c-6hfp8" Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.762997 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-etc-swift\") pod \"swift-proxy-79cf79466c-6hfp8\" (UID: \"4b18be8a-5ff5-4b20-b6d5-5ca167f33583\") " pod="openstack/swift-proxy-79cf79466c-6hfp8" Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.763062 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x59xw\" (UniqueName: \"kubernetes.io/projected/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-kube-api-access-x59xw\") pod \"swift-proxy-79cf79466c-6hfp8\" (UID: \"4b18be8a-5ff5-4b20-b6d5-5ca167f33583\") " pod="openstack/swift-proxy-79cf79466c-6hfp8" Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.763131 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-combined-ca-bundle\") pod \"swift-proxy-79cf79466c-6hfp8\" (UID: \"4b18be8a-5ff5-4b20-b6d5-5ca167f33583\") " pod="openstack/swift-proxy-79cf79466c-6hfp8" Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.763192 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-run-httpd\") pod \"swift-proxy-79cf79466c-6hfp8\" (UID: \"4b18be8a-5ff5-4b20-b6d5-5ca167f33583\") " pod="openstack/swift-proxy-79cf79466c-6hfp8" Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.763368 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-log-httpd\") pod \"swift-proxy-79cf79466c-6hfp8\" (UID: \"4b18be8a-5ff5-4b20-b6d5-5ca167f33583\") " pod="openstack/swift-proxy-79cf79466c-6hfp8" Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.772511 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-internal-tls-certs\") pod \"swift-proxy-79cf79466c-6hfp8\" (UID: \"4b18be8a-5ff5-4b20-b6d5-5ca167f33583\") " pod="openstack/swift-proxy-79cf79466c-6hfp8" Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.775274 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-combined-ca-bundle\") pod \"swift-proxy-79cf79466c-6hfp8\" (UID: \"4b18be8a-5ff5-4b20-b6d5-5ca167f33583\") " pod="openstack/swift-proxy-79cf79466c-6hfp8" Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.775949 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-run-httpd\") pod \"swift-proxy-79cf79466c-6hfp8\" (UID: \"4b18be8a-5ff5-4b20-b6d5-5ca167f33583\") " pod="openstack/swift-proxy-79cf79466c-6hfp8" Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.777435 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-etc-swift\") pod \"swift-proxy-79cf79466c-6hfp8\" (UID: \"4b18be8a-5ff5-4b20-b6d5-5ca167f33583\") " pod="openstack/swift-proxy-79cf79466c-6hfp8" Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.778577 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-config-data\") pod \"swift-proxy-79cf79466c-6hfp8\" (UID: \"4b18be8a-5ff5-4b20-b6d5-5ca167f33583\") " pod="openstack/swift-proxy-79cf79466c-6hfp8" Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.779713 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-public-tls-certs\") pod \"swift-proxy-79cf79466c-6hfp8\" (UID: \"4b18be8a-5ff5-4b20-b6d5-5ca167f33583\") " pod="openstack/swift-proxy-79cf79466c-6hfp8" Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.803102 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x59xw\" (UniqueName: \"kubernetes.io/projected/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-kube-api-access-x59xw\") pod \"swift-proxy-79cf79466c-6hfp8\" (UID: \"4b18be8a-5ff5-4b20-b6d5-5ca167f33583\") " pod="openstack/swift-proxy-79cf79466c-6hfp8" Nov 24 11:25:55 crc kubenswrapper[4752]: I1124 11:25:55.900429 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-79cf79466c-6hfp8" Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.014702 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76c46557f6-h7874" event={"ID":"42096ad9-dd99-4b49-ad30-57e43f4e97a1","Type":"ContainerDied","Data":"0ae6b9242a8a0dc207b186cc8a763c94f3d1c7de06c516cc7afb9b2a21473d4d"} Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.014961 4752 scope.go:117] "RemoveContainer" containerID="11fb1748bacf7cdd4881adda1adafd0b56fde11f31f6e1af46f23c8badc5f64a" Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.015138 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-76c46557f6-h7874" Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.051035 4752 scope.go:117] "RemoveContainer" containerID="2083f407f786cbdc50075533918be421651e6ec55fbad9e1b8c579416e048543" Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.063814 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-76c46557f6-h7874"] Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.076627 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-76c46557f6-h7874"] Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.382992 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-hlrvc"] Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.384172 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hlrvc" Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.401134 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-hlrvc"] Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.477781 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-x9jpf"] Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.478947 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-x9jpf" Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.482714 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3acd95db-8b48-48c6-9153-d4f2dd07745a-operator-scripts\") pod \"nova-api-db-create-hlrvc\" (UID: \"3acd95db-8b48-48c6-9153-d4f2dd07745a\") " pod="openstack/nova-api-db-create-hlrvc" Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.482848 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmpmx\" (UniqueName: \"kubernetes.io/projected/3acd95db-8b48-48c6-9153-d4f2dd07745a-kube-api-access-nmpmx\") pod \"nova-api-db-create-hlrvc\" (UID: \"3acd95db-8b48-48c6-9153-d4f2dd07745a\") " pod="openstack/nova-api-db-create-hlrvc" Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.488198 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-x9jpf"] Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.517454 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-79cf79466c-6hfp8"] Nov 24 11:25:56 crc kubenswrapper[4752]: W1124 11:25:56.520042 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b18be8a_5ff5_4b20_b6d5_5ca167f33583.slice/crio-32aecea165f1193639b453c2f9ab5c0e77b28185ea8e6c8fd5cb661270789aa4 WatchSource:0}: Error finding container 32aecea165f1193639b453c2f9ab5c0e77b28185ea8e6c8fd5cb661270789aa4: Status 404 returned error can't find the container with id 32aecea165f1193639b453c2f9ab5c0e77b28185ea8e6c8fd5cb661270789aa4 Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.583623 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmpmx\" (UniqueName: \"kubernetes.io/projected/3acd95db-8b48-48c6-9153-d4f2dd07745a-kube-api-access-nmpmx\") pod \"nova-api-db-create-hlrvc\" (UID: \"3acd95db-8b48-48c6-9153-d4f2dd07745a\") " pod="openstack/nova-api-db-create-hlrvc" Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.583694 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b86sr\" (UniqueName: \"kubernetes.io/projected/b9e6380d-e699-4757-ac88-6d245859dadd-kube-api-access-b86sr\") pod \"nova-cell0-db-create-x9jpf\" (UID: \"b9e6380d-e699-4757-ac88-6d245859dadd\") " pod="openstack/nova-cell0-db-create-x9jpf" Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.583734 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3acd95db-8b48-48c6-9153-d4f2dd07745a-operator-scripts\") pod \"nova-api-db-create-hlrvc\" (UID: \"3acd95db-8b48-48c6-9153-d4f2dd07745a\") " pod="openstack/nova-api-db-create-hlrvc" Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.583790 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9e6380d-e699-4757-ac88-6d245859dadd-operator-scripts\") pod \"nova-cell0-db-create-x9jpf\" (UID: \"b9e6380d-e699-4757-ac88-6d245859dadd\") " pod="openstack/nova-cell0-db-create-x9jpf" Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.584645 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3acd95db-8b48-48c6-9153-d4f2dd07745a-operator-scripts\") pod \"nova-api-db-create-hlrvc\" (UID: \"3acd95db-8b48-48c6-9153-d4f2dd07745a\") " pod="openstack/nova-api-db-create-hlrvc" Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.591790 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-b683-account-create-zdwjb"] Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.592957 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-b683-account-create-zdwjb" Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.595391 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.598513 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-fmqfk"] Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.599665 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-fmqfk" Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.603593 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmpmx\" (UniqueName: \"kubernetes.io/projected/3acd95db-8b48-48c6-9153-d4f2dd07745a-kube-api-access-nmpmx\") pod \"nova-api-db-create-hlrvc\" (UID: \"3acd95db-8b48-48c6-9153-d4f2dd07745a\") " pod="openstack/nova-api-db-create-hlrvc" Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.628557 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-b683-account-create-zdwjb"] Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.638064 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-fmqfk"] Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.685318 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b86sr\" (UniqueName: \"kubernetes.io/projected/b9e6380d-e699-4757-ac88-6d245859dadd-kube-api-access-b86sr\") pod \"nova-cell0-db-create-x9jpf\" (UID: \"b9e6380d-e699-4757-ac88-6d245859dadd\") " pod="openstack/nova-cell0-db-create-x9jpf" Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.685433 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9e6380d-e699-4757-ac88-6d245859dadd-operator-scripts\") pod \"nova-cell0-db-create-x9jpf\" (UID: \"b9e6380d-e699-4757-ac88-6d245859dadd\") " pod="openstack/nova-cell0-db-create-x9jpf" Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.686332 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9e6380d-e699-4757-ac88-6d245859dadd-operator-scripts\") pod \"nova-cell0-db-create-x9jpf\" (UID: \"b9e6380d-e699-4757-ac88-6d245859dadd\") " pod="openstack/nova-cell0-db-create-x9jpf" Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.699944 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hlrvc" Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.715445 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b86sr\" (UniqueName: \"kubernetes.io/projected/b9e6380d-e699-4757-ac88-6d245859dadd-kube-api-access-b86sr\") pod \"nova-cell0-db-create-x9jpf\" (UID: \"b9e6380d-e699-4757-ac88-6d245859dadd\") " pod="openstack/nova-cell0-db-create-x9jpf" Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.743859 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42096ad9-dd99-4b49-ad30-57e43f4e97a1" path="/var/lib/kubelet/pods/42096ad9-dd99-4b49-ad30-57e43f4e97a1/volumes" Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.787834 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94e8d35f-96ad-4182-a7dd-2adbbb113508-operator-scripts\") pod \"nova-cell1-db-create-fmqfk\" (UID: \"94e8d35f-96ad-4182-a7dd-2adbbb113508\") " pod="openstack/nova-cell1-db-create-fmqfk" Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.787891 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgxzc\" (UniqueName: \"kubernetes.io/projected/e1469fa1-76e8-4cb0-a81f-2ec056462912-kube-api-access-vgxzc\") pod \"nova-api-b683-account-create-zdwjb\" (UID: \"e1469fa1-76e8-4cb0-a81f-2ec056462912\") " pod="openstack/nova-api-b683-account-create-zdwjb" Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.788394 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1469fa1-76e8-4cb0-a81f-2ec056462912-operator-scripts\") pod \"nova-api-b683-account-create-zdwjb\" (UID: \"e1469fa1-76e8-4cb0-a81f-2ec056462912\") " pod="openstack/nova-api-b683-account-create-zdwjb" Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.788424 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg2sm\" (UniqueName: \"kubernetes.io/projected/94e8d35f-96ad-4182-a7dd-2adbbb113508-kube-api-access-gg2sm\") pod \"nova-cell1-db-create-fmqfk\" (UID: \"94e8d35f-96ad-4182-a7dd-2adbbb113508\") " pod="openstack/nova-cell1-db-create-fmqfk" Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.792427 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-x9jpf" Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.811267 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-80e2-account-create-xlp8w"] Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.812406 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-80e2-account-create-xlp8w" Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.821668 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.830370 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-80e2-account-create-xlp8w"] Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.889694 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1469fa1-76e8-4cb0-a81f-2ec056462912-operator-scripts\") pod \"nova-api-b683-account-create-zdwjb\" (UID: \"e1469fa1-76e8-4cb0-a81f-2ec056462912\") " pod="openstack/nova-api-b683-account-create-zdwjb" Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.889758 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg2sm\" (UniqueName: \"kubernetes.io/projected/94e8d35f-96ad-4182-a7dd-2adbbb113508-kube-api-access-gg2sm\") pod \"nova-cell1-db-create-fmqfk\" (UID: \"94e8d35f-96ad-4182-a7dd-2adbbb113508\") " pod="openstack/nova-cell1-db-create-fmqfk" Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.889829 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94e8d35f-96ad-4182-a7dd-2adbbb113508-operator-scripts\") pod \"nova-cell1-db-create-fmqfk\" (UID: \"94e8d35f-96ad-4182-a7dd-2adbbb113508\") " pod="openstack/nova-cell1-db-create-fmqfk" Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.889849 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgxzc\" (UniqueName: \"kubernetes.io/projected/e1469fa1-76e8-4cb0-a81f-2ec056462912-kube-api-access-vgxzc\") pod \"nova-api-b683-account-create-zdwjb\" (UID: \"e1469fa1-76e8-4cb0-a81f-2ec056462912\") " pod="openstack/nova-api-b683-account-create-zdwjb" Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.890756 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1469fa1-76e8-4cb0-a81f-2ec056462912-operator-scripts\") pod \"nova-api-b683-account-create-zdwjb\" (UID: \"e1469fa1-76e8-4cb0-a81f-2ec056462912\") " pod="openstack/nova-api-b683-account-create-zdwjb" Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.891360 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94e8d35f-96ad-4182-a7dd-2adbbb113508-operator-scripts\") pod \"nova-cell1-db-create-fmqfk\" (UID: \"94e8d35f-96ad-4182-a7dd-2adbbb113508\") " pod="openstack/nova-cell1-db-create-fmqfk" Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.910999 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgxzc\" (UniqueName: \"kubernetes.io/projected/e1469fa1-76e8-4cb0-a81f-2ec056462912-kube-api-access-vgxzc\") pod \"nova-api-b683-account-create-zdwjb\" (UID: \"e1469fa1-76e8-4cb0-a81f-2ec056462912\") " pod="openstack/nova-api-b683-account-create-zdwjb" Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.911584 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg2sm\" (UniqueName: \"kubernetes.io/projected/94e8d35f-96ad-4182-a7dd-2adbbb113508-kube-api-access-gg2sm\") pod \"nova-cell1-db-create-fmqfk\" (UID: \"94e8d35f-96ad-4182-a7dd-2adbbb113508\") " pod="openstack/nova-cell1-db-create-fmqfk" Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.913544 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-b683-account-create-zdwjb" Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.926963 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-fmqfk" Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.964932 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.965268 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7e27be68-eb58-40ff-8cae-686399b726a5" containerName="ceilometer-central-agent" containerID="cri-o://86db44d46e0e1fa3d0c22549d2bf06886013ac00affef218eb39fd11985bc94e" gracePeriod=30 Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.966117 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7e27be68-eb58-40ff-8cae-686399b726a5" containerName="proxy-httpd" containerID="cri-o://5b4736909b3a1a5ca49bd05264baefafda1155d24a297c3da3c06add676e920d" gracePeriod=30 Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.966172 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7e27be68-eb58-40ff-8cae-686399b726a5" containerName="sg-core" containerID="cri-o://78678ced3942ffc63c968c7e93809d754c643c147367ead303121fdfa7d27a16" gracePeriod=30 Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.966203 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7e27be68-eb58-40ff-8cae-686399b726a5" containerName="ceilometer-notification-agent" containerID="cri-o://4afdae42fda582e2e9cc1f06a212428e40a4399ac84443a679b5537eca27548b" gracePeriod=30 Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.973557 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.991434 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slwsg\" (UniqueName: \"kubernetes.io/projected/e0419545-a8e5-484a-8b23-8e57bd2a6d7b-kube-api-access-slwsg\") pod \"nova-cell0-80e2-account-create-xlp8w\" (UID: \"e0419545-a8e5-484a-8b23-8e57bd2a6d7b\") " pod="openstack/nova-cell0-80e2-account-create-xlp8w" Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.991588 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0419545-a8e5-484a-8b23-8e57bd2a6d7b-operator-scripts\") pod \"nova-cell0-80e2-account-create-xlp8w\" (UID: \"e0419545-a8e5-484a-8b23-8e57bd2a6d7b\") " pod="openstack/nova-cell0-80e2-account-create-xlp8w" Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.992154 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-19d4-account-create-84bgb"] Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.993282 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-19d4-account-create-84bgb" Nov 24 11:25:56 crc kubenswrapper[4752]: I1124 11:25:56.995097 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Nov 24 11:25:57 crc kubenswrapper[4752]: I1124 11:25:57.026422 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-19d4-account-create-84bgb"] Nov 24 11:25:57 crc kubenswrapper[4752]: I1124 11:25:57.081279 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-79cf79466c-6hfp8" event={"ID":"4b18be8a-5ff5-4b20-b6d5-5ca167f33583","Type":"ContainerStarted","Data":"32aecea165f1193639b453c2f9ab5c0e77b28185ea8e6c8fd5cb661270789aa4"} Nov 24 11:25:57 crc kubenswrapper[4752]: I1124 11:25:57.096202 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/293563af-85f4-47d2-87d0-15f7172c173f-operator-scripts\") pod \"nova-cell1-19d4-account-create-84bgb\" (UID: \"293563af-85f4-47d2-87d0-15f7172c173f\") " pod="openstack/nova-cell1-19d4-account-create-84bgb" Nov 24 11:25:57 crc kubenswrapper[4752]: I1124 11:25:57.096368 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slwsg\" (UniqueName: \"kubernetes.io/projected/e0419545-a8e5-484a-8b23-8e57bd2a6d7b-kube-api-access-slwsg\") pod \"nova-cell0-80e2-account-create-xlp8w\" (UID: \"e0419545-a8e5-484a-8b23-8e57bd2a6d7b\") " pod="openstack/nova-cell0-80e2-account-create-xlp8w" Nov 24 11:25:57 crc kubenswrapper[4752]: I1124 11:25:57.096477 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsxg7\" (UniqueName: \"kubernetes.io/projected/293563af-85f4-47d2-87d0-15f7172c173f-kube-api-access-jsxg7\") pod \"nova-cell1-19d4-account-create-84bgb\" (UID: \"293563af-85f4-47d2-87d0-15f7172c173f\") " pod="openstack/nova-cell1-19d4-account-create-84bgb" Nov 24 11:25:57 crc kubenswrapper[4752]: I1124 11:25:57.096526 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0419545-a8e5-484a-8b23-8e57bd2a6d7b-operator-scripts\") pod \"nova-cell0-80e2-account-create-xlp8w\" (UID: \"e0419545-a8e5-484a-8b23-8e57bd2a6d7b\") " pod="openstack/nova-cell0-80e2-account-create-xlp8w" Nov 24 11:25:57 crc kubenswrapper[4752]: I1124 11:25:57.097330 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0419545-a8e5-484a-8b23-8e57bd2a6d7b-operator-scripts\") pod \"nova-cell0-80e2-account-create-xlp8w\" (UID: \"e0419545-a8e5-484a-8b23-8e57bd2a6d7b\") " pod="openstack/nova-cell0-80e2-account-create-xlp8w" Nov 24 11:25:57 crc kubenswrapper[4752]: I1124 11:25:57.115630 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slwsg\" (UniqueName: \"kubernetes.io/projected/e0419545-a8e5-484a-8b23-8e57bd2a6d7b-kube-api-access-slwsg\") pod \"nova-cell0-80e2-account-create-xlp8w\" (UID: \"e0419545-a8e5-484a-8b23-8e57bd2a6d7b\") " pod="openstack/nova-cell0-80e2-account-create-xlp8w" Nov 24 11:25:57 crc kubenswrapper[4752]: I1124 11:25:57.133149 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-80e2-account-create-xlp8w" Nov 24 11:25:57 crc kubenswrapper[4752]: I1124 11:25:57.198358 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsxg7\" (UniqueName: \"kubernetes.io/projected/293563af-85f4-47d2-87d0-15f7172c173f-kube-api-access-jsxg7\") pod \"nova-cell1-19d4-account-create-84bgb\" (UID: \"293563af-85f4-47d2-87d0-15f7172c173f\") " pod="openstack/nova-cell1-19d4-account-create-84bgb" Nov 24 11:25:57 crc kubenswrapper[4752]: I1124 11:25:57.198460 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/293563af-85f4-47d2-87d0-15f7172c173f-operator-scripts\") pod \"nova-cell1-19d4-account-create-84bgb\" (UID: \"293563af-85f4-47d2-87d0-15f7172c173f\") " pod="openstack/nova-cell1-19d4-account-create-84bgb" Nov 24 11:25:57 crc kubenswrapper[4752]: I1124 11:25:57.199242 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/293563af-85f4-47d2-87d0-15f7172c173f-operator-scripts\") pod \"nova-cell1-19d4-account-create-84bgb\" (UID: \"293563af-85f4-47d2-87d0-15f7172c173f\") " pod="openstack/nova-cell1-19d4-account-create-84bgb" Nov 24 11:25:57 crc kubenswrapper[4752]: I1124 11:25:57.214647 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsxg7\" (UniqueName: \"kubernetes.io/projected/293563af-85f4-47d2-87d0-15f7172c173f-kube-api-access-jsxg7\") pod \"nova-cell1-19d4-account-create-84bgb\" (UID: \"293563af-85f4-47d2-87d0-15f7172c173f\") " pod="openstack/nova-cell1-19d4-account-create-84bgb" Nov 24 11:25:57 crc kubenswrapper[4752]: I1124 11:25:57.351165 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-19d4-account-create-84bgb" Nov 24 11:25:57 crc kubenswrapper[4752]: I1124 11:25:57.527495 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 24 11:25:58 crc kubenswrapper[4752]: I1124 11:25:58.110187 4752 generic.go:334] "Generic (PLEG): container finished" podID="7e27be68-eb58-40ff-8cae-686399b726a5" containerID="5b4736909b3a1a5ca49bd05264baefafda1155d24a297c3da3c06add676e920d" exitCode=0 Nov 24 11:25:58 crc kubenswrapper[4752]: I1124 11:25:58.110228 4752 generic.go:334] "Generic (PLEG): container finished" podID="7e27be68-eb58-40ff-8cae-686399b726a5" containerID="78678ced3942ffc63c968c7e93809d754c643c147367ead303121fdfa7d27a16" exitCode=2 Nov 24 11:25:58 crc kubenswrapper[4752]: I1124 11:25:58.110237 4752 generic.go:334] "Generic (PLEG): container finished" podID="7e27be68-eb58-40ff-8cae-686399b726a5" containerID="4afdae42fda582e2e9cc1f06a212428e40a4399ac84443a679b5537eca27548b" exitCode=0 Nov 24 11:25:58 crc kubenswrapper[4752]: I1124 11:25:58.110248 4752 generic.go:334] "Generic (PLEG): container finished" podID="7e27be68-eb58-40ff-8cae-686399b726a5" containerID="86db44d46e0e1fa3d0c22549d2bf06886013ac00affef218eb39fd11985bc94e" exitCode=0 Nov 24 11:25:58 crc kubenswrapper[4752]: I1124 11:25:58.110263 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e27be68-eb58-40ff-8cae-686399b726a5","Type":"ContainerDied","Data":"5b4736909b3a1a5ca49bd05264baefafda1155d24a297c3da3c06add676e920d"} Nov 24 11:25:58 crc kubenswrapper[4752]: I1124 11:25:58.110304 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e27be68-eb58-40ff-8cae-686399b726a5","Type":"ContainerDied","Data":"78678ced3942ffc63c968c7e93809d754c643c147367ead303121fdfa7d27a16"} Nov 24 11:25:58 crc kubenswrapper[4752]: I1124 11:25:58.110321 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e27be68-eb58-40ff-8cae-686399b726a5","Type":"ContainerDied","Data":"4afdae42fda582e2e9cc1f06a212428e40a4399ac84443a679b5537eca27548b"} Nov 24 11:25:58 crc kubenswrapper[4752]: I1124 11:25:58.110334 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e27be68-eb58-40ff-8cae-686399b726a5","Type":"ContainerDied","Data":"86db44d46e0e1fa3d0c22549d2bf06886013ac00affef218eb39fd11985bc94e"} Nov 24 11:26:03 crc kubenswrapper[4752]: I1124 11:26:03.150945 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 11:26:03 crc kubenswrapper[4752]: I1124 11:26:03.186632 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-79cf79466c-6hfp8" event={"ID":"4b18be8a-5ff5-4b20-b6d5-5ca167f33583","Type":"ContainerStarted","Data":"38c41eef16564abd80395499860a5b1c7de6d60c007496e866d4d37f4047cebc"} Nov 24 11:26:03 crc kubenswrapper[4752]: I1124 11:26:03.197434 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e27be68-eb58-40ff-8cae-686399b726a5","Type":"ContainerDied","Data":"67afd53a4f784f0add9e4745231637b977af697ce3962edad1952cd91f57f73c"} Nov 24 11:26:03 crc kubenswrapper[4752]: I1124 11:26:03.197486 4752 scope.go:117] "RemoveContainer" containerID="5b4736909b3a1a5ca49bd05264baefafda1155d24a297c3da3c06add676e920d" Nov 24 11:26:03 crc kubenswrapper[4752]: I1124 11:26:03.197601 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 11:26:03 crc kubenswrapper[4752]: I1124 11:26:03.218242 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e27be68-eb58-40ff-8cae-686399b726a5-run-httpd\") pod \"7e27be68-eb58-40ff-8cae-686399b726a5\" (UID: \"7e27be68-eb58-40ff-8cae-686399b726a5\") " Nov 24 11:26:03 crc kubenswrapper[4752]: I1124 11:26:03.218296 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e27be68-eb58-40ff-8cae-686399b726a5-sg-core-conf-yaml\") pod \"7e27be68-eb58-40ff-8cae-686399b726a5\" (UID: \"7e27be68-eb58-40ff-8cae-686399b726a5\") " Nov 24 11:26:03 crc kubenswrapper[4752]: I1124 11:26:03.218365 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vd6fq\" (UniqueName: \"kubernetes.io/projected/7e27be68-eb58-40ff-8cae-686399b726a5-kube-api-access-vd6fq\") pod \"7e27be68-eb58-40ff-8cae-686399b726a5\" (UID: \"7e27be68-eb58-40ff-8cae-686399b726a5\") " Nov 24 11:26:03 crc kubenswrapper[4752]: I1124 11:26:03.218447 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e27be68-eb58-40ff-8cae-686399b726a5-combined-ca-bundle\") pod \"7e27be68-eb58-40ff-8cae-686399b726a5\" (UID: \"7e27be68-eb58-40ff-8cae-686399b726a5\") " Nov 24 11:26:03 crc kubenswrapper[4752]: I1124 11:26:03.218486 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e27be68-eb58-40ff-8cae-686399b726a5-scripts\") pod \"7e27be68-eb58-40ff-8cae-686399b726a5\" (UID: \"7e27be68-eb58-40ff-8cae-686399b726a5\") " Nov 24 11:26:03 crc kubenswrapper[4752]: I1124 11:26:03.218542 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e27be68-eb58-40ff-8cae-686399b726a5-log-httpd\") pod \"7e27be68-eb58-40ff-8cae-686399b726a5\" (UID: \"7e27be68-eb58-40ff-8cae-686399b726a5\") " Nov 24 11:26:03 crc kubenswrapper[4752]: I1124 11:26:03.218590 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e27be68-eb58-40ff-8cae-686399b726a5-config-data\") pod \"7e27be68-eb58-40ff-8cae-686399b726a5\" (UID: \"7e27be68-eb58-40ff-8cae-686399b726a5\") " Nov 24 11:26:03 crc kubenswrapper[4752]: I1124 11:26:03.223854 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e27be68-eb58-40ff-8cae-686399b726a5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7e27be68-eb58-40ff-8cae-686399b726a5" (UID: "7e27be68-eb58-40ff-8cae-686399b726a5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:26:03 crc kubenswrapper[4752]: I1124 11:26:03.224057 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e27be68-eb58-40ff-8cae-686399b726a5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7e27be68-eb58-40ff-8cae-686399b726a5" (UID: "7e27be68-eb58-40ff-8cae-686399b726a5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:26:03 crc kubenswrapper[4752]: I1124 11:26:03.226531 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e27be68-eb58-40ff-8cae-686399b726a5-kube-api-access-vd6fq" (OuterVolumeSpecName: "kube-api-access-vd6fq") pod "7e27be68-eb58-40ff-8cae-686399b726a5" (UID: "7e27be68-eb58-40ff-8cae-686399b726a5"). InnerVolumeSpecName "kube-api-access-vd6fq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:26:03 crc kubenswrapper[4752]: I1124 11:26:03.231158 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e27be68-eb58-40ff-8cae-686399b726a5-scripts" (OuterVolumeSpecName: "scripts") pod "7e27be68-eb58-40ff-8cae-686399b726a5" (UID: "7e27be68-eb58-40ff-8cae-686399b726a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:26:03 crc kubenswrapper[4752]: I1124 11:26:03.231183 4752 scope.go:117] "RemoveContainer" containerID="78678ced3942ffc63c968c7e93809d754c643c147367ead303121fdfa7d27a16" Nov 24 11:26:03 crc kubenswrapper[4752]: I1124 11:26:03.259200 4752 scope.go:117] "RemoveContainer" containerID="4afdae42fda582e2e9cc1f06a212428e40a4399ac84443a679b5537eca27548b" Nov 24 11:26:03 crc kubenswrapper[4752]: I1124 11:26:03.288911 4752 scope.go:117] "RemoveContainer" containerID="86db44d46e0e1fa3d0c22549d2bf06886013ac00affef218eb39fd11985bc94e" Nov 24 11:26:03 crc kubenswrapper[4752]: I1124 11:26:03.322287 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e27be68-eb58-40ff-8cae-686399b726a5-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:03 crc kubenswrapper[4752]: I1124 11:26:03.322322 4752 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e27be68-eb58-40ff-8cae-686399b726a5-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:03 crc kubenswrapper[4752]: I1124 11:26:03.322336 4752 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e27be68-eb58-40ff-8cae-686399b726a5-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:03 crc kubenswrapper[4752]: I1124 11:26:03.322347 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vd6fq\" (UniqueName: \"kubernetes.io/projected/7e27be68-eb58-40ff-8cae-686399b726a5-kube-api-access-vd6fq\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:03 crc kubenswrapper[4752]: I1124 11:26:03.426339 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e27be68-eb58-40ff-8cae-686399b726a5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7e27be68-eb58-40ff-8cae-686399b726a5" (UID: "7e27be68-eb58-40ff-8cae-686399b726a5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:26:03 crc kubenswrapper[4752]: I1124 11:26:03.495271 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-80e2-account-create-xlp8w"] Nov 24 11:26:03 crc kubenswrapper[4752]: I1124 11:26:03.528946 4752 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e27be68-eb58-40ff-8cae-686399b726a5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:03 crc kubenswrapper[4752]: I1124 11:26:03.591932 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e27be68-eb58-40ff-8cae-686399b726a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e27be68-eb58-40ff-8cae-686399b726a5" (UID: "7e27be68-eb58-40ff-8cae-686399b726a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:26:03 crc kubenswrapper[4752]: I1124 11:26:03.637618 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e27be68-eb58-40ff-8cae-686399b726a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:03 crc kubenswrapper[4752]: I1124 11:26:03.645881 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e27be68-eb58-40ff-8cae-686399b726a5-config-data" (OuterVolumeSpecName: "config-data") pod "7e27be68-eb58-40ff-8cae-686399b726a5" (UID: "7e27be68-eb58-40ff-8cae-686399b726a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:26:03 crc kubenswrapper[4752]: W1124 11:26:03.735245 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1469fa1_76e8_4cb0_a81f_2ec056462912.slice/crio-f43e995fc8554050399b09fddcb9511e8e447b1e227a0d6100d851d8669a1605 WatchSource:0}: Error finding container f43e995fc8554050399b09fddcb9511e8e447b1e227a0d6100d851d8669a1605: Status 404 returned error can't find the container with id f43e995fc8554050399b09fddcb9511e8e447b1e227a0d6100d851d8669a1605 Nov 24 11:26:03 crc kubenswrapper[4752]: I1124 11:26:03.739091 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e27be68-eb58-40ff-8cae-686399b726a5-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:03 crc kubenswrapper[4752]: I1124 11:26:03.740858 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-b683-account-create-zdwjb"] Nov 24 11:26:03 crc kubenswrapper[4752]: I1124 11:26:03.752754 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 24 11:26:03 crc kubenswrapper[4752]: I1124 11:26:03.752999 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="4777fb8a-2a08-4286-8c15-39d249db6db0" containerName="kube-state-metrics" containerID="cri-o://048913e60e85a005c432d1b709dfd27f75ae10ec635f5a2ae5749e8ba5d88dcb" gracePeriod=30 Nov 24 11:26:03 crc kubenswrapper[4752]: I1124 11:26:03.801852 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-fmqfk"] Nov 24 11:26:03 crc kubenswrapper[4752]: I1124 11:26:03.814643 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-hlrvc"] Nov 24 11:26:03 crc kubenswrapper[4752]: I1124 11:26:03.869364 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-x9jpf"] Nov 24 11:26:03 crc kubenswrapper[4752]: I1124 11:26:03.885916 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-19d4-account-create-84bgb"] Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.038818 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.046171 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.053981 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 24 11:26:04 crc kubenswrapper[4752]: E1124 11:26:04.054597 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e27be68-eb58-40ff-8cae-686399b726a5" containerName="sg-core" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.054609 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e27be68-eb58-40ff-8cae-686399b726a5" containerName="sg-core" Nov 24 11:26:04 crc kubenswrapper[4752]: E1124 11:26:04.054629 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e27be68-eb58-40ff-8cae-686399b726a5" containerName="ceilometer-central-agent" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.054635 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e27be68-eb58-40ff-8cae-686399b726a5" containerName="ceilometer-central-agent" Nov 24 11:26:04 crc kubenswrapper[4752]: E1124 11:26:04.054646 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e27be68-eb58-40ff-8cae-686399b726a5" containerName="ceilometer-notification-agent" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.054651 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e27be68-eb58-40ff-8cae-686399b726a5" containerName="ceilometer-notification-agent" Nov 24 11:26:04 crc kubenswrapper[4752]: E1124 11:26:04.054658 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e27be68-eb58-40ff-8cae-686399b726a5" containerName="proxy-httpd" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.054664 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e27be68-eb58-40ff-8cae-686399b726a5" containerName="proxy-httpd" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.054863 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e27be68-eb58-40ff-8cae-686399b726a5" containerName="ceilometer-notification-agent" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.054878 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e27be68-eb58-40ff-8cae-686399b726a5" containerName="proxy-httpd" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.054895 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e27be68-eb58-40ff-8cae-686399b726a5" containerName="sg-core" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.054902 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e27be68-eb58-40ff-8cae-686399b726a5" containerName="ceilometer-central-agent" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.056409 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.059969 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.060281 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.060487 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.154368 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f\") " pod="openstack/ceilometer-0" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.154430 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2wb8\" (UniqueName: \"kubernetes.io/projected/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f-kube-api-access-t2wb8\") pod \"ceilometer-0\" (UID: \"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f\") " pod="openstack/ceilometer-0" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.154460 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f-scripts\") pod \"ceilometer-0\" (UID: \"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f\") " pod="openstack/ceilometer-0" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.154507 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f\") " pod="openstack/ceilometer-0" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.154547 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f-run-httpd\") pod \"ceilometer-0\" (UID: \"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f\") " pod="openstack/ceilometer-0" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.154603 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f-log-httpd\") pod \"ceilometer-0\" (UID: \"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f\") " pod="openstack/ceilometer-0" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.154629 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f-config-data\") pod \"ceilometer-0\" (UID: \"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f\") " pod="openstack/ceilometer-0" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.220703 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hlrvc" event={"ID":"3acd95db-8b48-48c6-9153-d4f2dd07745a","Type":"ContainerStarted","Data":"c00781bab3d87dde400d495b8256e364ba1e448388264783e10a992d73a3a2e2"} Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.224136 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-fmqfk" event={"ID":"94e8d35f-96ad-4182-a7dd-2adbbb113508","Type":"ContainerStarted","Data":"70521b0e317f3ef89430e40951208d988f5db13bb69f0a7ae1b19c8f70d137ec"} Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.229566 4752 generic.go:334] "Generic (PLEG): container finished" podID="4777fb8a-2a08-4286-8c15-39d249db6db0" containerID="048913e60e85a005c432d1b709dfd27f75ae10ec635f5a2ae5749e8ba5d88dcb" exitCode=2 Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.229678 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4777fb8a-2a08-4286-8c15-39d249db6db0","Type":"ContainerDied","Data":"048913e60e85a005c432d1b709dfd27f75ae10ec635f5a2ae5749e8ba5d88dcb"} Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.255326 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-hlrvc" podStartSLOduration=8.255304333 podStartE2EDuration="8.255304333s" podCreationTimestamp="2025-11-24 11:25:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:26:04.23927543 +0000 UTC m=+1170.224095719" watchObservedRunningTime="2025-11-24 11:26:04.255304333 +0000 UTC m=+1170.240124622" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.257320 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f\") " pod="openstack/ceilometer-0" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.257672 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f-run-httpd\") pod \"ceilometer-0\" (UID: \"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f\") " pod="openstack/ceilometer-0" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.257961 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f-log-httpd\") pod \"ceilometer-0\" (UID: \"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f\") " pod="openstack/ceilometer-0" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.258104 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f-config-data\") pod \"ceilometer-0\" (UID: \"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f\") " pod="openstack/ceilometer-0" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.258345 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2wb8\" (UniqueName: \"kubernetes.io/projected/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f-kube-api-access-t2wb8\") pod \"ceilometer-0\" (UID: \"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f\") " pod="openstack/ceilometer-0" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.258795 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f\") " pod="openstack/ceilometer-0" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.258934 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f-scripts\") pod \"ceilometer-0\" (UID: \"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f\") " pod="openstack/ceilometer-0" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.263597 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f-log-httpd\") pod \"ceilometer-0\" (UID: \"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f\") " pod="openstack/ceilometer-0" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.263839 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f-run-httpd\") pod \"ceilometer-0\" (UID: \"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f\") " pod="openstack/ceilometer-0" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.263882 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f\") " pod="openstack/ceilometer-0" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.265583 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f-scripts\") pod \"ceilometer-0\" (UID: \"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f\") " pod="openstack/ceilometer-0" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.274512 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f\") " pod="openstack/ceilometer-0" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.275349 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f-config-data\") pod \"ceilometer-0\" (UID: \"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f\") " pod="openstack/ceilometer-0" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.275528 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"09540fa5-6ff8-45cc-98be-968283dc2bfd","Type":"ContainerStarted","Data":"cd44959aa1a906397cf9d91f94ae613e3c4579ada93cf6e70cc97f819550202b"} Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.292489 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2wb8\" (UniqueName: \"kubernetes.io/projected/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f-kube-api-access-t2wb8\") pod \"ceilometer-0\" (UID: \"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f\") " pod="openstack/ceilometer-0" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.294145 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-x9jpf" event={"ID":"b9e6380d-e699-4757-ac88-6d245859dadd","Type":"ContainerStarted","Data":"f2f5db21c5a08aebeae1a66ae46f6d7d12a7e6d21f449c78f8c5330aa652e596"} Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.299753 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.516748484 podStartE2EDuration="14.299722467s" podCreationTimestamp="2025-11-24 11:25:50 +0000 UTC" firstStartedPulling="2025-11-24 11:25:51.013660181 +0000 UTC m=+1156.998480470" lastFinishedPulling="2025-11-24 11:26:02.796634164 +0000 UTC m=+1168.781454453" observedRunningTime="2025-11-24 11:26:04.294143376 +0000 UTC m=+1170.278963665" watchObservedRunningTime="2025-11-24 11:26:04.299722467 +0000 UTC m=+1170.284542756" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.311031 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.313034 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-b683-account-create-zdwjb" event={"ID":"e1469fa1-76e8-4cb0-a81f-2ec056462912","Type":"ContainerStarted","Data":"f43e995fc8554050399b09fddcb9511e8e447b1e227a0d6100d851d8669a1605"} Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.325594 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-80e2-account-create-xlp8w" event={"ID":"e0419545-a8e5-484a-8b23-8e57bd2a6d7b","Type":"ContainerStarted","Data":"3ce9fb668d269804d6dd2f2c8f344a9ec3cad9852de93a6c221320e395600d05"} Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.325640 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-80e2-account-create-xlp8w" event={"ID":"e0419545-a8e5-484a-8b23-8e57bd2a6d7b","Type":"ContainerStarted","Data":"24087204fd5e6fc4d3bec4895b7b5b8f4a494b554821243fa3cf8f9d5728a919"} Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.347124 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-19d4-account-create-84bgb" event={"ID":"293563af-85f4-47d2-87d0-15f7172c173f","Type":"ContainerStarted","Data":"0580219c23c4a2ec91d02a2cae44b0c8ffcf8df9d3bb90f28cb8b58430b9e93f"} Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.353046 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-79cf79466c-6hfp8" event={"ID":"4b18be8a-5ff5-4b20-b6d5-5ca167f33583","Type":"ContainerStarted","Data":"3971caac8147987cbdf6ff339a4a02e13902ca6259e0896f8df16f72509dc49d"} Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.353886 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-79cf79466c-6hfp8" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.353914 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-79cf79466c-6hfp8" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.358045 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-b683-account-create-zdwjb" podStartSLOduration=8.358021962 podStartE2EDuration="8.358021962s" podCreationTimestamp="2025-11-24 11:25:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:26:04.346756076 +0000 UTC m=+1170.331576365" watchObservedRunningTime="2025-11-24 11:26:04.358021962 +0000 UTC m=+1170.342842251" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.371978 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-80e2-account-create-xlp8w" podStartSLOduration=8.371957674 podStartE2EDuration="8.371957674s" podCreationTimestamp="2025-11-24 11:25:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:26:04.370226654 +0000 UTC m=+1170.355046943" watchObservedRunningTime="2025-11-24 11:26:04.371957674 +0000 UTC m=+1170.356777963" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.402306 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-79cf79466c-6hfp8" podStartSLOduration=9.402287061 podStartE2EDuration="9.402287061s" podCreationTimestamp="2025-11-24 11:25:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:26:04.392191009 +0000 UTC m=+1170.377011298" watchObservedRunningTime="2025-11-24 11:26:04.402287061 +0000 UTC m=+1170.387107350" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.404187 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.417003 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-19d4-account-create-84bgb" podStartSLOduration=8.416988505 podStartE2EDuration="8.416988505s" podCreationTimestamp="2025-11-24 11:25:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:26:04.414183044 +0000 UTC m=+1170.399003323" watchObservedRunningTime="2025-11-24 11:26:04.416988505 +0000 UTC m=+1170.401808794" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.466627 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5lgt\" (UniqueName: \"kubernetes.io/projected/4777fb8a-2a08-4286-8c15-39d249db6db0-kube-api-access-s5lgt\") pod \"4777fb8a-2a08-4286-8c15-39d249db6db0\" (UID: \"4777fb8a-2a08-4286-8c15-39d249db6db0\") " Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.472040 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4777fb8a-2a08-4286-8c15-39d249db6db0-kube-api-access-s5lgt" (OuterVolumeSpecName: "kube-api-access-s5lgt") pod "4777fb8a-2a08-4286-8c15-39d249db6db0" (UID: "4777fb8a-2a08-4286-8c15-39d249db6db0"). InnerVolumeSpecName "kube-api-access-s5lgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.570305 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5lgt\" (UniqueName: \"kubernetes.io/projected/4777fb8a-2a08-4286-8c15-39d249db6db0-kube-api-access-s5lgt\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.770896 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e27be68-eb58-40ff-8cae-686399b726a5" path="/var/lib/kubelet/pods/7e27be68-eb58-40ff-8cae-686399b726a5/volumes" Nov 24 11:26:04 crc kubenswrapper[4752]: I1124 11:26:04.790314 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 24 11:26:04 crc kubenswrapper[4752]: W1124 11:26:04.862340 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b6fbdcc_415b_458d_bfa2_f91fb6794d7f.slice/crio-c64871926fe9dd64686c7f9f93636bcdf194c45d0f3307d9579668fc591580d1 WatchSource:0}: Error finding container c64871926fe9dd64686c7f9f93636bcdf194c45d0f3307d9579668fc591580d1: Status 404 returned error can't find the container with id c64871926fe9dd64686c7f9f93636bcdf194c45d0f3307d9579668fc591580d1 Nov 24 11:26:05 crc kubenswrapper[4752]: I1124 11:26:05.367527 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4777fb8a-2a08-4286-8c15-39d249db6db0","Type":"ContainerDied","Data":"a5a1ce0dd28425425124e8919363775aaf4eda0b4450b67adff2e30e758a681e"} Nov 24 11:26:05 crc kubenswrapper[4752]: I1124 11:26:05.367575 4752 scope.go:117] "RemoveContainer" containerID="048913e60e85a005c432d1b709dfd27f75ae10ec635f5a2ae5749e8ba5d88dcb" Nov 24 11:26:05 crc kubenswrapper[4752]: I1124 11:26:05.367586 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 24 11:26:05 crc kubenswrapper[4752]: I1124 11:26:05.373239 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f","Type":"ContainerStarted","Data":"c64871926fe9dd64686c7f9f93636bcdf194c45d0f3307d9579668fc591580d1"} Nov 24 11:26:05 crc kubenswrapper[4752]: I1124 11:26:05.376364 4752 generic.go:334] "Generic (PLEG): container finished" podID="3acd95db-8b48-48c6-9153-d4f2dd07745a" containerID="fad4f8068e453f4f0adeb3e91bcabcf8d768979c4e10c48b561d2681459e57aa" exitCode=0 Nov 24 11:26:05 crc kubenswrapper[4752]: I1124 11:26:05.376623 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hlrvc" event={"ID":"3acd95db-8b48-48c6-9153-d4f2dd07745a","Type":"ContainerDied","Data":"fad4f8068e453f4f0adeb3e91bcabcf8d768979c4e10c48b561d2681459e57aa"} Nov 24 11:26:05 crc kubenswrapper[4752]: I1124 11:26:05.380071 4752 generic.go:334] "Generic (PLEG): container finished" podID="b9e6380d-e699-4757-ac88-6d245859dadd" containerID="020c79cb453e7768988ebf442209375f39cfa455420af82063716113800a4446" exitCode=0 Nov 24 11:26:05 crc kubenswrapper[4752]: I1124 11:26:05.380107 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-x9jpf" event={"ID":"b9e6380d-e699-4757-ac88-6d245859dadd","Type":"ContainerDied","Data":"020c79cb453e7768988ebf442209375f39cfa455420af82063716113800a4446"} Nov 24 11:26:05 crc kubenswrapper[4752]: I1124 11:26:05.397925 4752 generic.go:334] "Generic (PLEG): container finished" podID="e1469fa1-76e8-4cb0-a81f-2ec056462912" containerID="b2e200c05ccf3e61807da30a56ceeccd81731dd7d3c5381db6ec4a545e58aceb" exitCode=0 Nov 24 11:26:05 crc kubenswrapper[4752]: I1124 11:26:05.398034 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-b683-account-create-zdwjb" event={"ID":"e1469fa1-76e8-4cb0-a81f-2ec056462912","Type":"ContainerDied","Data":"b2e200c05ccf3e61807da30a56ceeccd81731dd7d3c5381db6ec4a545e58aceb"} Nov 24 11:26:05 crc kubenswrapper[4752]: I1124 11:26:05.404182 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-fmqfk" event={"ID":"94e8d35f-96ad-4182-a7dd-2adbbb113508","Type":"ContainerDied","Data":"cd58caf1dc39366b3251ce0c0c7889a1bdc484f870e4acf5136461945278d7e8"} Nov 24 11:26:05 crc kubenswrapper[4752]: I1124 11:26:05.404193 4752 generic.go:334] "Generic (PLEG): container finished" podID="94e8d35f-96ad-4182-a7dd-2adbbb113508" containerID="cd58caf1dc39366b3251ce0c0c7889a1bdc484f870e4acf5136461945278d7e8" exitCode=0 Nov 24 11:26:05 crc kubenswrapper[4752]: I1124 11:26:05.405024 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 24 11:26:05 crc kubenswrapper[4752]: I1124 11:26:05.415969 4752 generic.go:334] "Generic (PLEG): container finished" podID="e0419545-a8e5-484a-8b23-8e57bd2a6d7b" containerID="3ce9fb668d269804d6dd2f2c8f344a9ec3cad9852de93a6c221320e395600d05" exitCode=0 Nov 24 11:26:05 crc kubenswrapper[4752]: I1124 11:26:05.416057 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-80e2-account-create-xlp8w" event={"ID":"e0419545-a8e5-484a-8b23-8e57bd2a6d7b","Type":"ContainerDied","Data":"3ce9fb668d269804d6dd2f2c8f344a9ec3cad9852de93a6c221320e395600d05"} Nov 24 11:26:05 crc kubenswrapper[4752]: I1124 11:26:05.417822 4752 generic.go:334] "Generic (PLEG): container finished" podID="293563af-85f4-47d2-87d0-15f7172c173f" containerID="cc9d034a28c28ef8d0777d5e4c3d5981c8b605ee235018113534b49a6c30c837" exitCode=0 Nov 24 11:26:05 crc kubenswrapper[4752]: I1124 11:26:05.418040 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-19d4-account-create-84bgb" event={"ID":"293563af-85f4-47d2-87d0-15f7172c173f","Type":"ContainerDied","Data":"cc9d034a28c28ef8d0777d5e4c3d5981c8b605ee235018113534b49a6c30c837"} Nov 24 11:26:05 crc kubenswrapper[4752]: I1124 11:26:05.420276 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 24 11:26:05 crc kubenswrapper[4752]: I1124 11:26:05.439522 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Nov 24 11:26:05 crc kubenswrapper[4752]: E1124 11:26:05.439977 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4777fb8a-2a08-4286-8c15-39d249db6db0" containerName="kube-state-metrics" Nov 24 11:26:05 crc kubenswrapper[4752]: I1124 11:26:05.439994 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="4777fb8a-2a08-4286-8c15-39d249db6db0" containerName="kube-state-metrics" Nov 24 11:26:05 crc kubenswrapper[4752]: I1124 11:26:05.440182 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="4777fb8a-2a08-4286-8c15-39d249db6db0" containerName="kube-state-metrics" Nov 24 11:26:05 crc kubenswrapper[4752]: I1124 11:26:05.440733 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 24 11:26:05 crc kubenswrapper[4752]: I1124 11:26:05.446949 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Nov 24 11:26:05 crc kubenswrapper[4752]: I1124 11:26:05.447099 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Nov 24 11:26:05 crc kubenswrapper[4752]: I1124 11:26:05.467670 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 24 11:26:05 crc kubenswrapper[4752]: I1124 11:26:05.589112 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddb825f2-a05f-409d-b4cc-80408e2db5d7-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ddb825f2-a05f-409d-b4cc-80408e2db5d7\") " pod="openstack/kube-state-metrics-0" Nov 24 11:26:05 crc kubenswrapper[4752]: I1124 11:26:05.589171 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ddb825f2-a05f-409d-b4cc-80408e2db5d7-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ddb825f2-a05f-409d-b4cc-80408e2db5d7\") " pod="openstack/kube-state-metrics-0" Nov 24 11:26:05 crc kubenswrapper[4752]: I1124 11:26:05.589196 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz5xc\" (UniqueName: \"kubernetes.io/projected/ddb825f2-a05f-409d-b4cc-80408e2db5d7-kube-api-access-zz5xc\") pod \"kube-state-metrics-0\" (UID: \"ddb825f2-a05f-409d-b4cc-80408e2db5d7\") " pod="openstack/kube-state-metrics-0" Nov 24 11:26:05 crc kubenswrapper[4752]: I1124 11:26:05.589229 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddb825f2-a05f-409d-b4cc-80408e2db5d7-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ddb825f2-a05f-409d-b4cc-80408e2db5d7\") " pod="openstack/kube-state-metrics-0" Nov 24 11:26:05 crc kubenswrapper[4752]: I1124 11:26:05.690992 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddb825f2-a05f-409d-b4cc-80408e2db5d7-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ddb825f2-a05f-409d-b4cc-80408e2db5d7\") " pod="openstack/kube-state-metrics-0" Nov 24 11:26:05 crc kubenswrapper[4752]: I1124 11:26:05.691058 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ddb825f2-a05f-409d-b4cc-80408e2db5d7-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ddb825f2-a05f-409d-b4cc-80408e2db5d7\") " pod="openstack/kube-state-metrics-0" Nov 24 11:26:05 crc kubenswrapper[4752]: I1124 11:26:05.691094 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz5xc\" (UniqueName: \"kubernetes.io/projected/ddb825f2-a05f-409d-b4cc-80408e2db5d7-kube-api-access-zz5xc\") pod \"kube-state-metrics-0\" (UID: \"ddb825f2-a05f-409d-b4cc-80408e2db5d7\") " pod="openstack/kube-state-metrics-0" Nov 24 11:26:05 crc kubenswrapper[4752]: I1124 11:26:05.691147 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddb825f2-a05f-409d-b4cc-80408e2db5d7-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ddb825f2-a05f-409d-b4cc-80408e2db5d7\") " pod="openstack/kube-state-metrics-0" Nov 24 11:26:05 crc kubenswrapper[4752]: I1124 11:26:05.697107 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ddb825f2-a05f-409d-b4cc-80408e2db5d7-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ddb825f2-a05f-409d-b4cc-80408e2db5d7\") " pod="openstack/kube-state-metrics-0" Nov 24 11:26:05 crc kubenswrapper[4752]: I1124 11:26:05.716530 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddb825f2-a05f-409d-b4cc-80408e2db5d7-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ddb825f2-a05f-409d-b4cc-80408e2db5d7\") " pod="openstack/kube-state-metrics-0" Nov 24 11:26:05 crc kubenswrapper[4752]: I1124 11:26:05.716546 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddb825f2-a05f-409d-b4cc-80408e2db5d7-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ddb825f2-a05f-409d-b4cc-80408e2db5d7\") " pod="openstack/kube-state-metrics-0" Nov 24 11:26:05 crc kubenswrapper[4752]: I1124 11:26:05.722529 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz5xc\" (UniqueName: \"kubernetes.io/projected/ddb825f2-a05f-409d-b4cc-80408e2db5d7-kube-api-access-zz5xc\") pod \"kube-state-metrics-0\" (UID: \"ddb825f2-a05f-409d-b4cc-80408e2db5d7\") " pod="openstack/kube-state-metrics-0" Nov 24 11:26:05 crc kubenswrapper[4752]: I1124 11:26:05.766966 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 24 11:26:06 crc kubenswrapper[4752]: I1124 11:26:06.291523 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 24 11:26:06 crc kubenswrapper[4752]: I1124 11:26:06.427606 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ddb825f2-a05f-409d-b4cc-80408e2db5d7","Type":"ContainerStarted","Data":"37d0f849cb5dc5f2ad9919e4c2d9c244d2e10f4fb87a539f8bdafee6df4bde47"} Nov 24 11:26:06 crc kubenswrapper[4752]: I1124 11:26:06.593990 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 24 11:26:06 crc kubenswrapper[4752]: I1124 11:26:06.753677 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4777fb8a-2a08-4286-8c15-39d249db6db0" path="/var/lib/kubelet/pods/4777fb8a-2a08-4286-8c15-39d249db6db0/volumes" Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.074021 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-x9jpf" Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.231702 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b86sr\" (UniqueName: \"kubernetes.io/projected/b9e6380d-e699-4757-ac88-6d245859dadd-kube-api-access-b86sr\") pod \"b9e6380d-e699-4757-ac88-6d245859dadd\" (UID: \"b9e6380d-e699-4757-ac88-6d245859dadd\") " Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.231761 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9e6380d-e699-4757-ac88-6d245859dadd-operator-scripts\") pod \"b9e6380d-e699-4757-ac88-6d245859dadd\" (UID: \"b9e6380d-e699-4757-ac88-6d245859dadd\") " Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.233561 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9e6380d-e699-4757-ac88-6d245859dadd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b9e6380d-e699-4757-ac88-6d245859dadd" (UID: "b9e6380d-e699-4757-ac88-6d245859dadd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.247125 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9e6380d-e699-4757-ac88-6d245859dadd-kube-api-access-b86sr" (OuterVolumeSpecName: "kube-api-access-b86sr") pod "b9e6380d-e699-4757-ac88-6d245859dadd" (UID: "b9e6380d-e699-4757-ac88-6d245859dadd"). InnerVolumeSpecName "kube-api-access-b86sr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.316181 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-19d4-account-create-84bgb" Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.324084 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hlrvc" Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.334691 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b86sr\" (UniqueName: \"kubernetes.io/projected/b9e6380d-e699-4757-ac88-6d245859dadd-kube-api-access-b86sr\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.334737 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9e6380d-e699-4757-ac88-6d245859dadd-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.337202 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-fmqfk" Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.354459 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-80e2-account-create-xlp8w" Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.370458 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-b683-account-create-zdwjb" Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.437197 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gg2sm\" (UniqueName: \"kubernetes.io/projected/94e8d35f-96ad-4182-a7dd-2adbbb113508-kube-api-access-gg2sm\") pod \"94e8d35f-96ad-4182-a7dd-2adbbb113508\" (UID: \"94e8d35f-96ad-4182-a7dd-2adbbb113508\") " Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.437258 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmpmx\" (UniqueName: \"kubernetes.io/projected/3acd95db-8b48-48c6-9153-d4f2dd07745a-kube-api-access-nmpmx\") pod \"3acd95db-8b48-48c6-9153-d4f2dd07745a\" (UID: \"3acd95db-8b48-48c6-9153-d4f2dd07745a\") " Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.437308 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3acd95db-8b48-48c6-9153-d4f2dd07745a-operator-scripts\") pod \"3acd95db-8b48-48c6-9153-d4f2dd07745a\" (UID: \"3acd95db-8b48-48c6-9153-d4f2dd07745a\") " Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.437330 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slwsg\" (UniqueName: \"kubernetes.io/projected/e0419545-a8e5-484a-8b23-8e57bd2a6d7b-kube-api-access-slwsg\") pod \"e0419545-a8e5-484a-8b23-8e57bd2a6d7b\" (UID: \"e0419545-a8e5-484a-8b23-8e57bd2a6d7b\") " Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.437371 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94e8d35f-96ad-4182-a7dd-2adbbb113508-operator-scripts\") pod \"94e8d35f-96ad-4182-a7dd-2adbbb113508\" (UID: \"94e8d35f-96ad-4182-a7dd-2adbbb113508\") " Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.437507 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/293563af-85f4-47d2-87d0-15f7172c173f-operator-scripts\") pod \"293563af-85f4-47d2-87d0-15f7172c173f\" (UID: \"293563af-85f4-47d2-87d0-15f7172c173f\") " Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.437532 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0419545-a8e5-484a-8b23-8e57bd2a6d7b-operator-scripts\") pod \"e0419545-a8e5-484a-8b23-8e57bd2a6d7b\" (UID: \"e0419545-a8e5-484a-8b23-8e57bd2a6d7b\") " Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.437598 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsxg7\" (UniqueName: \"kubernetes.io/projected/293563af-85f4-47d2-87d0-15f7172c173f-kube-api-access-jsxg7\") pod \"293563af-85f4-47d2-87d0-15f7172c173f\" (UID: \"293563af-85f4-47d2-87d0-15f7172c173f\") " Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.439967 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/293563af-85f4-47d2-87d0-15f7172c173f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "293563af-85f4-47d2-87d0-15f7172c173f" (UID: "293563af-85f4-47d2-87d0-15f7172c173f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.440350 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3acd95db-8b48-48c6-9153-d4f2dd07745a-kube-api-access-nmpmx" (OuterVolumeSpecName: "kube-api-access-nmpmx") pod "3acd95db-8b48-48c6-9153-d4f2dd07745a" (UID: "3acd95db-8b48-48c6-9153-d4f2dd07745a"). InnerVolumeSpecName "kube-api-access-nmpmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.440408 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0419545-a8e5-484a-8b23-8e57bd2a6d7b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e0419545-a8e5-484a-8b23-8e57bd2a6d7b" (UID: "e0419545-a8e5-484a-8b23-8e57bd2a6d7b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.440957 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94e8d35f-96ad-4182-a7dd-2adbbb113508-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "94e8d35f-96ad-4182-a7dd-2adbbb113508" (UID: "94e8d35f-96ad-4182-a7dd-2adbbb113508"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.441593 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3acd95db-8b48-48c6-9153-d4f2dd07745a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3acd95db-8b48-48c6-9153-d4f2dd07745a" (UID: "3acd95db-8b48-48c6-9153-d4f2dd07745a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.442689 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-b683-account-create-zdwjb" event={"ID":"e1469fa1-76e8-4cb0-a81f-2ec056462912","Type":"ContainerDied","Data":"f43e995fc8554050399b09fddcb9511e8e447b1e227a0d6100d851d8669a1605"} Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.442723 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-b683-account-create-zdwjb" Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.442732 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f43e995fc8554050399b09fddcb9511e8e447b1e227a0d6100d851d8669a1605" Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.445899 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/293563af-85f4-47d2-87d0-15f7172c173f-kube-api-access-jsxg7" (OuterVolumeSpecName: "kube-api-access-jsxg7") pod "293563af-85f4-47d2-87d0-15f7172c173f" (UID: "293563af-85f4-47d2-87d0-15f7172c173f"). InnerVolumeSpecName "kube-api-access-jsxg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.446164 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94e8d35f-96ad-4182-a7dd-2adbbb113508-kube-api-access-gg2sm" (OuterVolumeSpecName: "kube-api-access-gg2sm") pod "94e8d35f-96ad-4182-a7dd-2adbbb113508" (UID: "94e8d35f-96ad-4182-a7dd-2adbbb113508"). InnerVolumeSpecName "kube-api-access-gg2sm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.446382 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-fmqfk" event={"ID":"94e8d35f-96ad-4182-a7dd-2adbbb113508","Type":"ContainerDied","Data":"70521b0e317f3ef89430e40951208d988f5db13bb69f0a7ae1b19c8f70d137ec"} Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.446417 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70521b0e317f3ef89430e40951208d988f5db13bb69f0a7ae1b19c8f70d137ec" Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.446388 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-fmqfk" Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.448314 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-80e2-account-create-xlp8w" event={"ID":"e0419545-a8e5-484a-8b23-8e57bd2a6d7b","Type":"ContainerDied","Data":"24087204fd5e6fc4d3bec4895b7b5b8f4a494b554821243fa3cf8f9d5728a919"} Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.448356 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24087204fd5e6fc4d3bec4895b7b5b8f4a494b554821243fa3cf8f9d5728a919" Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.448402 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-80e2-account-create-xlp8w" Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.451736 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-19d4-account-create-84bgb" event={"ID":"293563af-85f4-47d2-87d0-15f7172c173f","Type":"ContainerDied","Data":"0580219c23c4a2ec91d02a2cae44b0c8ffcf8df9d3bb90f28cb8b58430b9e93f"} Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.451851 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0580219c23c4a2ec91d02a2cae44b0c8ffcf8df9d3bb90f28cb8b58430b9e93f" Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.451927 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-19d4-account-create-84bgb" Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.468514 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0419545-a8e5-484a-8b23-8e57bd2a6d7b-kube-api-access-slwsg" (OuterVolumeSpecName: "kube-api-access-slwsg") pod "e0419545-a8e5-484a-8b23-8e57bd2a6d7b" (UID: "e0419545-a8e5-484a-8b23-8e57bd2a6d7b"). InnerVolumeSpecName "kube-api-access-slwsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.478588 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hlrvc" event={"ID":"3acd95db-8b48-48c6-9153-d4f2dd07745a","Type":"ContainerDied","Data":"c00781bab3d87dde400d495b8256e364ba1e448388264783e10a992d73a3a2e2"} Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.478621 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c00781bab3d87dde400d495b8256e364ba1e448388264783e10a992d73a3a2e2" Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.478672 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hlrvc" Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.496877 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-x9jpf" Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.497287 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-x9jpf" event={"ID":"b9e6380d-e699-4757-ac88-6d245859dadd","Type":"ContainerDied","Data":"f2f5db21c5a08aebeae1a66ae46f6d7d12a7e6d21f449c78f8c5330aa652e596"} Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.497334 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2f5db21c5a08aebeae1a66ae46f6d7d12a7e6d21f449c78f8c5330aa652e596" Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.540157 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1469fa1-76e8-4cb0-a81f-2ec056462912-operator-scripts\") pod \"e1469fa1-76e8-4cb0-a81f-2ec056462912\" (UID: \"e1469fa1-76e8-4cb0-a81f-2ec056462912\") " Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.540258 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgxzc\" (UniqueName: \"kubernetes.io/projected/e1469fa1-76e8-4cb0-a81f-2ec056462912-kube-api-access-vgxzc\") pod \"e1469fa1-76e8-4cb0-a81f-2ec056462912\" (UID: \"e1469fa1-76e8-4cb0-a81f-2ec056462912\") " Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.540956 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1469fa1-76e8-4cb0-a81f-2ec056462912-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e1469fa1-76e8-4cb0-a81f-2ec056462912" (UID: "e1469fa1-76e8-4cb0-a81f-2ec056462912"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.541352 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsxg7\" (UniqueName: \"kubernetes.io/projected/293563af-85f4-47d2-87d0-15f7172c173f-kube-api-access-jsxg7\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.541381 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1469fa1-76e8-4cb0-a81f-2ec056462912-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.541390 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gg2sm\" (UniqueName: \"kubernetes.io/projected/94e8d35f-96ad-4182-a7dd-2adbbb113508-kube-api-access-gg2sm\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.541402 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmpmx\" (UniqueName: \"kubernetes.io/projected/3acd95db-8b48-48c6-9153-d4f2dd07745a-kube-api-access-nmpmx\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.541412 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3acd95db-8b48-48c6-9153-d4f2dd07745a-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.541424 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slwsg\" (UniqueName: \"kubernetes.io/projected/e0419545-a8e5-484a-8b23-8e57bd2a6d7b-kube-api-access-slwsg\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.541433 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94e8d35f-96ad-4182-a7dd-2adbbb113508-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.541442 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/293563af-85f4-47d2-87d0-15f7172c173f-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.541451 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0419545-a8e5-484a-8b23-8e57bd2a6d7b-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.544233 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1469fa1-76e8-4cb0-a81f-2ec056462912-kube-api-access-vgxzc" (OuterVolumeSpecName: "kube-api-access-vgxzc") pod "e1469fa1-76e8-4cb0-a81f-2ec056462912" (UID: "e1469fa1-76e8-4cb0-a81f-2ec056462912"). InnerVolumeSpecName "kube-api-access-vgxzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:26:07 crc kubenswrapper[4752]: I1124 11:26:07.643198 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgxzc\" (UniqueName: \"kubernetes.io/projected/e1469fa1-76e8-4cb0-a81f-2ec056462912-kube-api-access-vgxzc\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:08 crc kubenswrapper[4752]: I1124 11:26:08.506844 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ddb825f2-a05f-409d-b4cc-80408e2db5d7","Type":"ContainerStarted","Data":"7d8a87521315ef54b08c4e35b1d82277cba49344eb491d6cd687e4acfa7826e7"} Nov 24 11:26:08 crc kubenswrapper[4752]: I1124 11:26:08.507398 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 24 11:26:08 crc kubenswrapper[4752]: I1124 11:26:08.509083 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f","Type":"ContainerStarted","Data":"7bf803de4b2c3d023a813714e5a79610e5331ed76059f7f8d5d491ea210962ba"} Nov 24 11:26:08 crc kubenswrapper[4752]: I1124 11:26:08.509114 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f","Type":"ContainerStarted","Data":"3de8cbe7d646d4f2c03050c329f8a77a605374ffd9beb6933a45288819611904"} Nov 24 11:26:08 crc kubenswrapper[4752]: I1124 11:26:08.534098 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.403849692 podStartE2EDuration="3.534074981s" podCreationTimestamp="2025-11-24 11:26:05 +0000 UTC" firstStartedPulling="2025-11-24 11:26:06.285146437 +0000 UTC m=+1172.269966726" lastFinishedPulling="2025-11-24 11:26:07.415371736 +0000 UTC m=+1173.400192015" observedRunningTime="2025-11-24 11:26:08.523584708 +0000 UTC m=+1174.508405007" watchObservedRunningTime="2025-11-24 11:26:08.534074981 +0000 UTC m=+1174.518895270" Nov 24 11:26:09 crc kubenswrapper[4752]: I1124 11:26:09.518475 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f","Type":"ContainerStarted","Data":"76963827a185faf059c72f4b0c5f7c1ecd0cb3e3801ed477bef37207f4d2ca35"} Nov 24 11:26:10 crc kubenswrapper[4752]: I1124 11:26:10.913147 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-79cf79466c-6hfp8" Nov 24 11:26:10 crc kubenswrapper[4752]: I1124 11:26:10.914276 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-79cf79466c-6hfp8" Nov 24 11:26:11 crc kubenswrapper[4752]: I1124 11:26:11.552057 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f","Type":"ContainerStarted","Data":"a65de3a78cc41a40e7ba884c2255d39b1a283a5bbce01d962354f63739f6db93"} Nov 24 11:26:11 crc kubenswrapper[4752]: I1124 11:26:11.552384 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7b6fbdcc-415b-458d-bfa2-f91fb6794d7f" containerName="ceilometer-central-agent" containerID="cri-o://3de8cbe7d646d4f2c03050c329f8a77a605374ffd9beb6933a45288819611904" gracePeriod=30 Nov 24 11:26:11 crc kubenswrapper[4752]: I1124 11:26:11.552434 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7b6fbdcc-415b-458d-bfa2-f91fb6794d7f" containerName="ceilometer-notification-agent" containerID="cri-o://7bf803de4b2c3d023a813714e5a79610e5331ed76059f7f8d5d491ea210962ba" gracePeriod=30 Nov 24 11:26:11 crc kubenswrapper[4752]: I1124 11:26:11.552455 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7b6fbdcc-415b-458d-bfa2-f91fb6794d7f" containerName="sg-core" containerID="cri-o://76963827a185faf059c72f4b0c5f7c1ecd0cb3e3801ed477bef37207f4d2ca35" gracePeriod=30 Nov 24 11:26:11 crc kubenswrapper[4752]: I1124 11:26:11.552384 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7b6fbdcc-415b-458d-bfa2-f91fb6794d7f" containerName="proxy-httpd" containerID="cri-o://a65de3a78cc41a40e7ba884c2255d39b1a283a5bbce01d962354f63739f6db93" gracePeriod=30 Nov 24 11:26:11 crc kubenswrapper[4752]: I1124 11:26:11.577046 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.939170599 podStartE2EDuration="7.577019448s" podCreationTimestamp="2025-11-24 11:26:04 +0000 UTC" firstStartedPulling="2025-11-24 11:26:04.870432978 +0000 UTC m=+1170.855253267" lastFinishedPulling="2025-11-24 11:26:10.508281827 +0000 UTC m=+1176.493102116" observedRunningTime="2025-11-24 11:26:11.573935919 +0000 UTC m=+1177.558756218" watchObservedRunningTime="2025-11-24 11:26:11.577019448 +0000 UTC m=+1177.561839737" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.308420 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.321295 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7qgtl"] Nov 24 11:26:12 crc kubenswrapper[4752]: E1124 11:26:12.321621 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0419545-a8e5-484a-8b23-8e57bd2a6d7b" containerName="mariadb-account-create" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.321636 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0419545-a8e5-484a-8b23-8e57bd2a6d7b" containerName="mariadb-account-create" Nov 24 11:26:12 crc kubenswrapper[4752]: E1124 11:26:12.321650 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b6fbdcc-415b-458d-bfa2-f91fb6794d7f" containerName="ceilometer-notification-agent" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.321656 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b6fbdcc-415b-458d-bfa2-f91fb6794d7f" containerName="ceilometer-notification-agent" Nov 24 11:26:12 crc kubenswrapper[4752]: E1124 11:26:12.321674 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1469fa1-76e8-4cb0-a81f-2ec056462912" containerName="mariadb-account-create" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.321682 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1469fa1-76e8-4cb0-a81f-2ec056462912" containerName="mariadb-account-create" Nov 24 11:26:12 crc kubenswrapper[4752]: E1124 11:26:12.321688 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b6fbdcc-415b-458d-bfa2-f91fb6794d7f" containerName="sg-core" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.321694 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b6fbdcc-415b-458d-bfa2-f91fb6794d7f" containerName="sg-core" Nov 24 11:26:12 crc kubenswrapper[4752]: E1124 11:26:12.321709 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94e8d35f-96ad-4182-a7dd-2adbbb113508" containerName="mariadb-database-create" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.321715 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="94e8d35f-96ad-4182-a7dd-2adbbb113508" containerName="mariadb-database-create" Nov 24 11:26:12 crc kubenswrapper[4752]: E1124 11:26:12.321729 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3acd95db-8b48-48c6-9153-d4f2dd07745a" containerName="mariadb-database-create" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.321734 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="3acd95db-8b48-48c6-9153-d4f2dd07745a" containerName="mariadb-database-create" Nov 24 11:26:12 crc kubenswrapper[4752]: E1124 11:26:12.321756 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="293563af-85f4-47d2-87d0-15f7172c173f" containerName="mariadb-account-create" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.321762 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="293563af-85f4-47d2-87d0-15f7172c173f" containerName="mariadb-account-create" Nov 24 11:26:12 crc kubenswrapper[4752]: E1124 11:26:12.321772 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b6fbdcc-415b-458d-bfa2-f91fb6794d7f" containerName="ceilometer-central-agent" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.321777 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b6fbdcc-415b-458d-bfa2-f91fb6794d7f" containerName="ceilometer-central-agent" Nov 24 11:26:12 crc kubenswrapper[4752]: E1124 11:26:12.321787 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b6fbdcc-415b-458d-bfa2-f91fb6794d7f" containerName="proxy-httpd" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.321794 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b6fbdcc-415b-458d-bfa2-f91fb6794d7f" containerName="proxy-httpd" Nov 24 11:26:12 crc kubenswrapper[4752]: E1124 11:26:12.321808 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9e6380d-e699-4757-ac88-6d245859dadd" containerName="mariadb-database-create" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.321814 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9e6380d-e699-4757-ac88-6d245859dadd" containerName="mariadb-database-create" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.321975 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b6fbdcc-415b-458d-bfa2-f91fb6794d7f" containerName="sg-core" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.321991 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b6fbdcc-415b-458d-bfa2-f91fb6794d7f" containerName="ceilometer-notification-agent" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.322000 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="94e8d35f-96ad-4182-a7dd-2adbbb113508" containerName="mariadb-database-create" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.322008 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b6fbdcc-415b-458d-bfa2-f91fb6794d7f" containerName="ceilometer-central-agent" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.322019 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1469fa1-76e8-4cb0-a81f-2ec056462912" containerName="mariadb-account-create" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.322029 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="3acd95db-8b48-48c6-9153-d4f2dd07745a" containerName="mariadb-database-create" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.322036 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b6fbdcc-415b-458d-bfa2-f91fb6794d7f" containerName="proxy-httpd" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.322048 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0419545-a8e5-484a-8b23-8e57bd2a6d7b" containerName="mariadb-account-create" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.322054 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="293563af-85f4-47d2-87d0-15f7172c173f" containerName="mariadb-account-create" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.322062 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9e6380d-e699-4757-ac88-6d245859dadd" containerName="mariadb-database-create" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.322571 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-7qgtl" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.324873 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.325115 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-q9v5z" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.326215 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.334116 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7qgtl"] Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.423253 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f-sg-core-conf-yaml\") pod \"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f\" (UID: \"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f\") " Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.423387 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2wb8\" (UniqueName: \"kubernetes.io/projected/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f-kube-api-access-t2wb8\") pod \"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f\" (UID: \"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f\") " Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.423442 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f-scripts\") pod \"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f\" (UID: \"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f\") " Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.423487 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f-combined-ca-bundle\") pod \"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f\" (UID: \"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f\") " Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.423571 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f-run-httpd\") pod \"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f\" (UID: \"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f\") " Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.423675 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f-config-data\") pod \"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f\" (UID: \"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f\") " Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.423709 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f-log-httpd\") pod \"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f\" (UID: \"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f\") " Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.423976 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7b6fbdcc-415b-458d-bfa2-f91fb6794d7f" (UID: "7b6fbdcc-415b-458d-bfa2-f91fb6794d7f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.424077 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55-config-data\") pod \"nova-cell0-conductor-db-sync-7qgtl\" (UID: \"2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55\") " pod="openstack/nova-cell0-conductor-db-sync-7qgtl" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.424201 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmccx\" (UniqueName: \"kubernetes.io/projected/2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55-kube-api-access-qmccx\") pod \"nova-cell0-conductor-db-sync-7qgtl\" (UID: \"2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55\") " pod="openstack/nova-cell0-conductor-db-sync-7qgtl" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.424317 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7b6fbdcc-415b-458d-bfa2-f91fb6794d7f" (UID: "7b6fbdcc-415b-458d-bfa2-f91fb6794d7f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.424383 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55-scripts\") pod \"nova-cell0-conductor-db-sync-7qgtl\" (UID: \"2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55\") " pod="openstack/nova-cell0-conductor-db-sync-7qgtl" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.424613 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-7qgtl\" (UID: \"2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55\") " pod="openstack/nova-cell0-conductor-db-sync-7qgtl" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.424757 4752 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.424781 4752 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.429211 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f-scripts" (OuterVolumeSpecName: "scripts") pod "7b6fbdcc-415b-458d-bfa2-f91fb6794d7f" (UID: "7b6fbdcc-415b-458d-bfa2-f91fb6794d7f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.430179 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f-kube-api-access-t2wb8" (OuterVolumeSpecName: "kube-api-access-t2wb8") pod "7b6fbdcc-415b-458d-bfa2-f91fb6794d7f" (UID: "7b6fbdcc-415b-458d-bfa2-f91fb6794d7f"). InnerVolumeSpecName "kube-api-access-t2wb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.466891 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7b6fbdcc-415b-458d-bfa2-f91fb6794d7f" (UID: "7b6fbdcc-415b-458d-bfa2-f91fb6794d7f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.512069 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b6fbdcc-415b-458d-bfa2-f91fb6794d7f" (UID: "7b6fbdcc-415b-458d-bfa2-f91fb6794d7f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.526142 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f-config-data" (OuterVolumeSpecName: "config-data") pod "7b6fbdcc-415b-458d-bfa2-f91fb6794d7f" (UID: "7b6fbdcc-415b-458d-bfa2-f91fb6794d7f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.526395 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f-config-data\") pod \"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f\" (UID: \"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f\") " Nov 24 11:26:12 crc kubenswrapper[4752]: W1124 11:26:12.526553 4752 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f/volumes/kubernetes.io~secret/config-data Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.526574 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f-config-data" (OuterVolumeSpecName: "config-data") pod "7b6fbdcc-415b-458d-bfa2-f91fb6794d7f" (UID: "7b6fbdcc-415b-458d-bfa2-f91fb6794d7f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.526700 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55-config-data\") pod \"nova-cell0-conductor-db-sync-7qgtl\" (UID: \"2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55\") " pod="openstack/nova-cell0-conductor-db-sync-7qgtl" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.526836 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmccx\" (UniqueName: \"kubernetes.io/projected/2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55-kube-api-access-qmccx\") pod \"nova-cell0-conductor-db-sync-7qgtl\" (UID: \"2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55\") " pod="openstack/nova-cell0-conductor-db-sync-7qgtl" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.526902 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55-scripts\") pod \"nova-cell0-conductor-db-sync-7qgtl\" (UID: \"2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55\") " pod="openstack/nova-cell0-conductor-db-sync-7qgtl" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.526980 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-7qgtl\" (UID: \"2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55\") " pod="openstack/nova-cell0-conductor-db-sync-7qgtl" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.527070 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.527088 4752 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.527099 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2wb8\" (UniqueName: \"kubernetes.io/projected/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f-kube-api-access-t2wb8\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.527108 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.527116 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.530586 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-7qgtl\" (UID: \"2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55\") " pod="openstack/nova-cell0-conductor-db-sync-7qgtl" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.531052 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55-config-data\") pod \"nova-cell0-conductor-db-sync-7qgtl\" (UID: \"2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55\") " pod="openstack/nova-cell0-conductor-db-sync-7qgtl" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.532056 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55-scripts\") pod \"nova-cell0-conductor-db-sync-7qgtl\" (UID: \"2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55\") " pod="openstack/nova-cell0-conductor-db-sync-7qgtl" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.542693 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmccx\" (UniqueName: \"kubernetes.io/projected/2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55-kube-api-access-qmccx\") pod \"nova-cell0-conductor-db-sync-7qgtl\" (UID: \"2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55\") " pod="openstack/nova-cell0-conductor-db-sync-7qgtl" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.563296 4752 generic.go:334] "Generic (PLEG): container finished" podID="7b6fbdcc-415b-458d-bfa2-f91fb6794d7f" containerID="a65de3a78cc41a40e7ba884c2255d39b1a283a5bbce01d962354f63739f6db93" exitCode=0 Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.563335 4752 generic.go:334] "Generic (PLEG): container finished" podID="7b6fbdcc-415b-458d-bfa2-f91fb6794d7f" containerID="76963827a185faf059c72f4b0c5f7c1ecd0cb3e3801ed477bef37207f4d2ca35" exitCode=2 Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.563346 4752 generic.go:334] "Generic (PLEG): container finished" podID="7b6fbdcc-415b-458d-bfa2-f91fb6794d7f" containerID="7bf803de4b2c3d023a813714e5a79610e5331ed76059f7f8d5d491ea210962ba" exitCode=0 Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.563354 4752 generic.go:334] "Generic (PLEG): container finished" podID="7b6fbdcc-415b-458d-bfa2-f91fb6794d7f" containerID="3de8cbe7d646d4f2c03050c329f8a77a605374ffd9beb6933a45288819611904" exitCode=0 Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.563373 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f","Type":"ContainerDied","Data":"a65de3a78cc41a40e7ba884c2255d39b1a283a5bbce01d962354f63739f6db93"} Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.563399 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f","Type":"ContainerDied","Data":"76963827a185faf059c72f4b0c5f7c1ecd0cb3e3801ed477bef37207f4d2ca35"} Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.563409 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f","Type":"ContainerDied","Data":"7bf803de4b2c3d023a813714e5a79610e5331ed76059f7f8d5d491ea210962ba"} Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.563419 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f","Type":"ContainerDied","Data":"3de8cbe7d646d4f2c03050c329f8a77a605374ffd9beb6933a45288819611904"} Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.563427 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b6fbdcc-415b-458d-bfa2-f91fb6794d7f","Type":"ContainerDied","Data":"c64871926fe9dd64686c7f9f93636bcdf194c45d0f3307d9579668fc591580d1"} Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.563443 4752 scope.go:117] "RemoveContainer" containerID="a65de3a78cc41a40e7ba884c2255d39b1a283a5bbce01d962354f63739f6db93" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.563562 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.603014 4752 scope.go:117] "RemoveContainer" containerID="76963827a185faf059c72f4b0c5f7c1ecd0cb3e3801ed477bef37207f4d2ca35" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.607186 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.613557 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.634434 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.640274 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-7qgtl" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.642126 4752 scope.go:117] "RemoveContainer" containerID="7bf803de4b2c3d023a813714e5a79610e5331ed76059f7f8d5d491ea210962ba" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.643469 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.658587 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.658883 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.659162 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.690821 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.731641 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb\") " pod="openstack/ceilometer-0" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.732244 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-config-data\") pod \"ceilometer-0\" (UID: \"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb\") " pod="openstack/ceilometer-0" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.732297 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb\") " pod="openstack/ceilometer-0" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.732364 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-run-httpd\") pod \"ceilometer-0\" (UID: \"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb\") " pod="openstack/ceilometer-0" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.732389 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-log-httpd\") pod \"ceilometer-0\" (UID: \"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb\") " pod="openstack/ceilometer-0" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.732476 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw5n8\" (UniqueName: \"kubernetes.io/projected/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-kube-api-access-bw5n8\") pod \"ceilometer-0\" (UID: \"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb\") " pod="openstack/ceilometer-0" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.732546 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb\") " pod="openstack/ceilometer-0" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.732596 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-scripts\") pod \"ceilometer-0\" (UID: \"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb\") " pod="openstack/ceilometer-0" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.745669 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b6fbdcc-415b-458d-bfa2-f91fb6794d7f" path="/var/lib/kubelet/pods/7b6fbdcc-415b-458d-bfa2-f91fb6794d7f/volumes" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.797126 4752 scope.go:117] "RemoveContainer" containerID="3de8cbe7d646d4f2c03050c329f8a77a605374ffd9beb6933a45288819611904" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.825336 4752 scope.go:117] "RemoveContainer" containerID="a65de3a78cc41a40e7ba884c2255d39b1a283a5bbce01d962354f63739f6db93" Nov 24 11:26:12 crc kubenswrapper[4752]: E1124 11:26:12.825719 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a65de3a78cc41a40e7ba884c2255d39b1a283a5bbce01d962354f63739f6db93\": container with ID starting with a65de3a78cc41a40e7ba884c2255d39b1a283a5bbce01d962354f63739f6db93 not found: ID does not exist" containerID="a65de3a78cc41a40e7ba884c2255d39b1a283a5bbce01d962354f63739f6db93" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.825767 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a65de3a78cc41a40e7ba884c2255d39b1a283a5bbce01d962354f63739f6db93"} err="failed to get container status \"a65de3a78cc41a40e7ba884c2255d39b1a283a5bbce01d962354f63739f6db93\": rpc error: code = NotFound desc = could not find container \"a65de3a78cc41a40e7ba884c2255d39b1a283a5bbce01d962354f63739f6db93\": container with ID starting with a65de3a78cc41a40e7ba884c2255d39b1a283a5bbce01d962354f63739f6db93 not found: ID does not exist" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.825791 4752 scope.go:117] "RemoveContainer" containerID="76963827a185faf059c72f4b0c5f7c1ecd0cb3e3801ed477bef37207f4d2ca35" Nov 24 11:26:12 crc kubenswrapper[4752]: E1124 11:26:12.828190 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76963827a185faf059c72f4b0c5f7c1ecd0cb3e3801ed477bef37207f4d2ca35\": container with ID starting with 76963827a185faf059c72f4b0c5f7c1ecd0cb3e3801ed477bef37207f4d2ca35 not found: ID does not exist" containerID="76963827a185faf059c72f4b0c5f7c1ecd0cb3e3801ed477bef37207f4d2ca35" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.828222 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76963827a185faf059c72f4b0c5f7c1ecd0cb3e3801ed477bef37207f4d2ca35"} err="failed to get container status \"76963827a185faf059c72f4b0c5f7c1ecd0cb3e3801ed477bef37207f4d2ca35\": rpc error: code = NotFound desc = could not find container \"76963827a185faf059c72f4b0c5f7c1ecd0cb3e3801ed477bef37207f4d2ca35\": container with ID starting with 76963827a185faf059c72f4b0c5f7c1ecd0cb3e3801ed477bef37207f4d2ca35 not found: ID does not exist" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.828246 4752 scope.go:117] "RemoveContainer" containerID="7bf803de4b2c3d023a813714e5a79610e5331ed76059f7f8d5d491ea210962ba" Nov 24 11:26:12 crc kubenswrapper[4752]: E1124 11:26:12.830315 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bf803de4b2c3d023a813714e5a79610e5331ed76059f7f8d5d491ea210962ba\": container with ID starting with 7bf803de4b2c3d023a813714e5a79610e5331ed76059f7f8d5d491ea210962ba not found: ID does not exist" containerID="7bf803de4b2c3d023a813714e5a79610e5331ed76059f7f8d5d491ea210962ba" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.830350 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bf803de4b2c3d023a813714e5a79610e5331ed76059f7f8d5d491ea210962ba"} err="failed to get container status \"7bf803de4b2c3d023a813714e5a79610e5331ed76059f7f8d5d491ea210962ba\": rpc error: code = NotFound desc = could not find container \"7bf803de4b2c3d023a813714e5a79610e5331ed76059f7f8d5d491ea210962ba\": container with ID starting with 7bf803de4b2c3d023a813714e5a79610e5331ed76059f7f8d5d491ea210962ba not found: ID does not exist" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.830374 4752 scope.go:117] "RemoveContainer" containerID="3de8cbe7d646d4f2c03050c329f8a77a605374ffd9beb6933a45288819611904" Nov 24 11:26:12 crc kubenswrapper[4752]: E1124 11:26:12.830681 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3de8cbe7d646d4f2c03050c329f8a77a605374ffd9beb6933a45288819611904\": container with ID starting with 3de8cbe7d646d4f2c03050c329f8a77a605374ffd9beb6933a45288819611904 not found: ID does not exist" containerID="3de8cbe7d646d4f2c03050c329f8a77a605374ffd9beb6933a45288819611904" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.830705 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3de8cbe7d646d4f2c03050c329f8a77a605374ffd9beb6933a45288819611904"} err="failed to get container status \"3de8cbe7d646d4f2c03050c329f8a77a605374ffd9beb6933a45288819611904\": rpc error: code = NotFound desc = could not find container \"3de8cbe7d646d4f2c03050c329f8a77a605374ffd9beb6933a45288819611904\": container with ID starting with 3de8cbe7d646d4f2c03050c329f8a77a605374ffd9beb6933a45288819611904 not found: ID does not exist" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.830722 4752 scope.go:117] "RemoveContainer" containerID="a65de3a78cc41a40e7ba884c2255d39b1a283a5bbce01d962354f63739f6db93" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.830989 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a65de3a78cc41a40e7ba884c2255d39b1a283a5bbce01d962354f63739f6db93"} err="failed to get container status \"a65de3a78cc41a40e7ba884c2255d39b1a283a5bbce01d962354f63739f6db93\": rpc error: code = NotFound desc = could not find container \"a65de3a78cc41a40e7ba884c2255d39b1a283a5bbce01d962354f63739f6db93\": container with ID starting with a65de3a78cc41a40e7ba884c2255d39b1a283a5bbce01d962354f63739f6db93 not found: ID does not exist" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.831026 4752 scope.go:117] "RemoveContainer" containerID="76963827a185faf059c72f4b0c5f7c1ecd0cb3e3801ed477bef37207f4d2ca35" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.831213 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76963827a185faf059c72f4b0c5f7c1ecd0cb3e3801ed477bef37207f4d2ca35"} err="failed to get container status \"76963827a185faf059c72f4b0c5f7c1ecd0cb3e3801ed477bef37207f4d2ca35\": rpc error: code = NotFound desc = could not find container \"76963827a185faf059c72f4b0c5f7c1ecd0cb3e3801ed477bef37207f4d2ca35\": container with ID starting with 76963827a185faf059c72f4b0c5f7c1ecd0cb3e3801ed477bef37207f4d2ca35 not found: ID does not exist" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.831227 4752 scope.go:117] "RemoveContainer" containerID="7bf803de4b2c3d023a813714e5a79610e5331ed76059f7f8d5d491ea210962ba" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.831514 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bf803de4b2c3d023a813714e5a79610e5331ed76059f7f8d5d491ea210962ba"} err="failed to get container status \"7bf803de4b2c3d023a813714e5a79610e5331ed76059f7f8d5d491ea210962ba\": rpc error: code = NotFound desc = could not find container \"7bf803de4b2c3d023a813714e5a79610e5331ed76059f7f8d5d491ea210962ba\": container with ID starting with 7bf803de4b2c3d023a813714e5a79610e5331ed76059f7f8d5d491ea210962ba not found: ID does not exist" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.831538 4752 scope.go:117] "RemoveContainer" containerID="3de8cbe7d646d4f2c03050c329f8a77a605374ffd9beb6933a45288819611904" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.831880 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3de8cbe7d646d4f2c03050c329f8a77a605374ffd9beb6933a45288819611904"} err="failed to get container status \"3de8cbe7d646d4f2c03050c329f8a77a605374ffd9beb6933a45288819611904\": rpc error: code = NotFound desc = could not find container \"3de8cbe7d646d4f2c03050c329f8a77a605374ffd9beb6933a45288819611904\": container with ID starting with 3de8cbe7d646d4f2c03050c329f8a77a605374ffd9beb6933a45288819611904 not found: ID does not exist" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.831923 4752 scope.go:117] "RemoveContainer" containerID="a65de3a78cc41a40e7ba884c2255d39b1a283a5bbce01d962354f63739f6db93" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.832969 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a65de3a78cc41a40e7ba884c2255d39b1a283a5bbce01d962354f63739f6db93"} err="failed to get container status \"a65de3a78cc41a40e7ba884c2255d39b1a283a5bbce01d962354f63739f6db93\": rpc error: code = NotFound desc = could not find container \"a65de3a78cc41a40e7ba884c2255d39b1a283a5bbce01d962354f63739f6db93\": container with ID starting with a65de3a78cc41a40e7ba884c2255d39b1a283a5bbce01d962354f63739f6db93 not found: ID does not exist" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.832993 4752 scope.go:117] "RemoveContainer" containerID="76963827a185faf059c72f4b0c5f7c1ecd0cb3e3801ed477bef37207f4d2ca35" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.834713 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76963827a185faf059c72f4b0c5f7c1ecd0cb3e3801ed477bef37207f4d2ca35"} err="failed to get container status \"76963827a185faf059c72f4b0c5f7c1ecd0cb3e3801ed477bef37207f4d2ca35\": rpc error: code = NotFound desc = could not find container \"76963827a185faf059c72f4b0c5f7c1ecd0cb3e3801ed477bef37207f4d2ca35\": container with ID starting with 76963827a185faf059c72f4b0c5f7c1ecd0cb3e3801ed477bef37207f4d2ca35 not found: ID does not exist" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.834737 4752 scope.go:117] "RemoveContainer" containerID="7bf803de4b2c3d023a813714e5a79610e5331ed76059f7f8d5d491ea210962ba" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.835052 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bf803de4b2c3d023a813714e5a79610e5331ed76059f7f8d5d491ea210962ba"} err="failed to get container status \"7bf803de4b2c3d023a813714e5a79610e5331ed76059f7f8d5d491ea210962ba\": rpc error: code = NotFound desc = could not find container \"7bf803de4b2c3d023a813714e5a79610e5331ed76059f7f8d5d491ea210962ba\": container with ID starting with 7bf803de4b2c3d023a813714e5a79610e5331ed76059f7f8d5d491ea210962ba not found: ID does not exist" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.835094 4752 scope.go:117] "RemoveContainer" containerID="3de8cbe7d646d4f2c03050c329f8a77a605374ffd9beb6933a45288819611904" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.837885 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3de8cbe7d646d4f2c03050c329f8a77a605374ffd9beb6933a45288819611904"} err="failed to get container status \"3de8cbe7d646d4f2c03050c329f8a77a605374ffd9beb6933a45288819611904\": rpc error: code = NotFound desc = could not find container \"3de8cbe7d646d4f2c03050c329f8a77a605374ffd9beb6933a45288819611904\": container with ID starting with 3de8cbe7d646d4f2c03050c329f8a77a605374ffd9beb6933a45288819611904 not found: ID does not exist" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.837919 4752 scope.go:117] "RemoveContainer" containerID="a65de3a78cc41a40e7ba884c2255d39b1a283a5bbce01d962354f63739f6db93" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.838798 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a65de3a78cc41a40e7ba884c2255d39b1a283a5bbce01d962354f63739f6db93"} err="failed to get container status \"a65de3a78cc41a40e7ba884c2255d39b1a283a5bbce01d962354f63739f6db93\": rpc error: code = NotFound desc = could not find container \"a65de3a78cc41a40e7ba884c2255d39b1a283a5bbce01d962354f63739f6db93\": container with ID starting with a65de3a78cc41a40e7ba884c2255d39b1a283a5bbce01d962354f63739f6db93 not found: ID does not exist" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.838821 4752 scope.go:117] "RemoveContainer" containerID="76963827a185faf059c72f4b0c5f7c1ecd0cb3e3801ed477bef37207f4d2ca35" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.840162 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb\") " pod="openstack/ceilometer-0" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.840229 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-config-data\") pod \"ceilometer-0\" (UID: \"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb\") " pod="openstack/ceilometer-0" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.840289 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb\") " pod="openstack/ceilometer-0" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.840369 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-run-httpd\") pod \"ceilometer-0\" (UID: \"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb\") " pod="openstack/ceilometer-0" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.840397 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-log-httpd\") pod \"ceilometer-0\" (UID: \"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb\") " pod="openstack/ceilometer-0" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.840446 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw5n8\" (UniqueName: \"kubernetes.io/projected/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-kube-api-access-bw5n8\") pod \"ceilometer-0\" (UID: \"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb\") " pod="openstack/ceilometer-0" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.840474 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb\") " pod="openstack/ceilometer-0" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.840524 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-scripts\") pod \"ceilometer-0\" (UID: \"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb\") " pod="openstack/ceilometer-0" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.842993 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-log-httpd\") pod \"ceilometer-0\" (UID: \"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb\") " pod="openstack/ceilometer-0" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.843145 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-run-httpd\") pod \"ceilometer-0\" (UID: \"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb\") " pod="openstack/ceilometer-0" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.843189 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76963827a185faf059c72f4b0c5f7c1ecd0cb3e3801ed477bef37207f4d2ca35"} err="failed to get container status \"76963827a185faf059c72f4b0c5f7c1ecd0cb3e3801ed477bef37207f4d2ca35\": rpc error: code = NotFound desc = could not find container \"76963827a185faf059c72f4b0c5f7c1ecd0cb3e3801ed477bef37207f4d2ca35\": container with ID starting with 76963827a185faf059c72f4b0c5f7c1ecd0cb3e3801ed477bef37207f4d2ca35 not found: ID does not exist" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.843231 4752 scope.go:117] "RemoveContainer" containerID="7bf803de4b2c3d023a813714e5a79610e5331ed76059f7f8d5d491ea210962ba" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.844377 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bf803de4b2c3d023a813714e5a79610e5331ed76059f7f8d5d491ea210962ba"} err="failed to get container status \"7bf803de4b2c3d023a813714e5a79610e5331ed76059f7f8d5d491ea210962ba\": rpc error: code = NotFound desc = could not find container \"7bf803de4b2c3d023a813714e5a79610e5331ed76059f7f8d5d491ea210962ba\": container with ID starting with 7bf803de4b2c3d023a813714e5a79610e5331ed76059f7f8d5d491ea210962ba not found: ID does not exist" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.844399 4752 scope.go:117] "RemoveContainer" containerID="3de8cbe7d646d4f2c03050c329f8a77a605374ffd9beb6933a45288819611904" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.844858 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3de8cbe7d646d4f2c03050c329f8a77a605374ffd9beb6933a45288819611904"} err="failed to get container status \"3de8cbe7d646d4f2c03050c329f8a77a605374ffd9beb6933a45288819611904\": rpc error: code = NotFound desc = could not find container \"3de8cbe7d646d4f2c03050c329f8a77a605374ffd9beb6933a45288819611904\": container with ID starting with 3de8cbe7d646d4f2c03050c329f8a77a605374ffd9beb6933a45288819611904 not found: ID does not exist" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.850341 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-config-data\") pod \"ceilometer-0\" (UID: \"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb\") " pod="openstack/ceilometer-0" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.854296 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb\") " pod="openstack/ceilometer-0" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.858020 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb\") " pod="openstack/ceilometer-0" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.859738 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-scripts\") pod \"ceilometer-0\" (UID: \"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb\") " pod="openstack/ceilometer-0" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.865977 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw5n8\" (UniqueName: \"kubernetes.io/projected/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-kube-api-access-bw5n8\") pod \"ceilometer-0\" (UID: \"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb\") " pod="openstack/ceilometer-0" Nov 24 11:26:12 crc kubenswrapper[4752]: I1124 11:26:12.881581 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb\") " pod="openstack/ceilometer-0" Nov 24 11:26:13 crc kubenswrapper[4752]: I1124 11:26:13.085617 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 11:26:13 crc kubenswrapper[4752]: I1124 11:26:13.154621 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7qgtl"] Nov 24 11:26:13 crc kubenswrapper[4752]: I1124 11:26:13.578168 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 24 11:26:13 crc kubenswrapper[4752]: I1124 11:26:13.585269 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-7qgtl" event={"ID":"2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55","Type":"ContainerStarted","Data":"fa63b3bed5e17fc9e7b3732fbd1d0d90ffbe7362bfd476db19f0fb8ddd91e110"} Nov 24 11:26:13 crc kubenswrapper[4752]: W1124 11:26:13.586208 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd02d2c0_6c03_46cc_9bc6_de7f16b89acb.slice/crio-04020ccb08b6bf747bde4dc2940f04e4046a354c63a37476d8bfef5b3271c8d1 WatchSource:0}: Error finding container 04020ccb08b6bf747bde4dc2940f04e4046a354c63a37476d8bfef5b3271c8d1: Status 404 returned error can't find the container with id 04020ccb08b6bf747bde4dc2940f04e4046a354c63a37476d8bfef5b3271c8d1 Nov 24 11:26:14 crc kubenswrapper[4752]: I1124 11:26:14.616153 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb","Type":"ContainerStarted","Data":"abbdc6f2496d21d6ed2c61a9b2bf0510522fadd077c510dfa54caa53686b0689"} Nov 24 11:26:14 crc kubenswrapper[4752]: I1124 11:26:14.616491 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb","Type":"ContainerStarted","Data":"04020ccb08b6bf747bde4dc2940f04e4046a354c63a37476d8bfef5b3271c8d1"} Nov 24 11:26:15 crc kubenswrapper[4752]: I1124 11:26:15.660655 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb","Type":"ContainerStarted","Data":"c7ce26739d729d68b575dc1602916d98c2b5685a62b5cea8ee37598153bc8082"} Nov 24 11:26:15 crc kubenswrapper[4752]: I1124 11:26:15.784904 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 24 11:26:16 crc kubenswrapper[4752]: I1124 11:26:16.198625 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 24 11:26:16 crc kubenswrapper[4752]: I1124 11:26:16.672442 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb","Type":"ContainerStarted","Data":"87efcae88cedc15462a8fe69c88dbd7d70162685faf240bdfb47620ec03b9460"} Nov 24 11:26:17 crc kubenswrapper[4752]: I1124 11:26:17.049664 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 11:26:17 crc kubenswrapper[4752]: I1124 11:26:17.049981 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9b4cf2fd-213b-4101-b13b-9a58eb9f3f28" containerName="glance-log" containerID="cri-o://e81d5a0b95235237f39a31dd9ae847f30d22805d65c467d4d9437a45460ca9d7" gracePeriod=30 Nov 24 11:26:17 crc kubenswrapper[4752]: I1124 11:26:17.050056 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9b4cf2fd-213b-4101-b13b-9a58eb9f3f28" containerName="glance-httpd" containerID="cri-o://15375c58a53fe633e1759e20c9d16ad63468a36df47866fd2c275d06f38ec48e" gracePeriod=30 Nov 24 11:26:17 crc kubenswrapper[4752]: I1124 11:26:17.684823 4752 generic.go:334] "Generic (PLEG): container finished" podID="9b4cf2fd-213b-4101-b13b-9a58eb9f3f28" containerID="e81d5a0b95235237f39a31dd9ae847f30d22805d65c467d4d9437a45460ca9d7" exitCode=143 Nov 24 11:26:17 crc kubenswrapper[4752]: I1124 11:26:17.685136 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28","Type":"ContainerDied","Data":"e81d5a0b95235237f39a31dd9ae847f30d22805d65c467d4d9437a45460ca9d7"} Nov 24 11:26:18 crc kubenswrapper[4752]: I1124 11:26:18.819569 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 11:26:18 crc kubenswrapper[4752]: I1124 11:26:18.820282 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a512d7d1-a13c-4c87-a20b-81a2ae62655a" containerName="glance-httpd" containerID="cri-o://8a5c9af4ba1963c78b62aa82f9df991156dceab59a337d81426c1979f5fe519d" gracePeriod=30 Nov 24 11:26:18 crc kubenswrapper[4752]: I1124 11:26:18.819951 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a512d7d1-a13c-4c87-a20b-81a2ae62655a" containerName="glance-log" containerID="cri-o://efd227698d64905aa18bab84b4b892d17c7c39b448d5682eaadcccf82dbda928" gracePeriod=30 Nov 24 11:26:19 crc kubenswrapper[4752]: I1124 11:26:19.711237 4752 generic.go:334] "Generic (PLEG): container finished" podID="a512d7d1-a13c-4c87-a20b-81a2ae62655a" containerID="efd227698d64905aa18bab84b4b892d17c7c39b448d5682eaadcccf82dbda928" exitCode=143 Nov 24 11:26:19 crc kubenswrapper[4752]: I1124 11:26:19.711314 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a512d7d1-a13c-4c87-a20b-81a2ae62655a","Type":"ContainerDied","Data":"efd227698d64905aa18bab84b4b892d17c7c39b448d5682eaadcccf82dbda928"} Nov 24 11:26:20 crc kubenswrapper[4752]: I1124 11:26:20.726067 4752 generic.go:334] "Generic (PLEG): container finished" podID="9b4cf2fd-213b-4101-b13b-9a58eb9f3f28" containerID="15375c58a53fe633e1759e20c9d16ad63468a36df47866fd2c275d06f38ec48e" exitCode=0 Nov 24 11:26:20 crc kubenswrapper[4752]: I1124 11:26:20.726123 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28","Type":"ContainerDied","Data":"15375c58a53fe633e1759e20c9d16ad63468a36df47866fd2c275d06f38ec48e"} Nov 24 11:26:21 crc kubenswrapper[4752]: I1124 11:26:21.818145 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 24 11:26:21 crc kubenswrapper[4752]: I1124 11:26:21.933591 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b4cf2fd-213b-4101-b13b-9a58eb9f3f28-scripts\") pod \"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28\" (UID: \"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28\") " Nov 24 11:26:21 crc kubenswrapper[4752]: I1124 11:26:21.933696 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b4cf2fd-213b-4101-b13b-9a58eb9f3f28-logs\") pod \"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28\" (UID: \"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28\") " Nov 24 11:26:21 crc kubenswrapper[4752]: I1124 11:26:21.933729 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b4cf2fd-213b-4101-b13b-9a58eb9f3f28-combined-ca-bundle\") pod \"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28\" (UID: \"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28\") " Nov 24 11:26:21 crc kubenswrapper[4752]: I1124 11:26:21.933789 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28\" (UID: \"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28\") " Nov 24 11:26:21 crc kubenswrapper[4752]: I1124 11:26:21.933821 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9b4cf2fd-213b-4101-b13b-9a58eb9f3f28-httpd-run\") pod \"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28\" (UID: \"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28\") " Nov 24 11:26:21 crc kubenswrapper[4752]: I1124 11:26:21.933863 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b4cf2fd-213b-4101-b13b-9a58eb9f3f28-config-data\") pod \"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28\" (UID: \"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28\") " Nov 24 11:26:21 crc kubenswrapper[4752]: I1124 11:26:21.933889 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b4cf2fd-213b-4101-b13b-9a58eb9f3f28-public-tls-certs\") pod \"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28\" (UID: \"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28\") " Nov 24 11:26:21 crc kubenswrapper[4752]: I1124 11:26:21.933951 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvh9b\" (UniqueName: \"kubernetes.io/projected/9b4cf2fd-213b-4101-b13b-9a58eb9f3f28-kube-api-access-tvh9b\") pod \"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28\" (UID: \"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28\") " Nov 24 11:26:21 crc kubenswrapper[4752]: I1124 11:26:21.934313 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b4cf2fd-213b-4101-b13b-9a58eb9f3f28-logs" (OuterVolumeSpecName: "logs") pod "9b4cf2fd-213b-4101-b13b-9a58eb9f3f28" (UID: "9b4cf2fd-213b-4101-b13b-9a58eb9f3f28"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:26:21 crc kubenswrapper[4752]: I1124 11:26:21.934696 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b4cf2fd-213b-4101-b13b-9a58eb9f3f28-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9b4cf2fd-213b-4101-b13b-9a58eb9f3f28" (UID: "9b4cf2fd-213b-4101-b13b-9a58eb9f3f28"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:26:21 crc kubenswrapper[4752]: I1124 11:26:21.938117 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b4cf2fd-213b-4101-b13b-9a58eb9f3f28-scripts" (OuterVolumeSpecName: "scripts") pod "9b4cf2fd-213b-4101-b13b-9a58eb9f3f28" (UID: "9b4cf2fd-213b-4101-b13b-9a58eb9f3f28"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:26:21 crc kubenswrapper[4752]: I1124 11:26:21.938771 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "9b4cf2fd-213b-4101-b13b-9a58eb9f3f28" (UID: "9b4cf2fd-213b-4101-b13b-9a58eb9f3f28"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 24 11:26:21 crc kubenswrapper[4752]: I1124 11:26:21.938851 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b4cf2fd-213b-4101-b13b-9a58eb9f3f28-kube-api-access-tvh9b" (OuterVolumeSpecName: "kube-api-access-tvh9b") pod "9b4cf2fd-213b-4101-b13b-9a58eb9f3f28" (UID: "9b4cf2fd-213b-4101-b13b-9a58eb9f3f28"). InnerVolumeSpecName "kube-api-access-tvh9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:26:21 crc kubenswrapper[4752]: I1124 11:26:21.965057 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b4cf2fd-213b-4101-b13b-9a58eb9f3f28-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b4cf2fd-213b-4101-b13b-9a58eb9f3f28" (UID: "9b4cf2fd-213b-4101-b13b-9a58eb9f3f28"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:26:21 crc kubenswrapper[4752]: I1124 11:26:21.998004 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b4cf2fd-213b-4101-b13b-9a58eb9f3f28-config-data" (OuterVolumeSpecName: "config-data") pod "9b4cf2fd-213b-4101-b13b-9a58eb9f3f28" (UID: "9b4cf2fd-213b-4101-b13b-9a58eb9f3f28"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.002350 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b4cf2fd-213b-4101-b13b-9a58eb9f3f28-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9b4cf2fd-213b-4101-b13b-9a58eb9f3f28" (UID: "9b4cf2fd-213b-4101-b13b-9a58eb9f3f28"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.035466 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvh9b\" (UniqueName: \"kubernetes.io/projected/9b4cf2fd-213b-4101-b13b-9a58eb9f3f28-kube-api-access-tvh9b\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.035500 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b4cf2fd-213b-4101-b13b-9a58eb9f3f28-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.035513 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b4cf2fd-213b-4101-b13b-9a58eb9f3f28-logs\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.035525 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b4cf2fd-213b-4101-b13b-9a58eb9f3f28-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.035554 4752 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.035567 4752 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9b4cf2fd-213b-4101-b13b-9a58eb9f3f28-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.035579 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b4cf2fd-213b-4101-b13b-9a58eb9f3f28-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.035593 4752 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b4cf2fd-213b-4101-b13b-9a58eb9f3f28-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.062757 4752 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.137191 4752 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.475236 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.542772 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a512d7d1-a13c-4c87-a20b-81a2ae62655a-combined-ca-bundle\") pod \"a512d7d1-a13c-4c87-a20b-81a2ae62655a\" (UID: \"a512d7d1-a13c-4c87-a20b-81a2ae62655a\") " Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.542818 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a512d7d1-a13c-4c87-a20b-81a2ae62655a-config-data\") pod \"a512d7d1-a13c-4c87-a20b-81a2ae62655a\" (UID: \"a512d7d1-a13c-4c87-a20b-81a2ae62655a\") " Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.542845 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a512d7d1-a13c-4c87-a20b-81a2ae62655a-internal-tls-certs\") pod \"a512d7d1-a13c-4c87-a20b-81a2ae62655a\" (UID: \"a512d7d1-a13c-4c87-a20b-81a2ae62655a\") " Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.542882 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a512d7d1-a13c-4c87-a20b-81a2ae62655a-logs\") pod \"a512d7d1-a13c-4c87-a20b-81a2ae62655a\" (UID: \"a512d7d1-a13c-4c87-a20b-81a2ae62655a\") " Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.542905 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a512d7d1-a13c-4c87-a20b-81a2ae62655a-httpd-run\") pod \"a512d7d1-a13c-4c87-a20b-81a2ae62655a\" (UID: \"a512d7d1-a13c-4c87-a20b-81a2ae62655a\") " Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.542945 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"a512d7d1-a13c-4c87-a20b-81a2ae62655a\" (UID: \"a512d7d1-a13c-4c87-a20b-81a2ae62655a\") " Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.542966 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cqrt\" (UniqueName: \"kubernetes.io/projected/a512d7d1-a13c-4c87-a20b-81a2ae62655a-kube-api-access-2cqrt\") pod \"a512d7d1-a13c-4c87-a20b-81a2ae62655a\" (UID: \"a512d7d1-a13c-4c87-a20b-81a2ae62655a\") " Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.543003 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a512d7d1-a13c-4c87-a20b-81a2ae62655a-scripts\") pod \"a512d7d1-a13c-4c87-a20b-81a2ae62655a\" (UID: \"a512d7d1-a13c-4c87-a20b-81a2ae62655a\") " Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.543480 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a512d7d1-a13c-4c87-a20b-81a2ae62655a-logs" (OuterVolumeSpecName: "logs") pod "a512d7d1-a13c-4c87-a20b-81a2ae62655a" (UID: "a512d7d1-a13c-4c87-a20b-81a2ae62655a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.543498 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a512d7d1-a13c-4c87-a20b-81a2ae62655a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a512d7d1-a13c-4c87-a20b-81a2ae62655a" (UID: "a512d7d1-a13c-4c87-a20b-81a2ae62655a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.585138 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a512d7d1-a13c-4c87-a20b-81a2ae62655a-scripts" (OuterVolumeSpecName: "scripts") pod "a512d7d1-a13c-4c87-a20b-81a2ae62655a" (UID: "a512d7d1-a13c-4c87-a20b-81a2ae62655a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.585631 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "a512d7d1-a13c-4c87-a20b-81a2ae62655a" (UID: "a512d7d1-a13c-4c87-a20b-81a2ae62655a"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.585925 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a512d7d1-a13c-4c87-a20b-81a2ae62655a-kube-api-access-2cqrt" (OuterVolumeSpecName: "kube-api-access-2cqrt") pod "a512d7d1-a13c-4c87-a20b-81a2ae62655a" (UID: "a512d7d1-a13c-4c87-a20b-81a2ae62655a"). InnerVolumeSpecName "kube-api-access-2cqrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.630612 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a512d7d1-a13c-4c87-a20b-81a2ae62655a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a512d7d1-a13c-4c87-a20b-81a2ae62655a" (UID: "a512d7d1-a13c-4c87-a20b-81a2ae62655a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.647178 4752 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.647219 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cqrt\" (UniqueName: \"kubernetes.io/projected/a512d7d1-a13c-4c87-a20b-81a2ae62655a-kube-api-access-2cqrt\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.647232 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a512d7d1-a13c-4c87-a20b-81a2ae62655a-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.647242 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a512d7d1-a13c-4c87-a20b-81a2ae62655a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.647251 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a512d7d1-a13c-4c87-a20b-81a2ae62655a-logs\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.647260 4752 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a512d7d1-a13c-4c87-a20b-81a2ae62655a-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.669873 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a512d7d1-a13c-4c87-a20b-81a2ae62655a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a512d7d1-a13c-4c87-a20b-81a2ae62655a" (UID: "a512d7d1-a13c-4c87-a20b-81a2ae62655a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.676991 4752 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.678023 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a512d7d1-a13c-4c87-a20b-81a2ae62655a-config-data" (OuterVolumeSpecName: "config-data") pod "a512d7d1-a13c-4c87-a20b-81a2ae62655a" (UID: "a512d7d1-a13c-4c87-a20b-81a2ae62655a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.751624 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a512d7d1-a13c-4c87-a20b-81a2ae62655a-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.751665 4752 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a512d7d1-a13c-4c87-a20b-81a2ae62655a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.751676 4752 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.761571 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-7qgtl" event={"ID":"2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55","Type":"ContainerStarted","Data":"e0babdc37f4d23392fd7039a2c159f95eeb6c70c008d4d45704626d8cce25733"} Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.765995 4752 generic.go:334] "Generic (PLEG): container finished" podID="a512d7d1-a13c-4c87-a20b-81a2ae62655a" containerID="8a5c9af4ba1963c78b62aa82f9df991156dceab59a337d81426c1979f5fe519d" exitCode=0 Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.766126 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a512d7d1-a13c-4c87-a20b-81a2ae62655a","Type":"ContainerDied","Data":"8a5c9af4ba1963c78b62aa82f9df991156dceab59a337d81426c1979f5fe519d"} Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.766161 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.766176 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a512d7d1-a13c-4c87-a20b-81a2ae62655a","Type":"ContainerDied","Data":"eed6ef5813eb0a5321cdbdf048b830fc768f0ac22fc0f88d819d573bea155758"} Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.766198 4752 scope.go:117] "RemoveContainer" containerID="8a5c9af4ba1963c78b62aa82f9df991156dceab59a337d81426c1979f5fe519d" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.771069 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9b4cf2fd-213b-4101-b13b-9a58eb9f3f28","Type":"ContainerDied","Data":"df5c05c48db5b1b37626d2bc284eb52b40b045399ddc23068bdf8e6feafd0f26"} Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.771189 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.783864 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb","Type":"ContainerStarted","Data":"b84aa77bae09e302746e37bd16e42bcfe71623a1fa8cc3ea07d3ee5a99c1fb61"} Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.784099 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bd02d2c0-6c03-46cc-9bc6-de7f16b89acb" containerName="ceilometer-central-agent" containerID="cri-o://abbdc6f2496d21d6ed2c61a9b2bf0510522fadd077c510dfa54caa53686b0689" gracePeriod=30 Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.784423 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.784538 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bd02d2c0-6c03-46cc-9bc6-de7f16b89acb" containerName="ceilometer-notification-agent" containerID="cri-o://c7ce26739d729d68b575dc1602916d98c2b5685a62b5cea8ee37598153bc8082" gracePeriod=30 Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.784693 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bd02d2c0-6c03-46cc-9bc6-de7f16b89acb" containerName="sg-core" containerID="cri-o://87efcae88cedc15462a8fe69c88dbd7d70162685faf240bdfb47620ec03b9460" gracePeriod=30 Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.784912 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-7qgtl" podStartSLOduration=2.330470191 podStartE2EDuration="10.784898927s" podCreationTimestamp="2025-11-24 11:26:12 +0000 UTC" firstStartedPulling="2025-11-24 11:26:13.15657342 +0000 UTC m=+1179.141393709" lastFinishedPulling="2025-11-24 11:26:21.611002156 +0000 UTC m=+1187.595822445" observedRunningTime="2025-11-24 11:26:22.775556877 +0000 UTC m=+1188.760377176" watchObservedRunningTime="2025-11-24 11:26:22.784898927 +0000 UTC m=+1188.769719216" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.790213 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bd02d2c0-6c03-46cc-9bc6-de7f16b89acb" containerName="proxy-httpd" containerID="cri-o://b84aa77bae09e302746e37bd16e42bcfe71623a1fa8cc3ea07d3ee5a99c1fb61" gracePeriod=30 Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.825641 4752 scope.go:117] "RemoveContainer" containerID="efd227698d64905aa18bab84b4b892d17c7c39b448d5682eaadcccf82dbda928" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.831605 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.848701 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.859802 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 11:26:22 crc kubenswrapper[4752]: E1124 11:26:22.860210 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a512d7d1-a13c-4c87-a20b-81a2ae62655a" containerName="glance-httpd" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.860226 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="a512d7d1-a13c-4c87-a20b-81a2ae62655a" containerName="glance-httpd" Nov 24 11:26:22 crc kubenswrapper[4752]: E1124 11:26:22.860245 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a512d7d1-a13c-4c87-a20b-81a2ae62655a" containerName="glance-log" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.860252 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="a512d7d1-a13c-4c87-a20b-81a2ae62655a" containerName="glance-log" Nov 24 11:26:22 crc kubenswrapper[4752]: E1124 11:26:22.860264 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b4cf2fd-213b-4101-b13b-9a58eb9f3f28" containerName="glance-log" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.860270 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b4cf2fd-213b-4101-b13b-9a58eb9f3f28" containerName="glance-log" Nov 24 11:26:22 crc kubenswrapper[4752]: E1124 11:26:22.860297 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b4cf2fd-213b-4101-b13b-9a58eb9f3f28" containerName="glance-httpd" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.860304 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b4cf2fd-213b-4101-b13b-9a58eb9f3f28" containerName="glance-httpd" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.860470 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b4cf2fd-213b-4101-b13b-9a58eb9f3f28" containerName="glance-httpd" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.860486 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b4cf2fd-213b-4101-b13b-9a58eb9f3f28" containerName="glance-log" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.860500 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="a512d7d1-a13c-4c87-a20b-81a2ae62655a" containerName="glance-log" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.860510 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="a512d7d1-a13c-4c87-a20b-81a2ae62655a" containerName="glance-httpd" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.861461 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.868147 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.871461 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.871700 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-z5nd5" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.872042 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.872342 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.886880 4752 scope.go:117] "RemoveContainer" containerID="8a5c9af4ba1963c78b62aa82f9df991156dceab59a337d81426c1979f5fe519d" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.886980 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 11:26:22 crc kubenswrapper[4752]: E1124 11:26:22.888491 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a5c9af4ba1963c78b62aa82f9df991156dceab59a337d81426c1979f5fe519d\": container with ID starting with 8a5c9af4ba1963c78b62aa82f9df991156dceab59a337d81426c1979f5fe519d not found: ID does not exist" containerID="8a5c9af4ba1963c78b62aa82f9df991156dceab59a337d81426c1979f5fe519d" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.888521 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a5c9af4ba1963c78b62aa82f9df991156dceab59a337d81426c1979f5fe519d"} err="failed to get container status \"8a5c9af4ba1963c78b62aa82f9df991156dceab59a337d81426c1979f5fe519d\": rpc error: code = NotFound desc = could not find container \"8a5c9af4ba1963c78b62aa82f9df991156dceab59a337d81426c1979f5fe519d\": container with ID starting with 8a5c9af4ba1963c78b62aa82f9df991156dceab59a337d81426c1979f5fe519d not found: ID does not exist" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.888542 4752 scope.go:117] "RemoveContainer" containerID="efd227698d64905aa18bab84b4b892d17c7c39b448d5682eaadcccf82dbda928" Nov 24 11:26:22 crc kubenswrapper[4752]: E1124 11:26:22.890525 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efd227698d64905aa18bab84b4b892d17c7c39b448d5682eaadcccf82dbda928\": container with ID starting with efd227698d64905aa18bab84b4b892d17c7c39b448d5682eaadcccf82dbda928 not found: ID does not exist" containerID="efd227698d64905aa18bab84b4b892d17c7c39b448d5682eaadcccf82dbda928" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.890575 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efd227698d64905aa18bab84b4b892d17c7c39b448d5682eaadcccf82dbda928"} err="failed to get container status \"efd227698d64905aa18bab84b4b892d17c7c39b448d5682eaadcccf82dbda928\": rpc error: code = NotFound desc = could not find container \"efd227698d64905aa18bab84b4b892d17c7c39b448d5682eaadcccf82dbda928\": container with ID starting with efd227698d64905aa18bab84b4b892d17c7c39b448d5682eaadcccf82dbda928 not found: ID does not exist" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.890595 4752 scope.go:117] "RemoveContainer" containerID="15375c58a53fe633e1759e20c9d16ad63468a36df47866fd2c275d06f38ec48e" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.894981 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.911069 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.911327 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.889754672 podStartE2EDuration="10.911316449s" podCreationTimestamp="2025-11-24 11:26:12 +0000 UTC" firstStartedPulling="2025-11-24 11:26:13.589522151 +0000 UTC m=+1179.574342440" lastFinishedPulling="2025-11-24 11:26:21.611083928 +0000 UTC m=+1187.595904217" observedRunningTime="2025-11-24 11:26:22.868165272 +0000 UTC m=+1188.852985561" watchObservedRunningTime="2025-11-24 11:26:22.911316449 +0000 UTC m=+1188.896136738" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.912516 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.915711 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.915968 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.921865 4752 scope.go:117] "RemoveContainer" containerID="e81d5a0b95235237f39a31dd9ae847f30d22805d65c467d4d9437a45460ca9d7" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.926324 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.955763 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3c01e5f-8338-497f-9e47-e2e3e857df63-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f3c01e5f-8338-497f-9e47-e2e3e857df63\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.955873 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3c01e5f-8338-497f-9e47-e2e3e857df63-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f3c01e5f-8338-497f-9e47-e2e3e857df63\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.955932 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxw5j\" (UniqueName: \"kubernetes.io/projected/16fc61a1-6794-4ecf-a3dd-d79a88eda486-kube-api-access-rxw5j\") pod \"glance-default-external-api-0\" (UID: \"16fc61a1-6794-4ecf-a3dd-d79a88eda486\") " pod="openstack/glance-default-external-api-0" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.956150 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3c01e5f-8338-497f-9e47-e2e3e857df63-logs\") pod \"glance-default-internal-api-0\" (UID: \"f3c01e5f-8338-497f-9e47-e2e3e857df63\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.956281 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3c01e5f-8338-497f-9e47-e2e3e857df63-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f3c01e5f-8338-497f-9e47-e2e3e857df63\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.956318 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3c01e5f-8338-497f-9e47-e2e3e857df63-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f3c01e5f-8338-497f-9e47-e2e3e857df63\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.956335 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16fc61a1-6794-4ecf-a3dd-d79a88eda486-config-data\") pod \"glance-default-external-api-0\" (UID: \"16fc61a1-6794-4ecf-a3dd-d79a88eda486\") " pod="openstack/glance-default-external-api-0" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.956384 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"16fc61a1-6794-4ecf-a3dd-d79a88eda486\") " pod="openstack/glance-default-external-api-0" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.956406 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16fc61a1-6794-4ecf-a3dd-d79a88eda486-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"16fc61a1-6794-4ecf-a3dd-d79a88eda486\") " pod="openstack/glance-default-external-api-0" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.956555 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhm5p\" (UniqueName: \"kubernetes.io/projected/f3c01e5f-8338-497f-9e47-e2e3e857df63-kube-api-access-xhm5p\") pod \"glance-default-internal-api-0\" (UID: \"f3c01e5f-8338-497f-9e47-e2e3e857df63\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.956595 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f3c01e5f-8338-497f-9e47-e2e3e857df63-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f3c01e5f-8338-497f-9e47-e2e3e857df63\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.956635 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16fc61a1-6794-4ecf-a3dd-d79a88eda486-logs\") pod \"glance-default-external-api-0\" (UID: \"16fc61a1-6794-4ecf-a3dd-d79a88eda486\") " pod="openstack/glance-default-external-api-0" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.956686 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16fc61a1-6794-4ecf-a3dd-d79a88eda486-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"16fc61a1-6794-4ecf-a3dd-d79a88eda486\") " pod="openstack/glance-default-external-api-0" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.956709 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"f3c01e5f-8338-497f-9e47-e2e3e857df63\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.956792 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16fc61a1-6794-4ecf-a3dd-d79a88eda486-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"16fc61a1-6794-4ecf-a3dd-d79a88eda486\") " pod="openstack/glance-default-external-api-0" Nov 24 11:26:22 crc kubenswrapper[4752]: I1124 11:26:22.956826 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16fc61a1-6794-4ecf-a3dd-d79a88eda486-scripts\") pod \"glance-default-external-api-0\" (UID: \"16fc61a1-6794-4ecf-a3dd-d79a88eda486\") " pod="openstack/glance-default-external-api-0" Nov 24 11:26:23 crc kubenswrapper[4752]: I1124 11:26:23.058424 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3c01e5f-8338-497f-9e47-e2e3e857df63-logs\") pod \"glance-default-internal-api-0\" (UID: \"f3c01e5f-8338-497f-9e47-e2e3e857df63\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:26:23 crc kubenswrapper[4752]: I1124 11:26:23.058526 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3c01e5f-8338-497f-9e47-e2e3e857df63-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f3c01e5f-8338-497f-9e47-e2e3e857df63\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:26:23 crc kubenswrapper[4752]: I1124 11:26:23.058564 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16fc61a1-6794-4ecf-a3dd-d79a88eda486-config-data\") pod \"glance-default-external-api-0\" (UID: \"16fc61a1-6794-4ecf-a3dd-d79a88eda486\") " pod="openstack/glance-default-external-api-0" Nov 24 11:26:23 crc kubenswrapper[4752]: I1124 11:26:23.058586 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3c01e5f-8338-497f-9e47-e2e3e857df63-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f3c01e5f-8338-497f-9e47-e2e3e857df63\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:26:23 crc kubenswrapper[4752]: I1124 11:26:23.058634 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"16fc61a1-6794-4ecf-a3dd-d79a88eda486\") " pod="openstack/glance-default-external-api-0" Nov 24 11:26:23 crc kubenswrapper[4752]: I1124 11:26:23.058663 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16fc61a1-6794-4ecf-a3dd-d79a88eda486-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"16fc61a1-6794-4ecf-a3dd-d79a88eda486\") " pod="openstack/glance-default-external-api-0" Nov 24 11:26:23 crc kubenswrapper[4752]: I1124 11:26:23.058703 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhm5p\" (UniqueName: \"kubernetes.io/projected/f3c01e5f-8338-497f-9e47-e2e3e857df63-kube-api-access-xhm5p\") pod \"glance-default-internal-api-0\" (UID: \"f3c01e5f-8338-497f-9e47-e2e3e857df63\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:26:23 crc kubenswrapper[4752]: I1124 11:26:23.058728 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f3c01e5f-8338-497f-9e47-e2e3e857df63-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f3c01e5f-8338-497f-9e47-e2e3e857df63\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:26:23 crc kubenswrapper[4752]: I1124 11:26:23.058773 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16fc61a1-6794-4ecf-a3dd-d79a88eda486-logs\") pod \"glance-default-external-api-0\" (UID: \"16fc61a1-6794-4ecf-a3dd-d79a88eda486\") " pod="openstack/glance-default-external-api-0" Nov 24 11:26:23 crc kubenswrapper[4752]: I1124 11:26:23.058801 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16fc61a1-6794-4ecf-a3dd-d79a88eda486-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"16fc61a1-6794-4ecf-a3dd-d79a88eda486\") " pod="openstack/glance-default-external-api-0" Nov 24 11:26:23 crc kubenswrapper[4752]: I1124 11:26:23.058821 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"f3c01e5f-8338-497f-9e47-e2e3e857df63\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:26:23 crc kubenswrapper[4752]: I1124 11:26:23.058850 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16fc61a1-6794-4ecf-a3dd-d79a88eda486-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"16fc61a1-6794-4ecf-a3dd-d79a88eda486\") " pod="openstack/glance-default-external-api-0" Nov 24 11:26:23 crc kubenswrapper[4752]: I1124 11:26:23.058875 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16fc61a1-6794-4ecf-a3dd-d79a88eda486-scripts\") pod \"glance-default-external-api-0\" (UID: \"16fc61a1-6794-4ecf-a3dd-d79a88eda486\") " pod="openstack/glance-default-external-api-0" Nov 24 11:26:23 crc kubenswrapper[4752]: I1124 11:26:23.058904 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3c01e5f-8338-497f-9e47-e2e3e857df63-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f3c01e5f-8338-497f-9e47-e2e3e857df63\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:26:23 crc kubenswrapper[4752]: I1124 11:26:23.058951 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3c01e5f-8338-497f-9e47-e2e3e857df63-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f3c01e5f-8338-497f-9e47-e2e3e857df63\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:26:23 crc kubenswrapper[4752]: I1124 11:26:23.058987 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxw5j\" (UniqueName: \"kubernetes.io/projected/16fc61a1-6794-4ecf-a3dd-d79a88eda486-kube-api-access-rxw5j\") pod \"glance-default-external-api-0\" (UID: \"16fc61a1-6794-4ecf-a3dd-d79a88eda486\") " pod="openstack/glance-default-external-api-0" Nov 24 11:26:23 crc kubenswrapper[4752]: I1124 11:26:23.058989 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3c01e5f-8338-497f-9e47-e2e3e857df63-logs\") pod \"glance-default-internal-api-0\" (UID: \"f3c01e5f-8338-497f-9e47-e2e3e857df63\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:26:23 crc kubenswrapper[4752]: I1124 11:26:23.059476 4752 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"16fc61a1-6794-4ecf-a3dd-d79a88eda486\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Nov 24 11:26:23 crc kubenswrapper[4752]: I1124 11:26:23.059727 4752 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"f3c01e5f-8338-497f-9e47-e2e3e857df63\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Nov 24 11:26:23 crc kubenswrapper[4752]: I1124 11:26:23.061650 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16fc61a1-6794-4ecf-a3dd-d79a88eda486-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"16fc61a1-6794-4ecf-a3dd-d79a88eda486\") " pod="openstack/glance-default-external-api-0" Nov 24 11:26:23 crc kubenswrapper[4752]: I1124 11:26:23.061686 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16fc61a1-6794-4ecf-a3dd-d79a88eda486-logs\") pod \"glance-default-external-api-0\" (UID: \"16fc61a1-6794-4ecf-a3dd-d79a88eda486\") " pod="openstack/glance-default-external-api-0" Nov 24 11:26:23 crc kubenswrapper[4752]: I1124 11:26:23.061655 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f3c01e5f-8338-497f-9e47-e2e3e857df63-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f3c01e5f-8338-497f-9e47-e2e3e857df63\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:26:23 crc kubenswrapper[4752]: I1124 11:26:23.067698 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3c01e5f-8338-497f-9e47-e2e3e857df63-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f3c01e5f-8338-497f-9e47-e2e3e857df63\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:26:23 crc kubenswrapper[4752]: I1124 11:26:23.068446 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16fc61a1-6794-4ecf-a3dd-d79a88eda486-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"16fc61a1-6794-4ecf-a3dd-d79a88eda486\") " pod="openstack/glance-default-external-api-0" Nov 24 11:26:23 crc kubenswrapper[4752]: I1124 11:26:23.068903 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16fc61a1-6794-4ecf-a3dd-d79a88eda486-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"16fc61a1-6794-4ecf-a3dd-d79a88eda486\") " pod="openstack/glance-default-external-api-0" Nov 24 11:26:23 crc kubenswrapper[4752]: I1124 11:26:23.071022 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16fc61a1-6794-4ecf-a3dd-d79a88eda486-config-data\") pod \"glance-default-external-api-0\" (UID: \"16fc61a1-6794-4ecf-a3dd-d79a88eda486\") " pod="openstack/glance-default-external-api-0" Nov 24 11:26:23 crc kubenswrapper[4752]: I1124 11:26:23.072340 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3c01e5f-8338-497f-9e47-e2e3e857df63-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f3c01e5f-8338-497f-9e47-e2e3e857df63\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:26:23 crc kubenswrapper[4752]: I1124 11:26:23.075969 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16fc61a1-6794-4ecf-a3dd-d79a88eda486-scripts\") pod \"glance-default-external-api-0\" (UID: \"16fc61a1-6794-4ecf-a3dd-d79a88eda486\") " pod="openstack/glance-default-external-api-0" Nov 24 11:26:23 crc kubenswrapper[4752]: I1124 11:26:23.076542 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3c01e5f-8338-497f-9e47-e2e3e857df63-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f3c01e5f-8338-497f-9e47-e2e3e857df63\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:26:23 crc kubenswrapper[4752]: I1124 11:26:23.076859 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3c01e5f-8338-497f-9e47-e2e3e857df63-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f3c01e5f-8338-497f-9e47-e2e3e857df63\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:26:23 crc kubenswrapper[4752]: I1124 11:26:23.079668 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhm5p\" (UniqueName: \"kubernetes.io/projected/f3c01e5f-8338-497f-9e47-e2e3e857df63-kube-api-access-xhm5p\") pod \"glance-default-internal-api-0\" (UID: \"f3c01e5f-8338-497f-9e47-e2e3e857df63\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:26:23 crc kubenswrapper[4752]: I1124 11:26:23.082916 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxw5j\" (UniqueName: \"kubernetes.io/projected/16fc61a1-6794-4ecf-a3dd-d79a88eda486-kube-api-access-rxw5j\") pod \"glance-default-external-api-0\" (UID: \"16fc61a1-6794-4ecf-a3dd-d79a88eda486\") " pod="openstack/glance-default-external-api-0" Nov 24 11:26:23 crc kubenswrapper[4752]: I1124 11:26:23.097681 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"f3c01e5f-8338-497f-9e47-e2e3e857df63\") " pod="openstack/glance-default-internal-api-0" Nov 24 11:26:23 crc kubenswrapper[4752]: I1124 11:26:23.116935 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"16fc61a1-6794-4ecf-a3dd-d79a88eda486\") " pod="openstack/glance-default-external-api-0" Nov 24 11:26:23 crc kubenswrapper[4752]: I1124 11:26:23.195242 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 24 11:26:23 crc kubenswrapper[4752]: I1124 11:26:23.244223 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 24 11:26:23 crc kubenswrapper[4752]: E1124 11:26:23.573818 4752 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd02d2c0_6c03_46cc_9bc6_de7f16b89acb.slice/crio-c7ce26739d729d68b575dc1602916d98c2b5685a62b5cea8ee37598153bc8082.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd02d2c0_6c03_46cc_9bc6_de7f16b89acb.slice/crio-conmon-c7ce26739d729d68b575dc1602916d98c2b5685a62b5cea8ee37598153bc8082.scope\": RecentStats: unable to find data in memory cache]" Nov 24 11:26:23 crc kubenswrapper[4752]: I1124 11:26:23.786012 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 11:26:23 crc kubenswrapper[4752]: I1124 11:26:23.799808 4752 generic.go:334] "Generic (PLEG): container finished" podID="bd02d2c0-6c03-46cc-9bc6-de7f16b89acb" containerID="b84aa77bae09e302746e37bd16e42bcfe71623a1fa8cc3ea07d3ee5a99c1fb61" exitCode=0 Nov 24 11:26:23 crc kubenswrapper[4752]: I1124 11:26:23.799841 4752 generic.go:334] "Generic (PLEG): container finished" podID="bd02d2c0-6c03-46cc-9bc6-de7f16b89acb" containerID="87efcae88cedc15462a8fe69c88dbd7d70162685faf240bdfb47620ec03b9460" exitCode=2 Nov 24 11:26:23 crc kubenswrapper[4752]: I1124 11:26:23.799852 4752 generic.go:334] "Generic (PLEG): container finished" podID="bd02d2c0-6c03-46cc-9bc6-de7f16b89acb" containerID="c7ce26739d729d68b575dc1602916d98c2b5685a62b5cea8ee37598153bc8082" exitCode=0 Nov 24 11:26:23 crc kubenswrapper[4752]: I1124 11:26:23.799948 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb","Type":"ContainerDied","Data":"b84aa77bae09e302746e37bd16e42bcfe71623a1fa8cc3ea07d3ee5a99c1fb61"} Nov 24 11:26:23 crc kubenswrapper[4752]: I1124 11:26:23.799979 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb","Type":"ContainerDied","Data":"87efcae88cedc15462a8fe69c88dbd7d70162685faf240bdfb47620ec03b9460"} Nov 24 11:26:23 crc kubenswrapper[4752]: I1124 11:26:23.799991 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb","Type":"ContainerDied","Data":"c7ce26739d729d68b575dc1602916d98c2b5685a62b5cea8ee37598153bc8082"} Nov 24 11:26:23 crc kubenswrapper[4752]: I1124 11:26:23.983002 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 11:26:23 crc kubenswrapper[4752]: W1124 11:26:23.985488 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16fc61a1_6794_4ecf_a3dd_d79a88eda486.slice/crio-a7ae944d23e0ca690cf82ed28c91c09d4ee094baf62771943265553e01af552f WatchSource:0}: Error finding container a7ae944d23e0ca690cf82ed28c91c09d4ee094baf62771943265553e01af552f: Status 404 returned error can't find the container with id a7ae944d23e0ca690cf82ed28c91c09d4ee094baf62771943265553e01af552f Nov 24 11:26:24 crc kubenswrapper[4752]: I1124 11:26:24.752519 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b4cf2fd-213b-4101-b13b-9a58eb9f3f28" path="/var/lib/kubelet/pods/9b4cf2fd-213b-4101-b13b-9a58eb9f3f28/volumes" Nov 24 11:26:24 crc kubenswrapper[4752]: I1124 11:26:24.753692 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a512d7d1-a13c-4c87-a20b-81a2ae62655a" path="/var/lib/kubelet/pods/a512d7d1-a13c-4c87-a20b-81a2ae62655a/volumes" Nov 24 11:26:24 crc kubenswrapper[4752]: I1124 11:26:24.826733 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"16fc61a1-6794-4ecf-a3dd-d79a88eda486","Type":"ContainerStarted","Data":"a7ae944d23e0ca690cf82ed28c91c09d4ee094baf62771943265553e01af552f"} Nov 24 11:26:24 crc kubenswrapper[4752]: I1124 11:26:24.835257 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f3c01e5f-8338-497f-9e47-e2e3e857df63","Type":"ContainerStarted","Data":"5608c635aa46e4432dd124ea11759d4280b1e7c2301d6131246b4a3f068d7cee"} Nov 24 11:26:24 crc kubenswrapper[4752]: I1124 11:26:24.835312 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f3c01e5f-8338-497f-9e47-e2e3e857df63","Type":"ContainerStarted","Data":"0aa09047cdbffb520d71d2f8b1165c33309be8bed98715681874484ee3fb7809"} Nov 24 11:26:25 crc kubenswrapper[4752]: I1124 11:26:25.849569 4752 generic.go:334] "Generic (PLEG): container finished" podID="bd02d2c0-6c03-46cc-9bc6-de7f16b89acb" containerID="abbdc6f2496d21d6ed2c61a9b2bf0510522fadd077c510dfa54caa53686b0689" exitCode=0 Nov 24 11:26:25 crc kubenswrapper[4752]: I1124 11:26:25.849658 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb","Type":"ContainerDied","Data":"abbdc6f2496d21d6ed2c61a9b2bf0510522fadd077c510dfa54caa53686b0689"} Nov 24 11:26:25 crc kubenswrapper[4752]: I1124 11:26:25.854996 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f3c01e5f-8338-497f-9e47-e2e3e857df63","Type":"ContainerStarted","Data":"9f8c293fe3f68cdc99eb5f8a593f9b4e799568c50b7fc38020752eb9bf6b2982"} Nov 24 11:26:25 crc kubenswrapper[4752]: I1124 11:26:25.856399 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"16fc61a1-6794-4ecf-a3dd-d79a88eda486","Type":"ContainerStarted","Data":"944ddefc80e86ba009045369f39188fcfd4ed96c67435d31b540b52189cc2961"} Nov 24 11:26:25 crc kubenswrapper[4752]: I1124 11:26:25.856428 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"16fc61a1-6794-4ecf-a3dd-d79a88eda486","Type":"ContainerStarted","Data":"4903a5653dc6c301907cc4b8a087af2718aabe36426414f0ba709fb8e91a2e8c"} Nov 24 11:26:25 crc kubenswrapper[4752]: I1124 11:26:25.884321 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.884303916 podStartE2EDuration="3.884303916s" podCreationTimestamp="2025-11-24 11:26:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:26:25.87373021 +0000 UTC m=+1191.858550509" watchObservedRunningTime="2025-11-24 11:26:25.884303916 +0000 UTC m=+1191.869124205" Nov 24 11:26:25 crc kubenswrapper[4752]: I1124 11:26:25.898919 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.898884247 podStartE2EDuration="3.898884247s" podCreationTimestamp="2025-11-24 11:26:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:26:25.894390587 +0000 UTC m=+1191.879210876" watchObservedRunningTime="2025-11-24 11:26:25.898884247 +0000 UTC m=+1191.883704546" Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.120370 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.218891 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-ceilometer-tls-certs\") pod \"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb\" (UID: \"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb\") " Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.218937 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-combined-ca-bundle\") pod \"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb\" (UID: \"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb\") " Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.218973 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-log-httpd\") pod \"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb\" (UID: \"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb\") " Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.218991 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bw5n8\" (UniqueName: \"kubernetes.io/projected/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-kube-api-access-bw5n8\") pod \"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb\" (UID: \"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb\") " Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.219014 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-sg-core-conf-yaml\") pod \"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb\" (UID: \"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb\") " Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.219036 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-config-data\") pod \"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb\" (UID: \"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb\") " Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.219113 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-scripts\") pod \"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb\" (UID: \"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb\") " Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.219199 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-run-httpd\") pod \"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb\" (UID: \"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb\") " Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.219554 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bd02d2c0-6c03-46cc-9bc6-de7f16b89acb" (UID: "bd02d2c0-6c03-46cc-9bc6-de7f16b89acb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.219803 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bd02d2c0-6c03-46cc-9bc6-de7f16b89acb" (UID: "bd02d2c0-6c03-46cc-9bc6-de7f16b89acb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.220591 4752 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.220643 4752 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.224517 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-kube-api-access-bw5n8" (OuterVolumeSpecName: "kube-api-access-bw5n8") pod "bd02d2c0-6c03-46cc-9bc6-de7f16b89acb" (UID: "bd02d2c0-6c03-46cc-9bc6-de7f16b89acb"). InnerVolumeSpecName "kube-api-access-bw5n8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.226288 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-scripts" (OuterVolumeSpecName: "scripts") pod "bd02d2c0-6c03-46cc-9bc6-de7f16b89acb" (UID: "bd02d2c0-6c03-46cc-9bc6-de7f16b89acb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.267814 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bd02d2c0-6c03-46cc-9bc6-de7f16b89acb" (UID: "bd02d2c0-6c03-46cc-9bc6-de7f16b89acb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.278940 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "bd02d2c0-6c03-46cc-9bc6-de7f16b89acb" (UID: "bd02d2c0-6c03-46cc-9bc6-de7f16b89acb"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.322054 4752 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.322099 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bw5n8\" (UniqueName: \"kubernetes.io/projected/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-kube-api-access-bw5n8\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.322115 4752 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.322126 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.324029 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd02d2c0-6c03-46cc-9bc6-de7f16b89acb" (UID: "bd02d2c0-6c03-46cc-9bc6-de7f16b89acb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.331848 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-config-data" (OuterVolumeSpecName: "config-data") pod "bd02d2c0-6c03-46cc-9bc6-de7f16b89acb" (UID: "bd02d2c0-6c03-46cc-9bc6-de7f16b89acb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.423310 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.423350 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.871966 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd02d2c0-6c03-46cc-9bc6-de7f16b89acb","Type":"ContainerDied","Data":"04020ccb08b6bf747bde4dc2940f04e4046a354c63a37476d8bfef5b3271c8d1"} Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.872336 4752 scope.go:117] "RemoveContainer" containerID="b84aa77bae09e302746e37bd16e42bcfe71623a1fa8cc3ea07d3ee5a99c1fb61" Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.872061 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.904588 4752 scope.go:117] "RemoveContainer" containerID="87efcae88cedc15462a8fe69c88dbd7d70162685faf240bdfb47620ec03b9460" Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.909566 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.919812 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.930374 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 24 11:26:26 crc kubenswrapper[4752]: E1124 11:26:26.930826 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd02d2c0-6c03-46cc-9bc6-de7f16b89acb" containerName="ceilometer-central-agent" Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.930845 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd02d2c0-6c03-46cc-9bc6-de7f16b89acb" containerName="ceilometer-central-agent" Nov 24 11:26:26 crc kubenswrapper[4752]: E1124 11:26:26.930859 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd02d2c0-6c03-46cc-9bc6-de7f16b89acb" containerName="sg-core" Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.930865 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd02d2c0-6c03-46cc-9bc6-de7f16b89acb" containerName="sg-core" Nov 24 11:26:26 crc kubenswrapper[4752]: E1124 11:26:26.930892 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd02d2c0-6c03-46cc-9bc6-de7f16b89acb" containerName="ceilometer-notification-agent" Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.930900 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd02d2c0-6c03-46cc-9bc6-de7f16b89acb" containerName="ceilometer-notification-agent" Nov 24 11:26:26 crc kubenswrapper[4752]: E1124 11:26:26.930908 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd02d2c0-6c03-46cc-9bc6-de7f16b89acb" containerName="proxy-httpd" Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.930913 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd02d2c0-6c03-46cc-9bc6-de7f16b89acb" containerName="proxy-httpd" Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.931100 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd02d2c0-6c03-46cc-9bc6-de7f16b89acb" containerName="sg-core" Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.931120 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd02d2c0-6c03-46cc-9bc6-de7f16b89acb" containerName="ceilometer-central-agent" Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.931132 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd02d2c0-6c03-46cc-9bc6-de7f16b89acb" containerName="ceilometer-notification-agent" Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.931151 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd02d2c0-6c03-46cc-9bc6-de7f16b89acb" containerName="proxy-httpd" Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.932718 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.943467 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.956342 4752 scope.go:117] "RemoveContainer" containerID="c7ce26739d729d68b575dc1602916d98c2b5685a62b5cea8ee37598153bc8082" Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.957273 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.957526 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 24 11:26:26 crc kubenswrapper[4752]: I1124 11:26:26.964688 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 24 11:26:27 crc kubenswrapper[4752]: I1124 11:26:27.004732 4752 scope.go:117] "RemoveContainer" containerID="abbdc6f2496d21d6ed2c61a9b2bf0510522fadd077c510dfa54caa53686b0689" Nov 24 11:26:27 crc kubenswrapper[4752]: I1124 11:26:27.034014 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23515198-ab42-4f07-83b5-ac2321bd7fef-log-httpd\") pod \"ceilometer-0\" (UID: \"23515198-ab42-4f07-83b5-ac2321bd7fef\") " pod="openstack/ceilometer-0" Nov 24 11:26:27 crc kubenswrapper[4752]: I1124 11:26:27.034068 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23515198-ab42-4f07-83b5-ac2321bd7fef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"23515198-ab42-4f07-83b5-ac2321bd7fef\") " pod="openstack/ceilometer-0" Nov 24 11:26:27 crc kubenswrapper[4752]: I1124 11:26:27.034216 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s799\" (UniqueName: \"kubernetes.io/projected/23515198-ab42-4f07-83b5-ac2321bd7fef-kube-api-access-5s799\") pod \"ceilometer-0\" (UID: \"23515198-ab42-4f07-83b5-ac2321bd7fef\") " pod="openstack/ceilometer-0" Nov 24 11:26:27 crc kubenswrapper[4752]: I1124 11:26:27.034258 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23515198-ab42-4f07-83b5-ac2321bd7fef-scripts\") pod \"ceilometer-0\" (UID: \"23515198-ab42-4f07-83b5-ac2321bd7fef\") " pod="openstack/ceilometer-0" Nov 24 11:26:27 crc kubenswrapper[4752]: I1124 11:26:27.034306 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23515198-ab42-4f07-83b5-ac2321bd7fef-config-data\") pod \"ceilometer-0\" (UID: \"23515198-ab42-4f07-83b5-ac2321bd7fef\") " pod="openstack/ceilometer-0" Nov 24 11:26:27 crc kubenswrapper[4752]: I1124 11:26:27.034354 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/23515198-ab42-4f07-83b5-ac2321bd7fef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"23515198-ab42-4f07-83b5-ac2321bd7fef\") " pod="openstack/ceilometer-0" Nov 24 11:26:27 crc kubenswrapper[4752]: I1124 11:26:27.034466 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23515198-ab42-4f07-83b5-ac2321bd7fef-run-httpd\") pod \"ceilometer-0\" (UID: \"23515198-ab42-4f07-83b5-ac2321bd7fef\") " pod="openstack/ceilometer-0" Nov 24 11:26:27 crc kubenswrapper[4752]: I1124 11:26:27.034495 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/23515198-ab42-4f07-83b5-ac2321bd7fef-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"23515198-ab42-4f07-83b5-ac2321bd7fef\") " pod="openstack/ceilometer-0" Nov 24 11:26:27 crc kubenswrapper[4752]: I1124 11:26:27.136151 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s799\" (UniqueName: \"kubernetes.io/projected/23515198-ab42-4f07-83b5-ac2321bd7fef-kube-api-access-5s799\") pod \"ceilometer-0\" (UID: \"23515198-ab42-4f07-83b5-ac2321bd7fef\") " pod="openstack/ceilometer-0" Nov 24 11:26:27 crc kubenswrapper[4752]: I1124 11:26:27.136440 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23515198-ab42-4f07-83b5-ac2321bd7fef-scripts\") pod \"ceilometer-0\" (UID: \"23515198-ab42-4f07-83b5-ac2321bd7fef\") " pod="openstack/ceilometer-0" Nov 24 11:26:27 crc kubenswrapper[4752]: I1124 11:26:27.136535 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23515198-ab42-4f07-83b5-ac2321bd7fef-config-data\") pod \"ceilometer-0\" (UID: \"23515198-ab42-4f07-83b5-ac2321bd7fef\") " pod="openstack/ceilometer-0" Nov 24 11:26:27 crc kubenswrapper[4752]: I1124 11:26:27.136620 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/23515198-ab42-4f07-83b5-ac2321bd7fef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"23515198-ab42-4f07-83b5-ac2321bd7fef\") " pod="openstack/ceilometer-0" Nov 24 11:26:27 crc kubenswrapper[4752]: I1124 11:26:27.137223 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23515198-ab42-4f07-83b5-ac2321bd7fef-run-httpd\") pod \"ceilometer-0\" (UID: \"23515198-ab42-4f07-83b5-ac2321bd7fef\") " pod="openstack/ceilometer-0" Nov 24 11:26:27 crc kubenswrapper[4752]: I1124 11:26:27.137341 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23515198-ab42-4f07-83b5-ac2321bd7fef-run-httpd\") pod \"ceilometer-0\" (UID: \"23515198-ab42-4f07-83b5-ac2321bd7fef\") " pod="openstack/ceilometer-0" Nov 24 11:26:27 crc kubenswrapper[4752]: I1124 11:26:27.137451 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/23515198-ab42-4f07-83b5-ac2321bd7fef-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"23515198-ab42-4f07-83b5-ac2321bd7fef\") " pod="openstack/ceilometer-0" Nov 24 11:26:27 crc kubenswrapper[4752]: I1124 11:26:27.137927 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23515198-ab42-4f07-83b5-ac2321bd7fef-log-httpd\") pod \"ceilometer-0\" (UID: \"23515198-ab42-4f07-83b5-ac2321bd7fef\") " pod="openstack/ceilometer-0" Nov 24 11:26:27 crc kubenswrapper[4752]: I1124 11:26:27.138111 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23515198-ab42-4f07-83b5-ac2321bd7fef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"23515198-ab42-4f07-83b5-ac2321bd7fef\") " pod="openstack/ceilometer-0" Nov 24 11:26:27 crc kubenswrapper[4752]: I1124 11:26:27.138282 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23515198-ab42-4f07-83b5-ac2321bd7fef-log-httpd\") pod \"ceilometer-0\" (UID: \"23515198-ab42-4f07-83b5-ac2321bd7fef\") " pod="openstack/ceilometer-0" Nov 24 11:26:27 crc kubenswrapper[4752]: I1124 11:26:27.141084 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23515198-ab42-4f07-83b5-ac2321bd7fef-scripts\") pod \"ceilometer-0\" (UID: \"23515198-ab42-4f07-83b5-ac2321bd7fef\") " pod="openstack/ceilometer-0" Nov 24 11:26:27 crc kubenswrapper[4752]: I1124 11:26:27.141307 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/23515198-ab42-4f07-83b5-ac2321bd7fef-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"23515198-ab42-4f07-83b5-ac2321bd7fef\") " pod="openstack/ceilometer-0" Nov 24 11:26:27 crc kubenswrapper[4752]: I1124 11:26:27.142001 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23515198-ab42-4f07-83b5-ac2321bd7fef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"23515198-ab42-4f07-83b5-ac2321bd7fef\") " pod="openstack/ceilometer-0" Nov 24 11:26:27 crc kubenswrapper[4752]: I1124 11:26:27.142701 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/23515198-ab42-4f07-83b5-ac2321bd7fef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"23515198-ab42-4f07-83b5-ac2321bd7fef\") " pod="openstack/ceilometer-0" Nov 24 11:26:27 crc kubenswrapper[4752]: I1124 11:26:27.152165 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23515198-ab42-4f07-83b5-ac2321bd7fef-config-data\") pod \"ceilometer-0\" (UID: \"23515198-ab42-4f07-83b5-ac2321bd7fef\") " pod="openstack/ceilometer-0" Nov 24 11:26:27 crc kubenswrapper[4752]: I1124 11:26:27.155040 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s799\" (UniqueName: \"kubernetes.io/projected/23515198-ab42-4f07-83b5-ac2321bd7fef-kube-api-access-5s799\") pod \"ceilometer-0\" (UID: \"23515198-ab42-4f07-83b5-ac2321bd7fef\") " pod="openstack/ceilometer-0" Nov 24 11:26:27 crc kubenswrapper[4752]: I1124 11:26:27.310508 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 11:26:27 crc kubenswrapper[4752]: I1124 11:26:27.745052 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 24 11:26:27 crc kubenswrapper[4752]: I1124 11:26:27.883656 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23515198-ab42-4f07-83b5-ac2321bd7fef","Type":"ContainerStarted","Data":"6aa926b2c6caab5212686ea00a27c9320fda1102fa5f8f0d385c964b59e449be"} Nov 24 11:26:28 crc kubenswrapper[4752]: I1124 11:26:28.742861 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd02d2c0-6c03-46cc-9bc6-de7f16b89acb" path="/var/lib/kubelet/pods/bd02d2c0-6c03-46cc-9bc6-de7f16b89acb/volumes" Nov 24 11:26:28 crc kubenswrapper[4752]: I1124 11:26:28.895457 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23515198-ab42-4f07-83b5-ac2321bd7fef","Type":"ContainerStarted","Data":"6aadc62293c4245d30845a5fd0dddc3432dedc55e14a0e2ee304da550d0eea5e"} Nov 24 11:26:29 crc kubenswrapper[4752]: I1124 11:26:29.905970 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23515198-ab42-4f07-83b5-ac2321bd7fef","Type":"ContainerStarted","Data":"2278eb5417743cc16e71303458116880db6ebd97702f8f3e08f1d1097346d162"} Nov 24 11:26:30 crc kubenswrapper[4752]: I1124 11:26:30.919576 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23515198-ab42-4f07-83b5-ac2321bd7fef","Type":"ContainerStarted","Data":"0370f1bf635129133c09e671dc9c39c6538ed64a431041082d99f2411ca44c76"} Nov 24 11:26:31 crc kubenswrapper[4752]: I1124 11:26:31.931699 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23515198-ab42-4f07-83b5-ac2321bd7fef","Type":"ContainerStarted","Data":"cb41f648eedd5e1e004b30e261b5fc56e95c7a25d1de15569e6c71eff1ebb182"} Nov 24 11:26:31 crc kubenswrapper[4752]: I1124 11:26:31.932096 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 24 11:26:31 crc kubenswrapper[4752]: I1124 11:26:31.955037 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.591080519 podStartE2EDuration="5.955016822s" podCreationTimestamp="2025-11-24 11:26:26 +0000 UTC" firstStartedPulling="2025-11-24 11:26:27.760417426 +0000 UTC m=+1193.745237715" lastFinishedPulling="2025-11-24 11:26:31.124353729 +0000 UTC m=+1197.109174018" observedRunningTime="2025-11-24 11:26:31.952713655 +0000 UTC m=+1197.937533964" watchObservedRunningTime="2025-11-24 11:26:31.955016822 +0000 UTC m=+1197.939837111" Nov 24 11:26:33 crc kubenswrapper[4752]: I1124 11:26:33.196655 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 24 11:26:33 crc kubenswrapper[4752]: I1124 11:26:33.197411 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 24 11:26:33 crc kubenswrapper[4752]: I1124 11:26:33.228873 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 24 11:26:33 crc kubenswrapper[4752]: I1124 11:26:33.244782 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 24 11:26:33 crc kubenswrapper[4752]: I1124 11:26:33.244914 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 24 11:26:33 crc kubenswrapper[4752]: I1124 11:26:33.244939 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 24 11:26:33 crc kubenswrapper[4752]: I1124 11:26:33.292572 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 24 11:26:33 crc kubenswrapper[4752]: I1124 11:26:33.292885 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 24 11:26:33 crc kubenswrapper[4752]: I1124 11:26:33.950354 4752 generic.go:334] "Generic (PLEG): container finished" podID="2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55" containerID="e0babdc37f4d23392fd7039a2c159f95eeb6c70c008d4d45704626d8cce25733" exitCode=0 Nov 24 11:26:33 crc kubenswrapper[4752]: I1124 11:26:33.950562 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-7qgtl" event={"ID":"2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55","Type":"ContainerDied","Data":"e0babdc37f4d23392fd7039a2c159f95eeb6c70c008d4d45704626d8cce25733"} Nov 24 11:26:33 crc kubenswrapper[4752]: I1124 11:26:33.951475 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 24 11:26:33 crc kubenswrapper[4752]: I1124 11:26:33.951520 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 24 11:26:33 crc kubenswrapper[4752]: I1124 11:26:33.951531 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 24 11:26:33 crc kubenswrapper[4752]: I1124 11:26:33.951541 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 24 11:26:34 crc kubenswrapper[4752]: I1124 11:26:34.440887 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 24 11:26:34 crc kubenswrapper[4752]: I1124 11:26:34.441248 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="23515198-ab42-4f07-83b5-ac2321bd7fef" containerName="ceilometer-central-agent" containerID="cri-o://6aadc62293c4245d30845a5fd0dddc3432dedc55e14a0e2ee304da550d0eea5e" gracePeriod=30 Nov 24 11:26:34 crc kubenswrapper[4752]: I1124 11:26:34.441278 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="23515198-ab42-4f07-83b5-ac2321bd7fef" containerName="proxy-httpd" containerID="cri-o://cb41f648eedd5e1e004b30e261b5fc56e95c7a25d1de15569e6c71eff1ebb182" gracePeriod=30 Nov 24 11:26:34 crc kubenswrapper[4752]: I1124 11:26:34.441331 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="23515198-ab42-4f07-83b5-ac2321bd7fef" containerName="ceilometer-notification-agent" containerID="cri-o://2278eb5417743cc16e71303458116880db6ebd97702f8f3e08f1d1097346d162" gracePeriod=30 Nov 24 11:26:34 crc kubenswrapper[4752]: I1124 11:26:34.441533 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="23515198-ab42-4f07-83b5-ac2321bd7fef" containerName="sg-core" containerID="cri-o://0370f1bf635129133c09e671dc9c39c6538ed64a431041082d99f2411ca44c76" gracePeriod=30 Nov 24 11:26:34 crc kubenswrapper[4752]: I1124 11:26:34.965819 4752 generic.go:334] "Generic (PLEG): container finished" podID="23515198-ab42-4f07-83b5-ac2321bd7fef" containerID="cb41f648eedd5e1e004b30e261b5fc56e95c7a25d1de15569e6c71eff1ebb182" exitCode=0 Nov 24 11:26:34 crc kubenswrapper[4752]: I1124 11:26:34.966146 4752 generic.go:334] "Generic (PLEG): container finished" podID="23515198-ab42-4f07-83b5-ac2321bd7fef" containerID="0370f1bf635129133c09e671dc9c39c6538ed64a431041082d99f2411ca44c76" exitCode=2 Nov 24 11:26:34 crc kubenswrapper[4752]: I1124 11:26:34.966157 4752 generic.go:334] "Generic (PLEG): container finished" podID="23515198-ab42-4f07-83b5-ac2321bd7fef" containerID="2278eb5417743cc16e71303458116880db6ebd97702f8f3e08f1d1097346d162" exitCode=0 Nov 24 11:26:34 crc kubenswrapper[4752]: I1124 11:26:34.966043 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23515198-ab42-4f07-83b5-ac2321bd7fef","Type":"ContainerDied","Data":"cb41f648eedd5e1e004b30e261b5fc56e95c7a25d1de15569e6c71eff1ebb182"} Nov 24 11:26:34 crc kubenswrapper[4752]: I1124 11:26:34.967256 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23515198-ab42-4f07-83b5-ac2321bd7fef","Type":"ContainerDied","Data":"0370f1bf635129133c09e671dc9c39c6538ed64a431041082d99f2411ca44c76"} Nov 24 11:26:34 crc kubenswrapper[4752]: I1124 11:26:34.967269 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23515198-ab42-4f07-83b5-ac2321bd7fef","Type":"ContainerDied","Data":"2278eb5417743cc16e71303458116880db6ebd97702f8f3e08f1d1097346d162"} Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.359942 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-7qgtl" Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.419842 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55-config-data\") pod \"2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55\" (UID: \"2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55\") " Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.419888 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55-combined-ca-bundle\") pod \"2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55\" (UID: \"2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55\") " Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.419934 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55-scripts\") pod \"2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55\" (UID: \"2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55\") " Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.420093 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmccx\" (UniqueName: \"kubernetes.io/projected/2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55-kube-api-access-qmccx\") pod \"2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55\" (UID: \"2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55\") " Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.431064 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55-scripts" (OuterVolumeSpecName: "scripts") pod "2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55" (UID: "2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.431931 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55-kube-api-access-qmccx" (OuterVolumeSpecName: "kube-api-access-qmccx") pod "2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55" (UID: "2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55"). InnerVolumeSpecName "kube-api-access-qmccx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.434679 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.463016 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55-config-data" (OuterVolumeSpecName: "config-data") pod "2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55" (UID: "2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.466137 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55" (UID: "2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.521504 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23515198-ab42-4f07-83b5-ac2321bd7fef-scripts\") pod \"23515198-ab42-4f07-83b5-ac2321bd7fef\" (UID: \"23515198-ab42-4f07-83b5-ac2321bd7fef\") " Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.521560 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23515198-ab42-4f07-83b5-ac2321bd7fef-run-httpd\") pod \"23515198-ab42-4f07-83b5-ac2321bd7fef\" (UID: \"23515198-ab42-4f07-83b5-ac2321bd7fef\") " Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.521614 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5s799\" (UniqueName: \"kubernetes.io/projected/23515198-ab42-4f07-83b5-ac2321bd7fef-kube-api-access-5s799\") pod \"23515198-ab42-4f07-83b5-ac2321bd7fef\" (UID: \"23515198-ab42-4f07-83b5-ac2321bd7fef\") " Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.521639 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/23515198-ab42-4f07-83b5-ac2321bd7fef-sg-core-conf-yaml\") pod \"23515198-ab42-4f07-83b5-ac2321bd7fef\" (UID: \"23515198-ab42-4f07-83b5-ac2321bd7fef\") " Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.521705 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23515198-ab42-4f07-83b5-ac2321bd7fef-log-httpd\") pod \"23515198-ab42-4f07-83b5-ac2321bd7fef\" (UID: \"23515198-ab42-4f07-83b5-ac2321bd7fef\") " Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.521863 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23515198-ab42-4f07-83b5-ac2321bd7fef-config-data\") pod \"23515198-ab42-4f07-83b5-ac2321bd7fef\" (UID: \"23515198-ab42-4f07-83b5-ac2321bd7fef\") " Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.521888 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23515198-ab42-4f07-83b5-ac2321bd7fef-combined-ca-bundle\") pod \"23515198-ab42-4f07-83b5-ac2321bd7fef\" (UID: \"23515198-ab42-4f07-83b5-ac2321bd7fef\") " Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.521908 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/23515198-ab42-4f07-83b5-ac2321bd7fef-ceilometer-tls-certs\") pod \"23515198-ab42-4f07-83b5-ac2321bd7fef\" (UID: \"23515198-ab42-4f07-83b5-ac2321bd7fef\") " Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.521935 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23515198-ab42-4f07-83b5-ac2321bd7fef-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "23515198-ab42-4f07-83b5-ac2321bd7fef" (UID: "23515198-ab42-4f07-83b5-ac2321bd7fef"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.522226 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23515198-ab42-4f07-83b5-ac2321bd7fef-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "23515198-ab42-4f07-83b5-ac2321bd7fef" (UID: "23515198-ab42-4f07-83b5-ac2321bd7fef"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.522276 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.522288 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.522299 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.522311 4752 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23515198-ab42-4f07-83b5-ac2321bd7fef-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.522324 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmccx\" (UniqueName: \"kubernetes.io/projected/2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55-kube-api-access-qmccx\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.526155 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23515198-ab42-4f07-83b5-ac2321bd7fef-kube-api-access-5s799" (OuterVolumeSpecName: "kube-api-access-5s799") pod "23515198-ab42-4f07-83b5-ac2321bd7fef" (UID: "23515198-ab42-4f07-83b5-ac2321bd7fef"). InnerVolumeSpecName "kube-api-access-5s799". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.526199 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23515198-ab42-4f07-83b5-ac2321bd7fef-scripts" (OuterVolumeSpecName: "scripts") pod "23515198-ab42-4f07-83b5-ac2321bd7fef" (UID: "23515198-ab42-4f07-83b5-ac2321bd7fef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.548067 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23515198-ab42-4f07-83b5-ac2321bd7fef-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "23515198-ab42-4f07-83b5-ac2321bd7fef" (UID: "23515198-ab42-4f07-83b5-ac2321bd7fef"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.568910 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23515198-ab42-4f07-83b5-ac2321bd7fef-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "23515198-ab42-4f07-83b5-ac2321bd7fef" (UID: "23515198-ab42-4f07-83b5-ac2321bd7fef"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.595464 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23515198-ab42-4f07-83b5-ac2321bd7fef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23515198-ab42-4f07-83b5-ac2321bd7fef" (UID: "23515198-ab42-4f07-83b5-ac2321bd7fef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.613210 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23515198-ab42-4f07-83b5-ac2321bd7fef-config-data" (OuterVolumeSpecName: "config-data") pod "23515198-ab42-4f07-83b5-ac2321bd7fef" (UID: "23515198-ab42-4f07-83b5-ac2321bd7fef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.623811 4752 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23515198-ab42-4f07-83b5-ac2321bd7fef-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.623842 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23515198-ab42-4f07-83b5-ac2321bd7fef-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.623929 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23515198-ab42-4f07-83b5-ac2321bd7fef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.623948 4752 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/23515198-ab42-4f07-83b5-ac2321bd7fef-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.623960 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23515198-ab42-4f07-83b5-ac2321bd7fef-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.623971 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5s799\" (UniqueName: \"kubernetes.io/projected/23515198-ab42-4f07-83b5-ac2321bd7fef-kube-api-access-5s799\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.623982 4752 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/23515198-ab42-4f07-83b5-ac2321bd7fef-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.795828 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.798314 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.975565 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-7qgtl" event={"ID":"2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55","Type":"ContainerDied","Data":"fa63b3bed5e17fc9e7b3732fbd1d0d90ffbe7362bfd476db19f0fb8ddd91e110"} Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.975604 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa63b3bed5e17fc9e7b3732fbd1d0d90ffbe7362bfd476db19f0fb8ddd91e110" Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.975598 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-7qgtl" Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.978637 4752 generic.go:334] "Generic (PLEG): container finished" podID="23515198-ab42-4f07-83b5-ac2321bd7fef" containerID="6aadc62293c4245d30845a5fd0dddc3432dedc55e14a0e2ee304da550d0eea5e" exitCode=0 Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.978668 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23515198-ab42-4f07-83b5-ac2321bd7fef","Type":"ContainerDied","Data":"6aadc62293c4245d30845a5fd0dddc3432dedc55e14a0e2ee304da550d0eea5e"} Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.978689 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.978698 4752 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.978706 4752 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.978711 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23515198-ab42-4f07-83b5-ac2321bd7fef","Type":"ContainerDied","Data":"6aa926b2c6caab5212686ea00a27c9320fda1102fa5f8f0d385c964b59e449be"} Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.978735 4752 scope.go:117] "RemoveContainer" containerID="cb41f648eedd5e1e004b30e261b5fc56e95c7a25d1de15569e6c71eff1ebb182" Nov 24 11:26:35 crc kubenswrapper[4752]: I1124 11:26:35.990874 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.001059 4752 scope.go:117] "RemoveContainer" containerID="0370f1bf635129133c09e671dc9c39c6538ed64a431041082d99f2411ca44c76" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.001269 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.044942 4752 scope.go:117] "RemoveContainer" containerID="2278eb5417743cc16e71303458116880db6ebd97702f8f3e08f1d1097346d162" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.082346 4752 scope.go:117] "RemoveContainer" containerID="6aadc62293c4245d30845a5fd0dddc3432dedc55e14a0e2ee304da550d0eea5e" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.082811 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.099892 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.144362 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 24 11:26:36 crc kubenswrapper[4752]: E1124 11:26:36.145277 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23515198-ab42-4f07-83b5-ac2321bd7fef" containerName="sg-core" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.145308 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="23515198-ab42-4f07-83b5-ac2321bd7fef" containerName="sg-core" Nov 24 11:26:36 crc kubenswrapper[4752]: E1124 11:26:36.145322 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23515198-ab42-4f07-83b5-ac2321bd7fef" containerName="ceilometer-notification-agent" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.145329 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="23515198-ab42-4f07-83b5-ac2321bd7fef" containerName="ceilometer-notification-agent" Nov 24 11:26:36 crc kubenswrapper[4752]: E1124 11:26:36.145355 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55" containerName="nova-cell0-conductor-db-sync" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.145363 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55" containerName="nova-cell0-conductor-db-sync" Nov 24 11:26:36 crc kubenswrapper[4752]: E1124 11:26:36.145393 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23515198-ab42-4f07-83b5-ac2321bd7fef" containerName="ceilometer-central-agent" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.145399 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="23515198-ab42-4f07-83b5-ac2321bd7fef" containerName="ceilometer-central-agent" Nov 24 11:26:36 crc kubenswrapper[4752]: E1124 11:26:36.145407 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23515198-ab42-4f07-83b5-ac2321bd7fef" containerName="proxy-httpd" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.145412 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="23515198-ab42-4f07-83b5-ac2321bd7fef" containerName="proxy-httpd" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.145569 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="23515198-ab42-4f07-83b5-ac2321bd7fef" containerName="ceilometer-notification-agent" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.145588 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="23515198-ab42-4f07-83b5-ac2321bd7fef" containerName="sg-core" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.145602 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="23515198-ab42-4f07-83b5-ac2321bd7fef" containerName="ceilometer-central-agent" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.145615 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55" containerName="nova-cell0-conductor-db-sync" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.145622 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="23515198-ab42-4f07-83b5-ac2321bd7fef" containerName="proxy-httpd" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.152049 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.156257 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.156361 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.156433 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.157515 4752 scope.go:117] "RemoveContainer" containerID="cb41f648eedd5e1e004b30e261b5fc56e95c7a25d1de15569e6c71eff1ebb182" Nov 24 11:26:36 crc kubenswrapper[4752]: E1124 11:26:36.159188 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb41f648eedd5e1e004b30e261b5fc56e95c7a25d1de15569e6c71eff1ebb182\": container with ID starting with cb41f648eedd5e1e004b30e261b5fc56e95c7a25d1de15569e6c71eff1ebb182 not found: ID does not exist" containerID="cb41f648eedd5e1e004b30e261b5fc56e95c7a25d1de15569e6c71eff1ebb182" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.159217 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb41f648eedd5e1e004b30e261b5fc56e95c7a25d1de15569e6c71eff1ebb182"} err="failed to get container status \"cb41f648eedd5e1e004b30e261b5fc56e95c7a25d1de15569e6c71eff1ebb182\": rpc error: code = NotFound desc = could not find container \"cb41f648eedd5e1e004b30e261b5fc56e95c7a25d1de15569e6c71eff1ebb182\": container with ID starting with cb41f648eedd5e1e004b30e261b5fc56e95c7a25d1de15569e6c71eff1ebb182 not found: ID does not exist" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.159242 4752 scope.go:117] "RemoveContainer" containerID="0370f1bf635129133c09e671dc9c39c6538ed64a431041082d99f2411ca44c76" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.159395 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 24 11:26:36 crc kubenswrapper[4752]: E1124 11:26:36.159619 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0370f1bf635129133c09e671dc9c39c6538ed64a431041082d99f2411ca44c76\": container with ID starting with 0370f1bf635129133c09e671dc9c39c6538ed64a431041082d99f2411ca44c76 not found: ID does not exist" containerID="0370f1bf635129133c09e671dc9c39c6538ed64a431041082d99f2411ca44c76" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.159644 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0370f1bf635129133c09e671dc9c39c6538ed64a431041082d99f2411ca44c76"} err="failed to get container status \"0370f1bf635129133c09e671dc9c39c6538ed64a431041082d99f2411ca44c76\": rpc error: code = NotFound desc = could not find container \"0370f1bf635129133c09e671dc9c39c6538ed64a431041082d99f2411ca44c76\": container with ID starting with 0370f1bf635129133c09e671dc9c39c6538ed64a431041082d99f2411ca44c76 not found: ID does not exist" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.159663 4752 scope.go:117] "RemoveContainer" containerID="2278eb5417743cc16e71303458116880db6ebd97702f8f3e08f1d1097346d162" Nov 24 11:26:36 crc kubenswrapper[4752]: E1124 11:26:36.160088 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2278eb5417743cc16e71303458116880db6ebd97702f8f3e08f1d1097346d162\": container with ID starting with 2278eb5417743cc16e71303458116880db6ebd97702f8f3e08f1d1097346d162 not found: ID does not exist" containerID="2278eb5417743cc16e71303458116880db6ebd97702f8f3e08f1d1097346d162" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.160109 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2278eb5417743cc16e71303458116880db6ebd97702f8f3e08f1d1097346d162"} err="failed to get container status \"2278eb5417743cc16e71303458116880db6ebd97702f8f3e08f1d1097346d162\": rpc error: code = NotFound desc = could not find container \"2278eb5417743cc16e71303458116880db6ebd97702f8f3e08f1d1097346d162\": container with ID starting with 2278eb5417743cc16e71303458116880db6ebd97702f8f3e08f1d1097346d162 not found: ID does not exist" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.160123 4752 scope.go:117] "RemoveContainer" containerID="6aadc62293c4245d30845a5fd0dddc3432dedc55e14a0e2ee304da550d0eea5e" Nov 24 11:26:36 crc kubenswrapper[4752]: E1124 11:26:36.160315 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6aadc62293c4245d30845a5fd0dddc3432dedc55e14a0e2ee304da550d0eea5e\": container with ID starting with 6aadc62293c4245d30845a5fd0dddc3432dedc55e14a0e2ee304da550d0eea5e not found: ID does not exist" containerID="6aadc62293c4245d30845a5fd0dddc3432dedc55e14a0e2ee304da550d0eea5e" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.160332 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aadc62293c4245d30845a5fd0dddc3432dedc55e14a0e2ee304da550d0eea5e"} err="failed to get container status \"6aadc62293c4245d30845a5fd0dddc3432dedc55e14a0e2ee304da550d0eea5e\": rpc error: code = NotFound desc = could not find container \"6aadc62293c4245d30845a5fd0dddc3432dedc55e14a0e2ee304da550d0eea5e\": container with ID starting with 6aadc62293c4245d30845a5fd0dddc3432dedc55e14a0e2ee304da550d0eea5e not found: ID does not exist" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.171486 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.174967 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.178528 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-q9v5z" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.178790 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.179230 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.239096 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d92c8e6-2992-41be-950b-1eba6c84b636-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2d92c8e6-2992-41be-950b-1eba6c84b636\") " pod="openstack/nova-cell0-conductor-0" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.239190 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a901a1c5-91ff-4986-94bf-592ff1c53ec8-log-httpd\") pod \"ceilometer-0\" (UID: \"a901a1c5-91ff-4986-94bf-592ff1c53ec8\") " pod="openstack/ceilometer-0" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.239332 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a901a1c5-91ff-4986-94bf-592ff1c53ec8-scripts\") pod \"ceilometer-0\" (UID: \"a901a1c5-91ff-4986-94bf-592ff1c53ec8\") " pod="openstack/ceilometer-0" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.239371 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8sgm\" (UniqueName: \"kubernetes.io/projected/2d92c8e6-2992-41be-950b-1eba6c84b636-kube-api-access-c8sgm\") pod \"nova-cell0-conductor-0\" (UID: \"2d92c8e6-2992-41be-950b-1eba6c84b636\") " pod="openstack/nova-cell0-conductor-0" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.239564 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d92c8e6-2992-41be-950b-1eba6c84b636-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2d92c8e6-2992-41be-950b-1eba6c84b636\") " pod="openstack/nova-cell0-conductor-0" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.239651 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a901a1c5-91ff-4986-94bf-592ff1c53ec8-config-data\") pod \"ceilometer-0\" (UID: \"a901a1c5-91ff-4986-94bf-592ff1c53ec8\") " pod="openstack/ceilometer-0" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.239719 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9f7l\" (UniqueName: \"kubernetes.io/projected/a901a1c5-91ff-4986-94bf-592ff1c53ec8-kube-api-access-j9f7l\") pod \"ceilometer-0\" (UID: \"a901a1c5-91ff-4986-94bf-592ff1c53ec8\") " pod="openstack/ceilometer-0" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.239817 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a901a1c5-91ff-4986-94bf-592ff1c53ec8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a901a1c5-91ff-4986-94bf-592ff1c53ec8\") " pod="openstack/ceilometer-0" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.239857 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a901a1c5-91ff-4986-94bf-592ff1c53ec8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a901a1c5-91ff-4986-94bf-592ff1c53ec8\") " pod="openstack/ceilometer-0" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.239918 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a901a1c5-91ff-4986-94bf-592ff1c53ec8-run-httpd\") pod \"ceilometer-0\" (UID: \"a901a1c5-91ff-4986-94bf-592ff1c53ec8\") " pod="openstack/ceilometer-0" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.239958 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a901a1c5-91ff-4986-94bf-592ff1c53ec8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a901a1c5-91ff-4986-94bf-592ff1c53ec8\") " pod="openstack/ceilometer-0" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.341475 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a901a1c5-91ff-4986-94bf-592ff1c53ec8-run-httpd\") pod \"ceilometer-0\" (UID: \"a901a1c5-91ff-4986-94bf-592ff1c53ec8\") " pod="openstack/ceilometer-0" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.341532 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a901a1c5-91ff-4986-94bf-592ff1c53ec8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a901a1c5-91ff-4986-94bf-592ff1c53ec8\") " pod="openstack/ceilometer-0" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.341583 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d92c8e6-2992-41be-950b-1eba6c84b636-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2d92c8e6-2992-41be-950b-1eba6c84b636\") " pod="openstack/nova-cell0-conductor-0" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.341621 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a901a1c5-91ff-4986-94bf-592ff1c53ec8-log-httpd\") pod \"ceilometer-0\" (UID: \"a901a1c5-91ff-4986-94bf-592ff1c53ec8\") " pod="openstack/ceilometer-0" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.341657 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a901a1c5-91ff-4986-94bf-592ff1c53ec8-scripts\") pod \"ceilometer-0\" (UID: \"a901a1c5-91ff-4986-94bf-592ff1c53ec8\") " pod="openstack/ceilometer-0" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.341672 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8sgm\" (UniqueName: \"kubernetes.io/projected/2d92c8e6-2992-41be-950b-1eba6c84b636-kube-api-access-c8sgm\") pod \"nova-cell0-conductor-0\" (UID: \"2d92c8e6-2992-41be-950b-1eba6c84b636\") " pod="openstack/nova-cell0-conductor-0" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.341701 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d92c8e6-2992-41be-950b-1eba6c84b636-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2d92c8e6-2992-41be-950b-1eba6c84b636\") " pod="openstack/nova-cell0-conductor-0" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.341727 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a901a1c5-91ff-4986-94bf-592ff1c53ec8-config-data\") pod \"ceilometer-0\" (UID: \"a901a1c5-91ff-4986-94bf-592ff1c53ec8\") " pod="openstack/ceilometer-0" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.341780 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9f7l\" (UniqueName: \"kubernetes.io/projected/a901a1c5-91ff-4986-94bf-592ff1c53ec8-kube-api-access-j9f7l\") pod \"ceilometer-0\" (UID: \"a901a1c5-91ff-4986-94bf-592ff1c53ec8\") " pod="openstack/ceilometer-0" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.341819 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a901a1c5-91ff-4986-94bf-592ff1c53ec8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a901a1c5-91ff-4986-94bf-592ff1c53ec8\") " pod="openstack/ceilometer-0" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.341842 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a901a1c5-91ff-4986-94bf-592ff1c53ec8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a901a1c5-91ff-4986-94bf-592ff1c53ec8\") " pod="openstack/ceilometer-0" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.342002 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a901a1c5-91ff-4986-94bf-592ff1c53ec8-run-httpd\") pod \"ceilometer-0\" (UID: \"a901a1c5-91ff-4986-94bf-592ff1c53ec8\") " pod="openstack/ceilometer-0" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.342951 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a901a1c5-91ff-4986-94bf-592ff1c53ec8-log-httpd\") pod \"ceilometer-0\" (UID: \"a901a1c5-91ff-4986-94bf-592ff1c53ec8\") " pod="openstack/ceilometer-0" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.346771 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d92c8e6-2992-41be-950b-1eba6c84b636-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2d92c8e6-2992-41be-950b-1eba6c84b636\") " pod="openstack/nova-cell0-conductor-0" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.346955 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a901a1c5-91ff-4986-94bf-592ff1c53ec8-scripts\") pod \"ceilometer-0\" (UID: \"a901a1c5-91ff-4986-94bf-592ff1c53ec8\") " pod="openstack/ceilometer-0" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.347576 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a901a1c5-91ff-4986-94bf-592ff1c53ec8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a901a1c5-91ff-4986-94bf-592ff1c53ec8\") " pod="openstack/ceilometer-0" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.348292 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d92c8e6-2992-41be-950b-1eba6c84b636-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2d92c8e6-2992-41be-950b-1eba6c84b636\") " pod="openstack/nova-cell0-conductor-0" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.348404 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a901a1c5-91ff-4986-94bf-592ff1c53ec8-config-data\") pod \"ceilometer-0\" (UID: \"a901a1c5-91ff-4986-94bf-592ff1c53ec8\") " pod="openstack/ceilometer-0" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.355333 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a901a1c5-91ff-4986-94bf-592ff1c53ec8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a901a1c5-91ff-4986-94bf-592ff1c53ec8\") " pod="openstack/ceilometer-0" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.356399 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a901a1c5-91ff-4986-94bf-592ff1c53ec8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a901a1c5-91ff-4986-94bf-592ff1c53ec8\") " pod="openstack/ceilometer-0" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.359263 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8sgm\" (UniqueName: \"kubernetes.io/projected/2d92c8e6-2992-41be-950b-1eba6c84b636-kube-api-access-c8sgm\") pod \"nova-cell0-conductor-0\" (UID: \"2d92c8e6-2992-41be-950b-1eba6c84b636\") " pod="openstack/nova-cell0-conductor-0" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.364868 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9f7l\" (UniqueName: \"kubernetes.io/projected/a901a1c5-91ff-4986-94bf-592ff1c53ec8-kube-api-access-j9f7l\") pod \"ceilometer-0\" (UID: \"a901a1c5-91ff-4986-94bf-592ff1c53ec8\") " pod="openstack/ceilometer-0" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.478251 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.491436 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.738260 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23515198-ab42-4f07-83b5-ac2321bd7fef" path="/var/lib/kubelet/pods/23515198-ab42-4f07-83b5-ac2321bd7fef/volumes" Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.957054 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 24 11:26:36 crc kubenswrapper[4752]: I1124 11:26:36.990253 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a901a1c5-91ff-4986-94bf-592ff1c53ec8","Type":"ContainerStarted","Data":"56f2528d0827a7ddfeb81df3732f32e7a0a596da619da02f5709ae432bbcf951"} Nov 24 11:26:37 crc kubenswrapper[4752]: I1124 11:26:37.055994 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 24 11:26:38 crc kubenswrapper[4752]: I1124 11:26:38.000144 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a901a1c5-91ff-4986-94bf-592ff1c53ec8","Type":"ContainerStarted","Data":"27b6d6942722d1db58c66719a8f8491f311077b193e1ef8b8f1f02443418972e"} Nov 24 11:26:38 crc kubenswrapper[4752]: I1124 11:26:38.005371 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"2d92c8e6-2992-41be-950b-1eba6c84b636","Type":"ContainerStarted","Data":"f7b77ab3e2e24a2b9d77f401f9cb923e7c8b2ba3bb0f4001ef3a2f170f3bb279"} Nov 24 11:26:38 crc kubenswrapper[4752]: I1124 11:26:38.005409 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"2d92c8e6-2992-41be-950b-1eba6c84b636","Type":"ContainerStarted","Data":"0a8d7b4a78af82bdec9ede8779357db9c95dd08572eb99903e7abda6ae72ef5e"} Nov 24 11:26:38 crc kubenswrapper[4752]: I1124 11:26:38.005452 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Nov 24 11:26:38 crc kubenswrapper[4752]: I1124 11:26:38.035391 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.035375307 podStartE2EDuration="2.035375307s" podCreationTimestamp="2025-11-24 11:26:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:26:38.023916206 +0000 UTC m=+1204.008736495" watchObservedRunningTime="2025-11-24 11:26:38.035375307 +0000 UTC m=+1204.020195596" Nov 24 11:26:39 crc kubenswrapper[4752]: I1124 11:26:39.012666 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a901a1c5-91ff-4986-94bf-592ff1c53ec8","Type":"ContainerStarted","Data":"606ef6738daabcd00a974684ff257695a7daec78240e6cfd544c3ac99056cf86"} Nov 24 11:26:40 crc kubenswrapper[4752]: I1124 11:26:40.024022 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a901a1c5-91ff-4986-94bf-592ff1c53ec8","Type":"ContainerStarted","Data":"60be8840e1ee6b3f596a5ce76b04f231bd7b6191f9dc15ca988c87b08abbc4ad"} Nov 24 11:26:41 crc kubenswrapper[4752]: I1124 11:26:41.035869 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a901a1c5-91ff-4986-94bf-592ff1c53ec8","Type":"ContainerStarted","Data":"4618b4ee6d11ee424000b275551df92bc5b3fa2d22b3cea12f04bb0291dd372d"} Nov 24 11:26:41 crc kubenswrapper[4752]: I1124 11:26:41.036294 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 24 11:26:41 crc kubenswrapper[4752]: I1124 11:26:41.068284 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.4336641700000001 podStartE2EDuration="5.068254814s" podCreationTimestamp="2025-11-24 11:26:36 +0000 UTC" firstStartedPulling="2025-11-24 11:26:36.961079345 +0000 UTC m=+1202.945899634" lastFinishedPulling="2025-11-24 11:26:40.595669989 +0000 UTC m=+1206.580490278" observedRunningTime="2025-11-24 11:26:41.059331236 +0000 UTC m=+1207.044151535" watchObservedRunningTime="2025-11-24 11:26:41.068254814 +0000 UTC m=+1207.053075113" Nov 24 11:26:46 crc kubenswrapper[4752]: I1124 11:26:46.537919 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.050666 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-27bqg"] Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.054060 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-27bqg" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.062093 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-27bqg"] Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.091030 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.091296 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.165865 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99p5q\" (UniqueName: \"kubernetes.io/projected/80204aea-8ec7-4653-bfad-d2e6800ecf6e-kube-api-access-99p5q\") pod \"nova-cell0-cell-mapping-27bqg\" (UID: \"80204aea-8ec7-4653-bfad-d2e6800ecf6e\") " pod="openstack/nova-cell0-cell-mapping-27bqg" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.165919 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80204aea-8ec7-4653-bfad-d2e6800ecf6e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-27bqg\" (UID: \"80204aea-8ec7-4653-bfad-d2e6800ecf6e\") " pod="openstack/nova-cell0-cell-mapping-27bqg" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.165951 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80204aea-8ec7-4653-bfad-d2e6800ecf6e-config-data\") pod \"nova-cell0-cell-mapping-27bqg\" (UID: \"80204aea-8ec7-4653-bfad-d2e6800ecf6e\") " pod="openstack/nova-cell0-cell-mapping-27bqg" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.165979 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80204aea-8ec7-4653-bfad-d2e6800ecf6e-scripts\") pod \"nova-cell0-cell-mapping-27bqg\" (UID: \"80204aea-8ec7-4653-bfad-d2e6800ecf6e\") " pod="openstack/nova-cell0-cell-mapping-27bqg" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.271773 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99p5q\" (UniqueName: \"kubernetes.io/projected/80204aea-8ec7-4653-bfad-d2e6800ecf6e-kube-api-access-99p5q\") pod \"nova-cell0-cell-mapping-27bqg\" (UID: \"80204aea-8ec7-4653-bfad-d2e6800ecf6e\") " pod="openstack/nova-cell0-cell-mapping-27bqg" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.271815 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80204aea-8ec7-4653-bfad-d2e6800ecf6e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-27bqg\" (UID: \"80204aea-8ec7-4653-bfad-d2e6800ecf6e\") " pod="openstack/nova-cell0-cell-mapping-27bqg" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.271846 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80204aea-8ec7-4653-bfad-d2e6800ecf6e-config-data\") pod \"nova-cell0-cell-mapping-27bqg\" (UID: \"80204aea-8ec7-4653-bfad-d2e6800ecf6e\") " pod="openstack/nova-cell0-cell-mapping-27bqg" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.271868 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80204aea-8ec7-4653-bfad-d2e6800ecf6e-scripts\") pod \"nova-cell0-cell-mapping-27bqg\" (UID: \"80204aea-8ec7-4653-bfad-d2e6800ecf6e\") " pod="openstack/nova-cell0-cell-mapping-27bqg" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.284579 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80204aea-8ec7-4653-bfad-d2e6800ecf6e-config-data\") pod \"nova-cell0-cell-mapping-27bqg\" (UID: \"80204aea-8ec7-4653-bfad-d2e6800ecf6e\") " pod="openstack/nova-cell0-cell-mapping-27bqg" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.285200 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80204aea-8ec7-4653-bfad-d2e6800ecf6e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-27bqg\" (UID: \"80204aea-8ec7-4653-bfad-d2e6800ecf6e\") " pod="openstack/nova-cell0-cell-mapping-27bqg" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.285263 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80204aea-8ec7-4653-bfad-d2e6800ecf6e-scripts\") pod \"nova-cell0-cell-mapping-27bqg\" (UID: \"80204aea-8ec7-4653-bfad-d2e6800ecf6e\") " pod="openstack/nova-cell0-cell-mapping-27bqg" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.287783 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.304245 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.317334 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99p5q\" (UniqueName: \"kubernetes.io/projected/80204aea-8ec7-4653-bfad-d2e6800ecf6e-kube-api-access-99p5q\") pod \"nova-cell0-cell-mapping-27bqg\" (UID: \"80204aea-8ec7-4653-bfad-d2e6800ecf6e\") " pod="openstack/nova-cell0-cell-mapping-27bqg" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.317997 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.325890 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.355591 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.367626 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.374502 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b41fce93-d38d-4688-b35a-17e5075edd6d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b41fce93-d38d-4688-b35a-17e5075edd6d\") " pod="openstack/nova-api-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.374581 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b41fce93-d38d-4688-b35a-17e5075edd6d-logs\") pod \"nova-api-0\" (UID: \"b41fce93-d38d-4688-b35a-17e5075edd6d\") " pod="openstack/nova-api-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.374606 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b41fce93-d38d-4688-b35a-17e5075edd6d-config-data\") pod \"nova-api-0\" (UID: \"b41fce93-d38d-4688-b35a-17e5075edd6d\") " pod="openstack/nova-api-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.374637 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn7k9\" (UniqueName: \"kubernetes.io/projected/b41fce93-d38d-4688-b35a-17e5075edd6d-kube-api-access-kn7k9\") pod \"nova-api-0\" (UID: \"b41fce93-d38d-4688-b35a-17e5075edd6d\") " pod="openstack/nova-api-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.377602 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.399621 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.432282 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-27bqg" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.472534 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.474299 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.484492 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.488230 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7bc6018-1c58-492d-bc91-5f55c7cf9b18-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d7bc6018-1c58-492d-bc91-5f55c7cf9b18\") " pod="openstack/nova-scheduler-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.488324 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b41fce93-d38d-4688-b35a-17e5075edd6d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b41fce93-d38d-4688-b35a-17e5075edd6d\") " pod="openstack/nova-api-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.488395 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dbxf\" (UniqueName: \"kubernetes.io/projected/d7bc6018-1c58-492d-bc91-5f55c7cf9b18-kube-api-access-7dbxf\") pod \"nova-scheduler-0\" (UID: \"d7bc6018-1c58-492d-bc91-5f55c7cf9b18\") " pod="openstack/nova-scheduler-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.488456 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b41fce93-d38d-4688-b35a-17e5075edd6d-logs\") pod \"nova-api-0\" (UID: \"b41fce93-d38d-4688-b35a-17e5075edd6d\") " pod="openstack/nova-api-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.488493 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b41fce93-d38d-4688-b35a-17e5075edd6d-config-data\") pod \"nova-api-0\" (UID: \"b41fce93-d38d-4688-b35a-17e5075edd6d\") " pod="openstack/nova-api-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.488518 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7bc6018-1c58-492d-bc91-5f55c7cf9b18-config-data\") pod \"nova-scheduler-0\" (UID: \"d7bc6018-1c58-492d-bc91-5f55c7cf9b18\") " pod="openstack/nova-scheduler-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.488552 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn7k9\" (UniqueName: \"kubernetes.io/projected/b41fce93-d38d-4688-b35a-17e5075edd6d-kube-api-access-kn7k9\") pod \"nova-api-0\" (UID: \"b41fce93-d38d-4688-b35a-17e5075edd6d\") " pod="openstack/nova-api-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.490193 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b41fce93-d38d-4688-b35a-17e5075edd6d-logs\") pod \"nova-api-0\" (UID: \"b41fce93-d38d-4688-b35a-17e5075edd6d\") " pod="openstack/nova-api-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.517577 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b41fce93-d38d-4688-b35a-17e5075edd6d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b41fce93-d38d-4688-b35a-17e5075edd6d\") " pod="openstack/nova-api-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.523420 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b41fce93-d38d-4688-b35a-17e5075edd6d-config-data\") pod \"nova-api-0\" (UID: \"b41fce93-d38d-4688-b35a-17e5075edd6d\") " pod="openstack/nova-api-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.555321 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn7k9\" (UniqueName: \"kubernetes.io/projected/b41fce93-d38d-4688-b35a-17e5075edd6d-kube-api-access-kn7k9\") pod \"nova-api-0\" (UID: \"b41fce93-d38d-4688-b35a-17e5075edd6d\") " pod="openstack/nova-api-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.593949 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c740ec7-cead-412f-b1ff-ae03d6e07007-config-data\") pod \"nova-metadata-0\" (UID: \"3c740ec7-cead-412f-b1ff-ae03d6e07007\") " pod="openstack/nova-metadata-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.593986 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7bc6018-1c58-492d-bc91-5f55c7cf9b18-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d7bc6018-1c58-492d-bc91-5f55c7cf9b18\") " pod="openstack/nova-scheduler-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.594007 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c740ec7-cead-412f-b1ff-ae03d6e07007-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3c740ec7-cead-412f-b1ff-ae03d6e07007\") " pod="openstack/nova-metadata-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.594057 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dbxf\" (UniqueName: \"kubernetes.io/projected/d7bc6018-1c58-492d-bc91-5f55c7cf9b18-kube-api-access-7dbxf\") pod \"nova-scheduler-0\" (UID: \"d7bc6018-1c58-492d-bc91-5f55c7cf9b18\") " pod="openstack/nova-scheduler-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.594081 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jvsn\" (UniqueName: \"kubernetes.io/projected/3c740ec7-cead-412f-b1ff-ae03d6e07007-kube-api-access-5jvsn\") pod \"nova-metadata-0\" (UID: \"3c740ec7-cead-412f-b1ff-ae03d6e07007\") " pod="openstack/nova-metadata-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.594114 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7bc6018-1c58-492d-bc91-5f55c7cf9b18-config-data\") pod \"nova-scheduler-0\" (UID: \"d7bc6018-1c58-492d-bc91-5f55c7cf9b18\") " pod="openstack/nova-scheduler-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.594128 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c740ec7-cead-412f-b1ff-ae03d6e07007-logs\") pod \"nova-metadata-0\" (UID: \"3c740ec7-cead-412f-b1ff-ae03d6e07007\") " pod="openstack/nova-metadata-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.606358 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.613964 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7bc6018-1c58-492d-bc91-5f55c7cf9b18-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d7bc6018-1c58-492d-bc91-5f55c7cf9b18\") " pod="openstack/nova-scheduler-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.616349 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7bc6018-1c58-492d-bc91-5f55c7cf9b18-config-data\") pod \"nova-scheduler-0\" (UID: \"d7bc6018-1c58-492d-bc91-5f55c7cf9b18\") " pod="openstack/nova-scheduler-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.634793 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.636188 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.646786 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.652406 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.677573 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dbxf\" (UniqueName: \"kubernetes.io/projected/d7bc6018-1c58-492d-bc91-5f55c7cf9b18-kube-api-access-7dbxf\") pod \"nova-scheduler-0\" (UID: \"d7bc6018-1c58-492d-bc91-5f55c7cf9b18\") " pod="openstack/nova-scheduler-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.709293 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.710051 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.711237 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km8x7\" (UniqueName: \"kubernetes.io/projected/8fd0f817-bfdb-4147-ac72-b0938d18b0bf-kube-api-access-km8x7\") pod \"nova-cell1-novncproxy-0\" (UID: \"8fd0f817-bfdb-4147-ac72-b0938d18b0bf\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.711276 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-2258v"] Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.711304 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fd0f817-bfdb-4147-ac72-b0938d18b0bf-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8fd0f817-bfdb-4147-ac72-b0938d18b0bf\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.711339 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c740ec7-cead-412f-b1ff-ae03d6e07007-config-data\") pod \"nova-metadata-0\" (UID: \"3c740ec7-cead-412f-b1ff-ae03d6e07007\") " pod="openstack/nova-metadata-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.711374 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c740ec7-cead-412f-b1ff-ae03d6e07007-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3c740ec7-cead-412f-b1ff-ae03d6e07007\") " pod="openstack/nova-metadata-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.711416 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fd0f817-bfdb-4147-ac72-b0938d18b0bf-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8fd0f817-bfdb-4147-ac72-b0938d18b0bf\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.711462 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jvsn\" (UniqueName: \"kubernetes.io/projected/3c740ec7-cead-412f-b1ff-ae03d6e07007-kube-api-access-5jvsn\") pod \"nova-metadata-0\" (UID: \"3c740ec7-cead-412f-b1ff-ae03d6e07007\") " pod="openstack/nova-metadata-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.711493 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c740ec7-cead-412f-b1ff-ae03d6e07007-logs\") pod \"nova-metadata-0\" (UID: \"3c740ec7-cead-412f-b1ff-ae03d6e07007\") " pod="openstack/nova-metadata-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.712012 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c740ec7-cead-412f-b1ff-ae03d6e07007-logs\") pod \"nova-metadata-0\" (UID: \"3c740ec7-cead-412f-b1ff-ae03d6e07007\") " pod="openstack/nova-metadata-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.713208 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-2258v" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.717769 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c740ec7-cead-412f-b1ff-ae03d6e07007-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3c740ec7-cead-412f-b1ff-ae03d6e07007\") " pod="openstack/nova-metadata-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.727969 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c740ec7-cead-412f-b1ff-ae03d6e07007-config-data\") pod \"nova-metadata-0\" (UID: \"3c740ec7-cead-412f-b1ff-ae03d6e07007\") " pod="openstack/nova-metadata-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.747994 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-2258v"] Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.756731 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jvsn\" (UniqueName: \"kubernetes.io/projected/3c740ec7-cead-412f-b1ff-ae03d6e07007-kube-api-access-5jvsn\") pod \"nova-metadata-0\" (UID: \"3c740ec7-cead-412f-b1ff-ae03d6e07007\") " pod="openstack/nova-metadata-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.812931 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/917b2118-45fd-437a-8f50-7b5f30336603-config\") pod \"dnsmasq-dns-bccf8f775-2258v\" (UID: \"917b2118-45fd-437a-8f50-7b5f30336603\") " pod="openstack/dnsmasq-dns-bccf8f775-2258v" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.813056 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/917b2118-45fd-437a-8f50-7b5f30336603-dns-svc\") pod \"dnsmasq-dns-bccf8f775-2258v\" (UID: \"917b2118-45fd-437a-8f50-7b5f30336603\") " pod="openstack/dnsmasq-dns-bccf8f775-2258v" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.813088 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km8x7\" (UniqueName: \"kubernetes.io/projected/8fd0f817-bfdb-4147-ac72-b0938d18b0bf-kube-api-access-km8x7\") pod \"nova-cell1-novncproxy-0\" (UID: \"8fd0f817-bfdb-4147-ac72-b0938d18b0bf\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.813134 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/917b2118-45fd-437a-8f50-7b5f30336603-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-2258v\" (UID: \"917b2118-45fd-437a-8f50-7b5f30336603\") " pod="openstack/dnsmasq-dns-bccf8f775-2258v" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.813177 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fd0f817-bfdb-4147-ac72-b0938d18b0bf-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8fd0f817-bfdb-4147-ac72-b0938d18b0bf\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.813231 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/917b2118-45fd-437a-8f50-7b5f30336603-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-2258v\" (UID: \"917b2118-45fd-437a-8f50-7b5f30336603\") " pod="openstack/dnsmasq-dns-bccf8f775-2258v" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.813259 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/917b2118-45fd-437a-8f50-7b5f30336603-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-2258v\" (UID: \"917b2118-45fd-437a-8f50-7b5f30336603\") " pod="openstack/dnsmasq-dns-bccf8f775-2258v" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.813712 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68trx\" (UniqueName: \"kubernetes.io/projected/917b2118-45fd-437a-8f50-7b5f30336603-kube-api-access-68trx\") pod \"dnsmasq-dns-bccf8f775-2258v\" (UID: \"917b2118-45fd-437a-8f50-7b5f30336603\") " pod="openstack/dnsmasq-dns-bccf8f775-2258v" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.813780 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fd0f817-bfdb-4147-ac72-b0938d18b0bf-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8fd0f817-bfdb-4147-ac72-b0938d18b0bf\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.822281 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fd0f817-bfdb-4147-ac72-b0938d18b0bf-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8fd0f817-bfdb-4147-ac72-b0938d18b0bf\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.822908 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fd0f817-bfdb-4147-ac72-b0938d18b0bf-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8fd0f817-bfdb-4147-ac72-b0938d18b0bf\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.837335 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km8x7\" (UniqueName: \"kubernetes.io/projected/8fd0f817-bfdb-4147-ac72-b0938d18b0bf-kube-api-access-km8x7\") pod \"nova-cell1-novncproxy-0\" (UID: \"8fd0f817-bfdb-4147-ac72-b0938d18b0bf\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.915004 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/917b2118-45fd-437a-8f50-7b5f30336603-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-2258v\" (UID: \"917b2118-45fd-437a-8f50-7b5f30336603\") " pod="openstack/dnsmasq-dns-bccf8f775-2258v" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.915264 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/917b2118-45fd-437a-8f50-7b5f30336603-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-2258v\" (UID: \"917b2118-45fd-437a-8f50-7b5f30336603\") " pod="openstack/dnsmasq-dns-bccf8f775-2258v" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.915292 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68trx\" (UniqueName: \"kubernetes.io/projected/917b2118-45fd-437a-8f50-7b5f30336603-kube-api-access-68trx\") pod \"dnsmasq-dns-bccf8f775-2258v\" (UID: \"917b2118-45fd-437a-8f50-7b5f30336603\") " pod="openstack/dnsmasq-dns-bccf8f775-2258v" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.915358 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/917b2118-45fd-437a-8f50-7b5f30336603-config\") pod \"dnsmasq-dns-bccf8f775-2258v\" (UID: \"917b2118-45fd-437a-8f50-7b5f30336603\") " pod="openstack/dnsmasq-dns-bccf8f775-2258v" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.915420 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/917b2118-45fd-437a-8f50-7b5f30336603-dns-svc\") pod \"dnsmasq-dns-bccf8f775-2258v\" (UID: \"917b2118-45fd-437a-8f50-7b5f30336603\") " pod="openstack/dnsmasq-dns-bccf8f775-2258v" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.915487 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/917b2118-45fd-437a-8f50-7b5f30336603-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-2258v\" (UID: \"917b2118-45fd-437a-8f50-7b5f30336603\") " pod="openstack/dnsmasq-dns-bccf8f775-2258v" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.917020 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/917b2118-45fd-437a-8f50-7b5f30336603-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-2258v\" (UID: \"917b2118-45fd-437a-8f50-7b5f30336603\") " pod="openstack/dnsmasq-dns-bccf8f775-2258v" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.917525 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/917b2118-45fd-437a-8f50-7b5f30336603-config\") pod \"dnsmasq-dns-bccf8f775-2258v\" (UID: \"917b2118-45fd-437a-8f50-7b5f30336603\") " pod="openstack/dnsmasq-dns-bccf8f775-2258v" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.917699 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/917b2118-45fd-437a-8f50-7b5f30336603-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-2258v\" (UID: \"917b2118-45fd-437a-8f50-7b5f30336603\") " pod="openstack/dnsmasq-dns-bccf8f775-2258v" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.918122 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/917b2118-45fd-437a-8f50-7b5f30336603-dns-svc\") pod \"dnsmasq-dns-bccf8f775-2258v\" (UID: \"917b2118-45fd-437a-8f50-7b5f30336603\") " pod="openstack/dnsmasq-dns-bccf8f775-2258v" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.918641 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/917b2118-45fd-437a-8f50-7b5f30336603-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-2258v\" (UID: \"917b2118-45fd-437a-8f50-7b5f30336603\") " pod="openstack/dnsmasq-dns-bccf8f775-2258v" Nov 24 11:26:47 crc kubenswrapper[4752]: I1124 11:26:47.935848 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68trx\" (UniqueName: \"kubernetes.io/projected/917b2118-45fd-437a-8f50-7b5f30336603-kube-api-access-68trx\") pod \"dnsmasq-dns-bccf8f775-2258v\" (UID: \"917b2118-45fd-437a-8f50-7b5f30336603\") " pod="openstack/dnsmasq-dns-bccf8f775-2258v" Nov 24 11:26:48 crc kubenswrapper[4752]: I1124 11:26:48.043198 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 11:26:48 crc kubenswrapper[4752]: I1124 11:26:48.068097 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 24 11:26:48 crc kubenswrapper[4752]: I1124 11:26:48.097654 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-2258v" Nov 24 11:26:48 crc kubenswrapper[4752]: I1124 11:26:48.445922 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-27bqg"] Nov 24 11:26:48 crc kubenswrapper[4752]: I1124 11:26:48.572118 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 24 11:26:48 crc kubenswrapper[4752]: I1124 11:26:48.581597 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 11:26:48 crc kubenswrapper[4752]: W1124 11:26:48.594545 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7bc6018_1c58_492d_bc91_5f55c7cf9b18.slice/crio-0e2d8bfefaa7728d69073898bf3b70c361e281ad5ce5b14fca7a2d27519841c4 WatchSource:0}: Error finding container 0e2d8bfefaa7728d69073898bf3b70c361e281ad5ce5b14fca7a2d27519841c4: Status 404 returned error can't find the container with id 0e2d8bfefaa7728d69073898bf3b70c361e281ad5ce5b14fca7a2d27519841c4 Nov 24 11:26:48 crc kubenswrapper[4752]: I1124 11:26:48.728121 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wm6ct"] Nov 24 11:26:48 crc kubenswrapper[4752]: I1124 11:26:48.733239 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wm6ct" Nov 24 11:26:48 crc kubenswrapper[4752]: I1124 11:26:48.736668 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 24 11:26:48 crc kubenswrapper[4752]: I1124 11:26:48.736913 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Nov 24 11:26:48 crc kubenswrapper[4752]: W1124 11:26:48.752816 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fd0f817_bfdb_4147_ac72_b0938d18b0bf.slice/crio-83a12326562a839150a5b47543feccea41d14745fc0dac4d671cde966ca0dd66 WatchSource:0}: Error finding container 83a12326562a839150a5b47543feccea41d14745fc0dac4d671cde966ca0dd66: Status 404 returned error can't find the container with id 83a12326562a839150a5b47543feccea41d14745fc0dac4d671cde966ca0dd66 Nov 24 11:26:48 crc kubenswrapper[4752]: I1124 11:26:48.755521 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wm6ct"] Nov 24 11:26:48 crc kubenswrapper[4752]: I1124 11:26:48.755551 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 24 11:26:48 crc kubenswrapper[4752]: W1124 11:26:48.757581 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c740ec7_cead_412f_b1ff_ae03d6e07007.slice/crio-e9dedc4417a37101312df4bd6b29fbead8d2e4f1879ccf566d687cfb4099c1f3 WatchSource:0}: Error finding container e9dedc4417a37101312df4bd6b29fbead8d2e4f1879ccf566d687cfb4099c1f3: Status 404 returned error can't find the container with id e9dedc4417a37101312df4bd6b29fbead8d2e4f1879ccf566d687cfb4099c1f3 Nov 24 11:26:48 crc kubenswrapper[4752]: I1124 11:26:48.761663 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 11:26:48 crc kubenswrapper[4752]: W1124 11:26:48.775510 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod917b2118_45fd_437a_8f50_7b5f30336603.slice/crio-a326f075bbba45680d191aba27ffcef9a5204e9927aa6957ad0427f124e2b53d WatchSource:0}: Error finding container a326f075bbba45680d191aba27ffcef9a5204e9927aa6957ad0427f124e2b53d: Status 404 returned error can't find the container with id a326f075bbba45680d191aba27ffcef9a5204e9927aa6957ad0427f124e2b53d Nov 24 11:26:48 crc kubenswrapper[4752]: I1124 11:26:48.779232 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-2258v"] Nov 24 11:26:48 crc kubenswrapper[4752]: I1124 11:26:48.841865 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d48e84d-ad34-4d80-b0f4-0f332b45b21d-scripts\") pod \"nova-cell1-conductor-db-sync-wm6ct\" (UID: \"3d48e84d-ad34-4d80-b0f4-0f332b45b21d\") " pod="openstack/nova-cell1-conductor-db-sync-wm6ct" Nov 24 11:26:48 crc kubenswrapper[4752]: I1124 11:26:48.841929 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d48e84d-ad34-4d80-b0f4-0f332b45b21d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wm6ct\" (UID: \"3d48e84d-ad34-4d80-b0f4-0f332b45b21d\") " pod="openstack/nova-cell1-conductor-db-sync-wm6ct" Nov 24 11:26:48 crc kubenswrapper[4752]: I1124 11:26:48.841972 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d48e84d-ad34-4d80-b0f4-0f332b45b21d-config-data\") pod \"nova-cell1-conductor-db-sync-wm6ct\" (UID: \"3d48e84d-ad34-4d80-b0f4-0f332b45b21d\") " pod="openstack/nova-cell1-conductor-db-sync-wm6ct" Nov 24 11:26:48 crc kubenswrapper[4752]: I1124 11:26:48.842017 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v74sd\" (UniqueName: \"kubernetes.io/projected/3d48e84d-ad34-4d80-b0f4-0f332b45b21d-kube-api-access-v74sd\") pod \"nova-cell1-conductor-db-sync-wm6ct\" (UID: \"3d48e84d-ad34-4d80-b0f4-0f332b45b21d\") " pod="openstack/nova-cell1-conductor-db-sync-wm6ct" Nov 24 11:26:48 crc kubenswrapper[4752]: I1124 11:26:48.943815 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d48e84d-ad34-4d80-b0f4-0f332b45b21d-scripts\") pod \"nova-cell1-conductor-db-sync-wm6ct\" (UID: \"3d48e84d-ad34-4d80-b0f4-0f332b45b21d\") " pod="openstack/nova-cell1-conductor-db-sync-wm6ct" Nov 24 11:26:48 crc kubenswrapper[4752]: I1124 11:26:48.943867 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d48e84d-ad34-4d80-b0f4-0f332b45b21d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wm6ct\" (UID: \"3d48e84d-ad34-4d80-b0f4-0f332b45b21d\") " pod="openstack/nova-cell1-conductor-db-sync-wm6ct" Nov 24 11:26:48 crc kubenswrapper[4752]: I1124 11:26:48.943903 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d48e84d-ad34-4d80-b0f4-0f332b45b21d-config-data\") pod \"nova-cell1-conductor-db-sync-wm6ct\" (UID: \"3d48e84d-ad34-4d80-b0f4-0f332b45b21d\") " pod="openstack/nova-cell1-conductor-db-sync-wm6ct" Nov 24 11:26:48 crc kubenswrapper[4752]: I1124 11:26:48.943937 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v74sd\" (UniqueName: \"kubernetes.io/projected/3d48e84d-ad34-4d80-b0f4-0f332b45b21d-kube-api-access-v74sd\") pod \"nova-cell1-conductor-db-sync-wm6ct\" (UID: \"3d48e84d-ad34-4d80-b0f4-0f332b45b21d\") " pod="openstack/nova-cell1-conductor-db-sync-wm6ct" Nov 24 11:26:48 crc kubenswrapper[4752]: I1124 11:26:48.949355 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d48e84d-ad34-4d80-b0f4-0f332b45b21d-config-data\") pod \"nova-cell1-conductor-db-sync-wm6ct\" (UID: \"3d48e84d-ad34-4d80-b0f4-0f332b45b21d\") " pod="openstack/nova-cell1-conductor-db-sync-wm6ct" Nov 24 11:26:48 crc kubenswrapper[4752]: I1124 11:26:48.949419 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d48e84d-ad34-4d80-b0f4-0f332b45b21d-scripts\") pod \"nova-cell1-conductor-db-sync-wm6ct\" (UID: \"3d48e84d-ad34-4d80-b0f4-0f332b45b21d\") " pod="openstack/nova-cell1-conductor-db-sync-wm6ct" Nov 24 11:26:48 crc kubenswrapper[4752]: I1124 11:26:48.951493 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d48e84d-ad34-4d80-b0f4-0f332b45b21d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wm6ct\" (UID: \"3d48e84d-ad34-4d80-b0f4-0f332b45b21d\") " pod="openstack/nova-cell1-conductor-db-sync-wm6ct" Nov 24 11:26:48 crc kubenswrapper[4752]: I1124 11:26:48.960544 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v74sd\" (UniqueName: \"kubernetes.io/projected/3d48e84d-ad34-4d80-b0f4-0f332b45b21d-kube-api-access-v74sd\") pod \"nova-cell1-conductor-db-sync-wm6ct\" (UID: \"3d48e84d-ad34-4d80-b0f4-0f332b45b21d\") " pod="openstack/nova-cell1-conductor-db-sync-wm6ct" Nov 24 11:26:49 crc kubenswrapper[4752]: I1124 11:26:49.063703 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wm6ct" Nov 24 11:26:49 crc kubenswrapper[4752]: I1124 11:26:49.154345 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d7bc6018-1c58-492d-bc91-5f55c7cf9b18","Type":"ContainerStarted","Data":"0e2d8bfefaa7728d69073898bf3b70c361e281ad5ce5b14fca7a2d27519841c4"} Nov 24 11:26:49 crc kubenswrapper[4752]: I1124 11:26:49.157096 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b41fce93-d38d-4688-b35a-17e5075edd6d","Type":"ContainerStarted","Data":"54ef48ed3de085be852c7f72040a4b49a11db5c62345f3cd8fc3cef11cc8c51a"} Nov 24 11:26:49 crc kubenswrapper[4752]: I1124 11:26:49.163993 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8fd0f817-bfdb-4147-ac72-b0938d18b0bf","Type":"ContainerStarted","Data":"83a12326562a839150a5b47543feccea41d14745fc0dac4d671cde966ca0dd66"} Nov 24 11:26:49 crc kubenswrapper[4752]: I1124 11:26:49.177439 4752 generic.go:334] "Generic (PLEG): container finished" podID="917b2118-45fd-437a-8f50-7b5f30336603" containerID="8314c680dec09416c0b7a5d467dd23d0b83e976435017c1eb80827410969b0a9" exitCode=0 Nov 24 11:26:49 crc kubenswrapper[4752]: I1124 11:26:49.177549 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-2258v" event={"ID":"917b2118-45fd-437a-8f50-7b5f30336603","Type":"ContainerDied","Data":"8314c680dec09416c0b7a5d467dd23d0b83e976435017c1eb80827410969b0a9"} Nov 24 11:26:49 crc kubenswrapper[4752]: I1124 11:26:49.177580 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-2258v" event={"ID":"917b2118-45fd-437a-8f50-7b5f30336603","Type":"ContainerStarted","Data":"a326f075bbba45680d191aba27ffcef9a5204e9927aa6957ad0427f124e2b53d"} Nov 24 11:26:49 crc kubenswrapper[4752]: I1124 11:26:49.181264 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c740ec7-cead-412f-b1ff-ae03d6e07007","Type":"ContainerStarted","Data":"e9dedc4417a37101312df4bd6b29fbead8d2e4f1879ccf566d687cfb4099c1f3"} Nov 24 11:26:49 crc kubenswrapper[4752]: I1124 11:26:49.183112 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-27bqg" event={"ID":"80204aea-8ec7-4653-bfad-d2e6800ecf6e","Type":"ContainerStarted","Data":"dfa9458977016460868074db72d499cd7233b5ead8ca22d285fc42aa9c5b96af"} Nov 24 11:26:49 crc kubenswrapper[4752]: I1124 11:26:49.183142 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-27bqg" event={"ID":"80204aea-8ec7-4653-bfad-d2e6800ecf6e","Type":"ContainerStarted","Data":"b925283fc406d5d9445a02c8aed9e5da72c015de3f4aeac1ca5e27c1f369ff2a"} Nov 24 11:26:49 crc kubenswrapper[4752]: I1124 11:26:49.230363 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-27bqg" podStartSLOduration=2.230345153 podStartE2EDuration="2.230345153s" podCreationTimestamp="2025-11-24 11:26:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:26:49.21502812 +0000 UTC m=+1215.199848409" watchObservedRunningTime="2025-11-24 11:26:49.230345153 +0000 UTC m=+1215.215165442" Nov 24 11:26:49 crc kubenswrapper[4752]: I1124 11:26:49.578074 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wm6ct"] Nov 24 11:26:50 crc kubenswrapper[4752]: I1124 11:26:50.195188 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-2258v" event={"ID":"917b2118-45fd-437a-8f50-7b5f30336603","Type":"ContainerStarted","Data":"3a73241e3ce20eb094929c8502acd81796faf1de15739a96eceae7def7765f99"} Nov 24 11:26:50 crc kubenswrapper[4752]: I1124 11:26:50.196049 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bccf8f775-2258v" Nov 24 11:26:50 crc kubenswrapper[4752]: I1124 11:26:50.198396 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wm6ct" event={"ID":"3d48e84d-ad34-4d80-b0f4-0f332b45b21d","Type":"ContainerStarted","Data":"8874e4f51742808fc31228902a4ae1a8f430f19b1a2fd93d62169bec6c779b21"} Nov 24 11:26:50 crc kubenswrapper[4752]: I1124 11:26:50.198423 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wm6ct" event={"ID":"3d48e84d-ad34-4d80-b0f4-0f332b45b21d","Type":"ContainerStarted","Data":"38107809df1cfa9fa53123eec6806e03c185746449cba41b031c6ab245fcdbaa"} Nov 24 11:26:50 crc kubenswrapper[4752]: I1124 11:26:50.231986 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bccf8f775-2258v" podStartSLOduration=3.231958275 podStartE2EDuration="3.231958275s" podCreationTimestamp="2025-11-24 11:26:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:26:50.21968632 +0000 UTC m=+1216.204506609" watchObservedRunningTime="2025-11-24 11:26:50.231958275 +0000 UTC m=+1216.216778564" Nov 24 11:26:50 crc kubenswrapper[4752]: I1124 11:26:50.255640 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-wm6ct" podStartSLOduration=2.255621119 podStartE2EDuration="2.255621119s" podCreationTimestamp="2025-11-24 11:26:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:26:50.235768795 +0000 UTC m=+1216.220589084" watchObservedRunningTime="2025-11-24 11:26:50.255621119 +0000 UTC m=+1216.240441408" Nov 24 11:26:51 crc kubenswrapper[4752]: I1124 11:26:51.073486 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 24 11:26:51 crc kubenswrapper[4752]: I1124 11:26:51.079990 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 11:26:52 crc kubenswrapper[4752]: I1124 11:26:52.216611 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c740ec7-cead-412f-b1ff-ae03d6e07007","Type":"ContainerStarted","Data":"1020e1afaf7c634f3869fe71601cd653f1db1a7daf04ed6e05a1f1b10cab6fae"} Nov 24 11:26:52 crc kubenswrapper[4752]: I1124 11:26:52.217904 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d7bc6018-1c58-492d-bc91-5f55c7cf9b18","Type":"ContainerStarted","Data":"288b2a196d84610d0887854db72c66950a41bd302a881fe348a2e3cd499ac476"} Nov 24 11:26:52 crc kubenswrapper[4752]: I1124 11:26:52.222520 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b41fce93-d38d-4688-b35a-17e5075edd6d","Type":"ContainerStarted","Data":"982857c386f1860f089af213f63a4f7919d815dca96aa4c2dd7baf3bb4cc3b62"} Nov 24 11:26:52 crc kubenswrapper[4752]: I1124 11:26:52.224296 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8fd0f817-bfdb-4147-ac72-b0938d18b0bf","Type":"ContainerStarted","Data":"682dc8577a8c51a3a589c8dcc6befead348a66e7150c6f7f392a3e1290c56b3a"} Nov 24 11:26:52 crc kubenswrapper[4752]: I1124 11:26:52.224456 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="8fd0f817-bfdb-4147-ac72-b0938d18b0bf" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://682dc8577a8c51a3a589c8dcc6befead348a66e7150c6f7f392a3e1290c56b3a" gracePeriod=30 Nov 24 11:26:52 crc kubenswrapper[4752]: I1124 11:26:52.235980 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.009540742 podStartE2EDuration="5.235960841s" podCreationTimestamp="2025-11-24 11:26:47 +0000 UTC" firstStartedPulling="2025-11-24 11:26:48.59696085 +0000 UTC m=+1214.581781139" lastFinishedPulling="2025-11-24 11:26:51.823380949 +0000 UTC m=+1217.808201238" observedRunningTime="2025-11-24 11:26:52.235836518 +0000 UTC m=+1218.220656817" watchObservedRunningTime="2025-11-24 11:26:52.235960841 +0000 UTC m=+1218.220781130" Nov 24 11:26:52 crc kubenswrapper[4752]: I1124 11:26:52.262496 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.199673336 podStartE2EDuration="5.262480518s" podCreationTimestamp="2025-11-24 11:26:47 +0000 UTC" firstStartedPulling="2025-11-24 11:26:48.759165967 +0000 UTC m=+1214.743986256" lastFinishedPulling="2025-11-24 11:26:51.821973149 +0000 UTC m=+1217.806793438" observedRunningTime="2025-11-24 11:26:52.252391206 +0000 UTC m=+1218.237211495" watchObservedRunningTime="2025-11-24 11:26:52.262480518 +0000 UTC m=+1218.247300807" Nov 24 11:26:52 crc kubenswrapper[4752]: I1124 11:26:52.710902 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 24 11:26:53 crc kubenswrapper[4752]: I1124 11:26:53.069439 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Nov 24 11:26:53 crc kubenswrapper[4752]: I1124 11:26:53.237978 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b41fce93-d38d-4688-b35a-17e5075edd6d","Type":"ContainerStarted","Data":"487d1cf1debc3c4e367bce133a79703645d46f0c44d627338dcae62c2393d339"} Nov 24 11:26:53 crc kubenswrapper[4752]: I1124 11:26:53.243160 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3c740ec7-cead-412f-b1ff-ae03d6e07007" containerName="nova-metadata-log" containerID="cri-o://1020e1afaf7c634f3869fe71601cd653f1db1a7daf04ed6e05a1f1b10cab6fae" gracePeriod=30 Nov 24 11:26:53 crc kubenswrapper[4752]: I1124 11:26:53.243288 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3c740ec7-cead-412f-b1ff-ae03d6e07007" containerName="nova-metadata-metadata" containerID="cri-o://e085e02523335c61d605d998a71058ad63d4c9dc0650070d708479ab032e6842" gracePeriod=30 Nov 24 11:26:53 crc kubenswrapper[4752]: I1124 11:26:53.243517 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c740ec7-cead-412f-b1ff-ae03d6e07007","Type":"ContainerStarted","Data":"e085e02523335c61d605d998a71058ad63d4c9dc0650070d708479ab032e6842"} Nov 24 11:26:53 crc kubenswrapper[4752]: I1124 11:26:53.276310 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.037143935 podStartE2EDuration="6.276280902s" podCreationTimestamp="2025-11-24 11:26:47 +0000 UTC" firstStartedPulling="2025-11-24 11:26:48.584222852 +0000 UTC m=+1214.569043141" lastFinishedPulling="2025-11-24 11:26:51.823359819 +0000 UTC m=+1217.808180108" observedRunningTime="2025-11-24 11:26:53.264471141 +0000 UTC m=+1219.249291440" watchObservedRunningTime="2025-11-24 11:26:53.276280902 +0000 UTC m=+1219.261101191" Nov 24 11:26:53 crc kubenswrapper[4752]: I1124 11:26:53.305883 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.231576424 podStartE2EDuration="6.305856647s" podCreationTimestamp="2025-11-24 11:26:47 +0000 UTC" firstStartedPulling="2025-11-24 11:26:48.760723072 +0000 UTC m=+1214.745543361" lastFinishedPulling="2025-11-24 11:26:51.835003285 +0000 UTC m=+1217.819823584" observedRunningTime="2025-11-24 11:26:53.29455011 +0000 UTC m=+1219.279370419" watchObservedRunningTime="2025-11-24 11:26:53.305856647 +0000 UTC m=+1219.290676946" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.054522 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.148400 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jvsn\" (UniqueName: \"kubernetes.io/projected/3c740ec7-cead-412f-b1ff-ae03d6e07007-kube-api-access-5jvsn\") pod \"3c740ec7-cead-412f-b1ff-ae03d6e07007\" (UID: \"3c740ec7-cead-412f-b1ff-ae03d6e07007\") " Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.148565 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c740ec7-cead-412f-b1ff-ae03d6e07007-logs\") pod \"3c740ec7-cead-412f-b1ff-ae03d6e07007\" (UID: \"3c740ec7-cead-412f-b1ff-ae03d6e07007\") " Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.148871 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c740ec7-cead-412f-b1ff-ae03d6e07007-logs" (OuterVolumeSpecName: "logs") pod "3c740ec7-cead-412f-b1ff-ae03d6e07007" (UID: "3c740ec7-cead-412f-b1ff-ae03d6e07007"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.148969 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c740ec7-cead-412f-b1ff-ae03d6e07007-combined-ca-bundle\") pod \"3c740ec7-cead-412f-b1ff-ae03d6e07007\" (UID: \"3c740ec7-cead-412f-b1ff-ae03d6e07007\") " Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.149043 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c740ec7-cead-412f-b1ff-ae03d6e07007-config-data\") pod \"3c740ec7-cead-412f-b1ff-ae03d6e07007\" (UID: \"3c740ec7-cead-412f-b1ff-ae03d6e07007\") " Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.153046 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c740ec7-cead-412f-b1ff-ae03d6e07007-logs\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.154447 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c740ec7-cead-412f-b1ff-ae03d6e07007-kube-api-access-5jvsn" (OuterVolumeSpecName: "kube-api-access-5jvsn") pod "3c740ec7-cead-412f-b1ff-ae03d6e07007" (UID: "3c740ec7-cead-412f-b1ff-ae03d6e07007"). InnerVolumeSpecName "kube-api-access-5jvsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.185179 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c740ec7-cead-412f-b1ff-ae03d6e07007-config-data" (OuterVolumeSpecName: "config-data") pod "3c740ec7-cead-412f-b1ff-ae03d6e07007" (UID: "3c740ec7-cead-412f-b1ff-ae03d6e07007"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.197973 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c740ec7-cead-412f-b1ff-ae03d6e07007-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c740ec7-cead-412f-b1ff-ae03d6e07007" (UID: "3c740ec7-cead-412f-b1ff-ae03d6e07007"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.255982 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jvsn\" (UniqueName: \"kubernetes.io/projected/3c740ec7-cead-412f-b1ff-ae03d6e07007-kube-api-access-5jvsn\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.256020 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c740ec7-cead-412f-b1ff-ae03d6e07007-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.256033 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c740ec7-cead-412f-b1ff-ae03d6e07007-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.262289 4752 generic.go:334] "Generic (PLEG): container finished" podID="3c740ec7-cead-412f-b1ff-ae03d6e07007" containerID="e085e02523335c61d605d998a71058ad63d4c9dc0650070d708479ab032e6842" exitCode=0 Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.262319 4752 generic.go:334] "Generic (PLEG): container finished" podID="3c740ec7-cead-412f-b1ff-ae03d6e07007" containerID="1020e1afaf7c634f3869fe71601cd653f1db1a7daf04ed6e05a1f1b10cab6fae" exitCode=143 Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.262313 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c740ec7-cead-412f-b1ff-ae03d6e07007","Type":"ContainerDied","Data":"e085e02523335c61d605d998a71058ad63d4c9dc0650070d708479ab032e6842"} Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.262349 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.262379 4752 scope.go:117] "RemoveContainer" containerID="e085e02523335c61d605d998a71058ad63d4c9dc0650070d708479ab032e6842" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.262364 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c740ec7-cead-412f-b1ff-ae03d6e07007","Type":"ContainerDied","Data":"1020e1afaf7c634f3869fe71601cd653f1db1a7daf04ed6e05a1f1b10cab6fae"} Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.262534 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c740ec7-cead-412f-b1ff-ae03d6e07007","Type":"ContainerDied","Data":"e9dedc4417a37101312df4bd6b29fbead8d2e4f1879ccf566d687cfb4099c1f3"} Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.308840 4752 scope.go:117] "RemoveContainer" containerID="1020e1afaf7c634f3869fe71601cd653f1db1a7daf04ed6e05a1f1b10cab6fae" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.329076 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.344679 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.346683 4752 scope.go:117] "RemoveContainer" containerID="e085e02523335c61d605d998a71058ad63d4c9dc0650070d708479ab032e6842" Nov 24 11:26:54 crc kubenswrapper[4752]: E1124 11:26:54.349352 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e085e02523335c61d605d998a71058ad63d4c9dc0650070d708479ab032e6842\": container with ID starting with e085e02523335c61d605d998a71058ad63d4c9dc0650070d708479ab032e6842 not found: ID does not exist" containerID="e085e02523335c61d605d998a71058ad63d4c9dc0650070d708479ab032e6842" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.349408 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e085e02523335c61d605d998a71058ad63d4c9dc0650070d708479ab032e6842"} err="failed to get container status \"e085e02523335c61d605d998a71058ad63d4c9dc0650070d708479ab032e6842\": rpc error: code = NotFound desc = could not find container \"e085e02523335c61d605d998a71058ad63d4c9dc0650070d708479ab032e6842\": container with ID starting with e085e02523335c61d605d998a71058ad63d4c9dc0650070d708479ab032e6842 not found: ID does not exist" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.349442 4752 scope.go:117] "RemoveContainer" containerID="1020e1afaf7c634f3869fe71601cd653f1db1a7daf04ed6e05a1f1b10cab6fae" Nov 24 11:26:54 crc kubenswrapper[4752]: E1124 11:26:54.349785 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1020e1afaf7c634f3869fe71601cd653f1db1a7daf04ed6e05a1f1b10cab6fae\": container with ID starting with 1020e1afaf7c634f3869fe71601cd653f1db1a7daf04ed6e05a1f1b10cab6fae not found: ID does not exist" containerID="1020e1afaf7c634f3869fe71601cd653f1db1a7daf04ed6e05a1f1b10cab6fae" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.349817 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1020e1afaf7c634f3869fe71601cd653f1db1a7daf04ed6e05a1f1b10cab6fae"} err="failed to get container status \"1020e1afaf7c634f3869fe71601cd653f1db1a7daf04ed6e05a1f1b10cab6fae\": rpc error: code = NotFound desc = could not find container \"1020e1afaf7c634f3869fe71601cd653f1db1a7daf04ed6e05a1f1b10cab6fae\": container with ID starting with 1020e1afaf7c634f3869fe71601cd653f1db1a7daf04ed6e05a1f1b10cab6fae not found: ID does not exist" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.349835 4752 scope.go:117] "RemoveContainer" containerID="e085e02523335c61d605d998a71058ad63d4c9dc0650070d708479ab032e6842" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.350086 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e085e02523335c61d605d998a71058ad63d4c9dc0650070d708479ab032e6842"} err="failed to get container status \"e085e02523335c61d605d998a71058ad63d4c9dc0650070d708479ab032e6842\": rpc error: code = NotFound desc = could not find container \"e085e02523335c61d605d998a71058ad63d4c9dc0650070d708479ab032e6842\": container with ID starting with e085e02523335c61d605d998a71058ad63d4c9dc0650070d708479ab032e6842 not found: ID does not exist" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.350103 4752 scope.go:117] "RemoveContainer" containerID="1020e1afaf7c634f3869fe71601cd653f1db1a7daf04ed6e05a1f1b10cab6fae" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.350333 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1020e1afaf7c634f3869fe71601cd653f1db1a7daf04ed6e05a1f1b10cab6fae"} err="failed to get container status \"1020e1afaf7c634f3869fe71601cd653f1db1a7daf04ed6e05a1f1b10cab6fae\": rpc error: code = NotFound desc = could not find container \"1020e1afaf7c634f3869fe71601cd653f1db1a7daf04ed6e05a1f1b10cab6fae\": container with ID starting with 1020e1afaf7c634f3869fe71601cd653f1db1a7daf04ed6e05a1f1b10cab6fae not found: ID does not exist" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.359376 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 24 11:26:54 crc kubenswrapper[4752]: E1124 11:26:54.359838 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c740ec7-cead-412f-b1ff-ae03d6e07007" containerName="nova-metadata-metadata" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.359855 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c740ec7-cead-412f-b1ff-ae03d6e07007" containerName="nova-metadata-metadata" Nov 24 11:26:54 crc kubenswrapper[4752]: E1124 11:26:54.359873 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c740ec7-cead-412f-b1ff-ae03d6e07007" containerName="nova-metadata-log" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.359880 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c740ec7-cead-412f-b1ff-ae03d6e07007" containerName="nova-metadata-log" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.360106 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c740ec7-cead-412f-b1ff-ae03d6e07007" containerName="nova-metadata-metadata" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.360135 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c740ec7-cead-412f-b1ff-ae03d6e07007" containerName="nova-metadata-log" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.361261 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.363783 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.364403 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.372189 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.460208 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/39fe6b62-7636-4a34-9d7b-965a96563930-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"39fe6b62-7636-4a34-9d7b-965a96563930\") " pod="openstack/nova-metadata-0" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.460815 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39fe6b62-7636-4a34-9d7b-965a96563930-config-data\") pod \"nova-metadata-0\" (UID: \"39fe6b62-7636-4a34-9d7b-965a96563930\") " pod="openstack/nova-metadata-0" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.460897 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39fe6b62-7636-4a34-9d7b-965a96563930-logs\") pod \"nova-metadata-0\" (UID: \"39fe6b62-7636-4a34-9d7b-965a96563930\") " pod="openstack/nova-metadata-0" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.460957 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39fe6b62-7636-4a34-9d7b-965a96563930-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"39fe6b62-7636-4a34-9d7b-965a96563930\") " pod="openstack/nova-metadata-0" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.461037 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqmx6\" (UniqueName: \"kubernetes.io/projected/39fe6b62-7636-4a34-9d7b-965a96563930-kube-api-access-cqmx6\") pod \"nova-metadata-0\" (UID: \"39fe6b62-7636-4a34-9d7b-965a96563930\") " pod="openstack/nova-metadata-0" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.563458 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/39fe6b62-7636-4a34-9d7b-965a96563930-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"39fe6b62-7636-4a34-9d7b-965a96563930\") " pod="openstack/nova-metadata-0" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.563548 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39fe6b62-7636-4a34-9d7b-965a96563930-config-data\") pod \"nova-metadata-0\" (UID: \"39fe6b62-7636-4a34-9d7b-965a96563930\") " pod="openstack/nova-metadata-0" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.563610 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39fe6b62-7636-4a34-9d7b-965a96563930-logs\") pod \"nova-metadata-0\" (UID: \"39fe6b62-7636-4a34-9d7b-965a96563930\") " pod="openstack/nova-metadata-0" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.563634 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39fe6b62-7636-4a34-9d7b-965a96563930-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"39fe6b62-7636-4a34-9d7b-965a96563930\") " pod="openstack/nova-metadata-0" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.563688 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqmx6\" (UniqueName: \"kubernetes.io/projected/39fe6b62-7636-4a34-9d7b-965a96563930-kube-api-access-cqmx6\") pod \"nova-metadata-0\" (UID: \"39fe6b62-7636-4a34-9d7b-965a96563930\") " pod="openstack/nova-metadata-0" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.565197 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39fe6b62-7636-4a34-9d7b-965a96563930-logs\") pod \"nova-metadata-0\" (UID: \"39fe6b62-7636-4a34-9d7b-965a96563930\") " pod="openstack/nova-metadata-0" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.568289 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39fe6b62-7636-4a34-9d7b-965a96563930-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"39fe6b62-7636-4a34-9d7b-965a96563930\") " pod="openstack/nova-metadata-0" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.569030 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/39fe6b62-7636-4a34-9d7b-965a96563930-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"39fe6b62-7636-4a34-9d7b-965a96563930\") " pod="openstack/nova-metadata-0" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.577572 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39fe6b62-7636-4a34-9d7b-965a96563930-config-data\") pod \"nova-metadata-0\" (UID: \"39fe6b62-7636-4a34-9d7b-965a96563930\") " pod="openstack/nova-metadata-0" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.578662 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqmx6\" (UniqueName: \"kubernetes.io/projected/39fe6b62-7636-4a34-9d7b-965a96563930-kube-api-access-cqmx6\") pod \"nova-metadata-0\" (UID: \"39fe6b62-7636-4a34-9d7b-965a96563930\") " pod="openstack/nova-metadata-0" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.693680 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 11:26:54 crc kubenswrapper[4752]: I1124 11:26:54.758622 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c740ec7-cead-412f-b1ff-ae03d6e07007" path="/var/lib/kubelet/pods/3c740ec7-cead-412f-b1ff-ae03d6e07007/volumes" Nov 24 11:26:55 crc kubenswrapper[4752]: W1124 11:26:55.189179 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39fe6b62_7636_4a34_9d7b_965a96563930.slice/crio-f4c4b2f2aa9168b5167256e88c0e432e90d7c5030a98458fbed20ab5b1f0235d WatchSource:0}: Error finding container f4c4b2f2aa9168b5167256e88c0e432e90d7c5030a98458fbed20ab5b1f0235d: Status 404 returned error can't find the container with id f4c4b2f2aa9168b5167256e88c0e432e90d7c5030a98458fbed20ab5b1f0235d Nov 24 11:26:55 crc kubenswrapper[4752]: I1124 11:26:55.191530 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 11:26:55 crc kubenswrapper[4752]: I1124 11:26:55.275535 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"39fe6b62-7636-4a34-9d7b-965a96563930","Type":"ContainerStarted","Data":"f4c4b2f2aa9168b5167256e88c0e432e90d7c5030a98458fbed20ab5b1f0235d"} Nov 24 11:26:56 crc kubenswrapper[4752]: I1124 11:26:56.295365 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"39fe6b62-7636-4a34-9d7b-965a96563930","Type":"ContainerStarted","Data":"9208a08450d86b0afe8f817f99a6c0a20c47346a5b142780bc45656d96f82808"} Nov 24 11:26:56 crc kubenswrapper[4752]: I1124 11:26:56.296240 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"39fe6b62-7636-4a34-9d7b-965a96563930","Type":"ContainerStarted","Data":"323341363aa671ef28fc363254f08a868bf5d55eb3157667c89afe7d904eec86"} Nov 24 11:26:56 crc kubenswrapper[4752]: I1124 11:26:56.301781 4752 generic.go:334] "Generic (PLEG): container finished" podID="80204aea-8ec7-4653-bfad-d2e6800ecf6e" containerID="dfa9458977016460868074db72d499cd7233b5ead8ca22d285fc42aa9c5b96af" exitCode=0 Nov 24 11:26:56 crc kubenswrapper[4752]: I1124 11:26:56.301853 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-27bqg" event={"ID":"80204aea-8ec7-4653-bfad-d2e6800ecf6e","Type":"ContainerDied","Data":"dfa9458977016460868074db72d499cd7233b5ead8ca22d285fc42aa9c5b96af"} Nov 24 11:26:56 crc kubenswrapper[4752]: I1124 11:26:56.338801 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.338729293 podStartE2EDuration="2.338729293s" podCreationTimestamp="2025-11-24 11:26:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:26:56.323373999 +0000 UTC m=+1222.308194328" watchObservedRunningTime="2025-11-24 11:26:56.338729293 +0000 UTC m=+1222.323549612" Nov 24 11:26:57 crc kubenswrapper[4752]: I1124 11:26:57.309963 4752 generic.go:334] "Generic (PLEG): container finished" podID="3d48e84d-ad34-4d80-b0f4-0f332b45b21d" containerID="8874e4f51742808fc31228902a4ae1a8f430f19b1a2fd93d62169bec6c779b21" exitCode=0 Nov 24 11:26:57 crc kubenswrapper[4752]: I1124 11:26:57.310075 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wm6ct" event={"ID":"3d48e84d-ad34-4d80-b0f4-0f332b45b21d","Type":"ContainerDied","Data":"8874e4f51742808fc31228902a4ae1a8f430f19b1a2fd93d62169bec6c779b21"} Nov 24 11:26:57 crc kubenswrapper[4752]: I1124 11:26:57.710770 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 24 11:26:57 crc kubenswrapper[4752]: I1124 11:26:57.710820 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 24 11:26:57 crc kubenswrapper[4752]: I1124 11:26:57.711876 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 24 11:26:57 crc kubenswrapper[4752]: I1124 11:26:57.716733 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-27bqg" Nov 24 11:26:57 crc kubenswrapper[4752]: I1124 11:26:57.771047 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 24 11:26:57 crc kubenswrapper[4752]: I1124 11:26:57.845067 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80204aea-8ec7-4653-bfad-d2e6800ecf6e-config-data\") pod \"80204aea-8ec7-4653-bfad-d2e6800ecf6e\" (UID: \"80204aea-8ec7-4653-bfad-d2e6800ecf6e\") " Nov 24 11:26:57 crc kubenswrapper[4752]: I1124 11:26:57.845481 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99p5q\" (UniqueName: \"kubernetes.io/projected/80204aea-8ec7-4653-bfad-d2e6800ecf6e-kube-api-access-99p5q\") pod \"80204aea-8ec7-4653-bfad-d2e6800ecf6e\" (UID: \"80204aea-8ec7-4653-bfad-d2e6800ecf6e\") " Nov 24 11:26:57 crc kubenswrapper[4752]: I1124 11:26:57.845540 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80204aea-8ec7-4653-bfad-d2e6800ecf6e-scripts\") pod \"80204aea-8ec7-4653-bfad-d2e6800ecf6e\" (UID: \"80204aea-8ec7-4653-bfad-d2e6800ecf6e\") " Nov 24 11:26:57 crc kubenswrapper[4752]: I1124 11:26:57.845575 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80204aea-8ec7-4653-bfad-d2e6800ecf6e-combined-ca-bundle\") pod \"80204aea-8ec7-4653-bfad-d2e6800ecf6e\" (UID: \"80204aea-8ec7-4653-bfad-d2e6800ecf6e\") " Nov 24 11:26:57 crc kubenswrapper[4752]: I1124 11:26:57.852799 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80204aea-8ec7-4653-bfad-d2e6800ecf6e-scripts" (OuterVolumeSpecName: "scripts") pod "80204aea-8ec7-4653-bfad-d2e6800ecf6e" (UID: "80204aea-8ec7-4653-bfad-d2e6800ecf6e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:26:57 crc kubenswrapper[4752]: I1124 11:26:57.852999 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80204aea-8ec7-4653-bfad-d2e6800ecf6e-kube-api-access-99p5q" (OuterVolumeSpecName: "kube-api-access-99p5q") pod "80204aea-8ec7-4653-bfad-d2e6800ecf6e" (UID: "80204aea-8ec7-4653-bfad-d2e6800ecf6e"). InnerVolumeSpecName "kube-api-access-99p5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:26:57 crc kubenswrapper[4752]: I1124 11:26:57.879359 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80204aea-8ec7-4653-bfad-d2e6800ecf6e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "80204aea-8ec7-4653-bfad-d2e6800ecf6e" (UID: "80204aea-8ec7-4653-bfad-d2e6800ecf6e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:26:57 crc kubenswrapper[4752]: I1124 11:26:57.884400 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80204aea-8ec7-4653-bfad-d2e6800ecf6e-config-data" (OuterVolumeSpecName: "config-data") pod "80204aea-8ec7-4653-bfad-d2e6800ecf6e" (UID: "80204aea-8ec7-4653-bfad-d2e6800ecf6e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:26:57 crc kubenswrapper[4752]: I1124 11:26:57.948061 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80204aea-8ec7-4653-bfad-d2e6800ecf6e-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:57 crc kubenswrapper[4752]: I1124 11:26:57.948110 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99p5q\" (UniqueName: \"kubernetes.io/projected/80204aea-8ec7-4653-bfad-d2e6800ecf6e-kube-api-access-99p5q\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:57 crc kubenswrapper[4752]: I1124 11:26:57.948126 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80204aea-8ec7-4653-bfad-d2e6800ecf6e-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:57 crc kubenswrapper[4752]: I1124 11:26:57.948139 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80204aea-8ec7-4653-bfad-d2e6800ecf6e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:58 crc kubenswrapper[4752]: I1124 11:26:58.100865 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bccf8f775-2258v" Nov 24 11:26:58 crc kubenswrapper[4752]: I1124 11:26:58.191710 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-q2svj"] Nov 24 11:26:58 crc kubenswrapper[4752]: I1124 11:26:58.194038 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6578955fd5-q2svj" podUID="3e1d72bc-5f15-4bf5-a424-62f34eafd84e" containerName="dnsmasq-dns" containerID="cri-o://bac1392813903ed8becd43cc8d47a7e24faaeb6360ec55c28bab36488f31e191" gracePeriod=10 Nov 24 11:26:58 crc kubenswrapper[4752]: I1124 11:26:58.321364 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-27bqg" event={"ID":"80204aea-8ec7-4653-bfad-d2e6800ecf6e","Type":"ContainerDied","Data":"b925283fc406d5d9445a02c8aed9e5da72c015de3f4aeac1ca5e27c1f369ff2a"} Nov 24 11:26:58 crc kubenswrapper[4752]: I1124 11:26:58.322151 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b925283fc406d5d9445a02c8aed9e5da72c015de3f4aeac1ca5e27c1f369ff2a" Nov 24 11:26:58 crc kubenswrapper[4752]: I1124 11:26:58.322278 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-27bqg" Nov 24 11:26:58 crc kubenswrapper[4752]: I1124 11:26:58.341776 4752 generic.go:334] "Generic (PLEG): container finished" podID="3e1d72bc-5f15-4bf5-a424-62f34eafd84e" containerID="bac1392813903ed8becd43cc8d47a7e24faaeb6360ec55c28bab36488f31e191" exitCode=0 Nov 24 11:26:58 crc kubenswrapper[4752]: I1124 11:26:58.343027 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-q2svj" event={"ID":"3e1d72bc-5f15-4bf5-a424-62f34eafd84e","Type":"ContainerDied","Data":"bac1392813903ed8becd43cc8d47a7e24faaeb6360ec55c28bab36488f31e191"} Nov 24 11:26:58 crc kubenswrapper[4752]: I1124 11:26:58.402348 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 24 11:26:58 crc kubenswrapper[4752]: I1124 11:26:58.479678 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 24 11:26:58 crc kubenswrapper[4752]: I1124 11:26:58.503360 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 11:26:58 crc kubenswrapper[4752]: I1124 11:26:58.503551 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="39fe6b62-7636-4a34-9d7b-965a96563930" containerName="nova-metadata-log" containerID="cri-o://323341363aa671ef28fc363254f08a868bf5d55eb3157667c89afe7d904eec86" gracePeriod=30 Nov 24 11:26:58 crc kubenswrapper[4752]: I1124 11:26:58.504009 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="39fe6b62-7636-4a34-9d7b-965a96563930" containerName="nova-metadata-metadata" containerID="cri-o://9208a08450d86b0afe8f817f99a6c0a20c47346a5b142780bc45656d96f82808" gracePeriod=30 Nov 24 11:26:58 crc kubenswrapper[4752]: I1124 11:26:58.793182 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b41fce93-d38d-4688-b35a-17e5075edd6d" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.184:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 24 11:26:58 crc kubenswrapper[4752]: I1124 11:26:58.793184 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b41fce93-d38d-4688-b35a-17e5075edd6d" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.184:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 24 11:26:58 crc kubenswrapper[4752]: I1124 11:26:58.857595 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-q2svj" Nov 24 11:26:58 crc kubenswrapper[4752]: I1124 11:26:58.869003 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wm6ct" Nov 24 11:26:58 crc kubenswrapper[4752]: I1124 11:26:58.971287 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d48e84d-ad34-4d80-b0f4-0f332b45b21d-combined-ca-bundle\") pod \"3d48e84d-ad34-4d80-b0f4-0f332b45b21d\" (UID: \"3d48e84d-ad34-4d80-b0f4-0f332b45b21d\") " Nov 24 11:26:58 crc kubenswrapper[4752]: I1124 11:26:58.971585 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cb7wn\" (UniqueName: \"kubernetes.io/projected/3e1d72bc-5f15-4bf5-a424-62f34eafd84e-kube-api-access-cb7wn\") pod \"3e1d72bc-5f15-4bf5-a424-62f34eafd84e\" (UID: \"3e1d72bc-5f15-4bf5-a424-62f34eafd84e\") " Nov 24 11:26:58 crc kubenswrapper[4752]: I1124 11:26:58.971631 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3e1d72bc-5f15-4bf5-a424-62f34eafd84e-ovsdbserver-nb\") pod \"3e1d72bc-5f15-4bf5-a424-62f34eafd84e\" (UID: \"3e1d72bc-5f15-4bf5-a424-62f34eafd84e\") " Nov 24 11:26:58 crc kubenswrapper[4752]: I1124 11:26:58.971685 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d48e84d-ad34-4d80-b0f4-0f332b45b21d-config-data\") pod \"3d48e84d-ad34-4d80-b0f4-0f332b45b21d\" (UID: \"3d48e84d-ad34-4d80-b0f4-0f332b45b21d\") " Nov 24 11:26:58 crc kubenswrapper[4752]: I1124 11:26:58.971774 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d48e84d-ad34-4d80-b0f4-0f332b45b21d-scripts\") pod \"3d48e84d-ad34-4d80-b0f4-0f332b45b21d\" (UID: \"3d48e84d-ad34-4d80-b0f4-0f332b45b21d\") " Nov 24 11:26:58 crc kubenswrapper[4752]: I1124 11:26:58.971790 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e1d72bc-5f15-4bf5-a424-62f34eafd84e-dns-svc\") pod \"3e1d72bc-5f15-4bf5-a424-62f34eafd84e\" (UID: \"3e1d72bc-5f15-4bf5-a424-62f34eafd84e\") " Nov 24 11:26:58 crc kubenswrapper[4752]: I1124 11:26:58.971819 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3e1d72bc-5f15-4bf5-a424-62f34eafd84e-dns-swift-storage-0\") pod \"3e1d72bc-5f15-4bf5-a424-62f34eafd84e\" (UID: \"3e1d72bc-5f15-4bf5-a424-62f34eafd84e\") " Nov 24 11:26:58 crc kubenswrapper[4752]: I1124 11:26:58.971832 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3e1d72bc-5f15-4bf5-a424-62f34eafd84e-ovsdbserver-sb\") pod \"3e1d72bc-5f15-4bf5-a424-62f34eafd84e\" (UID: \"3e1d72bc-5f15-4bf5-a424-62f34eafd84e\") " Nov 24 11:26:58 crc kubenswrapper[4752]: I1124 11:26:58.971877 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e1d72bc-5f15-4bf5-a424-62f34eafd84e-config\") pod \"3e1d72bc-5f15-4bf5-a424-62f34eafd84e\" (UID: \"3e1d72bc-5f15-4bf5-a424-62f34eafd84e\") " Nov 24 11:26:58 crc kubenswrapper[4752]: I1124 11:26:58.971899 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v74sd\" (UniqueName: \"kubernetes.io/projected/3d48e84d-ad34-4d80-b0f4-0f332b45b21d-kube-api-access-v74sd\") pod \"3d48e84d-ad34-4d80-b0f4-0f332b45b21d\" (UID: \"3d48e84d-ad34-4d80-b0f4-0f332b45b21d\") " Nov 24 11:26:58 crc kubenswrapper[4752]: I1124 11:26:58.991949 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d48e84d-ad34-4d80-b0f4-0f332b45b21d-scripts" (OuterVolumeSpecName: "scripts") pod "3d48e84d-ad34-4d80-b0f4-0f332b45b21d" (UID: "3d48e84d-ad34-4d80-b0f4-0f332b45b21d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:26:58 crc kubenswrapper[4752]: I1124 11:26:58.993315 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e1d72bc-5f15-4bf5-a424-62f34eafd84e-kube-api-access-cb7wn" (OuterVolumeSpecName: "kube-api-access-cb7wn") pod "3e1d72bc-5f15-4bf5-a424-62f34eafd84e" (UID: "3e1d72bc-5f15-4bf5-a424-62f34eafd84e"). InnerVolumeSpecName "kube-api-access-cb7wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:26:58 crc kubenswrapper[4752]: I1124 11:26:58.993724 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d48e84d-ad34-4d80-b0f4-0f332b45b21d-kube-api-access-v74sd" (OuterVolumeSpecName: "kube-api-access-v74sd") pod "3d48e84d-ad34-4d80-b0f4-0f332b45b21d" (UID: "3d48e84d-ad34-4d80-b0f4-0f332b45b21d"). InnerVolumeSpecName "kube-api-access-v74sd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.074276 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cb7wn\" (UniqueName: \"kubernetes.io/projected/3e1d72bc-5f15-4bf5-a424-62f34eafd84e-kube-api-access-cb7wn\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.074304 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d48e84d-ad34-4d80-b0f4-0f332b45b21d-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.074314 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v74sd\" (UniqueName: \"kubernetes.io/projected/3d48e84d-ad34-4d80-b0f4-0f332b45b21d-kube-api-access-v74sd\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.112653 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d48e84d-ad34-4d80-b0f4-0f332b45b21d-config-data" (OuterVolumeSpecName: "config-data") pod "3d48e84d-ad34-4d80-b0f4-0f332b45b21d" (UID: "3d48e84d-ad34-4d80-b0f4-0f332b45b21d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.134878 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d48e84d-ad34-4d80-b0f4-0f332b45b21d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d48e84d-ad34-4d80-b0f4-0f332b45b21d" (UID: "3d48e84d-ad34-4d80-b0f4-0f332b45b21d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.139050 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e1d72bc-5f15-4bf5-a424-62f34eafd84e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3e1d72bc-5f15-4bf5-a424-62f34eafd84e" (UID: "3e1d72bc-5f15-4bf5-a424-62f34eafd84e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.165616 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e1d72bc-5f15-4bf5-a424-62f34eafd84e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3e1d72bc-5f15-4bf5-a424-62f34eafd84e" (UID: "3e1d72bc-5f15-4bf5-a424-62f34eafd84e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.166345 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.177763 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d48e84d-ad34-4d80-b0f4-0f332b45b21d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.177787 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d48e84d-ad34-4d80-b0f4-0f332b45b21d-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.177796 4752 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e1d72bc-5f15-4bf5-a424-62f34eafd84e-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.177806 4752 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3e1d72bc-5f15-4bf5-a424-62f34eafd84e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.194121 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.212418 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e1d72bc-5f15-4bf5-a424-62f34eafd84e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3e1d72bc-5f15-4bf5-a424-62f34eafd84e" (UID: "3e1d72bc-5f15-4bf5-a424-62f34eafd84e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.246012 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e1d72bc-5f15-4bf5-a424-62f34eafd84e-config" (OuterVolumeSpecName: "config") pod "3e1d72bc-5f15-4bf5-a424-62f34eafd84e" (UID: "3e1d72bc-5f15-4bf5-a424-62f34eafd84e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.247195 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e1d72bc-5f15-4bf5-a424-62f34eafd84e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3e1d72bc-5f15-4bf5-a424-62f34eafd84e" (UID: "3e1d72bc-5f15-4bf5-a424-62f34eafd84e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.279402 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/39fe6b62-7636-4a34-9d7b-965a96563930-nova-metadata-tls-certs\") pod \"39fe6b62-7636-4a34-9d7b-965a96563930\" (UID: \"39fe6b62-7636-4a34-9d7b-965a96563930\") " Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.279458 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39fe6b62-7636-4a34-9d7b-965a96563930-combined-ca-bundle\") pod \"39fe6b62-7636-4a34-9d7b-965a96563930\" (UID: \"39fe6b62-7636-4a34-9d7b-965a96563930\") " Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.279596 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39fe6b62-7636-4a34-9d7b-965a96563930-config-data\") pod \"39fe6b62-7636-4a34-9d7b-965a96563930\" (UID: \"39fe6b62-7636-4a34-9d7b-965a96563930\") " Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.279615 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqmx6\" (UniqueName: \"kubernetes.io/projected/39fe6b62-7636-4a34-9d7b-965a96563930-kube-api-access-cqmx6\") pod \"39fe6b62-7636-4a34-9d7b-965a96563930\" (UID: \"39fe6b62-7636-4a34-9d7b-965a96563930\") " Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.279674 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39fe6b62-7636-4a34-9d7b-965a96563930-logs\") pod \"39fe6b62-7636-4a34-9d7b-965a96563930\" (UID: \"39fe6b62-7636-4a34-9d7b-965a96563930\") " Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.280046 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3e1d72bc-5f15-4bf5-a424-62f34eafd84e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.280064 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e1d72bc-5f15-4bf5-a424-62f34eafd84e-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.280077 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3e1d72bc-5f15-4bf5-a424-62f34eafd84e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.280261 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39fe6b62-7636-4a34-9d7b-965a96563930-logs" (OuterVolumeSpecName: "logs") pod "39fe6b62-7636-4a34-9d7b-965a96563930" (UID: "39fe6b62-7636-4a34-9d7b-965a96563930"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.286860 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39fe6b62-7636-4a34-9d7b-965a96563930-kube-api-access-cqmx6" (OuterVolumeSpecName: "kube-api-access-cqmx6") pod "39fe6b62-7636-4a34-9d7b-965a96563930" (UID: "39fe6b62-7636-4a34-9d7b-965a96563930"). InnerVolumeSpecName "kube-api-access-cqmx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.305391 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39fe6b62-7636-4a34-9d7b-965a96563930-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "39fe6b62-7636-4a34-9d7b-965a96563930" (UID: "39fe6b62-7636-4a34-9d7b-965a96563930"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.316846 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39fe6b62-7636-4a34-9d7b-965a96563930-config-data" (OuterVolumeSpecName: "config-data") pod "39fe6b62-7636-4a34-9d7b-965a96563930" (UID: "39fe6b62-7636-4a34-9d7b-965a96563930"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.331912 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39fe6b62-7636-4a34-9d7b-965a96563930-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "39fe6b62-7636-4a34-9d7b-965a96563930" (UID: "39fe6b62-7636-4a34-9d7b-965a96563930"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.358932 4752 generic.go:334] "Generic (PLEG): container finished" podID="39fe6b62-7636-4a34-9d7b-965a96563930" containerID="9208a08450d86b0afe8f817f99a6c0a20c47346a5b142780bc45656d96f82808" exitCode=0 Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.358966 4752 generic.go:334] "Generic (PLEG): container finished" podID="39fe6b62-7636-4a34-9d7b-965a96563930" containerID="323341363aa671ef28fc363254f08a868bf5d55eb3157667c89afe7d904eec86" exitCode=143 Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.359026 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"39fe6b62-7636-4a34-9d7b-965a96563930","Type":"ContainerDied","Data":"9208a08450d86b0afe8f817f99a6c0a20c47346a5b142780bc45656d96f82808"} Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.359027 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.359093 4752 scope.go:117] "RemoveContainer" containerID="9208a08450d86b0afe8f817f99a6c0a20c47346a5b142780bc45656d96f82808" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.359057 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"39fe6b62-7636-4a34-9d7b-965a96563930","Type":"ContainerDied","Data":"323341363aa671ef28fc363254f08a868bf5d55eb3157667c89afe7d904eec86"} Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.359183 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"39fe6b62-7636-4a34-9d7b-965a96563930","Type":"ContainerDied","Data":"f4c4b2f2aa9168b5167256e88c0e432e90d7c5030a98458fbed20ab5b1f0235d"} Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.364536 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-q2svj" event={"ID":"3e1d72bc-5f15-4bf5-a424-62f34eafd84e","Type":"ContainerDied","Data":"f1b43646d153d845681e012ce48db1e7b652caac948d8f1081c47b3f5d67cc4c"} Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.364625 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-q2svj" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.368837 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wm6ct" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.370381 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wm6ct" event={"ID":"3d48e84d-ad34-4d80-b0f4-0f332b45b21d","Type":"ContainerDied","Data":"38107809df1cfa9fa53123eec6806e03c185746449cba41b031c6ab245fcdbaa"} Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.370419 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38107809df1cfa9fa53123eec6806e03c185746449cba41b031c6ab245fcdbaa" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.370566 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b41fce93-d38d-4688-b35a-17e5075edd6d" containerName="nova-api-log" containerID="cri-o://982857c386f1860f089af213f63a4f7919d815dca96aa4c2dd7baf3bb4cc3b62" gracePeriod=30 Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.371029 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b41fce93-d38d-4688-b35a-17e5075edd6d" containerName="nova-api-api" containerID="cri-o://487d1cf1debc3c4e367bce133a79703645d46f0c44d627338dcae62c2393d339" gracePeriod=30 Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.382692 4752 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/39fe6b62-7636-4a34-9d7b-965a96563930-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.384203 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39fe6b62-7636-4a34-9d7b-965a96563930-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.384225 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39fe6b62-7636-4a34-9d7b-965a96563930-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.384237 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqmx6\" (UniqueName: \"kubernetes.io/projected/39fe6b62-7636-4a34-9d7b-965a96563930-kube-api-access-cqmx6\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.384250 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39fe6b62-7636-4a34-9d7b-965a96563930-logs\") on node \"crc\" DevicePath \"\"" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.401157 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 24 11:26:59 crc kubenswrapper[4752]: E1124 11:26:59.401520 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80204aea-8ec7-4653-bfad-d2e6800ecf6e" containerName="nova-manage" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.401535 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="80204aea-8ec7-4653-bfad-d2e6800ecf6e" containerName="nova-manage" Nov 24 11:26:59 crc kubenswrapper[4752]: E1124 11:26:59.401550 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d48e84d-ad34-4d80-b0f4-0f332b45b21d" containerName="nova-cell1-conductor-db-sync" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.401556 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d48e84d-ad34-4d80-b0f4-0f332b45b21d" containerName="nova-cell1-conductor-db-sync" Nov 24 11:26:59 crc kubenswrapper[4752]: E1124 11:26:59.401567 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39fe6b62-7636-4a34-9d7b-965a96563930" containerName="nova-metadata-log" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.401575 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="39fe6b62-7636-4a34-9d7b-965a96563930" containerName="nova-metadata-log" Nov 24 11:26:59 crc kubenswrapper[4752]: E1124 11:26:59.401593 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e1d72bc-5f15-4bf5-a424-62f34eafd84e" containerName="dnsmasq-dns" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.401598 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e1d72bc-5f15-4bf5-a424-62f34eafd84e" containerName="dnsmasq-dns" Nov 24 11:26:59 crc kubenswrapper[4752]: E1124 11:26:59.401613 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e1d72bc-5f15-4bf5-a424-62f34eafd84e" containerName="init" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.401619 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e1d72bc-5f15-4bf5-a424-62f34eafd84e" containerName="init" Nov 24 11:26:59 crc kubenswrapper[4752]: E1124 11:26:59.401638 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39fe6b62-7636-4a34-9d7b-965a96563930" containerName="nova-metadata-metadata" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.401644 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="39fe6b62-7636-4a34-9d7b-965a96563930" containerName="nova-metadata-metadata" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.401815 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="80204aea-8ec7-4653-bfad-d2e6800ecf6e" containerName="nova-manage" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.401833 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="39fe6b62-7636-4a34-9d7b-965a96563930" containerName="nova-metadata-metadata" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.401848 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e1d72bc-5f15-4bf5-a424-62f34eafd84e" containerName="dnsmasq-dns" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.401857 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="39fe6b62-7636-4a34-9d7b-965a96563930" containerName="nova-metadata-log" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.401865 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d48e84d-ad34-4d80-b0f4-0f332b45b21d" containerName="nova-cell1-conductor-db-sync" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.402500 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.409188 4752 scope.go:117] "RemoveContainer" containerID="323341363aa671ef28fc363254f08a868bf5d55eb3157667c89afe7d904eec86" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.410882 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.416632 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.425075 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-q2svj"] Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.439582 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-q2svj"] Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.456433 4752 scope.go:117] "RemoveContainer" containerID="9208a08450d86b0afe8f817f99a6c0a20c47346a5b142780bc45656d96f82808" Nov 24 11:26:59 crc kubenswrapper[4752]: E1124 11:26:59.457090 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9208a08450d86b0afe8f817f99a6c0a20c47346a5b142780bc45656d96f82808\": container with ID starting with 9208a08450d86b0afe8f817f99a6c0a20c47346a5b142780bc45656d96f82808 not found: ID does not exist" containerID="9208a08450d86b0afe8f817f99a6c0a20c47346a5b142780bc45656d96f82808" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.457172 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9208a08450d86b0afe8f817f99a6c0a20c47346a5b142780bc45656d96f82808"} err="failed to get container status \"9208a08450d86b0afe8f817f99a6c0a20c47346a5b142780bc45656d96f82808\": rpc error: code = NotFound desc = could not find container \"9208a08450d86b0afe8f817f99a6c0a20c47346a5b142780bc45656d96f82808\": container with ID starting with 9208a08450d86b0afe8f817f99a6c0a20c47346a5b142780bc45656d96f82808 not found: ID does not exist" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.457229 4752 scope.go:117] "RemoveContainer" containerID="323341363aa671ef28fc363254f08a868bf5d55eb3157667c89afe7d904eec86" Nov 24 11:26:59 crc kubenswrapper[4752]: E1124 11:26:59.457653 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"323341363aa671ef28fc363254f08a868bf5d55eb3157667c89afe7d904eec86\": container with ID starting with 323341363aa671ef28fc363254f08a868bf5d55eb3157667c89afe7d904eec86 not found: ID does not exist" containerID="323341363aa671ef28fc363254f08a868bf5d55eb3157667c89afe7d904eec86" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.457686 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"323341363aa671ef28fc363254f08a868bf5d55eb3157667c89afe7d904eec86"} err="failed to get container status \"323341363aa671ef28fc363254f08a868bf5d55eb3157667c89afe7d904eec86\": rpc error: code = NotFound desc = could not find container \"323341363aa671ef28fc363254f08a868bf5d55eb3157667c89afe7d904eec86\": container with ID starting with 323341363aa671ef28fc363254f08a868bf5d55eb3157667c89afe7d904eec86 not found: ID does not exist" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.457729 4752 scope.go:117] "RemoveContainer" containerID="9208a08450d86b0afe8f817f99a6c0a20c47346a5b142780bc45656d96f82808" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.457969 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9208a08450d86b0afe8f817f99a6c0a20c47346a5b142780bc45656d96f82808"} err="failed to get container status \"9208a08450d86b0afe8f817f99a6c0a20c47346a5b142780bc45656d96f82808\": rpc error: code = NotFound desc = could not find container \"9208a08450d86b0afe8f817f99a6c0a20c47346a5b142780bc45656d96f82808\": container with ID starting with 9208a08450d86b0afe8f817f99a6c0a20c47346a5b142780bc45656d96f82808 not found: ID does not exist" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.458019 4752 scope.go:117] "RemoveContainer" containerID="323341363aa671ef28fc363254f08a868bf5d55eb3157667c89afe7d904eec86" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.458268 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"323341363aa671ef28fc363254f08a868bf5d55eb3157667c89afe7d904eec86"} err="failed to get container status \"323341363aa671ef28fc363254f08a868bf5d55eb3157667c89afe7d904eec86\": rpc error: code = NotFound desc = could not find container \"323341363aa671ef28fc363254f08a868bf5d55eb3157667c89afe7d904eec86\": container with ID starting with 323341363aa671ef28fc363254f08a868bf5d55eb3157667c89afe7d904eec86 not found: ID does not exist" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.458301 4752 scope.go:117] "RemoveContainer" containerID="bac1392813903ed8becd43cc8d47a7e24faaeb6360ec55c28bab36488f31e191" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.477780 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.485829 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5be4efe6-ed60-4417-a84c-8ff27bf4a685-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"5be4efe6-ed60-4417-a84c-8ff27bf4a685\") " pod="openstack/nova-cell1-conductor-0" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.485885 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5be4efe6-ed60-4417-a84c-8ff27bf4a685-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"5be4efe6-ed60-4417-a84c-8ff27bf4a685\") " pod="openstack/nova-cell1-conductor-0" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.485909 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4qv4\" (UniqueName: \"kubernetes.io/projected/5be4efe6-ed60-4417-a84c-8ff27bf4a685-kube-api-access-s4qv4\") pod \"nova-cell1-conductor-0\" (UID: \"5be4efe6-ed60-4417-a84c-8ff27bf4a685\") " pod="openstack/nova-cell1-conductor-0" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.486626 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.501118 4752 scope.go:117] "RemoveContainer" containerID="c0a1ec892d670ab63c53b4b9aa81154db6743fafe39f3112eea17cfd2ef10528" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.501246 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.502756 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.505064 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.505236 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.517458 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.587356 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbmb6\" (UniqueName: \"kubernetes.io/projected/158542a8-66b4-40d4-b6a6-12635841260a-kube-api-access-bbmb6\") pod \"nova-metadata-0\" (UID: \"158542a8-66b4-40d4-b6a6-12635841260a\") " pod="openstack/nova-metadata-0" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.587408 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/158542a8-66b4-40d4-b6a6-12635841260a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"158542a8-66b4-40d4-b6a6-12635841260a\") " pod="openstack/nova-metadata-0" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.587451 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/158542a8-66b4-40d4-b6a6-12635841260a-logs\") pod \"nova-metadata-0\" (UID: \"158542a8-66b4-40d4-b6a6-12635841260a\") " pod="openstack/nova-metadata-0" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.587495 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5be4efe6-ed60-4417-a84c-8ff27bf4a685-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"5be4efe6-ed60-4417-a84c-8ff27bf4a685\") " pod="openstack/nova-cell1-conductor-0" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.587524 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5be4efe6-ed60-4417-a84c-8ff27bf4a685-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"5be4efe6-ed60-4417-a84c-8ff27bf4a685\") " pod="openstack/nova-cell1-conductor-0" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.587541 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/158542a8-66b4-40d4-b6a6-12635841260a-config-data\") pod \"nova-metadata-0\" (UID: \"158542a8-66b4-40d4-b6a6-12635841260a\") " pod="openstack/nova-metadata-0" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.587560 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4qv4\" (UniqueName: \"kubernetes.io/projected/5be4efe6-ed60-4417-a84c-8ff27bf4a685-kube-api-access-s4qv4\") pod \"nova-cell1-conductor-0\" (UID: \"5be4efe6-ed60-4417-a84c-8ff27bf4a685\") " pod="openstack/nova-cell1-conductor-0" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.587586 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/158542a8-66b4-40d4-b6a6-12635841260a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"158542a8-66b4-40d4-b6a6-12635841260a\") " pod="openstack/nova-metadata-0" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.591444 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5be4efe6-ed60-4417-a84c-8ff27bf4a685-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"5be4efe6-ed60-4417-a84c-8ff27bf4a685\") " pod="openstack/nova-cell1-conductor-0" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.591806 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5be4efe6-ed60-4417-a84c-8ff27bf4a685-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"5be4efe6-ed60-4417-a84c-8ff27bf4a685\") " pod="openstack/nova-cell1-conductor-0" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.605341 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4qv4\" (UniqueName: \"kubernetes.io/projected/5be4efe6-ed60-4417-a84c-8ff27bf4a685-kube-api-access-s4qv4\") pod \"nova-cell1-conductor-0\" (UID: \"5be4efe6-ed60-4417-a84c-8ff27bf4a685\") " pod="openstack/nova-cell1-conductor-0" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.688357 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/158542a8-66b4-40d4-b6a6-12635841260a-logs\") pod \"nova-metadata-0\" (UID: \"158542a8-66b4-40d4-b6a6-12635841260a\") " pod="openstack/nova-metadata-0" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.688456 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/158542a8-66b4-40d4-b6a6-12635841260a-config-data\") pod \"nova-metadata-0\" (UID: \"158542a8-66b4-40d4-b6a6-12635841260a\") " pod="openstack/nova-metadata-0" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.688494 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/158542a8-66b4-40d4-b6a6-12635841260a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"158542a8-66b4-40d4-b6a6-12635841260a\") " pod="openstack/nova-metadata-0" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.688593 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbmb6\" (UniqueName: \"kubernetes.io/projected/158542a8-66b4-40d4-b6a6-12635841260a-kube-api-access-bbmb6\") pod \"nova-metadata-0\" (UID: \"158542a8-66b4-40d4-b6a6-12635841260a\") " pod="openstack/nova-metadata-0" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.688627 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/158542a8-66b4-40d4-b6a6-12635841260a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"158542a8-66b4-40d4-b6a6-12635841260a\") " pod="openstack/nova-metadata-0" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.690124 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/158542a8-66b4-40d4-b6a6-12635841260a-logs\") pod \"nova-metadata-0\" (UID: \"158542a8-66b4-40d4-b6a6-12635841260a\") " pod="openstack/nova-metadata-0" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.692359 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/158542a8-66b4-40d4-b6a6-12635841260a-config-data\") pod \"nova-metadata-0\" (UID: \"158542a8-66b4-40d4-b6a6-12635841260a\") " pod="openstack/nova-metadata-0" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.692691 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/158542a8-66b4-40d4-b6a6-12635841260a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"158542a8-66b4-40d4-b6a6-12635841260a\") " pod="openstack/nova-metadata-0" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.692903 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/158542a8-66b4-40d4-b6a6-12635841260a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"158542a8-66b4-40d4-b6a6-12635841260a\") " pod="openstack/nova-metadata-0" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.707402 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbmb6\" (UniqueName: \"kubernetes.io/projected/158542a8-66b4-40d4-b6a6-12635841260a-kube-api-access-bbmb6\") pod \"nova-metadata-0\" (UID: \"158542a8-66b4-40d4-b6a6-12635841260a\") " pod="openstack/nova-metadata-0" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.749112 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 24 11:26:59 crc kubenswrapper[4752]: I1124 11:26:59.837777 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 11:27:00 crc kubenswrapper[4752]: I1124 11:27:00.232431 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 24 11:27:00 crc kubenswrapper[4752]: W1124 11:27:00.238299 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5be4efe6_ed60_4417_a84c_8ff27bf4a685.slice/crio-8836136b2a27c78c3afa53d3e96201339c94b6310c47657ce6f7730172f2a62a WatchSource:0}: Error finding container 8836136b2a27c78c3afa53d3e96201339c94b6310c47657ce6f7730172f2a62a: Status 404 returned error can't find the container with id 8836136b2a27c78c3afa53d3e96201339c94b6310c47657ce6f7730172f2a62a Nov 24 11:27:00 crc kubenswrapper[4752]: W1124 11:27:00.380686 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod158542a8_66b4_40d4_b6a6_12635841260a.slice/crio-ca0c010c770609ecd342df0c5dfd932a5fbee68b9c4410cbc4d32efccb3b69b2 WatchSource:0}: Error finding container ca0c010c770609ecd342df0c5dfd932a5fbee68b9c4410cbc4d32efccb3b69b2: Status 404 returned error can't find the container with id ca0c010c770609ecd342df0c5dfd932a5fbee68b9c4410cbc4d32efccb3b69b2 Nov 24 11:27:00 crc kubenswrapper[4752]: I1124 11:27:00.383473 4752 generic.go:334] "Generic (PLEG): container finished" podID="b41fce93-d38d-4688-b35a-17e5075edd6d" containerID="982857c386f1860f089af213f63a4f7919d815dca96aa4c2dd7baf3bb4cc3b62" exitCode=143 Nov 24 11:27:00 crc kubenswrapper[4752]: I1124 11:27:00.383537 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b41fce93-d38d-4688-b35a-17e5075edd6d","Type":"ContainerDied","Data":"982857c386f1860f089af213f63a4f7919d815dca96aa4c2dd7baf3bb4cc3b62"} Nov 24 11:27:00 crc kubenswrapper[4752]: I1124 11:27:00.385819 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"5be4efe6-ed60-4417-a84c-8ff27bf4a685","Type":"ContainerStarted","Data":"8836136b2a27c78c3afa53d3e96201339c94b6310c47657ce6f7730172f2a62a"} Nov 24 11:27:00 crc kubenswrapper[4752]: I1124 11:27:00.391135 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="d7bc6018-1c58-492d-bc91-5f55c7cf9b18" containerName="nova-scheduler-scheduler" containerID="cri-o://288b2a196d84610d0887854db72c66950a41bd302a881fe348a2e3cd499ac476" gracePeriod=30 Nov 24 11:27:00 crc kubenswrapper[4752]: I1124 11:27:00.393955 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 11:27:00 crc kubenswrapper[4752]: I1124 11:27:00.742343 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39fe6b62-7636-4a34-9d7b-965a96563930" path="/var/lib/kubelet/pods/39fe6b62-7636-4a34-9d7b-965a96563930/volumes" Nov 24 11:27:00 crc kubenswrapper[4752]: I1124 11:27:00.743652 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e1d72bc-5f15-4bf5-a424-62f34eafd84e" path="/var/lib/kubelet/pods/3e1d72bc-5f15-4bf5-a424-62f34eafd84e/volumes" Nov 24 11:27:01 crc kubenswrapper[4752]: I1124 11:27:01.402590 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"158542a8-66b4-40d4-b6a6-12635841260a","Type":"ContainerStarted","Data":"ebca040b3207b72cdc197f7d58c04e61d32418577e696984d81a85aa96df237c"} Nov 24 11:27:01 crc kubenswrapper[4752]: I1124 11:27:01.402978 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"158542a8-66b4-40d4-b6a6-12635841260a","Type":"ContainerStarted","Data":"58348713d78cef3ec61b99e837a147bdbf417d4939f96f8efbfc5ceecb421aa4"} Nov 24 11:27:01 crc kubenswrapper[4752]: I1124 11:27:01.402992 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"158542a8-66b4-40d4-b6a6-12635841260a","Type":"ContainerStarted","Data":"ca0c010c770609ecd342df0c5dfd932a5fbee68b9c4410cbc4d32efccb3b69b2"} Nov 24 11:27:01 crc kubenswrapper[4752]: I1124 11:27:01.404511 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"5be4efe6-ed60-4417-a84c-8ff27bf4a685","Type":"ContainerStarted","Data":"d926ac7aaad802ad821b9f117c72a63b843e3913b61ff928707f207f37dcd1bc"} Nov 24 11:27:01 crc kubenswrapper[4752]: I1124 11:27:01.404808 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Nov 24 11:27:01 crc kubenswrapper[4752]: I1124 11:27:01.427876 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.427860197 podStartE2EDuration="2.427860197s" podCreationTimestamp="2025-11-24 11:26:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:27:01.420143334 +0000 UTC m=+1227.404963623" watchObservedRunningTime="2025-11-24 11:27:01.427860197 +0000 UTC m=+1227.412680486" Nov 24 11:27:01 crc kubenswrapper[4752]: I1124 11:27:01.448863 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.448844423 podStartE2EDuration="2.448844423s" podCreationTimestamp="2025-11-24 11:26:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:27:01.44180258 +0000 UTC m=+1227.426622939" watchObservedRunningTime="2025-11-24 11:27:01.448844423 +0000 UTC m=+1227.433664712" Nov 24 11:27:02 crc kubenswrapper[4752]: E1124 11:27:02.712331 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="288b2a196d84610d0887854db72c66950a41bd302a881fe348a2e3cd499ac476" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 24 11:27:02 crc kubenswrapper[4752]: E1124 11:27:02.713933 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="288b2a196d84610d0887854db72c66950a41bd302a881fe348a2e3cd499ac476" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 24 11:27:02 crc kubenswrapper[4752]: E1124 11:27:02.715018 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="288b2a196d84610d0887854db72c66950a41bd302a881fe348a2e3cd499ac476" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 24 11:27:02 crc kubenswrapper[4752]: E1124 11:27:02.715092 4752 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="d7bc6018-1c58-492d-bc91-5f55c7cf9b18" containerName="nova-scheduler-scheduler" Nov 24 11:27:03 crc kubenswrapper[4752]: I1124 11:27:03.428054 4752 generic.go:334] "Generic (PLEG): container finished" podID="d7bc6018-1c58-492d-bc91-5f55c7cf9b18" containerID="288b2a196d84610d0887854db72c66950a41bd302a881fe348a2e3cd499ac476" exitCode=0 Nov 24 11:27:03 crc kubenswrapper[4752]: I1124 11:27:03.428101 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d7bc6018-1c58-492d-bc91-5f55c7cf9b18","Type":"ContainerDied","Data":"288b2a196d84610d0887854db72c66950a41bd302a881fe348a2e3cd499ac476"} Nov 24 11:27:03 crc kubenswrapper[4752]: I1124 11:27:03.762467 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 24 11:27:03 crc kubenswrapper[4752]: I1124 11:27:03.868492 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7bc6018-1c58-492d-bc91-5f55c7cf9b18-config-data\") pod \"d7bc6018-1c58-492d-bc91-5f55c7cf9b18\" (UID: \"d7bc6018-1c58-492d-bc91-5f55c7cf9b18\") " Nov 24 11:27:03 crc kubenswrapper[4752]: I1124 11:27:03.868714 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7bc6018-1c58-492d-bc91-5f55c7cf9b18-combined-ca-bundle\") pod \"d7bc6018-1c58-492d-bc91-5f55c7cf9b18\" (UID: \"d7bc6018-1c58-492d-bc91-5f55c7cf9b18\") " Nov 24 11:27:03 crc kubenswrapper[4752]: I1124 11:27:03.868773 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dbxf\" (UniqueName: \"kubernetes.io/projected/d7bc6018-1c58-492d-bc91-5f55c7cf9b18-kube-api-access-7dbxf\") pod \"d7bc6018-1c58-492d-bc91-5f55c7cf9b18\" (UID: \"d7bc6018-1c58-492d-bc91-5f55c7cf9b18\") " Nov 24 11:27:03 crc kubenswrapper[4752]: I1124 11:27:03.876955 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7bc6018-1c58-492d-bc91-5f55c7cf9b18-kube-api-access-7dbxf" (OuterVolumeSpecName: "kube-api-access-7dbxf") pod "d7bc6018-1c58-492d-bc91-5f55c7cf9b18" (UID: "d7bc6018-1c58-492d-bc91-5f55c7cf9b18"). InnerVolumeSpecName "kube-api-access-7dbxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:27:03 crc kubenswrapper[4752]: I1124 11:27:03.914887 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7bc6018-1c58-492d-bc91-5f55c7cf9b18-config-data" (OuterVolumeSpecName: "config-data") pod "d7bc6018-1c58-492d-bc91-5f55c7cf9b18" (UID: "d7bc6018-1c58-492d-bc91-5f55c7cf9b18"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:27:03 crc kubenswrapper[4752]: I1124 11:27:03.934876 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7bc6018-1c58-492d-bc91-5f55c7cf9b18-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7bc6018-1c58-492d-bc91-5f55c7cf9b18" (UID: "d7bc6018-1c58-492d-bc91-5f55c7cf9b18"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:27:03 crc kubenswrapper[4752]: I1124 11:27:03.971497 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7bc6018-1c58-492d-bc91-5f55c7cf9b18-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:03 crc kubenswrapper[4752]: I1124 11:27:03.971529 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dbxf\" (UniqueName: \"kubernetes.io/projected/d7bc6018-1c58-492d-bc91-5f55c7cf9b18-kube-api-access-7dbxf\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:03 crc kubenswrapper[4752]: I1124 11:27:03.971541 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7bc6018-1c58-492d-bc91-5f55c7cf9b18-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.108606 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.275917 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b41fce93-d38d-4688-b35a-17e5075edd6d-config-data\") pod \"b41fce93-d38d-4688-b35a-17e5075edd6d\" (UID: \"b41fce93-d38d-4688-b35a-17e5075edd6d\") " Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.275967 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kn7k9\" (UniqueName: \"kubernetes.io/projected/b41fce93-d38d-4688-b35a-17e5075edd6d-kube-api-access-kn7k9\") pod \"b41fce93-d38d-4688-b35a-17e5075edd6d\" (UID: \"b41fce93-d38d-4688-b35a-17e5075edd6d\") " Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.276042 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b41fce93-d38d-4688-b35a-17e5075edd6d-combined-ca-bundle\") pod \"b41fce93-d38d-4688-b35a-17e5075edd6d\" (UID: \"b41fce93-d38d-4688-b35a-17e5075edd6d\") " Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.276072 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b41fce93-d38d-4688-b35a-17e5075edd6d-logs\") pod \"b41fce93-d38d-4688-b35a-17e5075edd6d\" (UID: \"b41fce93-d38d-4688-b35a-17e5075edd6d\") " Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.276661 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b41fce93-d38d-4688-b35a-17e5075edd6d-logs" (OuterVolumeSpecName: "logs") pod "b41fce93-d38d-4688-b35a-17e5075edd6d" (UID: "b41fce93-d38d-4688-b35a-17e5075edd6d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.279206 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b41fce93-d38d-4688-b35a-17e5075edd6d-kube-api-access-kn7k9" (OuterVolumeSpecName: "kube-api-access-kn7k9") pod "b41fce93-d38d-4688-b35a-17e5075edd6d" (UID: "b41fce93-d38d-4688-b35a-17e5075edd6d"). InnerVolumeSpecName "kube-api-access-kn7k9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.304986 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b41fce93-d38d-4688-b35a-17e5075edd6d-config-data" (OuterVolumeSpecName: "config-data") pod "b41fce93-d38d-4688-b35a-17e5075edd6d" (UID: "b41fce93-d38d-4688-b35a-17e5075edd6d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.321444 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b41fce93-d38d-4688-b35a-17e5075edd6d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b41fce93-d38d-4688-b35a-17e5075edd6d" (UID: "b41fce93-d38d-4688-b35a-17e5075edd6d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.380872 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b41fce93-d38d-4688-b35a-17e5075edd6d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.380916 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b41fce93-d38d-4688-b35a-17e5075edd6d-logs\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.380928 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b41fce93-d38d-4688-b35a-17e5075edd6d-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.380947 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kn7k9\" (UniqueName: \"kubernetes.io/projected/b41fce93-d38d-4688-b35a-17e5075edd6d-kube-api-access-kn7k9\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.445982 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d7bc6018-1c58-492d-bc91-5f55c7cf9b18","Type":"ContainerDied","Data":"0e2d8bfefaa7728d69073898bf3b70c361e281ad5ce5b14fca7a2d27519841c4"} Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.446063 4752 scope.go:117] "RemoveContainer" containerID="288b2a196d84610d0887854db72c66950a41bd302a881fe348a2e3cd499ac476" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.446324 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.449461 4752 generic.go:334] "Generic (PLEG): container finished" podID="b41fce93-d38d-4688-b35a-17e5075edd6d" containerID="487d1cf1debc3c4e367bce133a79703645d46f0c44d627338dcae62c2393d339" exitCode=0 Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.449496 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b41fce93-d38d-4688-b35a-17e5075edd6d","Type":"ContainerDied","Data":"487d1cf1debc3c4e367bce133a79703645d46f0c44d627338dcae62c2393d339"} Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.449519 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b41fce93-d38d-4688-b35a-17e5075edd6d","Type":"ContainerDied","Data":"54ef48ed3de085be852c7f72040a4b49a11db5c62345f3cd8fc3cef11cc8c51a"} Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.449589 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.537514 4752 scope.go:117] "RemoveContainer" containerID="487d1cf1debc3c4e367bce133a79703645d46f0c44d627338dcae62c2393d339" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.569634 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.603233 4752 scope.go:117] "RemoveContainer" containerID="982857c386f1860f089af213f63a4f7919d815dca96aa4c2dd7baf3bb4cc3b62" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.629341 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.638913 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.641647 4752 scope.go:117] "RemoveContainer" containerID="487d1cf1debc3c4e367bce133a79703645d46f0c44d627338dcae62c2393d339" Nov 24 11:27:04 crc kubenswrapper[4752]: E1124 11:27:04.642193 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"487d1cf1debc3c4e367bce133a79703645d46f0c44d627338dcae62c2393d339\": container with ID starting with 487d1cf1debc3c4e367bce133a79703645d46f0c44d627338dcae62c2393d339 not found: ID does not exist" containerID="487d1cf1debc3c4e367bce133a79703645d46f0c44d627338dcae62c2393d339" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.642246 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"487d1cf1debc3c4e367bce133a79703645d46f0c44d627338dcae62c2393d339"} err="failed to get container status \"487d1cf1debc3c4e367bce133a79703645d46f0c44d627338dcae62c2393d339\": rpc error: code = NotFound desc = could not find container \"487d1cf1debc3c4e367bce133a79703645d46f0c44d627338dcae62c2393d339\": container with ID starting with 487d1cf1debc3c4e367bce133a79703645d46f0c44d627338dcae62c2393d339 not found: ID does not exist" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.642281 4752 scope.go:117] "RemoveContainer" containerID="982857c386f1860f089af213f63a4f7919d815dca96aa4c2dd7baf3bb4cc3b62" Nov 24 11:27:04 crc kubenswrapper[4752]: E1124 11:27:04.642592 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"982857c386f1860f089af213f63a4f7919d815dca96aa4c2dd7baf3bb4cc3b62\": container with ID starting with 982857c386f1860f089af213f63a4f7919d815dca96aa4c2dd7baf3bb4cc3b62 not found: ID does not exist" containerID="982857c386f1860f089af213f63a4f7919d815dca96aa4c2dd7baf3bb4cc3b62" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.642622 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"982857c386f1860f089af213f63a4f7919d815dca96aa4c2dd7baf3bb4cc3b62"} err="failed to get container status \"982857c386f1860f089af213f63a4f7919d815dca96aa4c2dd7baf3bb4cc3b62\": rpc error: code = NotFound desc = could not find container \"982857c386f1860f089af213f63a4f7919d815dca96aa4c2dd7baf3bb4cc3b62\": container with ID starting with 982857c386f1860f089af213f63a4f7919d815dca96aa4c2dd7baf3bb4cc3b62 not found: ID does not exist" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.654283 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.666088 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 11:27:04 crc kubenswrapper[4752]: E1124 11:27:04.666518 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b41fce93-d38d-4688-b35a-17e5075edd6d" containerName="nova-api-api" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.666539 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="b41fce93-d38d-4688-b35a-17e5075edd6d" containerName="nova-api-api" Nov 24 11:27:04 crc kubenswrapper[4752]: E1124 11:27:04.666557 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b41fce93-d38d-4688-b35a-17e5075edd6d" containerName="nova-api-log" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.666563 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="b41fce93-d38d-4688-b35a-17e5075edd6d" containerName="nova-api-log" Nov 24 11:27:04 crc kubenswrapper[4752]: E1124 11:27:04.666574 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7bc6018-1c58-492d-bc91-5f55c7cf9b18" containerName="nova-scheduler-scheduler" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.666581 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7bc6018-1c58-492d-bc91-5f55c7cf9b18" containerName="nova-scheduler-scheduler" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.666732 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7bc6018-1c58-492d-bc91-5f55c7cf9b18" containerName="nova-scheduler-scheduler" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.666784 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="b41fce93-d38d-4688-b35a-17e5075edd6d" containerName="nova-api-log" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.666792 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="b41fce93-d38d-4688-b35a-17e5075edd6d" containerName="nova-api-api" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.667578 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.672715 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.674168 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.685669 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.687801 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.688452 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dwdn\" (UniqueName: \"kubernetes.io/projected/d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7-kube-api-access-7dwdn\") pod \"nova-scheduler-0\" (UID: \"d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7\") " pod="openstack/nova-scheduler-0" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.688522 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7\") " pod="openstack/nova-scheduler-0" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.688641 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7-config-data\") pod \"nova-scheduler-0\" (UID: \"d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7\") " pod="openstack/nova-scheduler-0" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.690496 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.698612 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 24 11:27:04 crc kubenswrapper[4752]: E1124 11:27:04.733591 4752 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb41fce93_d38d_4688_b35a_17e5075edd6d.slice/crio-54ef48ed3de085be852c7f72040a4b49a11db5c62345f3cd8fc3cef11cc8c51a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb41fce93_d38d_4688_b35a_17e5075edd6d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7bc6018_1c58_492d_bc91_5f55c7cf9b18.slice\": RecentStats: unable to find data in memory cache]" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.740455 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b41fce93-d38d-4688-b35a-17e5075edd6d" path="/var/lib/kubelet/pods/b41fce93-d38d-4688-b35a-17e5075edd6d/volumes" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.741642 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7bc6018-1c58-492d-bc91-5f55c7cf9b18" path="/var/lib/kubelet/pods/d7bc6018-1c58-492d-bc91-5f55c7cf9b18/volumes" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.790015 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krpsk\" (UniqueName: \"kubernetes.io/projected/1b533d24-9af5-430e-ae85-fefaeea7c4b6-kube-api-access-krpsk\") pod \"nova-api-0\" (UID: \"1b533d24-9af5-430e-ae85-fefaeea7c4b6\") " pod="openstack/nova-api-0" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.790087 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dwdn\" (UniqueName: \"kubernetes.io/projected/d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7-kube-api-access-7dwdn\") pod \"nova-scheduler-0\" (UID: \"d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7\") " pod="openstack/nova-scheduler-0" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.790112 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7\") " pod="openstack/nova-scheduler-0" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.790132 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b533d24-9af5-430e-ae85-fefaeea7c4b6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1b533d24-9af5-430e-ae85-fefaeea7c4b6\") " pod="openstack/nova-api-0" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.790159 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b533d24-9af5-430e-ae85-fefaeea7c4b6-config-data\") pod \"nova-api-0\" (UID: \"1b533d24-9af5-430e-ae85-fefaeea7c4b6\") " pod="openstack/nova-api-0" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.790273 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7-config-data\") pod \"nova-scheduler-0\" (UID: \"d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7\") " pod="openstack/nova-scheduler-0" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.790313 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b533d24-9af5-430e-ae85-fefaeea7c4b6-logs\") pod \"nova-api-0\" (UID: \"1b533d24-9af5-430e-ae85-fefaeea7c4b6\") " pod="openstack/nova-api-0" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.794226 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7\") " pod="openstack/nova-scheduler-0" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.795087 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7-config-data\") pod \"nova-scheduler-0\" (UID: \"d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7\") " pod="openstack/nova-scheduler-0" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.809420 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dwdn\" (UniqueName: \"kubernetes.io/projected/d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7-kube-api-access-7dwdn\") pod \"nova-scheduler-0\" (UID: \"d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7\") " pod="openstack/nova-scheduler-0" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.843536 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.844884 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.892193 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krpsk\" (UniqueName: \"kubernetes.io/projected/1b533d24-9af5-430e-ae85-fefaeea7c4b6-kube-api-access-krpsk\") pod \"nova-api-0\" (UID: \"1b533d24-9af5-430e-ae85-fefaeea7c4b6\") " pod="openstack/nova-api-0" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.892270 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b533d24-9af5-430e-ae85-fefaeea7c4b6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1b533d24-9af5-430e-ae85-fefaeea7c4b6\") " pod="openstack/nova-api-0" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.892303 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b533d24-9af5-430e-ae85-fefaeea7c4b6-config-data\") pod \"nova-api-0\" (UID: \"1b533d24-9af5-430e-ae85-fefaeea7c4b6\") " pod="openstack/nova-api-0" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.892396 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b533d24-9af5-430e-ae85-fefaeea7c4b6-logs\") pod \"nova-api-0\" (UID: \"1b533d24-9af5-430e-ae85-fefaeea7c4b6\") " pod="openstack/nova-api-0" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.893007 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b533d24-9af5-430e-ae85-fefaeea7c4b6-logs\") pod \"nova-api-0\" (UID: \"1b533d24-9af5-430e-ae85-fefaeea7c4b6\") " pod="openstack/nova-api-0" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.896807 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b533d24-9af5-430e-ae85-fefaeea7c4b6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1b533d24-9af5-430e-ae85-fefaeea7c4b6\") " pod="openstack/nova-api-0" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.896810 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b533d24-9af5-430e-ae85-fefaeea7c4b6-config-data\") pod \"nova-api-0\" (UID: \"1b533d24-9af5-430e-ae85-fefaeea7c4b6\") " pod="openstack/nova-api-0" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.907276 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krpsk\" (UniqueName: \"kubernetes.io/projected/1b533d24-9af5-430e-ae85-fefaeea7c4b6-kube-api-access-krpsk\") pod \"nova-api-0\" (UID: \"1b533d24-9af5-430e-ae85-fefaeea7c4b6\") " pod="openstack/nova-api-0" Nov 24 11:27:04 crc kubenswrapper[4752]: I1124 11:27:04.991562 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 24 11:27:05 crc kubenswrapper[4752]: I1124 11:27:05.009475 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 11:27:05 crc kubenswrapper[4752]: I1124 11:27:05.476861 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 24 11:27:05 crc kubenswrapper[4752]: W1124 11:27:05.478498 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b533d24_9af5_430e_ae85_fefaeea7c4b6.slice/crio-817fd372c8465908993c893569d579693b84a933421a203f61e86c8369a8338d WatchSource:0}: Error finding container 817fd372c8465908993c893569d579693b84a933421a203f61e86c8369a8338d: Status 404 returned error can't find the container with id 817fd372c8465908993c893569d579693b84a933421a203f61e86c8369a8338d Nov 24 11:27:05 crc kubenswrapper[4752]: I1124 11:27:05.552612 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 11:27:05 crc kubenswrapper[4752]: W1124 11:27:05.559882 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2c9a6ee_a43f_4094_b1dc_a2d493d9efd7.slice/crio-b4ce138556a786dbbc3f7b896af464fbf6ce36bd7134dc351c0809d5423e5035 WatchSource:0}: Error finding container b4ce138556a786dbbc3f7b896af464fbf6ce36bd7134dc351c0809d5423e5035: Status 404 returned error can't find the container with id b4ce138556a786dbbc3f7b896af464fbf6ce36bd7134dc351c0809d5423e5035 Nov 24 11:27:06 crc kubenswrapper[4752]: I1124 11:27:06.468556 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1b533d24-9af5-430e-ae85-fefaeea7c4b6","Type":"ContainerStarted","Data":"178d0813a181d031999d15fbc9f5beb37bcb0382ccd95515501ba5834ab3ae0a"} Nov 24 11:27:06 crc kubenswrapper[4752]: I1124 11:27:06.468614 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1b533d24-9af5-430e-ae85-fefaeea7c4b6","Type":"ContainerStarted","Data":"2324ffde568e947f0a30ea2e426277d57f4bd20325222b7f8d7152d53a90cb0e"} Nov 24 11:27:06 crc kubenswrapper[4752]: I1124 11:27:06.468628 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1b533d24-9af5-430e-ae85-fefaeea7c4b6","Type":"ContainerStarted","Data":"817fd372c8465908993c893569d579693b84a933421a203f61e86c8369a8338d"} Nov 24 11:27:06 crc kubenswrapper[4752]: I1124 11:27:06.471281 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7","Type":"ContainerStarted","Data":"808c2fc55c7ab307f63456bc2ef733ab21e0f0c222fab0682eb7eae08cb59705"} Nov 24 11:27:06 crc kubenswrapper[4752]: I1124 11:27:06.471345 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7","Type":"ContainerStarted","Data":"b4ce138556a786dbbc3f7b896af464fbf6ce36bd7134dc351c0809d5423e5035"} Nov 24 11:27:06 crc kubenswrapper[4752]: I1124 11:27:06.499124 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 24 11:27:06 crc kubenswrapper[4752]: I1124 11:27:06.501439 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.50142144 podStartE2EDuration="2.50142144s" podCreationTimestamp="2025-11-24 11:27:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:27:06.484733227 +0000 UTC m=+1232.469553526" watchObservedRunningTime="2025-11-24 11:27:06.50142144 +0000 UTC m=+1232.486241729" Nov 24 11:27:06 crc kubenswrapper[4752]: I1124 11:27:06.507775 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.507753392 podStartE2EDuration="2.507753392s" podCreationTimestamp="2025-11-24 11:27:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:27:06.496167978 +0000 UTC m=+1232.480988267" watchObservedRunningTime="2025-11-24 11:27:06.507753392 +0000 UTC m=+1232.492573681" Nov 24 11:27:09 crc kubenswrapper[4752]: I1124 11:27:09.783087 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Nov 24 11:27:09 crc kubenswrapper[4752]: I1124 11:27:09.838860 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 24 11:27:09 crc kubenswrapper[4752]: I1124 11:27:09.838934 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 24 11:27:09 crc kubenswrapper[4752]: I1124 11:27:09.992239 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 24 11:27:10 crc kubenswrapper[4752]: I1124 11:27:10.863102 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="158542a8-66b4-40d4-b6a6-12635841260a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 24 11:27:10 crc kubenswrapper[4752]: I1124 11:27:10.863102 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="158542a8-66b4-40d4-b6a6-12635841260a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 24 11:27:15 crc kubenswrapper[4752]: I1124 11:27:15.002944 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 24 11:27:15 crc kubenswrapper[4752]: I1124 11:27:15.010352 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 24 11:27:15 crc kubenswrapper[4752]: I1124 11:27:15.010451 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 24 11:27:15 crc kubenswrapper[4752]: I1124 11:27:15.044951 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 24 11:27:15 crc kubenswrapper[4752]: I1124 11:27:15.468380 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 11:27:15 crc kubenswrapper[4752]: I1124 11:27:15.468763 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 11:27:15 crc kubenswrapper[4752]: I1124 11:27:15.591530 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 24 11:27:16 crc kubenswrapper[4752]: I1124 11:27:16.092074 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1b533d24-9af5-430e-ae85-fefaeea7c4b6" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 24 11:27:16 crc kubenswrapper[4752]: I1124 11:27:16.092092 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1b533d24-9af5-430e-ae85-fefaeea7c4b6" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 24 11:27:19 crc kubenswrapper[4752]: I1124 11:27:19.881289 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 24 11:27:19 crc kubenswrapper[4752]: I1124 11:27:19.882032 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 24 11:27:19 crc kubenswrapper[4752]: I1124 11:27:19.887310 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 24 11:27:19 crc kubenswrapper[4752]: I1124 11:27:19.888450 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 24 11:27:22 crc kubenswrapper[4752]: I1124 11:27:22.633241 4752 generic.go:334] "Generic (PLEG): container finished" podID="8fd0f817-bfdb-4147-ac72-b0938d18b0bf" containerID="682dc8577a8c51a3a589c8dcc6befead348a66e7150c6f7f392a3e1290c56b3a" exitCode=137 Nov 24 11:27:22 crc kubenswrapper[4752]: I1124 11:27:22.633348 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8fd0f817-bfdb-4147-ac72-b0938d18b0bf","Type":"ContainerDied","Data":"682dc8577a8c51a3a589c8dcc6befead348a66e7150c6f7f392a3e1290c56b3a"} Nov 24 11:27:22 crc kubenswrapper[4752]: I1124 11:27:22.633676 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8fd0f817-bfdb-4147-ac72-b0938d18b0bf","Type":"ContainerDied","Data":"83a12326562a839150a5b47543feccea41d14745fc0dac4d671cde966ca0dd66"} Nov 24 11:27:22 crc kubenswrapper[4752]: I1124 11:27:22.633697 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83a12326562a839150a5b47543feccea41d14745fc0dac4d671cde966ca0dd66" Nov 24 11:27:22 crc kubenswrapper[4752]: I1124 11:27:22.679820 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 24 11:27:22 crc kubenswrapper[4752]: I1124 11:27:22.835044 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fd0f817-bfdb-4147-ac72-b0938d18b0bf-combined-ca-bundle\") pod \"8fd0f817-bfdb-4147-ac72-b0938d18b0bf\" (UID: \"8fd0f817-bfdb-4147-ac72-b0938d18b0bf\") " Nov 24 11:27:22 crc kubenswrapper[4752]: I1124 11:27:22.835149 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-km8x7\" (UniqueName: \"kubernetes.io/projected/8fd0f817-bfdb-4147-ac72-b0938d18b0bf-kube-api-access-km8x7\") pod \"8fd0f817-bfdb-4147-ac72-b0938d18b0bf\" (UID: \"8fd0f817-bfdb-4147-ac72-b0938d18b0bf\") " Nov 24 11:27:22 crc kubenswrapper[4752]: I1124 11:27:22.835174 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fd0f817-bfdb-4147-ac72-b0938d18b0bf-config-data\") pod \"8fd0f817-bfdb-4147-ac72-b0938d18b0bf\" (UID: \"8fd0f817-bfdb-4147-ac72-b0938d18b0bf\") " Nov 24 11:27:22 crc kubenswrapper[4752]: I1124 11:27:22.840405 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fd0f817-bfdb-4147-ac72-b0938d18b0bf-kube-api-access-km8x7" (OuterVolumeSpecName: "kube-api-access-km8x7") pod "8fd0f817-bfdb-4147-ac72-b0938d18b0bf" (UID: "8fd0f817-bfdb-4147-ac72-b0938d18b0bf"). InnerVolumeSpecName "kube-api-access-km8x7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:27:22 crc kubenswrapper[4752]: I1124 11:27:22.863175 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fd0f817-bfdb-4147-ac72-b0938d18b0bf-config-data" (OuterVolumeSpecName: "config-data") pod "8fd0f817-bfdb-4147-ac72-b0938d18b0bf" (UID: "8fd0f817-bfdb-4147-ac72-b0938d18b0bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:27:22 crc kubenswrapper[4752]: I1124 11:27:22.863618 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fd0f817-bfdb-4147-ac72-b0938d18b0bf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8fd0f817-bfdb-4147-ac72-b0938d18b0bf" (UID: "8fd0f817-bfdb-4147-ac72-b0938d18b0bf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:27:22 crc kubenswrapper[4752]: I1124 11:27:22.937987 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fd0f817-bfdb-4147-ac72-b0938d18b0bf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:22 crc kubenswrapper[4752]: I1124 11:27:22.938061 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-km8x7\" (UniqueName: \"kubernetes.io/projected/8fd0f817-bfdb-4147-ac72-b0938d18b0bf-kube-api-access-km8x7\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:22 crc kubenswrapper[4752]: I1124 11:27:22.938076 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fd0f817-bfdb-4147-ac72-b0938d18b0bf-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:23 crc kubenswrapper[4752]: I1124 11:27:23.652119 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 24 11:27:23 crc kubenswrapper[4752]: I1124 11:27:23.705975 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 24 11:27:23 crc kubenswrapper[4752]: I1124 11:27:23.729041 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 24 11:27:23 crc kubenswrapper[4752]: I1124 11:27:23.779245 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 24 11:27:23 crc kubenswrapper[4752]: E1124 11:27:23.781123 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fd0f817-bfdb-4147-ac72-b0938d18b0bf" containerName="nova-cell1-novncproxy-novncproxy" Nov 24 11:27:23 crc kubenswrapper[4752]: I1124 11:27:23.781150 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fd0f817-bfdb-4147-ac72-b0938d18b0bf" containerName="nova-cell1-novncproxy-novncproxy" Nov 24 11:27:23 crc kubenswrapper[4752]: I1124 11:27:23.781989 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fd0f817-bfdb-4147-ac72-b0938d18b0bf" containerName="nova-cell1-novncproxy-novncproxy" Nov 24 11:27:23 crc kubenswrapper[4752]: I1124 11:27:23.790540 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 24 11:27:23 crc kubenswrapper[4752]: I1124 11:27:23.795521 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 24 11:27:23 crc kubenswrapper[4752]: I1124 11:27:23.795868 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Nov 24 11:27:23 crc kubenswrapper[4752]: I1124 11:27:23.798935 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Nov 24 11:27:23 crc kubenswrapper[4752]: I1124 11:27:23.813403 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 24 11:27:23 crc kubenswrapper[4752]: I1124 11:27:23.965864 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0487424d-5178-4843-ae8d-db6c015fe9d4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0487424d-5178-4843-ae8d-db6c015fe9d4\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 11:27:23 crc kubenswrapper[4752]: I1124 11:27:23.965958 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0487424d-5178-4843-ae8d-db6c015fe9d4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0487424d-5178-4843-ae8d-db6c015fe9d4\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 11:27:23 crc kubenswrapper[4752]: I1124 11:27:23.966019 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whnns\" (UniqueName: \"kubernetes.io/projected/0487424d-5178-4843-ae8d-db6c015fe9d4-kube-api-access-whnns\") pod \"nova-cell1-novncproxy-0\" (UID: \"0487424d-5178-4843-ae8d-db6c015fe9d4\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 11:27:23 crc kubenswrapper[4752]: I1124 11:27:23.966041 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0487424d-5178-4843-ae8d-db6c015fe9d4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0487424d-5178-4843-ae8d-db6c015fe9d4\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 11:27:23 crc kubenswrapper[4752]: I1124 11:27:23.966075 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0487424d-5178-4843-ae8d-db6c015fe9d4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0487424d-5178-4843-ae8d-db6c015fe9d4\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 11:27:24 crc kubenswrapper[4752]: I1124 11:27:24.068195 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0487424d-5178-4843-ae8d-db6c015fe9d4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0487424d-5178-4843-ae8d-db6c015fe9d4\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 11:27:24 crc kubenswrapper[4752]: I1124 11:27:24.068353 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0487424d-5178-4843-ae8d-db6c015fe9d4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0487424d-5178-4843-ae8d-db6c015fe9d4\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 11:27:24 crc kubenswrapper[4752]: I1124 11:27:24.068476 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whnns\" (UniqueName: \"kubernetes.io/projected/0487424d-5178-4843-ae8d-db6c015fe9d4-kube-api-access-whnns\") pod \"nova-cell1-novncproxy-0\" (UID: \"0487424d-5178-4843-ae8d-db6c015fe9d4\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 11:27:24 crc kubenswrapper[4752]: I1124 11:27:24.068516 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0487424d-5178-4843-ae8d-db6c015fe9d4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0487424d-5178-4843-ae8d-db6c015fe9d4\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 11:27:24 crc kubenswrapper[4752]: I1124 11:27:24.068592 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0487424d-5178-4843-ae8d-db6c015fe9d4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0487424d-5178-4843-ae8d-db6c015fe9d4\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 11:27:24 crc kubenswrapper[4752]: I1124 11:27:24.073153 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0487424d-5178-4843-ae8d-db6c015fe9d4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0487424d-5178-4843-ae8d-db6c015fe9d4\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 11:27:24 crc kubenswrapper[4752]: I1124 11:27:24.074431 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0487424d-5178-4843-ae8d-db6c015fe9d4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0487424d-5178-4843-ae8d-db6c015fe9d4\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 11:27:24 crc kubenswrapper[4752]: I1124 11:27:24.080736 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0487424d-5178-4843-ae8d-db6c015fe9d4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0487424d-5178-4843-ae8d-db6c015fe9d4\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 11:27:24 crc kubenswrapper[4752]: I1124 11:27:24.081319 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0487424d-5178-4843-ae8d-db6c015fe9d4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0487424d-5178-4843-ae8d-db6c015fe9d4\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 11:27:24 crc kubenswrapper[4752]: I1124 11:27:24.098238 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whnns\" (UniqueName: \"kubernetes.io/projected/0487424d-5178-4843-ae8d-db6c015fe9d4-kube-api-access-whnns\") pod \"nova-cell1-novncproxy-0\" (UID: \"0487424d-5178-4843-ae8d-db6c015fe9d4\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 11:27:24 crc kubenswrapper[4752]: I1124 11:27:24.115209 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 24 11:27:24 crc kubenswrapper[4752]: I1124 11:27:24.571075 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 24 11:27:24 crc kubenswrapper[4752]: W1124 11:27:24.575173 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0487424d_5178_4843_ae8d_db6c015fe9d4.slice/crio-051d7c0a52a159a27d5047b5f7dab7a88c7edd528d475565041bec4c4be853b6 WatchSource:0}: Error finding container 051d7c0a52a159a27d5047b5f7dab7a88c7edd528d475565041bec4c4be853b6: Status 404 returned error can't find the container with id 051d7c0a52a159a27d5047b5f7dab7a88c7edd528d475565041bec4c4be853b6 Nov 24 11:27:24 crc kubenswrapper[4752]: I1124 11:27:24.666168 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0487424d-5178-4843-ae8d-db6c015fe9d4","Type":"ContainerStarted","Data":"051d7c0a52a159a27d5047b5f7dab7a88c7edd528d475565041bec4c4be853b6"} Nov 24 11:27:24 crc kubenswrapper[4752]: I1124 11:27:24.747468 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fd0f817-bfdb-4147-ac72-b0938d18b0bf" path="/var/lib/kubelet/pods/8fd0f817-bfdb-4147-ac72-b0938d18b0bf/volumes" Nov 24 11:27:25 crc kubenswrapper[4752]: I1124 11:27:25.016278 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 24 11:27:25 crc kubenswrapper[4752]: I1124 11:27:25.016493 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 24 11:27:25 crc kubenswrapper[4752]: I1124 11:27:25.016635 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 24 11:27:25 crc kubenswrapper[4752]: I1124 11:27:25.030527 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 24 11:27:25 crc kubenswrapper[4752]: I1124 11:27:25.680004 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0487424d-5178-4843-ae8d-db6c015fe9d4","Type":"ContainerStarted","Data":"456e104a9500a5d75fcca8f093e7825be12c29b5f33ca5320de653b354d9b91c"} Nov 24 11:27:25 crc kubenswrapper[4752]: I1124 11:27:25.680193 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 24 11:27:25 crc kubenswrapper[4752]: I1124 11:27:25.683300 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 24 11:27:25 crc kubenswrapper[4752]: I1124 11:27:25.697664 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.697647607 podStartE2EDuration="2.697647607s" podCreationTimestamp="2025-11-24 11:27:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:27:25.696988418 +0000 UTC m=+1251.681808707" watchObservedRunningTime="2025-11-24 11:27:25.697647607 +0000 UTC m=+1251.682467906" Nov 24 11:27:25 crc kubenswrapper[4752]: I1124 11:27:25.878759 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-65sfh"] Nov 24 11:27:25 crc kubenswrapper[4752]: I1124 11:27:25.880205 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-65sfh" Nov 24 11:27:25 crc kubenswrapper[4752]: I1124 11:27:25.909892 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-65sfh"] Nov 24 11:27:26 crc kubenswrapper[4752]: I1124 11:27:26.002627 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c699191a-ace5-4413-8607-e801f9646d0d-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-65sfh\" (UID: \"c699191a-ace5-4413-8607-e801f9646d0d\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-65sfh" Nov 24 11:27:26 crc kubenswrapper[4752]: I1124 11:27:26.002666 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c699191a-ace5-4413-8607-e801f9646d0d-config\") pod \"dnsmasq-dns-cd5cbd7b9-65sfh\" (UID: \"c699191a-ace5-4413-8607-e801f9646d0d\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-65sfh" Nov 24 11:27:26 crc kubenswrapper[4752]: I1124 11:27:26.002710 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c699191a-ace5-4413-8607-e801f9646d0d-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-65sfh\" (UID: \"c699191a-ace5-4413-8607-e801f9646d0d\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-65sfh" Nov 24 11:27:26 crc kubenswrapper[4752]: I1124 11:27:26.002786 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnwg5\" (UniqueName: \"kubernetes.io/projected/c699191a-ace5-4413-8607-e801f9646d0d-kube-api-access-vnwg5\") pod \"dnsmasq-dns-cd5cbd7b9-65sfh\" (UID: \"c699191a-ace5-4413-8607-e801f9646d0d\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-65sfh" Nov 24 11:27:26 crc kubenswrapper[4752]: I1124 11:27:26.002819 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c699191a-ace5-4413-8607-e801f9646d0d-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-65sfh\" (UID: \"c699191a-ace5-4413-8607-e801f9646d0d\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-65sfh" Nov 24 11:27:26 crc kubenswrapper[4752]: I1124 11:27:26.002876 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c699191a-ace5-4413-8607-e801f9646d0d-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-65sfh\" (UID: \"c699191a-ace5-4413-8607-e801f9646d0d\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-65sfh" Nov 24 11:27:26 crc kubenswrapper[4752]: I1124 11:27:26.104850 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnwg5\" (UniqueName: \"kubernetes.io/projected/c699191a-ace5-4413-8607-e801f9646d0d-kube-api-access-vnwg5\") pod \"dnsmasq-dns-cd5cbd7b9-65sfh\" (UID: \"c699191a-ace5-4413-8607-e801f9646d0d\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-65sfh" Nov 24 11:27:26 crc kubenswrapper[4752]: I1124 11:27:26.104924 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c699191a-ace5-4413-8607-e801f9646d0d-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-65sfh\" (UID: \"c699191a-ace5-4413-8607-e801f9646d0d\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-65sfh" Nov 24 11:27:26 crc kubenswrapper[4752]: I1124 11:27:26.104990 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c699191a-ace5-4413-8607-e801f9646d0d-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-65sfh\" (UID: \"c699191a-ace5-4413-8607-e801f9646d0d\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-65sfh" Nov 24 11:27:26 crc kubenswrapper[4752]: I1124 11:27:26.105053 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c699191a-ace5-4413-8607-e801f9646d0d-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-65sfh\" (UID: \"c699191a-ace5-4413-8607-e801f9646d0d\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-65sfh" Nov 24 11:27:26 crc kubenswrapper[4752]: I1124 11:27:26.105077 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c699191a-ace5-4413-8607-e801f9646d0d-config\") pod \"dnsmasq-dns-cd5cbd7b9-65sfh\" (UID: \"c699191a-ace5-4413-8607-e801f9646d0d\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-65sfh" Nov 24 11:27:26 crc kubenswrapper[4752]: I1124 11:27:26.105130 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c699191a-ace5-4413-8607-e801f9646d0d-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-65sfh\" (UID: \"c699191a-ace5-4413-8607-e801f9646d0d\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-65sfh" Nov 24 11:27:26 crc kubenswrapper[4752]: I1124 11:27:26.106141 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c699191a-ace5-4413-8607-e801f9646d0d-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-65sfh\" (UID: \"c699191a-ace5-4413-8607-e801f9646d0d\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-65sfh" Nov 24 11:27:26 crc kubenswrapper[4752]: I1124 11:27:26.106141 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c699191a-ace5-4413-8607-e801f9646d0d-config\") pod \"dnsmasq-dns-cd5cbd7b9-65sfh\" (UID: \"c699191a-ace5-4413-8607-e801f9646d0d\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-65sfh" Nov 24 11:27:26 crc kubenswrapper[4752]: I1124 11:27:26.106153 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c699191a-ace5-4413-8607-e801f9646d0d-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-65sfh\" (UID: \"c699191a-ace5-4413-8607-e801f9646d0d\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-65sfh" Nov 24 11:27:26 crc kubenswrapper[4752]: I1124 11:27:26.106259 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c699191a-ace5-4413-8607-e801f9646d0d-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-65sfh\" (UID: \"c699191a-ace5-4413-8607-e801f9646d0d\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-65sfh" Nov 24 11:27:26 crc kubenswrapper[4752]: I1124 11:27:26.106366 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c699191a-ace5-4413-8607-e801f9646d0d-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-65sfh\" (UID: \"c699191a-ace5-4413-8607-e801f9646d0d\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-65sfh" Nov 24 11:27:26 crc kubenswrapper[4752]: I1124 11:27:26.129004 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnwg5\" (UniqueName: \"kubernetes.io/projected/c699191a-ace5-4413-8607-e801f9646d0d-kube-api-access-vnwg5\") pod \"dnsmasq-dns-cd5cbd7b9-65sfh\" (UID: \"c699191a-ace5-4413-8607-e801f9646d0d\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-65sfh" Nov 24 11:27:26 crc kubenswrapper[4752]: I1124 11:27:26.208573 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-65sfh" Nov 24 11:27:26 crc kubenswrapper[4752]: I1124 11:27:26.694244 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-65sfh"] Nov 24 11:27:26 crc kubenswrapper[4752]: W1124 11:27:26.708928 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc699191a_ace5_4413_8607_e801f9646d0d.slice/crio-1eabf3bee073b42427c876390f73ac696e84edd5e177af294cff5ca5999ecd4f WatchSource:0}: Error finding container 1eabf3bee073b42427c876390f73ac696e84edd5e177af294cff5ca5999ecd4f: Status 404 returned error can't find the container with id 1eabf3bee073b42427c876390f73ac696e84edd5e177af294cff5ca5999ecd4f Nov 24 11:27:27 crc kubenswrapper[4752]: I1124 11:27:27.699815 4752 generic.go:334] "Generic (PLEG): container finished" podID="c699191a-ace5-4413-8607-e801f9646d0d" containerID="5c192d14aa6a7bd9c02c00cdae6e1a39c7650144567990030adf01797a110db9" exitCode=0 Nov 24 11:27:27 crc kubenswrapper[4752]: I1124 11:27:27.699924 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-65sfh" event={"ID":"c699191a-ace5-4413-8607-e801f9646d0d","Type":"ContainerDied","Data":"5c192d14aa6a7bd9c02c00cdae6e1a39c7650144567990030adf01797a110db9"} Nov 24 11:27:27 crc kubenswrapper[4752]: I1124 11:27:27.700481 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-65sfh" event={"ID":"c699191a-ace5-4413-8607-e801f9646d0d","Type":"ContainerStarted","Data":"1eabf3bee073b42427c876390f73ac696e84edd5e177af294cff5ca5999ecd4f"} Nov 24 11:27:27 crc kubenswrapper[4752]: I1124 11:27:27.840965 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 24 11:27:27 crc kubenswrapper[4752]: I1124 11:27:27.841312 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a901a1c5-91ff-4986-94bf-592ff1c53ec8" containerName="ceilometer-central-agent" containerID="cri-o://27b6d6942722d1db58c66719a8f8491f311077b193e1ef8b8f1f02443418972e" gracePeriod=30 Nov 24 11:27:27 crc kubenswrapper[4752]: I1124 11:27:27.841364 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a901a1c5-91ff-4986-94bf-592ff1c53ec8" containerName="sg-core" containerID="cri-o://60be8840e1ee6b3f596a5ce76b04f231bd7b6191f9dc15ca988c87b08abbc4ad" gracePeriod=30 Nov 24 11:27:27 crc kubenswrapper[4752]: I1124 11:27:27.841397 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a901a1c5-91ff-4986-94bf-592ff1c53ec8" containerName="ceilometer-notification-agent" containerID="cri-o://606ef6738daabcd00a974684ff257695a7daec78240e6cfd544c3ac99056cf86" gracePeriod=30 Nov 24 11:27:27 crc kubenswrapper[4752]: I1124 11:27:27.842113 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a901a1c5-91ff-4986-94bf-592ff1c53ec8" containerName="proxy-httpd" containerID="cri-o://4618b4ee6d11ee424000b275551df92bc5b3fa2d22b3cea12f04bb0291dd372d" gracePeriod=30 Nov 24 11:27:28 crc kubenswrapper[4752]: I1124 11:27:28.029669 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 24 11:27:28 crc kubenswrapper[4752]: I1124 11:27:28.708680 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-65sfh" event={"ID":"c699191a-ace5-4413-8607-e801f9646d0d","Type":"ContainerStarted","Data":"69b8d0fb67913018dd4cc726dbf04cca82e339eb6081fe3590586f398c788b8b"} Nov 24 11:27:28 crc kubenswrapper[4752]: I1124 11:27:28.708833 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cd5cbd7b9-65sfh" Nov 24 11:27:28 crc kubenswrapper[4752]: I1124 11:27:28.712019 4752 generic.go:334] "Generic (PLEG): container finished" podID="a901a1c5-91ff-4986-94bf-592ff1c53ec8" containerID="4618b4ee6d11ee424000b275551df92bc5b3fa2d22b3cea12f04bb0291dd372d" exitCode=0 Nov 24 11:27:28 crc kubenswrapper[4752]: I1124 11:27:28.712054 4752 generic.go:334] "Generic (PLEG): container finished" podID="a901a1c5-91ff-4986-94bf-592ff1c53ec8" containerID="60be8840e1ee6b3f596a5ce76b04f231bd7b6191f9dc15ca988c87b08abbc4ad" exitCode=2 Nov 24 11:27:28 crc kubenswrapper[4752]: I1124 11:27:28.712067 4752 generic.go:334] "Generic (PLEG): container finished" podID="a901a1c5-91ff-4986-94bf-592ff1c53ec8" containerID="27b6d6942722d1db58c66719a8f8491f311077b193e1ef8b8f1f02443418972e" exitCode=0 Nov 24 11:27:28 crc kubenswrapper[4752]: I1124 11:27:28.712107 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a901a1c5-91ff-4986-94bf-592ff1c53ec8","Type":"ContainerDied","Data":"4618b4ee6d11ee424000b275551df92bc5b3fa2d22b3cea12f04bb0291dd372d"} Nov 24 11:27:28 crc kubenswrapper[4752]: I1124 11:27:28.712142 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a901a1c5-91ff-4986-94bf-592ff1c53ec8","Type":"ContainerDied","Data":"60be8840e1ee6b3f596a5ce76b04f231bd7b6191f9dc15ca988c87b08abbc4ad"} Nov 24 11:27:28 crc kubenswrapper[4752]: I1124 11:27:28.712158 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a901a1c5-91ff-4986-94bf-592ff1c53ec8","Type":"ContainerDied","Data":"27b6d6942722d1db58c66719a8f8491f311077b193e1ef8b8f1f02443418972e"} Nov 24 11:27:28 crc kubenswrapper[4752]: I1124 11:27:28.712311 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1b533d24-9af5-430e-ae85-fefaeea7c4b6" containerName="nova-api-log" containerID="cri-o://2324ffde568e947f0a30ea2e426277d57f4bd20325222b7f8d7152d53a90cb0e" gracePeriod=30 Nov 24 11:27:28 crc kubenswrapper[4752]: I1124 11:27:28.712416 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1b533d24-9af5-430e-ae85-fefaeea7c4b6" containerName="nova-api-api" containerID="cri-o://178d0813a181d031999d15fbc9f5beb37bcb0382ccd95515501ba5834ab3ae0a" gracePeriod=30 Nov 24 11:27:28 crc kubenswrapper[4752]: I1124 11:27:28.727114 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cd5cbd7b9-65sfh" podStartSLOduration=3.727092594 podStartE2EDuration="3.727092594s" podCreationTimestamp="2025-11-24 11:27:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:27:28.7255882 +0000 UTC m=+1254.710408499" watchObservedRunningTime="2025-11-24 11:27:28.727092594 +0000 UTC m=+1254.711912893" Nov 24 11:27:29 crc kubenswrapper[4752]: I1124 11:27:29.115819 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Nov 24 11:27:29 crc kubenswrapper[4752]: I1124 11:27:29.740452 4752 generic.go:334] "Generic (PLEG): container finished" podID="1b533d24-9af5-430e-ae85-fefaeea7c4b6" containerID="2324ffde568e947f0a30ea2e426277d57f4bd20325222b7f8d7152d53a90cb0e" exitCode=143 Nov 24 11:27:29 crc kubenswrapper[4752]: I1124 11:27:29.740526 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1b533d24-9af5-430e-ae85-fefaeea7c4b6","Type":"ContainerDied","Data":"2324ffde568e947f0a30ea2e426277d57f4bd20325222b7f8d7152d53a90cb0e"} Nov 24 11:27:30 crc kubenswrapper[4752]: I1124 11:27:30.755429 4752 generic.go:334] "Generic (PLEG): container finished" podID="a901a1c5-91ff-4986-94bf-592ff1c53ec8" containerID="606ef6738daabcd00a974684ff257695a7daec78240e6cfd544c3ac99056cf86" exitCode=0 Nov 24 11:27:30 crc kubenswrapper[4752]: I1124 11:27:30.755478 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a901a1c5-91ff-4986-94bf-592ff1c53ec8","Type":"ContainerDied","Data":"606ef6738daabcd00a974684ff257695a7daec78240e6cfd544c3ac99056cf86"} Nov 24 11:27:30 crc kubenswrapper[4752]: I1124 11:27:30.976507 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.104901 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a901a1c5-91ff-4986-94bf-592ff1c53ec8-ceilometer-tls-certs\") pod \"a901a1c5-91ff-4986-94bf-592ff1c53ec8\" (UID: \"a901a1c5-91ff-4986-94bf-592ff1c53ec8\") " Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.105085 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a901a1c5-91ff-4986-94bf-592ff1c53ec8-run-httpd\") pod \"a901a1c5-91ff-4986-94bf-592ff1c53ec8\" (UID: \"a901a1c5-91ff-4986-94bf-592ff1c53ec8\") " Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.105132 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a901a1c5-91ff-4986-94bf-592ff1c53ec8-log-httpd\") pod \"a901a1c5-91ff-4986-94bf-592ff1c53ec8\" (UID: \"a901a1c5-91ff-4986-94bf-592ff1c53ec8\") " Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.105186 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a901a1c5-91ff-4986-94bf-592ff1c53ec8-config-data\") pod \"a901a1c5-91ff-4986-94bf-592ff1c53ec8\" (UID: \"a901a1c5-91ff-4986-94bf-592ff1c53ec8\") " Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.105279 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9f7l\" (UniqueName: \"kubernetes.io/projected/a901a1c5-91ff-4986-94bf-592ff1c53ec8-kube-api-access-j9f7l\") pod \"a901a1c5-91ff-4986-94bf-592ff1c53ec8\" (UID: \"a901a1c5-91ff-4986-94bf-592ff1c53ec8\") " Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.105334 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a901a1c5-91ff-4986-94bf-592ff1c53ec8-sg-core-conf-yaml\") pod \"a901a1c5-91ff-4986-94bf-592ff1c53ec8\" (UID: \"a901a1c5-91ff-4986-94bf-592ff1c53ec8\") " Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.105475 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a901a1c5-91ff-4986-94bf-592ff1c53ec8-scripts\") pod \"a901a1c5-91ff-4986-94bf-592ff1c53ec8\" (UID: \"a901a1c5-91ff-4986-94bf-592ff1c53ec8\") " Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.105479 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a901a1c5-91ff-4986-94bf-592ff1c53ec8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a901a1c5-91ff-4986-94bf-592ff1c53ec8" (UID: "a901a1c5-91ff-4986-94bf-592ff1c53ec8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.105529 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a901a1c5-91ff-4986-94bf-592ff1c53ec8-combined-ca-bundle\") pod \"a901a1c5-91ff-4986-94bf-592ff1c53ec8\" (UID: \"a901a1c5-91ff-4986-94bf-592ff1c53ec8\") " Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.105568 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a901a1c5-91ff-4986-94bf-592ff1c53ec8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a901a1c5-91ff-4986-94bf-592ff1c53ec8" (UID: "a901a1c5-91ff-4986-94bf-592ff1c53ec8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.106190 4752 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a901a1c5-91ff-4986-94bf-592ff1c53ec8-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.106220 4752 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a901a1c5-91ff-4986-94bf-592ff1c53ec8-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.110319 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a901a1c5-91ff-4986-94bf-592ff1c53ec8-kube-api-access-j9f7l" (OuterVolumeSpecName: "kube-api-access-j9f7l") pod "a901a1c5-91ff-4986-94bf-592ff1c53ec8" (UID: "a901a1c5-91ff-4986-94bf-592ff1c53ec8"). InnerVolumeSpecName "kube-api-access-j9f7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.111677 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a901a1c5-91ff-4986-94bf-592ff1c53ec8-scripts" (OuterVolumeSpecName: "scripts") pod "a901a1c5-91ff-4986-94bf-592ff1c53ec8" (UID: "a901a1c5-91ff-4986-94bf-592ff1c53ec8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.135803 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a901a1c5-91ff-4986-94bf-592ff1c53ec8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a901a1c5-91ff-4986-94bf-592ff1c53ec8" (UID: "a901a1c5-91ff-4986-94bf-592ff1c53ec8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.168307 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a901a1c5-91ff-4986-94bf-592ff1c53ec8-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "a901a1c5-91ff-4986-94bf-592ff1c53ec8" (UID: "a901a1c5-91ff-4986-94bf-592ff1c53ec8"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.207889 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9f7l\" (UniqueName: \"kubernetes.io/projected/a901a1c5-91ff-4986-94bf-592ff1c53ec8-kube-api-access-j9f7l\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.207918 4752 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a901a1c5-91ff-4986-94bf-592ff1c53ec8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.207928 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a901a1c5-91ff-4986-94bf-592ff1c53ec8-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.207936 4752 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a901a1c5-91ff-4986-94bf-592ff1c53ec8-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.222462 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a901a1c5-91ff-4986-94bf-592ff1c53ec8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a901a1c5-91ff-4986-94bf-592ff1c53ec8" (UID: "a901a1c5-91ff-4986-94bf-592ff1c53ec8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.227841 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a901a1c5-91ff-4986-94bf-592ff1c53ec8-config-data" (OuterVolumeSpecName: "config-data") pod "a901a1c5-91ff-4986-94bf-592ff1c53ec8" (UID: "a901a1c5-91ff-4986-94bf-592ff1c53ec8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.309925 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a901a1c5-91ff-4986-94bf-592ff1c53ec8-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.309955 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a901a1c5-91ff-4986-94bf-592ff1c53ec8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.767432 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a901a1c5-91ff-4986-94bf-592ff1c53ec8","Type":"ContainerDied","Data":"56f2528d0827a7ddfeb81df3732f32e7a0a596da619da02f5709ae432bbcf951"} Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.767776 4752 scope.go:117] "RemoveContainer" containerID="4618b4ee6d11ee424000b275551df92bc5b3fa2d22b3cea12f04bb0291dd372d" Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.767491 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.799972 4752 scope.go:117] "RemoveContainer" containerID="60be8840e1ee6b3f596a5ce76b04f231bd7b6191f9dc15ca988c87b08abbc4ad" Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.810431 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.816629 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.823826 4752 scope.go:117] "RemoveContainer" containerID="606ef6738daabcd00a974684ff257695a7daec78240e6cfd544c3ac99056cf86" Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.833236 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 24 11:27:31 crc kubenswrapper[4752]: E1124 11:27:31.833710 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a901a1c5-91ff-4986-94bf-592ff1c53ec8" containerName="proxy-httpd" Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.833732 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="a901a1c5-91ff-4986-94bf-592ff1c53ec8" containerName="proxy-httpd" Nov 24 11:27:31 crc kubenswrapper[4752]: E1124 11:27:31.833773 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a901a1c5-91ff-4986-94bf-592ff1c53ec8" containerName="sg-core" Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.833780 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="a901a1c5-91ff-4986-94bf-592ff1c53ec8" containerName="sg-core" Nov 24 11:27:31 crc kubenswrapper[4752]: E1124 11:27:31.833792 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a901a1c5-91ff-4986-94bf-592ff1c53ec8" containerName="ceilometer-central-agent" Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.833798 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="a901a1c5-91ff-4986-94bf-592ff1c53ec8" containerName="ceilometer-central-agent" Nov 24 11:27:31 crc kubenswrapper[4752]: E1124 11:27:31.833814 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a901a1c5-91ff-4986-94bf-592ff1c53ec8" containerName="ceilometer-notification-agent" Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.833820 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="a901a1c5-91ff-4986-94bf-592ff1c53ec8" containerName="ceilometer-notification-agent" Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.834029 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="a901a1c5-91ff-4986-94bf-592ff1c53ec8" containerName="proxy-httpd" Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.834042 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="a901a1c5-91ff-4986-94bf-592ff1c53ec8" containerName="ceilometer-notification-agent" Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.834055 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="a901a1c5-91ff-4986-94bf-592ff1c53ec8" containerName="ceilometer-central-agent" Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.834068 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="a901a1c5-91ff-4986-94bf-592ff1c53ec8" containerName="sg-core" Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.836636 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.840645 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.840701 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.840805 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.844567 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.864839 4752 scope.go:117] "RemoveContainer" containerID="27b6d6942722d1db58c66719a8f8491f311077b193e1ef8b8f1f02443418972e" Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.921141 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e8ac218-b256-4d82-afb6-34448254aa9f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2e8ac218-b256-4d82-afb6-34448254aa9f\") " pod="openstack/ceilometer-0" Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.921220 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e8ac218-b256-4d82-afb6-34448254aa9f-run-httpd\") pod \"ceilometer-0\" (UID: \"2e8ac218-b256-4d82-afb6-34448254aa9f\") " pod="openstack/ceilometer-0" Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.921379 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e8ac218-b256-4d82-afb6-34448254aa9f-scripts\") pod \"ceilometer-0\" (UID: \"2e8ac218-b256-4d82-afb6-34448254aa9f\") " pod="openstack/ceilometer-0" Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.921490 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e8ac218-b256-4d82-afb6-34448254aa9f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2e8ac218-b256-4d82-afb6-34448254aa9f\") " pod="openstack/ceilometer-0" Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.921605 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e8ac218-b256-4d82-afb6-34448254aa9f-log-httpd\") pod \"ceilometer-0\" (UID: \"2e8ac218-b256-4d82-afb6-34448254aa9f\") " pod="openstack/ceilometer-0" Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.921695 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw4wc\" (UniqueName: \"kubernetes.io/projected/2e8ac218-b256-4d82-afb6-34448254aa9f-kube-api-access-nw4wc\") pod \"ceilometer-0\" (UID: \"2e8ac218-b256-4d82-afb6-34448254aa9f\") " pod="openstack/ceilometer-0" Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.921805 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2e8ac218-b256-4d82-afb6-34448254aa9f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2e8ac218-b256-4d82-afb6-34448254aa9f\") " pod="openstack/ceilometer-0" Nov 24 11:27:31 crc kubenswrapper[4752]: I1124 11:27:31.921841 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e8ac218-b256-4d82-afb6-34448254aa9f-config-data\") pod \"ceilometer-0\" (UID: \"2e8ac218-b256-4d82-afb6-34448254aa9f\") " pod="openstack/ceilometer-0" Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.023878 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e8ac218-b256-4d82-afb6-34448254aa9f-scripts\") pod \"ceilometer-0\" (UID: \"2e8ac218-b256-4d82-afb6-34448254aa9f\") " pod="openstack/ceilometer-0" Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.023954 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e8ac218-b256-4d82-afb6-34448254aa9f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2e8ac218-b256-4d82-afb6-34448254aa9f\") " pod="openstack/ceilometer-0" Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.024025 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e8ac218-b256-4d82-afb6-34448254aa9f-log-httpd\") pod \"ceilometer-0\" (UID: \"2e8ac218-b256-4d82-afb6-34448254aa9f\") " pod="openstack/ceilometer-0" Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.024072 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw4wc\" (UniqueName: \"kubernetes.io/projected/2e8ac218-b256-4d82-afb6-34448254aa9f-kube-api-access-nw4wc\") pod \"ceilometer-0\" (UID: \"2e8ac218-b256-4d82-afb6-34448254aa9f\") " pod="openstack/ceilometer-0" Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.024126 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2e8ac218-b256-4d82-afb6-34448254aa9f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2e8ac218-b256-4d82-afb6-34448254aa9f\") " pod="openstack/ceilometer-0" Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.024152 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e8ac218-b256-4d82-afb6-34448254aa9f-config-data\") pod \"ceilometer-0\" (UID: \"2e8ac218-b256-4d82-afb6-34448254aa9f\") " pod="openstack/ceilometer-0" Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.024190 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e8ac218-b256-4d82-afb6-34448254aa9f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2e8ac218-b256-4d82-afb6-34448254aa9f\") " pod="openstack/ceilometer-0" Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.024238 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e8ac218-b256-4d82-afb6-34448254aa9f-run-httpd\") pod \"ceilometer-0\" (UID: \"2e8ac218-b256-4d82-afb6-34448254aa9f\") " pod="openstack/ceilometer-0" Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.026938 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e8ac218-b256-4d82-afb6-34448254aa9f-run-httpd\") pod \"ceilometer-0\" (UID: \"2e8ac218-b256-4d82-afb6-34448254aa9f\") " pod="openstack/ceilometer-0" Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.027050 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e8ac218-b256-4d82-afb6-34448254aa9f-log-httpd\") pod \"ceilometer-0\" (UID: \"2e8ac218-b256-4d82-afb6-34448254aa9f\") " pod="openstack/ceilometer-0" Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.030633 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2e8ac218-b256-4d82-afb6-34448254aa9f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2e8ac218-b256-4d82-afb6-34448254aa9f\") " pod="openstack/ceilometer-0" Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.030719 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e8ac218-b256-4d82-afb6-34448254aa9f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2e8ac218-b256-4d82-afb6-34448254aa9f\") " pod="openstack/ceilometer-0" Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.030773 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e8ac218-b256-4d82-afb6-34448254aa9f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2e8ac218-b256-4d82-afb6-34448254aa9f\") " pod="openstack/ceilometer-0" Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.031427 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e8ac218-b256-4d82-afb6-34448254aa9f-config-data\") pod \"ceilometer-0\" (UID: \"2e8ac218-b256-4d82-afb6-34448254aa9f\") " pod="openstack/ceilometer-0" Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.033079 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e8ac218-b256-4d82-afb6-34448254aa9f-scripts\") pod \"ceilometer-0\" (UID: \"2e8ac218-b256-4d82-afb6-34448254aa9f\") " pod="openstack/ceilometer-0" Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.046132 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw4wc\" (UniqueName: \"kubernetes.io/projected/2e8ac218-b256-4d82-afb6-34448254aa9f-kube-api-access-nw4wc\") pod \"ceilometer-0\" (UID: \"2e8ac218-b256-4d82-afb6-34448254aa9f\") " pod="openstack/ceilometer-0" Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.258044 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.275063 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.435378 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b533d24-9af5-430e-ae85-fefaeea7c4b6-config-data\") pod \"1b533d24-9af5-430e-ae85-fefaeea7c4b6\" (UID: \"1b533d24-9af5-430e-ae85-fefaeea7c4b6\") " Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.435480 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krpsk\" (UniqueName: \"kubernetes.io/projected/1b533d24-9af5-430e-ae85-fefaeea7c4b6-kube-api-access-krpsk\") pod \"1b533d24-9af5-430e-ae85-fefaeea7c4b6\" (UID: \"1b533d24-9af5-430e-ae85-fefaeea7c4b6\") " Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.435547 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b533d24-9af5-430e-ae85-fefaeea7c4b6-combined-ca-bundle\") pod \"1b533d24-9af5-430e-ae85-fefaeea7c4b6\" (UID: \"1b533d24-9af5-430e-ae85-fefaeea7c4b6\") " Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.435656 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b533d24-9af5-430e-ae85-fefaeea7c4b6-logs\") pod \"1b533d24-9af5-430e-ae85-fefaeea7c4b6\" (UID: \"1b533d24-9af5-430e-ae85-fefaeea7c4b6\") " Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.436383 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b533d24-9af5-430e-ae85-fefaeea7c4b6-logs" (OuterVolumeSpecName: "logs") pod "1b533d24-9af5-430e-ae85-fefaeea7c4b6" (UID: "1b533d24-9af5-430e-ae85-fefaeea7c4b6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.444042 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b533d24-9af5-430e-ae85-fefaeea7c4b6-kube-api-access-krpsk" (OuterVolumeSpecName: "kube-api-access-krpsk") pod "1b533d24-9af5-430e-ae85-fefaeea7c4b6" (UID: "1b533d24-9af5-430e-ae85-fefaeea7c4b6"). InnerVolumeSpecName "kube-api-access-krpsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.476197 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b533d24-9af5-430e-ae85-fefaeea7c4b6-config-data" (OuterVolumeSpecName: "config-data") pod "1b533d24-9af5-430e-ae85-fefaeea7c4b6" (UID: "1b533d24-9af5-430e-ae85-fefaeea7c4b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.477678 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b533d24-9af5-430e-ae85-fefaeea7c4b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b533d24-9af5-430e-ae85-fefaeea7c4b6" (UID: "1b533d24-9af5-430e-ae85-fefaeea7c4b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.538481 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b533d24-9af5-430e-ae85-fefaeea7c4b6-logs\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.538517 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b533d24-9af5-430e-ae85-fefaeea7c4b6-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.538825 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krpsk\" (UniqueName: \"kubernetes.io/projected/1b533d24-9af5-430e-ae85-fefaeea7c4b6-kube-api-access-krpsk\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.538841 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b533d24-9af5-430e-ae85-fefaeea7c4b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.740684 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a901a1c5-91ff-4986-94bf-592ff1c53ec8" path="/var/lib/kubelet/pods/a901a1c5-91ff-4986-94bf-592ff1c53ec8/volumes" Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.783169 4752 generic.go:334] "Generic (PLEG): container finished" podID="1b533d24-9af5-430e-ae85-fefaeea7c4b6" containerID="178d0813a181d031999d15fbc9f5beb37bcb0382ccd95515501ba5834ab3ae0a" exitCode=0 Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.783210 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1b533d24-9af5-430e-ae85-fefaeea7c4b6","Type":"ContainerDied","Data":"178d0813a181d031999d15fbc9f5beb37bcb0382ccd95515501ba5834ab3ae0a"} Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.783531 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1b533d24-9af5-430e-ae85-fefaeea7c4b6","Type":"ContainerDied","Data":"817fd372c8465908993c893569d579693b84a933421a203f61e86c8369a8338d"} Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.783254 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.783553 4752 scope.go:117] "RemoveContainer" containerID="178d0813a181d031999d15fbc9f5beb37bcb0382ccd95515501ba5834ab3ae0a" Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.814965 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.826278 4752 scope.go:117] "RemoveContainer" containerID="2324ffde568e947f0a30ea2e426277d57f4bd20325222b7f8d7152d53a90cb0e" Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.852700 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.869589 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.879848 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 24 11:27:32 crc kubenswrapper[4752]: E1124 11:27:32.880542 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b533d24-9af5-430e-ae85-fefaeea7c4b6" containerName="nova-api-api" Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.880564 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b533d24-9af5-430e-ae85-fefaeea7c4b6" containerName="nova-api-api" Nov 24 11:27:32 crc kubenswrapper[4752]: E1124 11:27:32.880579 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b533d24-9af5-430e-ae85-fefaeea7c4b6" containerName="nova-api-log" Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.880586 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b533d24-9af5-430e-ae85-fefaeea7c4b6" containerName="nova-api-log" Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.888293 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b533d24-9af5-430e-ae85-fefaeea7c4b6" containerName="nova-api-api" Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.888414 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b533d24-9af5-430e-ae85-fefaeea7c4b6" containerName="nova-api-log" Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.891787 4752 scope.go:117] "RemoveContainer" containerID="178d0813a181d031999d15fbc9f5beb37bcb0382ccd95515501ba5834ab3ae0a" Nov 24 11:27:32 crc kubenswrapper[4752]: E1124 11:27:32.892259 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"178d0813a181d031999d15fbc9f5beb37bcb0382ccd95515501ba5834ab3ae0a\": container with ID starting with 178d0813a181d031999d15fbc9f5beb37bcb0382ccd95515501ba5834ab3ae0a not found: ID does not exist" containerID="178d0813a181d031999d15fbc9f5beb37bcb0382ccd95515501ba5834ab3ae0a" Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.892289 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"178d0813a181d031999d15fbc9f5beb37bcb0382ccd95515501ba5834ab3ae0a"} err="failed to get container status \"178d0813a181d031999d15fbc9f5beb37bcb0382ccd95515501ba5834ab3ae0a\": rpc error: code = NotFound desc = could not find container \"178d0813a181d031999d15fbc9f5beb37bcb0382ccd95515501ba5834ab3ae0a\": container with ID starting with 178d0813a181d031999d15fbc9f5beb37bcb0382ccd95515501ba5834ab3ae0a not found: ID does not exist" Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.892313 4752 scope.go:117] "RemoveContainer" containerID="2324ffde568e947f0a30ea2e426277d57f4bd20325222b7f8d7152d53a90cb0e" Nov 24 11:27:32 crc kubenswrapper[4752]: E1124 11:27:32.892550 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2324ffde568e947f0a30ea2e426277d57f4bd20325222b7f8d7152d53a90cb0e\": container with ID starting with 2324ffde568e947f0a30ea2e426277d57f4bd20325222b7f8d7152d53a90cb0e not found: ID does not exist" containerID="2324ffde568e947f0a30ea2e426277d57f4bd20325222b7f8d7152d53a90cb0e" Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.892587 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2324ffde568e947f0a30ea2e426277d57f4bd20325222b7f8d7152d53a90cb0e"} err="failed to get container status \"2324ffde568e947f0a30ea2e426277d57f4bd20325222b7f8d7152d53a90cb0e\": rpc error: code = NotFound desc = could not find container \"2324ffde568e947f0a30ea2e426277d57f4bd20325222b7f8d7152d53a90cb0e\": container with ID starting with 2324ffde568e947f0a30ea2e426277d57f4bd20325222b7f8d7152d53a90cb0e not found: ID does not exist" Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.893673 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.896168 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.896456 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.896600 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Nov 24 11:27:32 crc kubenswrapper[4752]: I1124 11:27:32.897503 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 24 11:27:33 crc kubenswrapper[4752]: I1124 11:27:33.060961 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3-config-data\") pod \"nova-api-0\" (UID: \"8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3\") " pod="openstack/nova-api-0" Nov 24 11:27:33 crc kubenswrapper[4752]: I1124 11:27:33.061063 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3-logs\") pod \"nova-api-0\" (UID: \"8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3\") " pod="openstack/nova-api-0" Nov 24 11:27:33 crc kubenswrapper[4752]: I1124 11:27:33.061113 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3-public-tls-certs\") pod \"nova-api-0\" (UID: \"8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3\") " pod="openstack/nova-api-0" Nov 24 11:27:33 crc kubenswrapper[4752]: I1124 11:27:33.061355 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq5zm\" (UniqueName: \"kubernetes.io/projected/8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3-kube-api-access-zq5zm\") pod \"nova-api-0\" (UID: \"8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3\") " pod="openstack/nova-api-0" Nov 24 11:27:33 crc kubenswrapper[4752]: I1124 11:27:33.061494 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3\") " pod="openstack/nova-api-0" Nov 24 11:27:33 crc kubenswrapper[4752]: I1124 11:27:33.061554 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3\") " pod="openstack/nova-api-0" Nov 24 11:27:33 crc kubenswrapper[4752]: I1124 11:27:33.163547 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3\") " pod="openstack/nova-api-0" Nov 24 11:27:33 crc kubenswrapper[4752]: I1124 11:27:33.163595 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3\") " pod="openstack/nova-api-0" Nov 24 11:27:33 crc kubenswrapper[4752]: I1124 11:27:33.163728 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3-config-data\") pod \"nova-api-0\" (UID: \"8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3\") " pod="openstack/nova-api-0" Nov 24 11:27:33 crc kubenswrapper[4752]: I1124 11:27:33.163785 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3-logs\") pod \"nova-api-0\" (UID: \"8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3\") " pod="openstack/nova-api-0" Nov 24 11:27:33 crc kubenswrapper[4752]: I1124 11:27:33.163813 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3-public-tls-certs\") pod \"nova-api-0\" (UID: \"8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3\") " pod="openstack/nova-api-0" Nov 24 11:27:33 crc kubenswrapper[4752]: I1124 11:27:33.163859 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq5zm\" (UniqueName: \"kubernetes.io/projected/8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3-kube-api-access-zq5zm\") pod \"nova-api-0\" (UID: \"8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3\") " pod="openstack/nova-api-0" Nov 24 11:27:33 crc kubenswrapper[4752]: I1124 11:27:33.165557 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3-logs\") pod \"nova-api-0\" (UID: \"8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3\") " pod="openstack/nova-api-0" Nov 24 11:27:33 crc kubenswrapper[4752]: I1124 11:27:33.171256 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3-public-tls-certs\") pod \"nova-api-0\" (UID: \"8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3\") " pod="openstack/nova-api-0" Nov 24 11:27:33 crc kubenswrapper[4752]: I1124 11:27:33.180425 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3\") " pod="openstack/nova-api-0" Nov 24 11:27:33 crc kubenswrapper[4752]: I1124 11:27:33.180645 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3-config-data\") pod \"nova-api-0\" (UID: \"8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3\") " pod="openstack/nova-api-0" Nov 24 11:27:33 crc kubenswrapper[4752]: I1124 11:27:33.181368 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3\") " pod="openstack/nova-api-0" Nov 24 11:27:33 crc kubenswrapper[4752]: I1124 11:27:33.188328 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq5zm\" (UniqueName: \"kubernetes.io/projected/8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3-kube-api-access-zq5zm\") pod \"nova-api-0\" (UID: \"8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3\") " pod="openstack/nova-api-0" Nov 24 11:27:33 crc kubenswrapper[4752]: I1124 11:27:33.219218 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 11:27:33 crc kubenswrapper[4752]: I1124 11:27:33.709359 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 24 11:27:33 crc kubenswrapper[4752]: W1124 11:27:33.715833 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c662bc9_4d2f_4dfb_83d3_b0458e8e90b3.slice/crio-85b9e522609f314c53f6691d58d0b017b257aaf953a146d609856d7c95531cfc WatchSource:0}: Error finding container 85b9e522609f314c53f6691d58d0b017b257aaf953a146d609856d7c95531cfc: Status 404 returned error can't find the container with id 85b9e522609f314c53f6691d58d0b017b257aaf953a146d609856d7c95531cfc Nov 24 11:27:33 crc kubenswrapper[4752]: I1124 11:27:33.795525 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3","Type":"ContainerStarted","Data":"85b9e522609f314c53f6691d58d0b017b257aaf953a146d609856d7c95531cfc"} Nov 24 11:27:33 crc kubenswrapper[4752]: I1124 11:27:33.798613 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e8ac218-b256-4d82-afb6-34448254aa9f","Type":"ContainerStarted","Data":"42af4a9e7e4f79d0e530981eec488f3193a8e0ab8487ba8583a6beacc33cd3f9"} Nov 24 11:27:33 crc kubenswrapper[4752]: I1124 11:27:33.798677 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e8ac218-b256-4d82-afb6-34448254aa9f","Type":"ContainerStarted","Data":"13c31024e25cdc8a474efaac8c249f5b73c4d7487d6ccdc27e5f7d73d85d329a"} Nov 24 11:27:34 crc kubenswrapper[4752]: I1124 11:27:34.115715 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Nov 24 11:27:34 crc kubenswrapper[4752]: I1124 11:27:34.150398 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Nov 24 11:27:34 crc kubenswrapper[4752]: I1124 11:27:34.747424 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b533d24-9af5-430e-ae85-fefaeea7c4b6" path="/var/lib/kubelet/pods/1b533d24-9af5-430e-ae85-fefaeea7c4b6/volumes" Nov 24 11:27:34 crc kubenswrapper[4752]: I1124 11:27:34.811766 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3","Type":"ContainerStarted","Data":"becd5b7341431fd379e4b339476a5becfa27ad38fc8cbc796194303467e448ae"} Nov 24 11:27:34 crc kubenswrapper[4752]: I1124 11:27:34.812124 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3","Type":"ContainerStarted","Data":"5ee796ae4d55e1d4b3a1e67f40a0194e7d8584bd99c3c44b43f31a6ed0846d9c"} Nov 24 11:27:34 crc kubenswrapper[4752]: I1124 11:27:34.814378 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e8ac218-b256-4d82-afb6-34448254aa9f","Type":"ContainerStarted","Data":"3e3197854a26bf19017707f823b368d90d33292e1a5a4ca1ccb629c45420d998"} Nov 24 11:27:34 crc kubenswrapper[4752]: I1124 11:27:34.838428 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Nov 24 11:27:34 crc kubenswrapper[4752]: I1124 11:27:34.859471 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.859452272 podStartE2EDuration="2.859452272s" podCreationTimestamp="2025-11-24 11:27:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:27:34.83896369 +0000 UTC m=+1260.823783979" watchObservedRunningTime="2025-11-24 11:27:34.859452272 +0000 UTC m=+1260.844272561" Nov 24 11:27:35 crc kubenswrapper[4752]: I1124 11:27:35.012234 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-2shz6"] Nov 24 11:27:35 crc kubenswrapper[4752]: I1124 11:27:35.013480 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-2shz6" Nov 24 11:27:35 crc kubenswrapper[4752]: I1124 11:27:35.015788 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Nov 24 11:27:35 crc kubenswrapper[4752]: I1124 11:27:35.017322 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Nov 24 11:27:35 crc kubenswrapper[4752]: I1124 11:27:35.026189 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-2shz6"] Nov 24 11:27:35 crc kubenswrapper[4752]: I1124 11:27:35.098408 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b7dddf-2bd1-4632-91ae-01abd3b77e1f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-2shz6\" (UID: \"04b7dddf-2bd1-4632-91ae-01abd3b77e1f\") " pod="openstack/nova-cell1-cell-mapping-2shz6" Nov 24 11:27:35 crc kubenswrapper[4752]: I1124 11:27:35.098486 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b7dddf-2bd1-4632-91ae-01abd3b77e1f-config-data\") pod \"nova-cell1-cell-mapping-2shz6\" (UID: \"04b7dddf-2bd1-4632-91ae-01abd3b77e1f\") " pod="openstack/nova-cell1-cell-mapping-2shz6" Nov 24 11:27:35 crc kubenswrapper[4752]: I1124 11:27:35.098727 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04b7dddf-2bd1-4632-91ae-01abd3b77e1f-scripts\") pod \"nova-cell1-cell-mapping-2shz6\" (UID: \"04b7dddf-2bd1-4632-91ae-01abd3b77e1f\") " pod="openstack/nova-cell1-cell-mapping-2shz6" Nov 24 11:27:35 crc kubenswrapper[4752]: I1124 11:27:35.098861 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8wx8\" (UniqueName: \"kubernetes.io/projected/04b7dddf-2bd1-4632-91ae-01abd3b77e1f-kube-api-access-n8wx8\") pod \"nova-cell1-cell-mapping-2shz6\" (UID: \"04b7dddf-2bd1-4632-91ae-01abd3b77e1f\") " pod="openstack/nova-cell1-cell-mapping-2shz6" Nov 24 11:27:35 crc kubenswrapper[4752]: I1124 11:27:35.200912 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b7dddf-2bd1-4632-91ae-01abd3b77e1f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-2shz6\" (UID: \"04b7dddf-2bd1-4632-91ae-01abd3b77e1f\") " pod="openstack/nova-cell1-cell-mapping-2shz6" Nov 24 11:27:35 crc kubenswrapper[4752]: I1124 11:27:35.200975 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b7dddf-2bd1-4632-91ae-01abd3b77e1f-config-data\") pod \"nova-cell1-cell-mapping-2shz6\" (UID: \"04b7dddf-2bd1-4632-91ae-01abd3b77e1f\") " pod="openstack/nova-cell1-cell-mapping-2shz6" Nov 24 11:27:35 crc kubenswrapper[4752]: I1124 11:27:35.201073 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04b7dddf-2bd1-4632-91ae-01abd3b77e1f-scripts\") pod \"nova-cell1-cell-mapping-2shz6\" (UID: \"04b7dddf-2bd1-4632-91ae-01abd3b77e1f\") " pod="openstack/nova-cell1-cell-mapping-2shz6" Nov 24 11:27:35 crc kubenswrapper[4752]: I1124 11:27:35.201118 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8wx8\" (UniqueName: \"kubernetes.io/projected/04b7dddf-2bd1-4632-91ae-01abd3b77e1f-kube-api-access-n8wx8\") pod \"nova-cell1-cell-mapping-2shz6\" (UID: \"04b7dddf-2bd1-4632-91ae-01abd3b77e1f\") " pod="openstack/nova-cell1-cell-mapping-2shz6" Nov 24 11:27:35 crc kubenswrapper[4752]: I1124 11:27:35.205724 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b7dddf-2bd1-4632-91ae-01abd3b77e1f-config-data\") pod \"nova-cell1-cell-mapping-2shz6\" (UID: \"04b7dddf-2bd1-4632-91ae-01abd3b77e1f\") " pod="openstack/nova-cell1-cell-mapping-2shz6" Nov 24 11:27:35 crc kubenswrapper[4752]: I1124 11:27:35.206898 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04b7dddf-2bd1-4632-91ae-01abd3b77e1f-scripts\") pod \"nova-cell1-cell-mapping-2shz6\" (UID: \"04b7dddf-2bd1-4632-91ae-01abd3b77e1f\") " pod="openstack/nova-cell1-cell-mapping-2shz6" Nov 24 11:27:35 crc kubenswrapper[4752]: I1124 11:27:35.212365 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b7dddf-2bd1-4632-91ae-01abd3b77e1f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-2shz6\" (UID: \"04b7dddf-2bd1-4632-91ae-01abd3b77e1f\") " pod="openstack/nova-cell1-cell-mapping-2shz6" Nov 24 11:27:35 crc kubenswrapper[4752]: I1124 11:27:35.220023 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8wx8\" (UniqueName: \"kubernetes.io/projected/04b7dddf-2bd1-4632-91ae-01abd3b77e1f-kube-api-access-n8wx8\") pod \"nova-cell1-cell-mapping-2shz6\" (UID: \"04b7dddf-2bd1-4632-91ae-01abd3b77e1f\") " pod="openstack/nova-cell1-cell-mapping-2shz6" Nov 24 11:27:35 crc kubenswrapper[4752]: I1124 11:27:35.332525 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-2shz6" Nov 24 11:27:35 crc kubenswrapper[4752]: I1124 11:27:35.785051 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-2shz6"] Nov 24 11:27:35 crc kubenswrapper[4752]: W1124 11:27:35.792456 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04b7dddf_2bd1_4632_91ae_01abd3b77e1f.slice/crio-40baf1e421319eb598fe907be353b0a86a71bc763cbb17f53ec575dbd0494311 WatchSource:0}: Error finding container 40baf1e421319eb598fe907be353b0a86a71bc763cbb17f53ec575dbd0494311: Status 404 returned error can't find the container with id 40baf1e421319eb598fe907be353b0a86a71bc763cbb17f53ec575dbd0494311 Nov 24 11:27:35 crc kubenswrapper[4752]: I1124 11:27:35.833714 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e8ac218-b256-4d82-afb6-34448254aa9f","Type":"ContainerStarted","Data":"3511a8fadfd4d408af8dd7cfad50cde220b8abfee4439a6034133cc037a305e8"} Nov 24 11:27:35 crc kubenswrapper[4752]: I1124 11:27:35.836435 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-2shz6" event={"ID":"04b7dddf-2bd1-4632-91ae-01abd3b77e1f","Type":"ContainerStarted","Data":"40baf1e421319eb598fe907be353b0a86a71bc763cbb17f53ec575dbd0494311"} Nov 24 11:27:36 crc kubenswrapper[4752]: I1124 11:27:36.210518 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cd5cbd7b9-65sfh" Nov 24 11:27:36 crc kubenswrapper[4752]: I1124 11:27:36.306415 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-2258v"] Nov 24 11:27:36 crc kubenswrapper[4752]: I1124 11:27:36.306899 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bccf8f775-2258v" podUID="917b2118-45fd-437a-8f50-7b5f30336603" containerName="dnsmasq-dns" containerID="cri-o://3a73241e3ce20eb094929c8502acd81796faf1de15739a96eceae7def7765f99" gracePeriod=10 Nov 24 11:27:36 crc kubenswrapper[4752]: I1124 11:27:36.759713 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-2258v" Nov 24 11:27:36 crc kubenswrapper[4752]: I1124 11:27:36.834767 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/917b2118-45fd-437a-8f50-7b5f30336603-ovsdbserver-nb\") pod \"917b2118-45fd-437a-8f50-7b5f30336603\" (UID: \"917b2118-45fd-437a-8f50-7b5f30336603\") " Nov 24 11:27:36 crc kubenswrapper[4752]: I1124 11:27:36.835043 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/917b2118-45fd-437a-8f50-7b5f30336603-dns-swift-storage-0\") pod \"917b2118-45fd-437a-8f50-7b5f30336603\" (UID: \"917b2118-45fd-437a-8f50-7b5f30336603\") " Nov 24 11:27:36 crc kubenswrapper[4752]: I1124 11:27:36.835078 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/917b2118-45fd-437a-8f50-7b5f30336603-ovsdbserver-sb\") pod \"917b2118-45fd-437a-8f50-7b5f30336603\" (UID: \"917b2118-45fd-437a-8f50-7b5f30336603\") " Nov 24 11:27:36 crc kubenswrapper[4752]: I1124 11:27:36.835113 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68trx\" (UniqueName: \"kubernetes.io/projected/917b2118-45fd-437a-8f50-7b5f30336603-kube-api-access-68trx\") pod \"917b2118-45fd-437a-8f50-7b5f30336603\" (UID: \"917b2118-45fd-437a-8f50-7b5f30336603\") " Nov 24 11:27:36 crc kubenswrapper[4752]: I1124 11:27:36.835171 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/917b2118-45fd-437a-8f50-7b5f30336603-config\") pod \"917b2118-45fd-437a-8f50-7b5f30336603\" (UID: \"917b2118-45fd-437a-8f50-7b5f30336603\") " Nov 24 11:27:36 crc kubenswrapper[4752]: I1124 11:27:36.835226 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/917b2118-45fd-437a-8f50-7b5f30336603-dns-svc\") pod \"917b2118-45fd-437a-8f50-7b5f30336603\" (UID: \"917b2118-45fd-437a-8f50-7b5f30336603\") " Nov 24 11:27:36 crc kubenswrapper[4752]: I1124 11:27:36.857875 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/917b2118-45fd-437a-8f50-7b5f30336603-kube-api-access-68trx" (OuterVolumeSpecName: "kube-api-access-68trx") pod "917b2118-45fd-437a-8f50-7b5f30336603" (UID: "917b2118-45fd-437a-8f50-7b5f30336603"). InnerVolumeSpecName "kube-api-access-68trx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:27:36 crc kubenswrapper[4752]: I1124 11:27:36.872526 4752 generic.go:334] "Generic (PLEG): container finished" podID="917b2118-45fd-437a-8f50-7b5f30336603" containerID="3a73241e3ce20eb094929c8502acd81796faf1de15739a96eceae7def7765f99" exitCode=0 Nov 24 11:27:36 crc kubenswrapper[4752]: I1124 11:27:36.872601 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-2258v" event={"ID":"917b2118-45fd-437a-8f50-7b5f30336603","Type":"ContainerDied","Data":"3a73241e3ce20eb094929c8502acd81796faf1de15739a96eceae7def7765f99"} Nov 24 11:27:36 crc kubenswrapper[4752]: I1124 11:27:36.872605 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-2258v" Nov 24 11:27:36 crc kubenswrapper[4752]: I1124 11:27:36.872629 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-2258v" event={"ID":"917b2118-45fd-437a-8f50-7b5f30336603","Type":"ContainerDied","Data":"a326f075bbba45680d191aba27ffcef9a5204e9927aa6957ad0427f124e2b53d"} Nov 24 11:27:36 crc kubenswrapper[4752]: I1124 11:27:36.872647 4752 scope.go:117] "RemoveContainer" containerID="3a73241e3ce20eb094929c8502acd81796faf1de15739a96eceae7def7765f99" Nov 24 11:27:36 crc kubenswrapper[4752]: I1124 11:27:36.877999 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e8ac218-b256-4d82-afb6-34448254aa9f","Type":"ContainerStarted","Data":"b64a3d9cdc49b880ccf1a894ebc75a567c3fbafc89ff71380851b7711949247f"} Nov 24 11:27:36 crc kubenswrapper[4752]: I1124 11:27:36.879447 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 24 11:27:36 crc kubenswrapper[4752]: I1124 11:27:36.885967 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-2shz6" event={"ID":"04b7dddf-2bd1-4632-91ae-01abd3b77e1f","Type":"ContainerStarted","Data":"b6a6637e2c4a956270f0ef941c35eb469cca815e47679929b858c1f2f545b628"} Nov 24 11:27:36 crc kubenswrapper[4752]: I1124 11:27:36.910823 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/917b2118-45fd-437a-8f50-7b5f30336603-config" (OuterVolumeSpecName: "config") pod "917b2118-45fd-437a-8f50-7b5f30336603" (UID: "917b2118-45fd-437a-8f50-7b5f30336603"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:27:36 crc kubenswrapper[4752]: I1124 11:27:36.916590 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/917b2118-45fd-437a-8f50-7b5f30336603-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "917b2118-45fd-437a-8f50-7b5f30336603" (UID: "917b2118-45fd-437a-8f50-7b5f30336603"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:27:36 crc kubenswrapper[4752]: I1124 11:27:36.917824 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/917b2118-45fd-437a-8f50-7b5f30336603-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "917b2118-45fd-437a-8f50-7b5f30336603" (UID: "917b2118-45fd-437a-8f50-7b5f30336603"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:27:36 crc kubenswrapper[4752]: I1124 11:27:36.918633 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/917b2118-45fd-437a-8f50-7b5f30336603-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "917b2118-45fd-437a-8f50-7b5f30336603" (UID: "917b2118-45fd-437a-8f50-7b5f30336603"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:27:36 crc kubenswrapper[4752]: I1124 11:27:36.928899 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.718345149 podStartE2EDuration="5.928875799s" podCreationTimestamp="2025-11-24 11:27:31 +0000 UTC" firstStartedPulling="2025-11-24 11:27:32.826242451 +0000 UTC m=+1258.811062740" lastFinishedPulling="2025-11-24 11:27:36.036773081 +0000 UTC m=+1262.021593390" observedRunningTime="2025-11-24 11:27:36.911050514 +0000 UTC m=+1262.895870803" watchObservedRunningTime="2025-11-24 11:27:36.928875799 +0000 UTC m=+1262.913696108" Nov 24 11:27:36 crc kubenswrapper[4752]: I1124 11:27:36.931811 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-2shz6" podStartSLOduration=2.931801084 podStartE2EDuration="2.931801084s" podCreationTimestamp="2025-11-24 11:27:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:27:36.930926789 +0000 UTC m=+1262.915747078" watchObservedRunningTime="2025-11-24 11:27:36.931801084 +0000 UTC m=+1262.916621373" Nov 24 11:27:36 crc kubenswrapper[4752]: I1124 11:27:36.937558 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/917b2118-45fd-437a-8f50-7b5f30336603-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:36 crc kubenswrapper[4752]: I1124 11:27:36.937594 4752 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/917b2118-45fd-437a-8f50-7b5f30336603-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:36 crc kubenswrapper[4752]: I1124 11:27:36.937607 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68trx\" (UniqueName: \"kubernetes.io/projected/917b2118-45fd-437a-8f50-7b5f30336603-kube-api-access-68trx\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:36 crc kubenswrapper[4752]: I1124 11:27:36.937620 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/917b2118-45fd-437a-8f50-7b5f30336603-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:36 crc kubenswrapper[4752]: I1124 11:27:36.937628 4752 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/917b2118-45fd-437a-8f50-7b5f30336603-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:36 crc kubenswrapper[4752]: I1124 11:27:36.938346 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/917b2118-45fd-437a-8f50-7b5f30336603-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "917b2118-45fd-437a-8f50-7b5f30336603" (UID: "917b2118-45fd-437a-8f50-7b5f30336603"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:27:37 crc kubenswrapper[4752]: I1124 11:27:37.034220 4752 scope.go:117] "RemoveContainer" containerID="8314c680dec09416c0b7a5d467dd23d0b83e976435017c1eb80827410969b0a9" Nov 24 11:27:37 crc kubenswrapper[4752]: I1124 11:27:37.039725 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/917b2118-45fd-437a-8f50-7b5f30336603-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:37 crc kubenswrapper[4752]: I1124 11:27:37.062388 4752 scope.go:117] "RemoveContainer" containerID="3a73241e3ce20eb094929c8502acd81796faf1de15739a96eceae7def7765f99" Nov 24 11:27:37 crc kubenswrapper[4752]: E1124 11:27:37.063525 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a73241e3ce20eb094929c8502acd81796faf1de15739a96eceae7def7765f99\": container with ID starting with 3a73241e3ce20eb094929c8502acd81796faf1de15739a96eceae7def7765f99 not found: ID does not exist" containerID="3a73241e3ce20eb094929c8502acd81796faf1de15739a96eceae7def7765f99" Nov 24 11:27:37 crc kubenswrapper[4752]: I1124 11:27:37.063708 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a73241e3ce20eb094929c8502acd81796faf1de15739a96eceae7def7765f99"} err="failed to get container status \"3a73241e3ce20eb094929c8502acd81796faf1de15739a96eceae7def7765f99\": rpc error: code = NotFound desc = could not find container \"3a73241e3ce20eb094929c8502acd81796faf1de15739a96eceae7def7765f99\": container with ID starting with 3a73241e3ce20eb094929c8502acd81796faf1de15739a96eceae7def7765f99 not found: ID does not exist" Nov 24 11:27:37 crc kubenswrapper[4752]: I1124 11:27:37.063855 4752 scope.go:117] "RemoveContainer" containerID="8314c680dec09416c0b7a5d467dd23d0b83e976435017c1eb80827410969b0a9" Nov 24 11:27:37 crc kubenswrapper[4752]: E1124 11:27:37.066035 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8314c680dec09416c0b7a5d467dd23d0b83e976435017c1eb80827410969b0a9\": container with ID starting with 8314c680dec09416c0b7a5d467dd23d0b83e976435017c1eb80827410969b0a9 not found: ID does not exist" containerID="8314c680dec09416c0b7a5d467dd23d0b83e976435017c1eb80827410969b0a9" Nov 24 11:27:37 crc kubenswrapper[4752]: I1124 11:27:37.066097 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8314c680dec09416c0b7a5d467dd23d0b83e976435017c1eb80827410969b0a9"} err="failed to get container status \"8314c680dec09416c0b7a5d467dd23d0b83e976435017c1eb80827410969b0a9\": rpc error: code = NotFound desc = could not find container \"8314c680dec09416c0b7a5d467dd23d0b83e976435017c1eb80827410969b0a9\": container with ID starting with 8314c680dec09416c0b7a5d467dd23d0b83e976435017c1eb80827410969b0a9 not found: ID does not exist" Nov 24 11:27:37 crc kubenswrapper[4752]: I1124 11:27:37.223963 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-2258v"] Nov 24 11:27:37 crc kubenswrapper[4752]: I1124 11:27:37.233847 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-2258v"] Nov 24 11:27:38 crc kubenswrapper[4752]: I1124 11:27:38.749115 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="917b2118-45fd-437a-8f50-7b5f30336603" path="/var/lib/kubelet/pods/917b2118-45fd-437a-8f50-7b5f30336603/volumes" Nov 24 11:27:40 crc kubenswrapper[4752]: I1124 11:27:40.924424 4752 generic.go:334] "Generic (PLEG): container finished" podID="04b7dddf-2bd1-4632-91ae-01abd3b77e1f" containerID="b6a6637e2c4a956270f0ef941c35eb469cca815e47679929b858c1f2f545b628" exitCode=0 Nov 24 11:27:40 crc kubenswrapper[4752]: I1124 11:27:40.924711 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-2shz6" event={"ID":"04b7dddf-2bd1-4632-91ae-01abd3b77e1f","Type":"ContainerDied","Data":"b6a6637e2c4a956270f0ef941c35eb469cca815e47679929b858c1f2f545b628"} Nov 24 11:27:42 crc kubenswrapper[4752]: I1124 11:27:42.377068 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-2shz6" Nov 24 11:27:42 crc kubenswrapper[4752]: I1124 11:27:42.446531 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b7dddf-2bd1-4632-91ae-01abd3b77e1f-combined-ca-bundle\") pod \"04b7dddf-2bd1-4632-91ae-01abd3b77e1f\" (UID: \"04b7dddf-2bd1-4632-91ae-01abd3b77e1f\") " Nov 24 11:27:42 crc kubenswrapper[4752]: I1124 11:27:42.446631 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04b7dddf-2bd1-4632-91ae-01abd3b77e1f-scripts\") pod \"04b7dddf-2bd1-4632-91ae-01abd3b77e1f\" (UID: \"04b7dddf-2bd1-4632-91ae-01abd3b77e1f\") " Nov 24 11:27:42 crc kubenswrapper[4752]: I1124 11:27:42.446657 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b7dddf-2bd1-4632-91ae-01abd3b77e1f-config-data\") pod \"04b7dddf-2bd1-4632-91ae-01abd3b77e1f\" (UID: \"04b7dddf-2bd1-4632-91ae-01abd3b77e1f\") " Nov 24 11:27:42 crc kubenswrapper[4752]: I1124 11:27:42.446890 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8wx8\" (UniqueName: \"kubernetes.io/projected/04b7dddf-2bd1-4632-91ae-01abd3b77e1f-kube-api-access-n8wx8\") pod \"04b7dddf-2bd1-4632-91ae-01abd3b77e1f\" (UID: \"04b7dddf-2bd1-4632-91ae-01abd3b77e1f\") " Nov 24 11:27:42 crc kubenswrapper[4752]: I1124 11:27:42.452655 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04b7dddf-2bd1-4632-91ae-01abd3b77e1f-kube-api-access-n8wx8" (OuterVolumeSpecName: "kube-api-access-n8wx8") pod "04b7dddf-2bd1-4632-91ae-01abd3b77e1f" (UID: "04b7dddf-2bd1-4632-91ae-01abd3b77e1f"). InnerVolumeSpecName "kube-api-access-n8wx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:27:42 crc kubenswrapper[4752]: I1124 11:27:42.452972 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04b7dddf-2bd1-4632-91ae-01abd3b77e1f-scripts" (OuterVolumeSpecName: "scripts") pod "04b7dddf-2bd1-4632-91ae-01abd3b77e1f" (UID: "04b7dddf-2bd1-4632-91ae-01abd3b77e1f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:27:42 crc kubenswrapper[4752]: I1124 11:27:42.476633 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04b7dddf-2bd1-4632-91ae-01abd3b77e1f-config-data" (OuterVolumeSpecName: "config-data") pod "04b7dddf-2bd1-4632-91ae-01abd3b77e1f" (UID: "04b7dddf-2bd1-4632-91ae-01abd3b77e1f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:27:42 crc kubenswrapper[4752]: I1124 11:27:42.481015 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04b7dddf-2bd1-4632-91ae-01abd3b77e1f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04b7dddf-2bd1-4632-91ae-01abd3b77e1f" (UID: "04b7dddf-2bd1-4632-91ae-01abd3b77e1f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:27:42 crc kubenswrapper[4752]: I1124 11:27:42.548535 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8wx8\" (UniqueName: \"kubernetes.io/projected/04b7dddf-2bd1-4632-91ae-01abd3b77e1f-kube-api-access-n8wx8\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:42 crc kubenswrapper[4752]: I1124 11:27:42.548569 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b7dddf-2bd1-4632-91ae-01abd3b77e1f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:42 crc kubenswrapper[4752]: I1124 11:27:42.548579 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04b7dddf-2bd1-4632-91ae-01abd3b77e1f-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:42 crc kubenswrapper[4752]: I1124 11:27:42.548588 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b7dddf-2bd1-4632-91ae-01abd3b77e1f-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:42 crc kubenswrapper[4752]: I1124 11:27:42.951440 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-2shz6" event={"ID":"04b7dddf-2bd1-4632-91ae-01abd3b77e1f","Type":"ContainerDied","Data":"40baf1e421319eb598fe907be353b0a86a71bc763cbb17f53ec575dbd0494311"} Nov 24 11:27:42 crc kubenswrapper[4752]: I1124 11:27:42.951763 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40baf1e421319eb598fe907be353b0a86a71bc763cbb17f53ec575dbd0494311" Nov 24 11:27:42 crc kubenswrapper[4752]: I1124 11:27:42.951598 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-2shz6" Nov 24 11:27:43 crc kubenswrapper[4752]: I1124 11:27:43.147280 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 24 11:27:43 crc kubenswrapper[4752]: I1124 11:27:43.147574 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3" containerName="nova-api-log" containerID="cri-o://5ee796ae4d55e1d4b3a1e67f40a0194e7d8584bd99c3c44b43f31a6ed0846d9c" gracePeriod=30 Nov 24 11:27:43 crc kubenswrapper[4752]: I1124 11:27:43.147709 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3" containerName="nova-api-api" containerID="cri-o://becd5b7341431fd379e4b339476a5becfa27ad38fc8cbc796194303467e448ae" gracePeriod=30 Nov 24 11:27:43 crc kubenswrapper[4752]: I1124 11:27:43.181830 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 11:27:43 crc kubenswrapper[4752]: I1124 11:27:43.182060 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7" containerName="nova-scheduler-scheduler" containerID="cri-o://808c2fc55c7ab307f63456bc2ef733ab21e0f0c222fab0682eb7eae08cb59705" gracePeriod=30 Nov 24 11:27:43 crc kubenswrapper[4752]: I1124 11:27:43.196657 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 11:27:43 crc kubenswrapper[4752]: I1124 11:27:43.197238 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="158542a8-66b4-40d4-b6a6-12635841260a" containerName="nova-metadata-metadata" containerID="cri-o://ebca040b3207b72cdc197f7d58c04e61d32418577e696984d81a85aa96df237c" gracePeriod=30 Nov 24 11:27:43 crc kubenswrapper[4752]: I1124 11:27:43.197192 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="158542a8-66b4-40d4-b6a6-12635841260a" containerName="nova-metadata-log" containerID="cri-o://58348713d78cef3ec61b99e837a147bdbf417d4939f96f8efbfc5ceecb421aa4" gracePeriod=30 Nov 24 11:27:43 crc kubenswrapper[4752]: I1124 11:27:43.846439 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 11:27:43 crc kubenswrapper[4752]: I1124 11:27:43.874101 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3-public-tls-certs\") pod \"8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3\" (UID: \"8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3\") " Nov 24 11:27:43 crc kubenswrapper[4752]: I1124 11:27:43.874362 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3-logs\") pod \"8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3\" (UID: \"8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3\") " Nov 24 11:27:43 crc kubenswrapper[4752]: I1124 11:27:43.874507 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3-combined-ca-bundle\") pod \"8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3\" (UID: \"8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3\") " Nov 24 11:27:43 crc kubenswrapper[4752]: I1124 11:27:43.874660 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3-internal-tls-certs\") pod \"8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3\" (UID: \"8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3\") " Nov 24 11:27:43 crc kubenswrapper[4752]: I1124 11:27:43.874701 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zq5zm\" (UniqueName: \"kubernetes.io/projected/8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3-kube-api-access-zq5zm\") pod \"8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3\" (UID: \"8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3\") " Nov 24 11:27:43 crc kubenswrapper[4752]: I1124 11:27:43.874800 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3-config-data\") pod \"8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3\" (UID: \"8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3\") " Nov 24 11:27:43 crc kubenswrapper[4752]: I1124 11:27:43.875040 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3-logs" (OuterVolumeSpecName: "logs") pod "8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3" (UID: "8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:27:43 crc kubenswrapper[4752]: I1124 11:27:43.876036 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3-logs\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:43 crc kubenswrapper[4752]: I1124 11:27:43.891223 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3-kube-api-access-zq5zm" (OuterVolumeSpecName: "kube-api-access-zq5zm") pod "8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3" (UID: "8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3"). InnerVolumeSpecName "kube-api-access-zq5zm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:27:43 crc kubenswrapper[4752]: I1124 11:27:43.938128 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3" (UID: "8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:27:43 crc kubenswrapper[4752]: I1124 11:27:43.946693 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3-config-data" (OuterVolumeSpecName: "config-data") pod "8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3" (UID: "8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:27:43 crc kubenswrapper[4752]: I1124 11:27:43.964859 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3" (UID: "8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:27:43 crc kubenswrapper[4752]: I1124 11:27:43.975971 4752 generic.go:334] "Generic (PLEG): container finished" podID="158542a8-66b4-40d4-b6a6-12635841260a" containerID="58348713d78cef3ec61b99e837a147bdbf417d4939f96f8efbfc5ceecb421aa4" exitCode=143 Nov 24 11:27:43 crc kubenswrapper[4752]: I1124 11:27:43.976074 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"158542a8-66b4-40d4-b6a6-12635841260a","Type":"ContainerDied","Data":"58348713d78cef3ec61b99e837a147bdbf417d4939f96f8efbfc5ceecb421aa4"} Nov 24 11:27:43 crc kubenswrapper[4752]: I1124 11:27:43.977421 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:43 crc kubenswrapper[4752]: I1124 11:27:43.977451 4752 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:43 crc kubenswrapper[4752]: I1124 11:27:43.977464 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zq5zm\" (UniqueName: \"kubernetes.io/projected/8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3-kube-api-access-zq5zm\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:43 crc kubenswrapper[4752]: I1124 11:27:43.977478 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:43 crc kubenswrapper[4752]: I1124 11:27:43.978826 4752 generic.go:334] "Generic (PLEG): container finished" podID="8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3" containerID="becd5b7341431fd379e4b339476a5becfa27ad38fc8cbc796194303467e448ae" exitCode=0 Nov 24 11:27:43 crc kubenswrapper[4752]: I1124 11:27:43.978856 4752 generic.go:334] "Generic (PLEG): container finished" podID="8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3" containerID="5ee796ae4d55e1d4b3a1e67f40a0194e7d8584bd99c3c44b43f31a6ed0846d9c" exitCode=143 Nov 24 11:27:43 crc kubenswrapper[4752]: I1124 11:27:43.978917 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3","Type":"ContainerDied","Data":"becd5b7341431fd379e4b339476a5becfa27ad38fc8cbc796194303467e448ae"} Nov 24 11:27:43 crc kubenswrapper[4752]: I1124 11:27:43.978943 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3","Type":"ContainerDied","Data":"5ee796ae4d55e1d4b3a1e67f40a0194e7d8584bd99c3c44b43f31a6ed0846d9c"} Nov 24 11:27:43 crc kubenswrapper[4752]: I1124 11:27:43.978953 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3","Type":"ContainerDied","Data":"85b9e522609f314c53f6691d58d0b017b257aaf953a146d609856d7c95531cfc"} Nov 24 11:27:43 crc kubenswrapper[4752]: I1124 11:27:43.978967 4752 scope.go:117] "RemoveContainer" containerID="becd5b7341431fd379e4b339476a5becfa27ad38fc8cbc796194303467e448ae" Nov 24 11:27:43 crc kubenswrapper[4752]: I1124 11:27:43.979085 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 11:27:43 crc kubenswrapper[4752]: I1124 11:27:43.981484 4752 generic.go:334] "Generic (PLEG): container finished" podID="d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7" containerID="808c2fc55c7ab307f63456bc2ef733ab21e0f0c222fab0682eb7eae08cb59705" exitCode=0 Nov 24 11:27:43 crc kubenswrapper[4752]: I1124 11:27:43.981505 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7","Type":"ContainerDied","Data":"808c2fc55c7ab307f63456bc2ef733ab21e0f0c222fab0682eb7eae08cb59705"} Nov 24 11:27:43 crc kubenswrapper[4752]: I1124 11:27:43.985921 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3" (UID: "8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.004381 4752 scope.go:117] "RemoveContainer" containerID="5ee796ae4d55e1d4b3a1e67f40a0194e7d8584bd99c3c44b43f31a6ed0846d9c" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.021387 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.030804 4752 scope.go:117] "RemoveContainer" containerID="becd5b7341431fd379e4b339476a5becfa27ad38fc8cbc796194303467e448ae" Nov 24 11:27:44 crc kubenswrapper[4752]: E1124 11:27:44.031465 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"becd5b7341431fd379e4b339476a5becfa27ad38fc8cbc796194303467e448ae\": container with ID starting with becd5b7341431fd379e4b339476a5becfa27ad38fc8cbc796194303467e448ae not found: ID does not exist" containerID="becd5b7341431fd379e4b339476a5becfa27ad38fc8cbc796194303467e448ae" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.031519 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"becd5b7341431fd379e4b339476a5becfa27ad38fc8cbc796194303467e448ae"} err="failed to get container status \"becd5b7341431fd379e4b339476a5becfa27ad38fc8cbc796194303467e448ae\": rpc error: code = NotFound desc = could not find container \"becd5b7341431fd379e4b339476a5becfa27ad38fc8cbc796194303467e448ae\": container with ID starting with becd5b7341431fd379e4b339476a5becfa27ad38fc8cbc796194303467e448ae not found: ID does not exist" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.031551 4752 scope.go:117] "RemoveContainer" containerID="5ee796ae4d55e1d4b3a1e67f40a0194e7d8584bd99c3c44b43f31a6ed0846d9c" Nov 24 11:27:44 crc kubenswrapper[4752]: E1124 11:27:44.031906 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ee796ae4d55e1d4b3a1e67f40a0194e7d8584bd99c3c44b43f31a6ed0846d9c\": container with ID starting with 5ee796ae4d55e1d4b3a1e67f40a0194e7d8584bd99c3c44b43f31a6ed0846d9c not found: ID does not exist" containerID="5ee796ae4d55e1d4b3a1e67f40a0194e7d8584bd99c3c44b43f31a6ed0846d9c" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.031931 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ee796ae4d55e1d4b3a1e67f40a0194e7d8584bd99c3c44b43f31a6ed0846d9c"} err="failed to get container status \"5ee796ae4d55e1d4b3a1e67f40a0194e7d8584bd99c3c44b43f31a6ed0846d9c\": rpc error: code = NotFound desc = could not find container \"5ee796ae4d55e1d4b3a1e67f40a0194e7d8584bd99c3c44b43f31a6ed0846d9c\": container with ID starting with 5ee796ae4d55e1d4b3a1e67f40a0194e7d8584bd99c3c44b43f31a6ed0846d9c not found: ID does not exist" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.031946 4752 scope.go:117] "RemoveContainer" containerID="becd5b7341431fd379e4b339476a5becfa27ad38fc8cbc796194303467e448ae" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.032136 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"becd5b7341431fd379e4b339476a5becfa27ad38fc8cbc796194303467e448ae"} err="failed to get container status \"becd5b7341431fd379e4b339476a5becfa27ad38fc8cbc796194303467e448ae\": rpc error: code = NotFound desc = could not find container \"becd5b7341431fd379e4b339476a5becfa27ad38fc8cbc796194303467e448ae\": container with ID starting with becd5b7341431fd379e4b339476a5becfa27ad38fc8cbc796194303467e448ae not found: ID does not exist" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.032153 4752 scope.go:117] "RemoveContainer" containerID="5ee796ae4d55e1d4b3a1e67f40a0194e7d8584bd99c3c44b43f31a6ed0846d9c" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.032302 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ee796ae4d55e1d4b3a1e67f40a0194e7d8584bd99c3c44b43f31a6ed0846d9c"} err="failed to get container status \"5ee796ae4d55e1d4b3a1e67f40a0194e7d8584bd99c3c44b43f31a6ed0846d9c\": rpc error: code = NotFound desc = could not find container \"5ee796ae4d55e1d4b3a1e67f40a0194e7d8584bd99c3c44b43f31a6ed0846d9c\": container with ID starting with 5ee796ae4d55e1d4b3a1e67f40a0194e7d8584bd99c3c44b43f31a6ed0846d9c not found: ID does not exist" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.078513 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7-combined-ca-bundle\") pod \"d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7\" (UID: \"d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7\") " Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.078567 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dwdn\" (UniqueName: \"kubernetes.io/projected/d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7-kube-api-access-7dwdn\") pod \"d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7\" (UID: \"d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7\") " Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.078623 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7-config-data\") pod \"d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7\" (UID: \"d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7\") " Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.079110 4752 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.083143 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7-kube-api-access-7dwdn" (OuterVolumeSpecName: "kube-api-access-7dwdn") pod "d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7" (UID: "d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7"). InnerVolumeSpecName "kube-api-access-7dwdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.106134 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7-config-data" (OuterVolumeSpecName: "config-data") pod "d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7" (UID: "d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.112439 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7" (UID: "d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.180897 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.180934 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.180949 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dwdn\" (UniqueName: \"kubernetes.io/projected/d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7-kube-api-access-7dwdn\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.313347 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.321688 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.334009 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 24 11:27:44 crc kubenswrapper[4752]: E1124 11:27:44.334353 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7" containerName="nova-scheduler-scheduler" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.334367 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7" containerName="nova-scheduler-scheduler" Nov 24 11:27:44 crc kubenswrapper[4752]: E1124 11:27:44.334382 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="917b2118-45fd-437a-8f50-7b5f30336603" containerName="init" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.334388 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="917b2118-45fd-437a-8f50-7b5f30336603" containerName="init" Nov 24 11:27:44 crc kubenswrapper[4752]: E1124 11:27:44.334400 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3" containerName="nova-api-api" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.334407 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3" containerName="nova-api-api" Nov 24 11:27:44 crc kubenswrapper[4752]: E1124 11:27:44.334419 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3" containerName="nova-api-log" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.334424 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3" containerName="nova-api-log" Nov 24 11:27:44 crc kubenswrapper[4752]: E1124 11:27:44.334451 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04b7dddf-2bd1-4632-91ae-01abd3b77e1f" containerName="nova-manage" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.334457 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="04b7dddf-2bd1-4632-91ae-01abd3b77e1f" containerName="nova-manage" Nov 24 11:27:44 crc kubenswrapper[4752]: E1124 11:27:44.334472 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="917b2118-45fd-437a-8f50-7b5f30336603" containerName="dnsmasq-dns" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.334477 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="917b2118-45fd-437a-8f50-7b5f30336603" containerName="dnsmasq-dns" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.334693 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7" containerName="nova-scheduler-scheduler" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.334708 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3" containerName="nova-api-api" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.334719 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3" containerName="nova-api-log" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.334731 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="917b2118-45fd-437a-8f50-7b5f30336603" containerName="dnsmasq-dns" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.334764 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="04b7dddf-2bd1-4632-91ae-01abd3b77e1f" containerName="nova-manage" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.335701 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.338937 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.338981 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.339201 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.352652 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.385080 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3b08ab3-f6c3-417e-abce-e03994e7553d-public-tls-certs\") pod \"nova-api-0\" (UID: \"b3b08ab3-f6c3-417e-abce-e03994e7553d\") " pod="openstack/nova-api-0" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.385189 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3b08ab3-f6c3-417e-abce-e03994e7553d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b3b08ab3-f6c3-417e-abce-e03994e7553d\") " pod="openstack/nova-api-0" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.385227 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm5rw\" (UniqueName: \"kubernetes.io/projected/b3b08ab3-f6c3-417e-abce-e03994e7553d-kube-api-access-wm5rw\") pod \"nova-api-0\" (UID: \"b3b08ab3-f6c3-417e-abce-e03994e7553d\") " pod="openstack/nova-api-0" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.385338 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3b08ab3-f6c3-417e-abce-e03994e7553d-config-data\") pod \"nova-api-0\" (UID: \"b3b08ab3-f6c3-417e-abce-e03994e7553d\") " pod="openstack/nova-api-0" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.385387 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3b08ab3-f6c3-417e-abce-e03994e7553d-logs\") pod \"nova-api-0\" (UID: \"b3b08ab3-f6c3-417e-abce-e03994e7553d\") " pod="openstack/nova-api-0" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.385429 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3b08ab3-f6c3-417e-abce-e03994e7553d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b3b08ab3-f6c3-417e-abce-e03994e7553d\") " pod="openstack/nova-api-0" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.487158 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3b08ab3-f6c3-417e-abce-e03994e7553d-public-tls-certs\") pod \"nova-api-0\" (UID: \"b3b08ab3-f6c3-417e-abce-e03994e7553d\") " pod="openstack/nova-api-0" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.487265 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3b08ab3-f6c3-417e-abce-e03994e7553d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b3b08ab3-f6c3-417e-abce-e03994e7553d\") " pod="openstack/nova-api-0" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.487294 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm5rw\" (UniqueName: \"kubernetes.io/projected/b3b08ab3-f6c3-417e-abce-e03994e7553d-kube-api-access-wm5rw\") pod \"nova-api-0\" (UID: \"b3b08ab3-f6c3-417e-abce-e03994e7553d\") " pod="openstack/nova-api-0" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.487379 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3b08ab3-f6c3-417e-abce-e03994e7553d-config-data\") pod \"nova-api-0\" (UID: \"b3b08ab3-f6c3-417e-abce-e03994e7553d\") " pod="openstack/nova-api-0" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.487428 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3b08ab3-f6c3-417e-abce-e03994e7553d-logs\") pod \"nova-api-0\" (UID: \"b3b08ab3-f6c3-417e-abce-e03994e7553d\") " pod="openstack/nova-api-0" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.487459 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3b08ab3-f6c3-417e-abce-e03994e7553d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b3b08ab3-f6c3-417e-abce-e03994e7553d\") " pod="openstack/nova-api-0" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.488146 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3b08ab3-f6c3-417e-abce-e03994e7553d-logs\") pod \"nova-api-0\" (UID: \"b3b08ab3-f6c3-417e-abce-e03994e7553d\") " pod="openstack/nova-api-0" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.491882 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3b08ab3-f6c3-417e-abce-e03994e7553d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b3b08ab3-f6c3-417e-abce-e03994e7553d\") " pod="openstack/nova-api-0" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.491965 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3b08ab3-f6c3-417e-abce-e03994e7553d-config-data\") pod \"nova-api-0\" (UID: \"b3b08ab3-f6c3-417e-abce-e03994e7553d\") " pod="openstack/nova-api-0" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.496317 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3b08ab3-f6c3-417e-abce-e03994e7553d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b3b08ab3-f6c3-417e-abce-e03994e7553d\") " pod="openstack/nova-api-0" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.497013 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3b08ab3-f6c3-417e-abce-e03994e7553d-public-tls-certs\") pod \"nova-api-0\" (UID: \"b3b08ab3-f6c3-417e-abce-e03994e7553d\") " pod="openstack/nova-api-0" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.505189 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm5rw\" (UniqueName: \"kubernetes.io/projected/b3b08ab3-f6c3-417e-abce-e03994e7553d-kube-api-access-wm5rw\") pod \"nova-api-0\" (UID: \"b3b08ab3-f6c3-417e-abce-e03994e7553d\") " pod="openstack/nova-api-0" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.707292 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.738429 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3" path="/var/lib/kubelet/pods/8c662bc9-4d2f-4dfb-83d3-b0458e8e90b3/volumes" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.993599 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7","Type":"ContainerDied","Data":"b4ce138556a786dbbc3f7b896af464fbf6ce36bd7134dc351c0809d5423e5035"} Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.993650 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 24 11:27:44 crc kubenswrapper[4752]: I1124 11:27:44.994017 4752 scope.go:117] "RemoveContainer" containerID="808c2fc55c7ab307f63456bc2ef733ab21e0f0c222fab0682eb7eae08cb59705" Nov 24 11:27:45 crc kubenswrapper[4752]: I1124 11:27:45.016118 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 11:27:45 crc kubenswrapper[4752]: I1124 11:27:45.035166 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 11:27:45 crc kubenswrapper[4752]: I1124 11:27:45.045391 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 11:27:45 crc kubenswrapper[4752]: I1124 11:27:45.046705 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 24 11:27:45 crc kubenswrapper[4752]: I1124 11:27:45.048338 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 24 11:27:45 crc kubenswrapper[4752]: I1124 11:27:45.052765 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 11:27:45 crc kubenswrapper[4752]: I1124 11:27:45.098368 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd071fb7-f9a2-4f4a-aad6-90c340f0d009-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fd071fb7-f9a2-4f4a-aad6-90c340f0d009\") " pod="openstack/nova-scheduler-0" Nov 24 11:27:45 crc kubenswrapper[4752]: I1124 11:27:45.098509 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hxms\" (UniqueName: \"kubernetes.io/projected/fd071fb7-f9a2-4f4a-aad6-90c340f0d009-kube-api-access-8hxms\") pod \"nova-scheduler-0\" (UID: \"fd071fb7-f9a2-4f4a-aad6-90c340f0d009\") " pod="openstack/nova-scheduler-0" Nov 24 11:27:45 crc kubenswrapper[4752]: I1124 11:27:45.098558 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd071fb7-f9a2-4f4a-aad6-90c340f0d009-config-data\") pod \"nova-scheduler-0\" (UID: \"fd071fb7-f9a2-4f4a-aad6-90c340f0d009\") " pod="openstack/nova-scheduler-0" Nov 24 11:27:45 crc kubenswrapper[4752]: I1124 11:27:45.200089 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 24 11:27:45 crc kubenswrapper[4752]: I1124 11:27:45.200357 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hxms\" (UniqueName: \"kubernetes.io/projected/fd071fb7-f9a2-4f4a-aad6-90c340f0d009-kube-api-access-8hxms\") pod \"nova-scheduler-0\" (UID: \"fd071fb7-f9a2-4f4a-aad6-90c340f0d009\") " pod="openstack/nova-scheduler-0" Nov 24 11:27:45 crc kubenswrapper[4752]: I1124 11:27:45.200457 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd071fb7-f9a2-4f4a-aad6-90c340f0d009-config-data\") pod \"nova-scheduler-0\" (UID: \"fd071fb7-f9a2-4f4a-aad6-90c340f0d009\") " pod="openstack/nova-scheduler-0" Nov 24 11:27:45 crc kubenswrapper[4752]: I1124 11:27:45.200563 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd071fb7-f9a2-4f4a-aad6-90c340f0d009-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fd071fb7-f9a2-4f4a-aad6-90c340f0d009\") " pod="openstack/nova-scheduler-0" Nov 24 11:27:45 crc kubenswrapper[4752]: I1124 11:27:45.206321 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd071fb7-f9a2-4f4a-aad6-90c340f0d009-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fd071fb7-f9a2-4f4a-aad6-90c340f0d009\") " pod="openstack/nova-scheduler-0" Nov 24 11:27:45 crc kubenswrapper[4752]: I1124 11:27:45.206427 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd071fb7-f9a2-4f4a-aad6-90c340f0d009-config-data\") pod \"nova-scheduler-0\" (UID: \"fd071fb7-f9a2-4f4a-aad6-90c340f0d009\") " pod="openstack/nova-scheduler-0" Nov 24 11:27:45 crc kubenswrapper[4752]: I1124 11:27:45.218382 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hxms\" (UniqueName: \"kubernetes.io/projected/fd071fb7-f9a2-4f4a-aad6-90c340f0d009-kube-api-access-8hxms\") pod \"nova-scheduler-0\" (UID: \"fd071fb7-f9a2-4f4a-aad6-90c340f0d009\") " pod="openstack/nova-scheduler-0" Nov 24 11:27:45 crc kubenswrapper[4752]: I1124 11:27:45.366900 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 24 11:27:45 crc kubenswrapper[4752]: I1124 11:27:45.468334 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 11:27:45 crc kubenswrapper[4752]: I1124 11:27:45.468381 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 11:27:45 crc kubenswrapper[4752]: I1124 11:27:45.839820 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 11:27:45 crc kubenswrapper[4752]: W1124 11:27:45.844029 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd071fb7_f9a2_4f4a_aad6_90c340f0d009.slice/crio-4d6071de5a0edc39a54a13c20117ae1d903587e9e41e885ca753b6c25d7d3f06 WatchSource:0}: Error finding container 4d6071de5a0edc39a54a13c20117ae1d903587e9e41e885ca753b6c25d7d3f06: Status 404 returned error can't find the container with id 4d6071de5a0edc39a54a13c20117ae1d903587e9e41e885ca753b6c25d7d3f06 Nov 24 11:27:46 crc kubenswrapper[4752]: I1124 11:27:46.006823 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fd071fb7-f9a2-4f4a-aad6-90c340f0d009","Type":"ContainerStarted","Data":"4d6071de5a0edc39a54a13c20117ae1d903587e9e41e885ca753b6c25d7d3f06"} Nov 24 11:27:46 crc kubenswrapper[4752]: I1124 11:27:46.009190 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b3b08ab3-f6c3-417e-abce-e03994e7553d","Type":"ContainerStarted","Data":"858e76b45449fa0fcdbccdf4d4f6325a65859816c0ee8d347ba69396fbd78a83"} Nov 24 11:27:46 crc kubenswrapper[4752]: I1124 11:27:46.009224 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b3b08ab3-f6c3-417e-abce-e03994e7553d","Type":"ContainerStarted","Data":"9568b7bef3cbeeeda117b907e1246694e8acf31ceba1e01df07fa02cb3727f40"} Nov 24 11:27:46 crc kubenswrapper[4752]: I1124 11:27:46.009238 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b3b08ab3-f6c3-417e-abce-e03994e7553d","Type":"ContainerStarted","Data":"710ca0b3182fe50b2b01b8c727287d36e5ef90de149923679f0ad39a88ac05a6"} Nov 24 11:27:46 crc kubenswrapper[4752]: I1124 11:27:46.035958 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.035939903 podStartE2EDuration="2.035939903s" podCreationTimestamp="2025-11-24 11:27:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:27:46.030890757 +0000 UTC m=+1272.015711046" watchObservedRunningTime="2025-11-24 11:27:46.035939903 +0000 UTC m=+1272.020760192" Nov 24 11:27:46 crc kubenswrapper[4752]: I1124 11:27:46.338983 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="158542a8-66b4-40d4-b6a6-12635841260a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": read tcp 10.217.0.2:48166->10.217.0.192:8775: read: connection reset by peer" Nov 24 11:27:46 crc kubenswrapper[4752]: I1124 11:27:46.339050 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="158542a8-66b4-40d4-b6a6-12635841260a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": read tcp 10.217.0.2:48150->10.217.0.192:8775: read: connection reset by peer" Nov 24 11:27:46 crc kubenswrapper[4752]: I1124 11:27:46.741446 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7" path="/var/lib/kubelet/pods/d2c9a6ee-a43f-4094-b1dc-a2d493d9efd7/volumes" Nov 24 11:27:46 crc kubenswrapper[4752]: I1124 11:27:46.808506 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 11:27:46 crc kubenswrapper[4752]: I1124 11:27:46.837407 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbmb6\" (UniqueName: \"kubernetes.io/projected/158542a8-66b4-40d4-b6a6-12635841260a-kube-api-access-bbmb6\") pod \"158542a8-66b4-40d4-b6a6-12635841260a\" (UID: \"158542a8-66b4-40d4-b6a6-12635841260a\") " Nov 24 11:27:46 crc kubenswrapper[4752]: I1124 11:27:46.837550 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/158542a8-66b4-40d4-b6a6-12635841260a-combined-ca-bundle\") pod \"158542a8-66b4-40d4-b6a6-12635841260a\" (UID: \"158542a8-66b4-40d4-b6a6-12635841260a\") " Nov 24 11:27:46 crc kubenswrapper[4752]: I1124 11:27:46.837601 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/158542a8-66b4-40d4-b6a6-12635841260a-config-data\") pod \"158542a8-66b4-40d4-b6a6-12635841260a\" (UID: \"158542a8-66b4-40d4-b6a6-12635841260a\") " Nov 24 11:27:46 crc kubenswrapper[4752]: I1124 11:27:46.837625 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/158542a8-66b4-40d4-b6a6-12635841260a-nova-metadata-tls-certs\") pod \"158542a8-66b4-40d4-b6a6-12635841260a\" (UID: \"158542a8-66b4-40d4-b6a6-12635841260a\") " Nov 24 11:27:46 crc kubenswrapper[4752]: I1124 11:27:46.837796 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/158542a8-66b4-40d4-b6a6-12635841260a-logs\") pod \"158542a8-66b4-40d4-b6a6-12635841260a\" (UID: \"158542a8-66b4-40d4-b6a6-12635841260a\") " Nov 24 11:27:46 crc kubenswrapper[4752]: I1124 11:27:46.838419 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/158542a8-66b4-40d4-b6a6-12635841260a-logs" (OuterVolumeSpecName: "logs") pod "158542a8-66b4-40d4-b6a6-12635841260a" (UID: "158542a8-66b4-40d4-b6a6-12635841260a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:27:46 crc kubenswrapper[4752]: I1124 11:27:46.846623 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/158542a8-66b4-40d4-b6a6-12635841260a-kube-api-access-bbmb6" (OuterVolumeSpecName: "kube-api-access-bbmb6") pod "158542a8-66b4-40d4-b6a6-12635841260a" (UID: "158542a8-66b4-40d4-b6a6-12635841260a"). InnerVolumeSpecName "kube-api-access-bbmb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:27:46 crc kubenswrapper[4752]: I1124 11:27:46.875651 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/158542a8-66b4-40d4-b6a6-12635841260a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "158542a8-66b4-40d4-b6a6-12635841260a" (UID: "158542a8-66b4-40d4-b6a6-12635841260a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:27:46 crc kubenswrapper[4752]: I1124 11:27:46.901947 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/158542a8-66b4-40d4-b6a6-12635841260a-config-data" (OuterVolumeSpecName: "config-data") pod "158542a8-66b4-40d4-b6a6-12635841260a" (UID: "158542a8-66b4-40d4-b6a6-12635841260a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:27:46 crc kubenswrapper[4752]: I1124 11:27:46.916455 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/158542a8-66b4-40d4-b6a6-12635841260a-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "158542a8-66b4-40d4-b6a6-12635841260a" (UID: "158542a8-66b4-40d4-b6a6-12635841260a"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:27:46 crc kubenswrapper[4752]: I1124 11:27:46.942697 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbmb6\" (UniqueName: \"kubernetes.io/projected/158542a8-66b4-40d4-b6a6-12635841260a-kube-api-access-bbmb6\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:46 crc kubenswrapper[4752]: I1124 11:27:46.942763 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/158542a8-66b4-40d4-b6a6-12635841260a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:46 crc kubenswrapper[4752]: I1124 11:27:46.942776 4752 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/158542a8-66b4-40d4-b6a6-12635841260a-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:46 crc kubenswrapper[4752]: I1124 11:27:46.942789 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/158542a8-66b4-40d4-b6a6-12635841260a-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:46 crc kubenswrapper[4752]: I1124 11:27:46.942824 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/158542a8-66b4-40d4-b6a6-12635841260a-logs\") on node \"crc\" DevicePath \"\"" Nov 24 11:27:47 crc kubenswrapper[4752]: I1124 11:27:47.031579 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fd071fb7-f9a2-4f4a-aad6-90c340f0d009","Type":"ContainerStarted","Data":"84100f51acbc89dc3eef175fd930e0f2a3ebebe505fb5e9794dfc4d2790209b4"} Nov 24 11:27:47 crc kubenswrapper[4752]: I1124 11:27:47.035495 4752 generic.go:334] "Generic (PLEG): container finished" podID="158542a8-66b4-40d4-b6a6-12635841260a" containerID="ebca040b3207b72cdc197f7d58c04e61d32418577e696984d81a85aa96df237c" exitCode=0 Nov 24 11:27:47 crc kubenswrapper[4752]: I1124 11:27:47.036071 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 11:27:47 crc kubenswrapper[4752]: I1124 11:27:47.036105 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"158542a8-66b4-40d4-b6a6-12635841260a","Type":"ContainerDied","Data":"ebca040b3207b72cdc197f7d58c04e61d32418577e696984d81a85aa96df237c"} Nov 24 11:27:47 crc kubenswrapper[4752]: I1124 11:27:47.036152 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"158542a8-66b4-40d4-b6a6-12635841260a","Type":"ContainerDied","Data":"ca0c010c770609ecd342df0c5dfd932a5fbee68b9c4410cbc4d32efccb3b69b2"} Nov 24 11:27:47 crc kubenswrapper[4752]: I1124 11:27:47.036177 4752 scope.go:117] "RemoveContainer" containerID="ebca040b3207b72cdc197f7d58c04e61d32418577e696984d81a85aa96df237c" Nov 24 11:27:47 crc kubenswrapper[4752]: I1124 11:27:47.051932 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.05191357 podStartE2EDuration="2.05191357s" podCreationTimestamp="2025-11-24 11:27:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:27:47.04707718 +0000 UTC m=+1273.031897469" watchObservedRunningTime="2025-11-24 11:27:47.05191357 +0000 UTC m=+1273.036733859" Nov 24 11:27:47 crc kubenswrapper[4752]: I1124 11:27:47.083314 4752 scope.go:117] "RemoveContainer" containerID="58348713d78cef3ec61b99e837a147bdbf417d4939f96f8efbfc5ceecb421aa4" Nov 24 11:27:47 crc kubenswrapper[4752]: I1124 11:27:47.087530 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 11:27:47 crc kubenswrapper[4752]: I1124 11:27:47.142986 4752 scope.go:117] "RemoveContainer" containerID="ebca040b3207b72cdc197f7d58c04e61d32418577e696984d81a85aa96df237c" Nov 24 11:27:47 crc kubenswrapper[4752]: I1124 11:27:47.146931 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 11:27:47 crc kubenswrapper[4752]: E1124 11:27:47.147296 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebca040b3207b72cdc197f7d58c04e61d32418577e696984d81a85aa96df237c\": container with ID starting with ebca040b3207b72cdc197f7d58c04e61d32418577e696984d81a85aa96df237c not found: ID does not exist" containerID="ebca040b3207b72cdc197f7d58c04e61d32418577e696984d81a85aa96df237c" Nov 24 11:27:47 crc kubenswrapper[4752]: I1124 11:27:47.147335 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebca040b3207b72cdc197f7d58c04e61d32418577e696984d81a85aa96df237c"} err="failed to get container status \"ebca040b3207b72cdc197f7d58c04e61d32418577e696984d81a85aa96df237c\": rpc error: code = NotFound desc = could not find container \"ebca040b3207b72cdc197f7d58c04e61d32418577e696984d81a85aa96df237c\": container with ID starting with ebca040b3207b72cdc197f7d58c04e61d32418577e696984d81a85aa96df237c not found: ID does not exist" Nov 24 11:27:47 crc kubenswrapper[4752]: I1124 11:27:47.147361 4752 scope.go:117] "RemoveContainer" containerID="58348713d78cef3ec61b99e837a147bdbf417d4939f96f8efbfc5ceecb421aa4" Nov 24 11:27:47 crc kubenswrapper[4752]: E1124 11:27:47.154374 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58348713d78cef3ec61b99e837a147bdbf417d4939f96f8efbfc5ceecb421aa4\": container with ID starting with 58348713d78cef3ec61b99e837a147bdbf417d4939f96f8efbfc5ceecb421aa4 not found: ID does not exist" containerID="58348713d78cef3ec61b99e837a147bdbf417d4939f96f8efbfc5ceecb421aa4" Nov 24 11:27:47 crc kubenswrapper[4752]: I1124 11:27:47.154413 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58348713d78cef3ec61b99e837a147bdbf417d4939f96f8efbfc5ceecb421aa4"} err="failed to get container status \"58348713d78cef3ec61b99e837a147bdbf417d4939f96f8efbfc5ceecb421aa4\": rpc error: code = NotFound desc = could not find container \"58348713d78cef3ec61b99e837a147bdbf417d4939f96f8efbfc5ceecb421aa4\": container with ID starting with 58348713d78cef3ec61b99e837a147bdbf417d4939f96f8efbfc5ceecb421aa4 not found: ID does not exist" Nov 24 11:27:47 crc kubenswrapper[4752]: I1124 11:27:47.161209 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 24 11:27:47 crc kubenswrapper[4752]: E1124 11:27:47.161718 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="158542a8-66b4-40d4-b6a6-12635841260a" containerName="nova-metadata-log" Nov 24 11:27:47 crc kubenswrapper[4752]: I1124 11:27:47.161736 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="158542a8-66b4-40d4-b6a6-12635841260a" containerName="nova-metadata-log" Nov 24 11:27:47 crc kubenswrapper[4752]: E1124 11:27:47.161773 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="158542a8-66b4-40d4-b6a6-12635841260a" containerName="nova-metadata-metadata" Nov 24 11:27:47 crc kubenswrapper[4752]: I1124 11:27:47.161782 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="158542a8-66b4-40d4-b6a6-12635841260a" containerName="nova-metadata-metadata" Nov 24 11:27:47 crc kubenswrapper[4752]: I1124 11:27:47.162013 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="158542a8-66b4-40d4-b6a6-12635841260a" containerName="nova-metadata-metadata" Nov 24 11:27:47 crc kubenswrapper[4752]: I1124 11:27:47.162041 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="158542a8-66b4-40d4-b6a6-12635841260a" containerName="nova-metadata-log" Nov 24 11:27:47 crc kubenswrapper[4752]: I1124 11:27:47.163237 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 11:27:47 crc kubenswrapper[4752]: I1124 11:27:47.165430 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 11:27:47 crc kubenswrapper[4752]: I1124 11:27:47.167514 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 24 11:27:47 crc kubenswrapper[4752]: I1124 11:27:47.167806 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 24 11:27:47 crc kubenswrapper[4752]: I1124 11:27:47.246939 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bdb834d-3fdb-4bed-b349-e41c60ec4d64-logs\") pod \"nova-metadata-0\" (UID: \"4bdb834d-3fdb-4bed-b349-e41c60ec4d64\") " pod="openstack/nova-metadata-0" Nov 24 11:27:47 crc kubenswrapper[4752]: I1124 11:27:47.247015 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4bdb834d-3fdb-4bed-b349-e41c60ec4d64-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4bdb834d-3fdb-4bed-b349-e41c60ec4d64\") " pod="openstack/nova-metadata-0" Nov 24 11:27:47 crc kubenswrapper[4752]: I1124 11:27:47.247041 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dklh\" (UniqueName: \"kubernetes.io/projected/4bdb834d-3fdb-4bed-b349-e41c60ec4d64-kube-api-access-8dklh\") pod \"nova-metadata-0\" (UID: \"4bdb834d-3fdb-4bed-b349-e41c60ec4d64\") " pod="openstack/nova-metadata-0" Nov 24 11:27:47 crc kubenswrapper[4752]: I1124 11:27:47.247106 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bdb834d-3fdb-4bed-b349-e41c60ec4d64-config-data\") pod \"nova-metadata-0\" (UID: \"4bdb834d-3fdb-4bed-b349-e41c60ec4d64\") " pod="openstack/nova-metadata-0" Nov 24 11:27:47 crc kubenswrapper[4752]: I1124 11:27:47.247131 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bdb834d-3fdb-4bed-b349-e41c60ec4d64-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4bdb834d-3fdb-4bed-b349-e41c60ec4d64\") " pod="openstack/nova-metadata-0" Nov 24 11:27:47 crc kubenswrapper[4752]: I1124 11:27:47.349076 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bdb834d-3fdb-4bed-b349-e41c60ec4d64-config-data\") pod \"nova-metadata-0\" (UID: \"4bdb834d-3fdb-4bed-b349-e41c60ec4d64\") " pod="openstack/nova-metadata-0" Nov 24 11:27:47 crc kubenswrapper[4752]: I1124 11:27:47.349122 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bdb834d-3fdb-4bed-b349-e41c60ec4d64-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4bdb834d-3fdb-4bed-b349-e41c60ec4d64\") " pod="openstack/nova-metadata-0" Nov 24 11:27:47 crc kubenswrapper[4752]: I1124 11:27:47.349235 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bdb834d-3fdb-4bed-b349-e41c60ec4d64-logs\") pod \"nova-metadata-0\" (UID: \"4bdb834d-3fdb-4bed-b349-e41c60ec4d64\") " pod="openstack/nova-metadata-0" Nov 24 11:27:47 crc kubenswrapper[4752]: I1124 11:27:47.349277 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4bdb834d-3fdb-4bed-b349-e41c60ec4d64-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4bdb834d-3fdb-4bed-b349-e41c60ec4d64\") " pod="openstack/nova-metadata-0" Nov 24 11:27:47 crc kubenswrapper[4752]: I1124 11:27:47.349292 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dklh\" (UniqueName: \"kubernetes.io/projected/4bdb834d-3fdb-4bed-b349-e41c60ec4d64-kube-api-access-8dklh\") pod \"nova-metadata-0\" (UID: \"4bdb834d-3fdb-4bed-b349-e41c60ec4d64\") " pod="openstack/nova-metadata-0" Nov 24 11:27:47 crc kubenswrapper[4752]: I1124 11:27:47.349864 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bdb834d-3fdb-4bed-b349-e41c60ec4d64-logs\") pod \"nova-metadata-0\" (UID: \"4bdb834d-3fdb-4bed-b349-e41c60ec4d64\") " pod="openstack/nova-metadata-0" Nov 24 11:27:47 crc kubenswrapper[4752]: I1124 11:27:47.354096 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bdb834d-3fdb-4bed-b349-e41c60ec4d64-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4bdb834d-3fdb-4bed-b349-e41c60ec4d64\") " pod="openstack/nova-metadata-0" Nov 24 11:27:47 crc kubenswrapper[4752]: I1124 11:27:47.354379 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4bdb834d-3fdb-4bed-b349-e41c60ec4d64-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4bdb834d-3fdb-4bed-b349-e41c60ec4d64\") " pod="openstack/nova-metadata-0" Nov 24 11:27:47 crc kubenswrapper[4752]: I1124 11:27:47.355369 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bdb834d-3fdb-4bed-b349-e41c60ec4d64-config-data\") pod \"nova-metadata-0\" (UID: \"4bdb834d-3fdb-4bed-b349-e41c60ec4d64\") " pod="openstack/nova-metadata-0" Nov 24 11:27:47 crc kubenswrapper[4752]: I1124 11:27:47.365517 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dklh\" (UniqueName: \"kubernetes.io/projected/4bdb834d-3fdb-4bed-b349-e41c60ec4d64-kube-api-access-8dklh\") pod \"nova-metadata-0\" (UID: \"4bdb834d-3fdb-4bed-b349-e41c60ec4d64\") " pod="openstack/nova-metadata-0" Nov 24 11:27:47 crc kubenswrapper[4752]: I1124 11:27:47.485709 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 11:27:47 crc kubenswrapper[4752]: I1124 11:27:47.923847 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 11:27:48 crc kubenswrapper[4752]: I1124 11:27:48.051472 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4bdb834d-3fdb-4bed-b349-e41c60ec4d64","Type":"ContainerStarted","Data":"27624a3ee861162764efc4d71981b850fb235ccd4efc483dcb937fd0d6cf78fb"} Nov 24 11:27:48 crc kubenswrapper[4752]: I1124 11:27:48.741732 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="158542a8-66b4-40d4-b6a6-12635841260a" path="/var/lib/kubelet/pods/158542a8-66b4-40d4-b6a6-12635841260a/volumes" Nov 24 11:27:49 crc kubenswrapper[4752]: I1124 11:27:49.063423 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4bdb834d-3fdb-4bed-b349-e41c60ec4d64","Type":"ContainerStarted","Data":"820c72aca67b79ea64a516208b0eaa7594ab5684ba110221a1b876bae259b74a"} Nov 24 11:27:49 crc kubenswrapper[4752]: I1124 11:27:49.063762 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4bdb834d-3fdb-4bed-b349-e41c60ec4d64","Type":"ContainerStarted","Data":"6d5f6c189c0f341ca1cfaf61e6a7acbcdd7eb6be88bcb7818cbe04ea74b7b501"} Nov 24 11:27:49 crc kubenswrapper[4752]: I1124 11:27:49.084298 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.084281436 podStartE2EDuration="2.084281436s" podCreationTimestamp="2025-11-24 11:27:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:27:49.083076811 +0000 UTC m=+1275.067897110" watchObservedRunningTime="2025-11-24 11:27:49.084281436 +0000 UTC m=+1275.069101725" Nov 24 11:27:50 crc kubenswrapper[4752]: I1124 11:27:50.368830 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 24 11:27:52 crc kubenswrapper[4752]: I1124 11:27:52.486563 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 24 11:27:52 crc kubenswrapper[4752]: I1124 11:27:52.486989 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 24 11:27:54 crc kubenswrapper[4752]: I1124 11:27:54.708530 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 24 11:27:54 crc kubenswrapper[4752]: I1124 11:27:54.708649 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 24 11:27:55 crc kubenswrapper[4752]: I1124 11:27:55.368652 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 24 11:27:55 crc kubenswrapper[4752]: I1124 11:27:55.394949 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 24 11:27:55 crc kubenswrapper[4752]: I1124 11:27:55.722983 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b3b08ab3-f6c3-417e-abce-e03994e7553d" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.200:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 24 11:27:55 crc kubenswrapper[4752]: I1124 11:27:55.723010 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b3b08ab3-f6c3-417e-abce-e03994e7553d" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.200:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 24 11:27:56 crc kubenswrapper[4752]: I1124 11:27:56.153068 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 24 11:27:57 crc kubenswrapper[4752]: I1124 11:27:57.486181 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 24 11:27:57 crc kubenswrapper[4752]: I1124 11:27:57.486509 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 24 11:27:58 crc kubenswrapper[4752]: I1124 11:27:58.500938 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4bdb834d-3fdb-4bed-b349-e41c60ec4d64" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 24 11:27:58 crc kubenswrapper[4752]: I1124 11:27:58.500946 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4bdb834d-3fdb-4bed-b349-e41c60ec4d64" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 24 11:28:02 crc kubenswrapper[4752]: I1124 11:28:02.283357 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 24 11:28:04 crc kubenswrapper[4752]: I1124 11:28:04.717012 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 24 11:28:04 crc kubenswrapper[4752]: I1124 11:28:04.717589 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 24 11:28:04 crc kubenswrapper[4752]: I1124 11:28:04.717960 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 24 11:28:04 crc kubenswrapper[4752]: I1124 11:28:04.718016 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 24 11:28:04 crc kubenswrapper[4752]: I1124 11:28:04.725609 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 24 11:28:04 crc kubenswrapper[4752]: I1124 11:28:04.726320 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 24 11:28:07 crc kubenswrapper[4752]: I1124 11:28:07.493461 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 24 11:28:07 crc kubenswrapper[4752]: I1124 11:28:07.496026 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 24 11:28:07 crc kubenswrapper[4752]: I1124 11:28:07.500506 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 24 11:28:08 crc kubenswrapper[4752]: I1124 11:28:08.266537 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 24 11:28:15 crc kubenswrapper[4752]: I1124 11:28:15.468466 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 11:28:15 crc kubenswrapper[4752]: I1124 11:28:15.469056 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 11:28:15 crc kubenswrapper[4752]: I1124 11:28:15.469100 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 11:28:15 crc kubenswrapper[4752]: I1124 11:28:15.469921 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1a145d3540c315b0dee20cc299bc5214b8f1897d2d61c319cc9bb2cf1542af39"} pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 24 11:28:15 crc kubenswrapper[4752]: I1124 11:28:15.469979 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" containerID="cri-o://1a145d3540c315b0dee20cc299bc5214b8f1897d2d61c319cc9bb2cf1542af39" gracePeriod=600 Nov 24 11:28:16 crc kubenswrapper[4752]: I1124 11:28:16.342185 4752 generic.go:334] "Generic (PLEG): container finished" podID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerID="1a145d3540c315b0dee20cc299bc5214b8f1897d2d61c319cc9bb2cf1542af39" exitCode=0 Nov 24 11:28:16 crc kubenswrapper[4752]: I1124 11:28:16.342266 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerDied","Data":"1a145d3540c315b0dee20cc299bc5214b8f1897d2d61c319cc9bb2cf1542af39"} Nov 24 11:28:16 crc kubenswrapper[4752]: I1124 11:28:16.342487 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerStarted","Data":"a290b60e48084025bca90b1cd49b6254811f073d7356fc1da70f12f4b4085d03"} Nov 24 11:28:16 crc kubenswrapper[4752]: I1124 11:28:16.342510 4752 scope.go:117] "RemoveContainer" containerID="47e4c2373d3287581d775010f911ee2e00a31c4abfe85132a68240b84e00095c" Nov 24 11:28:26 crc kubenswrapper[4752]: I1124 11:28:26.484937 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Nov 24 11:28:26 crc kubenswrapper[4752]: I1124 11:28:26.486473 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="09540fa5-6ff8-45cc-98be-968283dc2bfd" containerName="openstackclient" containerID="cri-o://cd44959aa1a906397cf9d91f94ae613e3c4579ada93cf6e70cc97f819550202b" gracePeriod=2 Nov 24 11:28:26 crc kubenswrapper[4752]: I1124 11:28:26.492496 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Nov 24 11:28:26 crc kubenswrapper[4752]: I1124 11:28:26.669507 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 24 11:28:26 crc kubenswrapper[4752]: I1124 11:28:26.691627 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder3b7b-account-delete-rj27n"] Nov 24 11:28:26 crc kubenswrapper[4752]: E1124 11:28:26.692372 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09540fa5-6ff8-45cc-98be-968283dc2bfd" containerName="openstackclient" Nov 24 11:28:26 crc kubenswrapper[4752]: I1124 11:28:26.692446 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="09540fa5-6ff8-45cc-98be-968283dc2bfd" containerName="openstackclient" Nov 24 11:28:26 crc kubenswrapper[4752]: I1124 11:28:26.692708 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="09540fa5-6ff8-45cc-98be-968283dc2bfd" containerName="openstackclient" Nov 24 11:28:26 crc kubenswrapper[4752]: I1124 11:28:26.693948 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder3b7b-account-delete-rj27n" Nov 24 11:28:26 crc kubenswrapper[4752]: I1124 11:28:26.718468 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder3b7b-account-delete-rj27n"] Nov 24 11:28:26 crc kubenswrapper[4752]: I1124 11:28:26.813615 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 24 11:28:26 crc kubenswrapper[4752]: I1124 11:28:26.814151 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement83dd-account-delete-87gjb"] Nov 24 11:28:26 crc kubenswrapper[4752]: I1124 11:28:26.815643 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b9d21e3-863e-4129-a794-41c3bb8899df-operator-scripts\") pod \"cinder3b7b-account-delete-rj27n\" (UID: \"6b9d21e3-863e-4129-a794-41c3bb8899df\") " pod="openstack/cinder3b7b-account-delete-rj27n" Nov 24 11:28:26 crc kubenswrapper[4752]: I1124 11:28:26.815727 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hrns\" (UniqueName: \"kubernetes.io/projected/6b9d21e3-863e-4129-a794-41c3bb8899df-kube-api-access-9hrns\") pod \"cinder3b7b-account-delete-rj27n\" (UID: \"6b9d21e3-863e-4129-a794-41c3bb8899df\") " pod="openstack/cinder3b7b-account-delete-rj27n" Nov 24 11:28:26 crc kubenswrapper[4752]: I1124 11:28:26.817530 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement83dd-account-delete-87gjb" Nov 24 11:28:26 crc kubenswrapper[4752]: E1124 11:28:26.819320 4752 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Nov 24 11:28:26 crc kubenswrapper[4752]: E1124 11:28:26.819402 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-config-data podName:a1f1e943-afb7-40c3-9ff2-56791f4e0ad5 nodeName:}" failed. No retries permitted until 2025-11-24 11:28:27.319375678 +0000 UTC m=+1313.304196057 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-config-data") pod "rabbitmq-server-0" (UID: "a1f1e943-afb7-40c3-9ff2-56791f4e0ad5") : configmap "rabbitmq-config-data" not found Nov 24 11:28:26 crc kubenswrapper[4752]: I1124 11:28:26.819385 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="d5a1ee4e-2810-485d-b931-d3121fe7e264" containerName="openstack-network-exporter" containerID="cri-o://e53bf221b6c3d910bf35949c2778fbbe1ce10498615e17af98ca6b7a7b5fd127" gracePeriod=300 Nov 24 11:28:26 crc kubenswrapper[4752]: I1124 11:28:26.838360 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement83dd-account-delete-87gjb"] Nov 24 11:28:26 crc kubenswrapper[4752]: I1124 11:28:26.862863 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbicanf581-account-delete-qm6qk"] Nov 24 11:28:26 crc kubenswrapper[4752]: I1124 11:28:26.865904 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbicanf581-account-delete-qm6qk" Nov 24 11:28:26 crc kubenswrapper[4752]: I1124 11:28:26.895442 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbicanf581-account-delete-qm6qk"] Nov 24 11:28:26 crc kubenswrapper[4752]: I1124 11:28:26.921957 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b9d21e3-863e-4129-a794-41c3bb8899df-operator-scripts\") pod \"cinder3b7b-account-delete-rj27n\" (UID: \"6b9d21e3-863e-4129-a794-41c3bb8899df\") " pod="openstack/cinder3b7b-account-delete-rj27n" Nov 24 11:28:26 crc kubenswrapper[4752]: I1124 11:28:26.922009 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7cb3afd6-0557-4a85-9763-1572d92e6aa3-operator-scripts\") pod \"placement83dd-account-delete-87gjb\" (UID: \"7cb3afd6-0557-4a85-9763-1572d92e6aa3\") " pod="openstack/placement83dd-account-delete-87gjb" Nov 24 11:28:26 crc kubenswrapper[4752]: I1124 11:28:26.922041 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hrns\" (UniqueName: \"kubernetes.io/projected/6b9d21e3-863e-4129-a794-41c3bb8899df-kube-api-access-9hrns\") pod \"cinder3b7b-account-delete-rj27n\" (UID: \"6b9d21e3-863e-4129-a794-41c3bb8899df\") " pod="openstack/cinder3b7b-account-delete-rj27n" Nov 24 11:28:26 crc kubenswrapper[4752]: I1124 11:28:26.922101 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svb6w\" (UniqueName: \"kubernetes.io/projected/7cb3afd6-0557-4a85-9763-1572d92e6aa3-kube-api-access-svb6w\") pod \"placement83dd-account-delete-87gjb\" (UID: \"7cb3afd6-0557-4a85-9763-1572d92e6aa3\") " pod="openstack/placement83dd-account-delete-87gjb" Nov 24 11:28:26 crc kubenswrapper[4752]: I1124 11:28:26.922834 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b9d21e3-863e-4129-a794-41c3bb8899df-operator-scripts\") pod \"cinder3b7b-account-delete-rj27n\" (UID: \"6b9d21e3-863e-4129-a794-41c3bb8899df\") " pod="openstack/cinder3b7b-account-delete-rj27n" Nov 24 11:28:26 crc kubenswrapper[4752]: I1124 11:28:26.945247 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 24 11:28:26 crc kubenswrapper[4752]: I1124 11:28:26.968641 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hrns\" (UniqueName: \"kubernetes.io/projected/6b9d21e3-863e-4129-a794-41c3bb8899df-kube-api-access-9hrns\") pod \"cinder3b7b-account-delete-rj27n\" (UID: \"6b9d21e3-863e-4129-a794-41c3bb8899df\") " pod="openstack/cinder3b7b-account-delete-rj27n" Nov 24 11:28:26 crc kubenswrapper[4752]: I1124 11:28:26.994733 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="d5a1ee4e-2810-485d-b931-d3121fe7e264" containerName="ovsdbserver-nb" containerID="cri-o://342bfdc4cfd18e6bd3ed709a4a89ca54622aa4743390cc89a011bb7607799950" gracePeriod=300 Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.023261 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31369e0b-e7eb-44cc-9d7e-21d196d95ad3-operator-scripts\") pod \"barbicanf581-account-delete-qm6qk\" (UID: \"31369e0b-e7eb-44cc-9d7e-21d196d95ad3\") " pod="openstack/barbicanf581-account-delete-qm6qk" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.023325 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7cb3afd6-0557-4a85-9763-1572d92e6aa3-operator-scripts\") pod \"placement83dd-account-delete-87gjb\" (UID: \"7cb3afd6-0557-4a85-9763-1572d92e6aa3\") " pod="openstack/placement83dd-account-delete-87gjb" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.023440 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svb6w\" (UniqueName: \"kubernetes.io/projected/7cb3afd6-0557-4a85-9763-1572d92e6aa3-kube-api-access-svb6w\") pod \"placement83dd-account-delete-87gjb\" (UID: \"7cb3afd6-0557-4a85-9763-1572d92e6aa3\") " pod="openstack/placement83dd-account-delete-87gjb" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.023528 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g95m5\" (UniqueName: \"kubernetes.io/projected/31369e0b-e7eb-44cc-9d7e-21d196d95ad3-kube-api-access-g95m5\") pod \"barbicanf581-account-delete-qm6qk\" (UID: \"31369e0b-e7eb-44cc-9d7e-21d196d95ad3\") " pod="openstack/barbicanf581-account-delete-qm6qk" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.024597 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7cb3afd6-0557-4a85-9763-1572d92e6aa3-operator-scripts\") pod \"placement83dd-account-delete-87gjb\" (UID: \"7cb3afd6-0557-4a85-9763-1572d92e6aa3\") " pod="openstack/placement83dd-account-delete-87gjb" Nov 24 11:28:27 crc kubenswrapper[4752]: E1124 11:28:27.025686 4752 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Nov 24 11:28:27 crc kubenswrapper[4752]: E1124 11:28:27.025737 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-config-data podName:3d820ba8-03d6-48f4-9423-bbc1ed64a36b nodeName:}" failed. No retries permitted until 2025-11-24 11:28:27.525723832 +0000 UTC m=+1313.510544121 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-config-data") pod "rabbitmq-cell1-server-0" (UID: "3d820ba8-03d6-48f4-9423-bbc1ed64a36b") : configmap "rabbitmq-cell1-config-data" not found Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.038167 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glancef4c5-account-delete-kjkjp"] Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.039329 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glancef4c5-account-delete-kjkjp" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.053897 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glancef4c5-account-delete-kjkjp"] Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.085030 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svb6w\" (UniqueName: \"kubernetes.io/projected/7cb3afd6-0557-4a85-9763-1572d92e6aa3-kube-api-access-svb6w\") pod \"placement83dd-account-delete-87gjb\" (UID: \"7cb3afd6-0557-4a85-9763-1572d92e6aa3\") " pod="openstack/placement83dd-account-delete-87gjb" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.130145 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder3b7b-account-delete-rj27n" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.133893 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0-operator-scripts\") pod \"glancef4c5-account-delete-kjkjp\" (UID: \"4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0\") " pod="openstack/glancef4c5-account-delete-kjkjp" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.134115 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g95m5\" (UniqueName: \"kubernetes.io/projected/31369e0b-e7eb-44cc-9d7e-21d196d95ad3-kube-api-access-g95m5\") pod \"barbicanf581-account-delete-qm6qk\" (UID: \"31369e0b-e7eb-44cc-9d7e-21d196d95ad3\") " pod="openstack/barbicanf581-account-delete-qm6qk" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.134249 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31369e0b-e7eb-44cc-9d7e-21d196d95ad3-operator-scripts\") pod \"barbicanf581-account-delete-qm6qk\" (UID: \"31369e0b-e7eb-44cc-9d7e-21d196d95ad3\") " pod="openstack/barbicanf581-account-delete-qm6qk" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.134336 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9kpx\" (UniqueName: \"kubernetes.io/projected/4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0-kube-api-access-w9kpx\") pod \"glancef4c5-account-delete-kjkjp\" (UID: \"4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0\") " pod="openstack/glancef4c5-account-delete-kjkjp" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.136314 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31369e0b-e7eb-44cc-9d7e-21d196d95ad3-operator-scripts\") pod \"barbicanf581-account-delete-qm6qk\" (UID: \"31369e0b-e7eb-44cc-9d7e-21d196d95ad3\") " pod="openstack/barbicanf581-account-delete-qm6qk" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.168005 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement83dd-account-delete-87gjb" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.192109 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g95m5\" (UniqueName: \"kubernetes.io/projected/31369e0b-e7eb-44cc-9d7e-21d196d95ad3-kube-api-access-g95m5\") pod \"barbicanf581-account-delete-qm6qk\" (UID: \"31369e0b-e7eb-44cc-9d7e-21d196d95ad3\") " pod="openstack/barbicanf581-account-delete-qm6qk" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.202860 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbicanf581-account-delete-qm6qk" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.214427 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell119d4-account-delete-vksfq"] Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.235830 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9kpx\" (UniqueName: \"kubernetes.io/projected/4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0-kube-api-access-w9kpx\") pod \"glancef4c5-account-delete-kjkjp\" (UID: \"4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0\") " pod="openstack/glancef4c5-account-delete-kjkjp" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.235915 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0-operator-scripts\") pod \"glancef4c5-account-delete-kjkjp\" (UID: \"4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0\") " pod="openstack/glancef4c5-account-delete-kjkjp" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.236548 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell119d4-account-delete-vksfq" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.236827 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0-operator-scripts\") pod \"glancef4c5-account-delete-kjkjp\" (UID: \"4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0\") " pod="openstack/glancef4c5-account-delete-kjkjp" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.254113 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell119d4-account-delete-vksfq"] Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.286578 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.288396 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9kpx\" (UniqueName: \"kubernetes.io/projected/4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0-kube-api-access-w9kpx\") pod \"glancef4c5-account-delete-kjkjp\" (UID: \"4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0\") " pod="openstack/glancef4c5-account-delete-kjkjp" Nov 24 11:28:27 crc kubenswrapper[4752]: E1124 11:28:27.317698 4752 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5a1ee4e_2810_485d_b931_d3121fe7e264.slice/crio-e53bf221b6c3d910bf35949c2778fbbe1ce10498615e17af98ca6b7a7b5fd127.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5a1ee4e_2810_485d_b931_d3121fe7e264.slice/crio-conmon-e53bf221b6c3d910bf35949c2778fbbe1ce10498615e17af98ca6b7a7b5fd127.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5a1ee4e_2810_485d_b931_d3121fe7e264.slice/crio-342bfdc4cfd18e6bd3ed709a4a89ca54622aa4743390cc89a011bb7607799950.scope\": RecentStats: unable to find data in memory cache]" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.322431 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="7d558bea-9793-4831-9304-f8cee2b2331e" containerName="openstack-network-exporter" containerID="cri-o://44a0f217799b6ba62959ff62bff9dd54d47f9e3c7d76763038b8475083244095" gracePeriod=300 Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.335160 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-ff2l8"] Nov 24 11:28:27 crc kubenswrapper[4752]: E1124 11:28:27.337550 4752 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Nov 24 11:28:27 crc kubenswrapper[4752]: E1124 11:28:27.337623 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-config-data podName:a1f1e943-afb7-40c3-9ff2-56791f4e0ad5 nodeName:}" failed. No retries permitted until 2025-11-24 11:28:28.337603816 +0000 UTC m=+1314.322424105 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-config-data") pod "rabbitmq-server-0" (UID: "a1f1e943-afb7-40c3-9ff2-56791f4e0ad5") : configmap "rabbitmq-config-data" not found Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.382626 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-ff2l8"] Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.394954 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell080e2-account-delete-8s7w5"] Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.396461 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell080e2-account-delete-8s7w5" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.422449 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell080e2-account-delete-8s7w5"] Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.442825 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjr8g\" (UniqueName: \"kubernetes.io/projected/51186e60-74ec-447f-afd0-14056f01d5a3-kube-api-access-pjr8g\") pod \"novacell119d4-account-delete-vksfq\" (UID: \"51186e60-74ec-447f-afd0-14056f01d5a3\") " pod="openstack/novacell119d4-account-delete-vksfq" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.443311 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51186e60-74ec-447f-afd0-14056f01d5a3-operator-scripts\") pod \"novacell119d4-account-delete-vksfq\" (UID: \"51186e60-74ec-447f-afd0-14056f01d5a3\") " pod="openstack/novacell119d4-account-delete-vksfq" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.449807 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novaapib683-account-delete-k7898"] Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.451388 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapib683-account-delete-k7898" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.476229 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapib683-account-delete-k7898"] Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.529801 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-s49hz"] Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.536519 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-s49hz"] Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.546598 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11bd1a8d-cbf9-47c6-a116-538f60634a33-operator-scripts\") pod \"novaapib683-account-delete-k7898\" (UID: \"11bd1a8d-cbf9-47c6-a116-538f60634a33\") " pod="openstack/novaapib683-account-delete-k7898" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.546647 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51186e60-74ec-447f-afd0-14056f01d5a3-operator-scripts\") pod \"novacell119d4-account-delete-vksfq\" (UID: \"51186e60-74ec-447f-afd0-14056f01d5a3\") " pod="openstack/novacell119d4-account-delete-vksfq" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.546755 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzh6w\" (UniqueName: \"kubernetes.io/projected/11bd1a8d-cbf9-47c6-a116-538f60634a33-kube-api-access-hzh6w\") pod \"novaapib683-account-delete-k7898\" (UID: \"11bd1a8d-cbf9-47c6-a116-538f60634a33\") " pod="openstack/novaapib683-account-delete-k7898" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.546870 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjr8g\" (UniqueName: \"kubernetes.io/projected/51186e60-74ec-447f-afd0-14056f01d5a3-kube-api-access-pjr8g\") pod \"novacell119d4-account-delete-vksfq\" (UID: \"51186e60-74ec-447f-afd0-14056f01d5a3\") " pod="openstack/novacell119d4-account-delete-vksfq" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.546959 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/273fc6bc-214e-4fca-8939-003f983d6aa1-operator-scripts\") pod \"novacell080e2-account-delete-8s7w5\" (UID: \"273fc6bc-214e-4fca-8939-003f983d6aa1\") " pod="openstack/novacell080e2-account-delete-8s7w5" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.546995 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2zkj\" (UniqueName: \"kubernetes.io/projected/273fc6bc-214e-4fca-8939-003f983d6aa1-kube-api-access-b2zkj\") pod \"novacell080e2-account-delete-8s7w5\" (UID: \"273fc6bc-214e-4fca-8939-003f983d6aa1\") " pod="openstack/novacell080e2-account-delete-8s7w5" Nov 24 11:28:27 crc kubenswrapper[4752]: E1124 11:28:27.547111 4752 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Nov 24 11:28:27 crc kubenswrapper[4752]: E1124 11:28:27.547210 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-config-data podName:3d820ba8-03d6-48f4-9423-bbc1ed64a36b nodeName:}" failed. No retries permitted until 2025-11-24 11:28:28.547190865 +0000 UTC m=+1314.532011154 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-config-data") pod "rabbitmq-cell1-server-0" (UID: "3d820ba8-03d6-48f4-9423-bbc1ed64a36b") : configmap "rabbitmq-cell1-config-data" not found Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.547529 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51186e60-74ec-447f-afd0-14056f01d5a3-operator-scripts\") pod \"novacell119d4-account-delete-vksfq\" (UID: \"51186e60-74ec-447f-afd0-14056f01d5a3\") " pod="openstack/novacell119d4-account-delete-vksfq" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.549530 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glancef4c5-account-delete-kjkjp" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.558276 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-vvkzg"] Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.565520 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="7d558bea-9793-4831-9304-f8cee2b2331e" containerName="ovsdbserver-sb" containerID="cri-o://2ea78ef62ccc3e6245a885fb49cb63708cc7ee44c9d209af4583cd80c3340003" gracePeriod=300 Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.569320 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d5a1ee4e-2810-485d-b931-d3121fe7e264/ovsdbserver-nb/0.log" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.569362 4752 generic.go:334] "Generic (PLEG): container finished" podID="d5a1ee4e-2810-485d-b931-d3121fe7e264" containerID="e53bf221b6c3d910bf35949c2778fbbe1ce10498615e17af98ca6b7a7b5fd127" exitCode=2 Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.569396 4752 generic.go:334] "Generic (PLEG): container finished" podID="d5a1ee4e-2810-485d-b931-d3121fe7e264" containerID="342bfdc4cfd18e6bd3ed709a4a89ca54622aa4743390cc89a011bb7607799950" exitCode=143 Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.569419 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d5a1ee4e-2810-485d-b931-d3121fe7e264","Type":"ContainerDied","Data":"e53bf221b6c3d910bf35949c2778fbbe1ce10498615e17af98ca6b7a7b5fd127"} Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.569446 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d5a1ee4e-2810-485d-b931-d3121fe7e264","Type":"ContainerDied","Data":"342bfdc4cfd18e6bd3ed709a4a89ca54622aa4743390cc89a011bb7607799950"} Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.584994 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-vvkzg"] Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.588443 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjr8g\" (UniqueName: \"kubernetes.io/projected/51186e60-74ec-447f-afd0-14056f01d5a3-kube-api-access-pjr8g\") pod \"novacell119d4-account-delete-vksfq\" (UID: \"51186e60-74ec-447f-afd0-14056f01d5a3\") " pod="openstack/novacell119d4-account-delete-vksfq" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.592341 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell119d4-account-delete-vksfq" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.607179 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-fdrhb"] Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.618960 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-fdrhb"] Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.638830 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron574b-account-delete-nw8qc"] Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.640995 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron574b-account-delete-nw8qc" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.650564 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzh6w\" (UniqueName: \"kubernetes.io/projected/11bd1a8d-cbf9-47c6-a116-538f60634a33-kube-api-access-hzh6w\") pod \"novaapib683-account-delete-k7898\" (UID: \"11bd1a8d-cbf9-47c6-a116-538f60634a33\") " pod="openstack/novaapib683-account-delete-k7898" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.650703 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/273fc6bc-214e-4fca-8939-003f983d6aa1-operator-scripts\") pod \"novacell080e2-account-delete-8s7w5\" (UID: \"273fc6bc-214e-4fca-8939-003f983d6aa1\") " pod="openstack/novacell080e2-account-delete-8s7w5" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.650734 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2zkj\" (UniqueName: \"kubernetes.io/projected/273fc6bc-214e-4fca-8939-003f983d6aa1-kube-api-access-b2zkj\") pod \"novacell080e2-account-delete-8s7w5\" (UID: \"273fc6bc-214e-4fca-8939-003f983d6aa1\") " pod="openstack/novacell080e2-account-delete-8s7w5" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.650789 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjq64\" (UniqueName: \"kubernetes.io/projected/a8b9f59e-8570-4b2a-9c7b-14be641f74fe-kube-api-access-jjq64\") pod \"neutron574b-account-delete-nw8qc\" (UID: \"a8b9f59e-8570-4b2a-9c7b-14be641f74fe\") " pod="openstack/neutron574b-account-delete-nw8qc" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.650810 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11bd1a8d-cbf9-47c6-a116-538f60634a33-operator-scripts\") pod \"novaapib683-account-delete-k7898\" (UID: \"11bd1a8d-cbf9-47c6-a116-538f60634a33\") " pod="openstack/novaapib683-account-delete-k7898" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.650829 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8b9f59e-8570-4b2a-9c7b-14be641f74fe-operator-scripts\") pod \"neutron574b-account-delete-nw8qc\" (UID: \"a8b9f59e-8570-4b2a-9c7b-14be641f74fe\") " pod="openstack/neutron574b-account-delete-nw8qc" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.651685 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/273fc6bc-214e-4fca-8939-003f983d6aa1-operator-scripts\") pod \"novacell080e2-account-delete-8s7w5\" (UID: \"273fc6bc-214e-4fca-8939-003f983d6aa1\") " pod="openstack/novacell080e2-account-delete-8s7w5" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.651770 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11bd1a8d-cbf9-47c6-a116-538f60634a33-operator-scripts\") pod \"novaapib683-account-delete-k7898\" (UID: \"11bd1a8d-cbf9-47c6-a116-538f60634a33\") " pod="openstack/novaapib683-account-delete-k7898" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.653912 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron574b-account-delete-nw8qc"] Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.666691 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.668016 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="1077002f-be90-4e09-8158-efdc98329e5b" containerName="ovn-northd" containerID="cri-o://cbc8819aaddd8d4d2d0dad826bb4ce82df8a1c47c68ac131ce481a35067952c7" gracePeriod=30 Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.668153 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="1077002f-be90-4e09-8158-efdc98329e5b" containerName="openstack-network-exporter" containerID="cri-o://04d87e64c2950336bd5ce6ab63d3778aa490d6f9336803686ba547e5f016a8d5" gracePeriod=30 Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.687406 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzh6w\" (UniqueName: \"kubernetes.io/projected/11bd1a8d-cbf9-47c6-a116-538f60634a33-kube-api-access-hzh6w\") pod \"novaapib683-account-delete-k7898\" (UID: \"11bd1a8d-cbf9-47c6-a116-538f60634a33\") " pod="openstack/novaapib683-account-delete-k7898" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.692222 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-xmc2s"] Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.704697 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2zkj\" (UniqueName: \"kubernetes.io/projected/273fc6bc-214e-4fca-8939-003f983d6aa1-kube-api-access-b2zkj\") pod \"novacell080e2-account-delete-8s7w5\" (UID: \"273fc6bc-214e-4fca-8939-003f983d6aa1\") " pod="openstack/novacell080e2-account-delete-8s7w5" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.721515 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-v6jhw"] Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.735298 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-cm8dh"] Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.739245 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-cm8dh" podUID="01bd298d-28f4-4c50-80b3-f81ed96de794" containerName="openstack-network-exporter" containerID="cri-o://c9770a80fae990d49f8d0649788f9b375b5556a34ea582ab37233bdac9368a5a" gracePeriod=30 Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.762782 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjq64\" (UniqueName: \"kubernetes.io/projected/a8b9f59e-8570-4b2a-9c7b-14be641f74fe-kube-api-access-jjq64\") pod \"neutron574b-account-delete-nw8qc\" (UID: \"a8b9f59e-8570-4b2a-9c7b-14be641f74fe\") " pod="openstack/neutron574b-account-delete-nw8qc" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.762891 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8b9f59e-8570-4b2a-9c7b-14be641f74fe-operator-scripts\") pod \"neutron574b-account-delete-nw8qc\" (UID: \"a8b9f59e-8570-4b2a-9c7b-14be641f74fe\") " pod="openstack/neutron574b-account-delete-nw8qc" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.764023 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell080e2-account-delete-8s7w5" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.764389 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8b9f59e-8570-4b2a-9c7b-14be641f74fe-operator-scripts\") pod \"neutron574b-account-delete-nw8qc\" (UID: \"a8b9f59e-8570-4b2a-9c7b-14be641f74fe\") " pod="openstack/neutron574b-account-delete-nw8qc" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.793210 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjq64\" (UniqueName: \"kubernetes.io/projected/a8b9f59e-8570-4b2a-9c7b-14be641f74fe-kube-api-access-jjq64\") pod \"neutron574b-account-delete-nw8qc\" (UID: \"a8b9f59e-8570-4b2a-9c7b-14be641f74fe\") " pod="openstack/neutron574b-account-delete-nw8qc" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.812227 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapib683-account-delete-k7898" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.823640 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron574b-account-delete-nw8qc" Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.859574 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-65sfh"] Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.859857 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cd5cbd7b9-65sfh" podUID="c699191a-ace5-4413-8607-e801f9646d0d" containerName="dnsmasq-dns" containerID="cri-o://69b8d0fb67913018dd4cc726dbf04cca82e339eb6081fe3590586f398c788b8b" gracePeriod=10 Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.905098 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-xx5fv"] Nov 24 11:28:27 crc kubenswrapper[4752]: I1124 11:28:27.954833 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-xx5fv"] Nov 24 11:28:28 crc kubenswrapper[4752]: E1124 11:28:28.029987 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2ea78ef62ccc3e6245a885fb49cb63708cc7ee44c9d209af4583cd80c3340003 is running failed: container process not found" containerID="2ea78ef62ccc3e6245a885fb49cb63708cc7ee44c9d209af4583cd80c3340003" cmd=["/usr/bin/pidof","ovsdb-server"] Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.036559 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-2shz6"] Nov 24 11:28:28 crc kubenswrapper[4752]: E1124 11:28:28.037983 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2ea78ef62ccc3e6245a885fb49cb63708cc7ee44c9d209af4583cd80c3340003 is running failed: container process not found" containerID="2ea78ef62ccc3e6245a885fb49cb63708cc7ee44c9d209af4583cd80c3340003" cmd=["/usr/bin/pidof","ovsdb-server"] Nov 24 11:28:28 crc kubenswrapper[4752]: E1124 11:28:28.042480 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2ea78ef62ccc3e6245a885fb49cb63708cc7ee44c9d209af4583cd80c3340003 is running failed: container process not found" containerID="2ea78ef62ccc3e6245a885fb49cb63708cc7ee44c9d209af4583cd80c3340003" cmd=["/usr/bin/pidof","ovsdb-server"] Nov 24 11:28:28 crc kubenswrapper[4752]: E1124 11:28:28.042549 4752 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2ea78ef62ccc3e6245a885fb49cb63708cc7ee44c9d209af4583cd80c3340003 is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-sb-0" podUID="7d558bea-9793-4831-9304-f8cee2b2331e" containerName="ovsdbserver-sb" Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.116255 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-2shz6"] Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.139805 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-27bqg"] Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.149884 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-27bqg"] Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.242856 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.244384 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="account-server" containerID="cri-o://c2e4b5f4a8a1169d5bc08d2b66fc675cb52f5aca6edec61cb505367c31a96177" gracePeriod=30 Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.245918 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="object-server" containerID="cri-o://e96acd23c65d5f84d30d5433b104a5d9611a4f0c7182eced372f80832d7b71b7" gracePeriod=30 Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.246537 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="swift-recon-cron" containerID="cri-o://3be396b0e6f5e0aba349a55cbe34b21319263e884f766386de78d9f798a96776" gracePeriod=30 Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.246735 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="rsync" containerID="cri-o://9f08577ef801c3e7522c491d0c718c0de2696b6e95c04bd48ac0a4caf2a50f3f" gracePeriod=30 Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.262943 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="object-expirer" containerID="cri-o://22d66d38b77ecc103d240b9da3486e681fb8545959095b39b6a50d203a648e8f" gracePeriod=30 Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.263364 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="object-updater" containerID="cri-o://40650a43647baeef577e362f8903bd0a8586c76d9193a87a0ee8036e11ec9bd6" gracePeriod=30 Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.263624 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="object-auditor" containerID="cri-o://655afbf26b251383053ab1cd2f78be797251cca8e2d6d31b15777423d8e8b8d6" gracePeriod=30 Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.263910 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="object-replicator" containerID="cri-o://43d8ca866663d4ecd92842929ff03ff00e617cdd862f758f162611f3514cc729" gracePeriod=30 Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.264563 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="container-server" containerID="cri-o://1f69e9bf406833b72e92b79eb1f7d0d4f08fa2bf38125f2569affd7aa3802317" gracePeriod=30 Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.264782 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="container-updater" containerID="cri-o://767dc8c699d90b9a5af04f3792d4ddd678467deee8fa5af4963960ebd783d478" gracePeriod=30 Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.265043 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="container-auditor" containerID="cri-o://488850f07865f9460361539e0d1fa936b99c6aeeb6356f50bc78ab5f8d90ffae" gracePeriod=30 Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.268365 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="container-replicator" containerID="cri-o://b223ea43e6b2067d0faea7345f42301d5670008d230a3af4e07aa9d728dd386e" gracePeriod=30 Nov 24 11:28:28 crc kubenswrapper[4752]: E1124 11:28:28.363563 4752 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Nov 24 11:28:28 crc kubenswrapper[4752]: E1124 11:28:28.363653 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-config-data podName:a1f1e943-afb7-40c3-9ff2-56791f4e0ad5 nodeName:}" failed. No retries permitted until 2025-11-24 11:28:30.363630332 +0000 UTC m=+1316.348450621 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-config-data") pod "rabbitmq-server-0" (UID: "a1f1e943-afb7-40c3-9ff2-56791f4e0ad5") : configmap "rabbitmq-config-data" not found Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.270241 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="account-replicator" containerID="cri-o://8800b3e493a8c8c3ffef58fdad6314f8858048be66617819eddea42dd54bfa75" gracePeriod=30 Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.269274 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="account-reaper" containerID="cri-o://0aa9b465d5873d31219e417b504faeddfc33faecc4e6894d5ac1e3bed4d23124" gracePeriod=30 Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.268812 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="account-auditor" containerID="cri-o://c40fe6199d59ba83d6f5ecc88363dc9b53f3780bec0cb8d3bf3828832d55bce4" gracePeriod=30 Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.385238 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-t8l2x"] Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.465813 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-t8l2x"] Nov 24 11:28:28 crc kubenswrapper[4752]: E1124 11:28:28.574662 4752 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Nov 24 11:28:28 crc kubenswrapper[4752]: E1124 11:28:28.574736 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-config-data podName:3d820ba8-03d6-48f4-9423-bbc1ed64a36b nodeName:}" failed. No retries permitted until 2025-11-24 11:28:30.574720074 +0000 UTC m=+1316.559540363 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-config-data") pod "rabbitmq-cell1-server-0" (UID: "3d820ba8-03d6-48f4-9423-bbc1ed64a36b") : configmap "rabbitmq-cell1-config-data" not found Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.608509 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.608854 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="9d0084f6-064e-4089-87fe-4ead63923b56" containerName="cinder-scheduler" containerID="cri-o://9e7930d8efc895b38537d979ec9125f2a956817fea154183ec332abf612d2f8c" gracePeriod=30 Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.609400 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="9d0084f6-064e-4089-87fe-4ead63923b56" containerName="probe" containerID="cri-o://be46845395e6568e301fffb75b2b8cbf950babfb7216ca6d7fcccb52601c5794" gracePeriod=30 Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.639986 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-57b7bbd86d-9drzs"] Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.640288 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-57b7bbd86d-9drzs" podUID="a6e4558a-350e-442a-ad61-70d9a9824219" containerName="placement-log" containerID="cri-o://608e26395b4dc1de0bd00669f57c463df6c52c83abe35e40f229dda6794c160f" gracePeriod=30 Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.640456 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-57b7bbd86d-9drzs" podUID="a6e4558a-350e-442a-ad61-70d9a9824219" containerName="placement-api" containerID="cri-o://aaadfc25131dfddd66fa261c60845cf30f3a819ab939eec3683de48fc967913c" gracePeriod=30 Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.691274 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.691521 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="ffd3886c-7122-4bd5-97d2-5c448c31e941" containerName="cinder-api-log" containerID="cri-o://8afbde4838c68ed65d0a791d3f6cf81c9dff82063ee60fdee90cec80f669ae26" gracePeriod=30 Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.692007 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="ffd3886c-7122-4bd5-97d2-5c448c31e941" containerName="cinder-api" containerID="cri-o://11876dbf5b00d15eb3d002e8d11562f90c15b2bebf4c5f309e67dc8dc3e37bbb" gracePeriod=30 Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.701089 4752 generic.go:334] "Generic (PLEG): container finished" podID="c699191a-ace5-4413-8607-e801f9646d0d" containerID="69b8d0fb67913018dd4cc726dbf04cca82e339eb6081fe3590586f398c788b8b" exitCode=0 Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.701188 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-65sfh" event={"ID":"c699191a-ace5-4413-8607-e801f9646d0d","Type":"ContainerDied","Data":"69b8d0fb67913018dd4cc726dbf04cca82e339eb6081fe3590586f398c788b8b"} Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.708522 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d5a1ee4e-2810-485d-b931-d3121fe7e264/ovsdbserver-nb/0.log" Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.708624 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.719420 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.719635 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f3c01e5f-8338-497f-9e47-e2e3e857df63" containerName="glance-log" containerID="cri-o://5608c635aa46e4432dd124ea11759d4280b1e7c2301d6131246b4a3f068d7cee" gracePeriod=30 Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.719811 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f3c01e5f-8338-497f-9e47-e2e3e857df63" containerName="glance-httpd" containerID="cri-o://9f8c293fe3f68cdc99eb5f8a593f9b4e799568c50b7fc38020752eb9bf6b2982" gracePeriod=30 Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.733006 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d5a1ee4e-2810-485d-b931-d3121fe7e264/ovsdbserver-nb/0.log" Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.746151 4752 generic.go:334] "Generic (PLEG): container finished" podID="1077002f-be90-4e09-8158-efdc98329e5b" containerID="04d87e64c2950336bd5ce6ab63d3778aa490d6f9336803686ba547e5f016a8d5" exitCode=2 Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.778797 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5a1ee4e-2810-485d-b931-d3121fe7e264-ovsdbserver-nb-tls-certs\") pod \"d5a1ee4e-2810-485d-b931-d3121fe7e264\" (UID: \"d5a1ee4e-2810-485d-b931-d3121fe7e264\") " Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.779115 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"d5a1ee4e-2810-485d-b931-d3121fe7e264\" (UID: \"d5a1ee4e-2810-485d-b931-d3121fe7e264\") " Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.779320 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5a1ee4e-2810-485d-b931-d3121fe7e264-metrics-certs-tls-certs\") pod \"d5a1ee4e-2810-485d-b931-d3121fe7e264\" (UID: \"d5a1ee4e-2810-485d-b931-d3121fe7e264\") " Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.779355 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5a1ee4e-2810-485d-b931-d3121fe7e264-scripts\") pod \"d5a1ee4e-2810-485d-b931-d3121fe7e264\" (UID: \"d5a1ee4e-2810-485d-b931-d3121fe7e264\") " Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.779394 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5a1ee4e-2810-485d-b931-d3121fe7e264-combined-ca-bundle\") pod \"d5a1ee4e-2810-485d-b931-d3121fe7e264\" (UID: \"d5a1ee4e-2810-485d-b931-d3121fe7e264\") " Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.779418 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d5a1ee4e-2810-485d-b931-d3121fe7e264-ovsdb-rundir\") pod \"d5a1ee4e-2810-485d-b931-d3121fe7e264\" (UID: \"d5a1ee4e-2810-485d-b931-d3121fe7e264\") " Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.779467 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvs2l\" (UniqueName: \"kubernetes.io/projected/d5a1ee4e-2810-485d-b931-d3121fe7e264-kube-api-access-cvs2l\") pod \"d5a1ee4e-2810-485d-b931-d3121fe7e264\" (UID: \"d5a1ee4e-2810-485d-b931-d3121fe7e264\") " Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.779488 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5a1ee4e-2810-485d-b931-d3121fe7e264-config\") pod \"d5a1ee4e-2810-485d-b931-d3121fe7e264\" (UID: \"d5a1ee4e-2810-485d-b931-d3121fe7e264\") " Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.780408 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5a1ee4e-2810-485d-b931-d3121fe7e264-config" (OuterVolumeSpecName: "config") pod "d5a1ee4e-2810-485d-b931-d3121fe7e264" (UID: "d5a1ee4e-2810-485d-b931-d3121fe7e264"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.782363 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5a1ee4e-2810-485d-b931-d3121fe7e264-scripts" (OuterVolumeSpecName: "scripts") pod "d5a1ee4e-2810-485d-b931-d3121fe7e264" (UID: "d5a1ee4e-2810-485d-b931-d3121fe7e264"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.787078 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5a1ee4e-2810-485d-b931-d3121fe7e264-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "d5a1ee4e-2810-485d-b931-d3121fe7e264" (UID: "d5a1ee4e-2810-485d-b931-d3121fe7e264"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.794003 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "d5a1ee4e-2810-485d-b931-d3121fe7e264" (UID: "d5a1ee4e-2810-485d-b931-d3121fe7e264"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.798430 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5a1ee4e-2810-485d-b931-d3121fe7e264-kube-api-access-cvs2l" (OuterVolumeSpecName: "kube-api-access-cvs2l") pod "d5a1ee4e-2810-485d-b931-d3121fe7e264" (UID: "d5a1ee4e-2810-485d-b931-d3121fe7e264"). InnerVolumeSpecName "kube-api-access-cvs2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.814241 4752 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/nova-cell1-conductor-0" secret="" err="secret \"nova-nova-dockercfg-q9v5z\" not found" Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.818029 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7d558bea-9793-4831-9304-f8cee2b2331e/ovsdbserver-sb/0.log" Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.818153 4752 generic.go:334] "Generic (PLEG): container finished" podID="7d558bea-9793-4831-9304-f8cee2b2331e" containerID="44a0f217799b6ba62959ff62bff9dd54d47f9e3c7d76763038b8475083244095" exitCode=2 Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.818172 4752 generic.go:334] "Generic (PLEG): container finished" podID="7d558bea-9793-4831-9304-f8cee2b2331e" containerID="2ea78ef62ccc3e6245a885fb49cb63708cc7ee44c9d209af4583cd80c3340003" exitCode=143 Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.845207 4752 generic.go:334] "Generic (PLEG): container finished" podID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerID="767dc8c699d90b9a5af04f3792d4ddd678467deee8fa5af4963960ebd783d478" exitCode=0 Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.856140 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-cm8dh_01bd298d-28f4-4c50-80b3-f81ed96de794/openstack-network-exporter/0.log" Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.856184 4752 generic.go:334] "Generic (PLEG): container finished" podID="01bd298d-28f4-4c50-80b3-f81ed96de794" containerID="c9770a80fae990d49f8d0649788f9b375b5556a34ea582ab37233bdac9368a5a" exitCode=2 Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.875085 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5a1ee4e-2810-485d-b931-d3121fe7e264-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5a1ee4e-2810-485d-b931-d3121fe7e264" (UID: "d5a1ee4e-2810-485d-b931-d3121fe7e264"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.883020 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5a1ee4e-2810-485d-b931-d3121fe7e264-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.883048 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5a1ee4e-2810-485d-b931-d3121fe7e264-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.883059 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d5a1ee4e-2810-485d-b931-d3121fe7e264-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.883068 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvs2l\" (UniqueName: \"kubernetes.io/projected/d5a1ee4e-2810-485d-b931-d3121fe7e264-kube-api-access-cvs2l\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.883078 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5a1ee4e-2810-485d-b931-d3121fe7e264-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.883095 4752 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.966130 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04b7dddf-2bd1-4632-91ae-01abd3b77e1f" path="/var/lib/kubelet/pods/04b7dddf-2bd1-4632-91ae-01abd3b77e1f/volumes" Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.966736 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ecf935a-a2fa-4fb4-91f4-46f5564cb094" path="/var/lib/kubelet/pods/0ecf935a-a2fa-4fb4-91f4-46f5564cb094/volumes" Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.968569 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c082fce-df36-4282-9630-e2f3089fa482" path="/var/lib/kubelet/pods/2c082fce-df36-4282-9630-e2f3089fa482/volumes" Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.969896 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7812a05d-b13c-413a-ae63-9317fec939e5" path="/var/lib/kubelet/pods/7812a05d-b13c-413a-ae63-9317fec939e5/volumes" Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.972373 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80204aea-8ec7-4653-bfad-d2e6800ecf6e" path="/var/lib/kubelet/pods/80204aea-8ec7-4653-bfad-d2e6800ecf6e/volumes" Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.973409 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96ca5590-9544-4b84-b86d-1ec3ef57a829" path="/var/lib/kubelet/pods/96ca5590-9544-4b84-b86d-1ec3ef57a829/volumes" Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.973954 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4" path="/var/lib/kubelet/pods/9a79fb50-4f8a-4cbe-9efd-f3fd44dfb4c4/volumes" Nov 24 11:28:28 crc kubenswrapper[4752]: I1124 11:28:28.975042 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9" path="/var/lib/kubelet/pods/dd2d15cf-d759-41f9-9b5b-8a6deb1a27d9/volumes" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.019958 4752 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.023351 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5a1ee4e-2810-485d-b931-d3121fe7e264-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "d5a1ee4e-2810-485d-b931-d3121fe7e264" (UID: "d5a1ee4e-2810-485d-b931-d3121fe7e264"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.039271 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5a1ee4e-2810-485d-b931-d3121fe7e264-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "d5a1ee4e-2810-485d-b931-d3121fe7e264" (UID: "d5a1ee4e-2810-485d-b931-d3121fe7e264"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.089563 4752 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5a1ee4e-2810-485d-b931-d3121fe7e264-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.089594 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5a1ee4e-2810-485d-b931-d3121fe7e264-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.089603 4752 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.120145 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-ovs-v6jhw" podUID="26da9a73-3ede-476b-bfab-08839a8b88d1" containerName="ovs-vswitchd" probeResult="failure" output=< Nov 24 11:28:29 crc kubenswrapper[4752]: 2025-11-24T11:28:29Z|00001|unixctl|WARN|failed to connect to /var/run/openvswitch/ovs-vswitchd.11.ctl Nov 24 11:28:29 crc kubenswrapper[4752]: ovs-appctl: cannot connect to "/var/run/openvswitch/ovs-vswitchd.11.ctl" (No such file or directory) Nov 24 11:28:29 crc kubenswrapper[4752]: ERROR - Failed retrieving ofproto/list from ovs-vswitchd Nov 24 11:28:29 crc kubenswrapper[4752]: > Nov 24 11:28:29 crc kubenswrapper[4752]: E1124 11:28:29.165711 4752 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Nov 24 11:28:29 crc kubenswrapper[4752]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Nov 24 11:28:29 crc kubenswrapper[4752]: + source /usr/local/bin/container-scripts/functions Nov 24 11:28:29 crc kubenswrapper[4752]: ++ OVNBridge=br-int Nov 24 11:28:29 crc kubenswrapper[4752]: ++ OVNRemote=tcp:localhost:6642 Nov 24 11:28:29 crc kubenswrapper[4752]: ++ OVNEncapType=geneve Nov 24 11:28:29 crc kubenswrapper[4752]: ++ OVNAvailabilityZones= Nov 24 11:28:29 crc kubenswrapper[4752]: ++ EnableChassisAsGateway=true Nov 24 11:28:29 crc kubenswrapper[4752]: ++ PhysicalNetworks= Nov 24 11:28:29 crc kubenswrapper[4752]: ++ OVNHostName= Nov 24 11:28:29 crc kubenswrapper[4752]: ++ DB_FILE=/etc/openvswitch/conf.db Nov 24 11:28:29 crc kubenswrapper[4752]: ++ ovs_dir=/var/lib/openvswitch Nov 24 11:28:29 crc kubenswrapper[4752]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Nov 24 11:28:29 crc kubenswrapper[4752]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Nov 24 11:28:29 crc kubenswrapper[4752]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Nov 24 11:28:29 crc kubenswrapper[4752]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Nov 24 11:28:29 crc kubenswrapper[4752]: + sleep 0.5 Nov 24 11:28:29 crc kubenswrapper[4752]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Nov 24 11:28:29 crc kubenswrapper[4752]: + sleep 0.5 Nov 24 11:28:29 crc kubenswrapper[4752]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Nov 24 11:28:29 crc kubenswrapper[4752]: + cleanup_ovsdb_server_semaphore Nov 24 11:28:29 crc kubenswrapper[4752]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Nov 24 11:28:29 crc kubenswrapper[4752]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Nov 24 11:28:29 crc kubenswrapper[4752]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-v6jhw" message=< Nov 24 11:28:29 crc kubenswrapper[4752]: Exiting ovsdb-server (5) ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Nov 24 11:28:29 crc kubenswrapper[4752]: + source /usr/local/bin/container-scripts/functions Nov 24 11:28:29 crc kubenswrapper[4752]: ++ OVNBridge=br-int Nov 24 11:28:29 crc kubenswrapper[4752]: ++ OVNRemote=tcp:localhost:6642 Nov 24 11:28:29 crc kubenswrapper[4752]: ++ OVNEncapType=geneve Nov 24 11:28:29 crc kubenswrapper[4752]: ++ OVNAvailabilityZones= Nov 24 11:28:29 crc kubenswrapper[4752]: ++ EnableChassisAsGateway=true Nov 24 11:28:29 crc kubenswrapper[4752]: ++ PhysicalNetworks= Nov 24 11:28:29 crc kubenswrapper[4752]: ++ OVNHostName= Nov 24 11:28:29 crc kubenswrapper[4752]: ++ DB_FILE=/etc/openvswitch/conf.db Nov 24 11:28:29 crc kubenswrapper[4752]: ++ ovs_dir=/var/lib/openvswitch Nov 24 11:28:29 crc kubenswrapper[4752]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Nov 24 11:28:29 crc kubenswrapper[4752]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Nov 24 11:28:29 crc kubenswrapper[4752]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Nov 24 11:28:29 crc kubenswrapper[4752]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Nov 24 11:28:29 crc kubenswrapper[4752]: + sleep 0.5 Nov 24 11:28:29 crc kubenswrapper[4752]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Nov 24 11:28:29 crc kubenswrapper[4752]: + sleep 0.5 Nov 24 11:28:29 crc kubenswrapper[4752]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Nov 24 11:28:29 crc kubenswrapper[4752]: + cleanup_ovsdb_server_semaphore Nov 24 11:28:29 crc kubenswrapper[4752]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Nov 24 11:28:29 crc kubenswrapper[4752]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Nov 24 11:28:29 crc kubenswrapper[4752]: > Nov 24 11:28:29 crc kubenswrapper[4752]: E1124 11:28:29.165850 4752 kuberuntime_container.go:691] "PreStop hook failed" err=< Nov 24 11:28:29 crc kubenswrapper[4752]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Nov 24 11:28:29 crc kubenswrapper[4752]: + source /usr/local/bin/container-scripts/functions Nov 24 11:28:29 crc kubenswrapper[4752]: ++ OVNBridge=br-int Nov 24 11:28:29 crc kubenswrapper[4752]: ++ OVNRemote=tcp:localhost:6642 Nov 24 11:28:29 crc kubenswrapper[4752]: ++ OVNEncapType=geneve Nov 24 11:28:29 crc kubenswrapper[4752]: ++ OVNAvailabilityZones= Nov 24 11:28:29 crc kubenswrapper[4752]: ++ EnableChassisAsGateway=true Nov 24 11:28:29 crc kubenswrapper[4752]: ++ PhysicalNetworks= Nov 24 11:28:29 crc kubenswrapper[4752]: ++ OVNHostName= Nov 24 11:28:29 crc kubenswrapper[4752]: ++ DB_FILE=/etc/openvswitch/conf.db Nov 24 11:28:29 crc kubenswrapper[4752]: ++ ovs_dir=/var/lib/openvswitch Nov 24 11:28:29 crc kubenswrapper[4752]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Nov 24 11:28:29 crc kubenswrapper[4752]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Nov 24 11:28:29 crc kubenswrapper[4752]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Nov 24 11:28:29 crc kubenswrapper[4752]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Nov 24 11:28:29 crc kubenswrapper[4752]: + sleep 0.5 Nov 24 11:28:29 crc kubenswrapper[4752]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Nov 24 11:28:29 crc kubenswrapper[4752]: + sleep 0.5 Nov 24 11:28:29 crc kubenswrapper[4752]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Nov 24 11:28:29 crc kubenswrapper[4752]: + cleanup_ovsdb_server_semaphore Nov 24 11:28:29 crc kubenswrapper[4752]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Nov 24 11:28:29 crc kubenswrapper[4752]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Nov 24 11:28:29 crc kubenswrapper[4752]: > pod="openstack/ovn-controller-ovs-v6jhw" podUID="26da9a73-3ede-476b-bfab-08839a8b88d1" containerName="ovsdb-server" containerID="cri-o://8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.165900 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-v6jhw" podUID="26da9a73-3ede-476b-bfab-08839a8b88d1" containerName="ovsdb-server" containerID="cri-o://8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce" gracePeriod=29 Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.198429 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-v6jhw" podUID="26da9a73-3ede-476b-bfab-08839a8b88d1" containerName="ovs-vswitchd" containerID="cri-o://3d7ebdd7e2accb69d034418b3619ad363a595fd55694997a32bd7b3f79d31253" gracePeriod=29 Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.201592 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.201664 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.201683 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.201696 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.201711 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d5a1ee4e-2810-485d-b931-d3121fe7e264","Type":"ContainerDied","Data":"825da2ef53156d201cfd38c810103d4083ba547ce727cf4b78de89b162548a38"} Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.201754 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"1077002f-be90-4e09-8158-efdc98329e5b","Type":"ContainerDied","Data":"04d87e64c2950336bd5ce6ab63d3778aa490d6f9336803686ba547e5f016a8d5"} Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.201771 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7d558bea-9793-4831-9304-f8cee2b2331e","Type":"ContainerDied","Data":"44a0f217799b6ba62959ff62bff9dd54d47f9e3c7d76763038b8475083244095"} Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.201790 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.201804 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7d558bea-9793-4831-9304-f8cee2b2331e","Type":"ContainerDied","Data":"2ea78ef62ccc3e6245a885fb49cb63708cc7ee44c9d209af4583cd80c3340003"} Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.201815 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.201827 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-fmqfk"] Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.201841 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-fmqfk"] Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.201858 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d140408d-b7e5-4ba0-9404-877498cf18a1","Type":"ContainerDied","Data":"767dc8c699d90b9a5af04f3792d4ddd678467deee8fa5af4963960ebd783d478"} Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.201876 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell119d4-account-delete-vksfq"] Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.201889 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-19d4-account-create-84bgb"] Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.201901 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-cm8dh" event={"ID":"01bd298d-28f4-4c50-80b3-f81ed96de794","Type":"ContainerDied","Data":"c9770a80fae990d49f8d0649788f9b375b5556a34ea582ab37233bdac9368a5a"} Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.201914 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-56dd6c8857-kp42z"] Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.201936 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-19d4-account-create-84bgb"] Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.201948 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-57bcdc7dc8-2mzt6"] Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.201961 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-68f8b8b648-65q5g"] Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.201973 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7ffb7c9ccc-wsc5d"] Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.201985 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.202004 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.202270 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="fd071fb7-f9a2-4f4a-aad6-90c340f0d009" containerName="nova-scheduler-scheduler" containerID="cri-o://84100f51acbc89dc3eef175fd930e0f2a3ebebe505fb5e9794dfc4d2790209b4" gracePeriod=30 Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.202681 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="16fc61a1-6794-4ecf-a3dd-d79a88eda486" containerName="glance-log" containerID="cri-o://4903a5653dc6c301907cc4b8a087af2718aabe36426414f0ba709fb8e91a2e8c" gracePeriod=30 Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.202864 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b3b08ab3-f6c3-417e-abce-e03994e7553d" containerName="nova-api-log" containerID="cri-o://9568b7bef3cbeeeda117b907e1246694e8acf31ceba1e01df07fa02cb3727f40" gracePeriod=30 Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.203052 4752 scope.go:117] "RemoveContainer" containerID="e53bf221b6c3d910bf35949c2778fbbe1ce10498615e17af98ca6b7a7b5fd127" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.203347 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4bdb834d-3fdb-4bed-b349-e41c60ec4d64" containerName="nova-metadata-log" containerID="cri-o://6d5f6c189c0f341ca1cfaf61e6a7acbcdd7eb6be88bcb7818cbe04ea74b7b501" gracePeriod=30 Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.203491 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-56dd6c8857-kp42z" podUID="e9afd9c8-7b23-44fb-99ce-7924833d4589" containerName="neutron-api" containerID="cri-o://31f4463200798c8e71465693c21c3b267736953dd9c6093042d651dea9d77b08" gracePeriod=30 Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.203624 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-57bcdc7dc8-2mzt6" podUID="dd899b7f-383d-45fd-9ec8-48caa4f954f2" containerName="barbican-keystone-listener-log" containerID="cri-o://92a697b4785f20d4c2e3ad0c1a378d7b86f11315ec2519ac9828547a6554fe0b" gracePeriod=30 Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.203794 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-68f8b8b648-65q5g" podUID="6c592991-2ffb-417a-aa80-49d0111618bb" containerName="barbican-api-log" containerID="cri-o://b77d7cf0b069a2acc502c8ead2f25a536ecf99b661c7953794e5aa4a197061a2" gracePeriod=30 Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.203940 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7ffb7c9ccc-wsc5d" podUID="7cb9117e-3eeb-4c01-af6a-643eec81666c" containerName="barbican-worker-log" containerID="cri-o://b756f1cb33ac9fd77f5462fb04d0efd55526c3b8882a7fb850fc306538938658" gracePeriod=30 Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.204063 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="0487424d-5178-4843-ae8d-db6c015fe9d4" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://456e104a9500a5d75fcca8f093e7825be12c29b5f33ca5320de653b354d9b91c" gracePeriod=30 Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.204334 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="16fc61a1-6794-4ecf-a3dd-d79a88eda486" containerName="glance-httpd" containerID="cri-o://944ddefc80e86ba009045369f39188fcfd4ed96c67435d31b540b52189cc2961" gracePeriod=30 Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.204402 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-56dd6c8857-kp42z" podUID="e9afd9c8-7b23-44fb-99ce-7924833d4589" containerName="neutron-httpd" containerID="cri-o://b891d4bc5a7581172d59c7e614c2ede206cf3a7bc2dcfbffabe7f2e2bc23b602" gracePeriod=30 Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.204444 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-57bcdc7dc8-2mzt6" podUID="dd899b7f-383d-45fd-9ec8-48caa4f954f2" containerName="barbican-keystone-listener" containerID="cri-o://1b7082498c964fc9737d85041d682c2e7e12fce540ffa1aaeb63a62e8dc412bf" gracePeriod=30 Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.204489 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b3b08ab3-f6c3-417e-abce-e03994e7553d" containerName="nova-api-api" containerID="cri-o://858e76b45449fa0fcdbccdf4d4f6325a65859816c0ee8d347ba69396fbd78a83" gracePeriod=30 Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.204526 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-68f8b8b648-65q5g" podUID="6c592991-2ffb-417a-aa80-49d0111618bb" containerName="barbican-api" containerID="cri-o://58ade1c43e6907e3577011461f577c10b33e8682d39ece87e8fbf57d78ed060c" gracePeriod=30 Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.204575 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7ffb7c9ccc-wsc5d" podUID="7cb9117e-3eeb-4c01-af6a-643eec81666c" containerName="barbican-worker" containerID="cri-o://30b709fe7204f5ebbe3e77680225e7690d93a9b9881d529fce8b592b7c3e865e" gracePeriod=30 Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.204584 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4bdb834d-3fdb-4bed-b349-e41c60ec4d64" containerName="nova-metadata-metadata" containerID="cri-o://820c72aca67b79ea64a516208b0eaa7594ab5684ba110221a1b876bae259b74a" gracePeriod=30 Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.332778 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="a1f1e943-afb7-40c3-9ff2-56791f4e0ad5" containerName="rabbitmq" containerID="cri-o://d59dfb8e5264cefe368e6631c2fcd950f686b3a37ff453ac4489ed24628cf2fd" gracePeriod=604800 Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.376497 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="3d820ba8-03d6-48f4-9423-bbc1ed64a36b" containerName="rabbitmq" containerID="cri-o://d0225139306006f6265b3807404dbc72413aeefa6c2ec7ca1125827941ca3fbd" gracePeriod=604800 Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.464919 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.494720 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wm6ct"] Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.509222 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wm6ct"] Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.552839 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.553144 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="2d92c8e6-2992-41be-950b-1eba6c84b636" containerName="nova-cell0-conductor-conductor" containerID="cri-o://f7b77ab3e2e24a2b9d77f401f9cb923e7c8b2ba3bb0f4001ef3a2f170f3bb279" gracePeriod=30 Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.586231 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7qgtl"] Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.595715 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7qgtl"] Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.605820 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement83dd-account-delete-87gjb"] Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.612903 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder3b7b-account-delete-rj27n"] Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.614354 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="e3426b66-ea91-4d99-86c5-955d77073619" containerName="galera" containerID="cri-o://00c9561e9a5735f6ed6ea2bd3f779c56f52d92d1f868c56150ff9276fc50cb92" gracePeriod=30 Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.614836 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7d558bea-9793-4831-9304-f8cee2b2331e/ovsdbserver-sb/0.log" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.614894 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.624559 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-65sfh" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.625665 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-cm8dh_01bd298d-28f4-4c50-80b3-f81ed96de794/openstack-network-exporter/0.log" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.625706 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-cm8dh" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.654962 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.698692 4752 scope.go:117] "RemoveContainer" containerID="342bfdc4cfd18e6bd3ed709a4a89ca54622aa4743390cc89a011bb7607799950" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.702633 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09540fa5-6ff8-45cc-98be-968283dc2bfd-combined-ca-bundle\") pod \"09540fa5-6ff8-45cc-98be-968283dc2bfd\" (UID: \"09540fa5-6ff8-45cc-98be-968283dc2bfd\") " Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.702680 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c699191a-ace5-4413-8607-e801f9646d0d-dns-swift-storage-0\") pod \"c699191a-ace5-4413-8607-e801f9646d0d\" (UID: \"c699191a-ace5-4413-8607-e801f9646d0d\") " Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.702719 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d558bea-9793-4831-9304-f8cee2b2331e-scripts\") pod \"7d558bea-9793-4831-9304-f8cee2b2331e\" (UID: \"7d558bea-9793-4831-9304-f8cee2b2331e\") " Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.702770 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d558bea-9793-4831-9304-f8cee2b2331e-config\") pod \"7d558bea-9793-4831-9304-f8cee2b2331e\" (UID: \"7d558bea-9793-4831-9304-f8cee2b2331e\") " Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.702815 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c699191a-ace5-4413-8607-e801f9646d0d-config\") pod \"c699191a-ace5-4413-8607-e801f9646d0d\" (UID: \"c699191a-ace5-4413-8607-e801f9646d0d\") " Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.702856 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c699191a-ace5-4413-8607-e801f9646d0d-ovsdbserver-nb\") pod \"c699191a-ace5-4413-8607-e801f9646d0d\" (UID: \"c699191a-ace5-4413-8607-e801f9646d0d\") " Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.702876 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/01bd298d-28f4-4c50-80b3-f81ed96de794-metrics-certs-tls-certs\") pod \"01bd298d-28f4-4c50-80b3-f81ed96de794\" (UID: \"01bd298d-28f4-4c50-80b3-f81ed96de794\") " Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.702914 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnwg5\" (UniqueName: \"kubernetes.io/projected/c699191a-ace5-4413-8607-e801f9646d0d-kube-api-access-vnwg5\") pod \"c699191a-ace5-4413-8607-e801f9646d0d\" (UID: \"c699191a-ace5-4413-8607-e801f9646d0d\") " Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.703152 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"7d558bea-9793-4831-9304-f8cee2b2331e\" (UID: \"7d558bea-9793-4831-9304-f8cee2b2331e\") " Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.703184 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcb9z\" (UniqueName: \"kubernetes.io/projected/7d558bea-9793-4831-9304-f8cee2b2331e-kube-api-access-hcb9z\") pod \"7d558bea-9793-4831-9304-f8cee2b2331e\" (UID: \"7d558bea-9793-4831-9304-f8cee2b2331e\") " Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.703207 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7d558bea-9793-4831-9304-f8cee2b2331e-ovsdb-rundir\") pod \"7d558bea-9793-4831-9304-f8cee2b2331e\" (UID: \"7d558bea-9793-4831-9304-f8cee2b2331e\") " Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.703251 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c699191a-ace5-4413-8607-e801f9646d0d-dns-svc\") pod \"c699191a-ace5-4413-8607-e801f9646d0d\" (UID: \"c699191a-ace5-4413-8607-e801f9646d0d\") " Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.703281 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d558bea-9793-4831-9304-f8cee2b2331e-ovsdbserver-sb-tls-certs\") pod \"7d558bea-9793-4831-9304-f8cee2b2331e\" (UID: \"7d558bea-9793-4831-9304-f8cee2b2331e\") " Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.703327 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/09540fa5-6ff8-45cc-98be-968283dc2bfd-openstack-config-secret\") pod \"09540fa5-6ff8-45cc-98be-968283dc2bfd\" (UID: \"09540fa5-6ff8-45cc-98be-968283dc2bfd\") " Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.703353 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01bd298d-28f4-4c50-80b3-f81ed96de794-combined-ca-bundle\") pod \"01bd298d-28f4-4c50-80b3-f81ed96de794\" (UID: \"01bd298d-28f4-4c50-80b3-f81ed96de794\") " Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.703382 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/09540fa5-6ff8-45cc-98be-968283dc2bfd-openstack-config\") pod \"09540fa5-6ff8-45cc-98be-968283dc2bfd\" (UID: \"09540fa5-6ff8-45cc-98be-968283dc2bfd\") " Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.703411 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/01bd298d-28f4-4c50-80b3-f81ed96de794-ovs-rundir\") pod \"01bd298d-28f4-4c50-80b3-f81ed96de794\" (UID: \"01bd298d-28f4-4c50-80b3-f81ed96de794\") " Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.703440 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d558bea-9793-4831-9304-f8cee2b2331e-combined-ca-bundle\") pod \"7d558bea-9793-4831-9304-f8cee2b2331e\" (UID: \"7d558bea-9793-4831-9304-f8cee2b2331e\") " Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.703491 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvtlv\" (UniqueName: \"kubernetes.io/projected/09540fa5-6ff8-45cc-98be-968283dc2bfd-kube-api-access-rvtlv\") pod \"09540fa5-6ff8-45cc-98be-968283dc2bfd\" (UID: \"09540fa5-6ff8-45cc-98be-968283dc2bfd\") " Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.703517 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c699191a-ace5-4413-8607-e801f9646d0d-ovsdbserver-sb\") pod \"c699191a-ace5-4413-8607-e801f9646d0d\" (UID: \"c699191a-ace5-4413-8607-e801f9646d0d\") " Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.703550 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01bd298d-28f4-4c50-80b3-f81ed96de794-config\") pod \"01bd298d-28f4-4c50-80b3-f81ed96de794\" (UID: \"01bd298d-28f4-4c50-80b3-f81ed96de794\") " Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.703580 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/01bd298d-28f4-4c50-80b3-f81ed96de794-ovn-rundir\") pod \"01bd298d-28f4-4c50-80b3-f81ed96de794\" (UID: \"01bd298d-28f4-4c50-80b3-f81ed96de794\") " Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.703609 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bcwj\" (UniqueName: \"kubernetes.io/projected/01bd298d-28f4-4c50-80b3-f81ed96de794-kube-api-access-6bcwj\") pod \"01bd298d-28f4-4c50-80b3-f81ed96de794\" (UID: \"01bd298d-28f4-4c50-80b3-f81ed96de794\") " Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.703639 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d558bea-9793-4831-9304-f8cee2b2331e-metrics-certs-tls-certs\") pod \"7d558bea-9793-4831-9304-f8cee2b2331e\" (UID: \"7d558bea-9793-4831-9304-f8cee2b2331e\") " Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.706815 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d558bea-9793-4831-9304-f8cee2b2331e-scripts" (OuterVolumeSpecName: "scripts") pod "7d558bea-9793-4831-9304-f8cee2b2331e" (UID: "7d558bea-9793-4831-9304-f8cee2b2331e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.706862 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d558bea-9793-4831-9304-f8cee2b2331e-config" (OuterVolumeSpecName: "config") pod "7d558bea-9793-4831-9304-f8cee2b2331e" (UID: "7d558bea-9793-4831-9304-f8cee2b2331e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.711413 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09540fa5-6ff8-45cc-98be-968283dc2bfd-kube-api-access-rvtlv" (OuterVolumeSpecName: "kube-api-access-rvtlv") pod "09540fa5-6ff8-45cc-98be-968283dc2bfd" (UID: "09540fa5-6ff8-45cc-98be-968283dc2bfd"). InnerVolumeSpecName "kube-api-access-rvtlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.721888 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01bd298d-28f4-4c50-80b3-f81ed96de794-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "01bd298d-28f4-4c50-80b3-f81ed96de794" (UID: "01bd298d-28f4-4c50-80b3-f81ed96de794"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.722876 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01bd298d-28f4-4c50-80b3-f81ed96de794-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "01bd298d-28f4-4c50-80b3-f81ed96de794" (UID: "01bd298d-28f4-4c50-80b3-f81ed96de794"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.728680 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d558bea-9793-4831-9304-f8cee2b2331e-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "7d558bea-9793-4831-9304-f8cee2b2331e" (UID: "7d558bea-9793-4831-9304-f8cee2b2331e"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.732972 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "7d558bea-9793-4831-9304-f8cee2b2331e" (UID: "7d558bea-9793-4831-9304-f8cee2b2331e"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.738898 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01bd298d-28f4-4c50-80b3-f81ed96de794-kube-api-access-6bcwj" (OuterVolumeSpecName: "kube-api-access-6bcwj") pod "01bd298d-28f4-4c50-80b3-f81ed96de794" (UID: "01bd298d-28f4-4c50-80b3-f81ed96de794"). InnerVolumeSpecName "kube-api-access-6bcwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.743589 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01bd298d-28f4-4c50-80b3-f81ed96de794-config" (OuterVolumeSpecName: "config") pod "01bd298d-28f4-4c50-80b3-f81ed96de794" (UID: "01bd298d-28f4-4c50-80b3-f81ed96de794"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.744738 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c699191a-ace5-4413-8607-e801f9646d0d-kube-api-access-vnwg5" (OuterVolumeSpecName: "kube-api-access-vnwg5") pod "c699191a-ace5-4413-8607-e801f9646d0d" (UID: "c699191a-ace5-4413-8607-e801f9646d0d"). InnerVolumeSpecName "kube-api-access-vnwg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.750607 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d558bea-9793-4831-9304-f8cee2b2331e-kube-api-access-hcb9z" (OuterVolumeSpecName: "kube-api-access-hcb9z") pod "7d558bea-9793-4831-9304-f8cee2b2331e" (UID: "7d558bea-9793-4831-9304-f8cee2b2331e"). InnerVolumeSpecName "kube-api-access-hcb9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.812386 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09540fa5-6ff8-45cc-98be-968283dc2bfd-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "09540fa5-6ff8-45cc-98be-968283dc2bfd" (UID: "09540fa5-6ff8-45cc-98be-968283dc2bfd"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.817058 4752 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.817103 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcb9z\" (UniqueName: \"kubernetes.io/projected/7d558bea-9793-4831-9304-f8cee2b2331e-kube-api-access-hcb9z\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.817114 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7d558bea-9793-4831-9304-f8cee2b2331e-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.817124 4752 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/09540fa5-6ff8-45cc-98be-968283dc2bfd-openstack-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.817133 4752 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/01bd298d-28f4-4c50-80b3-f81ed96de794-ovs-rundir\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.817144 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvtlv\" (UniqueName: \"kubernetes.io/projected/09540fa5-6ff8-45cc-98be-968283dc2bfd-kube-api-access-rvtlv\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.817155 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01bd298d-28f4-4c50-80b3-f81ed96de794-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.817163 4752 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/01bd298d-28f4-4c50-80b3-f81ed96de794-ovn-rundir\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.817171 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bcwj\" (UniqueName: \"kubernetes.io/projected/01bd298d-28f4-4c50-80b3-f81ed96de794-kube-api-access-6bcwj\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.817179 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d558bea-9793-4831-9304-f8cee2b2331e-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.817187 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d558bea-9793-4831-9304-f8cee2b2331e-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.817195 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnwg5\" (UniqueName: \"kubernetes.io/projected/c699191a-ace5-4413-8607-e801f9646d0d-kube-api-access-vnwg5\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:29 crc kubenswrapper[4752]: E1124 11:28:29.831536 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbc8819aaddd8d4d2d0dad826bb4ce82df8a1c47c68ac131ce481a35067952c7" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Nov 24 11:28:29 crc kubenswrapper[4752]: E1124 11:28:29.843369 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbc8819aaddd8d4d2d0dad826bb4ce82df8a1c47c68ac131ce481a35067952c7" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.847842 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09540fa5-6ff8-45cc-98be-968283dc2bfd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "09540fa5-6ff8-45cc-98be-968283dc2bfd" (UID: "09540fa5-6ff8-45cc-98be-968283dc2bfd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:29 crc kubenswrapper[4752]: E1124 11:28:29.858317 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbc8819aaddd8d4d2d0dad826bb4ce82df8a1c47c68ac131ce481a35067952c7" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Nov 24 11:28:29 crc kubenswrapper[4752]: E1124 11:28:29.858384 4752 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="1077002f-be90-4e09-8158-efdc98329e5b" containerName="ovn-northd" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.902701 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c699191a-ace5-4413-8607-e801f9646d0d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c699191a-ace5-4413-8607-e801f9646d0d" (UID: "c699191a-ace5-4413-8607-e801f9646d0d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.911474 4752 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.917962 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-cm8dh_01bd298d-28f4-4c50-80b3-f81ed96de794/openstack-network-exporter/0.log" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.918043 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-cm8dh" event={"ID":"01bd298d-28f4-4c50-80b3-f81ed96de794","Type":"ContainerDied","Data":"7ffa3c32b294f924b2c9e536b2d0f491231355abdf781c8100897626e5c10bdb"} Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.918080 4752 scope.go:117] "RemoveContainer" containerID="c9770a80fae990d49f8d0649788f9b375b5556a34ea582ab37233bdac9368a5a" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.918186 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-cm8dh" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.919860 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09540fa5-6ff8-45cc-98be-968283dc2bfd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.919876 4752 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c699191a-ace5-4413-8607-e801f9646d0d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.919887 4752 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.929916 4752 generic.go:334] "Generic (PLEG): container finished" podID="f3c01e5f-8338-497f-9e47-e2e3e857df63" containerID="5608c635aa46e4432dd124ea11759d4280b1e7c2301d6131246b4a3f068d7cee" exitCode=143 Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.929965 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f3c01e5f-8338-497f-9e47-e2e3e857df63","Type":"ContainerDied","Data":"5608c635aa46e4432dd124ea11759d4280b1e7c2301d6131246b4a3f068d7cee"} Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.939029 4752 generic.go:334] "Generic (PLEG): container finished" podID="7cb9117e-3eeb-4c01-af6a-643eec81666c" containerID="b756f1cb33ac9fd77f5462fb04d0efd55526c3b8882a7fb850fc306538938658" exitCode=143 Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.939091 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapib683-account-delete-k7898"] Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.939113 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7ffb7c9ccc-wsc5d" event={"ID":"7cb9117e-3eeb-4c01-af6a-643eec81666c","Type":"ContainerDied","Data":"b756f1cb33ac9fd77f5462fb04d0efd55526c3b8882a7fb850fc306538938658"} Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.946346 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d558bea-9793-4831-9304-f8cee2b2331e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d558bea-9793-4831-9304-f8cee2b2331e" (UID: "7d558bea-9793-4831-9304-f8cee2b2331e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.959070 4752 generic.go:334] "Generic (PLEG): container finished" podID="e9afd9c8-7b23-44fb-99ce-7924833d4589" containerID="b891d4bc5a7581172d59c7e614c2ede206cf3a7bc2dcfbffabe7f2e2bc23b602" exitCode=0 Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.959125 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56dd6c8857-kp42z" event={"ID":"e9afd9c8-7b23-44fb-99ce-7924833d4589","Type":"ContainerDied","Data":"b891d4bc5a7581172d59c7e614c2ede206cf3a7bc2dcfbffabe7f2e2bc23b602"} Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.971141 4752 generic.go:334] "Generic (PLEG): container finished" podID="09540fa5-6ff8-45cc-98be-968283dc2bfd" containerID="cd44959aa1a906397cf9d91f94ae613e3c4579ada93cf6e70cc97f819550202b" exitCode=137 Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.971305 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.976097 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron574b-account-delete-nw8qc"] Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.987227 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ffd3886c-7122-4bd5-97d2-5c448c31e941","Type":"ContainerDied","Data":"8afbde4838c68ed65d0a791d3f6cf81c9dff82063ee60fdee90cec80f669ae26"} Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.989789 4752 generic.go:334] "Generic (PLEG): container finished" podID="ffd3886c-7122-4bd5-97d2-5c448c31e941" containerID="8afbde4838c68ed65d0a791d3f6cf81c9dff82063ee60fdee90cec80f669ae26" exitCode=143 Nov 24 11:28:29 crc kubenswrapper[4752]: I1124 11:28:29.995271 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glancef4c5-account-delete-kjkjp"] Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.006785 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell119d4-account-delete-vksfq"] Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.020762 4752 generic.go:334] "Generic (PLEG): container finished" podID="9d0084f6-064e-4089-87fe-4ead63923b56" containerID="be46845395e6568e301fffb75b2b8cbf950babfb7216ca6d7fcccb52601c5794" exitCode=0 Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.020851 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9d0084f6-064e-4089-87fe-4ead63923b56","Type":"ContainerDied","Data":"be46845395e6568e301fffb75b2b8cbf950babfb7216ca6d7fcccb52601c5794"} Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.022017 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d558bea-9793-4831-9304-f8cee2b2331e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.026777 4752 generic.go:334] "Generic (PLEG): container finished" podID="4bdb834d-3fdb-4bed-b349-e41c60ec4d64" containerID="6d5f6c189c0f341ca1cfaf61e6a7acbcdd7eb6be88bcb7818cbe04ea74b7b501" exitCode=143 Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.026854 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4bdb834d-3fdb-4bed-b349-e41c60ec4d64","Type":"ContainerDied","Data":"6d5f6c189c0f341ca1cfaf61e6a7acbcdd7eb6be88bcb7818cbe04ea74b7b501"} Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.050720 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbicanf581-account-delete-qm6qk"] Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.063700 4752 generic.go:334] "Generic (PLEG): container finished" podID="dd899b7f-383d-45fd-9ec8-48caa4f954f2" containerID="1b7082498c964fc9737d85041d682c2e7e12fce540ffa1aaeb63a62e8dc412bf" exitCode=0 Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.063752 4752 generic.go:334] "Generic (PLEG): container finished" podID="dd899b7f-383d-45fd-9ec8-48caa4f954f2" containerID="92a697b4785f20d4c2e3ad0c1a378d7b86f11315ec2519ac9828547a6554fe0b" exitCode=143 Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.063827 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-57bcdc7dc8-2mzt6" event={"ID":"dd899b7f-383d-45fd-9ec8-48caa4f954f2","Type":"ContainerDied","Data":"1b7082498c964fc9737d85041d682c2e7e12fce540ffa1aaeb63a62e8dc412bf"} Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.063859 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-57bcdc7dc8-2mzt6" event={"ID":"dd899b7f-383d-45fd-9ec8-48caa4f954f2","Type":"ContainerDied","Data":"92a697b4785f20d4c2e3ad0c1a378d7b86f11315ec2519ac9828547a6554fe0b"} Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.066934 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell080e2-account-delete-8s7w5"] Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.085019 4752 generic.go:334] "Generic (PLEG): container finished" podID="26da9a73-3ede-476b-bfab-08839a8b88d1" containerID="8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce" exitCode=0 Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.085099 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-v6jhw" event={"ID":"26da9a73-3ede-476b-bfab-08839a8b88d1","Type":"ContainerDied","Data":"8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce"} Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.102161 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7d558bea-9793-4831-9304-f8cee2b2331e/ovsdbserver-sb/0.log" Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.102787 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.103064 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7d558bea-9793-4831-9304-f8cee2b2331e","Type":"ContainerDied","Data":"c55fa00ce217c9ff143279ea2ebf102cf6985dd0a7e29e449f51597e7dc588e9"} Nov 24 11:28:30 crc kubenswrapper[4752]: W1124 11:28:30.104116 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod273fc6bc_214e_4fca_8939_003f983d6aa1.slice/crio-abd2e1662ed7dcf11502498345efcbb4ba177e3072b9adfe74c59d85b7c85869 WatchSource:0}: Error finding container abd2e1662ed7dcf11502498345efcbb4ba177e3072b9adfe74c59d85b7c85869: Status 404 returned error can't find the container with id abd2e1662ed7dcf11502498345efcbb4ba177e3072b9adfe74c59d85b7c85869 Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.113180 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement83dd-account-delete-87gjb" event={"ID":"7cb3afd6-0557-4a85-9763-1572d92e6aa3","Type":"ContainerStarted","Data":"d1ebfc6f10dd4eb52b4acd0b3bb270a0fd32ec1642ea5d5b5155483350b10e02"} Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.115445 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder3b7b-account-delete-rj27n" event={"ID":"6b9d21e3-863e-4129-a794-41c3bb8899df","Type":"ContainerStarted","Data":"10c4b439ce7ad25316b6f53a5450e988eb5c29a9c5fefcd5bcf1ef2742400177"} Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.116772 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.129084 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01bd298d-28f4-4c50-80b3-f81ed96de794-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01bd298d-28f4-4c50-80b3-f81ed96de794" (UID: "01bd298d-28f4-4c50-80b3-f81ed96de794"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.204503 4752 scope.go:117] "RemoveContainer" containerID="cd44959aa1a906397cf9d91f94ae613e3c4579ada93cf6e70cc97f819550202b" Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.214196 4752 generic.go:334] "Generic (PLEG): container finished" podID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerID="9f08577ef801c3e7522c491d0c718c0de2696b6e95c04bd48ac0a4caf2a50f3f" exitCode=0 Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.214227 4752 generic.go:334] "Generic (PLEG): container finished" podID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerID="22d66d38b77ecc103d240b9da3486e681fb8545959095b39b6a50d203a648e8f" exitCode=0 Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.214238 4752 generic.go:334] "Generic (PLEG): container finished" podID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerID="40650a43647baeef577e362f8903bd0a8586c76d9193a87a0ee8036e11ec9bd6" exitCode=0 Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.214247 4752 generic.go:334] "Generic (PLEG): container finished" podID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerID="655afbf26b251383053ab1cd2f78be797251cca8e2d6d31b15777423d8e8b8d6" exitCode=0 Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.214254 4752 generic.go:334] "Generic (PLEG): container finished" podID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerID="43d8ca866663d4ecd92842929ff03ff00e617cdd862f758f162611f3514cc729" exitCode=0 Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.214260 4752 generic.go:334] "Generic (PLEG): container finished" podID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerID="e96acd23c65d5f84d30d5433b104a5d9611a4f0c7182eced372f80832d7b71b7" exitCode=0 Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.214266 4752 generic.go:334] "Generic (PLEG): container finished" podID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerID="488850f07865f9460361539e0d1fa936b99c6aeeb6356f50bc78ab5f8d90ffae" exitCode=0 Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.214273 4752 generic.go:334] "Generic (PLEG): container finished" podID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerID="b223ea43e6b2067d0faea7345f42301d5670008d230a3af4e07aa9d728dd386e" exitCode=0 Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.214279 4752 generic.go:334] "Generic (PLEG): container finished" podID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerID="1f69e9bf406833b72e92b79eb1f7d0d4f08fa2bf38125f2569affd7aa3802317" exitCode=0 Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.214286 4752 generic.go:334] "Generic (PLEG): container finished" podID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerID="0aa9b465d5873d31219e417b504faeddfc33faecc4e6894d5ac1e3bed4d23124" exitCode=0 Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.214292 4752 generic.go:334] "Generic (PLEG): container finished" podID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerID="c40fe6199d59ba83d6f5ecc88363dc9b53f3780bec0cb8d3bf3828832d55bce4" exitCode=0 Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.214299 4752 generic.go:334] "Generic (PLEG): container finished" podID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerID="8800b3e493a8c8c3ffef58fdad6314f8858048be66617819eddea42dd54bfa75" exitCode=0 Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.214305 4752 generic.go:334] "Generic (PLEG): container finished" podID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerID="c2e4b5f4a8a1169d5bc08d2b66fc675cb52f5aca6edec61cb505367c31a96177" exitCode=0 Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.214352 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d140408d-b7e5-4ba0-9404-877498cf18a1","Type":"ContainerDied","Data":"9f08577ef801c3e7522c491d0c718c0de2696b6e95c04bd48ac0a4caf2a50f3f"} Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.214378 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d140408d-b7e5-4ba0-9404-877498cf18a1","Type":"ContainerDied","Data":"22d66d38b77ecc103d240b9da3486e681fb8545959095b39b6a50d203a648e8f"} Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.214387 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d140408d-b7e5-4ba0-9404-877498cf18a1","Type":"ContainerDied","Data":"40650a43647baeef577e362f8903bd0a8586c76d9193a87a0ee8036e11ec9bd6"} Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.214396 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d140408d-b7e5-4ba0-9404-877498cf18a1","Type":"ContainerDied","Data":"655afbf26b251383053ab1cd2f78be797251cca8e2d6d31b15777423d8e8b8d6"} Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.214404 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d140408d-b7e5-4ba0-9404-877498cf18a1","Type":"ContainerDied","Data":"43d8ca866663d4ecd92842929ff03ff00e617cdd862f758f162611f3514cc729"} Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.214415 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d140408d-b7e5-4ba0-9404-877498cf18a1","Type":"ContainerDied","Data":"e96acd23c65d5f84d30d5433b104a5d9611a4f0c7182eced372f80832d7b71b7"} Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.214423 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d140408d-b7e5-4ba0-9404-877498cf18a1","Type":"ContainerDied","Data":"488850f07865f9460361539e0d1fa936b99c6aeeb6356f50bc78ab5f8d90ffae"} Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.214434 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d140408d-b7e5-4ba0-9404-877498cf18a1","Type":"ContainerDied","Data":"b223ea43e6b2067d0faea7345f42301d5670008d230a3af4e07aa9d728dd386e"} Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.214447 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d140408d-b7e5-4ba0-9404-877498cf18a1","Type":"ContainerDied","Data":"1f69e9bf406833b72e92b79eb1f7d0d4f08fa2bf38125f2569affd7aa3802317"} Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.214459 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d140408d-b7e5-4ba0-9404-877498cf18a1","Type":"ContainerDied","Data":"0aa9b465d5873d31219e417b504faeddfc33faecc4e6894d5ac1e3bed4d23124"} Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.214471 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d140408d-b7e5-4ba0-9404-877498cf18a1","Type":"ContainerDied","Data":"c40fe6199d59ba83d6f5ecc88363dc9b53f3780bec0cb8d3bf3828832d55bce4"} Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.214481 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d140408d-b7e5-4ba0-9404-877498cf18a1","Type":"ContainerDied","Data":"8800b3e493a8c8c3ffef58fdad6314f8858048be66617819eddea42dd54bfa75"} Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.214493 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d140408d-b7e5-4ba0-9404-877498cf18a1","Type":"ContainerDied","Data":"c2e4b5f4a8a1169d5bc08d2b66fc675cb52f5aca6edec61cb505367c31a96177"} Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.217126 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-79cf79466c-6hfp8"] Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.217403 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-79cf79466c-6hfp8" podUID="4b18be8a-5ff5-4b20-b6d5-5ca167f33583" containerName="proxy-httpd" containerID="cri-o://38c41eef16564abd80395499860a5b1c7de6d60c007496e866d4d37f4047cebc" gracePeriod=30 Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.217815 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-79cf79466c-6hfp8" podUID="4b18be8a-5ff5-4b20-b6d5-5ca167f33583" containerName="proxy-server" containerID="cri-o://3971caac8147987cbdf6ff339a4a02e13902ca6259e0896f8df16f72509dc49d" gracePeriod=30 Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.227543 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01bd298d-28f4-4c50-80b3-f81ed96de794-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.232204 4752 generic.go:334] "Generic (PLEG): container finished" podID="a6e4558a-350e-442a-ad61-70d9a9824219" containerID="608e26395b4dc1de0bd00669f57c463df6c52c83abe35e40f229dda6794c160f" exitCode=143 Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.232296 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-57b7bbd86d-9drzs" event={"ID":"a6e4558a-350e-442a-ad61-70d9a9824219","Type":"ContainerDied","Data":"608e26395b4dc1de0bd00669f57c463df6c52c83abe35e40f229dda6794c160f"} Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.235901 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.241778 4752 generic.go:334] "Generic (PLEG): container finished" podID="6c592991-2ffb-417a-aa80-49d0111618bb" containerID="b77d7cf0b069a2acc502c8ead2f25a536ecf99b661c7953794e5aa4a197061a2" exitCode=143 Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.241856 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-68f8b8b648-65q5g" event={"ID":"6c592991-2ffb-417a-aa80-49d0111618bb","Type":"ContainerDied","Data":"b77d7cf0b069a2acc502c8ead2f25a536ecf99b661c7953794e5aa4a197061a2"} Nov 24 11:28:30 crc kubenswrapper[4752]: W1124 11:28:30.246127 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8b9f59e_8570_4b2a_9c7b_14be641f74fe.slice/crio-658624b76f0dab253674f5903fd66093ed8665d386f9a2195deb9b3c741fc580 WatchSource:0}: Error finding container 658624b76f0dab253674f5903fd66093ed8665d386f9a2195deb9b3c741fc580: Status 404 returned error can't find the container with id 658624b76f0dab253674f5903fd66093ed8665d386f9a2195deb9b3c741fc580 Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.247776 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-65sfh" Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.248553 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-65sfh" event={"ID":"c699191a-ace5-4413-8607-e801f9646d0d","Type":"ContainerDied","Data":"1eabf3bee073b42427c876390f73ac696e84edd5e177af294cff5ca5999ecd4f"} Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.251979 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.255853 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c699191a-ace5-4413-8607-e801f9646d0d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c699191a-ace5-4413-8607-e801f9646d0d" (UID: "c699191a-ace5-4413-8607-e801f9646d0d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.258561 4752 generic.go:334] "Generic (PLEG): container finished" podID="16fc61a1-6794-4ecf-a3dd-d79a88eda486" containerID="4903a5653dc6c301907cc4b8a087af2718aabe36426414f0ba709fb8e91a2e8c" exitCode=143 Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.258697 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"16fc61a1-6794-4ecf-a3dd-d79a88eda486","Type":"ContainerDied","Data":"4903a5653dc6c301907cc4b8a087af2718aabe36426414f0ba709fb8e91a2e8c"} Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.263394 4752 generic.go:334] "Generic (PLEG): container finished" podID="b3b08ab3-f6c3-417e-abce-e03994e7553d" containerID="9568b7bef3cbeeeda117b907e1246694e8acf31ceba1e01df07fa02cb3727f40" exitCode=143 Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.263583 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="5be4efe6-ed60-4417-a84c-8ff27bf4a685" containerName="nova-cell1-conductor-conductor" containerID="cri-o://d926ac7aaad802ad821b9f117c72a63b843e3913b61ff928707f207f37dcd1bc" gracePeriod=30 Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.264852 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b3b08ab3-f6c3-417e-abce-e03994e7553d","Type":"ContainerDied","Data":"9568b7bef3cbeeeda117b907e1246694e8acf31ceba1e01df07fa02cb3727f40"} Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.314218 4752 scope.go:117] "RemoveContainer" containerID="cd44959aa1a906397cf9d91f94ae613e3c4579ada93cf6e70cc97f819550202b" Nov 24 11:28:30 crc kubenswrapper[4752]: E1124 11:28:30.314723 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd44959aa1a906397cf9d91f94ae613e3c4579ada93cf6e70cc97f819550202b\": container with ID starting with cd44959aa1a906397cf9d91f94ae613e3c4579ada93cf6e70cc97f819550202b not found: ID does not exist" containerID="cd44959aa1a906397cf9d91f94ae613e3c4579ada93cf6e70cc97f819550202b" Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.314882 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd44959aa1a906397cf9d91f94ae613e3c4579ada93cf6e70cc97f819550202b"} err="failed to get container status \"cd44959aa1a906397cf9d91f94ae613e3c4579ada93cf6e70cc97f819550202b\": rpc error: code = NotFound desc = could not find container \"cd44959aa1a906397cf9d91f94ae613e3c4579ada93cf6e70cc97f819550202b\": container with ID starting with cd44959aa1a906397cf9d91f94ae613e3c4579ada93cf6e70cc97f819550202b not found: ID does not exist" Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.314931 4752 scope.go:117] "RemoveContainer" containerID="44a0f217799b6ba62959ff62bff9dd54d47f9e3c7d76763038b8475083244095" Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.328987 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c699191a-ace5-4413-8607-e801f9646d0d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:30 crc kubenswrapper[4752]: E1124 11:28:30.375095 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="84100f51acbc89dc3eef175fd930e0f2a3ebebe505fb5e9794dfc4d2790209b4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 24 11:28:30 crc kubenswrapper[4752]: E1124 11:28:30.393209 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="84100f51acbc89dc3eef175fd930e0f2a3ebebe505fb5e9794dfc4d2790209b4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 24 11:28:30 crc kubenswrapper[4752]: E1124 11:28:30.411089 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="84100f51acbc89dc3eef175fd930e0f2a3ebebe505fb5e9794dfc4d2790209b4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 24 11:28:30 crc kubenswrapper[4752]: E1124 11:28:30.411137 4752 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="fd071fb7-f9a2-4f4a-aad6-90c340f0d009" containerName="nova-scheduler-scheduler" Nov 24 11:28:30 crc kubenswrapper[4752]: E1124 11:28:30.433734 4752 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Nov 24 11:28:30 crc kubenswrapper[4752]: E1124 11:28:30.433978 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-config-data podName:a1f1e943-afb7-40c3-9ff2-56791f4e0ad5 nodeName:}" failed. No retries permitted until 2025-11-24 11:28:34.433959924 +0000 UTC m=+1320.418780203 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-config-data") pod "rabbitmq-server-0" (UID: "a1f1e943-afb7-40c3-9ff2-56791f4e0ad5") : configmap "rabbitmq-config-data" not found Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.538734 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c699191a-ace5-4413-8607-e801f9646d0d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c699191a-ace5-4413-8607-e801f9646d0d" (UID: "c699191a-ace5-4413-8607-e801f9646d0d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.548253 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c699191a-ace5-4413-8607-e801f9646d0d-config" (OuterVolumeSpecName: "config") pod "c699191a-ace5-4413-8607-e801f9646d0d" (UID: "c699191a-ace5-4413-8607-e801f9646d0d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.618409 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09540fa5-6ff8-45cc-98be-968283dc2bfd-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "09540fa5-6ff8-45cc-98be-968283dc2bfd" (UID: "09540fa5-6ff8-45cc-98be-968283dc2bfd"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.635259 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d558bea-9793-4831-9304-f8cee2b2331e-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "7d558bea-9793-4831-9304-f8cee2b2331e" (UID: "7d558bea-9793-4831-9304-f8cee2b2331e"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.637890 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01bd298d-28f4-4c50-80b3-f81ed96de794-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "01bd298d-28f4-4c50-80b3-f81ed96de794" (UID: "01bd298d-28f4-4c50-80b3-f81ed96de794"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.642362 4752 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/01bd298d-28f4-4c50-80b3-f81ed96de794-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.642391 4752 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/09540fa5-6ff8-45cc-98be-968283dc2bfd-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.642401 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c699191a-ace5-4413-8607-e801f9646d0d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.642410 4752 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d558bea-9793-4831-9304-f8cee2b2331e-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.642420 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c699191a-ace5-4413-8607-e801f9646d0d-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:30 crc kubenswrapper[4752]: E1124 11:28:30.642488 4752 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Nov 24 11:28:30 crc kubenswrapper[4752]: E1124 11:28:30.642536 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-config-data podName:3d820ba8-03d6-48f4-9423-bbc1ed64a36b nodeName:}" failed. No retries permitted until 2025-11-24 11:28:34.642518612 +0000 UTC m=+1320.627338901 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-config-data") pod "rabbitmq-cell1-server-0" (UID: "3d820ba8-03d6-48f4-9423-bbc1ed64a36b") : configmap "rabbitmq-cell1-config-data" not found Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.686605 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c699191a-ace5-4413-8607-e801f9646d0d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c699191a-ace5-4413-8607-e801f9646d0d" (UID: "c699191a-ace5-4413-8607-e801f9646d0d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.697353 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d558bea-9793-4831-9304-f8cee2b2331e-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "7d558bea-9793-4831-9304-f8cee2b2331e" (UID: "7d558bea-9793-4831-9304-f8cee2b2331e"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.744386 4752 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c699191a-ace5-4413-8607-e801f9646d0d-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.744438 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d558bea-9793-4831-9304-f8cee2b2331e-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.770395 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09540fa5-6ff8-45cc-98be-968283dc2bfd" path="/var/lib/kubelet/pods/09540fa5-6ff8-45cc-98be-968283dc2bfd/volumes" Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.774434 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="293563af-85f4-47d2-87d0-15f7172c173f" path="/var/lib/kubelet/pods/293563af-85f4-47d2-87d0-15f7172c173f/volumes" Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.775163 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55" path="/var/lib/kubelet/pods/2b39a6a4-a9e4-4fd2-9f24-fb4827db8b55/volumes" Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.775861 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d48e84d-ad34-4d80-b0f4-0f332b45b21d" path="/var/lib/kubelet/pods/3d48e84d-ad34-4d80-b0f4-0f332b45b21d/volumes" Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.776662 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94e8d35f-96ad-4182-a7dd-2adbbb113508" path="/var/lib/kubelet/pods/94e8d35f-96ad-4182-a7dd-2adbbb113508/volumes" Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.777945 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5a1ee4e-2810-485d-b931-d3121fe7e264" path="/var/lib/kubelet/pods/d5a1ee4e-2810-485d-b931-d3121fe7e264/volumes" Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.902365 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-79cf79466c-6hfp8" podUID="4b18be8a-5ff5-4b20-b6d5-5ca167f33583" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.167:8080/healthcheck\": dial tcp 10.217.0.167:8080: connect: connection refused" Nov 24 11:28:30 crc kubenswrapper[4752]: I1124 11:28:30.902636 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-79cf79466c-6hfp8" podUID="4b18be8a-5ff5-4b20-b6d5-5ca167f33583" containerName="proxy-server" probeResult="failure" output="Get \"https://10.217.0.167:8080/healthcheck\": dial tcp 10.217.0.167:8080: connect: connection refused" Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.144086 4752 scope.go:117] "RemoveContainer" containerID="2ea78ef62ccc3e6245a885fb49cb63708cc7ee44c9d209af4583cd80c3340003" Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.203373 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-57bcdc7dc8-2mzt6" Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.228946 4752 scope.go:117] "RemoveContainer" containerID="69b8d0fb67913018dd4cc726dbf04cca82e339eb6081fe3590586f398c788b8b" Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.290717 4752 generic.go:334] "Generic (PLEG): container finished" podID="6b9d21e3-863e-4129-a794-41c3bb8899df" containerID="ab779a1c42b633961d1f5b419a52d5bcf3916d2d4fe9bc9ba7cc57e2fc21152c" exitCode=0 Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.290796 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder3b7b-account-delete-rj27n" event={"ID":"6b9d21e3-863e-4129-a794-41c3bb8899df","Type":"ContainerDied","Data":"ab779a1c42b633961d1f5b419a52d5bcf3916d2d4fe9bc9ba7cc57e2fc21152c"} Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.300230 4752 generic.go:334] "Generic (PLEG): container finished" podID="2d92c8e6-2992-41be-950b-1eba6c84b636" containerID="f7b77ab3e2e24a2b9d77f401f9cb923e7c8b2ba3bb0f4001ef3a2f170f3bb279" exitCode=0 Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.300308 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"2d92c8e6-2992-41be-950b-1eba6c84b636","Type":"ContainerDied","Data":"f7b77ab3e2e24a2b9d77f401f9cb923e7c8b2ba3bb0f4001ef3a2f170f3bb279"} Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.327103 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell080e2-account-delete-8s7w5" event={"ID":"273fc6bc-214e-4fca-8939-003f983d6aa1","Type":"ContainerStarted","Data":"abd2e1662ed7dcf11502498345efcbb4ba177e3072b9adfe74c59d85b7c85869"} Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.329150 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron574b-account-delete-nw8qc" event={"ID":"a8b9f59e-8570-4b2a-9c7b-14be641f74fe","Type":"ContainerStarted","Data":"658624b76f0dab253674f5903fd66093ed8665d386f9a2195deb9b3c741fc580"} Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.335275 4752 generic.go:334] "Generic (PLEG): container finished" podID="7cb3afd6-0557-4a85-9763-1572d92e6aa3" containerID="a4d46d258f63ded30ef4a97217ef0d958fa5fa116dacc0a15e72c21b6b729660" exitCode=0 Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.335357 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement83dd-account-delete-87gjb" event={"ID":"7cb3afd6-0557-4a85-9763-1572d92e6aa3","Type":"ContainerDied","Data":"a4d46d258f63ded30ef4a97217ef0d958fa5fa116dacc0a15e72c21b6b729660"} Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.341943 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapib683-account-delete-k7898" event={"ID":"11bd1a8d-cbf9-47c6-a116-538f60634a33","Type":"ContainerStarted","Data":"000d10841c1abbd59399298c9ba35003072190b03182e79f0d52675198c8d93c"} Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.343887 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-57bcdc7dc8-2mzt6" event={"ID":"dd899b7f-383d-45fd-9ec8-48caa4f954f2","Type":"ContainerDied","Data":"c497636f28d92eb377f777117aba918117bba1876dda5e891bc31443246d02c2"} Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.343969 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-57bcdc7dc8-2mzt6" Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.345559 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glancef4c5-account-delete-kjkjp" event={"ID":"4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0","Type":"ContainerStarted","Data":"718a682b19ea651cbbac9169feb75607846abc1ed86f2897ee2f998e2085d1aa"} Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.347231 4752 generic.go:334] "Generic (PLEG): container finished" podID="4b18be8a-5ff5-4b20-b6d5-5ca167f33583" containerID="3971caac8147987cbdf6ff339a4a02e13902ca6259e0896f8df16f72509dc49d" exitCode=0 Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.347254 4752 generic.go:334] "Generic (PLEG): container finished" podID="4b18be8a-5ff5-4b20-b6d5-5ca167f33583" containerID="38c41eef16564abd80395499860a5b1c7de6d60c007496e866d4d37f4047cebc" exitCode=0 Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.347293 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-79cf79466c-6hfp8" event={"ID":"4b18be8a-5ff5-4b20-b6d5-5ca167f33583","Type":"ContainerDied","Data":"3971caac8147987cbdf6ff339a4a02e13902ca6259e0896f8df16f72509dc49d"} Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.347315 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-79cf79466c-6hfp8" event={"ID":"4b18be8a-5ff5-4b20-b6d5-5ca167f33583","Type":"ContainerDied","Data":"38c41eef16564abd80395499860a5b1c7de6d60c007496e866d4d37f4047cebc"} Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.348348 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell119d4-account-delete-vksfq" event={"ID":"51186e60-74ec-447f-afd0-14056f01d5a3","Type":"ContainerStarted","Data":"cd4de94aac35e65688ccf20767afcfad345409b7ffd4e0069b54b5fcab53a480"} Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.349304 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbicanf581-account-delete-qm6qk" event={"ID":"31369e0b-e7eb-44cc-9d7e-21d196d95ad3","Type":"ContainerStarted","Data":"9d7d65b03f15c7ef19a705756da017793b0f64eecc214cb8ef05b8218160aea7"} Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.356462 4752 generic.go:334] "Generic (PLEG): container finished" podID="0487424d-5178-4843-ae8d-db6c015fe9d4" containerID="456e104a9500a5d75fcca8f093e7825be12c29b5f33ca5320de653b354d9b91c" exitCode=0 Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.356526 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0487424d-5178-4843-ae8d-db6c015fe9d4","Type":"ContainerDied","Data":"456e104a9500a5d75fcca8f093e7825be12c29b5f33ca5320de653b354d9b91c"} Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.356553 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0487424d-5178-4843-ae8d-db6c015fe9d4","Type":"ContainerDied","Data":"051d7c0a52a159a27d5047b5f7dab7a88c7edd528d475565041bec4c4be853b6"} Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.356566 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="051d7c0a52a159a27d5047b5f7dab7a88c7edd528d475565041bec4c4be853b6" Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.358626 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd899b7f-383d-45fd-9ec8-48caa4f954f2-config-data-custom\") pod \"dd899b7f-383d-45fd-9ec8-48caa4f954f2\" (UID: \"dd899b7f-383d-45fd-9ec8-48caa4f954f2\") " Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.358699 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd899b7f-383d-45fd-9ec8-48caa4f954f2-logs\") pod \"dd899b7f-383d-45fd-9ec8-48caa4f954f2\" (UID: \"dd899b7f-383d-45fd-9ec8-48caa4f954f2\") " Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.358789 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxnkx\" (UniqueName: \"kubernetes.io/projected/dd899b7f-383d-45fd-9ec8-48caa4f954f2-kube-api-access-pxnkx\") pod \"dd899b7f-383d-45fd-9ec8-48caa4f954f2\" (UID: \"dd899b7f-383d-45fd-9ec8-48caa4f954f2\") " Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.358849 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd899b7f-383d-45fd-9ec8-48caa4f954f2-config-data\") pod \"dd899b7f-383d-45fd-9ec8-48caa4f954f2\" (UID: \"dd899b7f-383d-45fd-9ec8-48caa4f954f2\") " Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.358887 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd899b7f-383d-45fd-9ec8-48caa4f954f2-combined-ca-bundle\") pod \"dd899b7f-383d-45fd-9ec8-48caa4f954f2\" (UID: \"dd899b7f-383d-45fd-9ec8-48caa4f954f2\") " Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.361361 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd899b7f-383d-45fd-9ec8-48caa4f954f2-logs" (OuterVolumeSpecName: "logs") pod "dd899b7f-383d-45fd-9ec8-48caa4f954f2" (UID: "dd899b7f-383d-45fd-9ec8-48caa4f954f2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.372041 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd899b7f-383d-45fd-9ec8-48caa4f954f2-kube-api-access-pxnkx" (OuterVolumeSpecName: "kube-api-access-pxnkx") pod "dd899b7f-383d-45fd-9ec8-48caa4f954f2" (UID: "dd899b7f-383d-45fd-9ec8-48caa4f954f2"). InnerVolumeSpecName "kube-api-access-pxnkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.372869 4752 generic.go:334] "Generic (PLEG): container finished" podID="e3426b66-ea91-4d99-86c5-955d77073619" containerID="00c9561e9a5735f6ed6ea2bd3f779c56f52d92d1f868c56150ff9276fc50cb92" exitCode=0 Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.373560 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e3426b66-ea91-4d99-86c5-955d77073619","Type":"ContainerDied","Data":"00c9561e9a5735f6ed6ea2bd3f779c56f52d92d1f868c56150ff9276fc50cb92"} Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.373616 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e3426b66-ea91-4d99-86c5-955d77073619","Type":"ContainerDied","Data":"10b7753a7df64fd559a8dafe71a80a27e8b14aaa19eaaf5c6094695d8f1bf827"} Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.373647 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10b7753a7df64fd559a8dafe71a80a27e8b14aaa19eaaf5c6094695d8f1bf827" Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.385013 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd899b7f-383d-45fd-9ec8-48caa4f954f2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "dd899b7f-383d-45fd-9ec8-48caa4f954f2" (UID: "dd899b7f-383d-45fd-9ec8-48caa4f954f2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.387237 4752 generic.go:334] "Generic (PLEG): container finished" podID="7cb9117e-3eeb-4c01-af6a-643eec81666c" containerID="30b709fe7204f5ebbe3e77680225e7690d93a9b9881d529fce8b592b7c3e865e" exitCode=0 Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.387274 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7ffb7c9ccc-wsc5d" event={"ID":"7cb9117e-3eeb-4c01-af6a-643eec81666c","Type":"ContainerDied","Data":"30b709fe7204f5ebbe3e77680225e7690d93a9b9881d529fce8b592b7c3e865e"} Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.387301 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7ffb7c9ccc-wsc5d" event={"ID":"7cb9117e-3eeb-4c01-af6a-643eec81666c","Type":"ContainerDied","Data":"732074ae4e8be72a1a53e99b47e2b6e7e29e754a05bc470943479a5ff3b53dd8"} Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.387311 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="732074ae4e8be72a1a53e99b47e2b6e7e29e754a05bc470943479a5ff3b53dd8" Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.429280 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd899b7f-383d-45fd-9ec8-48caa4f954f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd899b7f-383d-45fd-9ec8-48caa4f954f2" (UID: "dd899b7f-383d-45fd-9ec8-48caa4f954f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.460835 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxnkx\" (UniqueName: \"kubernetes.io/projected/dd899b7f-383d-45fd-9ec8-48caa4f954f2-kube-api-access-pxnkx\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.460863 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd899b7f-383d-45fd-9ec8-48caa4f954f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.460874 4752 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd899b7f-383d-45fd-9ec8-48caa4f954f2-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.460883 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd899b7f-383d-45fd-9ec8-48caa4f954f2-logs\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.485925 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd899b7f-383d-45fd-9ec8-48caa4f954f2-config-data" (OuterVolumeSpecName: "config-data") pod "dd899b7f-383d-45fd-9ec8-48caa4f954f2" (UID: "dd899b7f-383d-45fd-9ec8-48caa4f954f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:31 crc kubenswrapper[4752]: E1124 11:28:31.492821 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f7b77ab3e2e24a2b9d77f401f9cb923e7c8b2ba3bb0f4001ef3a2f170f3bb279 is running failed: container process not found" containerID="f7b77ab3e2e24a2b9d77f401f9cb923e7c8b2ba3bb0f4001ef3a2f170f3bb279" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 24 11:28:31 crc kubenswrapper[4752]: E1124 11:28:31.493322 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f7b77ab3e2e24a2b9d77f401f9cb923e7c8b2ba3bb0f4001ef3a2f170f3bb279 is running failed: container process not found" containerID="f7b77ab3e2e24a2b9d77f401f9cb923e7c8b2ba3bb0f4001ef3a2f170f3bb279" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 24 11:28:31 crc kubenswrapper[4752]: E1124 11:28:31.493803 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f7b77ab3e2e24a2b9d77f401f9cb923e7c8b2ba3bb0f4001ef3a2f170f3bb279 is running failed: container process not found" containerID="f7b77ab3e2e24a2b9d77f401f9cb923e7c8b2ba3bb0f4001ef3a2f170f3bb279" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 24 11:28:31 crc kubenswrapper[4752]: E1124 11:28:31.493843 4752 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f7b77ab3e2e24a2b9d77f401f9cb923e7c8b2ba3bb0f4001ef3a2f170f3bb279 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="2d92c8e6-2992-41be-950b-1eba6c84b636" containerName="nova-cell0-conductor-conductor" Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.548698 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7ffb7c9ccc-wsc5d" Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.559844 4752 scope.go:117] "RemoveContainer" containerID="5c192d14aa6a7bd9c02c00cdae6e1a39c7650144567990030adf01797a110db9" Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.562315 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd899b7f-383d-45fd-9ec8-48caa4f954f2-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.663571 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7cb9117e-3eeb-4c01-af6a-643eec81666c-config-data-custom\") pod \"7cb9117e-3eeb-4c01-af6a-643eec81666c\" (UID: \"7cb9117e-3eeb-4c01-af6a-643eec81666c\") " Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.663629 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cb9117e-3eeb-4c01-af6a-643eec81666c-config-data\") pod \"7cb9117e-3eeb-4c01-af6a-643eec81666c\" (UID: \"7cb9117e-3eeb-4c01-af6a-643eec81666c\") " Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.663658 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tq42q\" (UniqueName: \"kubernetes.io/projected/7cb9117e-3eeb-4c01-af6a-643eec81666c-kube-api-access-tq42q\") pod \"7cb9117e-3eeb-4c01-af6a-643eec81666c\" (UID: \"7cb9117e-3eeb-4c01-af6a-643eec81666c\") " Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.663714 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cb9117e-3eeb-4c01-af6a-643eec81666c-combined-ca-bundle\") pod \"7cb9117e-3eeb-4c01-af6a-643eec81666c\" (UID: \"7cb9117e-3eeb-4c01-af6a-643eec81666c\") " Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.663792 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cb9117e-3eeb-4c01-af6a-643eec81666c-logs\") pod \"7cb9117e-3eeb-4c01-af6a-643eec81666c\" (UID: \"7cb9117e-3eeb-4c01-af6a-643eec81666c\") " Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.665113 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cb9117e-3eeb-4c01-af6a-643eec81666c-logs" (OuterVolumeSpecName: "logs") pod "7cb9117e-3eeb-4c01-af6a-643eec81666c" (UID: "7cb9117e-3eeb-4c01-af6a-643eec81666c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.673653 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.686543 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-cm8dh"] Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.716584 4752 scope.go:117] "RemoveContainer" containerID="1b7082498c964fc9737d85041d682c2e7e12fce540ffa1aaeb63a62e8dc412bf" Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.717378 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-cm8dh"] Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.717550 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cb9117e-3eeb-4c01-af6a-643eec81666c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7cb9117e-3eeb-4c01-af6a-643eec81666c" (UID: "7cb9117e-3eeb-4c01-af6a-643eec81666c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.739986 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cb9117e-3eeb-4c01-af6a-643eec81666c-kube-api-access-tq42q" (OuterVolumeSpecName: "kube-api-access-tq42q") pod "7cb9117e-3eeb-4c01-af6a-643eec81666c" (UID: "7cb9117e-3eeb-4c01-af6a-643eec81666c"). InnerVolumeSpecName "kube-api-access-tq42q". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.772354 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0487424d-5178-4843-ae8d-db6c015fe9d4-vencrypt-tls-certs\") pod \"0487424d-5178-4843-ae8d-db6c015fe9d4\" (UID: \"0487424d-5178-4843-ae8d-db6c015fe9d4\") " Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.772397 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0487424d-5178-4843-ae8d-db6c015fe9d4-combined-ca-bundle\") pod \"0487424d-5178-4843-ae8d-db6c015fe9d4\" (UID: \"0487424d-5178-4843-ae8d-db6c015fe9d4\") " Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.772423 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0487424d-5178-4843-ae8d-db6c015fe9d4-nova-novncproxy-tls-certs\") pod \"0487424d-5178-4843-ae8d-db6c015fe9d4\" (UID: \"0487424d-5178-4843-ae8d-db6c015fe9d4\") " Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.772525 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0487424d-5178-4843-ae8d-db6c015fe9d4-config-data\") pod \"0487424d-5178-4843-ae8d-db6c015fe9d4\" (UID: \"0487424d-5178-4843-ae8d-db6c015fe9d4\") " Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.772554 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whnns\" (UniqueName: \"kubernetes.io/projected/0487424d-5178-4843-ae8d-db6c015fe9d4-kube-api-access-whnns\") pod \"0487424d-5178-4843-ae8d-db6c015fe9d4\" (UID: \"0487424d-5178-4843-ae8d-db6c015fe9d4\") " Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.774361 4752 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7cb9117e-3eeb-4c01-af6a-643eec81666c-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.774384 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tq42q\" (UniqueName: \"kubernetes.io/projected/7cb9117e-3eeb-4c01-af6a-643eec81666c-kube-api-access-tq42q\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.774396 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cb9117e-3eeb-4c01-af6a-643eec81666c-logs\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.801864 4752 scope.go:117] "RemoveContainer" containerID="92a697b4785f20d4c2e3ad0c1a378d7b86f11315ec2519ac9828547a6554fe0b" Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.841527 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.859246 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.859579 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2e8ac218-b256-4d82-afb6-34448254aa9f" containerName="ceilometer-central-agent" containerID="cri-o://42af4a9e7e4f79d0e530981eec488f3193a8e0ab8487ba8583a6beacc33cd3f9" gracePeriod=30 Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.859718 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2e8ac218-b256-4d82-afb6-34448254aa9f" containerName="proxy-httpd" containerID="cri-o://b64a3d9cdc49b880ccf1a894ebc75a567c3fbafc89ff71380851b7711949247f" gracePeriod=30 Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.859767 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2e8ac218-b256-4d82-afb6-34448254aa9f" containerName="sg-core" containerID="cri-o://3511a8fadfd4d408af8dd7cfad50cde220b8abfee4439a6034133cc037a305e8" gracePeriod=30 Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.859797 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2e8ac218-b256-4d82-afb6-34448254aa9f" containerName="ceilometer-notification-agent" containerID="cri-o://3e3197854a26bf19017707f823b368d90d33292e1a5a4ca1ccb629c45420d998" gracePeriod=30 Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.868337 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-79cf79466c-6hfp8" Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.868952 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.885804 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0487424d-5178-4843-ae8d-db6c015fe9d4-kube-api-access-whnns" (OuterVolumeSpecName: "kube-api-access-whnns") pod "0487424d-5178-4843-ae8d-db6c015fe9d4" (UID: "0487424d-5178-4843-ae8d-db6c015fe9d4"). InnerVolumeSpecName "kube-api-access-whnns". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.913008 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-65sfh"] Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.936555 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-65sfh"] Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.977275 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e3426b66-ea91-4d99-86c5-955d77073619-config-data-generated\") pod \"e3426b66-ea91-4d99-86c5-955d77073619\" (UID: \"e3426b66-ea91-4d99-86c5-955d77073619\") " Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.977324 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3426b66-ea91-4d99-86c5-955d77073619-galera-tls-certs\") pod \"e3426b66-ea91-4d99-86c5-955d77073619\" (UID: \"e3426b66-ea91-4d99-86c5-955d77073619\") " Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.977385 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3426b66-ea91-4d99-86c5-955d77073619-operator-scripts\") pod \"e3426b66-ea91-4d99-86c5-955d77073619\" (UID: \"e3426b66-ea91-4d99-86c5-955d77073619\") " Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.977414 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d92c8e6-2992-41be-950b-1eba6c84b636-combined-ca-bundle\") pod \"2d92c8e6-2992-41be-950b-1eba6c84b636\" (UID: \"2d92c8e6-2992-41be-950b-1eba6c84b636\") " Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.977447 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-etc-swift\") pod \"4b18be8a-5ff5-4b20-b6d5-5ca167f33583\" (UID: \"4b18be8a-5ff5-4b20-b6d5-5ca167f33583\") " Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.977509 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e3426b66-ea91-4d99-86c5-955d77073619-kolla-config\") pod \"e3426b66-ea91-4d99-86c5-955d77073619\" (UID: \"e3426b66-ea91-4d99-86c5-955d77073619\") " Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.977556 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-combined-ca-bundle\") pod \"4b18be8a-5ff5-4b20-b6d5-5ca167f33583\" (UID: \"4b18be8a-5ff5-4b20-b6d5-5ca167f33583\") " Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.977579 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d92c8e6-2992-41be-950b-1eba6c84b636-config-data\") pod \"2d92c8e6-2992-41be-950b-1eba6c84b636\" (UID: \"2d92c8e6-2992-41be-950b-1eba6c84b636\") " Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.977598 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmkh7\" (UniqueName: \"kubernetes.io/projected/e3426b66-ea91-4d99-86c5-955d77073619-kube-api-access-jmkh7\") pod \"e3426b66-ea91-4d99-86c5-955d77073619\" (UID: \"e3426b66-ea91-4d99-86c5-955d77073619\") " Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.977622 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-run-httpd\") pod \"4b18be8a-5ff5-4b20-b6d5-5ca167f33583\" (UID: \"4b18be8a-5ff5-4b20-b6d5-5ca167f33583\") " Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.977640 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3426b66-ea91-4d99-86c5-955d77073619-combined-ca-bundle\") pod \"e3426b66-ea91-4d99-86c5-955d77073619\" (UID: \"e3426b66-ea91-4d99-86c5-955d77073619\") " Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.977662 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"e3426b66-ea91-4d99-86c5-955d77073619\" (UID: \"e3426b66-ea91-4d99-86c5-955d77073619\") " Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.977706 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-internal-tls-certs\") pod \"4b18be8a-5ff5-4b20-b6d5-5ca167f33583\" (UID: \"4b18be8a-5ff5-4b20-b6d5-5ca167f33583\") " Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.977759 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-log-httpd\") pod \"4b18be8a-5ff5-4b20-b6d5-5ca167f33583\" (UID: \"4b18be8a-5ff5-4b20-b6d5-5ca167f33583\") " Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.977778 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-public-tls-certs\") pod \"4b18be8a-5ff5-4b20-b6d5-5ca167f33583\" (UID: \"4b18be8a-5ff5-4b20-b6d5-5ca167f33583\") " Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.977803 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8sgm\" (UniqueName: \"kubernetes.io/projected/2d92c8e6-2992-41be-950b-1eba6c84b636-kube-api-access-c8sgm\") pod \"2d92c8e6-2992-41be-950b-1eba6c84b636\" (UID: \"2d92c8e6-2992-41be-950b-1eba6c84b636\") " Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.977834 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-config-data\") pod \"4b18be8a-5ff5-4b20-b6d5-5ca167f33583\" (UID: \"4b18be8a-5ff5-4b20-b6d5-5ca167f33583\") " Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.977851 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e3426b66-ea91-4d99-86c5-955d77073619-config-data-default\") pod \"e3426b66-ea91-4d99-86c5-955d77073619\" (UID: \"e3426b66-ea91-4d99-86c5-955d77073619\") " Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.977870 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x59xw\" (UniqueName: \"kubernetes.io/projected/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-kube-api-access-x59xw\") pod \"4b18be8a-5ff5-4b20-b6d5-5ca167f33583\" (UID: \"4b18be8a-5ff5-4b20-b6d5-5ca167f33583\") " Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.978252 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whnns\" (UniqueName: \"kubernetes.io/projected/0487424d-5178-4843-ae8d-db6c015fe9d4-kube-api-access-whnns\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.979021 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.979247 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="ddb825f2-a05f-409d-b4cc-80408e2db5d7" containerName="kube-state-metrics" containerID="cri-o://7d8a87521315ef54b08c4e35b1d82277cba49344eb491d6cd687e4acfa7826e7" gracePeriod=30 Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.982730 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4b18be8a-5ff5-4b20-b6d5-5ca167f33583" (UID: "4b18be8a-5ff5-4b20-b6d5-5ca167f33583"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.983399 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4b18be8a-5ff5-4b20-b6d5-5ca167f33583" (UID: "4b18be8a-5ff5-4b20-b6d5-5ca167f33583"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.985456 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3426b66-ea91-4d99-86c5-955d77073619-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "e3426b66-ea91-4d99-86c5-955d77073619" (UID: "e3426b66-ea91-4d99-86c5-955d77073619"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.995658 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3426b66-ea91-4d99-86c5-955d77073619-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e3426b66-ea91-4d99-86c5-955d77073619" (UID: "e3426b66-ea91-4d99-86c5-955d77073619"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:28:31 crc kubenswrapper[4752]: I1124 11:28:31.997070 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3426b66-ea91-4d99-86c5-955d77073619-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "e3426b66-ea91-4d99-86c5-955d77073619" (UID: "e3426b66-ea91-4d99-86c5-955d77073619"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.001425 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3426b66-ea91-4d99-86c5-955d77073619-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "e3426b66-ea91-4d99-86c5-955d77073619" (UID: "e3426b66-ea91-4d99-86c5-955d77073619"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.016972 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-57bcdc7dc8-2mzt6"] Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.036610 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-kube-api-access-x59xw" (OuterVolumeSpecName: "kube-api-access-x59xw") pod "4b18be8a-5ff5-4b20-b6d5-5ca167f33583" (UID: "4b18be8a-5ff5-4b20-b6d5-5ca167f33583"). InnerVolumeSpecName "kube-api-access-x59xw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.053510 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-57bcdc7dc8-2mzt6"] Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.080191 4752 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e3426b66-ea91-4d99-86c5-955d77073619-config-data-default\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.082200 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x59xw\" (UniqueName: \"kubernetes.io/projected/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-kube-api-access-x59xw\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.082222 4752 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e3426b66-ea91-4d99-86c5-955d77073619-config-data-generated\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.082234 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3426b66-ea91-4d99-86c5-955d77073619-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.082245 4752 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e3426b66-ea91-4d99-86c5-955d77073619-kolla-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.082254 4752 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.082262 4752 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.111588 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "mysql-db") pod "e3426b66-ea91-4d99-86c5-955d77073619" (UID: "e3426b66-ea91-4d99-86c5-955d77073619"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.112072 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d92c8e6-2992-41be-950b-1eba6c84b636-kube-api-access-c8sgm" (OuterVolumeSpecName: "kube-api-access-c8sgm") pod "2d92c8e6-2992-41be-950b-1eba6c84b636" (UID: "2d92c8e6-2992-41be-950b-1eba6c84b636"). InnerVolumeSpecName "kube-api-access-c8sgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.114974 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "4b18be8a-5ff5-4b20-b6d5-5ca167f33583" (UID: "4b18be8a-5ff5-4b20-b6d5-5ca167f33583"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.124969 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3426b66-ea91-4d99-86c5-955d77073619-kube-api-access-jmkh7" (OuterVolumeSpecName: "kube-api-access-jmkh7") pod "e3426b66-ea91-4d99-86c5-955d77073619" (UID: "e3426b66-ea91-4d99-86c5-955d77073619"). InnerVolumeSpecName "kube-api-access-jmkh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.144764 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.145039 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="1020727e-3725-4fed-b276-043ae4d30c4c" containerName="memcached" containerID="cri-o://2163dde349442cb3ffdc514182a2490f24052d186366d6badae0a20befe860b1" gracePeriod=30 Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.186364 4752 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-etc-swift\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.186394 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmkh7\" (UniqueName: \"kubernetes.io/projected/e3426b66-ea91-4d99-86c5-955d77073619-kube-api-access-jmkh7\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.186418 4752 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.186428 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8sgm\" (UniqueName: \"kubernetes.io/projected/2d92c8e6-2992-41be-950b-1eba6c84b636-kube-api-access-c8sgm\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.197773 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cb9117e-3eeb-4c01-af6a-643eec81666c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7cb9117e-3eeb-4c01-af6a-643eec81666c" (UID: "7cb9117e-3eeb-4c01-af6a-643eec81666c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.203868 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-wf9mt"] Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.224035 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-v9hbc"] Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.247529 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-wf9mt"] Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.271678 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-v9hbc"] Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.279682 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-7f4ddd687-vk74x"] Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.279941 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-7f4ddd687-vk74x" podUID="1cd9d1a6-a562-443e-b16d-76c159107794" containerName="keystone-api" containerID="cri-o://0a1dcccaeed6faa3a0411c4c11965da3a4a0622d15fdfb5a731e239a841bf153" gracePeriod=30 Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.288936 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cb9117e-3eeb-4c01-af6a-643eec81666c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.292891 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.299483 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="2e8ac218-b256-4d82-afb6-34448254aa9f" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.197:3000/\": dial tcp 10.217.0.197:3000: connect: connection refused" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.302852 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone7b4f-account-delete-jdrrz"] Nov 24 11:28:32 crc kubenswrapper[4752]: E1124 11:28:32.303421 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b18be8a-5ff5-4b20-b6d5-5ca167f33583" containerName="proxy-server" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.303521 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b18be8a-5ff5-4b20-b6d5-5ca167f33583" containerName="proxy-server" Nov 24 11:28:32 crc kubenswrapper[4752]: E1124 11:28:32.303632 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0487424d-5178-4843-ae8d-db6c015fe9d4" containerName="nova-cell1-novncproxy-novncproxy" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.303717 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="0487424d-5178-4843-ae8d-db6c015fe9d4" containerName="nova-cell1-novncproxy-novncproxy" Nov 24 11:28:32 crc kubenswrapper[4752]: E1124 11:28:32.303826 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c699191a-ace5-4413-8607-e801f9646d0d" containerName="init" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.303913 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="c699191a-ace5-4413-8607-e801f9646d0d" containerName="init" Nov 24 11:28:32 crc kubenswrapper[4752]: E1124 11:28:32.303998 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3426b66-ea91-4d99-86c5-955d77073619" containerName="galera" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.304082 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3426b66-ea91-4d99-86c5-955d77073619" containerName="galera" Nov 24 11:28:32 crc kubenswrapper[4752]: E1124 11:28:32.304167 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5a1ee4e-2810-485d-b931-d3121fe7e264" containerName="openstack-network-exporter" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.304248 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5a1ee4e-2810-485d-b931-d3121fe7e264" containerName="openstack-network-exporter" Nov 24 11:28:32 crc kubenswrapper[4752]: E1124 11:28:32.304330 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd899b7f-383d-45fd-9ec8-48caa4f954f2" containerName="barbican-keystone-listener" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.304398 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd899b7f-383d-45fd-9ec8-48caa4f954f2" containerName="barbican-keystone-listener" Nov 24 11:28:32 crc kubenswrapper[4752]: E1124 11:28:32.304477 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d92c8e6-2992-41be-950b-1eba6c84b636" containerName="nova-cell0-conductor-conductor" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.304548 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d92c8e6-2992-41be-950b-1eba6c84b636" containerName="nova-cell0-conductor-conductor" Nov 24 11:28:32 crc kubenswrapper[4752]: E1124 11:28:32.304627 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cb9117e-3eeb-4c01-af6a-643eec81666c" containerName="barbican-worker-log" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.304712 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cb9117e-3eeb-4c01-af6a-643eec81666c" containerName="barbican-worker-log" Nov 24 11:28:32 crc kubenswrapper[4752]: E1124 11:28:32.304828 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd899b7f-383d-45fd-9ec8-48caa4f954f2" containerName="barbican-keystone-listener-log" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.304907 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd899b7f-383d-45fd-9ec8-48caa4f954f2" containerName="barbican-keystone-listener-log" Nov 24 11:28:32 crc kubenswrapper[4752]: E1124 11:28:32.304998 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3426b66-ea91-4d99-86c5-955d77073619" containerName="mysql-bootstrap" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.305072 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3426b66-ea91-4d99-86c5-955d77073619" containerName="mysql-bootstrap" Nov 24 11:28:32 crc kubenswrapper[4752]: E1124 11:28:32.305152 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5a1ee4e-2810-485d-b931-d3121fe7e264" containerName="ovsdbserver-nb" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.305234 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5a1ee4e-2810-485d-b931-d3121fe7e264" containerName="ovsdbserver-nb" Nov 24 11:28:32 crc kubenswrapper[4752]: E1124 11:28:32.305310 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b18be8a-5ff5-4b20-b6d5-5ca167f33583" containerName="proxy-httpd" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.305381 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b18be8a-5ff5-4b20-b6d5-5ca167f33583" containerName="proxy-httpd" Nov 24 11:28:32 crc kubenswrapper[4752]: E1124 11:28:32.305479 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c699191a-ace5-4413-8607-e801f9646d0d" containerName="dnsmasq-dns" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.306788 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="c699191a-ace5-4413-8607-e801f9646d0d" containerName="dnsmasq-dns" Nov 24 11:28:32 crc kubenswrapper[4752]: E1124 11:28:32.306898 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01bd298d-28f4-4c50-80b3-f81ed96de794" containerName="openstack-network-exporter" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.306998 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="01bd298d-28f4-4c50-80b3-f81ed96de794" containerName="openstack-network-exporter" Nov 24 11:28:32 crc kubenswrapper[4752]: E1124 11:28:32.307098 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d558bea-9793-4831-9304-f8cee2b2331e" containerName="ovsdbserver-sb" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.307198 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d558bea-9793-4831-9304-f8cee2b2331e" containerName="ovsdbserver-sb" Nov 24 11:28:32 crc kubenswrapper[4752]: E1124 11:28:32.307281 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cb9117e-3eeb-4c01-af6a-643eec81666c" containerName="barbican-worker" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.307359 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cb9117e-3eeb-4c01-af6a-643eec81666c" containerName="barbican-worker" Nov 24 11:28:32 crc kubenswrapper[4752]: E1124 11:28:32.307452 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d558bea-9793-4831-9304-f8cee2b2331e" containerName="openstack-network-exporter" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.307533 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d558bea-9793-4831-9304-f8cee2b2331e" containerName="openstack-network-exporter" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.307957 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5a1ee4e-2810-485d-b931-d3121fe7e264" containerName="openstack-network-exporter" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.308097 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cb9117e-3eeb-4c01-af6a-643eec81666c" containerName="barbican-worker" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.308199 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d558bea-9793-4831-9304-f8cee2b2331e" containerName="openstack-network-exporter" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.308296 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="0487424d-5178-4843-ae8d-db6c015fe9d4" containerName="nova-cell1-novncproxy-novncproxy" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.308406 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cb9117e-3eeb-4c01-af6a-643eec81666c" containerName="barbican-worker-log" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.308526 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5a1ee4e-2810-485d-b931-d3121fe7e264" containerName="ovsdbserver-nb" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.308639 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b18be8a-5ff5-4b20-b6d5-5ca167f33583" containerName="proxy-httpd" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.308763 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="01bd298d-28f4-4c50-80b3-f81ed96de794" containerName="openstack-network-exporter" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.309015 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3426b66-ea91-4d99-86c5-955d77073619" containerName="galera" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.309116 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd899b7f-383d-45fd-9ec8-48caa4f954f2" containerName="barbican-keystone-listener-log" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.309223 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="c699191a-ace5-4413-8607-e801f9646d0d" containerName="dnsmasq-dns" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.309366 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d558bea-9793-4831-9304-f8cee2b2331e" containerName="ovsdbserver-sb" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.309472 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b18be8a-5ff5-4b20-b6d5-5ca167f33583" containerName="proxy-server" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.309576 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d92c8e6-2992-41be-950b-1eba6c84b636" containerName="nova-cell0-conductor-conductor" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.309673 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd899b7f-383d-45fd-9ec8-48caa4f954f2" containerName="barbican-keystone-listener" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.310545 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone7b4f-account-delete-jdrrz" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.312380 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone7b4f-account-delete-jdrrz"] Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.339128 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-j8hcn"] Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.388009 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-j8hcn"] Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.392579 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b-operator-scripts\") pod \"keystone7b4f-account-delete-jdrrz\" (UID: \"29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b\") " pod="openstack/keystone7b4f-account-delete-jdrrz" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.392629 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgtdc\" (UniqueName: \"kubernetes.io/projected/29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b-kube-api-access-lgtdc\") pod \"keystone7b4f-account-delete-jdrrz\" (UID: \"29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b\") " pod="openstack/keystone7b4f-account-delete-jdrrz" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.402647 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="ffd3886c-7122-4bd5-97d2-5c448c31e941" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.163:8776/healthcheck\": dial tcp 10.217.0.163:8776: connect: connection refused" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.414508 4752 generic.go:334] "Generic (PLEG): container finished" podID="f3c01e5f-8338-497f-9e47-e2e3e857df63" containerID="9f8c293fe3f68cdc99eb5f8a593f9b4e799568c50b7fc38020752eb9bf6b2982" exitCode=0 Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.415002 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f3c01e5f-8338-497f-9e47-e2e3e857df63","Type":"ContainerDied","Data":"9f8c293fe3f68cdc99eb5f8a593f9b4e799568c50b7fc38020752eb9bf6b2982"} Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.441140 4752 generic.go:334] "Generic (PLEG): container finished" podID="ffd3886c-7122-4bd5-97d2-5c448c31e941" containerID="11876dbf5b00d15eb3d002e8d11562f90c15b2bebf4c5f309e67dc8dc3e37bbb" exitCode=0 Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.441428 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ffd3886c-7122-4bd5-97d2-5c448c31e941","Type":"ContainerDied","Data":"11876dbf5b00d15eb3d002e8d11562f90c15b2bebf4c5f309e67dc8dc3e37bbb"} Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.457852 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell080e2-account-delete-8s7w5" event={"ID":"273fc6bc-214e-4fca-8939-003f983d6aa1","Type":"ContainerStarted","Data":"51562528f3ca126cb6a542c5f8fc1ce5583a429c0d7b6e27713bdb2a51828aa1"} Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.458481 4752 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novacell080e2-account-delete-8s7w5" secret="" err="secret \"galera-openstack-dockercfg-zg767\" not found" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.458831 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-f4c5-account-create-6zcx6"] Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.465463 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0487424d-5178-4843-ae8d-db6c015fe9d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0487424d-5178-4843-ae8d-db6c015fe9d4" (UID: "0487424d-5178-4843-ae8d-db6c015fe9d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.470707 4752 generic.go:334] "Generic (PLEG): container finished" podID="a6e4558a-350e-442a-ad61-70d9a9824219" containerID="aaadfc25131dfddd66fa261c60845cf30f3a819ab939eec3683de48fc967913c" exitCode=0 Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.470799 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-57b7bbd86d-9drzs" event={"ID":"a6e4558a-350e-442a-ad61-70d9a9824219","Type":"ContainerDied","Data":"aaadfc25131dfddd66fa261c60845cf30f3a819ab939eec3683de48fc967913c"} Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.471608 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glancef4c5-account-delete-kjkjp"] Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.485262 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glancef4c5-account-delete-kjkjp" event={"ID":"4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0","Type":"ContainerStarted","Data":"0b2d1772a4113b9569ea388eb821fe2bba108595c602c892167bc5ef467db515"} Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.485854 4752 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/glancef4c5-account-delete-kjkjp" secret="" err="secret \"galera-openstack-dockercfg-zg767\" not found" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.505728 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b-operator-scripts\") pod \"keystone7b4f-account-delete-jdrrz\" (UID: \"29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b\") " pod="openstack/keystone7b4f-account-delete-jdrrz" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.505778 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgtdc\" (UniqueName: \"kubernetes.io/projected/29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b-kube-api-access-lgtdc\") pod \"keystone7b4f-account-delete-jdrrz\" (UID: \"29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b\") " pod="openstack/keystone7b4f-account-delete-jdrrz" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.505904 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0487424d-5178-4843-ae8d-db6c015fe9d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:32 crc kubenswrapper[4752]: E1124 11:28:32.506184 4752 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 24 11:28:32 crc kubenswrapper[4752]: E1124 11:28:32.506229 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b-operator-scripts podName:29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b nodeName:}" failed. No retries permitted until 2025-11-24 11:28:33.00621476 +0000 UTC m=+1318.991035049 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b-operator-scripts") pod "keystone7b4f-account-delete-jdrrz" (UID: "29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b") : configmap "openstack-scripts" not found Nov 24 11:28:32 crc kubenswrapper[4752]: E1124 11:28:32.536893 4752 projected.go:194] Error preparing data for projected volume kube-api-access-lgtdc for pod openstack/keystone7b4f-account-delete-jdrrz: failed to fetch token: serviceaccounts "galera-openstack" not found Nov 24 11:28:32 crc kubenswrapper[4752]: E1124 11:28:32.536987 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b-kube-api-access-lgtdc podName:29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b nodeName:}" failed. No retries permitted until 2025-11-24 11:28:33.036964699 +0000 UTC m=+1319.021784988 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-lgtdc" (UniqueName: "kubernetes.io/projected/29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b-kube-api-access-lgtdc") pod "keystone7b4f-account-delete-jdrrz" (UID: "29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b") : failed to fetch token: serviceaccounts "galera-openstack" not found Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.537343 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.537380 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"2d92c8e6-2992-41be-950b-1eba6c84b636","Type":"ContainerDied","Data":"0a8d7b4a78af82bdec9ede8779357db9c95dd08572eb99903e7abda6ae72ef5e"} Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.537429 4752 scope.go:117] "RemoveContainer" containerID="f7b77ab3e2e24a2b9d77f401f9cb923e7c8b2ba3bb0f4001ef3a2f170f3bb279" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.544182 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-f4c5-account-create-6zcx6"] Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.555088 4752 generic.go:334] "Generic (PLEG): container finished" podID="2e8ac218-b256-4d82-afb6-34448254aa9f" containerID="b64a3d9cdc49b880ccf1a894ebc75a567c3fbafc89ff71380851b7711949247f" exitCode=0 Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.555416 4752 generic.go:334] "Generic (PLEG): container finished" podID="2e8ac218-b256-4d82-afb6-34448254aa9f" containerID="3511a8fadfd4d408af8dd7cfad50cde220b8abfee4439a6034133cc037a305e8" exitCode=2 Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.555502 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-4kfpv"] Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.555524 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e8ac218-b256-4d82-afb6-34448254aa9f","Type":"ContainerDied","Data":"b64a3d9cdc49b880ccf1a894ebc75a567c3fbafc89ff71380851b7711949247f"} Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.555550 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e8ac218-b256-4d82-afb6-34448254aa9f","Type":"ContainerDied","Data":"3511a8fadfd4d408af8dd7cfad50cde220b8abfee4439a6034133cc037a305e8"} Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.562934 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-4kfpv"] Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.577020 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbicanf581-account-delete-qm6qk" event={"ID":"31369e0b-e7eb-44cc-9d7e-21d196d95ad3","Type":"ContainerStarted","Data":"80077efb113a9fef0b2ba87a7d085c77587815a0d82bd1d80505c5eea98837ef"} Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.577477 4752 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/barbicanf581-account-delete-qm6qk" secret="" err="secret \"galera-openstack-dockercfg-zg767\" not found" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.595375 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-7b4f-account-create-pq4n9"] Nov 24 11:28:32 crc kubenswrapper[4752]: E1124 11:28:32.611994 4752 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 24 11:28:32 crc kubenswrapper[4752]: E1124 11:28:32.612049 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0-operator-scripts podName:4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0 nodeName:}" failed. No retries permitted until 2025-11-24 11:28:33.112034979 +0000 UTC m=+1319.096855268 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0-operator-scripts") pod "glancef4c5-account-delete-kjkjp" (UID: "4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0") : configmap "openstack-scripts" not found Nov 24 11:28:32 crc kubenswrapper[4752]: E1124 11:28:32.612234 4752 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 24 11:28:32 crc kubenswrapper[4752]: E1124 11:28:32.612266 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/273fc6bc-214e-4fca-8939-003f983d6aa1-operator-scripts podName:273fc6bc-214e-4fca-8939-003f983d6aa1 nodeName:}" failed. No retries permitted until 2025-11-24 11:28:33.112249375 +0000 UTC m=+1319.097069664 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/273fc6bc-214e-4fca-8939-003f983d6aa1-operator-scripts") pod "novacell080e2-account-delete-8s7w5" (UID: "273fc6bc-214e-4fca-8939-003f983d6aa1") : configmap "openstack-scripts" not found Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.613048 4752 generic.go:334] "Generic (PLEG): container finished" podID="ddb825f2-a05f-409d-b4cc-80408e2db5d7" containerID="7d8a87521315ef54b08c4e35b1d82277cba49344eb491d6cd687e4acfa7826e7" exitCode=2 Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.613108 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ddb825f2-a05f-409d-b4cc-80408e2db5d7","Type":"ContainerDied","Data":"7d8a87521315ef54b08c4e35b1d82277cba49344eb491d6cd687e4acfa7826e7"} Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.623240 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-7b4f-account-create-pq4n9"] Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.637303 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone7b4f-account-delete-jdrrz"] Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.644074 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.644778 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-79cf79466c-6hfp8" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.644833 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-79cf79466c-6hfp8" event={"ID":"4b18be8a-5ff5-4b20-b6d5-5ca167f33583","Type":"ContainerDied","Data":"32aecea165f1193639b453c2f9ab5c0e77b28185ea8e6c8fd5cb661270789aa4"} Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.644956 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.645377 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7ffb7c9ccc-wsc5d" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.661798 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-x9jpf"] Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.670146 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-x9jpf"] Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.672358 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="4bdb834d-3fdb-4bed-b349-e41c60ec4d64" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": read tcp 10.217.0.2:56484->10.217.0.202:8775: read: connection reset by peer" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.672399 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="4bdb834d-3fdb-4bed-b349-e41c60ec4d64" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": read tcp 10.217.0.2:56476->10.217.0.202:8775: read: connection reset by peer" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.678896 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell080e2-account-delete-8s7w5"] Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.687683 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-80e2-account-create-xlp8w"] Nov 24 11:28:32 crc kubenswrapper[4752]: E1124 11:28:32.721282 4752 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 24 11:28:32 crc kubenswrapper[4752]: E1124 11:28:32.721347 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/31369e0b-e7eb-44cc-9d7e-21d196d95ad3-operator-scripts podName:31369e0b-e7eb-44cc-9d7e-21d196d95ad3 nodeName:}" failed. No retries permitted until 2025-11-24 11:28:33.221332048 +0000 UTC m=+1319.206152337 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/31369e0b-e7eb-44cc-9d7e-21d196d95ad3-operator-scripts") pod "barbicanf581-account-delete-qm6qk" (UID: "31369e0b-e7eb-44cc-9d7e-21d196d95ad3") : configmap "openstack-scripts" not found Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.723326 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-80e2-account-create-xlp8w"] Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.753321 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-68f8b8b648-65q5g" podUID="6c592991-2ffb-417a-aa80-49d0111618bb" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.157:9311/healthcheck\": read tcp 10.217.0.2:44440->10.217.0.157:9311: read: connection reset by peer" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.753691 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-68f8b8b648-65q5g" podUID="6c592991-2ffb-417a-aa80-49d0111618bb" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.157:9311/healthcheck\": read tcp 10.217.0.2:44452->10.217.0.157:9311: read: connection reset by peer" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.784186 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01bd298d-28f4-4c50-80b3-f81ed96de794" path="/var/lib/kubelet/pods/01bd298d-28f4-4c50-80b3-f81ed96de794/volumes" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.788327 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e756e30-c210-4624-a396-e9e05691f1ed" path="/var/lib/kubelet/pods/0e756e30-c210-4624-a396-e9e05691f1ed/volumes" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.791569 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a954e1c-c17d-4296-b165-22b1653af22f" path="/var/lib/kubelet/pods/2a954e1c-c17d-4296-b165-22b1653af22f/volumes" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.792413 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c789332-baf6-4602-b509-7421b2ff22fe" path="/var/lib/kubelet/pods/3c789332-baf6-4602-b509-7421b2ff22fe/volumes" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.793303 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93537e9d-a8b8-4fb4-b982-2971ccf60f9d" path="/var/lib/kubelet/pods/93537e9d-a8b8-4fb4-b982-2971ccf60f9d/volumes" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.794690 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9e6380d-e699-4757-ac88-6d245859dadd" path="/var/lib/kubelet/pods/b9e6380d-e699-4757-ac88-6d245859dadd/volumes" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.795418 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c699191a-ace5-4413-8607-e801f9646d0d" path="/var/lib/kubelet/pods/c699191a-ace5-4413-8607-e801f9646d0d/volumes" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.796457 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd899b7f-383d-45fd-9ec8-48caa4f954f2" path="/var/lib/kubelet/pods/dd899b7f-383d-45fd-9ec8-48caa4f954f2/volumes" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.798345 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0419545-a8e5-484a-8b23-8e57bd2a6d7b" path="/var/lib/kubelet/pods/e0419545-a8e5-484a-8b23-8e57bd2a6d7b/volumes" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.799170 4752 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.799438 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4e82288-6d0b-4bc0-b934-5c0c869a8198" path="/var/lib/kubelet/pods/e4e82288-6d0b-4bc0-b934-5c0c869a8198/volumes" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.800788 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe564c7e-869d-40d3-9472-839c4fbb51e6" path="/var/lib/kubelet/pods/fe564c7e-869d-40d3-9472-839c4fbb51e6/volumes" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.850114 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0487424d-5178-4843-ae8d-db6c015fe9d4-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "0487424d-5178-4843-ae8d-db6c015fe9d4" (UID: "0487424d-5178-4843-ae8d-db6c015fe9d4"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.857171 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/novacell080e2-account-delete-8s7w5" podStartSLOduration=5.857153214 podStartE2EDuration="5.857153214s" podCreationTimestamp="2025-11-24 11:28:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:28:32.477335816 +0000 UTC m=+1318.462156095" watchObservedRunningTime="2025-11-24 11:28:32.857153214 +0000 UTC m=+1318.841973493" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.863132 4752 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.863161 4752 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0487424d-5178-4843-ae8d-db6c015fe9d4-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.869422 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glancef4c5-account-delete-kjkjp" podStartSLOduration=5.869398858 podStartE2EDuration="5.869398858s" podCreationTimestamp="2025-11-24 11:28:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:28:32.533497999 +0000 UTC m=+1318.518318288" watchObservedRunningTime="2025-11-24 11:28:32.869398858 +0000 UTC m=+1318.854219157" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.872727 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbicanf581-account-delete-qm6qk" podStartSLOduration=6.872710974 podStartE2EDuration="6.872710974s" podCreationTimestamp="2025-11-24 11:28:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:28:32.598398985 +0000 UTC m=+1318.583219274" watchObservedRunningTime="2025-11-24 11:28:32.872710974 +0000 UTC m=+1318.857531253" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.910334 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cb9117e-3eeb-4c01-af6a-643eec81666c-config-data" (OuterVolumeSpecName: "config-data") pod "7cb9117e-3eeb-4c01-af6a-643eec81666c" (UID: "7cb9117e-3eeb-4c01-af6a-643eec81666c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.927899 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d92c8e6-2992-41be-950b-1eba6c84b636-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d92c8e6-2992-41be-950b-1eba6c84b636" (UID: "2d92c8e6-2992-41be-950b-1eba6c84b636"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.971893 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d92c8e6-2992-41be-950b-1eba6c84b636-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:32 crc kubenswrapper[4752]: I1124 11:28:32.972222 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cb9117e-3eeb-4c01-af6a-643eec81666c-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.013630 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-config-data" (OuterVolumeSpecName: "config-data") pod "4b18be8a-5ff5-4b20-b6d5-5ca167f33583" (UID: "4b18be8a-5ff5-4b20-b6d5-5ca167f33583"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.073848 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0487424d-5178-4843-ae8d-db6c015fe9d4-config-data" (OuterVolumeSpecName: "config-data") pod "0487424d-5178-4843-ae8d-db6c015fe9d4" (UID: "0487424d-5178-4843-ae8d-db6c015fe9d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.075643 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b-operator-scripts\") pod \"keystone7b4f-account-delete-jdrrz\" (UID: \"29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b\") " pod="openstack/keystone7b4f-account-delete-jdrrz" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.075674 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgtdc\" (UniqueName: \"kubernetes.io/projected/29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b-kube-api-access-lgtdc\") pod \"keystone7b4f-account-delete-jdrrz\" (UID: \"29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b\") " pod="openstack/keystone7b4f-account-delete-jdrrz" Nov 24 11:28:33 crc kubenswrapper[4752]: E1124 11:28:33.075915 4752 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 24 11:28:33 crc kubenswrapper[4752]: E1124 11:28:33.075970 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b-operator-scripts podName:29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b nodeName:}" failed. No retries permitted until 2025-11-24 11:28:34.075951948 +0000 UTC m=+1320.060772247 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b-operator-scripts") pod "keystone7b4f-account-delete-jdrrz" (UID: "29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b") : configmap "openstack-scripts" not found Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.076478 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.076495 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0487424d-5178-4843-ae8d-db6c015fe9d4-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:33 crc kubenswrapper[4752]: E1124 11:28:33.084898 4752 projected.go:194] Error preparing data for projected volume kube-api-access-lgtdc for pod openstack/keystone7b4f-account-delete-jdrrz: failed to fetch token: serviceaccounts "galera-openstack" not found Nov 24 11:28:33 crc kubenswrapper[4752]: E1124 11:28:33.084964 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b-kube-api-access-lgtdc podName:29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b nodeName:}" failed. No retries permitted until 2025-11-24 11:28:34.084944278 +0000 UTC m=+1320.069764577 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-lgtdc" (UniqueName: "kubernetes.io/projected/29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b-kube-api-access-lgtdc") pod "keystone7b4f-account-delete-jdrrz" (UID: "29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b") : failed to fetch token: serviceaccounts "galera-openstack" not found Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.131210 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4b18be8a-5ff5-4b20-b6d5-5ca167f33583" (UID: "4b18be8a-5ff5-4b20-b6d5-5ca167f33583"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.138549 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0487424d-5178-4843-ae8d-db6c015fe9d4-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "0487424d-5178-4843-ae8d-db6c015fe9d4" (UID: "0487424d-5178-4843-ae8d-db6c015fe9d4"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.141618 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d92c8e6-2992-41be-950b-1eba6c84b636-config-data" (OuterVolumeSpecName: "config-data") pod "2d92c8e6-2992-41be-950b-1eba6c84b636" (UID: "2d92c8e6-2992-41be-950b-1eba6c84b636"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.171495 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3426b66-ea91-4d99-86c5-955d77073619-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3426b66-ea91-4d99-86c5-955d77073619" (UID: "e3426b66-ea91-4d99-86c5-955d77073619"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:33 crc kubenswrapper[4752]: E1124 11:28:33.177864 4752 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 24 11:28:33 crc kubenswrapper[4752]: E1124 11:28:33.177959 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0-operator-scripts podName:4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0 nodeName:}" failed. No retries permitted until 2025-11-24 11:28:34.177937526 +0000 UTC m=+1320.162757865 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0-operator-scripts") pod "glancef4c5-account-delete-kjkjp" (UID: "4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0") : configmap "openstack-scripts" not found Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.179001 4752 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.179037 4752 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0487424d-5178-4843-ae8d-db6c015fe9d4-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.179050 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d92c8e6-2992-41be-950b-1eba6c84b636-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.179062 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3426b66-ea91-4d99-86c5-955d77073619-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:33 crc kubenswrapper[4752]: E1124 11:28:33.179130 4752 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 24 11:28:33 crc kubenswrapper[4752]: E1124 11:28:33.179187 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/273fc6bc-214e-4fca-8939-003f983d6aa1-operator-scripts podName:273fc6bc-214e-4fca-8939-003f983d6aa1 nodeName:}" failed. No retries permitted until 2025-11-24 11:28:34.179168501 +0000 UTC m=+1320.163988850 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/273fc6bc-214e-4fca-8939-003f983d6aa1-operator-scripts") pod "novacell080e2-account-delete-8s7w5" (UID: "273fc6bc-214e-4fca-8939-003f983d6aa1") : configmap "openstack-scripts" not found Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.191949 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4b18be8a-5ff5-4b20-b6d5-5ca167f33583" (UID: "4b18be8a-5ff5-4b20-b6d5-5ca167f33583"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.212972 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3426b66-ea91-4d99-86c5-955d77073619-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "e3426b66-ea91-4d99-86c5-955d77073619" (UID: "e3426b66-ea91-4d99-86c5-955d77073619"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.222648 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="30292578-890d-42e8-b689-b909de083b48" containerName="galera" containerID="cri-o://2be1c33a29006d4527d097e0563524c636dc5fa73eff7ea290845ce9f9a8aa72" gracePeriod=30 Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.254594 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b18be8a-5ff5-4b20-b6d5-5ca167f33583" (UID: "4b18be8a-5ff5-4b20-b6d5-5ca167f33583"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.282893 4752 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3426b66-ea91-4d99-86c5-955d77073619-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.284608 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.284625 4752 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b18be8a-5ff5-4b20-b6d5-5ca167f33583-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:33 crc kubenswrapper[4752]: E1124 11:28:33.282914 4752 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 24 11:28:33 crc kubenswrapper[4752]: E1124 11:28:33.284701 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/31369e0b-e7eb-44cc-9d7e-21d196d95ad3-operator-scripts podName:31369e0b-e7eb-44cc-9d7e-21d196d95ad3 nodeName:}" failed. No retries permitted until 2025-11-24 11:28:34.284681131 +0000 UTC m=+1320.269501420 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/31369e0b-e7eb-44cc-9d7e-21d196d95ad3-operator-scripts") pod "barbicanf581-account-delete-qm6qk" (UID: "31369e0b-e7eb-44cc-9d7e-21d196d95ad3") : configmap "openstack-scripts" not found Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.606118 4752 kubelet_pods.go:2476] "Failed to reduce cpu time for pod pending volume cleanup" podUID="7cb9117e-3eeb-4c01-af6a-643eec81666c" err="openat2 /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7cb9117e_3eeb_4c01_af6a_643eec81666c.slice/cgroup.controllers: no such file or directory" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.607444 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-hlrvc"] Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.607473 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-hlrvc"] Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.607488 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-b683-account-create-zdwjb"] Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.607499 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-b683-account-create-zdwjb"] Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.607510 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapib683-account-delete-k7898"] Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.607534 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-6rbdm"] Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.607544 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-6rbdm"] Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.607555 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-574b-account-create-qkvsq"] Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.607564 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron574b-account-delete-nw8qc"] Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.607575 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-574b-account-create-qkvsq"] Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.668151 4752 scope.go:117] "RemoveContainer" containerID="3971caac8147987cbdf6ff339a4a02e13902ca6259e0896f8df16f72509dc49d" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.669285 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.674491 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-57b7bbd86d-9drzs" event={"ID":"a6e4558a-350e-442a-ad61-70d9a9824219","Type":"ContainerDied","Data":"327462de80c54143cc9b42a44939bcc5f3f982657f91658cc2099742d508a41d"} Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.674561 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="327462de80c54143cc9b42a44939bcc5f3f982657f91658cc2099742d508a41d" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.677976 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.690810 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/memcached-0" podUID="1020727e-3725-4fed-b276-043ae4d30c4c" containerName="memcached" probeResult="failure" output="dial tcp 10.217.0.105:11211: connect: connection refused" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.693637 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.693884 4752 generic.go:334] "Generic (PLEG): container finished" podID="6c592991-2ffb-417a-aa80-49d0111618bb" containerID="58ade1c43e6907e3577011461f577c10b33e8682d39ece87e8fbf57d78ed060c" exitCode=0 Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.694006 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-68f8b8b648-65q5g" event={"ID":"6c592991-2ffb-417a-aa80-49d0111618bb","Type":"ContainerDied","Data":"58ade1c43e6907e3577011461f577c10b33e8682d39ece87e8fbf57d78ed060c"} Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.698100 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zz5xc\" (UniqueName: \"kubernetes.io/projected/ddb825f2-a05f-409d-b4cc-80408e2db5d7-kube-api-access-zz5xc\") pod \"ddb825f2-a05f-409d-b4cc-80408e2db5d7\" (UID: \"ddb825f2-a05f-409d-b4cc-80408e2db5d7\") " Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.698170 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddb825f2-a05f-409d-b4cc-80408e2db5d7-combined-ca-bundle\") pod \"ddb825f2-a05f-409d-b4cc-80408e2db5d7\" (UID: \"ddb825f2-a05f-409d-b4cc-80408e2db5d7\") " Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.698233 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddb825f2-a05f-409d-b4cc-80408e2db5d7-kube-state-metrics-tls-certs\") pod \"ddb825f2-a05f-409d-b4cc-80408e2db5d7\" (UID: \"ddb825f2-a05f-409d-b4cc-80408e2db5d7\") " Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.698276 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ddb825f2-a05f-409d-b4cc-80408e2db5d7-kube-state-metrics-tls-config\") pod \"ddb825f2-a05f-409d-b4cc-80408e2db5d7\" (UID: \"ddb825f2-a05f-409d-b4cc-80408e2db5d7\") " Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.703356 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.711794 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.723392 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddb825f2-a05f-409d-b4cc-80408e2db5d7-kube-api-access-zz5xc" (OuterVolumeSpecName: "kube-api-access-zz5xc") pod "ddb825f2-a05f-409d-b4cc-80408e2db5d7" (UID: "ddb825f2-a05f-409d-b4cc-80408e2db5d7"). InnerVolumeSpecName "kube-api-access-zz5xc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.723548 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement83dd-account-delete-87gjb" event={"ID":"7cb3afd6-0557-4a85-9763-1572d92e6aa3","Type":"ContainerDied","Data":"d1ebfc6f10dd4eb52b4acd0b3bb270a0fd32ec1642ea5d5b5155483350b10e02"} Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.723610 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1ebfc6f10dd4eb52b4acd0b3bb270a0fd32ec1642ea5d5b5155483350b10e02" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.725875 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.732895 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.734265 4752 generic.go:334] "Generic (PLEG): container finished" podID="1020727e-3725-4fed-b276-043ae4d30c4c" containerID="2163dde349442cb3ffdc514182a2490f24052d186366d6badae0a20befe860b1" exitCode=0 Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.734323 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"1020727e-3725-4fed-b276-043ae4d30c4c","Type":"ContainerDied","Data":"2163dde349442cb3ffdc514182a2490f24052d186366d6badae0a20befe860b1"} Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.738854 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7ffb7c9ccc-wsc5d"] Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.739905 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f3c01e5f-8338-497f-9e47-e2e3e857df63","Type":"ContainerDied","Data":"0aa09047cdbffb520d71d2f8b1165c33309be8bed98715681874484ee3fb7809"} Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.740018 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0aa09047cdbffb520d71d2f8b1165c33309be8bed98715681874484ee3fb7809" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.749993 4752 generic.go:334] "Generic (PLEG): container finished" podID="2e8ac218-b256-4d82-afb6-34448254aa9f" containerID="3e3197854a26bf19017707f823b368d90d33292e1a5a4ca1ccb629c45420d998" exitCode=0 Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.750216 4752 generic.go:334] "Generic (PLEG): container finished" podID="2e8ac218-b256-4d82-afb6-34448254aa9f" containerID="42af4a9e7e4f79d0e530981eec488f3193a8e0ab8487ba8583a6beacc33cd3f9" exitCode=0 Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.750009 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-xmc2s" podUID="9495af0b-dca8-4695-8f40-f4f9a7ec5229" containerName="ovn-controller" probeResult="failure" output="command timed out" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.749830 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-7ffb7c9ccc-wsc5d"] Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.750767 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e8ac218-b256-4d82-afb6-34448254aa9f","Type":"ContainerDied","Data":"3e3197854a26bf19017707f823b368d90d33292e1a5a4ca1ccb629c45420d998"} Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.750847 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e8ac218-b256-4d82-afb6-34448254aa9f","Type":"ContainerDied","Data":"42af4a9e7e4f79d0e530981eec488f3193a8e0ab8487ba8583a6beacc33cd3f9"} Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.756944 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddb825f2-a05f-409d-b4cc-80408e2db5d7-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "ddb825f2-a05f-409d-b4cc-80408e2db5d7" (UID: "ddb825f2-a05f-409d-b4cc-80408e2db5d7"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.757026 4752 generic.go:334] "Generic (PLEG): container finished" podID="4bdb834d-3fdb-4bed-b349-e41c60ec4d64" containerID="820c72aca67b79ea64a516208b0eaa7594ab5684ba110221a1b876bae259b74a" exitCode=0 Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.757110 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4bdb834d-3fdb-4bed-b349-e41c60ec4d64","Type":"ContainerDied","Data":"820c72aca67b79ea64a516208b0eaa7594ab5684ba110221a1b876bae259b74a"} Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.760111 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ddb825f2-a05f-409d-b4cc-80408e2db5d7","Type":"ContainerDied","Data":"37d0f849cb5dc5f2ad9919e4c2d9c244d2e10f4fb87a539f8bdafee6df4bde47"} Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.760224 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.770669 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"16fc61a1-6794-4ecf-a3dd-d79a88eda486","Type":"ContainerDied","Data":"944ddefc80e86ba009045369f39188fcfd4ed96c67435d31b540b52189cc2961"} Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.770635 4752 generic.go:334] "Generic (PLEG): container finished" podID="16fc61a1-6794-4ecf-a3dd-d79a88eda486" containerID="944ddefc80e86ba009045369f39188fcfd4ed96c67435d31b540b52189cc2961" exitCode=0 Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.773687 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder3b7b-account-delete-rj27n" event={"ID":"6b9d21e3-863e-4129-a794-41c3bb8899df","Type":"ContainerDied","Data":"10c4b439ce7ad25316b6f53a5450e988eb5c29a9c5fefcd5bcf1ef2742400177"} Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.773713 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10c4b439ce7ad25316b6f53a5450e988eb5c29a9c5fefcd5bcf1ef2742400177" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.775972 4752 generic.go:334] "Generic (PLEG): container finished" podID="9d0084f6-064e-4089-87fe-4ead63923b56" containerID="9e7930d8efc895b38537d979ec9125f2a956817fea154183ec332abf612d2f8c" exitCode=0 Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.776025 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9d0084f6-064e-4089-87fe-4ead63923b56","Type":"ContainerDied","Data":"9e7930d8efc895b38537d979ec9125f2a956817fea154183ec332abf612d2f8c"} Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.777348 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapib683-account-delete-k7898" event={"ID":"11bd1a8d-cbf9-47c6-a116-538f60634a33","Type":"ContainerStarted","Data":"fa5ed377c2b9e970a9675ea141a6258eecf2cb33ef47859900cb051909237527"} Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.778910 4752 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novaapib683-account-delete-k7898" secret="" err="secret \"galera-openstack-dockercfg-zg767\" not found" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.779151 4752 generic.go:334] "Generic (PLEG): container finished" podID="51186e60-74ec-447f-afd0-14056f01d5a3" containerID="9139c1b70a2b425efb74d8d55e0b897e54430b4e552e063513eb97a27c8f6d2b" exitCode=1 Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.779191 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell119d4-account-delete-vksfq" event={"ID":"51186e60-74ec-447f-afd0-14056f01d5a3","Type":"ContainerDied","Data":"9139c1b70a2b425efb74d8d55e0b897e54430b4e552e063513eb97a27c8f6d2b"} Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.783180 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron574b-account-delete-nw8qc" event={"ID":"a8b9f59e-8570-4b2a-9c7b-14be641f74fe","Type":"ContainerStarted","Data":"0c28586032f85dbb5be5c9dc7486df65e43615fce2f45a9b8350fd4f996c1d88"} Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.783658 4752 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/neutron574b-account-delete-nw8qc" secret="" err="secret \"galera-openstack-dockercfg-zg767\" not found" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.787815 4752 generic.go:334] "Generic (PLEG): container finished" podID="b3b08ab3-f6c3-417e-abce-e03994e7553d" containerID="858e76b45449fa0fcdbccdf4d4f6325a65859816c0ee8d347ba69396fbd78a83" exitCode=0 Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.787896 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b3b08ab3-f6c3-417e-abce-e03994e7553d","Type":"ContainerDied","Data":"858e76b45449fa0fcdbccdf4d4f6325a65859816c0ee8d347ba69396fbd78a83"} Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.787922 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b3b08ab3-f6c3-417e-abce-e03994e7553d","Type":"ContainerDied","Data":"710ca0b3182fe50b2b01b8c727287d36e5ef90de149923679f0ad39a88ac05a6"} Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.787936 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="710ca0b3182fe50b2b01b8c727287d36e5ef90de149923679f0ad39a88ac05a6" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.794140 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glancef4c5-account-delete-kjkjp" podUID="4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0" containerName="mariadb-account-delete" containerID="cri-o://0b2d1772a4113b9569ea388eb821fe2bba108595c602c892167bc5ef467db515" gracePeriod=30 Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.794227 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ffd3886c-7122-4bd5-97d2-5c448c31e941","Type":"ContainerDied","Data":"e0949c324ffe3ff86542cb48faffa1696d02a6667acf5cdb256904e4f58d19f3"} Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.794249 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0949c324ffe3ff86542cb48faffa1696d02a6667acf5cdb256904e4f58d19f3" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.795237 4752 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/barbicanf581-account-delete-qm6qk" secret="" err="secret \"galera-openstack-dockercfg-zg767\" not found" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.796595 4752 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novacell080e2-account-delete-8s7w5" secret="" err="secret \"galera-openstack-dockercfg-zg767\" not found" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.797365 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/novaapib683-account-delete-k7898" podStartSLOduration=6.79735432 podStartE2EDuration="6.79735432s" podCreationTimestamp="2025-11-24 11:28:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:28:33.794088615 +0000 UTC m=+1319.778908904" watchObservedRunningTime="2025-11-24 11:28:33.79735432 +0000 UTC m=+1319.782174609" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.802370 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zz5xc\" (UniqueName: \"kubernetes.io/projected/ddb825f2-a05f-409d-b4cc-80408e2db5d7-kube-api-access-zz5xc\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.802399 4752 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ddb825f2-a05f-409d-b4cc-80408e2db5d7-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:33 crc kubenswrapper[4752]: E1124 11:28:33.802680 4752 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 24 11:28:33 crc kubenswrapper[4752]: E1124 11:28:33.802735 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a8b9f59e-8570-4b2a-9c7b-14be641f74fe-operator-scripts podName:a8b9f59e-8570-4b2a-9c7b-14be641f74fe nodeName:}" failed. No retries permitted until 2025-11-24 11:28:34.302720105 +0000 UTC m=+1320.287540394 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/a8b9f59e-8570-4b2a-9c7b-14be641f74fe-operator-scripts") pod "neutron574b-account-delete-nw8qc" (UID: "a8b9f59e-8570-4b2a-9c7b-14be641f74fe") : configmap "openstack-scripts" not found Nov 24 11:28:33 crc kubenswrapper[4752]: E1124 11:28:33.803213 4752 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 24 11:28:33 crc kubenswrapper[4752]: E1124 11:28:33.803390 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/11bd1a8d-cbf9-47c6-a116-538f60634a33-operator-scripts podName:11bd1a8d-cbf9-47c6-a116-538f60634a33 nodeName:}" failed. No retries permitted until 2025-11-24 11:28:34.303370364 +0000 UTC m=+1320.288190653 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/11bd1a8d-cbf9-47c6-a116-538f60634a33-operator-scripts") pod "novaapib683-account-delete-k7898" (UID: "11bd1a8d-cbf9-47c6-a116-538f60634a33") : configmap "openstack-scripts" not found Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.848050 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 24 11:28:33 crc kubenswrapper[4752]: E1124 11:28:33.852941 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-lgtdc operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystone7b4f-account-delete-jdrrz" podUID="29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.865373 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron574b-account-delete-nw8qc" podStartSLOduration=6.865337285 podStartE2EDuration="6.865337285s" podCreationTimestamp="2025-11-24 11:28:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:28:33.832141565 +0000 UTC m=+1319.816961854" watchObservedRunningTime="2025-11-24 11:28:33.865337285 +0000 UTC m=+1319.850157594" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.880463 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddb825f2-a05f-409d-b4cc-80408e2db5d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ddb825f2-a05f-409d-b4cc-80408e2db5d7" (UID: "ddb825f2-a05f-409d-b4cc-80408e2db5d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.891303 4752 scope.go:117] "RemoveContainer" containerID="38c41eef16564abd80395499860a5b1c7de6d60c007496e866d4d37f4047cebc" Nov 24 11:28:33 crc kubenswrapper[4752]: E1124 11:28:33.901133 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce is running failed: container process not found" containerID="8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 24 11:28:33 crc kubenswrapper[4752]: E1124 11:28:33.904283 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce is running failed: container process not found" containerID="8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.905878 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ffd3886c-7122-4bd5-97d2-5c448c31e941-etc-machine-id\") pod \"ffd3886c-7122-4bd5-97d2-5c448c31e941\" (UID: \"ffd3886c-7122-4bd5-97d2-5c448c31e941\") " Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.906009 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffd3886c-7122-4bd5-97d2-5c448c31e941-internal-tls-certs\") pod \"ffd3886c-7122-4bd5-97d2-5c448c31e941\" (UID: \"ffd3886c-7122-4bd5-97d2-5c448c31e941\") " Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.906583 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ffd3886c-7122-4bd5-97d2-5c448c31e941-config-data-custom\") pod \"ffd3886c-7122-4bd5-97d2-5c448c31e941\" (UID: \"ffd3886c-7122-4bd5-97d2-5c448c31e941\") " Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.906668 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffd3886c-7122-4bd5-97d2-5c448c31e941-scripts\") pod \"ffd3886c-7122-4bd5-97d2-5c448c31e941\" (UID: \"ffd3886c-7122-4bd5-97d2-5c448c31e941\") " Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.906686 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmhfj\" (UniqueName: \"kubernetes.io/projected/ffd3886c-7122-4bd5-97d2-5c448c31e941-kube-api-access-mmhfj\") pod \"ffd3886c-7122-4bd5-97d2-5c448c31e941\" (UID: \"ffd3886c-7122-4bd5-97d2-5c448c31e941\") " Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.906727 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffd3886c-7122-4bd5-97d2-5c448c31e941-combined-ca-bundle\") pod \"ffd3886c-7122-4bd5-97d2-5c448c31e941\" (UID: \"ffd3886c-7122-4bd5-97d2-5c448c31e941\") " Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.906775 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffd3886c-7122-4bd5-97d2-5c448c31e941-config-data\") pod \"ffd3886c-7122-4bd5-97d2-5c448c31e941\" (UID: \"ffd3886c-7122-4bd5-97d2-5c448c31e941\") " Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.906809 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffd3886c-7122-4bd5-97d2-5c448c31e941-logs\") pod \"ffd3886c-7122-4bd5-97d2-5c448c31e941\" (UID: \"ffd3886c-7122-4bd5-97d2-5c448c31e941\") " Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.906832 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffd3886c-7122-4bd5-97d2-5c448c31e941-public-tls-certs\") pod \"ffd3886c-7122-4bd5-97d2-5c448c31e941\" (UID: \"ffd3886c-7122-4bd5-97d2-5c448c31e941\") " Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.908246 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddb825f2-a05f-409d-b4cc-80408e2db5d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:33 crc kubenswrapper[4752]: E1124 11:28:33.912735 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce is running failed: container process not found" containerID="8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 24 11:28:33 crc kubenswrapper[4752]: E1124 11:28:33.912803 4752 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-v6jhw" podUID="26da9a73-3ede-476b-bfab-08839a8b88d1" containerName="ovsdb-server" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.912932 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ffd3886c-7122-4bd5-97d2-5c448c31e941-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ffd3886c-7122-4bd5-97d2-5c448c31e941" (UID: "ffd3886c-7122-4bd5-97d2-5c448c31e941"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.913657 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffd3886c-7122-4bd5-97d2-5c448c31e941-logs" (OuterVolumeSpecName: "logs") pod "ffd3886c-7122-4bd5-97d2-5c448c31e941" (UID: "ffd3886c-7122-4bd5-97d2-5c448c31e941"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.914109 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-57b7bbd86d-9drzs" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.919089 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffd3886c-7122-4bd5-97d2-5c448c31e941-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ffd3886c-7122-4bd5-97d2-5c448c31e941" (UID: "ffd3886c-7122-4bd5-97d2-5c448c31e941"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.920117 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.930367 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffd3886c-7122-4bd5-97d2-5c448c31e941-kube-api-access-mmhfj" (OuterVolumeSpecName: "kube-api-access-mmhfj") pod "ffd3886c-7122-4bd5-97d2-5c448c31e941" (UID: "ffd3886c-7122-4bd5-97d2-5c448c31e941"). InnerVolumeSpecName "kube-api-access-mmhfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.931245 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddb825f2-a05f-409d-b4cc-80408e2db5d7-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "ddb825f2-a05f-409d-b4cc-80408e2db5d7" (UID: "ddb825f2-a05f-409d-b4cc-80408e2db5d7"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.934996 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder3b7b-account-delete-rj27n" Nov 24 11:28:33 crc kubenswrapper[4752]: E1124 11:28:33.936412 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3d7ebdd7e2accb69d034418b3619ad363a595fd55694997a32bd7b3f79d31253" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.936709 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffd3886c-7122-4bd5-97d2-5c448c31e941-scripts" (OuterVolumeSpecName: "scripts") pod "ffd3886c-7122-4bd5-97d2-5c448c31e941" (UID: "ffd3886c-7122-4bd5-97d2-5c448c31e941"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.937251 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-79cf79466c-6hfp8"] Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.949946 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement83dd-account-delete-87gjb" Nov 24 11:28:33 crc kubenswrapper[4752]: E1124 11:28:33.954576 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3d7ebdd7e2accb69d034418b3619ad363a595fd55694997a32bd7b3f79d31253" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.967237 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-79cf79466c-6hfp8"] Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.969332 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 11:28:33 crc kubenswrapper[4752]: E1124 11:28:33.969664 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3d7ebdd7e2accb69d034418b3619ad363a595fd55694997a32bd7b3f79d31253" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 24 11:28:33 crc kubenswrapper[4752]: E1124 11:28:33.969802 4752 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-v6jhw" podUID="26da9a73-3ede-476b-bfab-08839a8b88d1" containerName="ovs-vswitchd" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.974100 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffd3886c-7122-4bd5-97d2-5c448c31e941-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ffd3886c-7122-4bd5-97d2-5c448c31e941" (UID: "ffd3886c-7122-4bd5-97d2-5c448c31e941"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.983107 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-xmc2s" podUID="9495af0b-dca8-4695-8f40-f4f9a7ec5229" containerName="ovn-controller" probeResult="failure" output=< Nov 24 11:28:33 crc kubenswrapper[4752]: ERROR - Failed to get connection status from ovn-controller, ovn-appctl exit status: 0 Nov 24 11:28:33 crc kubenswrapper[4752]: > Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.994386 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 24 11:28:33 crc kubenswrapper[4752]: I1124 11:28:33.995688 4752 scope.go:117] "RemoveContainer" containerID="7d8a87521315ef54b08c4e35b1d82277cba49344eb491d6cd687e4acfa7826e7" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.005050 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffd3886c-7122-4bd5-97d2-5c448c31e941-config-data" (OuterVolumeSpecName: "config-data") pod "ffd3886c-7122-4bd5-97d2-5c448c31e941" (UID: "ffd3886c-7122-4bd5-97d2-5c448c31e941"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.010083 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3c01e5f-8338-497f-9e47-e2e3e857df63-scripts\") pod \"f3c01e5f-8338-497f-9e47-e2e3e857df63\" (UID: \"f3c01e5f-8338-497f-9e47-e2e3e857df63\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.010570 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3b08ab3-f6c3-417e-abce-e03994e7553d-combined-ca-bundle\") pod \"b3b08ab3-f6c3-417e-abce-e03994e7553d\" (UID: \"b3b08ab3-f6c3-417e-abce-e03994e7553d\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.010634 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3c01e5f-8338-497f-9e47-e2e3e857df63-combined-ca-bundle\") pod \"f3c01e5f-8338-497f-9e47-e2e3e857df63\" (UID: \"f3c01e5f-8338-497f-9e47-e2e3e857df63\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.010659 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6e4558a-350e-442a-ad61-70d9a9824219-config-data\") pod \"a6e4558a-350e-442a-ad61-70d9a9824219\" (UID: \"a6e4558a-350e-442a-ad61-70d9a9824219\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.010834 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16fc61a1-6794-4ecf-a3dd-d79a88eda486-httpd-run\") pod \"16fc61a1-6794-4ecf-a3dd-d79a88eda486\" (UID: \"16fc61a1-6794-4ecf-a3dd-d79a88eda486\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.010868 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3c01e5f-8338-497f-9e47-e2e3e857df63-config-data\") pod \"f3c01e5f-8338-497f-9e47-e2e3e857df63\" (UID: \"f3c01e5f-8338-497f-9e47-e2e3e857df63\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.010902 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"f3c01e5f-8338-497f-9e47-e2e3e857df63\" (UID: \"f3c01e5f-8338-497f-9e47-e2e3e857df63\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.010943 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16fc61a1-6794-4ecf-a3dd-d79a88eda486-combined-ca-bundle\") pod \"16fc61a1-6794-4ecf-a3dd-d79a88eda486\" (UID: \"16fc61a1-6794-4ecf-a3dd-d79a88eda486\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.010975 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhm5p\" (UniqueName: \"kubernetes.io/projected/f3c01e5f-8338-497f-9e47-e2e3e857df63-kube-api-access-xhm5p\") pod \"f3c01e5f-8338-497f-9e47-e2e3e857df63\" (UID: \"f3c01e5f-8338-497f-9e47-e2e3e857df63\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.011004 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6e4558a-350e-442a-ad61-70d9a9824219-internal-tls-certs\") pod \"a6e4558a-350e-442a-ad61-70d9a9824219\" (UID: \"a6e4558a-350e-442a-ad61-70d9a9824219\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.011027 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3b08ab3-f6c3-417e-abce-e03994e7553d-internal-tls-certs\") pod \"b3b08ab3-f6c3-417e-abce-e03994e7553d\" (UID: \"b3b08ab3-f6c3-417e-abce-e03994e7553d\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.011052 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wm5rw\" (UniqueName: \"kubernetes.io/projected/b3b08ab3-f6c3-417e-abce-e03994e7553d-kube-api-access-wm5rw\") pod \"b3b08ab3-f6c3-417e-abce-e03994e7553d\" (UID: \"b3b08ab3-f6c3-417e-abce-e03994e7553d\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.011082 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16fc61a1-6794-4ecf-a3dd-d79a88eda486-config-data\") pod \"16fc61a1-6794-4ecf-a3dd-d79a88eda486\" (UID: \"16fc61a1-6794-4ecf-a3dd-d79a88eda486\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.011104 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svb6w\" (UniqueName: \"kubernetes.io/projected/7cb3afd6-0557-4a85-9763-1572d92e6aa3-kube-api-access-svb6w\") pod \"7cb3afd6-0557-4a85-9763-1572d92e6aa3\" (UID: \"7cb3afd6-0557-4a85-9763-1572d92e6aa3\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.011132 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxw5j\" (UniqueName: \"kubernetes.io/projected/16fc61a1-6794-4ecf-a3dd-d79a88eda486-kube-api-access-rxw5j\") pod \"16fc61a1-6794-4ecf-a3dd-d79a88eda486\" (UID: \"16fc61a1-6794-4ecf-a3dd-d79a88eda486\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.011170 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7cb3afd6-0557-4a85-9763-1572d92e6aa3-operator-scripts\") pod \"7cb3afd6-0557-4a85-9763-1572d92e6aa3\" (UID: \"7cb3afd6-0557-4a85-9763-1572d92e6aa3\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.011193 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b9d21e3-863e-4129-a794-41c3bb8899df-operator-scripts\") pod \"6b9d21e3-863e-4129-a794-41c3bb8899df\" (UID: \"6b9d21e3-863e-4129-a794-41c3bb8899df\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.011221 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6e4558a-350e-442a-ad61-70d9a9824219-scripts\") pod \"a6e4558a-350e-442a-ad61-70d9a9824219\" (UID: \"a6e4558a-350e-442a-ad61-70d9a9824219\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.011277 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3c01e5f-8338-497f-9e47-e2e3e857df63-logs\") pod \"f3c01e5f-8338-497f-9e47-e2e3e857df63\" (UID: \"f3c01e5f-8338-497f-9e47-e2e3e857df63\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.011301 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6e4558a-350e-442a-ad61-70d9a9824219-public-tls-certs\") pod \"a6e4558a-350e-442a-ad61-70d9a9824219\" (UID: \"a6e4558a-350e-442a-ad61-70d9a9824219\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.011328 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6e4558a-350e-442a-ad61-70d9a9824219-combined-ca-bundle\") pod \"a6e4558a-350e-442a-ad61-70d9a9824219\" (UID: \"a6e4558a-350e-442a-ad61-70d9a9824219\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.011369 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hrns\" (UniqueName: \"kubernetes.io/projected/6b9d21e3-863e-4129-a794-41c3bb8899df-kube-api-access-9hrns\") pod \"6b9d21e3-863e-4129-a794-41c3bb8899df\" (UID: \"6b9d21e3-863e-4129-a794-41c3bb8899df\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.011408 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6e4558a-350e-442a-ad61-70d9a9824219-logs\") pod \"a6e4558a-350e-442a-ad61-70d9a9824219\" (UID: \"a6e4558a-350e-442a-ad61-70d9a9824219\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.011434 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3b08ab3-f6c3-417e-abce-e03994e7553d-public-tls-certs\") pod \"b3b08ab3-f6c3-417e-abce-e03994e7553d\" (UID: \"b3b08ab3-f6c3-417e-abce-e03994e7553d\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.011473 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3b08ab3-f6c3-417e-abce-e03994e7553d-config-data\") pod \"b3b08ab3-f6c3-417e-abce-e03994e7553d\" (UID: \"b3b08ab3-f6c3-417e-abce-e03994e7553d\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.011513 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxmfz\" (UniqueName: \"kubernetes.io/projected/a6e4558a-350e-442a-ad61-70d9a9824219-kube-api-access-qxmfz\") pod \"a6e4558a-350e-442a-ad61-70d9a9824219\" (UID: \"a6e4558a-350e-442a-ad61-70d9a9824219\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.011551 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16fc61a1-6794-4ecf-a3dd-d79a88eda486-logs\") pod \"16fc61a1-6794-4ecf-a3dd-d79a88eda486\" (UID: \"16fc61a1-6794-4ecf-a3dd-d79a88eda486\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.011581 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3b08ab3-f6c3-417e-abce-e03994e7553d-logs\") pod \"b3b08ab3-f6c3-417e-abce-e03994e7553d\" (UID: \"b3b08ab3-f6c3-417e-abce-e03994e7553d\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.012574 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16fc61a1-6794-4ecf-a3dd-d79a88eda486-public-tls-certs\") pod \"16fc61a1-6794-4ecf-a3dd-d79a88eda486\" (UID: \"16fc61a1-6794-4ecf-a3dd-d79a88eda486\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.012610 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16fc61a1-6794-4ecf-a3dd-d79a88eda486-scripts\") pod \"16fc61a1-6794-4ecf-a3dd-d79a88eda486\" (UID: \"16fc61a1-6794-4ecf-a3dd-d79a88eda486\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.012636 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3c01e5f-8338-497f-9e47-e2e3e857df63-internal-tls-certs\") pod \"f3c01e5f-8338-497f-9e47-e2e3e857df63\" (UID: \"f3c01e5f-8338-497f-9e47-e2e3e857df63\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.012656 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"16fc61a1-6794-4ecf-a3dd-d79a88eda486\" (UID: \"16fc61a1-6794-4ecf-a3dd-d79a88eda486\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.012688 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f3c01e5f-8338-497f-9e47-e2e3e857df63-httpd-run\") pod \"f3c01e5f-8338-497f-9e47-e2e3e857df63\" (UID: \"f3c01e5f-8338-497f-9e47-e2e3e857df63\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.013410 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffd3886c-7122-4bd5-97d2-5c448c31e941-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.013431 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmhfj\" (UniqueName: \"kubernetes.io/projected/ffd3886c-7122-4bd5-97d2-5c448c31e941-kube-api-access-mmhfj\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.013447 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffd3886c-7122-4bd5-97d2-5c448c31e941-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.013460 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffd3886c-7122-4bd5-97d2-5c448c31e941-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.013473 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffd3886c-7122-4bd5-97d2-5c448c31e941-logs\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.013485 4752 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ffd3886c-7122-4bd5-97d2-5c448c31e941-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.013497 4752 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddb825f2-a05f-409d-b4cc-80408e2db5d7-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.013511 4752 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ffd3886c-7122-4bd5-97d2-5c448c31e941-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.013929 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3c01e5f-8338-497f-9e47-e2e3e857df63-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f3c01e5f-8338-497f-9e47-e2e3e857df63" (UID: "f3c01e5f-8338-497f-9e47-e2e3e857df63"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.015507 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3b08ab3-f6c3-417e-abce-e03994e7553d-logs" (OuterVolumeSpecName: "logs") pod "b3b08ab3-f6c3-417e-abce-e03994e7553d" (UID: "b3b08ab3-f6c3-417e-abce-e03994e7553d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.016616 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16fc61a1-6794-4ecf-a3dd-d79a88eda486-logs" (OuterVolumeSpecName: "logs") pod "16fc61a1-6794-4ecf-a3dd-d79a88eda486" (UID: "16fc61a1-6794-4ecf-a3dd-d79a88eda486"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.018941 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b9d21e3-863e-4129-a794-41c3bb8899df-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6b9d21e3-863e-4129-a794-41c3bb8899df" (UID: "6b9d21e3-863e-4129-a794-41c3bb8899df"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.019261 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3c01e5f-8338-497f-9e47-e2e3e857df63-logs" (OuterVolumeSpecName: "logs") pod "f3c01e5f-8338-497f-9e47-e2e3e857df63" (UID: "f3c01e5f-8338-497f-9e47-e2e3e857df63"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.021217 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6e4558a-350e-442a-ad61-70d9a9824219-logs" (OuterVolumeSpecName: "logs") pod "a6e4558a-350e-442a-ad61-70d9a9824219" (UID: "a6e4558a-350e-442a-ad61-70d9a9824219"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.022478 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cb3afd6-0557-4a85-9763-1572d92e6aa3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7cb3afd6-0557-4a85-9763-1572d92e6aa3" (UID: "7cb3afd6-0557-4a85-9763-1572d92e6aa3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.022565 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffd3886c-7122-4bd5-97d2-5c448c31e941-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ffd3886c-7122-4bd5-97d2-5c448c31e941" (UID: "ffd3886c-7122-4bd5-97d2-5c448c31e941"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.022814 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3c01e5f-8338-497f-9e47-e2e3e857df63-scripts" (OuterVolumeSpecName: "scripts") pod "f3c01e5f-8338-497f-9e47-e2e3e857df63" (UID: "f3c01e5f-8338-497f-9e47-e2e3e857df63"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.022869 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16fc61a1-6794-4ecf-a3dd-d79a88eda486-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "16fc61a1-6794-4ecf-a3dd-d79a88eda486" (UID: "16fc61a1-6794-4ecf-a3dd-d79a88eda486"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.029076 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.030485 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16fc61a1-6794-4ecf-a3dd-d79a88eda486-kube-api-access-rxw5j" (OuterVolumeSpecName: "kube-api-access-rxw5j") pod "16fc61a1-6794-4ecf-a3dd-d79a88eda486" (UID: "16fc61a1-6794-4ecf-a3dd-d79a88eda486"). InnerVolumeSpecName "kube-api-access-rxw5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.051010 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16fc61a1-6794-4ecf-a3dd-d79a88eda486-scripts" (OuterVolumeSpecName: "scripts") pod "16fc61a1-6794-4ecf-a3dd-d79a88eda486" (UID: "16fc61a1-6794-4ecf-a3dd-d79a88eda486"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.051014 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6e4558a-350e-442a-ad61-70d9a9824219-scripts" (OuterVolumeSpecName: "scripts") pod "a6e4558a-350e-442a-ad61-70d9a9824219" (UID: "a6e4558a-350e-442a-ad61-70d9a9824219"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.051051 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "16fc61a1-6794-4ecf-a3dd-d79a88eda486" (UID: "16fc61a1-6794-4ecf-a3dd-d79a88eda486"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.051064 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3c01e5f-8338-497f-9e47-e2e3e857df63-kube-api-access-xhm5p" (OuterVolumeSpecName: "kube-api-access-xhm5p") pod "f3c01e5f-8338-497f-9e47-e2e3e857df63" (UID: "f3c01e5f-8338-497f-9e47-e2e3e857df63"). InnerVolumeSpecName "kube-api-access-xhm5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.051120 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffd3886c-7122-4bd5-97d2-5c448c31e941-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ffd3886c-7122-4bd5-97d2-5c448c31e941" (UID: "ffd3886c-7122-4bd5-97d2-5c448c31e941"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.056976 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3b08ab3-f6c3-417e-abce-e03994e7553d-kube-api-access-wm5rw" (OuterVolumeSpecName: "kube-api-access-wm5rw") pod "b3b08ab3-f6c3-417e-abce-e03994e7553d" (UID: "b3b08ab3-f6c3-417e-abce-e03994e7553d"). InnerVolumeSpecName "kube-api-access-wm5rw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.057001 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "f3c01e5f-8338-497f-9e47-e2e3e857df63" (UID: "f3c01e5f-8338-497f-9e47-e2e3e857df63"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.060198 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6e4558a-350e-442a-ad61-70d9a9824219-kube-api-access-qxmfz" (OuterVolumeSpecName: "kube-api-access-qxmfz") pod "a6e4558a-350e-442a-ad61-70d9a9824219" (UID: "a6e4558a-350e-442a-ad61-70d9a9824219"). InnerVolumeSpecName "kube-api-access-qxmfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.063840 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.063947 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cb3afd6-0557-4a85-9763-1572d92e6aa3-kube-api-access-svb6w" (OuterVolumeSpecName: "kube-api-access-svb6w") pod "7cb3afd6-0557-4a85-9763-1572d92e6aa3" (UID: "7cb3afd6-0557-4a85-9763-1572d92e6aa3"). InnerVolumeSpecName "kube-api-access-svb6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.064975 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b9d21e3-863e-4129-a794-41c3bb8899df-kube-api-access-9hrns" (OuterVolumeSpecName: "kube-api-access-9hrns") pod "6b9d21e3-863e-4129-a794-41c3bb8899df" (UID: "6b9d21e3-863e-4129-a794-41c3bb8899df"). InnerVolumeSpecName "kube-api-access-9hrns". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.133447 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e8ac218-b256-4d82-afb6-34448254aa9f-log-httpd\") pod \"2e8ac218-b256-4d82-afb6-34448254aa9f\" (UID: \"2e8ac218-b256-4d82-afb6-34448254aa9f\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.133510 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e8ac218-b256-4d82-afb6-34448254aa9f-scripts\") pod \"2e8ac218-b256-4d82-afb6-34448254aa9f\" (UID: \"2e8ac218-b256-4d82-afb6-34448254aa9f\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.133552 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e8ac218-b256-4d82-afb6-34448254aa9f-run-httpd\") pod \"2e8ac218-b256-4d82-afb6-34448254aa9f\" (UID: \"2e8ac218-b256-4d82-afb6-34448254aa9f\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.133641 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4bdb834d-3fdb-4bed-b349-e41c60ec4d64-nova-metadata-tls-certs\") pod \"4bdb834d-3fdb-4bed-b349-e41c60ec4d64\" (UID: \"4bdb834d-3fdb-4bed-b349-e41c60ec4d64\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.133683 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bdb834d-3fdb-4bed-b349-e41c60ec4d64-combined-ca-bundle\") pod \"4bdb834d-3fdb-4bed-b349-e41c60ec4d64\" (UID: \"4bdb834d-3fdb-4bed-b349-e41c60ec4d64\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.133834 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e8ac218-b256-4d82-afb6-34448254aa9f-ceilometer-tls-certs\") pod \"2e8ac218-b256-4d82-afb6-34448254aa9f\" (UID: \"2e8ac218-b256-4d82-afb6-34448254aa9f\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.133876 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bdb834d-3fdb-4bed-b349-e41c60ec4d64-config-data\") pod \"4bdb834d-3fdb-4bed-b349-e41c60ec4d64\" (UID: \"4bdb834d-3fdb-4bed-b349-e41c60ec4d64\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.133904 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2e8ac218-b256-4d82-afb6-34448254aa9f-sg-core-conf-yaml\") pod \"2e8ac218-b256-4d82-afb6-34448254aa9f\" (UID: \"2e8ac218-b256-4d82-afb6-34448254aa9f\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.133955 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dklh\" (UniqueName: \"kubernetes.io/projected/4bdb834d-3fdb-4bed-b349-e41c60ec4d64-kube-api-access-8dklh\") pod \"4bdb834d-3fdb-4bed-b349-e41c60ec4d64\" (UID: \"4bdb834d-3fdb-4bed-b349-e41c60ec4d64\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.133996 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bdb834d-3fdb-4bed-b349-e41c60ec4d64-logs\") pod \"4bdb834d-3fdb-4bed-b349-e41c60ec4d64\" (UID: \"4bdb834d-3fdb-4bed-b349-e41c60ec4d64\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.134041 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e8ac218-b256-4d82-afb6-34448254aa9f-config-data\") pod \"2e8ac218-b256-4d82-afb6-34448254aa9f\" (UID: \"2e8ac218-b256-4d82-afb6-34448254aa9f\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.134122 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e8ac218-b256-4d82-afb6-34448254aa9f-combined-ca-bundle\") pod \"2e8ac218-b256-4d82-afb6-34448254aa9f\" (UID: \"2e8ac218-b256-4d82-afb6-34448254aa9f\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.134158 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw4wc\" (UniqueName: \"kubernetes.io/projected/2e8ac218-b256-4d82-afb6-34448254aa9f-kube-api-access-nw4wc\") pod \"2e8ac218-b256-4d82-afb6-34448254aa9f\" (UID: \"2e8ac218-b256-4d82-afb6-34448254aa9f\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.135685 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e8ac218-b256-4d82-afb6-34448254aa9f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2e8ac218-b256-4d82-afb6-34448254aa9f" (UID: "2e8ac218-b256-4d82-afb6-34448254aa9f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.136032 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e8ac218-b256-4d82-afb6-34448254aa9f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2e8ac218-b256-4d82-afb6-34448254aa9f" (UID: "2e8ac218-b256-4d82-afb6-34448254aa9f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.136346 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b-operator-scripts\") pod \"keystone7b4f-account-delete-jdrrz\" (UID: \"29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b\") " pod="openstack/keystone7b4f-account-delete-jdrrz" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.136387 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgtdc\" (UniqueName: \"kubernetes.io/projected/29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b-kube-api-access-lgtdc\") pod \"keystone7b4f-account-delete-jdrrz\" (UID: \"29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b\") " pod="openstack/keystone7b4f-account-delete-jdrrz" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.136703 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16fc61a1-6794-4ecf-a3dd-d79a88eda486-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.136731 4752 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.136756 4752 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f3c01e5f-8338-497f-9e47-e2e3e857df63-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.136765 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3c01e5f-8338-497f-9e47-e2e3e857df63-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.136774 4752 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16fc61a1-6794-4ecf-a3dd-d79a88eda486-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.136793 4752 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.136804 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhm5p\" (UniqueName: \"kubernetes.io/projected/f3c01e5f-8338-497f-9e47-e2e3e857df63-kube-api-access-xhm5p\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.136818 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wm5rw\" (UniqueName: \"kubernetes.io/projected/b3b08ab3-f6c3-417e-abce-e03994e7553d-kube-api-access-wm5rw\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.136828 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svb6w\" (UniqueName: \"kubernetes.io/projected/7cb3afd6-0557-4a85-9763-1572d92e6aa3-kube-api-access-svb6w\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.136837 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxw5j\" (UniqueName: \"kubernetes.io/projected/16fc61a1-6794-4ecf-a3dd-d79a88eda486-kube-api-access-rxw5j\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.136847 4752 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffd3886c-7122-4bd5-97d2-5c448c31e941-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.136860 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7cb3afd6-0557-4a85-9763-1572d92e6aa3-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.136874 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b9d21e3-863e-4129-a794-41c3bb8899df-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.136883 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6e4558a-350e-442a-ad61-70d9a9824219-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.136893 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3c01e5f-8338-497f-9e47-e2e3e857df63-logs\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.136905 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hrns\" (UniqueName: \"kubernetes.io/projected/6b9d21e3-863e-4129-a794-41c3bb8899df-kube-api-access-9hrns\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.136914 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6e4558a-350e-442a-ad61-70d9a9824219-logs\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.136923 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxmfz\" (UniqueName: \"kubernetes.io/projected/a6e4558a-350e-442a-ad61-70d9a9824219-kube-api-access-qxmfz\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.136932 4752 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffd3886c-7122-4bd5-97d2-5c448c31e941-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.136944 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16fc61a1-6794-4ecf-a3dd-d79a88eda486-logs\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.136952 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3b08ab3-f6c3-417e-abce-e03994e7553d-logs\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.147882 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bdb834d-3fdb-4bed-b349-e41c60ec4d64-logs" (OuterVolumeSpecName: "logs") pod "4bdb834d-3fdb-4bed-b349-e41c60ec4d64" (UID: "4bdb834d-3fdb-4bed-b349-e41c60ec4d64"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: E1124 11:28:34.150411 4752 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 24 11:28:34 crc kubenswrapper[4752]: E1124 11:28:34.150468 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b-operator-scripts podName:29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b nodeName:}" failed. No retries permitted until 2025-11-24 11:28:36.150444796 +0000 UTC m=+1322.135265085 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b-operator-scripts") pod "keystone7b4f-account-delete-jdrrz" (UID: "29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b") : configmap "openstack-scripts" not found Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.155873 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-68f8b8b648-65q5g" Nov 24 11:28:34 crc kubenswrapper[4752]: E1124 11:28:34.162094 4752 projected.go:194] Error preparing data for projected volume kube-api-access-lgtdc for pod openstack/keystone7b4f-account-delete-jdrrz: failed to fetch token: serviceaccounts "galera-openstack" not found Nov 24 11:28:34 crc kubenswrapper[4752]: E1124 11:28:34.163586 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b-kube-api-access-lgtdc podName:29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b nodeName:}" failed. No retries permitted until 2025-11-24 11:28:36.163219605 +0000 UTC m=+1322.148039894 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-lgtdc" (UniqueName: "kubernetes.io/projected/29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b-kube-api-access-lgtdc") pod "keystone7b4f-account-delete-jdrrz" (UID: "29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b") : failed to fetch token: serviceaccounts "galera-openstack" not found Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.167441 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e8ac218-b256-4d82-afb6-34448254aa9f-scripts" (OuterVolumeSpecName: "scripts") pod "2e8ac218-b256-4d82-afb6-34448254aa9f" (UID: "2e8ac218-b256-4d82-afb6-34448254aa9f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.167581 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.184782 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e8ac218-b256-4d82-afb6-34448254aa9f-kube-api-access-nw4wc" (OuterVolumeSpecName: "kube-api-access-nw4wc") pod "2e8ac218-b256-4d82-afb6-34448254aa9f" (UID: "2e8ac218-b256-4d82-afb6-34448254aa9f"). InnerVolumeSpecName "kube-api-access-nw4wc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.189429 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16fc61a1-6794-4ecf-a3dd-d79a88eda486-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16fc61a1-6794-4ecf-a3dd-d79a88eda486" (UID: "16fc61a1-6794-4ecf-a3dd-d79a88eda486"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.192447 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3b08ab3-f6c3-417e-abce-e03994e7553d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b3b08ab3-f6c3-417e-abce-e03994e7553d" (UID: "b3b08ab3-f6c3-417e-abce-e03994e7553d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.204274 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bdb834d-3fdb-4bed-b349-e41c60ec4d64-kube-api-access-8dklh" (OuterVolumeSpecName: "kube-api-access-8dklh") pod "4bdb834d-3fdb-4bed-b349-e41c60ec4d64" (UID: "4bdb834d-3fdb-4bed-b349-e41c60ec4d64"). InnerVolumeSpecName "kube-api-access-8dklh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.206120 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.207484 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.220113 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.238009 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d0084f6-064e-4089-87fe-4ead63923b56-config-data\") pod \"9d0084f6-064e-4089-87fe-4ead63923b56\" (UID: \"9d0084f6-064e-4089-87fe-4ead63923b56\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.238100 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d0084f6-064e-4089-87fe-4ead63923b56-combined-ca-bundle\") pod \"9d0084f6-064e-4089-87fe-4ead63923b56\" (UID: \"9d0084f6-064e-4089-87fe-4ead63923b56\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.238125 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1020727e-3725-4fed-b276-043ae4d30c4c-combined-ca-bundle\") pod \"1020727e-3725-4fed-b276-043ae4d30c4c\" (UID: \"1020727e-3725-4fed-b276-043ae4d30c4c\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.238147 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c592991-2ffb-417a-aa80-49d0111618bb-internal-tls-certs\") pod \"6c592991-2ffb-417a-aa80-49d0111618bb\" (UID: \"6c592991-2ffb-417a-aa80-49d0111618bb\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.238184 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhd9l\" (UniqueName: \"kubernetes.io/projected/1020727e-3725-4fed-b276-043ae4d30c4c-kube-api-access-vhd9l\") pod \"1020727e-3725-4fed-b276-043ae4d30c4c\" (UID: \"1020727e-3725-4fed-b276-043ae4d30c4c\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.238309 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6wb7\" (UniqueName: \"kubernetes.io/projected/6c592991-2ffb-417a-aa80-49d0111618bb-kube-api-access-n6wb7\") pod \"6c592991-2ffb-417a-aa80-49d0111618bb\" (UID: \"6c592991-2ffb-417a-aa80-49d0111618bb\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.238336 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9d0084f6-064e-4089-87fe-4ead63923b56-etc-machine-id\") pod \"9d0084f6-064e-4089-87fe-4ead63923b56\" (UID: \"9d0084f6-064e-4089-87fe-4ead63923b56\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.238374 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/1020727e-3725-4fed-b276-043ae4d30c4c-memcached-tls-certs\") pod \"1020727e-3725-4fed-b276-043ae4d30c4c\" (UID: \"1020727e-3725-4fed-b276-043ae4d30c4c\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.238405 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c592991-2ffb-417a-aa80-49d0111618bb-config-data\") pod \"6c592991-2ffb-417a-aa80-49d0111618bb\" (UID: \"6c592991-2ffb-417a-aa80-49d0111618bb\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.238523 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c592991-2ffb-417a-aa80-49d0111618bb-public-tls-certs\") pod \"6c592991-2ffb-417a-aa80-49d0111618bb\" (UID: \"6c592991-2ffb-417a-aa80-49d0111618bb\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.238571 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c592991-2ffb-417a-aa80-49d0111618bb-combined-ca-bundle\") pod \"6c592991-2ffb-417a-aa80-49d0111618bb\" (UID: \"6c592991-2ffb-417a-aa80-49d0111618bb\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.238612 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1020727e-3725-4fed-b276-043ae4d30c4c-config-data\") pod \"1020727e-3725-4fed-b276-043ae4d30c4c\" (UID: \"1020727e-3725-4fed-b276-043ae4d30c4c\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.238640 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c592991-2ffb-417a-aa80-49d0111618bb-config-data-custom\") pod \"6c592991-2ffb-417a-aa80-49d0111618bb\" (UID: \"6c592991-2ffb-417a-aa80-49d0111618bb\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.238693 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1020727e-3725-4fed-b276-043ae4d30c4c-kolla-config\") pod \"1020727e-3725-4fed-b276-043ae4d30c4c\" (UID: \"1020727e-3725-4fed-b276-043ae4d30c4c\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.238730 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d0084f6-064e-4089-87fe-4ead63923b56-scripts\") pod \"9d0084f6-064e-4089-87fe-4ead63923b56\" (UID: \"9d0084f6-064e-4089-87fe-4ead63923b56\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.238759 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klw4x\" (UniqueName: \"kubernetes.io/projected/9d0084f6-064e-4089-87fe-4ead63923b56-kube-api-access-klw4x\") pod \"9d0084f6-064e-4089-87fe-4ead63923b56\" (UID: \"9d0084f6-064e-4089-87fe-4ead63923b56\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.238779 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d0084f6-064e-4089-87fe-4ead63923b56-config-data-custom\") pod \"9d0084f6-064e-4089-87fe-4ead63923b56\" (UID: \"9d0084f6-064e-4089-87fe-4ead63923b56\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.238824 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c592991-2ffb-417a-aa80-49d0111618bb-logs\") pod \"6c592991-2ffb-417a-aa80-49d0111618bb\" (UID: \"6c592991-2ffb-417a-aa80-49d0111618bb\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.240719 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16fc61a1-6794-4ecf-a3dd-d79a88eda486-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.240756 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dklh\" (UniqueName: \"kubernetes.io/projected/4bdb834d-3fdb-4bed-b349-e41c60ec4d64-kube-api-access-8dklh\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.240767 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bdb834d-3fdb-4bed-b349-e41c60ec4d64-logs\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.240779 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3b08ab3-f6c3-417e-abce-e03994e7553d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.240789 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nw4wc\" (UniqueName: \"kubernetes.io/projected/2e8ac218-b256-4d82-afb6-34448254aa9f-kube-api-access-nw4wc\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.240798 4752 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e8ac218-b256-4d82-afb6-34448254aa9f-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.240807 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e8ac218-b256-4d82-afb6-34448254aa9f-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.240817 4752 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e8ac218-b256-4d82-afb6-34448254aa9f-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: E1124 11:28:34.240870 4752 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.240892 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1020727e-3725-4fed-b276-043ae4d30c4c-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "1020727e-3725-4fed-b276-043ae4d30c4c" (UID: "1020727e-3725-4fed-b276-043ae4d30c4c"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: E1124 11:28:34.240920 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0-operator-scripts podName:4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0 nodeName:}" failed. No retries permitted until 2025-11-24 11:28:36.24090247 +0000 UTC m=+1322.225722759 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0-operator-scripts") pod "glancef4c5-account-delete-kjkjp" (UID: "4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0") : configmap "openstack-scripts" not found Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.241942 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c592991-2ffb-417a-aa80-49d0111618bb-logs" (OuterVolumeSpecName: "logs") pod "6c592991-2ffb-417a-aa80-49d0111618bb" (UID: "6c592991-2ffb-417a-aa80-49d0111618bb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.242788 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1020727e-3725-4fed-b276-043ae4d30c4c-config-data" (OuterVolumeSpecName: "config-data") pod "1020727e-3725-4fed-b276-043ae4d30c4c" (UID: "1020727e-3725-4fed-b276-043ae4d30c4c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: E1124 11:28:34.243331 4752 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 24 11:28:34 crc kubenswrapper[4752]: E1124 11:28:34.243410 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/273fc6bc-214e-4fca-8939-003f983d6aa1-operator-scripts podName:273fc6bc-214e-4fca-8939-003f983d6aa1 nodeName:}" failed. No retries permitted until 2025-11-24 11:28:36.243387362 +0000 UTC m=+1322.228207711 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/273fc6bc-214e-4fca-8939-003f983d6aa1-operator-scripts") pod "novacell080e2-account-delete-8s7w5" (UID: "273fc6bc-214e-4fca-8939-003f983d6aa1") : configmap "openstack-scripts" not found Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.246719 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9d0084f6-064e-4089-87fe-4ead63923b56-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9d0084f6-064e-4089-87fe-4ead63923b56" (UID: "9d0084f6-064e-4089-87fe-4ead63923b56"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.252660 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1020727e-3725-4fed-b276-043ae4d30c4c-kube-api-access-vhd9l" (OuterVolumeSpecName: "kube-api-access-vhd9l") pod "1020727e-3725-4fed-b276-043ae4d30c4c" (UID: "1020727e-3725-4fed-b276-043ae4d30c4c"). InnerVolumeSpecName "kube-api-access-vhd9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.273204 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6e4558a-350e-442a-ad61-70d9a9824219-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a6e4558a-350e-442a-ad61-70d9a9824219" (UID: "a6e4558a-350e-442a-ad61-70d9a9824219"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.299548 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c592991-2ffb-417a-aa80-49d0111618bb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6c592991-2ffb-417a-aa80-49d0111618bb" (UID: "6c592991-2ffb-417a-aa80-49d0111618bb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.299941 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d0084f6-064e-4089-87fe-4ead63923b56-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9d0084f6-064e-4089-87fe-4ead63923b56" (UID: "9d0084f6-064e-4089-87fe-4ead63923b56"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.300455 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d0084f6-064e-4089-87fe-4ead63923b56-kube-api-access-klw4x" (OuterVolumeSpecName: "kube-api-access-klw4x") pod "9d0084f6-064e-4089-87fe-4ead63923b56" (UID: "9d0084f6-064e-4089-87fe-4ead63923b56"). InnerVolumeSpecName "kube-api-access-klw4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.301789 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c592991-2ffb-417a-aa80-49d0111618bb-kube-api-access-n6wb7" (OuterVolumeSpecName: "kube-api-access-n6wb7") pod "6c592991-2ffb-417a-aa80-49d0111618bb" (UID: "6c592991-2ffb-417a-aa80-49d0111618bb"). InnerVolumeSpecName "kube-api-access-n6wb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.310939 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d0084f6-064e-4089-87fe-4ead63923b56-scripts" (OuterVolumeSpecName: "scripts") pod "9d0084f6-064e-4089-87fe-4ead63923b56" (UID: "9d0084f6-064e-4089-87fe-4ead63923b56"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.343376 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bdb834d-3fdb-4bed-b349-e41c60ec4d64-config-data" (OuterVolumeSpecName: "config-data") pod "4bdb834d-3fdb-4bed-b349-e41c60ec4d64" (UID: "4bdb834d-3fdb-4bed-b349-e41c60ec4d64"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: E1124 11:28:34.343475 4752 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 24 11:28:34 crc kubenswrapper[4752]: E1124 11:28:34.343680 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/11bd1a8d-cbf9-47c6-a116-538f60634a33-operator-scripts podName:11bd1a8d-cbf9-47c6-a116-538f60634a33 nodeName:}" failed. No retries permitted until 2025-11-24 11:28:35.343666541 +0000 UTC m=+1321.328486830 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/11bd1a8d-cbf9-47c6-a116-538f60634a33-operator-scripts") pod "novaapib683-account-delete-k7898" (UID: "11bd1a8d-cbf9-47c6-a116-538f60634a33") : configmap "openstack-scripts" not found Nov 24 11:28:34 crc kubenswrapper[4752]: E1124 11:28:34.343491 4752 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 24 11:28:34 crc kubenswrapper[4752]: E1124 11:28:34.344060 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a8b9f59e-8570-4b2a-9c7b-14be641f74fe-operator-scripts podName:a8b9f59e-8570-4b2a-9c7b-14be641f74fe nodeName:}" failed. No retries permitted until 2025-11-24 11:28:35.344047212 +0000 UTC m=+1321.328867511 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/a8b9f59e-8570-4b2a-9c7b-14be641f74fe-operator-scripts") pod "neutron574b-account-delete-nw8qc" (UID: "a8b9f59e-8570-4b2a-9c7b-14be641f74fe") : configmap "openstack-scripts" not found Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.343427 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6e4558a-350e-442a-ad61-70d9a9824219-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.344086 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1020727e-3725-4fed-b276-043ae4d30c4c-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.344128 4752 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c592991-2ffb-417a-aa80-49d0111618bb-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.344138 4752 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1020727e-3725-4fed-b276-043ae4d30c4c-kolla-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.344147 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d0084f6-064e-4089-87fe-4ead63923b56-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.344156 4752 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d0084f6-064e-4089-87fe-4ead63923b56-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.344164 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klw4x\" (UniqueName: \"kubernetes.io/projected/9d0084f6-064e-4089-87fe-4ead63923b56-kube-api-access-klw4x\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: E1124 11:28:34.344163 4752 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 24 11:28:34 crc kubenswrapper[4752]: E1124 11:28:34.344231 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/31369e0b-e7eb-44cc-9d7e-21d196d95ad3-operator-scripts podName:31369e0b-e7eb-44cc-9d7e-21d196d95ad3 nodeName:}" failed. No retries permitted until 2025-11-24 11:28:36.344210956 +0000 UTC m=+1322.329031245 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/31369e0b-e7eb-44cc-9d7e-21d196d95ad3-operator-scripts") pod "barbicanf581-account-delete-qm6qk" (UID: "31369e0b-e7eb-44cc-9d7e-21d196d95ad3") : configmap "openstack-scripts" not found Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.344172 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c592991-2ffb-417a-aa80-49d0111618bb-logs\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.344292 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhd9l\" (UniqueName: \"kubernetes.io/projected/1020727e-3725-4fed-b276-043ae4d30c4c-kube-api-access-vhd9l\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.344307 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6wb7\" (UniqueName: \"kubernetes.io/projected/6c592991-2ffb-417a-aa80-49d0111618bb-kube-api-access-n6wb7\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.344319 4752 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9d0084f6-064e-4089-87fe-4ead63923b56-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.364165 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3b08ab3-f6c3-417e-abce-e03994e7553d-config-data" (OuterVolumeSpecName: "config-data") pod "b3b08ab3-f6c3-417e-abce-e03994e7553d" (UID: "b3b08ab3-f6c3-417e-abce-e03994e7553d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.375719 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16fc61a1-6794-4ecf-a3dd-d79a88eda486-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "16fc61a1-6794-4ecf-a3dd-d79a88eda486" (UID: "16fc61a1-6794-4ecf-a3dd-d79a88eda486"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.393507 4752 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.409595 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e8ac218-b256-4d82-afb6-34448254aa9f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2e8ac218-b256-4d82-afb6-34448254aa9f" (UID: "2e8ac218-b256-4d82-afb6-34448254aa9f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.442716 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16fc61a1-6794-4ecf-a3dd-d79a88eda486-config-data" (OuterVolumeSpecName: "config-data") pod "16fc61a1-6794-4ecf-a3dd-d79a88eda486" (UID: "16fc61a1-6794-4ecf-a3dd-d79a88eda486"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.447362 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3b08ab3-f6c3-417e-abce-e03994e7553d-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.447393 4752 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16fc61a1-6794-4ecf-a3dd-d79a88eda486-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.447408 4752 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.447420 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16fc61a1-6794-4ecf-a3dd-d79a88eda486-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.447431 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bdb834d-3fdb-4bed-b349-e41c60ec4d64-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.447441 4752 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2e8ac218-b256-4d82-afb6-34448254aa9f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: E1124 11:28:34.447502 4752 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Nov 24 11:28:34 crc kubenswrapper[4752]: E1124 11:28:34.447600 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-config-data podName:a1f1e943-afb7-40c3-9ff2-56791f4e0ad5 nodeName:}" failed. No retries permitted until 2025-11-24 11:28:42.447575994 +0000 UTC m=+1328.432396373 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-config-data") pod "rabbitmq-server-0" (UID: "a1f1e943-afb7-40c3-9ff2-56791f4e0ad5") : configmap "rabbitmq-config-data" not found Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.464689 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bdb834d-3fdb-4bed-b349-e41c60ec4d64-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "4bdb834d-3fdb-4bed-b349-e41c60ec4d64" (UID: "4bdb834d-3fdb-4bed-b349-e41c60ec4d64"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.498028 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3b08ab3-f6c3-417e-abce-e03994e7553d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b3b08ab3-f6c3-417e-abce-e03994e7553d" (UID: "b3b08ab3-f6c3-417e-abce-e03994e7553d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.519271 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3c01e5f-8338-497f-9e47-e2e3e857df63-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3c01e5f-8338-497f-9e47-e2e3e857df63" (UID: "f3c01e5f-8338-497f-9e47-e2e3e857df63"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.552797 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3c01e5f-8338-497f-9e47-e2e3e857df63-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.553175 4752 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4bdb834d-3fdb-4bed-b349-e41c60ec4d64-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.553258 4752 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3b08ab3-f6c3-417e-abce-e03994e7553d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.566847 4752 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.571971 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3b08ab3-f6c3-417e-abce-e03994e7553d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b3b08ab3-f6c3-417e-abce-e03994e7553d" (UID: "b3b08ab3-f6c3-417e-abce-e03994e7553d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.574865 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c592991-2ffb-417a-aa80-49d0111618bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c592991-2ffb-417a-aa80-49d0111618bb" (UID: "6c592991-2ffb-417a-aa80-49d0111618bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.579856 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3c01e5f-8338-497f-9e47-e2e3e857df63-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f3c01e5f-8338-497f-9e47-e2e3e857df63" (UID: "f3c01e5f-8338-497f-9e47-e2e3e857df63"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.583662 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c592991-2ffb-417a-aa80-49d0111618bb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6c592991-2ffb-417a-aa80-49d0111618bb" (UID: "6c592991-2ffb-417a-aa80-49d0111618bb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.585111 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bdb834d-3fdb-4bed-b349-e41c60ec4d64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4bdb834d-3fdb-4bed-b349-e41c60ec4d64" (UID: "4bdb834d-3fdb-4bed-b349-e41c60ec4d64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.597916 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6e4558a-350e-442a-ad61-70d9a9824219-config-data" (OuterVolumeSpecName: "config-data") pod "a6e4558a-350e-442a-ad61-70d9a9824219" (UID: "a6e4558a-350e-442a-ad61-70d9a9824219"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.630992 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d0084f6-064e-4089-87fe-4ead63923b56-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d0084f6-064e-4089-87fe-4ead63923b56" (UID: "9d0084f6-064e-4089-87fe-4ead63923b56"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.642988 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e8ac218-b256-4d82-afb6-34448254aa9f-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "2e8ac218-b256-4d82-afb6-34448254aa9f" (UID: "2e8ac218-b256-4d82-afb6-34448254aa9f"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.643015 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1020727e-3725-4fed-b276-043ae4d30c4c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1020727e-3725-4fed-b276-043ae4d30c4c" (UID: "1020727e-3725-4fed-b276-043ae4d30c4c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.655960 4752 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e8ac218-b256-4d82-afb6-34448254aa9f-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.655994 4752 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c592991-2ffb-417a-aa80-49d0111618bb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.656004 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c592991-2ffb-417a-aa80-49d0111618bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.656013 4752 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3c01e5f-8338-497f-9e47-e2e3e857df63-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.656021 4752 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.656031 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6e4558a-350e-442a-ad61-70d9a9824219-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.656039 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d0084f6-064e-4089-87fe-4ead63923b56-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.656048 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1020727e-3725-4fed-b276-043ae4d30c4c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.656056 4752 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3b08ab3-f6c3-417e-abce-e03994e7553d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.656064 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bdb834d-3fdb-4bed-b349-e41c60ec4d64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: E1124 11:28:34.656123 4752 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Nov 24 11:28:34 crc kubenswrapper[4752]: E1124 11:28:34.656171 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-config-data podName:3d820ba8-03d6-48f4-9423-bbc1ed64a36b nodeName:}" failed. No retries permitted until 2025-11-24 11:28:42.656154053 +0000 UTC m=+1328.640974342 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-config-data") pod "rabbitmq-cell1-server-0" (UID: "3d820ba8-03d6-48f4-9423-bbc1ed64a36b") : configmap "rabbitmq-cell1-config-data" not found Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.667932 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c592991-2ffb-417a-aa80-49d0111618bb-config-data" (OuterVolumeSpecName: "config-data") pod "6c592991-2ffb-417a-aa80-49d0111618bb" (UID: "6c592991-2ffb-417a-aa80-49d0111618bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.672398 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6e4558a-350e-442a-ad61-70d9a9824219-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a6e4558a-350e-442a-ad61-70d9a9824219" (UID: "a6e4558a-350e-442a-ad61-70d9a9824219"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.678123 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e8ac218-b256-4d82-afb6-34448254aa9f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e8ac218-b256-4d82-afb6-34448254aa9f" (UID: "2e8ac218-b256-4d82-afb6-34448254aa9f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.688413 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1020727e-3725-4fed-b276-043ae4d30c4c-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "1020727e-3725-4fed-b276-043ae4d30c4c" (UID: "1020727e-3725-4fed-b276-043ae4d30c4c"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.689720 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3c01e5f-8338-497f-9e47-e2e3e857df63-config-data" (OuterVolumeSpecName: "config-data") pod "f3c01e5f-8338-497f-9e47-e2e3e857df63" (UID: "f3c01e5f-8338-497f-9e47-e2e3e857df63"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.692177 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d0084f6-064e-4089-87fe-4ead63923b56-config-data" (OuterVolumeSpecName: "config-data") pod "9d0084f6-064e-4089-87fe-4ead63923b56" (UID: "9d0084f6-064e-4089-87fe-4ead63923b56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.717088 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c592991-2ffb-417a-aa80-49d0111618bb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6c592991-2ffb-417a-aa80-49d0111618bb" (UID: "6c592991-2ffb-417a-aa80-49d0111618bb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.724472 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6e4558a-350e-442a-ad61-70d9a9824219-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a6e4558a-350e-442a-ad61-70d9a9824219" (UID: "a6e4558a-350e-442a-ad61-70d9a9824219"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.738811 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell119d4-account-delete-vksfq" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.749638 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0487424d-5178-4843-ae8d-db6c015fe9d4" path="/var/lib/kubelet/pods/0487424d-5178-4843-ae8d-db6c015fe9d4/volumes" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.756035 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c1cc988-7c84-4647-98d5-6831aaff2915" path="/var/lib/kubelet/pods/0c1cc988-7c84-4647-98d5-6831aaff2915/volumes" Nov 24 11:28:34 crc kubenswrapper[4752]: E1124 11:28:34.756281 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d926ac7aaad802ad821b9f117c72a63b843e3913b61ff928707f207f37dcd1bc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.757068 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51186e60-74ec-447f-afd0-14056f01d5a3-operator-scripts\") pod \"51186e60-74ec-447f-afd0-14056f01d5a3\" (UID: \"51186e60-74ec-447f-afd0-14056f01d5a3\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.757127 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr8g\" (UniqueName: \"kubernetes.io/projected/51186e60-74ec-447f-afd0-14056f01d5a3-kube-api-access-pjr8g\") pod \"51186e60-74ec-447f-afd0-14056f01d5a3\" (UID: \"51186e60-74ec-447f-afd0-14056f01d5a3\") " Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.758925 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51186e60-74ec-447f-afd0-14056f01d5a3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "51186e60-74ec-447f-afd0-14056f01d5a3" (UID: "51186e60-74ec-447f-afd0-14056f01d5a3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.759901 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e8ac218-b256-4d82-afb6-34448254aa9f-config-data" (OuterVolumeSpecName: "config-data") pod "2e8ac218-b256-4d82-afb6-34448254aa9f" (UID: "2e8ac218-b256-4d82-afb6-34448254aa9f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.760160 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e8ac218-b256-4d82-afb6-34448254aa9f-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.760178 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e8ac218-b256-4d82-afb6-34448254aa9f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.760191 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3c01e5f-8338-497f-9e47-e2e3e857df63-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.760204 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d0084f6-064e-4089-87fe-4ead63923b56-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.760224 4752 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c592991-2ffb-417a-aa80-49d0111618bb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.760236 4752 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6e4558a-350e-442a-ad61-70d9a9824219-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.760251 4752 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/1020727e-3725-4fed-b276-043ae4d30c4c-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.760262 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c592991-2ffb-417a-aa80-49d0111618bb-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.760272 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51186e60-74ec-447f-afd0-14056f01d5a3-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.760282 4752 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6e4558a-350e-442a-ad61-70d9a9824219-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: E1124 11:28:34.761702 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d926ac7aaad802ad821b9f117c72a63b843e3913b61ff928707f207f37dcd1bc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.762935 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51186e60-74ec-447f-afd0-14056f01d5a3-kube-api-access-pjr8g" (OuterVolumeSpecName: "kube-api-access-pjr8g") pod "51186e60-74ec-447f-afd0-14056f01d5a3" (UID: "51186e60-74ec-447f-afd0-14056f01d5a3"). InnerVolumeSpecName "kube-api-access-pjr8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:28:34 crc kubenswrapper[4752]: E1124 11:28:34.764895 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d926ac7aaad802ad821b9f117c72a63b843e3913b61ff928707f207f37dcd1bc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 24 11:28:34 crc kubenswrapper[4752]: E1124 11:28:34.764957 4752 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="5be4efe6-ed60-4417-a84c-8ff27bf4a685" containerName="nova-cell1-conductor-conductor" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.765427 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d92c8e6-2992-41be-950b-1eba6c84b636" path="/var/lib/kubelet/pods/2d92c8e6-2992-41be-950b-1eba6c84b636/volumes" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.766886 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3acd95db-8b48-48c6-9153-d4f2dd07745a" path="/var/lib/kubelet/pods/3acd95db-8b48-48c6-9153-d4f2dd07745a/volumes" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.767688 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b18be8a-5ff5-4b20-b6d5-5ca167f33583" path="/var/lib/kubelet/pods/4b18be8a-5ff5-4b20-b6d5-5ca167f33583/volumes" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.770556 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cb9117e-3eeb-4c01-af6a-643eec81666c" path="/var/lib/kubelet/pods/7cb9117e-3eeb-4c01-af6a-643eec81666c/volumes" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.771138 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e150ca9-5a87-40ba-bbad-bfae139179ae" path="/var/lib/kubelet/pods/7e150ca9-5a87-40ba-bbad-bfae139179ae/volumes" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.771605 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddb825f2-a05f-409d-b4cc-80408e2db5d7" path="/var/lib/kubelet/pods/ddb825f2-a05f-409d-b4cc-80408e2db5d7/volumes" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.772511 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1469fa1-76e8-4cb0-a81f-2ec056462912" path="/var/lib/kubelet/pods/e1469fa1-76e8-4cb0-a81f-2ec056462912/volumes" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.773130 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3426b66-ea91-4d99-86c5-955d77073619" path="/var/lib/kubelet/pods/e3426b66-ea91-4d99-86c5-955d77073619/volumes" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.807109 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9d0084f6-064e-4089-87fe-4ead63923b56","Type":"ContainerDied","Data":"cf1d42fd20548e21c71af29102a5bd9e07f9567f10fb372db9995b22fff3e318"} Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.807158 4752 scope.go:117] "RemoveContainer" containerID="be46845395e6568e301fffb75b2b8cbf950babfb7216ca6d7fcccb52601c5794" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.807272 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.812243 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"1020727e-3725-4fed-b276-043ae4d30c4c","Type":"ContainerDied","Data":"dcba6482a3ae81032c0db9645c5e86cb29e0abe3e9376cbfd1c49910aa608d17"} Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.812343 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.814469 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell119d4-account-delete-vksfq" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.814469 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell119d4-account-delete-vksfq" event={"ID":"51186e60-74ec-447f-afd0-14056f01d5a3","Type":"ContainerDied","Data":"cd4de94aac35e65688ccf20767afcfad345409b7ffd4e0069b54b5fcab53a480"} Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.817284 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e8ac218-b256-4d82-afb6-34448254aa9f","Type":"ContainerDied","Data":"13c31024e25cdc8a474efaac8c249f5b73c4d7487d6ccdc27e5f7d73d85d329a"} Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.817394 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 11:28:34 crc kubenswrapper[4752]: E1124 11:28:34.820295 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbc8819aaddd8d4d2d0dad826bb4ce82df8a1c47c68ac131ce481a35067952c7" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Nov 24 11:28:34 crc kubenswrapper[4752]: E1124 11:28:34.822134 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbc8819aaddd8d4d2d0dad826bb4ce82df8a1c47c68ac131ce481a35067952c7" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.825110 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4bdb834d-3fdb-4bed-b349-e41c60ec4d64","Type":"ContainerDied","Data":"27624a3ee861162764efc4d71981b850fb235ccd4efc483dcb937fd0d6cf78fb"} Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.825170 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 11:28:34 crc kubenswrapper[4752]: E1124 11:28:34.825179 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbc8819aaddd8d4d2d0dad826bb4ce82df8a1c47c68ac131ce481a35067952c7" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Nov 24 11:28:34 crc kubenswrapper[4752]: E1124 11:28:34.825224 4752 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="1077002f-be90-4e09-8158-efdc98329e5b" containerName="ovn-northd" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.832629 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"16fc61a1-6794-4ecf-a3dd-d79a88eda486","Type":"ContainerDied","Data":"a7ae944d23e0ca690cf82ed28c91c09d4ee094baf62771943265553e01af552f"} Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.832688 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.844567 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-57b7bbd86d-9drzs" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.851123 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/novacell080e2-account-delete-8s7w5" podUID="273fc6bc-214e-4fca-8939-003f983d6aa1" containerName="mariadb-account-delete" containerID="cri-o://51562528f3ca126cb6a542c5f8fc1ce5583a429c0d7b6e27713bdb2a51828aa1" gracePeriod=30 Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.851359 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-68f8b8b648-65q5g" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.851616 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.852630 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-68f8b8b648-65q5g" event={"ID":"6c592991-2ffb-417a-aa80-49d0111618bb","Type":"ContainerDied","Data":"ce2c6985b321235d1278471374362c9d02efbc790fbe346fd16a502f6a1612f5"} Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.852892 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement83dd-account-delete-87gjb" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.852915 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron574b-account-delete-nw8qc" podUID="a8b9f59e-8570-4b2a-9c7b-14be641f74fe" containerName="mariadb-account-delete" containerID="cri-o://0c28586032f85dbb5be5c9dc7486df65e43615fce2f45a9b8350fd4f996c1d88" gracePeriod=30 Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.853371 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone7b4f-account-delete-jdrrz" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.853417 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder3b7b-account-delete-rj27n" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.853456 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.853678 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.853764 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/novaapib683-account-delete-k7898" podUID="11bd1a8d-cbf9-47c6-a116-538f60634a33" containerName="mariadb-account-delete" containerID="cri-o://fa5ed377c2b9e970a9675ea141a6258eecf2cb33ef47859900cb051909237527" gracePeriod=30 Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.864342 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr8g\" (UniqueName: \"kubernetes.io/projected/51186e60-74ec-447f-afd0-14056f01d5a3-kube-api-access-pjr8g\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:34 crc kubenswrapper[4752]: I1124 11:28:34.936032 4752 scope.go:117] "RemoveContainer" containerID="9e7930d8efc895b38537d979ec9125f2a956817fea154183ec332abf612d2f8c" Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.006597 4752 scope.go:117] "RemoveContainer" containerID="2163dde349442cb3ffdc514182a2490f24052d186366d6badae0a20befe860b1" Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.051795 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone7b4f-account-delete-jdrrz" Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.060879 4752 scope.go:117] "RemoveContainer" containerID="9139c1b70a2b425efb74d8d55e0b897e54430b4e552e063513eb97a27c8f6d2b" Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.256089 4752 scope.go:117] "RemoveContainer" containerID="b64a3d9cdc49b880ccf1a894ebc75a567c3fbafc89ff71380851b7711949247f" Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.324437 4752 scope.go:117] "RemoveContainer" containerID="3511a8fadfd4d408af8dd7cfad50cde220b8abfee4439a6034133cc037a305e8" Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.329389 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.356158 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.363128 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.369355 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.375547 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="a1f1e943-afb7-40c3-9ff2-56791f4e0ad5" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.375704 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.382705 4752 scope.go:117] "RemoveContainer" containerID="3e3197854a26bf19017707f823b368d90d33292e1a5a4ca1ccb629c45420d998" Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.387427 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 24 11:28:35 crc kubenswrapper[4752]: E1124 11:28:35.391501 4752 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 24 11:28:35 crc kubenswrapper[4752]: E1124 11:28:35.391569 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a8b9f59e-8570-4b2a-9c7b-14be641f74fe-operator-scripts podName:a8b9f59e-8570-4b2a-9c7b-14be641f74fe nodeName:}" failed. No retries permitted until 2025-11-24 11:28:37.391554078 +0000 UTC m=+1323.376374367 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/a8b9f59e-8570-4b2a-9c7b-14be641f74fe-operator-scripts") pod "neutron574b-account-delete-nw8qc" (UID: "a8b9f59e-8570-4b2a-9c7b-14be641f74fe") : configmap "openstack-scripts" not found Nov 24 11:28:35 crc kubenswrapper[4752]: E1124 11:28:35.393495 4752 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 24 11:28:35 crc kubenswrapper[4752]: E1124 11:28:35.393584 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/11bd1a8d-cbf9-47c6-a116-538f60634a33-operator-scripts podName:11bd1a8d-cbf9-47c6-a116-538f60634a33 nodeName:}" failed. No retries permitted until 2025-11-24 11:28:37.393562276 +0000 UTC m=+1323.378382635 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/11bd1a8d-cbf9-47c6-a116-538f60634a33-operator-scripts") pod "novaapib683-account-delete-k7898" (UID: "11bd1a8d-cbf9-47c6-a116-538f60634a33") : configmap "openstack-scripts" not found Nov 24 11:28:35 crc kubenswrapper[4752]: E1124 11:28:35.400739 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="84100f51acbc89dc3eef175fd930e0f2a3ebebe505fb5e9794dfc4d2790209b4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.407473 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-57b7bbd86d-9drzs"] Nov 24 11:28:35 crc kubenswrapper[4752]: E1124 11:28:35.410882 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="84100f51acbc89dc3eef175fd930e0f2a3ebebe505fb5e9794dfc4d2790209b4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.420205 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-57b7bbd86d-9drzs"] Nov 24 11:28:35 crc kubenswrapper[4752]: E1124 11:28:35.424388 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="84100f51acbc89dc3eef175fd930e0f2a3ebebe505fb5e9794dfc4d2790209b4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 24 11:28:35 crc kubenswrapper[4752]: E1124 11:28:35.424464 4752 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="fd071fb7-f9a2-4f4a-aad6-90c340f0d009" containerName="nova-scheduler-scheduler" Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.429437 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.437324 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.440001 4752 scope.go:117] "RemoveContainer" containerID="42af4a9e7e4f79d0e530981eec488f3193a8e0ab8487ba8583a6beacc33cd3f9" Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.442454 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-68f8b8b648-65q5g"] Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.448301 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-68f8b8b648-65q5g"] Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.457506 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.464657 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.471978 4752 scope.go:117] "RemoveContainer" containerID="820c72aca67b79ea64a516208b0eaa7594ab5684ba110221a1b876bae259b74a" Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.482675 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.500333 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.502304 4752 scope.go:117] "RemoveContainer" containerID="6d5f6c189c0f341ca1cfaf61e6a7acbcdd7eb6be88bcb7818cbe04ea74b7b501" Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.506072 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.513955 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.521949 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.529077 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.542117 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell119d4-account-delete-vksfq"] Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.549002 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell119d4-account-delete-vksfq"] Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.660046 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="3d820ba8-03d6-48f4-9423-bbc1ed64a36b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.801067 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.819685 4752 scope.go:117] "RemoveContainer" containerID="944ddefc80e86ba009045369f39188fcfd4ed96c67435d31b540b52189cc2961" Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.844475 4752 scope.go:117] "RemoveContainer" containerID="4903a5653dc6c301907cc4b8a087af2718aabe36426414f0ba709fb8e91a2e8c" Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.881375 4752 scope.go:117] "RemoveContainer" containerID="58ade1c43e6907e3577011461f577c10b33e8682d39ece87e8fbf57d78ed060c" Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.890392 4752 generic.go:334] "Generic (PLEG): container finished" podID="1cd9d1a6-a562-443e-b16d-76c159107794" containerID="0a1dcccaeed6faa3a0411c4c11965da3a4a0622d15fdfb5a731e239a841bf153" exitCode=0 Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.890714 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7f4ddd687-vk74x" event={"ID":"1cd9d1a6-a562-443e-b16d-76c159107794","Type":"ContainerDied","Data":"0a1dcccaeed6faa3a0411c4c11965da3a4a0622d15fdfb5a731e239a841bf153"} Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.894019 4752 generic.go:334] "Generic (PLEG): container finished" podID="3d820ba8-03d6-48f4-9423-bbc1ed64a36b" containerID="d0225139306006f6265b3807404dbc72413aeefa6c2ec7ca1125827941ca3fbd" exitCode=0 Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.894064 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3d820ba8-03d6-48f4-9423-bbc1ed64a36b","Type":"ContainerDied","Data":"d0225139306006f6265b3807404dbc72413aeefa6c2ec7ca1125827941ca3fbd"} Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.906243 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/30292578-890d-42e8-b689-b909de083b48-kolla-config\") pod \"30292578-890d-42e8-b689-b909de083b48\" (UID: \"30292578-890d-42e8-b689-b909de083b48\") " Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.906323 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30292578-890d-42e8-b689-b909de083b48-operator-scripts\") pod \"30292578-890d-42e8-b689-b909de083b48\" (UID: \"30292578-890d-42e8-b689-b909de083b48\") " Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.906357 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"30292578-890d-42e8-b689-b909de083b48\" (UID: \"30292578-890d-42e8-b689-b909de083b48\") " Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.906407 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/30292578-890d-42e8-b689-b909de083b48-config-data-default\") pod \"30292578-890d-42e8-b689-b909de083b48\" (UID: \"30292578-890d-42e8-b689-b909de083b48\") " Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.906468 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/30292578-890d-42e8-b689-b909de083b48-config-data-generated\") pod \"30292578-890d-42e8-b689-b909de083b48\" (UID: \"30292578-890d-42e8-b689-b909de083b48\") " Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.906500 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/30292578-890d-42e8-b689-b909de083b48-galera-tls-certs\") pod \"30292578-890d-42e8-b689-b909de083b48\" (UID: \"30292578-890d-42e8-b689-b909de083b48\") " Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.906565 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30292578-890d-42e8-b689-b909de083b48-combined-ca-bundle\") pod \"30292578-890d-42e8-b689-b909de083b48\" (UID: \"30292578-890d-42e8-b689-b909de083b48\") " Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.906605 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpsng\" (UniqueName: \"kubernetes.io/projected/30292578-890d-42e8-b689-b909de083b48-kube-api-access-zpsng\") pod \"30292578-890d-42e8-b689-b909de083b48\" (UID: \"30292578-890d-42e8-b689-b909de083b48\") " Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.907487 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30292578-890d-42e8-b689-b909de083b48-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "30292578-890d-42e8-b689-b909de083b48" (UID: "30292578-890d-42e8-b689-b909de083b48"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.907837 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30292578-890d-42e8-b689-b909de083b48-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "30292578-890d-42e8-b689-b909de083b48" (UID: "30292578-890d-42e8-b689-b909de083b48"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.907871 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"30292578-890d-42e8-b689-b909de083b48","Type":"ContainerDied","Data":"2be1c33a29006d4527d097e0563524c636dc5fa73eff7ea290845ce9f9a8aa72"} Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.907846 4752 generic.go:334] "Generic (PLEG): container finished" podID="30292578-890d-42e8-b689-b909de083b48" containerID="2be1c33a29006d4527d097e0563524c636dc5fa73eff7ea290845ce9f9a8aa72" exitCode=0 Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.907963 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.908032 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"30292578-890d-42e8-b689-b909de083b48","Type":"ContainerDied","Data":"5c2f60fd67a0566ec2eb65f095e07acc01aceeeeb15ee5ace5d35d83a86ef470"} Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.908379 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30292578-890d-42e8-b689-b909de083b48-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "30292578-890d-42e8-b689-b909de083b48" (UID: "30292578-890d-42e8-b689-b909de083b48"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.909367 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30292578-890d-42e8-b689-b909de083b48-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "30292578-890d-42e8-b689-b909de083b48" (UID: "30292578-890d-42e8-b689-b909de083b48"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.913850 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30292578-890d-42e8-b689-b909de083b48-kube-api-access-zpsng" (OuterVolumeSpecName: "kube-api-access-zpsng") pod "30292578-890d-42e8-b689-b909de083b48" (UID: "30292578-890d-42e8-b689-b909de083b48"). InnerVolumeSpecName "kube-api-access-zpsng". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.921860 4752 scope.go:117] "RemoveContainer" containerID="b77d7cf0b069a2acc502c8ead2f25a536ecf99b661c7953794e5aa4a197061a2" Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.929154 4752 generic.go:334] "Generic (PLEG): container finished" podID="a1f1e943-afb7-40c3-9ff2-56791f4e0ad5" containerID="d59dfb8e5264cefe368e6631c2fcd950f686b3a37ff453ac4489ed24628cf2fd" exitCode=0 Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.929225 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5","Type":"ContainerDied","Data":"d59dfb8e5264cefe368e6631c2fcd950f686b3a37ff453ac4489ed24628cf2fd"} Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.929961 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "mysql-db") pod "30292578-890d-42e8-b689-b909de083b48" (UID: "30292578-890d-42e8-b689-b909de083b48"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.933074 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone7b4f-account-delete-jdrrz" Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.988089 4752 scope.go:117] "RemoveContainer" containerID="2be1c33a29006d4527d097e0563524c636dc5fa73eff7ea290845ce9f9a8aa72" Nov 24 11:28:35 crc kubenswrapper[4752]: I1124 11:28:35.988369 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.005225 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone7b4f-account-delete-jdrrz"] Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.008207 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smttb\" (UniqueName: \"kubernetes.io/projected/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-kube-api-access-smttb\") pod \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") " Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.008250 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-server-conf\") pod \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") " Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.008299 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-rabbitmq-confd\") pod \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") " Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.008340 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-rabbitmq-erlang-cookie\") pod \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") " Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.008376 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-erlang-cookie-secret\") pod \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") " Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.008408 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-pod-info\") pod \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") " Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.008431 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-plugins-conf\") pod \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") " Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.008476 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-rabbitmq-plugins\") pod \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") " Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.008498 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") " Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.008521 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-rabbitmq-tls\") pod \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") " Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.008632 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-config-data\") pod \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\" (UID: \"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5\") " Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.009065 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30292578-890d-42e8-b689-b909de083b48-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.009090 4752 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.009102 4752 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/30292578-890d-42e8-b689-b909de083b48-config-data-default\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.009114 4752 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/30292578-890d-42e8-b689-b909de083b48-config-data-generated\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.009126 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpsng\" (UniqueName: \"kubernetes.io/projected/30292578-890d-42e8-b689-b909de083b48-kube-api-access-zpsng\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.009137 4752 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/30292578-890d-42e8-b689-b909de083b48-kolla-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.016212 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "a1f1e943-afb7-40c3-9ff2-56791f4e0ad5" (UID: "a1f1e943-afb7-40c3-9ff2-56791f4e0ad5"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.019595 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "a1f1e943-afb7-40c3-9ff2-56791f4e0ad5" (UID: "a1f1e943-afb7-40c3-9ff2-56791f4e0ad5"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.019834 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-pod-info" (OuterVolumeSpecName: "pod-info") pod "a1f1e943-afb7-40c3-9ff2-56791f4e0ad5" (UID: "a1f1e943-afb7-40c3-9ff2-56791f4e0ad5"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.022440 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "a1f1e943-afb7-40c3-9ff2-56791f4e0ad5" (UID: "a1f1e943-afb7-40c3-9ff2-56791f4e0ad5"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.027005 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone7b4f-account-delete-jdrrz"] Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.032926 4752 scope.go:117] "RemoveContainer" containerID="62e09945d9b86b9c0065485c10b2db775ec7009251b346c71a1da8a7ab7d633f" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.032962 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "a1f1e943-afb7-40c3-9ff2-56791f4e0ad5" (UID: "a1f1e943-afb7-40c3-9ff2-56791f4e0ad5"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.033055 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "persistence") pod "a1f1e943-afb7-40c3-9ff2-56791f4e0ad5" (UID: "a1f1e943-afb7-40c3-9ff2-56791f4e0ad5"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.035263 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30292578-890d-42e8-b689-b909de083b48-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "30292578-890d-42e8-b689-b909de083b48" (UID: "30292578-890d-42e8-b689-b909de083b48"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.037319 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "a1f1e943-afb7-40c3-9ff2-56791f4e0ad5" (UID: "a1f1e943-afb7-40c3-9ff2-56791f4e0ad5"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.043984 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-kube-api-access-smttb" (OuterVolumeSpecName: "kube-api-access-smttb") pod "a1f1e943-afb7-40c3-9ff2-56791f4e0ad5" (UID: "a1f1e943-afb7-40c3-9ff2-56791f4e0ad5"). InnerVolumeSpecName "kube-api-access-smttb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.044074 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30292578-890d-42e8-b689-b909de083b48-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30292578-890d-42e8-b689-b909de083b48" (UID: "30292578-890d-42e8-b689-b909de083b48"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.077055 4752 scope.go:117] "RemoveContainer" containerID="2be1c33a29006d4527d097e0563524c636dc5fa73eff7ea290845ce9f9a8aa72" Nov 24 11:28:36 crc kubenswrapper[4752]: E1124 11:28:36.078988 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2be1c33a29006d4527d097e0563524c636dc5fa73eff7ea290845ce9f9a8aa72\": container with ID starting with 2be1c33a29006d4527d097e0563524c636dc5fa73eff7ea290845ce9f9a8aa72 not found: ID does not exist" containerID="2be1c33a29006d4527d097e0563524c636dc5fa73eff7ea290845ce9f9a8aa72" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.079515 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2be1c33a29006d4527d097e0563524c636dc5fa73eff7ea290845ce9f9a8aa72"} err="failed to get container status \"2be1c33a29006d4527d097e0563524c636dc5fa73eff7ea290845ce9f9a8aa72\": rpc error: code = NotFound desc = could not find container \"2be1c33a29006d4527d097e0563524c636dc5fa73eff7ea290845ce9f9a8aa72\": container with ID starting with 2be1c33a29006d4527d097e0563524c636dc5fa73eff7ea290845ce9f9a8aa72 not found: ID does not exist" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.079549 4752 scope.go:117] "RemoveContainer" containerID="62e09945d9b86b9c0065485c10b2db775ec7009251b346c71a1da8a7ab7d633f" Nov 24 11:28:36 crc kubenswrapper[4752]: E1124 11:28:36.079957 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62e09945d9b86b9c0065485c10b2db775ec7009251b346c71a1da8a7ab7d633f\": container with ID starting with 62e09945d9b86b9c0065485c10b2db775ec7009251b346c71a1da8a7ab7d633f not found: ID does not exist" containerID="62e09945d9b86b9c0065485c10b2db775ec7009251b346c71a1da8a7ab7d633f" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.079985 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62e09945d9b86b9c0065485c10b2db775ec7009251b346c71a1da8a7ab7d633f"} err="failed to get container status \"62e09945d9b86b9c0065485c10b2db775ec7009251b346c71a1da8a7ab7d633f\": rpc error: code = NotFound desc = could not find container \"62e09945d9b86b9c0065485c10b2db775ec7009251b346c71a1da8a7ab7d633f\": container with ID starting with 62e09945d9b86b9c0065485c10b2db775ec7009251b346c71a1da8a7ab7d633f not found: ID does not exist" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.090275 4752 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.102550 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-server-conf" (OuterVolumeSpecName: "server-conf") pod "a1f1e943-afb7-40c3-9ff2-56791f4e0ad5" (UID: "a1f1e943-afb7-40c3-9ff2-56791f4e0ad5"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.111980 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30292578-890d-42e8-b689-b909de083b48-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.112018 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smttb\" (UniqueName: \"kubernetes.io/projected/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-kube-api-access-smttb\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.112030 4752 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-server-conf\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.112039 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.112049 4752 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.112057 4752 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.112065 4752 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.112073 4752 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-pod-info\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.112080 4752 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.112089 4752 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.112109 4752 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.112118 4752 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.112127 4752 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/30292578-890d-42e8-b689-b909de083b48-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.112135 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgtdc\" (UniqueName: \"kubernetes.io/projected/29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b-kube-api-access-lgtdc\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.124560 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-config-data" (OuterVolumeSpecName: "config-data") pod "a1f1e943-afb7-40c3-9ff2-56791f4e0ad5" (UID: "a1f1e943-afb7-40c3-9ff2-56791f4e0ad5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.154324 4752 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.156546 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.157779 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7f4ddd687-vk74x" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.199052 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "a1f1e943-afb7-40c3-9ff2-56791f4e0ad5" (UID: "a1f1e943-afb7-40c3-9ff2-56791f4e0ad5"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.213042 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1cd9d1a6-a562-443e-b16d-76c159107794-credential-keys\") pod \"1cd9d1a6-a562-443e-b16d-76c159107794\" (UID: \"1cd9d1a6-a562-443e-b16d-76c159107794\") " Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.213097 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-erlang-cookie-secret\") pod \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") " Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.213126 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-plugins-conf\") pod \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") " Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.213150 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1cd9d1a6-a562-443e-b16d-76c159107794-fernet-keys\") pod \"1cd9d1a6-a562-443e-b16d-76c159107794\" (UID: \"1cd9d1a6-a562-443e-b16d-76c159107794\") " Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.213173 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") " Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.213199 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cd9d1a6-a562-443e-b16d-76c159107794-public-tls-certs\") pod \"1cd9d1a6-a562-443e-b16d-76c159107794\" (UID: \"1cd9d1a6-a562-443e-b16d-76c159107794\") " Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.213263 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4pjw\" (UniqueName: \"kubernetes.io/projected/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-kube-api-access-r4pjw\") pod \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") " Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.213329 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xj6pv\" (UniqueName: \"kubernetes.io/projected/1cd9d1a6-a562-443e-b16d-76c159107794-kube-api-access-xj6pv\") pod \"1cd9d1a6-a562-443e-b16d-76c159107794\" (UID: \"1cd9d1a6-a562-443e-b16d-76c159107794\") " Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.213361 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-server-conf\") pod \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") " Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.213418 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cd9d1a6-a562-443e-b16d-76c159107794-internal-tls-certs\") pod \"1cd9d1a6-a562-443e-b16d-76c159107794\" (UID: \"1cd9d1a6-a562-443e-b16d-76c159107794\") " Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.213442 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-rabbitmq-erlang-cookie\") pod \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") " Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.213463 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-rabbitmq-confd\") pod \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") " Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.213503 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-pod-info\") pod \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") " Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.213545 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cd9d1a6-a562-443e-b16d-76c159107794-scripts\") pod \"1cd9d1a6-a562-443e-b16d-76c159107794\" (UID: \"1cd9d1a6-a562-443e-b16d-76c159107794\") " Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.213580 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-config-data\") pod \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") " Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.213611 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-rabbitmq-plugins\") pod \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") " Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.213639 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cd9d1a6-a562-443e-b16d-76c159107794-combined-ca-bundle\") pod \"1cd9d1a6-a562-443e-b16d-76c159107794\" (UID: \"1cd9d1a6-a562-443e-b16d-76c159107794\") " Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.213665 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-rabbitmq-tls\") pod \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\" (UID: \"3d820ba8-03d6-48f4-9423-bbc1ed64a36b\") " Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.213688 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cd9d1a6-a562-443e-b16d-76c159107794-config-data\") pod \"1cd9d1a6-a562-443e-b16d-76c159107794\" (UID: \"1cd9d1a6-a562-443e-b16d-76c159107794\") " Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.214070 4752 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.214093 4752 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.214105 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.215274 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "3d820ba8-03d6-48f4-9423-bbc1ed64a36b" (UID: "3d820ba8-03d6-48f4-9423-bbc1ed64a36b"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.216238 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "3d820ba8-03d6-48f4-9423-bbc1ed64a36b" (UID: "3d820ba8-03d6-48f4-9423-bbc1ed64a36b"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.216688 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "3d820ba8-03d6-48f4-9423-bbc1ed64a36b" (UID: "3d820ba8-03d6-48f4-9423-bbc1ed64a36b"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.220100 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cd9d1a6-a562-443e-b16d-76c159107794-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "1cd9d1a6-a562-443e-b16d-76c159107794" (UID: "1cd9d1a6-a562-443e-b16d-76c159107794"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.221144 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "3d820ba8-03d6-48f4-9423-bbc1ed64a36b" (UID: "3d820ba8-03d6-48f4-9423-bbc1ed64a36b"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.222616 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cd9d1a6-a562-443e-b16d-76c159107794-scripts" (OuterVolumeSpecName: "scripts") pod "1cd9d1a6-a562-443e-b16d-76c159107794" (UID: "1cd9d1a6-a562-443e-b16d-76c159107794"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.222632 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cd9d1a6-a562-443e-b16d-76c159107794-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "1cd9d1a6-a562-443e-b16d-76c159107794" (UID: "1cd9d1a6-a562-443e-b16d-76c159107794"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.228987 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cd9d1a6-a562-443e-b16d-76c159107794-kube-api-access-xj6pv" (OuterVolumeSpecName: "kube-api-access-xj6pv") pod "1cd9d1a6-a562-443e-b16d-76c159107794" (UID: "1cd9d1a6-a562-443e-b16d-76c159107794"). InnerVolumeSpecName "kube-api-access-xj6pv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.229006 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "3d820ba8-03d6-48f4-9423-bbc1ed64a36b" (UID: "3d820ba8-03d6-48f4-9423-bbc1ed64a36b"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.229005 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-kube-api-access-r4pjw" (OuterVolumeSpecName: "kube-api-access-r4pjw") pod "3d820ba8-03d6-48f4-9423-bbc1ed64a36b" (UID: "3d820ba8-03d6-48f4-9423-bbc1ed64a36b"). InnerVolumeSpecName "kube-api-access-r4pjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.234901 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "persistence") pod "3d820ba8-03d6-48f4-9423-bbc1ed64a36b" (UID: "3d820ba8-03d6-48f4-9423-bbc1ed64a36b"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.237447 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-pod-info" (OuterVolumeSpecName: "pod-info") pod "3d820ba8-03d6-48f4-9423-bbc1ed64a36b" (UID: "3d820ba8-03d6-48f4-9423-bbc1ed64a36b"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.261841 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.265322 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.268276 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-config-data" (OuterVolumeSpecName: "config-data") pod "3d820ba8-03d6-48f4-9423-bbc1ed64a36b" (UID: "3d820ba8-03d6-48f4-9423-bbc1ed64a36b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.282626 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cd9d1a6-a562-443e-b16d-76c159107794-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1cd9d1a6-a562-443e-b16d-76c159107794" (UID: "1cd9d1a6-a562-443e-b16d-76c159107794"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.285993 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cd9d1a6-a562-443e-b16d-76c159107794-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1cd9d1a6-a562-443e-b16d-76c159107794" (UID: "1cd9d1a6-a562-443e-b16d-76c159107794"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.293416 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cd9d1a6-a562-443e-b16d-76c159107794-config-data" (OuterVolumeSpecName: "config-data") pod "1cd9d1a6-a562-443e-b16d-76c159107794" (UID: "1cd9d1a6-a562-443e-b16d-76c159107794"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.301088 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cd9d1a6-a562-443e-b16d-76c159107794-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1cd9d1a6-a562-443e-b16d-76c159107794" (UID: "1cd9d1a6-a562-443e-b16d-76c159107794"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.307221 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-server-conf" (OuterVolumeSpecName: "server-conf") pod "3d820ba8-03d6-48f4-9423-bbc1ed64a36b" (UID: "3d820ba8-03d6-48f4-9423-bbc1ed64a36b"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.315772 4752 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-pod-info\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.315805 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cd9d1a6-a562-443e-b16d-76c159107794-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.315814 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.315822 4752 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.315832 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cd9d1a6-a562-443e-b16d-76c159107794-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.315848 4752 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.315857 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cd9d1a6-a562-443e-b16d-76c159107794-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.315866 4752 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1cd9d1a6-a562-443e-b16d-76c159107794-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.315875 4752 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.315884 4752 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.315892 4752 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1cd9d1a6-a562-443e-b16d-76c159107794-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:36 crc kubenswrapper[4752]: E1124 11:28:36.315905 4752 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 24 11:28:36 crc kubenswrapper[4752]: E1124 11:28:36.315994 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0-operator-scripts podName:4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0 nodeName:}" failed. No retries permitted until 2025-11-24 11:28:40.315973948 +0000 UTC m=+1326.300794237 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0-operator-scripts") pod "glancef4c5-account-delete-kjkjp" (UID: "4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0") : configmap "openstack-scripts" not found Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.315913 4752 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.316098 4752 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cd9d1a6-a562-443e-b16d-76c159107794-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.316119 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4pjw\" (UniqueName: \"kubernetes.io/projected/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-kube-api-access-r4pjw\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.316132 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xj6pv\" (UniqueName: \"kubernetes.io/projected/1cd9d1a6-a562-443e-b16d-76c159107794-kube-api-access-xj6pv\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.316144 4752 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-server-conf\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.316174 4752 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cd9d1a6-a562-443e-b16d-76c159107794-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.316185 4752 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:36 crc kubenswrapper[4752]: E1124 11:28:36.316398 4752 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 24 11:28:36 crc kubenswrapper[4752]: E1124 11:28:36.316442 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/273fc6bc-214e-4fca-8939-003f983d6aa1-operator-scripts podName:273fc6bc-214e-4fca-8939-003f983d6aa1 nodeName:}" failed. No retries permitted until 2025-11-24 11:28:40.316432311 +0000 UTC m=+1326.301252600 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/273fc6bc-214e-4fca-8939-003f983d6aa1-operator-scripts") pod "novacell080e2-account-delete-8s7w5" (UID: "273fc6bc-214e-4fca-8939-003f983d6aa1") : configmap "openstack-scripts" not found Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.331350 4752 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.343677 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "3d820ba8-03d6-48f4-9423-bbc1ed64a36b" (UID: "3d820ba8-03d6-48f4-9423-bbc1ed64a36b"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.418428 4752 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3d820ba8-03d6-48f4-9423-bbc1ed64a36b-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.418469 4752 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:36 crc kubenswrapper[4752]: E1124 11:28:36.418545 4752 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 24 11:28:36 crc kubenswrapper[4752]: E1124 11:28:36.418611 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/31369e0b-e7eb-44cc-9d7e-21d196d95ad3-operator-scripts podName:31369e0b-e7eb-44cc-9d7e-21d196d95ad3 nodeName:}" failed. No retries permitted until 2025-11-24 11:28:40.418593174 +0000 UTC m=+1326.403413463 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/31369e0b-e7eb-44cc-9d7e-21d196d95ad3-operator-scripts") pod "barbicanf581-account-delete-qm6qk" (UID: "31369e0b-e7eb-44cc-9d7e-21d196d95ad3") : configmap "openstack-scripts" not found Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.714899 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-5krk8"] Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.728802 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-5krk8"] Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.732215 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder3b7b-account-delete-rj27n"] Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.763057 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1020727e-3725-4fed-b276-043ae4d30c4c" path="/var/lib/kubelet/pods/1020727e-3725-4fed-b276-043ae4d30c4c/volumes" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.763694 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16fc61a1-6794-4ecf-a3dd-d79a88eda486" path="/var/lib/kubelet/pods/16fc61a1-6794-4ecf-a3dd-d79a88eda486/volumes" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.764415 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b" path="/var/lib/kubelet/pods/29c6b87f-e4ea-4ffc-9e92-8d5e5cb5c36b/volumes" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.764779 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e8ac218-b256-4d82-afb6-34448254aa9f" path="/var/lib/kubelet/pods/2e8ac218-b256-4d82-afb6-34448254aa9f/volumes" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.766154 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30292578-890d-42e8-b689-b909de083b48" path="/var/lib/kubelet/pods/30292578-890d-42e8-b689-b909de083b48/volumes" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.766782 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bdb834d-3fdb-4bed-b349-e41c60ec4d64" path="/var/lib/kubelet/pods/4bdb834d-3fdb-4bed-b349-e41c60ec4d64/volumes" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.767804 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51186e60-74ec-447f-afd0-14056f01d5a3" path="/var/lib/kubelet/pods/51186e60-74ec-447f-afd0-14056f01d5a3/volumes" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.768348 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c592991-2ffb-417a-aa80-49d0111618bb" path="/var/lib/kubelet/pods/6c592991-2ffb-417a-aa80-49d0111618bb/volumes" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.768895 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d0084f6-064e-4089-87fe-4ead63923b56" path="/var/lib/kubelet/pods/9d0084f6-064e-4089-87fe-4ead63923b56/volumes" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.770035 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6e4558a-350e-442a-ad61-70d9a9824219" path="/var/lib/kubelet/pods/a6e4558a-350e-442a-ad61-70d9a9824219/volumes" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.770565 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3b08ab3-f6c3-417e-abce-e03994e7553d" path="/var/lib/kubelet/pods/b3b08ab3-f6c3-417e-abce-e03994e7553d/volumes" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.771534 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3c01e5f-8338-497f-9e47-e2e3e857df63" path="/var/lib/kubelet/pods/f3c01e5f-8338-497f-9e47-e2e3e857df63/volumes" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.772257 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa205dc7-c2a0-418c-a696-ccdfa4c0d43f" path="/var/lib/kubelet/pods/fa205dc7-c2a0-418c-a696-ccdfa4c0d43f/volumes" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.772816 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffd3886c-7122-4bd5-97d2-5c448c31e941" path="/var/lib/kubelet/pods/ffd3886c-7122-4bd5-97d2-5c448c31e941/volumes" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.779318 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder3b7b-account-delete-rj27n"] Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.780393 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-3b7b-account-create-6hhgc"] Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.780414 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-3b7b-account-create-6hhgc"] Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.835779 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-mbqd5"] Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.848835 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-mbqd5"] Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.865346 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-83dd-account-create-m48bt"] Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.881676 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement83dd-account-delete-87gjb"] Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.898259 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-83dd-account-create-m48bt"] Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.905171 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement83dd-account-delete-87gjb"] Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.913213 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-dp6s7"] Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.918760 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-dp6s7"] Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.928937 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbicanf581-account-delete-qm6qk"] Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.929459 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbicanf581-account-delete-qm6qk" podUID="31369e0b-e7eb-44cc-9d7e-21d196d95ad3" containerName="mariadb-account-delete" containerID="cri-o://80077efb113a9fef0b2ba87a7d085c77587815a0d82bd1d80505c5eea98837ef" gracePeriod=30 Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.933934 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-f581-account-create-hwjsp"] Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.938594 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-f581-account-create-hwjsp"] Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.948048 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3d820ba8-03d6-48f4-9423-bbc1ed64a36b","Type":"ContainerDied","Data":"70fcd895ea7f430512fbb3aced2ca12c3fba7d5a82925481ac5e2520dbe70a4c"} Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.948286 4752 scope.go:117] "RemoveContainer" containerID="d0225139306006f6265b3807404dbc72413aeefa6c2ec7ca1125827941ca3fbd" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.948547 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.962018 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_1077002f-be90-4e09-8158-efdc98329e5b/ovn-northd/0.log" Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.962881 4752 generic.go:334] "Generic (PLEG): container finished" podID="1077002f-be90-4e09-8158-efdc98329e5b" containerID="cbc8819aaddd8d4d2d0dad826bb4ce82df8a1c47c68ac131ce481a35067952c7" exitCode=139 Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.963049 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"1077002f-be90-4e09-8158-efdc98329e5b","Type":"ContainerDied","Data":"cbc8819aaddd8d4d2d0dad826bb4ce82df8a1c47c68ac131ce481a35067952c7"} Nov 24 11:28:36 crc kubenswrapper[4752]: I1124 11:28:36.990107 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.005556 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a1f1e943-afb7-40c3-9ff2-56791f4e0ad5","Type":"ContainerDied","Data":"dfc4f105e340723be162ecae7dacaabe01750f1b6aaf460058b78ab7887bc5f1"} Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.005670 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.008533 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.016090 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7f4ddd687-vk74x" event={"ID":"1cd9d1a6-a562-443e-b16d-76c159107794","Type":"ContainerDied","Data":"4e5f7910f2ce4f6ad256efb05f29eb26420dbab368e692d14cce58e0dcac8968"} Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.016494 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7f4ddd687-vk74x" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.021367 4752 scope.go:117] "RemoveContainer" containerID="ca63fd0de5f52cf43fe00bb9dfbd1436710419f23b5ae8d734e790c2aa804e2b" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.102986 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.108917 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.114132 4752 scope.go:117] "RemoveContainer" containerID="d59dfb8e5264cefe368e6631c2fcd950f686b3a37ff453ac4489ed24628cf2fd" Nov 24 11:28:37 crc kubenswrapper[4752]: E1124 11:28:37.127484 4752 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Nov 24 11:28:37 crc kubenswrapper[4752]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2025-11-24T11:28:29Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Nov 24 11:28:37 crc kubenswrapper[4752]: /etc/init.d/functions: line 589: 386 Alarm clock "$@" Nov 24 11:28:37 crc kubenswrapper[4752]: > execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-xmc2s" message=< Nov 24 11:28:37 crc kubenswrapper[4752]: Exiting ovn-controller (1) [FAILED] Nov 24 11:28:37 crc kubenswrapper[4752]: Killing ovn-controller (1) [ OK ] Nov 24 11:28:37 crc kubenswrapper[4752]: Killing ovn-controller (1) with SIGKILL [ OK ] Nov 24 11:28:37 crc kubenswrapper[4752]: 2025-11-24T11:28:29Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Nov 24 11:28:37 crc kubenswrapper[4752]: /etc/init.d/functions: line 589: 386 Alarm clock "$@" Nov 24 11:28:37 crc kubenswrapper[4752]: > Nov 24 11:28:37 crc kubenswrapper[4752]: E1124 11:28:37.127554 4752 kuberuntime_container.go:691] "PreStop hook failed" err=< Nov 24 11:28:37 crc kubenswrapper[4752]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2025-11-24T11:28:29Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Nov 24 11:28:37 crc kubenswrapper[4752]: /etc/init.d/functions: line 589: 386 Alarm clock "$@" Nov 24 11:28:37 crc kubenswrapper[4752]: > pod="openstack/ovn-controller-xmc2s" podUID="9495af0b-dca8-4695-8f40-f4f9a7ec5229" containerName="ovn-controller" containerID="cri-o://a049cd20437a840deeead794f8a8f79bd0e17445da58b94a4ff6632215070a13" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.127590 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-xmc2s" podUID="9495af0b-dca8-4695-8f40-f4f9a7ec5229" containerName="ovn-controller" containerID="cri-o://a049cd20437a840deeead794f8a8f79bd0e17445da58b94a4ff6632215070a13" gracePeriod=21 Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.167629 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-7f4ddd687-vk74x"] Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.179340 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-7f4ddd687-vk74x"] Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.191474 4752 scope.go:117] "RemoveContainer" containerID="50d83bb320cd3643b150ab5645490c9c124d9eff8abf9994e68c250ab8f36395" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.230278 4752 scope.go:117] "RemoveContainer" containerID="0a1dcccaeed6faa3a0411c4c11965da3a4a0622d15fdfb5a731e239a841bf153" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.251776 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_1077002f-be90-4e09-8158-efdc98329e5b/ovn-northd/0.log" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.251870 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.345515 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1077002f-be90-4e09-8158-efdc98329e5b-config\") pod \"1077002f-be90-4e09-8158-efdc98329e5b\" (UID: \"1077002f-be90-4e09-8158-efdc98329e5b\") " Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.345604 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1077002f-be90-4e09-8158-efdc98329e5b-metrics-certs-tls-certs\") pod \"1077002f-be90-4e09-8158-efdc98329e5b\" (UID: \"1077002f-be90-4e09-8158-efdc98329e5b\") " Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.345646 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czr57\" (UniqueName: \"kubernetes.io/projected/1077002f-be90-4e09-8158-efdc98329e5b-kube-api-access-czr57\") pod \"1077002f-be90-4e09-8158-efdc98329e5b\" (UID: \"1077002f-be90-4e09-8158-efdc98329e5b\") " Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.345695 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1077002f-be90-4e09-8158-efdc98329e5b-ovn-northd-tls-certs\") pod \"1077002f-be90-4e09-8158-efdc98329e5b\" (UID: \"1077002f-be90-4e09-8158-efdc98329e5b\") " Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.345791 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1077002f-be90-4e09-8158-efdc98329e5b-scripts\") pod \"1077002f-be90-4e09-8158-efdc98329e5b\" (UID: \"1077002f-be90-4e09-8158-efdc98329e5b\") " Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.345888 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1077002f-be90-4e09-8158-efdc98329e5b-combined-ca-bundle\") pod \"1077002f-be90-4e09-8158-efdc98329e5b\" (UID: \"1077002f-be90-4e09-8158-efdc98329e5b\") " Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.345916 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1077002f-be90-4e09-8158-efdc98329e5b-ovn-rundir\") pod \"1077002f-be90-4e09-8158-efdc98329e5b\" (UID: \"1077002f-be90-4e09-8158-efdc98329e5b\") " Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.346832 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1077002f-be90-4e09-8158-efdc98329e5b-scripts" (OuterVolumeSpecName: "scripts") pod "1077002f-be90-4e09-8158-efdc98329e5b" (UID: "1077002f-be90-4e09-8158-efdc98329e5b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.347245 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1077002f-be90-4e09-8158-efdc98329e5b-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.347472 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1077002f-be90-4e09-8158-efdc98329e5b-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "1077002f-be90-4e09-8158-efdc98329e5b" (UID: "1077002f-be90-4e09-8158-efdc98329e5b"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.347511 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1077002f-be90-4e09-8158-efdc98329e5b-config" (OuterVolumeSpecName: "config") pod "1077002f-be90-4e09-8158-efdc98329e5b" (UID: "1077002f-be90-4e09-8158-efdc98329e5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.353096 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1077002f-be90-4e09-8158-efdc98329e5b-kube-api-access-czr57" (OuterVolumeSpecName: "kube-api-access-czr57") pod "1077002f-be90-4e09-8158-efdc98329e5b" (UID: "1077002f-be90-4e09-8158-efdc98329e5b"). InnerVolumeSpecName "kube-api-access-czr57". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.370174 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1077002f-be90-4e09-8158-efdc98329e5b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1077002f-be90-4e09-8158-efdc98329e5b" (UID: "1077002f-be90-4e09-8158-efdc98329e5b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.425555 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1077002f-be90-4e09-8158-efdc98329e5b-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "1077002f-be90-4e09-8158-efdc98329e5b" (UID: "1077002f-be90-4e09-8158-efdc98329e5b"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.427066 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1077002f-be90-4e09-8158-efdc98329e5b-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "1077002f-be90-4e09-8158-efdc98329e5b" (UID: "1077002f-be90-4e09-8158-efdc98329e5b"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.457590 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1077002f-be90-4e09-8158-efdc98329e5b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.457617 4752 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1077002f-be90-4e09-8158-efdc98329e5b-ovn-rundir\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.457626 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1077002f-be90-4e09-8158-efdc98329e5b-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.457639 4752 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1077002f-be90-4e09-8158-efdc98329e5b-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:37 crc kubenswrapper[4752]: E1124 11:28:37.457643 4752 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 24 11:28:37 crc kubenswrapper[4752]: E1124 11:28:37.457725 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/11bd1a8d-cbf9-47c6-a116-538f60634a33-operator-scripts podName:11bd1a8d-cbf9-47c6-a116-538f60634a33 nodeName:}" failed. No retries permitted until 2025-11-24 11:28:41.457700809 +0000 UTC m=+1327.442521158 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/11bd1a8d-cbf9-47c6-a116-538f60634a33-operator-scripts") pod "novaapib683-account-delete-k7898" (UID: "11bd1a8d-cbf9-47c6-a116-538f60634a33") : configmap "openstack-scripts" not found Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.457652 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czr57\" (UniqueName: \"kubernetes.io/projected/1077002f-be90-4e09-8158-efdc98329e5b-kube-api-access-czr57\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.458183 4752 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1077002f-be90-4e09-8158-efdc98329e5b-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:37 crc kubenswrapper[4752]: E1124 11:28:37.458451 4752 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 24 11:28:37 crc kubenswrapper[4752]: E1124 11:28:37.458724 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a8b9f59e-8570-4b2a-9c7b-14be641f74fe-operator-scripts podName:a8b9f59e-8570-4b2a-9c7b-14be641f74fe nodeName:}" failed. No retries permitted until 2025-11-24 11:28:41.458703148 +0000 UTC m=+1327.443523527 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/a8b9f59e-8570-4b2a-9c7b-14be641f74fe-operator-scripts") pod "neutron574b-account-delete-nw8qc" (UID: "a8b9f59e-8570-4b2a-9c7b-14be641f74fe") : configmap "openstack-scripts" not found Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.478006 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.559589 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4qv4\" (UniqueName: \"kubernetes.io/projected/5be4efe6-ed60-4417-a84c-8ff27bf4a685-kube-api-access-s4qv4\") pod \"5be4efe6-ed60-4417-a84c-8ff27bf4a685\" (UID: \"5be4efe6-ed60-4417-a84c-8ff27bf4a685\") " Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.559864 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5be4efe6-ed60-4417-a84c-8ff27bf4a685-combined-ca-bundle\") pod \"5be4efe6-ed60-4417-a84c-8ff27bf4a685\" (UID: \"5be4efe6-ed60-4417-a84c-8ff27bf4a685\") " Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.559897 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5be4efe6-ed60-4417-a84c-8ff27bf4a685-config-data\") pod \"5be4efe6-ed60-4417-a84c-8ff27bf4a685\" (UID: \"5be4efe6-ed60-4417-a84c-8ff27bf4a685\") " Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.563717 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5be4efe6-ed60-4417-a84c-8ff27bf4a685-kube-api-access-s4qv4" (OuterVolumeSpecName: "kube-api-access-s4qv4") pod "5be4efe6-ed60-4417-a84c-8ff27bf4a685" (UID: "5be4efe6-ed60-4417-a84c-8ff27bf4a685"). InnerVolumeSpecName "kube-api-access-s4qv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.608726 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.617362 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5be4efe6-ed60-4417-a84c-8ff27bf4a685-config-data" (OuterVolumeSpecName: "config-data") pod "5be4efe6-ed60-4417-a84c-8ff27bf4a685" (UID: "5be4efe6-ed60-4417-a84c-8ff27bf4a685"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.626300 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5be4efe6-ed60-4417-a84c-8ff27bf4a685-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5be4efe6-ed60-4417-a84c-8ff27bf4a685" (UID: "5be4efe6-ed60-4417-a84c-8ff27bf4a685"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.661100 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd071fb7-f9a2-4f4a-aad6-90c340f0d009-combined-ca-bundle\") pod \"fd071fb7-f9a2-4f4a-aad6-90c340f0d009\" (UID: \"fd071fb7-f9a2-4f4a-aad6-90c340f0d009\") " Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.661164 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd071fb7-f9a2-4f4a-aad6-90c340f0d009-config-data\") pod \"fd071fb7-f9a2-4f4a-aad6-90c340f0d009\" (UID: \"fd071fb7-f9a2-4f4a-aad6-90c340f0d009\") " Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.661279 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hxms\" (UniqueName: \"kubernetes.io/projected/fd071fb7-f9a2-4f4a-aad6-90c340f0d009-kube-api-access-8hxms\") pod \"fd071fb7-f9a2-4f4a-aad6-90c340f0d009\" (UID: \"fd071fb7-f9a2-4f4a-aad6-90c340f0d009\") " Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.661776 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4qv4\" (UniqueName: \"kubernetes.io/projected/5be4efe6-ed60-4417-a84c-8ff27bf4a685-kube-api-access-s4qv4\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.661797 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5be4efe6-ed60-4417-a84c-8ff27bf4a685-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.661810 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5be4efe6-ed60-4417-a84c-8ff27bf4a685-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.668900 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-xmc2s_9495af0b-dca8-4695-8f40-f4f9a7ec5229/ovn-controller/0.log" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.668959 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xmc2s" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.669874 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd071fb7-f9a2-4f4a-aad6-90c340f0d009-kube-api-access-8hxms" (OuterVolumeSpecName: "kube-api-access-8hxms") pod "fd071fb7-f9a2-4f4a-aad6-90c340f0d009" (UID: "fd071fb7-f9a2-4f4a-aad6-90c340f0d009"). InnerVolumeSpecName "kube-api-access-8hxms". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.690231 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd071fb7-f9a2-4f4a-aad6-90c340f0d009-config-data" (OuterVolumeSpecName: "config-data") pod "fd071fb7-f9a2-4f4a-aad6-90c340f0d009" (UID: "fd071fb7-f9a2-4f4a-aad6-90c340f0d009"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.690939 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd071fb7-f9a2-4f4a-aad6-90c340f0d009-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fd071fb7-f9a2-4f4a-aad6-90c340f0d009" (UID: "fd071fb7-f9a2-4f4a-aad6-90c340f0d009"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.763246 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9495af0b-dca8-4695-8f40-f4f9a7ec5229-ovn-controller-tls-certs\") pod \"9495af0b-dca8-4695-8f40-f4f9a7ec5229\" (UID: \"9495af0b-dca8-4695-8f40-f4f9a7ec5229\") " Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.763289 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9495af0b-dca8-4695-8f40-f4f9a7ec5229-var-run-ovn\") pod \"9495af0b-dca8-4695-8f40-f4f9a7ec5229\" (UID: \"9495af0b-dca8-4695-8f40-f4f9a7ec5229\") " Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.763347 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2j8xn\" (UniqueName: \"kubernetes.io/projected/9495af0b-dca8-4695-8f40-f4f9a7ec5229-kube-api-access-2j8xn\") pod \"9495af0b-dca8-4695-8f40-f4f9a7ec5229\" (UID: \"9495af0b-dca8-4695-8f40-f4f9a7ec5229\") " Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.763365 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9495af0b-dca8-4695-8f40-f4f9a7ec5229-combined-ca-bundle\") pod \"9495af0b-dca8-4695-8f40-f4f9a7ec5229\" (UID: \"9495af0b-dca8-4695-8f40-f4f9a7ec5229\") " Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.763389 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9495af0b-dca8-4695-8f40-f4f9a7ec5229-var-run\") pod \"9495af0b-dca8-4695-8f40-f4f9a7ec5229\" (UID: \"9495af0b-dca8-4695-8f40-f4f9a7ec5229\") " Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.763417 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9495af0b-dca8-4695-8f40-f4f9a7ec5229-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "9495af0b-dca8-4695-8f40-f4f9a7ec5229" (UID: "9495af0b-dca8-4695-8f40-f4f9a7ec5229"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.763435 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9495af0b-dca8-4695-8f40-f4f9a7ec5229-var-log-ovn\") pod \"9495af0b-dca8-4695-8f40-f4f9a7ec5229\" (UID: \"9495af0b-dca8-4695-8f40-f4f9a7ec5229\") " Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.763464 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9495af0b-dca8-4695-8f40-f4f9a7ec5229-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "9495af0b-dca8-4695-8f40-f4f9a7ec5229" (UID: "9495af0b-dca8-4695-8f40-f4f9a7ec5229"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.763516 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9495af0b-dca8-4695-8f40-f4f9a7ec5229-var-run" (OuterVolumeSpecName: "var-run") pod "9495af0b-dca8-4695-8f40-f4f9a7ec5229" (UID: "9495af0b-dca8-4695-8f40-f4f9a7ec5229"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.763589 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9495af0b-dca8-4695-8f40-f4f9a7ec5229-scripts\") pod \"9495af0b-dca8-4695-8f40-f4f9a7ec5229\" (UID: \"9495af0b-dca8-4695-8f40-f4f9a7ec5229\") " Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.764176 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hxms\" (UniqueName: \"kubernetes.io/projected/fd071fb7-f9a2-4f4a-aad6-90c340f0d009-kube-api-access-8hxms\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.764191 4752 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9495af0b-dca8-4695-8f40-f4f9a7ec5229-var-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.764201 4752 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9495af0b-dca8-4695-8f40-f4f9a7ec5229-var-run\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.764212 4752 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9495af0b-dca8-4695-8f40-f4f9a7ec5229-var-log-ovn\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.764222 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd071fb7-f9a2-4f4a-aad6-90c340f0d009-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.764233 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd071fb7-f9a2-4f4a-aad6-90c340f0d009-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.766517 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9495af0b-dca8-4695-8f40-f4f9a7ec5229-scripts" (OuterVolumeSpecName: "scripts") pod "9495af0b-dca8-4695-8f40-f4f9a7ec5229" (UID: "9495af0b-dca8-4695-8f40-f4f9a7ec5229"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.767487 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9495af0b-dca8-4695-8f40-f4f9a7ec5229-kube-api-access-2j8xn" (OuterVolumeSpecName: "kube-api-access-2j8xn") pod "9495af0b-dca8-4695-8f40-f4f9a7ec5229" (UID: "9495af0b-dca8-4695-8f40-f4f9a7ec5229"). InnerVolumeSpecName "kube-api-access-2j8xn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.805891 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9495af0b-dca8-4695-8f40-f4f9a7ec5229-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9495af0b-dca8-4695-8f40-f4f9a7ec5229" (UID: "9495af0b-dca8-4695-8f40-f4f9a7ec5229"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.834417 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9495af0b-dca8-4695-8f40-f4f9a7ec5229-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "9495af0b-dca8-4695-8f40-f4f9a7ec5229" (UID: "9495af0b-dca8-4695-8f40-f4f9a7ec5229"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.865302 4752 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9495af0b-dca8-4695-8f40-f4f9a7ec5229-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.865333 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2j8xn\" (UniqueName: \"kubernetes.io/projected/9495af0b-dca8-4695-8f40-f4f9a7ec5229-kube-api-access-2j8xn\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.865343 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9495af0b-dca8-4695-8f40-f4f9a7ec5229-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:37 crc kubenswrapper[4752]: I1124 11:28:37.865352 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9495af0b-dca8-4695-8f40-f4f9a7ec5229-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.034177 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_1077002f-be90-4e09-8158-efdc98329e5b/ovn-northd/0.log" Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.034565 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"1077002f-be90-4e09-8158-efdc98329e5b","Type":"ContainerDied","Data":"1ddb11e8cd6745332066e35a57f83011763d8e124d2b3c44c89bcc30b301c98e"} Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.034636 4752 scope.go:117] "RemoveContainer" containerID="04d87e64c2950336bd5ce6ab63d3778aa490d6f9336803686ba547e5f016a8d5" Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.036607 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.041640 4752 generic.go:334] "Generic (PLEG): container finished" podID="fd071fb7-f9a2-4f4a-aad6-90c340f0d009" containerID="84100f51acbc89dc3eef175fd930e0f2a3ebebe505fb5e9794dfc4d2790209b4" exitCode=0 Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.041798 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fd071fb7-f9a2-4f4a-aad6-90c340f0d009","Type":"ContainerDied","Data":"84100f51acbc89dc3eef175fd930e0f2a3ebebe505fb5e9794dfc4d2790209b4"} Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.041814 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.041847 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fd071fb7-f9a2-4f4a-aad6-90c340f0d009","Type":"ContainerDied","Data":"4d6071de5a0edc39a54a13c20117ae1d903587e9e41e885ca753b6c25d7d3f06"} Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.046852 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-xmc2s_9495af0b-dca8-4695-8f40-f4f9a7ec5229/ovn-controller/0.log" Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.046941 4752 generic.go:334] "Generic (PLEG): container finished" podID="9495af0b-dca8-4695-8f40-f4f9a7ec5229" containerID="a049cd20437a840deeead794f8a8f79bd0e17445da58b94a4ff6632215070a13" exitCode=137 Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.047072 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xmc2s" event={"ID":"9495af0b-dca8-4695-8f40-f4f9a7ec5229","Type":"ContainerDied","Data":"a049cd20437a840deeead794f8a8f79bd0e17445da58b94a4ff6632215070a13"} Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.047107 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xmc2s" event={"ID":"9495af0b-dca8-4695-8f40-f4f9a7ec5229","Type":"ContainerDied","Data":"97ae2b312225cdca306e777fe223cca18db358f282d54afa66de7b1071301f4f"} Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.047276 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xmc2s" Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.055216 4752 generic.go:334] "Generic (PLEG): container finished" podID="5be4efe6-ed60-4417-a84c-8ff27bf4a685" containerID="d926ac7aaad802ad821b9f117c72a63b843e3913b61ff928707f207f37dcd1bc" exitCode=0 Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.055251 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"5be4efe6-ed60-4417-a84c-8ff27bf4a685","Type":"ContainerDied","Data":"d926ac7aaad802ad821b9f117c72a63b843e3913b61ff928707f207f37dcd1bc"} Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.055278 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"5be4efe6-ed60-4417-a84c-8ff27bf4a685","Type":"ContainerDied","Data":"8836136b2a27c78c3afa53d3e96201339c94b6310c47657ce6f7730172f2a62a"} Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.055343 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.080037 4752 scope.go:117] "RemoveContainer" containerID="cbc8819aaddd8d4d2d0dad826bb4ce82df8a1c47c68ac131ce481a35067952c7" Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.103362 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.114079 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.124318 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.128084 4752 scope.go:117] "RemoveContainer" containerID="84100f51acbc89dc3eef175fd930e0f2a3ebebe505fb5e9794dfc4d2790209b4" Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.135867 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.141828 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.148429 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.153249 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-xmc2s"] Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.157713 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-xmc2s"] Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.158181 4752 scope.go:117] "RemoveContainer" containerID="84100f51acbc89dc3eef175fd930e0f2a3ebebe505fb5e9794dfc4d2790209b4" Nov 24 11:28:38 crc kubenswrapper[4752]: E1124 11:28:38.158522 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84100f51acbc89dc3eef175fd930e0f2a3ebebe505fb5e9794dfc4d2790209b4\": container with ID starting with 84100f51acbc89dc3eef175fd930e0f2a3ebebe505fb5e9794dfc4d2790209b4 not found: ID does not exist" containerID="84100f51acbc89dc3eef175fd930e0f2a3ebebe505fb5e9794dfc4d2790209b4" Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.158550 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84100f51acbc89dc3eef175fd930e0f2a3ebebe505fb5e9794dfc4d2790209b4"} err="failed to get container status \"84100f51acbc89dc3eef175fd930e0f2a3ebebe505fb5e9794dfc4d2790209b4\": rpc error: code = NotFound desc = could not find container \"84100f51acbc89dc3eef175fd930e0f2a3ebebe505fb5e9794dfc4d2790209b4\": container with ID starting with 84100f51acbc89dc3eef175fd930e0f2a3ebebe505fb5e9794dfc4d2790209b4 not found: ID does not exist" Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.158574 4752 scope.go:117] "RemoveContainer" containerID="a049cd20437a840deeead794f8a8f79bd0e17445da58b94a4ff6632215070a13" Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.194150 4752 scope.go:117] "RemoveContainer" containerID="a049cd20437a840deeead794f8a8f79bd0e17445da58b94a4ff6632215070a13" Nov 24 11:28:38 crc kubenswrapper[4752]: E1124 11:28:38.194538 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a049cd20437a840deeead794f8a8f79bd0e17445da58b94a4ff6632215070a13\": container with ID starting with a049cd20437a840deeead794f8a8f79bd0e17445da58b94a4ff6632215070a13 not found: ID does not exist" containerID="a049cd20437a840deeead794f8a8f79bd0e17445da58b94a4ff6632215070a13" Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.194565 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a049cd20437a840deeead794f8a8f79bd0e17445da58b94a4ff6632215070a13"} err="failed to get container status \"a049cd20437a840deeead794f8a8f79bd0e17445da58b94a4ff6632215070a13\": rpc error: code = NotFound desc = could not find container \"a049cd20437a840deeead794f8a8f79bd0e17445da58b94a4ff6632215070a13\": container with ID starting with a049cd20437a840deeead794f8a8f79bd0e17445da58b94a4ff6632215070a13 not found: ID does not exist" Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.194587 4752 scope.go:117] "RemoveContainer" containerID="d926ac7aaad802ad821b9f117c72a63b843e3913b61ff928707f207f37dcd1bc" Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.216190 4752 scope.go:117] "RemoveContainer" containerID="d926ac7aaad802ad821b9f117c72a63b843e3913b61ff928707f207f37dcd1bc" Nov 24 11:28:38 crc kubenswrapper[4752]: E1124 11:28:38.217373 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d926ac7aaad802ad821b9f117c72a63b843e3913b61ff928707f207f37dcd1bc\": container with ID starting with d926ac7aaad802ad821b9f117c72a63b843e3913b61ff928707f207f37dcd1bc not found: ID does not exist" containerID="d926ac7aaad802ad821b9f117c72a63b843e3913b61ff928707f207f37dcd1bc" Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.217432 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d926ac7aaad802ad821b9f117c72a63b843e3913b61ff928707f207f37dcd1bc"} err="failed to get container status \"d926ac7aaad802ad821b9f117c72a63b843e3913b61ff928707f207f37dcd1bc\": rpc error: code = NotFound desc = could not find container \"d926ac7aaad802ad821b9f117c72a63b843e3913b61ff928707f207f37dcd1bc\": container with ID starting with d926ac7aaad802ad821b9f117c72a63b843e3913b61ff928707f207f37dcd1bc not found: ID does not exist" Nov 24 11:28:38 crc kubenswrapper[4752]: E1124 11:28:38.224967 4752 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/887a71f4ccf96b0ba63fea4ceeddb3c7df1a6f1a5924e81ed7d12efe038078a0/diff" to get inode usage: stat /var/lib/containers/storage/overlay/887a71f4ccf96b0ba63fea4ceeddb3c7df1a6f1a5924e81ed7d12efe038078a0/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_ovn-controller-xmc2s_9495af0b-dca8-4695-8f40-f4f9a7ec5229/ovn-controller/0.log" to get inode usage: stat /var/log/pods/openstack_ovn-controller-xmc2s_9495af0b-dca8-4695-8f40-f4f9a7ec5229/ovn-controller/0.log: no such file or directory Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.737512 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04ae1a5e-39a4-40d2-8a7f-dd5e7dac6c0d" path="/var/lib/kubelet/pods/04ae1a5e-39a4-40d2-8a7f-dd5e7dac6c0d/volumes" Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.738149 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1077002f-be90-4e09-8158-efdc98329e5b" path="/var/lib/kubelet/pods/1077002f-be90-4e09-8158-efdc98329e5b/volumes" Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.738703 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cd9d1a6-a562-443e-b16d-76c159107794" path="/var/lib/kubelet/pods/1cd9d1a6-a562-443e-b16d-76c159107794/volumes" Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.739980 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d820ba8-03d6-48f4-9423-bbc1ed64a36b" path="/var/lib/kubelet/pods/3d820ba8-03d6-48f4-9423-bbc1ed64a36b/volumes" Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.740534 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5be4efe6-ed60-4417-a84c-8ff27bf4a685" path="/var/lib/kubelet/pods/5be4efe6-ed60-4417-a84c-8ff27bf4a685/volumes" Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.741003 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b9d21e3-863e-4129-a794-41c3bb8899df" path="/var/lib/kubelet/pods/6b9d21e3-863e-4129-a794-41c3bb8899df/volumes" Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.742124 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cb3afd6-0557-4a85-9763-1572d92e6aa3" path="/var/lib/kubelet/pods/7cb3afd6-0557-4a85-9763-1572d92e6aa3/volumes" Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.742654 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7df74304-1dc2-466a-b089-ecfa45af2ce1" path="/var/lib/kubelet/pods/7df74304-1dc2-466a-b089-ecfa45af2ce1/volumes" Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.743155 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9495af0b-dca8-4695-8f40-f4f9a7ec5229" path="/var/lib/kubelet/pods/9495af0b-dca8-4695-8f40-f4f9a7ec5229/volumes" Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.743790 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1f1e943-afb7-40c3-9ff2-56791f4e0ad5" path="/var/lib/kubelet/pods/a1f1e943-afb7-40c3-9ff2-56791f4e0ad5/volumes" Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.744966 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa308cc4-8cc5-4a63-926a-033a151f7291" path="/var/lib/kubelet/pods/aa308cc4-8cc5-4a63-926a-033a151f7291/volumes" Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.745437 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acd74806-0b12-4c05-9594-acbaf92b9af9" path="/var/lib/kubelet/pods/acd74806-0b12-4c05-9594-acbaf92b9af9/volumes" Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.745924 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="baa3eaaa-7445-48bc-80fc-753d38628a73" path="/var/lib/kubelet/pods/baa3eaaa-7445-48bc-80fc-753d38628a73/volumes" Nov 24 11:28:38 crc kubenswrapper[4752]: I1124 11:28:38.746764 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd071fb7-f9a2-4f4a-aad6-90c340f0d009" path="/var/lib/kubelet/pods/fd071fb7-f9a2-4f4a-aad6-90c340f0d009/volumes" Nov 24 11:28:38 crc kubenswrapper[4752]: E1124 11:28:38.890825 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce is running failed: container process not found" containerID="8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 24 11:28:38 crc kubenswrapper[4752]: E1124 11:28:38.891190 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce is running failed: container process not found" containerID="8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 24 11:28:38 crc kubenswrapper[4752]: E1124 11:28:38.891473 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce is running failed: container process not found" containerID="8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 24 11:28:38 crc kubenswrapper[4752]: E1124 11:28:38.891504 4752 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-v6jhw" podUID="26da9a73-3ede-476b-bfab-08839a8b88d1" containerName="ovsdb-server" Nov 24 11:28:38 crc kubenswrapper[4752]: E1124 11:28:38.891879 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3d7ebdd7e2accb69d034418b3619ad363a595fd55694997a32bd7b3f79d31253" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 24 11:28:38 crc kubenswrapper[4752]: E1124 11:28:38.893986 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3d7ebdd7e2accb69d034418b3619ad363a595fd55694997a32bd7b3f79d31253" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 24 11:28:38 crc kubenswrapper[4752]: E1124 11:28:38.895212 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3d7ebdd7e2accb69d034418b3619ad363a595fd55694997a32bd7b3f79d31253" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 24 11:28:38 crc kubenswrapper[4752]: E1124 11:28:38.895242 4752 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-v6jhw" podUID="26da9a73-3ede-476b-bfab-08839a8b88d1" containerName="ovs-vswitchd" Nov 24 11:28:40 crc kubenswrapper[4752]: E1124 11:28:40.316712 4752 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 24 11:28:40 crc kubenswrapper[4752]: E1124 11:28:40.316760 4752 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 24 11:28:40 crc kubenswrapper[4752]: E1124 11:28:40.317108 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0-operator-scripts podName:4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0 nodeName:}" failed. No retries permitted until 2025-11-24 11:28:48.317088907 +0000 UTC m=+1334.301909196 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0-operator-scripts") pod "glancef4c5-account-delete-kjkjp" (UID: "4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0") : configmap "openstack-scripts" not found Nov 24 11:28:40 crc kubenswrapper[4752]: E1124 11:28:40.317136 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/273fc6bc-214e-4fca-8939-003f983d6aa1-operator-scripts podName:273fc6bc-214e-4fca-8939-003f983d6aa1 nodeName:}" failed. No retries permitted until 2025-11-24 11:28:48.317120358 +0000 UTC m=+1334.301940647 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/273fc6bc-214e-4fca-8939-003f983d6aa1-operator-scripts") pod "novacell080e2-account-delete-8s7w5" (UID: "273fc6bc-214e-4fca-8939-003f983d6aa1") : configmap "openstack-scripts" not found Nov 24 11:28:40 crc kubenswrapper[4752]: E1124 11:28:40.520036 4752 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 24 11:28:40 crc kubenswrapper[4752]: E1124 11:28:40.520114 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/31369e0b-e7eb-44cc-9d7e-21d196d95ad3-operator-scripts podName:31369e0b-e7eb-44cc-9d7e-21d196d95ad3 nodeName:}" failed. No retries permitted until 2025-11-24 11:28:48.520098175 +0000 UTC m=+1334.504918474 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/31369e0b-e7eb-44cc-9d7e-21d196d95ad3-operator-scripts") pod "barbicanf581-account-delete-qm6qk" (UID: "31369e0b-e7eb-44cc-9d7e-21d196d95ad3") : configmap "openstack-scripts" not found Nov 24 11:28:41 crc kubenswrapper[4752]: E1124 11:28:41.536601 4752 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 24 11:28:41 crc kubenswrapper[4752]: E1124 11:28:41.536674 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a8b9f59e-8570-4b2a-9c7b-14be641f74fe-operator-scripts podName:a8b9f59e-8570-4b2a-9c7b-14be641f74fe nodeName:}" failed. No retries permitted until 2025-11-24 11:28:49.536658148 +0000 UTC m=+1335.521478437 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/a8b9f59e-8570-4b2a-9c7b-14be641f74fe-operator-scripts") pod "neutron574b-account-delete-nw8qc" (UID: "a8b9f59e-8570-4b2a-9c7b-14be641f74fe") : configmap "openstack-scripts" not found Nov 24 11:28:41 crc kubenswrapper[4752]: E1124 11:28:41.536756 4752 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 24 11:28:41 crc kubenswrapper[4752]: E1124 11:28:41.536836 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/11bd1a8d-cbf9-47c6-a116-538f60634a33-operator-scripts podName:11bd1a8d-cbf9-47c6-a116-538f60634a33 nodeName:}" failed. No retries permitted until 2025-11-24 11:28:49.536813572 +0000 UTC m=+1335.521633871 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/11bd1a8d-cbf9-47c6-a116-538f60634a33-operator-scripts") pod "novaapib683-account-delete-k7898" (UID: "11bd1a8d-cbf9-47c6-a116-538f60634a33") : configmap "openstack-scripts" not found Nov 24 11:28:42 crc kubenswrapper[4752]: E1124 11:28:42.434148 4752 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/370397f586d8ab419afe777baea0e0cdb2566ec6a78b39294678bae3e41bd15a/diff" to get inode usage: stat /var/lib/containers/storage/overlay/370397f586d8ab419afe777baea0e0cdb2566ec6a78b39294678bae3e41bd15a/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_openstack-galera-0_30292578-890d-42e8-b689-b909de083b48/galera/0.log" to get inode usage: stat /var/log/pods/openstack_openstack-galera-0_30292578-890d-42e8-b689-b909de083b48/galera/0.log: no such file or directory Nov 24 11:28:43 crc kubenswrapper[4752]: E1124 11:28:43.891351 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce is running failed: container process not found" containerID="8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 24 11:28:43 crc kubenswrapper[4752]: E1124 11:28:43.892351 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce is running failed: container process not found" containerID="8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 24 11:28:43 crc kubenswrapper[4752]: E1124 11:28:43.892491 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3d7ebdd7e2accb69d034418b3619ad363a595fd55694997a32bd7b3f79d31253" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 24 11:28:43 crc kubenswrapper[4752]: E1124 11:28:43.892807 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce is running failed: container process not found" containerID="8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 24 11:28:43 crc kubenswrapper[4752]: E1124 11:28:43.892844 4752 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-v6jhw" podUID="26da9a73-3ede-476b-bfab-08839a8b88d1" containerName="ovsdb-server" Nov 24 11:28:43 crc kubenswrapper[4752]: E1124 11:28:43.895163 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3d7ebdd7e2accb69d034418b3619ad363a595fd55694997a32bd7b3f79d31253" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 24 11:28:43 crc kubenswrapper[4752]: E1124 11:28:43.897212 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3d7ebdd7e2accb69d034418b3619ad363a595fd55694997a32bd7b3f79d31253" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 24 11:28:43 crc kubenswrapper[4752]: E1124 11:28:43.897249 4752 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-v6jhw" podUID="26da9a73-3ede-476b-bfab-08839a8b88d1" containerName="ovs-vswitchd" Nov 24 11:28:44 crc kubenswrapper[4752]: I1124 11:28:44.115796 4752 generic.go:334] "Generic (PLEG): container finished" podID="e9afd9c8-7b23-44fb-99ce-7924833d4589" containerID="31f4463200798c8e71465693c21c3b267736953dd9c6093042d651dea9d77b08" exitCode=0 Nov 24 11:28:44 crc kubenswrapper[4752]: I1124 11:28:44.115856 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56dd6c8857-kp42z" event={"ID":"e9afd9c8-7b23-44fb-99ce-7924833d4589","Type":"ContainerDied","Data":"31f4463200798c8e71465693c21c3b267736953dd9c6093042d651dea9d77b08"} Nov 24 11:28:44 crc kubenswrapper[4752]: I1124 11:28:44.115897 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56dd6c8857-kp42z" event={"ID":"e9afd9c8-7b23-44fb-99ce-7924833d4589","Type":"ContainerDied","Data":"2ab9d714009d285cbf3a02722516d82bfbff37f2e74cc178d47099771bf093e0"} Nov 24 11:28:44 crc kubenswrapper[4752]: I1124 11:28:44.115918 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ab9d714009d285cbf3a02722516d82bfbff37f2e74cc178d47099771bf093e0" Nov 24 11:28:44 crc kubenswrapper[4752]: I1124 11:28:44.140033 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56dd6c8857-kp42z" Nov 24 11:28:44 crc kubenswrapper[4752]: I1124 11:28:44.182848 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e9afd9c8-7b23-44fb-99ce-7924833d4589-httpd-config\") pod \"e9afd9c8-7b23-44fb-99ce-7924833d4589\" (UID: \"e9afd9c8-7b23-44fb-99ce-7924833d4589\") " Nov 24 11:28:44 crc kubenswrapper[4752]: I1124 11:28:44.183004 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9afd9c8-7b23-44fb-99ce-7924833d4589-public-tls-certs\") pod \"e9afd9c8-7b23-44fb-99ce-7924833d4589\" (UID: \"e9afd9c8-7b23-44fb-99ce-7924833d4589\") " Nov 24 11:28:44 crc kubenswrapper[4752]: I1124 11:28:44.183055 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9afd9c8-7b23-44fb-99ce-7924833d4589-internal-tls-certs\") pod \"e9afd9c8-7b23-44fb-99ce-7924833d4589\" (UID: \"e9afd9c8-7b23-44fb-99ce-7924833d4589\") " Nov 24 11:28:44 crc kubenswrapper[4752]: I1124 11:28:44.183107 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9afd9c8-7b23-44fb-99ce-7924833d4589-combined-ca-bundle\") pod \"e9afd9c8-7b23-44fb-99ce-7924833d4589\" (UID: \"e9afd9c8-7b23-44fb-99ce-7924833d4589\") " Nov 24 11:28:44 crc kubenswrapper[4752]: I1124 11:28:44.183164 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e9afd9c8-7b23-44fb-99ce-7924833d4589-config\") pod \"e9afd9c8-7b23-44fb-99ce-7924833d4589\" (UID: \"e9afd9c8-7b23-44fb-99ce-7924833d4589\") " Nov 24 11:28:44 crc kubenswrapper[4752]: I1124 11:28:44.183201 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9afd9c8-7b23-44fb-99ce-7924833d4589-ovndb-tls-certs\") pod \"e9afd9c8-7b23-44fb-99ce-7924833d4589\" (UID: \"e9afd9c8-7b23-44fb-99ce-7924833d4589\") " Nov 24 11:28:44 crc kubenswrapper[4752]: I1124 11:28:44.183247 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmllg\" (UniqueName: \"kubernetes.io/projected/e9afd9c8-7b23-44fb-99ce-7924833d4589-kube-api-access-nmllg\") pod \"e9afd9c8-7b23-44fb-99ce-7924833d4589\" (UID: \"e9afd9c8-7b23-44fb-99ce-7924833d4589\") " Nov 24 11:28:44 crc kubenswrapper[4752]: I1124 11:28:44.192565 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9afd9c8-7b23-44fb-99ce-7924833d4589-kube-api-access-nmllg" (OuterVolumeSpecName: "kube-api-access-nmllg") pod "e9afd9c8-7b23-44fb-99ce-7924833d4589" (UID: "e9afd9c8-7b23-44fb-99ce-7924833d4589"). InnerVolumeSpecName "kube-api-access-nmllg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:28:44 crc kubenswrapper[4752]: I1124 11:28:44.215841 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9afd9c8-7b23-44fb-99ce-7924833d4589-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "e9afd9c8-7b23-44fb-99ce-7924833d4589" (UID: "e9afd9c8-7b23-44fb-99ce-7924833d4589"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:44 crc kubenswrapper[4752]: I1124 11:28:44.236396 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9afd9c8-7b23-44fb-99ce-7924833d4589-config" (OuterVolumeSpecName: "config") pod "e9afd9c8-7b23-44fb-99ce-7924833d4589" (UID: "e9afd9c8-7b23-44fb-99ce-7924833d4589"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:44 crc kubenswrapper[4752]: I1124 11:28:44.247236 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9afd9c8-7b23-44fb-99ce-7924833d4589-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e9afd9c8-7b23-44fb-99ce-7924833d4589" (UID: "e9afd9c8-7b23-44fb-99ce-7924833d4589"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:44 crc kubenswrapper[4752]: I1124 11:28:44.253858 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9afd9c8-7b23-44fb-99ce-7924833d4589-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e9afd9c8-7b23-44fb-99ce-7924833d4589" (UID: "e9afd9c8-7b23-44fb-99ce-7924833d4589"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:44 crc kubenswrapper[4752]: I1124 11:28:44.259291 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9afd9c8-7b23-44fb-99ce-7924833d4589-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e9afd9c8-7b23-44fb-99ce-7924833d4589" (UID: "e9afd9c8-7b23-44fb-99ce-7924833d4589"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:44 crc kubenswrapper[4752]: I1124 11:28:44.278039 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9afd9c8-7b23-44fb-99ce-7924833d4589-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "e9afd9c8-7b23-44fb-99ce-7924833d4589" (UID: "e9afd9c8-7b23-44fb-99ce-7924833d4589"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:28:44 crc kubenswrapper[4752]: I1124 11:28:44.284940 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9afd9c8-7b23-44fb-99ce-7924833d4589-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:44 crc kubenswrapper[4752]: I1124 11:28:44.284959 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e9afd9c8-7b23-44fb-99ce-7924833d4589-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:44 crc kubenswrapper[4752]: I1124 11:28:44.284968 4752 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9afd9c8-7b23-44fb-99ce-7924833d4589-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:44 crc kubenswrapper[4752]: I1124 11:28:44.284976 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmllg\" (UniqueName: \"kubernetes.io/projected/e9afd9c8-7b23-44fb-99ce-7924833d4589-kube-api-access-nmllg\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:44 crc kubenswrapper[4752]: I1124 11:28:44.284986 4752 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e9afd9c8-7b23-44fb-99ce-7924833d4589-httpd-config\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:44 crc kubenswrapper[4752]: I1124 11:28:44.284994 4752 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9afd9c8-7b23-44fb-99ce-7924833d4589-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:44 crc kubenswrapper[4752]: I1124 11:28:44.285006 4752 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9afd9c8-7b23-44fb-99ce-7924833d4589-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:45 crc kubenswrapper[4752]: I1124 11:28:45.137804 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56dd6c8857-kp42z" Nov 24 11:28:45 crc kubenswrapper[4752]: I1124 11:28:45.164261 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-56dd6c8857-kp42z"] Nov 24 11:28:45 crc kubenswrapper[4752]: I1124 11:28:45.168626 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-56dd6c8857-kp42z"] Nov 24 11:28:45 crc kubenswrapper[4752]: E1124 11:28:45.954469 4752 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/d230e52cc82f5a5f19489946f8579cc713abadb6faa67c8738715c795aef9e1d/diff" to get inode usage: stat /var/lib/containers/storage/overlay/d230e52cc82f5a5f19489946f8579cc713abadb6faa67c8738715c795aef9e1d/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_nova-scheduler-0_fd071fb7-f9a2-4f4a-aad6-90c340f0d009/nova-scheduler-scheduler/0.log" to get inode usage: stat /var/log/pods/openstack_nova-scheduler-0_fd071fb7-f9a2-4f4a-aad6-90c340f0d009/nova-scheduler-scheduler/0.log: no such file or directory Nov 24 11:28:46 crc kubenswrapper[4752]: I1124 11:28:46.752832 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9afd9c8-7b23-44fb-99ce-7924833d4589" path="/var/lib/kubelet/pods/e9afd9c8-7b23-44fb-99ce-7924833d4589/volumes" Nov 24 11:28:47 crc kubenswrapper[4752]: E1124 11:28:47.928549 4752 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/a26097d2990fd2d829fc8e63d4682dbc96cae86b26f0013709236cf69e073480/diff" to get inode usage: stat /var/lib/containers/storage/overlay/a26097d2990fd2d829fc8e63d4682dbc96cae86b26f0013709236cf69e073480/diff: no such file or directory, extraDiskErr: Nov 24 11:28:48 crc kubenswrapper[4752]: E1124 11:28:48.360688 4752 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 24 11:28:48 crc kubenswrapper[4752]: E1124 11:28:48.360794 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0-operator-scripts podName:4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0 nodeName:}" failed. No retries permitted until 2025-11-24 11:29:04.360777023 +0000 UTC m=+1350.345597312 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0-operator-scripts") pod "glancef4c5-account-delete-kjkjp" (UID: "4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0") : configmap "openstack-scripts" not found Nov 24 11:28:48 crc kubenswrapper[4752]: E1124 11:28:48.361122 4752 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 24 11:28:48 crc kubenswrapper[4752]: E1124 11:28:48.361609 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/273fc6bc-214e-4fca-8939-003f983d6aa1-operator-scripts podName:273fc6bc-214e-4fca-8939-003f983d6aa1 nodeName:}" failed. No retries permitted until 2025-11-24 11:29:04.361593907 +0000 UTC m=+1350.346414196 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/273fc6bc-214e-4fca-8939-003f983d6aa1-operator-scripts") pod "novacell080e2-account-delete-8s7w5" (UID: "273fc6bc-214e-4fca-8939-003f983d6aa1") : configmap "openstack-scripts" not found Nov 24 11:28:48 crc kubenswrapper[4752]: E1124 11:28:48.563834 4752 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 24 11:28:48 crc kubenswrapper[4752]: E1124 11:28:48.563925 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/31369e0b-e7eb-44cc-9d7e-21d196d95ad3-operator-scripts podName:31369e0b-e7eb-44cc-9d7e-21d196d95ad3 nodeName:}" failed. No retries permitted until 2025-11-24 11:29:04.563905914 +0000 UTC m=+1350.548726203 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/31369e0b-e7eb-44cc-9d7e-21d196d95ad3-operator-scripts") pod "barbicanf581-account-delete-qm6qk" (UID: "31369e0b-e7eb-44cc-9d7e-21d196d95ad3") : configmap "openstack-scripts" not found Nov 24 11:28:48 crc kubenswrapper[4752]: E1124 11:28:48.890202 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce is running failed: container process not found" containerID="8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 24 11:28:48 crc kubenswrapper[4752]: E1124 11:28:48.891123 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce is running failed: container process not found" containerID="8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 24 11:28:48 crc kubenswrapper[4752]: E1124 11:28:48.891710 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce is running failed: container process not found" containerID="8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 24 11:28:48 crc kubenswrapper[4752]: E1124 11:28:48.891772 4752 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-v6jhw" podUID="26da9a73-3ede-476b-bfab-08839a8b88d1" containerName="ovsdb-server" Nov 24 11:28:48 crc kubenswrapper[4752]: E1124 11:28:48.891792 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3d7ebdd7e2accb69d034418b3619ad363a595fd55694997a32bd7b3f79d31253" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 24 11:28:48 crc kubenswrapper[4752]: E1124 11:28:48.896194 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3d7ebdd7e2accb69d034418b3619ad363a595fd55694997a32bd7b3f79d31253" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 24 11:28:48 crc kubenswrapper[4752]: E1124 11:28:48.899189 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3d7ebdd7e2accb69d034418b3619ad363a595fd55694997a32bd7b3f79d31253" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 24 11:28:48 crc kubenswrapper[4752]: E1124 11:28:48.899278 4752 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-v6jhw" podUID="26da9a73-3ede-476b-bfab-08839a8b88d1" containerName="ovs-vswitchd" Nov 24 11:28:49 crc kubenswrapper[4752]: E1124 11:28:49.581230 4752 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 24 11:28:49 crc kubenswrapper[4752]: E1124 11:28:49.581280 4752 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 24 11:28:49 crc kubenswrapper[4752]: E1124 11:28:49.581324 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/11bd1a8d-cbf9-47c6-a116-538f60634a33-operator-scripts podName:11bd1a8d-cbf9-47c6-a116-538f60634a33 nodeName:}" failed. No retries permitted until 2025-11-24 11:29:05.581298292 +0000 UTC m=+1351.566118581 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/11bd1a8d-cbf9-47c6-a116-538f60634a33-operator-scripts") pod "novaapib683-account-delete-k7898" (UID: "11bd1a8d-cbf9-47c6-a116-538f60634a33") : configmap "openstack-scripts" not found Nov 24 11:28:49 crc kubenswrapper[4752]: E1124 11:28:49.581348 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a8b9f59e-8570-4b2a-9c7b-14be641f74fe-operator-scripts podName:a8b9f59e-8570-4b2a-9c7b-14be641f74fe nodeName:}" failed. No retries permitted until 2025-11-24 11:29:05.581340323 +0000 UTC m=+1351.566160612 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/a8b9f59e-8570-4b2a-9c7b-14be641f74fe-operator-scripts") pod "neutron574b-account-delete-nw8qc" (UID: "a8b9f59e-8570-4b2a-9c7b-14be641f74fe") : configmap "openstack-scripts" not found Nov 24 11:28:51 crc kubenswrapper[4752]: E1124 11:28:51.661092 4752 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/0b22d3bd1b6ad477b6a633b9313d14787f4d796a9cadaed30776ab00dfb3dace/diff" to get inode usage: stat /var/lib/containers/storage/overlay/0b22d3bd1b6ad477b6a633b9313d14787f4d796a9cadaed30776ab00dfb3dace/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_ovn-northd-0_1077002f-be90-4e09-8158-efdc98329e5b/ovn-northd/0.log" to get inode usage: stat /var/log/pods/openstack_ovn-northd-0_1077002f-be90-4e09-8158-efdc98329e5b/ovn-northd/0.log: no such file or directory Nov 24 11:28:53 crc kubenswrapper[4752]: E1124 11:28:53.890002 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce is running failed: container process not found" containerID="8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 24 11:28:53 crc kubenswrapper[4752]: E1124 11:28:53.890615 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce is running failed: container process not found" containerID="8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 24 11:28:53 crc kubenswrapper[4752]: E1124 11:28:53.890966 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce is running failed: container process not found" containerID="8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 24 11:28:53 crc kubenswrapper[4752]: E1124 11:28:53.891018 4752 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-v6jhw" podUID="26da9a73-3ede-476b-bfab-08839a8b88d1" containerName="ovsdb-server" Nov 24 11:28:53 crc kubenswrapper[4752]: E1124 11:28:53.892130 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3d7ebdd7e2accb69d034418b3619ad363a595fd55694997a32bd7b3f79d31253" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 24 11:28:53 crc kubenswrapper[4752]: E1124 11:28:53.893490 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3d7ebdd7e2accb69d034418b3619ad363a595fd55694997a32bd7b3f79d31253" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 24 11:28:53 crc kubenswrapper[4752]: E1124 11:28:53.894828 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3d7ebdd7e2accb69d034418b3619ad363a595fd55694997a32bd7b3f79d31253" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 24 11:28:53 crc kubenswrapper[4752]: E1124 11:28:53.894906 4752 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-v6jhw" podUID="26da9a73-3ede-476b-bfab-08839a8b88d1" containerName="ovs-vswitchd" Nov 24 11:28:58 crc kubenswrapper[4752]: I1124 11:28:58.389801 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-v6jhw_26da9a73-3ede-476b-bfab-08839a8b88d1/ovs-vswitchd/0.log" Nov 24 11:28:58 crc kubenswrapper[4752]: I1124 11:28:58.390663 4752 generic.go:334] "Generic (PLEG): container finished" podID="26da9a73-3ede-476b-bfab-08839a8b88d1" containerID="3d7ebdd7e2accb69d034418b3619ad363a595fd55694997a32bd7b3f79d31253" exitCode=137 Nov 24 11:28:58 crc kubenswrapper[4752]: I1124 11:28:58.390700 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-v6jhw" event={"ID":"26da9a73-3ede-476b-bfab-08839a8b88d1","Type":"ContainerDied","Data":"3d7ebdd7e2accb69d034418b3619ad363a595fd55694997a32bd7b3f79d31253"} Nov 24 11:28:58 crc kubenswrapper[4752]: E1124 11:28:58.890146 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3d7ebdd7e2accb69d034418b3619ad363a595fd55694997a32bd7b3f79d31253 is running failed: container process not found" containerID="3d7ebdd7e2accb69d034418b3619ad363a595fd55694997a32bd7b3f79d31253" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 24 11:28:58 crc kubenswrapper[4752]: E1124 11:28:58.890163 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce is running failed: container process not found" containerID="8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 24 11:28:58 crc kubenswrapper[4752]: E1124 11:28:58.890942 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce is running failed: container process not found" containerID="8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 24 11:28:58 crc kubenswrapper[4752]: E1124 11:28:58.890990 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3d7ebdd7e2accb69d034418b3619ad363a595fd55694997a32bd7b3f79d31253 is running failed: container process not found" containerID="3d7ebdd7e2accb69d034418b3619ad363a595fd55694997a32bd7b3f79d31253" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 24 11:28:58 crc kubenswrapper[4752]: E1124 11:28:58.891301 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce is running failed: container process not found" containerID="8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 24 11:28:58 crc kubenswrapper[4752]: E1124 11:28:58.891326 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3d7ebdd7e2accb69d034418b3619ad363a595fd55694997a32bd7b3f79d31253 is running failed: container process not found" containerID="3d7ebdd7e2accb69d034418b3619ad363a595fd55694997a32bd7b3f79d31253" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 24 11:28:58 crc kubenswrapper[4752]: E1124 11:28:58.891363 4752 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3d7ebdd7e2accb69d034418b3619ad363a595fd55694997a32bd7b3f79d31253 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-v6jhw" podUID="26da9a73-3ede-476b-bfab-08839a8b88d1" containerName="ovs-vswitchd" Nov 24 11:28:58 crc kubenswrapper[4752]: E1124 11:28:58.891335 4752 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-v6jhw" podUID="26da9a73-3ede-476b-bfab-08839a8b88d1" containerName="ovsdb-server" Nov 24 11:28:58 crc kubenswrapper[4752]: I1124 11:28:58.921962 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.059709 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d140408d-b7e5-4ba0-9404-877498cf18a1-etc-swift\") pod \"d140408d-b7e5-4ba0-9404-877498cf18a1\" (UID: \"d140408d-b7e5-4ba0-9404-877498cf18a1\") " Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.059865 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"d140408d-b7e5-4ba0-9404-877498cf18a1\" (UID: \"d140408d-b7e5-4ba0-9404-877498cf18a1\") " Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.059912 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjs2z\" (UniqueName: \"kubernetes.io/projected/d140408d-b7e5-4ba0-9404-877498cf18a1-kube-api-access-mjs2z\") pod \"d140408d-b7e5-4ba0-9404-877498cf18a1\" (UID: \"d140408d-b7e5-4ba0-9404-877498cf18a1\") " Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.059984 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d140408d-b7e5-4ba0-9404-877498cf18a1-lock\") pod \"d140408d-b7e5-4ba0-9404-877498cf18a1\" (UID: \"d140408d-b7e5-4ba0-9404-877498cf18a1\") " Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.060013 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d140408d-b7e5-4ba0-9404-877498cf18a1-cache\") pod \"d140408d-b7e5-4ba0-9404-877498cf18a1\" (UID: \"d140408d-b7e5-4ba0-9404-877498cf18a1\") " Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.060616 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d140408d-b7e5-4ba0-9404-877498cf18a1-lock" (OuterVolumeSpecName: "lock") pod "d140408d-b7e5-4ba0-9404-877498cf18a1" (UID: "d140408d-b7e5-4ba0-9404-877498cf18a1"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.061170 4752 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d140408d-b7e5-4ba0-9404-877498cf18a1-lock\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.061327 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d140408d-b7e5-4ba0-9404-877498cf18a1-cache" (OuterVolumeSpecName: "cache") pod "d140408d-b7e5-4ba0-9404-877498cf18a1" (UID: "d140408d-b7e5-4ba0-9404-877498cf18a1"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.065015 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d140408d-b7e5-4ba0-9404-877498cf18a1-kube-api-access-mjs2z" (OuterVolumeSpecName: "kube-api-access-mjs2z") pod "d140408d-b7e5-4ba0-9404-877498cf18a1" (UID: "d140408d-b7e5-4ba0-9404-877498cf18a1"). InnerVolumeSpecName "kube-api-access-mjs2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.070531 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d140408d-b7e5-4ba0-9404-877498cf18a1-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "d140408d-b7e5-4ba0-9404-877498cf18a1" (UID: "d140408d-b7e5-4ba0-9404-877498cf18a1"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.070879 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "swift") pod "d140408d-b7e5-4ba0-9404-877498cf18a1" (UID: "d140408d-b7e5-4ba0-9404-877498cf18a1"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.138165 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-v6jhw_26da9a73-3ede-476b-bfab-08839a8b88d1/ovs-vswitchd/0.log" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.139137 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-v6jhw" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.163171 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6whzx\" (UniqueName: \"kubernetes.io/projected/26da9a73-3ede-476b-bfab-08839a8b88d1-kube-api-access-6whzx\") pod \"26da9a73-3ede-476b-bfab-08839a8b88d1\" (UID: \"26da9a73-3ede-476b-bfab-08839a8b88d1\") " Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.163262 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/26da9a73-3ede-476b-bfab-08839a8b88d1-var-lib\") pod \"26da9a73-3ede-476b-bfab-08839a8b88d1\" (UID: \"26da9a73-3ede-476b-bfab-08839a8b88d1\") " Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.163304 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/26da9a73-3ede-476b-bfab-08839a8b88d1-var-log\") pod \"26da9a73-3ede-476b-bfab-08839a8b88d1\" (UID: \"26da9a73-3ede-476b-bfab-08839a8b88d1\") " Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.163370 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26da9a73-3ede-476b-bfab-08839a8b88d1-scripts\") pod \"26da9a73-3ede-476b-bfab-08839a8b88d1\" (UID: \"26da9a73-3ede-476b-bfab-08839a8b88d1\") " Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.163475 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/26da9a73-3ede-476b-bfab-08839a8b88d1-var-run\") pod \"26da9a73-3ede-476b-bfab-08839a8b88d1\" (UID: \"26da9a73-3ede-476b-bfab-08839a8b88d1\") " Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.163495 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/26da9a73-3ede-476b-bfab-08839a8b88d1-etc-ovs\") pod \"26da9a73-3ede-476b-bfab-08839a8b88d1\" (UID: \"26da9a73-3ede-476b-bfab-08839a8b88d1\") " Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.163905 4752 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d140408d-b7e5-4ba0-9404-877498cf18a1-cache\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.163929 4752 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d140408d-b7e5-4ba0-9404-877498cf18a1-etc-swift\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.164021 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/26da9a73-3ede-476b-bfab-08839a8b88d1-var-log" (OuterVolumeSpecName: "var-log") pod "26da9a73-3ede-476b-bfab-08839a8b88d1" (UID: "26da9a73-3ede-476b-bfab-08839a8b88d1"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.164560 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/26da9a73-3ede-476b-bfab-08839a8b88d1-var-lib" (OuterVolumeSpecName: "var-lib") pod "26da9a73-3ede-476b-bfab-08839a8b88d1" (UID: "26da9a73-3ede-476b-bfab-08839a8b88d1"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.164610 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/26da9a73-3ede-476b-bfab-08839a8b88d1-var-run" (OuterVolumeSpecName: "var-run") pod "26da9a73-3ede-476b-bfab-08839a8b88d1" (UID: "26da9a73-3ede-476b-bfab-08839a8b88d1"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.164708 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/26da9a73-3ede-476b-bfab-08839a8b88d1-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "26da9a73-3ede-476b-bfab-08839a8b88d1" (UID: "26da9a73-3ede-476b-bfab-08839a8b88d1"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.165420 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26da9a73-3ede-476b-bfab-08839a8b88d1-scripts" (OuterVolumeSpecName: "scripts") pod "26da9a73-3ede-476b-bfab-08839a8b88d1" (UID: "26da9a73-3ede-476b-bfab-08839a8b88d1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.165483 4752 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.165499 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjs2z\" (UniqueName: \"kubernetes.io/projected/d140408d-b7e5-4ba0-9404-877498cf18a1-kube-api-access-mjs2z\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.168344 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26da9a73-3ede-476b-bfab-08839a8b88d1-kube-api-access-6whzx" (OuterVolumeSpecName: "kube-api-access-6whzx") pod "26da9a73-3ede-476b-bfab-08839a8b88d1" (UID: "26da9a73-3ede-476b-bfab-08839a8b88d1"). InnerVolumeSpecName "kube-api-access-6whzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.182486 4752 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.266945 4752 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/26da9a73-3ede-476b-bfab-08839a8b88d1-var-run\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.266983 4752 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/26da9a73-3ede-476b-bfab-08839a8b88d1-etc-ovs\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.266995 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6whzx\" (UniqueName: \"kubernetes.io/projected/26da9a73-3ede-476b-bfab-08839a8b88d1-kube-api-access-6whzx\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.267011 4752 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/26da9a73-3ede-476b-bfab-08839a8b88d1-var-lib\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.267022 4752 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/26da9a73-3ede-476b-bfab-08839a8b88d1-var-log\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.267037 4752 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.267052 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26da9a73-3ede-476b-bfab-08839a8b88d1-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.441722 4752 generic.go:334] "Generic (PLEG): container finished" podID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerID="3be396b0e6f5e0aba349a55cbe34b21319263e884f766386de78d9f798a96776" exitCode=137 Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.441797 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d140408d-b7e5-4ba0-9404-877498cf18a1","Type":"ContainerDied","Data":"3be396b0e6f5e0aba349a55cbe34b21319263e884f766386de78d9f798a96776"} Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.441878 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d140408d-b7e5-4ba0-9404-877498cf18a1","Type":"ContainerDied","Data":"7ed21dba93fa06628760ea5c9289230f478fea6824ab9e73cdc301dbb0a189b7"} Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.441905 4752 scope.go:117] "RemoveContainer" containerID="3be396b0e6f5e0aba349a55cbe34b21319263e884f766386de78d9f798a96776" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.441975 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.447005 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-v6jhw_26da9a73-3ede-476b-bfab-08839a8b88d1/ovs-vswitchd/0.log" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.448339 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-v6jhw" event={"ID":"26da9a73-3ede-476b-bfab-08839a8b88d1","Type":"ContainerDied","Data":"56fde739099e3c78f5d578401686680df965464a0a7168b59933697eff75109c"} Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.448445 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-v6jhw" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.489961 4752 scope.go:117] "RemoveContainer" containerID="9f08577ef801c3e7522c491d0c718c0de2696b6e95c04bd48ac0a4caf2a50f3f" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.493871 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.506247 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.513583 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-v6jhw"] Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.519179 4752 scope.go:117] "RemoveContainer" containerID="22d66d38b77ecc103d240b9da3486e681fb8545959095b39b6a50d203a648e8f" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.520140 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-v6jhw"] Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.540616 4752 scope.go:117] "RemoveContainer" containerID="40650a43647baeef577e362f8903bd0a8586c76d9193a87a0ee8036e11ec9bd6" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.563625 4752 scope.go:117] "RemoveContainer" containerID="655afbf26b251383053ab1cd2f78be797251cca8e2d6d31b15777423d8e8b8d6" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.583538 4752 scope.go:117] "RemoveContainer" containerID="43d8ca866663d4ecd92842929ff03ff00e617cdd862f758f162611f3514cc729" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.606280 4752 scope.go:117] "RemoveContainer" containerID="e96acd23c65d5f84d30d5433b104a5d9611a4f0c7182eced372f80832d7b71b7" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.627715 4752 scope.go:117] "RemoveContainer" containerID="767dc8c699d90b9a5af04f3792d4ddd678467deee8fa5af4963960ebd783d478" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.645504 4752 scope.go:117] "RemoveContainer" containerID="488850f07865f9460361539e0d1fa936b99c6aeeb6356f50bc78ab5f8d90ffae" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.664051 4752 scope.go:117] "RemoveContainer" containerID="b223ea43e6b2067d0faea7345f42301d5670008d230a3af4e07aa9d728dd386e" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.685256 4752 scope.go:117] "RemoveContainer" containerID="1f69e9bf406833b72e92b79eb1f7d0d4f08fa2bf38125f2569affd7aa3802317" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.724354 4752 scope.go:117] "RemoveContainer" containerID="0aa9b465d5873d31219e417b504faeddfc33faecc4e6894d5ac1e3bed4d23124" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.740728 4752 scope.go:117] "RemoveContainer" containerID="c40fe6199d59ba83d6f5ecc88363dc9b53f3780bec0cb8d3bf3828832d55bce4" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.769613 4752 scope.go:117] "RemoveContainer" containerID="8800b3e493a8c8c3ffef58fdad6314f8858048be66617819eddea42dd54bfa75" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.789121 4752 scope.go:117] "RemoveContainer" containerID="c2e4b5f4a8a1169d5bc08d2b66fc675cb52f5aca6edec61cb505367c31a96177" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.807826 4752 scope.go:117] "RemoveContainer" containerID="3be396b0e6f5e0aba349a55cbe34b21319263e884f766386de78d9f798a96776" Nov 24 11:28:59 crc kubenswrapper[4752]: E1124 11:28:59.808299 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3be396b0e6f5e0aba349a55cbe34b21319263e884f766386de78d9f798a96776\": container with ID starting with 3be396b0e6f5e0aba349a55cbe34b21319263e884f766386de78d9f798a96776 not found: ID does not exist" containerID="3be396b0e6f5e0aba349a55cbe34b21319263e884f766386de78d9f798a96776" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.808331 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3be396b0e6f5e0aba349a55cbe34b21319263e884f766386de78d9f798a96776"} err="failed to get container status \"3be396b0e6f5e0aba349a55cbe34b21319263e884f766386de78d9f798a96776\": rpc error: code = NotFound desc = could not find container \"3be396b0e6f5e0aba349a55cbe34b21319263e884f766386de78d9f798a96776\": container with ID starting with 3be396b0e6f5e0aba349a55cbe34b21319263e884f766386de78d9f798a96776 not found: ID does not exist" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.808353 4752 scope.go:117] "RemoveContainer" containerID="9f08577ef801c3e7522c491d0c718c0de2696b6e95c04bd48ac0a4caf2a50f3f" Nov 24 11:28:59 crc kubenswrapper[4752]: E1124 11:28:59.808800 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f08577ef801c3e7522c491d0c718c0de2696b6e95c04bd48ac0a4caf2a50f3f\": container with ID starting with 9f08577ef801c3e7522c491d0c718c0de2696b6e95c04bd48ac0a4caf2a50f3f not found: ID does not exist" containerID="9f08577ef801c3e7522c491d0c718c0de2696b6e95c04bd48ac0a4caf2a50f3f" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.808830 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f08577ef801c3e7522c491d0c718c0de2696b6e95c04bd48ac0a4caf2a50f3f"} err="failed to get container status \"9f08577ef801c3e7522c491d0c718c0de2696b6e95c04bd48ac0a4caf2a50f3f\": rpc error: code = NotFound desc = could not find container \"9f08577ef801c3e7522c491d0c718c0de2696b6e95c04bd48ac0a4caf2a50f3f\": container with ID starting with 9f08577ef801c3e7522c491d0c718c0de2696b6e95c04bd48ac0a4caf2a50f3f not found: ID does not exist" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.808850 4752 scope.go:117] "RemoveContainer" containerID="22d66d38b77ecc103d240b9da3486e681fb8545959095b39b6a50d203a648e8f" Nov 24 11:28:59 crc kubenswrapper[4752]: E1124 11:28:59.809133 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22d66d38b77ecc103d240b9da3486e681fb8545959095b39b6a50d203a648e8f\": container with ID starting with 22d66d38b77ecc103d240b9da3486e681fb8545959095b39b6a50d203a648e8f not found: ID does not exist" containerID="22d66d38b77ecc103d240b9da3486e681fb8545959095b39b6a50d203a648e8f" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.809155 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22d66d38b77ecc103d240b9da3486e681fb8545959095b39b6a50d203a648e8f"} err="failed to get container status \"22d66d38b77ecc103d240b9da3486e681fb8545959095b39b6a50d203a648e8f\": rpc error: code = NotFound desc = could not find container \"22d66d38b77ecc103d240b9da3486e681fb8545959095b39b6a50d203a648e8f\": container with ID starting with 22d66d38b77ecc103d240b9da3486e681fb8545959095b39b6a50d203a648e8f not found: ID does not exist" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.809170 4752 scope.go:117] "RemoveContainer" containerID="40650a43647baeef577e362f8903bd0a8586c76d9193a87a0ee8036e11ec9bd6" Nov 24 11:28:59 crc kubenswrapper[4752]: E1124 11:28:59.809536 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40650a43647baeef577e362f8903bd0a8586c76d9193a87a0ee8036e11ec9bd6\": container with ID starting with 40650a43647baeef577e362f8903bd0a8586c76d9193a87a0ee8036e11ec9bd6 not found: ID does not exist" containerID="40650a43647baeef577e362f8903bd0a8586c76d9193a87a0ee8036e11ec9bd6" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.809555 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40650a43647baeef577e362f8903bd0a8586c76d9193a87a0ee8036e11ec9bd6"} err="failed to get container status \"40650a43647baeef577e362f8903bd0a8586c76d9193a87a0ee8036e11ec9bd6\": rpc error: code = NotFound desc = could not find container \"40650a43647baeef577e362f8903bd0a8586c76d9193a87a0ee8036e11ec9bd6\": container with ID starting with 40650a43647baeef577e362f8903bd0a8586c76d9193a87a0ee8036e11ec9bd6 not found: ID does not exist" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.809568 4752 scope.go:117] "RemoveContainer" containerID="655afbf26b251383053ab1cd2f78be797251cca8e2d6d31b15777423d8e8b8d6" Nov 24 11:28:59 crc kubenswrapper[4752]: E1124 11:28:59.809980 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"655afbf26b251383053ab1cd2f78be797251cca8e2d6d31b15777423d8e8b8d6\": container with ID starting with 655afbf26b251383053ab1cd2f78be797251cca8e2d6d31b15777423d8e8b8d6 not found: ID does not exist" containerID="655afbf26b251383053ab1cd2f78be797251cca8e2d6d31b15777423d8e8b8d6" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.810069 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"655afbf26b251383053ab1cd2f78be797251cca8e2d6d31b15777423d8e8b8d6"} err="failed to get container status \"655afbf26b251383053ab1cd2f78be797251cca8e2d6d31b15777423d8e8b8d6\": rpc error: code = NotFound desc = could not find container \"655afbf26b251383053ab1cd2f78be797251cca8e2d6d31b15777423d8e8b8d6\": container with ID starting with 655afbf26b251383053ab1cd2f78be797251cca8e2d6d31b15777423d8e8b8d6 not found: ID does not exist" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.810085 4752 scope.go:117] "RemoveContainer" containerID="43d8ca866663d4ecd92842929ff03ff00e617cdd862f758f162611f3514cc729" Nov 24 11:28:59 crc kubenswrapper[4752]: E1124 11:28:59.810390 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43d8ca866663d4ecd92842929ff03ff00e617cdd862f758f162611f3514cc729\": container with ID starting with 43d8ca866663d4ecd92842929ff03ff00e617cdd862f758f162611f3514cc729 not found: ID does not exist" containerID="43d8ca866663d4ecd92842929ff03ff00e617cdd862f758f162611f3514cc729" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.810432 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43d8ca866663d4ecd92842929ff03ff00e617cdd862f758f162611f3514cc729"} err="failed to get container status \"43d8ca866663d4ecd92842929ff03ff00e617cdd862f758f162611f3514cc729\": rpc error: code = NotFound desc = could not find container \"43d8ca866663d4ecd92842929ff03ff00e617cdd862f758f162611f3514cc729\": container with ID starting with 43d8ca866663d4ecd92842929ff03ff00e617cdd862f758f162611f3514cc729 not found: ID does not exist" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.810457 4752 scope.go:117] "RemoveContainer" containerID="e96acd23c65d5f84d30d5433b104a5d9611a4f0c7182eced372f80832d7b71b7" Nov 24 11:28:59 crc kubenswrapper[4752]: E1124 11:28:59.810730 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e96acd23c65d5f84d30d5433b104a5d9611a4f0c7182eced372f80832d7b71b7\": container with ID starting with e96acd23c65d5f84d30d5433b104a5d9611a4f0c7182eced372f80832d7b71b7 not found: ID does not exist" containerID="e96acd23c65d5f84d30d5433b104a5d9611a4f0c7182eced372f80832d7b71b7" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.810756 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e96acd23c65d5f84d30d5433b104a5d9611a4f0c7182eced372f80832d7b71b7"} err="failed to get container status \"e96acd23c65d5f84d30d5433b104a5d9611a4f0c7182eced372f80832d7b71b7\": rpc error: code = NotFound desc = could not find container \"e96acd23c65d5f84d30d5433b104a5d9611a4f0c7182eced372f80832d7b71b7\": container with ID starting with e96acd23c65d5f84d30d5433b104a5d9611a4f0c7182eced372f80832d7b71b7 not found: ID does not exist" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.810937 4752 scope.go:117] "RemoveContainer" containerID="767dc8c699d90b9a5af04f3792d4ddd678467deee8fa5af4963960ebd783d478" Nov 24 11:28:59 crc kubenswrapper[4752]: E1124 11:28:59.811221 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"767dc8c699d90b9a5af04f3792d4ddd678467deee8fa5af4963960ebd783d478\": container with ID starting with 767dc8c699d90b9a5af04f3792d4ddd678467deee8fa5af4963960ebd783d478 not found: ID does not exist" containerID="767dc8c699d90b9a5af04f3792d4ddd678467deee8fa5af4963960ebd783d478" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.811266 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"767dc8c699d90b9a5af04f3792d4ddd678467deee8fa5af4963960ebd783d478"} err="failed to get container status \"767dc8c699d90b9a5af04f3792d4ddd678467deee8fa5af4963960ebd783d478\": rpc error: code = NotFound desc = could not find container \"767dc8c699d90b9a5af04f3792d4ddd678467deee8fa5af4963960ebd783d478\": container with ID starting with 767dc8c699d90b9a5af04f3792d4ddd678467deee8fa5af4963960ebd783d478 not found: ID does not exist" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.811296 4752 scope.go:117] "RemoveContainer" containerID="488850f07865f9460361539e0d1fa936b99c6aeeb6356f50bc78ab5f8d90ffae" Nov 24 11:28:59 crc kubenswrapper[4752]: E1124 11:28:59.811621 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"488850f07865f9460361539e0d1fa936b99c6aeeb6356f50bc78ab5f8d90ffae\": container with ID starting with 488850f07865f9460361539e0d1fa936b99c6aeeb6356f50bc78ab5f8d90ffae not found: ID does not exist" containerID="488850f07865f9460361539e0d1fa936b99c6aeeb6356f50bc78ab5f8d90ffae" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.811650 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"488850f07865f9460361539e0d1fa936b99c6aeeb6356f50bc78ab5f8d90ffae"} err="failed to get container status \"488850f07865f9460361539e0d1fa936b99c6aeeb6356f50bc78ab5f8d90ffae\": rpc error: code = NotFound desc = could not find container \"488850f07865f9460361539e0d1fa936b99c6aeeb6356f50bc78ab5f8d90ffae\": container with ID starting with 488850f07865f9460361539e0d1fa936b99c6aeeb6356f50bc78ab5f8d90ffae not found: ID does not exist" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.811672 4752 scope.go:117] "RemoveContainer" containerID="b223ea43e6b2067d0faea7345f42301d5670008d230a3af4e07aa9d728dd386e" Nov 24 11:28:59 crc kubenswrapper[4752]: E1124 11:28:59.812223 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b223ea43e6b2067d0faea7345f42301d5670008d230a3af4e07aa9d728dd386e\": container with ID starting with b223ea43e6b2067d0faea7345f42301d5670008d230a3af4e07aa9d728dd386e not found: ID does not exist" containerID="b223ea43e6b2067d0faea7345f42301d5670008d230a3af4e07aa9d728dd386e" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.812243 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b223ea43e6b2067d0faea7345f42301d5670008d230a3af4e07aa9d728dd386e"} err="failed to get container status \"b223ea43e6b2067d0faea7345f42301d5670008d230a3af4e07aa9d728dd386e\": rpc error: code = NotFound desc = could not find container \"b223ea43e6b2067d0faea7345f42301d5670008d230a3af4e07aa9d728dd386e\": container with ID starting with b223ea43e6b2067d0faea7345f42301d5670008d230a3af4e07aa9d728dd386e not found: ID does not exist" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.812255 4752 scope.go:117] "RemoveContainer" containerID="1f69e9bf406833b72e92b79eb1f7d0d4f08fa2bf38125f2569affd7aa3802317" Nov 24 11:28:59 crc kubenswrapper[4752]: E1124 11:28:59.812543 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f69e9bf406833b72e92b79eb1f7d0d4f08fa2bf38125f2569affd7aa3802317\": container with ID starting with 1f69e9bf406833b72e92b79eb1f7d0d4f08fa2bf38125f2569affd7aa3802317 not found: ID does not exist" containerID="1f69e9bf406833b72e92b79eb1f7d0d4f08fa2bf38125f2569affd7aa3802317" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.812574 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f69e9bf406833b72e92b79eb1f7d0d4f08fa2bf38125f2569affd7aa3802317"} err="failed to get container status \"1f69e9bf406833b72e92b79eb1f7d0d4f08fa2bf38125f2569affd7aa3802317\": rpc error: code = NotFound desc = could not find container \"1f69e9bf406833b72e92b79eb1f7d0d4f08fa2bf38125f2569affd7aa3802317\": container with ID starting with 1f69e9bf406833b72e92b79eb1f7d0d4f08fa2bf38125f2569affd7aa3802317 not found: ID does not exist" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.812605 4752 scope.go:117] "RemoveContainer" containerID="0aa9b465d5873d31219e417b504faeddfc33faecc4e6894d5ac1e3bed4d23124" Nov 24 11:28:59 crc kubenswrapper[4752]: E1124 11:28:59.813317 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0aa9b465d5873d31219e417b504faeddfc33faecc4e6894d5ac1e3bed4d23124\": container with ID starting with 0aa9b465d5873d31219e417b504faeddfc33faecc4e6894d5ac1e3bed4d23124 not found: ID does not exist" containerID="0aa9b465d5873d31219e417b504faeddfc33faecc4e6894d5ac1e3bed4d23124" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.813349 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aa9b465d5873d31219e417b504faeddfc33faecc4e6894d5ac1e3bed4d23124"} err="failed to get container status \"0aa9b465d5873d31219e417b504faeddfc33faecc4e6894d5ac1e3bed4d23124\": rpc error: code = NotFound desc = could not find container \"0aa9b465d5873d31219e417b504faeddfc33faecc4e6894d5ac1e3bed4d23124\": container with ID starting with 0aa9b465d5873d31219e417b504faeddfc33faecc4e6894d5ac1e3bed4d23124 not found: ID does not exist" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.813364 4752 scope.go:117] "RemoveContainer" containerID="c40fe6199d59ba83d6f5ecc88363dc9b53f3780bec0cb8d3bf3828832d55bce4" Nov 24 11:28:59 crc kubenswrapper[4752]: E1124 11:28:59.813548 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c40fe6199d59ba83d6f5ecc88363dc9b53f3780bec0cb8d3bf3828832d55bce4\": container with ID starting with c40fe6199d59ba83d6f5ecc88363dc9b53f3780bec0cb8d3bf3828832d55bce4 not found: ID does not exist" containerID="c40fe6199d59ba83d6f5ecc88363dc9b53f3780bec0cb8d3bf3828832d55bce4" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.813630 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c40fe6199d59ba83d6f5ecc88363dc9b53f3780bec0cb8d3bf3828832d55bce4"} err="failed to get container status \"c40fe6199d59ba83d6f5ecc88363dc9b53f3780bec0cb8d3bf3828832d55bce4\": rpc error: code = NotFound desc = could not find container \"c40fe6199d59ba83d6f5ecc88363dc9b53f3780bec0cb8d3bf3828832d55bce4\": container with ID starting with c40fe6199d59ba83d6f5ecc88363dc9b53f3780bec0cb8d3bf3828832d55bce4 not found: ID does not exist" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.813643 4752 scope.go:117] "RemoveContainer" containerID="8800b3e493a8c8c3ffef58fdad6314f8858048be66617819eddea42dd54bfa75" Nov 24 11:28:59 crc kubenswrapper[4752]: E1124 11:28:59.813970 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8800b3e493a8c8c3ffef58fdad6314f8858048be66617819eddea42dd54bfa75\": container with ID starting with 8800b3e493a8c8c3ffef58fdad6314f8858048be66617819eddea42dd54bfa75 not found: ID does not exist" containerID="8800b3e493a8c8c3ffef58fdad6314f8858048be66617819eddea42dd54bfa75" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.813985 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8800b3e493a8c8c3ffef58fdad6314f8858048be66617819eddea42dd54bfa75"} err="failed to get container status \"8800b3e493a8c8c3ffef58fdad6314f8858048be66617819eddea42dd54bfa75\": rpc error: code = NotFound desc = could not find container \"8800b3e493a8c8c3ffef58fdad6314f8858048be66617819eddea42dd54bfa75\": container with ID starting with 8800b3e493a8c8c3ffef58fdad6314f8858048be66617819eddea42dd54bfa75 not found: ID does not exist" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.813995 4752 scope.go:117] "RemoveContainer" containerID="c2e4b5f4a8a1169d5bc08d2b66fc675cb52f5aca6edec61cb505367c31a96177" Nov 24 11:28:59 crc kubenswrapper[4752]: E1124 11:28:59.814211 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2e4b5f4a8a1169d5bc08d2b66fc675cb52f5aca6edec61cb505367c31a96177\": container with ID starting with c2e4b5f4a8a1169d5bc08d2b66fc675cb52f5aca6edec61cb505367c31a96177 not found: ID does not exist" containerID="c2e4b5f4a8a1169d5bc08d2b66fc675cb52f5aca6edec61cb505367c31a96177" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.814229 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2e4b5f4a8a1169d5bc08d2b66fc675cb52f5aca6edec61cb505367c31a96177"} err="failed to get container status \"c2e4b5f4a8a1169d5bc08d2b66fc675cb52f5aca6edec61cb505367c31a96177\": rpc error: code = NotFound desc = could not find container \"c2e4b5f4a8a1169d5bc08d2b66fc675cb52f5aca6edec61cb505367c31a96177\": container with ID starting with c2e4b5f4a8a1169d5bc08d2b66fc675cb52f5aca6edec61cb505367c31a96177 not found: ID does not exist" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.814240 4752 scope.go:117] "RemoveContainer" containerID="3d7ebdd7e2accb69d034418b3619ad363a595fd55694997a32bd7b3f79d31253" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.859677 4752 scope.go:117] "RemoveContainer" containerID="8c0eccc2247392b60ee7a893a0070b14f66b5a505f1d0e1000c0311b78c2d3ce" Nov 24 11:28:59 crc kubenswrapper[4752]: I1124 11:28:59.898438 4752 scope.go:117] "RemoveContainer" containerID="95748596e6497c0d2fb881e9a27eb4fca9a23e8c544c31ba9b578f2427735437" Nov 24 11:29:00 crc kubenswrapper[4752]: E1124 11:29:00.352551 4752 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/2d5ccf2841eb2c0dcec26219c72ec7c0c0ce7d700649facd572385cd3b5db293/diff" to get inode usage: stat /var/lib/containers/storage/overlay/2d5ccf2841eb2c0dcec26219c72ec7c0c0ce7d700649facd572385cd3b5db293/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_nova-cell1-conductor-0_5be4efe6-ed60-4417-a84c-8ff27bf4a685/nova-cell1-conductor-conductor/0.log" to get inode usage: stat /var/log/pods/openstack_nova-cell1-conductor-0_5be4efe6-ed60-4417-a84c-8ff27bf4a685/nova-cell1-conductor-conductor/0.log: no such file or directory Nov 24 11:29:00 crc kubenswrapper[4752]: I1124 11:29:00.738259 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26da9a73-3ede-476b-bfab-08839a8b88d1" path="/var/lib/kubelet/pods/26da9a73-3ede-476b-bfab-08839a8b88d1/volumes" Nov 24 11:29:00 crc kubenswrapper[4752]: I1124 11:29:00.739401 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" path="/var/lib/kubelet/pods/d140408d-b7e5-4ba0-9404-877498cf18a1/volumes" Nov 24 11:29:01 crc kubenswrapper[4752]: I1124 11:29:01.078222 4752 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod7d558bea-9793-4831-9304-f8cee2b2331e"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod7d558bea-9793-4831-9304-f8cee2b2331e] : Timed out while waiting for systemd to remove kubepods-besteffort-pod7d558bea_9793_4831_9304_f8cee2b2331e.slice" Nov 24 11:29:01 crc kubenswrapper[4752]: E1124 11:29:01.078273 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod7d558bea-9793-4831-9304-f8cee2b2331e] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod7d558bea-9793-4831-9304-f8cee2b2331e] : Timed out while waiting for systemd to remove kubepods-besteffort-pod7d558bea_9793_4831_9304_f8cee2b2331e.slice" pod="openstack/ovsdbserver-sb-0" podUID="7d558bea-9793-4831-9304-f8cee2b2331e" Nov 24 11:29:01 crc kubenswrapper[4752]: I1124 11:29:01.083509 4752 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod01bd298d-28f4-4c50-80b3-f81ed96de794"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod01bd298d-28f4-4c50-80b3-f81ed96de794] : Timed out while waiting for systemd to remove kubepods-besteffort-pod01bd298d_28f4_4c50_80b3_f81ed96de794.slice" Nov 24 11:29:01 crc kubenswrapper[4752]: I1124 11:29:01.470511 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 24 11:29:01 crc kubenswrapper[4752]: I1124 11:29:01.508814 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 24 11:29:01 crc kubenswrapper[4752]: I1124 11:29:01.514363 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 24 11:29:02 crc kubenswrapper[4752]: I1124 11:29:02.742205 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d558bea-9793-4831-9304-f8cee2b2331e" path="/var/lib/kubelet/pods/7d558bea-9793-4831-9304-f8cee2b2331e/volumes" Nov 24 11:29:04 crc kubenswrapper[4752]: I1124 11:29:04.154619 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glancef4c5-account-delete-kjkjp" Nov 24 11:29:04 crc kubenswrapper[4752]: I1124 11:29:04.340855 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9kpx\" (UniqueName: \"kubernetes.io/projected/4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0-kube-api-access-w9kpx\") pod \"4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0\" (UID: \"4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0\") " Nov 24 11:29:04 crc kubenswrapper[4752]: I1124 11:29:04.340983 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0-operator-scripts\") pod \"4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0\" (UID: \"4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0\") " Nov 24 11:29:04 crc kubenswrapper[4752]: I1124 11:29:04.341788 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0" (UID: "4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:29:04 crc kubenswrapper[4752]: I1124 11:29:04.351197 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0-kube-api-access-w9kpx" (OuterVolumeSpecName: "kube-api-access-w9kpx") pod "4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0" (UID: "4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0"). InnerVolumeSpecName "kube-api-access-w9kpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:29:04 crc kubenswrapper[4752]: I1124 11:29:04.442600 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9kpx\" (UniqueName: \"kubernetes.io/projected/4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0-kube-api-access-w9kpx\") on node \"crc\" DevicePath \"\"" Nov 24 11:29:04 crc kubenswrapper[4752]: I1124 11:29:04.442652 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:29:04 crc kubenswrapper[4752]: E1124 11:29:04.442665 4752 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 24 11:29:04 crc kubenswrapper[4752]: E1124 11:29:04.442738 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/273fc6bc-214e-4fca-8939-003f983d6aa1-operator-scripts podName:273fc6bc-214e-4fca-8939-003f983d6aa1 nodeName:}" failed. No retries permitted until 2025-11-24 11:29:36.442719726 +0000 UTC m=+1382.427540115 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/273fc6bc-214e-4fca-8939-003f983d6aa1-operator-scripts") pod "novacell080e2-account-delete-8s7w5" (UID: "273fc6bc-214e-4fca-8939-003f983d6aa1") : configmap "openstack-scripts" not found Nov 24 11:29:04 crc kubenswrapper[4752]: I1124 11:29:04.499259 4752 generic.go:334] "Generic (PLEG): container finished" podID="4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0" containerID="0b2d1772a4113b9569ea388eb821fe2bba108595c602c892167bc5ef467db515" exitCode=137 Nov 24 11:29:04 crc kubenswrapper[4752]: I1124 11:29:04.499303 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glancef4c5-account-delete-kjkjp" event={"ID":"4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0","Type":"ContainerDied","Data":"0b2d1772a4113b9569ea388eb821fe2bba108595c602c892167bc5ef467db515"} Nov 24 11:29:04 crc kubenswrapper[4752]: I1124 11:29:04.499329 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glancef4c5-account-delete-kjkjp" event={"ID":"4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0","Type":"ContainerDied","Data":"718a682b19ea651cbbac9169feb75607846abc1ed86f2897ee2f998e2085d1aa"} Nov 24 11:29:04 crc kubenswrapper[4752]: I1124 11:29:04.499346 4752 scope.go:117] "RemoveContainer" containerID="0b2d1772a4113b9569ea388eb821fe2bba108595c602c892167bc5ef467db515" Nov 24 11:29:04 crc kubenswrapper[4752]: I1124 11:29:04.499335 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glancef4c5-account-delete-kjkjp" Nov 24 11:29:04 crc kubenswrapper[4752]: I1124 11:29:04.527488 4752 scope.go:117] "RemoveContainer" containerID="0b2d1772a4113b9569ea388eb821fe2bba108595c602c892167bc5ef467db515" Nov 24 11:29:04 crc kubenswrapper[4752]: E1124 11:29:04.528461 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b2d1772a4113b9569ea388eb821fe2bba108595c602c892167bc5ef467db515\": container with ID starting with 0b2d1772a4113b9569ea388eb821fe2bba108595c602c892167bc5ef467db515 not found: ID does not exist" containerID="0b2d1772a4113b9569ea388eb821fe2bba108595c602c892167bc5ef467db515" Nov 24 11:29:04 crc kubenswrapper[4752]: I1124 11:29:04.528533 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b2d1772a4113b9569ea388eb821fe2bba108595c602c892167bc5ef467db515"} err="failed to get container status \"0b2d1772a4113b9569ea388eb821fe2bba108595c602c892167bc5ef467db515\": rpc error: code = NotFound desc = could not find container \"0b2d1772a4113b9569ea388eb821fe2bba108595c602c892167bc5ef467db515\": container with ID starting with 0b2d1772a4113b9569ea388eb821fe2bba108595c602c892167bc5ef467db515 not found: ID does not exist" Nov 24 11:29:04 crc kubenswrapper[4752]: I1124 11:29:04.535623 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glancef4c5-account-delete-kjkjp"] Nov 24 11:29:04 crc kubenswrapper[4752]: I1124 11:29:04.544579 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glancef4c5-account-delete-kjkjp"] Nov 24 11:29:04 crc kubenswrapper[4752]: E1124 11:29:04.645971 4752 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 24 11:29:04 crc kubenswrapper[4752]: E1124 11:29:04.646097 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/31369e0b-e7eb-44cc-9d7e-21d196d95ad3-operator-scripts podName:31369e0b-e7eb-44cc-9d7e-21d196d95ad3 nodeName:}" failed. No retries permitted until 2025-11-24 11:29:36.646063603 +0000 UTC m=+1382.630883932 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/31369e0b-e7eb-44cc-9d7e-21d196d95ad3-operator-scripts") pod "barbicanf581-account-delete-qm6qk" (UID: "31369e0b-e7eb-44cc-9d7e-21d196d95ad3") : configmap "openstack-scripts" not found Nov 24 11:29:04 crc kubenswrapper[4752]: I1124 11:29:04.740903 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0" path="/var/lib/kubelet/pods/4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0/volumes" Nov 24 11:29:05 crc kubenswrapper[4752]: E1124 11:29:05.143483 4752 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11bd1a8d_cbf9_47c6_a116_538f60634a33.slice/crio-conmon-fa5ed377c2b9e970a9675ea141a6258eecf2cb33ef47859900cb051909237527.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod273fc6bc_214e_4fca_8939_003f983d6aa1.slice/crio-conmon-51562528f3ca126cb6a542c5f8fc1ce5583a429c0d7b6e27713bdb2a51828aa1.scope\": RecentStats: unable to find data in memory cache]" Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.248991 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapib683-account-delete-k7898" Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.357250 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11bd1a8d-cbf9-47c6-a116-538f60634a33-operator-scripts\") pod \"11bd1a8d-cbf9-47c6-a116-538f60634a33\" (UID: \"11bd1a8d-cbf9-47c6-a116-538f60634a33\") " Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.357339 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzh6w\" (UniqueName: \"kubernetes.io/projected/11bd1a8d-cbf9-47c6-a116-538f60634a33-kube-api-access-hzh6w\") pod \"11bd1a8d-cbf9-47c6-a116-538f60634a33\" (UID: \"11bd1a8d-cbf9-47c6-a116-538f60634a33\") " Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.358786 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11bd1a8d-cbf9-47c6-a116-538f60634a33-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "11bd1a8d-cbf9-47c6-a116-538f60634a33" (UID: "11bd1a8d-cbf9-47c6-a116-538f60634a33"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.361736 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11bd1a8d-cbf9-47c6-a116-538f60634a33-kube-api-access-hzh6w" (OuterVolumeSpecName: "kube-api-access-hzh6w") pod "11bd1a8d-cbf9-47c6-a116-538f60634a33" (UID: "11bd1a8d-cbf9-47c6-a116-538f60634a33"). InnerVolumeSpecName "kube-api-access-hzh6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.458921 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11bd1a8d-cbf9-47c6-a116-538f60634a33-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.458950 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzh6w\" (UniqueName: \"kubernetes.io/projected/11bd1a8d-cbf9-47c6-a116-538f60634a33-kube-api-access-hzh6w\") on node \"crc\" DevicePath \"\"" Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.459464 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron574b-account-delete-nw8qc" Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.462468 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell080e2-account-delete-8s7w5" Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.507516 4752 generic.go:334] "Generic (PLEG): container finished" podID="273fc6bc-214e-4fca-8939-003f983d6aa1" containerID="51562528f3ca126cb6a542c5f8fc1ce5583a429c0d7b6e27713bdb2a51828aa1" exitCode=137 Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.507761 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell080e2-account-delete-8s7w5" event={"ID":"273fc6bc-214e-4fca-8939-003f983d6aa1","Type":"ContainerDied","Data":"51562528f3ca126cb6a542c5f8fc1ce5583a429c0d7b6e27713bdb2a51828aa1"} Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.507837 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell080e2-account-delete-8s7w5" event={"ID":"273fc6bc-214e-4fca-8939-003f983d6aa1","Type":"ContainerDied","Data":"abd2e1662ed7dcf11502498345efcbb4ba177e3072b9adfe74c59d85b7c85869"} Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.507911 4752 scope.go:117] "RemoveContainer" containerID="51562528f3ca126cb6a542c5f8fc1ce5583a429c0d7b6e27713bdb2a51828aa1" Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.507983 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell080e2-account-delete-8s7w5" Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.509311 4752 generic.go:334] "Generic (PLEG): container finished" podID="a8b9f59e-8570-4b2a-9c7b-14be641f74fe" containerID="0c28586032f85dbb5be5c9dc7486df65e43615fce2f45a9b8350fd4f996c1d88" exitCode=137 Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.509383 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron574b-account-delete-nw8qc" event={"ID":"a8b9f59e-8570-4b2a-9c7b-14be641f74fe","Type":"ContainerDied","Data":"0c28586032f85dbb5be5c9dc7486df65e43615fce2f45a9b8350fd4f996c1d88"} Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.509412 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron574b-account-delete-nw8qc" event={"ID":"a8b9f59e-8570-4b2a-9c7b-14be641f74fe","Type":"ContainerDied","Data":"658624b76f0dab253674f5903fd66093ed8665d386f9a2195deb9b3c741fc580"} Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.509481 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron574b-account-delete-nw8qc" Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.512034 4752 generic.go:334] "Generic (PLEG): container finished" podID="11bd1a8d-cbf9-47c6-a116-538f60634a33" containerID="fa5ed377c2b9e970a9675ea141a6258eecf2cb33ef47859900cb051909237527" exitCode=137 Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.512085 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapib683-account-delete-k7898" Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.512092 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapib683-account-delete-k7898" event={"ID":"11bd1a8d-cbf9-47c6-a116-538f60634a33","Type":"ContainerDied","Data":"fa5ed377c2b9e970a9675ea141a6258eecf2cb33ef47859900cb051909237527"} Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.512196 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapib683-account-delete-k7898" event={"ID":"11bd1a8d-cbf9-47c6-a116-538f60634a33","Type":"ContainerDied","Data":"000d10841c1abbd59399298c9ba35003072190b03182e79f0d52675198c8d93c"} Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.540703 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapib683-account-delete-k7898"] Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.541327 4752 scope.go:117] "RemoveContainer" containerID="51562528f3ca126cb6a542c5f8fc1ce5583a429c0d7b6e27713bdb2a51828aa1" Nov 24 11:29:05 crc kubenswrapper[4752]: E1124 11:29:05.541798 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51562528f3ca126cb6a542c5f8fc1ce5583a429c0d7b6e27713bdb2a51828aa1\": container with ID starting with 51562528f3ca126cb6a542c5f8fc1ce5583a429c0d7b6e27713bdb2a51828aa1 not found: ID does not exist" containerID="51562528f3ca126cb6a542c5f8fc1ce5583a429c0d7b6e27713bdb2a51828aa1" Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.541898 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51562528f3ca126cb6a542c5f8fc1ce5583a429c0d7b6e27713bdb2a51828aa1"} err="failed to get container status \"51562528f3ca126cb6a542c5f8fc1ce5583a429c0d7b6e27713bdb2a51828aa1\": rpc error: code = NotFound desc = could not find container \"51562528f3ca126cb6a542c5f8fc1ce5583a429c0d7b6e27713bdb2a51828aa1\": container with ID starting with 51562528f3ca126cb6a542c5f8fc1ce5583a429c0d7b6e27713bdb2a51828aa1 not found: ID does not exist" Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.542027 4752 scope.go:117] "RemoveContainer" containerID="0c28586032f85dbb5be5c9dc7486df65e43615fce2f45a9b8350fd4f996c1d88" Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.546368 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novaapib683-account-delete-k7898"] Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.557345 4752 scope.go:117] "RemoveContainer" containerID="0c28586032f85dbb5be5c9dc7486df65e43615fce2f45a9b8350fd4f996c1d88" Nov 24 11:29:05 crc kubenswrapper[4752]: E1124 11:29:05.557665 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c28586032f85dbb5be5c9dc7486df65e43615fce2f45a9b8350fd4f996c1d88\": container with ID starting with 0c28586032f85dbb5be5c9dc7486df65e43615fce2f45a9b8350fd4f996c1d88 not found: ID does not exist" containerID="0c28586032f85dbb5be5c9dc7486df65e43615fce2f45a9b8350fd4f996c1d88" Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.557697 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c28586032f85dbb5be5c9dc7486df65e43615fce2f45a9b8350fd4f996c1d88"} err="failed to get container status \"0c28586032f85dbb5be5c9dc7486df65e43615fce2f45a9b8350fd4f996c1d88\": rpc error: code = NotFound desc = could not find container \"0c28586032f85dbb5be5c9dc7486df65e43615fce2f45a9b8350fd4f996c1d88\": container with ID starting with 0c28586032f85dbb5be5c9dc7486df65e43615fce2f45a9b8350fd4f996c1d88 not found: ID does not exist" Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.557721 4752 scope.go:117] "RemoveContainer" containerID="fa5ed377c2b9e970a9675ea141a6258eecf2cb33ef47859900cb051909237527" Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.560055 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjq64\" (UniqueName: \"kubernetes.io/projected/a8b9f59e-8570-4b2a-9c7b-14be641f74fe-kube-api-access-jjq64\") pod \"a8b9f59e-8570-4b2a-9c7b-14be641f74fe\" (UID: \"a8b9f59e-8570-4b2a-9c7b-14be641f74fe\") " Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.560102 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8b9f59e-8570-4b2a-9c7b-14be641f74fe-operator-scripts\") pod \"a8b9f59e-8570-4b2a-9c7b-14be641f74fe\" (UID: \"a8b9f59e-8570-4b2a-9c7b-14be641f74fe\") " Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.560813 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8b9f59e-8570-4b2a-9c7b-14be641f74fe-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a8b9f59e-8570-4b2a-9c7b-14be641f74fe" (UID: "a8b9f59e-8570-4b2a-9c7b-14be641f74fe"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.563942 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8b9f59e-8570-4b2a-9c7b-14be641f74fe-kube-api-access-jjq64" (OuterVolumeSpecName: "kube-api-access-jjq64") pod "a8b9f59e-8570-4b2a-9c7b-14be641f74fe" (UID: "a8b9f59e-8570-4b2a-9c7b-14be641f74fe"). InnerVolumeSpecName "kube-api-access-jjq64". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.571916 4752 scope.go:117] "RemoveContainer" containerID="fa5ed377c2b9e970a9675ea141a6258eecf2cb33ef47859900cb051909237527" Nov 24 11:29:05 crc kubenswrapper[4752]: E1124 11:29:05.573079 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa5ed377c2b9e970a9675ea141a6258eecf2cb33ef47859900cb051909237527\": container with ID starting with fa5ed377c2b9e970a9675ea141a6258eecf2cb33ef47859900cb051909237527 not found: ID does not exist" containerID="fa5ed377c2b9e970a9675ea141a6258eecf2cb33ef47859900cb051909237527" Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.573135 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa5ed377c2b9e970a9675ea141a6258eecf2cb33ef47859900cb051909237527"} err="failed to get container status \"fa5ed377c2b9e970a9675ea141a6258eecf2cb33ef47859900cb051909237527\": rpc error: code = NotFound desc = could not find container \"fa5ed377c2b9e970a9675ea141a6258eecf2cb33ef47859900cb051909237527\": container with ID starting with fa5ed377c2b9e970a9675ea141a6258eecf2cb33ef47859900cb051909237527 not found: ID does not exist" Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.661716 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2zkj\" (UniqueName: \"kubernetes.io/projected/273fc6bc-214e-4fca-8939-003f983d6aa1-kube-api-access-b2zkj\") pod \"273fc6bc-214e-4fca-8939-003f983d6aa1\" (UID: \"273fc6bc-214e-4fca-8939-003f983d6aa1\") " Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.662202 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/273fc6bc-214e-4fca-8939-003f983d6aa1-operator-scripts\") pod \"273fc6bc-214e-4fca-8939-003f983d6aa1\" (UID: \"273fc6bc-214e-4fca-8939-003f983d6aa1\") " Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.662657 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/273fc6bc-214e-4fca-8939-003f983d6aa1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "273fc6bc-214e-4fca-8939-003f983d6aa1" (UID: "273fc6bc-214e-4fca-8939-003f983d6aa1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.663201 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjq64\" (UniqueName: \"kubernetes.io/projected/a8b9f59e-8570-4b2a-9c7b-14be641f74fe-kube-api-access-jjq64\") on node \"crc\" DevicePath \"\"" Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.663234 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8b9f59e-8570-4b2a-9c7b-14be641f74fe-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.665587 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/273fc6bc-214e-4fca-8939-003f983d6aa1-kube-api-access-b2zkj" (OuterVolumeSpecName: "kube-api-access-b2zkj") pod "273fc6bc-214e-4fca-8939-003f983d6aa1" (UID: "273fc6bc-214e-4fca-8939-003f983d6aa1"). InnerVolumeSpecName "kube-api-access-b2zkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.764868 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/273fc6bc-214e-4fca-8939-003f983d6aa1-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.764911 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2zkj\" (UniqueName: \"kubernetes.io/projected/273fc6bc-214e-4fca-8939-003f983d6aa1-kube-api-access-b2zkj\") on node \"crc\" DevicePath \"\"" Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.841333 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell080e2-account-delete-8s7w5"] Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.846432 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell080e2-account-delete-8s7w5"] Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.859938 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron574b-account-delete-nw8qc"] Nov 24 11:29:05 crc kubenswrapper[4752]: I1124 11:29:05.866440 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron574b-account-delete-nw8qc"] Nov 24 11:29:06 crc kubenswrapper[4752]: I1124 11:29:06.754005 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11bd1a8d-cbf9-47c6-a116-538f60634a33" path="/var/lib/kubelet/pods/11bd1a8d-cbf9-47c6-a116-538f60634a33/volumes" Nov 24 11:29:06 crc kubenswrapper[4752]: I1124 11:29:06.754988 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="273fc6bc-214e-4fca-8939-003f983d6aa1" path="/var/lib/kubelet/pods/273fc6bc-214e-4fca-8939-003f983d6aa1/volumes" Nov 24 11:29:06 crc kubenswrapper[4752]: I1124 11:29:06.756076 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8b9f59e-8570-4b2a-9c7b-14be641f74fe" path="/var/lib/kubelet/pods/a8b9f59e-8570-4b2a-9c7b-14be641f74fe/volumes" Nov 24 11:29:07 crc kubenswrapper[4752]: I1124 11:29:07.321144 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbicanf581-account-delete-qm6qk" Nov 24 11:29:07 crc kubenswrapper[4752]: I1124 11:29:07.519165 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g95m5\" (UniqueName: \"kubernetes.io/projected/31369e0b-e7eb-44cc-9d7e-21d196d95ad3-kube-api-access-g95m5\") pod \"31369e0b-e7eb-44cc-9d7e-21d196d95ad3\" (UID: \"31369e0b-e7eb-44cc-9d7e-21d196d95ad3\") " Nov 24 11:29:07 crc kubenswrapper[4752]: I1124 11:29:07.519410 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31369e0b-e7eb-44cc-9d7e-21d196d95ad3-operator-scripts\") pod \"31369e0b-e7eb-44cc-9d7e-21d196d95ad3\" (UID: \"31369e0b-e7eb-44cc-9d7e-21d196d95ad3\") " Nov 24 11:29:07 crc kubenswrapper[4752]: I1124 11:29:07.520450 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31369e0b-e7eb-44cc-9d7e-21d196d95ad3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "31369e0b-e7eb-44cc-9d7e-21d196d95ad3" (UID: "31369e0b-e7eb-44cc-9d7e-21d196d95ad3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:29:07 crc kubenswrapper[4752]: I1124 11:29:07.524272 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31369e0b-e7eb-44cc-9d7e-21d196d95ad3-kube-api-access-g95m5" (OuterVolumeSpecName: "kube-api-access-g95m5") pod "31369e0b-e7eb-44cc-9d7e-21d196d95ad3" (UID: "31369e0b-e7eb-44cc-9d7e-21d196d95ad3"). InnerVolumeSpecName "kube-api-access-g95m5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:29:07 crc kubenswrapper[4752]: I1124 11:29:07.540203 4752 generic.go:334] "Generic (PLEG): container finished" podID="31369e0b-e7eb-44cc-9d7e-21d196d95ad3" containerID="80077efb113a9fef0b2ba87a7d085c77587815a0d82bd1d80505c5eea98837ef" exitCode=137 Nov 24 11:29:07 crc kubenswrapper[4752]: I1124 11:29:07.540257 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbicanf581-account-delete-qm6qk" event={"ID":"31369e0b-e7eb-44cc-9d7e-21d196d95ad3","Type":"ContainerDied","Data":"80077efb113a9fef0b2ba87a7d085c77587815a0d82bd1d80505c5eea98837ef"} Nov 24 11:29:07 crc kubenswrapper[4752]: I1124 11:29:07.540277 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbicanf581-account-delete-qm6qk" Nov 24 11:29:07 crc kubenswrapper[4752]: I1124 11:29:07.540302 4752 scope.go:117] "RemoveContainer" containerID="80077efb113a9fef0b2ba87a7d085c77587815a0d82bd1d80505c5eea98837ef" Nov 24 11:29:07 crc kubenswrapper[4752]: I1124 11:29:07.540289 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbicanf581-account-delete-qm6qk" event={"ID":"31369e0b-e7eb-44cc-9d7e-21d196d95ad3","Type":"ContainerDied","Data":"9d7d65b03f15c7ef19a705756da017793b0f64eecc214cb8ef05b8218160aea7"} Nov 24 11:29:07 crc kubenswrapper[4752]: I1124 11:29:07.580342 4752 scope.go:117] "RemoveContainer" containerID="80077efb113a9fef0b2ba87a7d085c77587815a0d82bd1d80505c5eea98837ef" Nov 24 11:29:07 crc kubenswrapper[4752]: E1124 11:29:07.581220 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80077efb113a9fef0b2ba87a7d085c77587815a0d82bd1d80505c5eea98837ef\": container with ID starting with 80077efb113a9fef0b2ba87a7d085c77587815a0d82bd1d80505c5eea98837ef not found: ID does not exist" containerID="80077efb113a9fef0b2ba87a7d085c77587815a0d82bd1d80505c5eea98837ef" Nov 24 11:29:07 crc kubenswrapper[4752]: I1124 11:29:07.581255 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80077efb113a9fef0b2ba87a7d085c77587815a0d82bd1d80505c5eea98837ef"} err="failed to get container status \"80077efb113a9fef0b2ba87a7d085c77587815a0d82bd1d80505c5eea98837ef\": rpc error: code = NotFound desc = could not find container \"80077efb113a9fef0b2ba87a7d085c77587815a0d82bd1d80505c5eea98837ef\": container with ID starting with 80077efb113a9fef0b2ba87a7d085c77587815a0d82bd1d80505c5eea98837ef not found: ID does not exist" Nov 24 11:29:07 crc kubenswrapper[4752]: I1124 11:29:07.583079 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbicanf581-account-delete-qm6qk"] Nov 24 11:29:07 crc kubenswrapper[4752]: I1124 11:29:07.588799 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbicanf581-account-delete-qm6qk"] Nov 24 11:29:07 crc kubenswrapper[4752]: I1124 11:29:07.620328 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31369e0b-e7eb-44cc-9d7e-21d196d95ad3-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 11:29:07 crc kubenswrapper[4752]: I1124 11:29:07.620363 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g95m5\" (UniqueName: \"kubernetes.io/projected/31369e0b-e7eb-44cc-9d7e-21d196d95ad3-kube-api-access-g95m5\") on node \"crc\" DevicePath \"\"" Nov 24 11:29:08 crc kubenswrapper[4752]: I1124 11:29:08.740872 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31369e0b-e7eb-44cc-9d7e-21d196d95ad3" path="/var/lib/kubelet/pods/31369e0b-e7eb-44cc-9d7e-21d196d95ad3/volumes" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.945640 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n4z5t"] Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.946522 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9afd9c8-7b23-44fb-99ce-7924833d4589" containerName="neutron-httpd" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.946535 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9afd9c8-7b23-44fb-99ce-7924833d4589" containerName="neutron-httpd" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.946548 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffd3886c-7122-4bd5-97d2-5c448c31e941" containerName="cinder-api-log" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.946555 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffd3886c-7122-4bd5-97d2-5c448c31e941" containerName="cinder-api-log" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.946562 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="swift-recon-cron" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.946568 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="swift-recon-cron" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.946577 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="account-reaper" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.946582 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="account-reaper" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.946593 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="container-replicator" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.946599 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="container-replicator" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.946607 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="account-server" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.946612 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="account-server" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.946620 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e8ac218-b256-4d82-afb6-34448254aa9f" containerName="ceilometer-central-agent" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.946626 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e8ac218-b256-4d82-afb6-34448254aa9f" containerName="ceilometer-central-agent" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.946638 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26da9a73-3ede-476b-bfab-08839a8b88d1" containerName="ovsdb-server-init" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.946643 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="26da9a73-3ede-476b-bfab-08839a8b88d1" containerName="ovsdb-server-init" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.946650 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bdb834d-3fdb-4bed-b349-e41c60ec4d64" containerName="nova-metadata-metadata" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.946655 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bdb834d-3fdb-4bed-b349-e41c60ec4d64" containerName="nova-metadata-metadata" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.946664 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="object-server" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.946669 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="object-server" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.946677 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cb3afd6-0557-4a85-9763-1572d92e6aa3" containerName="mariadb-account-delete" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.946685 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cb3afd6-0557-4a85-9763-1572d92e6aa3" containerName="mariadb-account-delete" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.946695 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3b08ab3-f6c3-417e-abce-e03994e7553d" containerName="nova-api-api" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.946702 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3b08ab3-f6c3-417e-abce-e03994e7553d" containerName="nova-api-api" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.946716 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11bd1a8d-cbf9-47c6-a116-538f60634a33" containerName="mariadb-account-delete" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.946723 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="11bd1a8d-cbf9-47c6-a116-538f60634a33" containerName="mariadb-account-delete" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.946732 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e8ac218-b256-4d82-afb6-34448254aa9f" containerName="proxy-httpd" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.946758 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e8ac218-b256-4d82-afb6-34448254aa9f" containerName="proxy-httpd" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.946768 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51186e60-74ec-447f-afd0-14056f01d5a3" containerName="mariadb-account-delete" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.946775 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="51186e60-74ec-447f-afd0-14056f01d5a3" containerName="mariadb-account-delete" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.946785 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6e4558a-350e-442a-ad61-70d9a9824219" containerName="placement-log" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.946790 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6e4558a-350e-442a-ad61-70d9a9824219" containerName="placement-log" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.946800 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3c01e5f-8338-497f-9e47-e2e3e857df63" containerName="glance-log" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.946806 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3c01e5f-8338-497f-9e47-e2e3e857df63" containerName="glance-log" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.946814 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26da9a73-3ede-476b-bfab-08839a8b88d1" containerName="ovs-vswitchd" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.946819 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="26da9a73-3ede-476b-bfab-08839a8b88d1" containerName="ovs-vswitchd" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.946828 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="object-auditor" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.946834 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="object-auditor" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.946841 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c592991-2ffb-417a-aa80-49d0111618bb" containerName="barbican-api" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.946848 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c592991-2ffb-417a-aa80-49d0111618bb" containerName="barbican-api" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.946856 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="account-replicator" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.946862 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="account-replicator" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.946870 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1020727e-3725-4fed-b276-043ae4d30c4c" containerName="memcached" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.946875 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="1020727e-3725-4fed-b276-043ae4d30c4c" containerName="memcached" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.946885 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d0084f6-064e-4089-87fe-4ead63923b56" containerName="cinder-scheduler" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.946891 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d0084f6-064e-4089-87fe-4ead63923b56" containerName="cinder-scheduler" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.946901 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e8ac218-b256-4d82-afb6-34448254aa9f" containerName="ceilometer-notification-agent" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.946906 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e8ac218-b256-4d82-afb6-34448254aa9f" containerName="ceilometer-notification-agent" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.946919 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3b08ab3-f6c3-417e-abce-e03994e7553d" containerName="nova-api-log" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.946925 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3b08ab3-f6c3-417e-abce-e03994e7553d" containerName="nova-api-log" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.946933 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c592991-2ffb-417a-aa80-49d0111618bb" containerName="barbican-api-log" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.946939 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c592991-2ffb-417a-aa80-49d0111618bb" containerName="barbican-api-log" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.946947 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffd3886c-7122-4bd5-97d2-5c448c31e941" containerName="cinder-api" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.946953 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffd3886c-7122-4bd5-97d2-5c448c31e941" containerName="cinder-api" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.946966 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9afd9c8-7b23-44fb-99ce-7924833d4589" containerName="neutron-api" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.946971 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9afd9c8-7b23-44fb-99ce-7924833d4589" containerName="neutron-api" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.946980 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6e4558a-350e-442a-ad61-70d9a9824219" containerName="placement-api" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.946987 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6e4558a-350e-442a-ad61-70d9a9824219" containerName="placement-api" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.946996 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="object-updater" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947001 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="object-updater" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.947011 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16fc61a1-6794-4ecf-a3dd-d79a88eda486" containerName="glance-log" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947017 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="16fc61a1-6794-4ecf-a3dd-d79a88eda486" containerName="glance-log" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.947025 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b9d21e3-863e-4129-a794-41c3bb8899df" containerName="mariadb-account-delete" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947031 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b9d21e3-863e-4129-a794-41c3bb8899df" containerName="mariadb-account-delete" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.947040 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d820ba8-03d6-48f4-9423-bbc1ed64a36b" containerName="rabbitmq" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947045 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d820ba8-03d6-48f4-9423-bbc1ed64a36b" containerName="rabbitmq" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.947053 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8b9f59e-8570-4b2a-9c7b-14be641f74fe" containerName="mariadb-account-delete" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947058 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8b9f59e-8570-4b2a-9c7b-14be641f74fe" containerName="mariadb-account-delete" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.947067 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1077002f-be90-4e09-8158-efdc98329e5b" containerName="openstack-network-exporter" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947073 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="1077002f-be90-4e09-8158-efdc98329e5b" containerName="openstack-network-exporter" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.947082 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d820ba8-03d6-48f4-9423-bbc1ed64a36b" containerName="setup-container" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947089 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d820ba8-03d6-48f4-9423-bbc1ed64a36b" containerName="setup-container" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.947098 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3c01e5f-8338-497f-9e47-e2e3e857df63" containerName="glance-httpd" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947105 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3c01e5f-8338-497f-9e47-e2e3e857df63" containerName="glance-httpd" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.947115 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16fc61a1-6794-4ecf-a3dd-d79a88eda486" containerName="glance-httpd" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947123 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="16fc61a1-6794-4ecf-a3dd-d79a88eda486" containerName="glance-httpd" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.947134 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd071fb7-f9a2-4f4a-aad6-90c340f0d009" containerName="nova-scheduler-scheduler" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947140 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd071fb7-f9a2-4f4a-aad6-90c340f0d009" containerName="nova-scheduler-scheduler" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.947149 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="273fc6bc-214e-4fca-8939-003f983d6aa1" containerName="mariadb-account-delete" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947154 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="273fc6bc-214e-4fca-8939-003f983d6aa1" containerName="mariadb-account-delete" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.947161 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bdb834d-3fdb-4bed-b349-e41c60ec4d64" containerName="nova-metadata-log" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947168 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bdb834d-3fdb-4bed-b349-e41c60ec4d64" containerName="nova-metadata-log" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.947177 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26da9a73-3ede-476b-bfab-08839a8b88d1" containerName="ovsdb-server" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947184 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="26da9a73-3ede-476b-bfab-08839a8b88d1" containerName="ovsdb-server" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.947195 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="rsync" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947202 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="rsync" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.947210 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e8ac218-b256-4d82-afb6-34448254aa9f" containerName="sg-core" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947216 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e8ac218-b256-4d82-afb6-34448254aa9f" containerName="sg-core" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.947225 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="account-auditor" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947231 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="account-auditor" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.947236 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d0084f6-064e-4089-87fe-4ead63923b56" containerName="probe" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947242 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d0084f6-064e-4089-87fe-4ead63923b56" containerName="probe" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.947250 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30292578-890d-42e8-b689-b909de083b48" containerName="galera" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947255 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="30292578-890d-42e8-b689-b909de083b48" containerName="galera" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.947261 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30292578-890d-42e8-b689-b909de083b48" containerName="mysql-bootstrap" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947267 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="30292578-890d-42e8-b689-b909de083b48" containerName="mysql-bootstrap" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.947275 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="container-server" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947280 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="container-server" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.947291 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddb825f2-a05f-409d-b4cc-80408e2db5d7" containerName="kube-state-metrics" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947296 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddb825f2-a05f-409d-b4cc-80408e2db5d7" containerName="kube-state-metrics" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.947305 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="object-replicator" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947311 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="object-replicator" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.947320 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="container-auditor" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947326 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="container-auditor" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.947337 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5be4efe6-ed60-4417-a84c-8ff27bf4a685" containerName="nova-cell1-conductor-conductor" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947342 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="5be4efe6-ed60-4417-a84c-8ff27bf4a685" containerName="nova-cell1-conductor-conductor" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.947352 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0" containerName="mariadb-account-delete" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947357 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0" containerName="mariadb-account-delete" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.947365 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="object-expirer" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947370 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="object-expirer" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.947380 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1077002f-be90-4e09-8158-efdc98329e5b" containerName="ovn-northd" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947387 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="1077002f-be90-4e09-8158-efdc98329e5b" containerName="ovn-northd" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.947395 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="container-updater" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947401 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="container-updater" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.947413 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cd9d1a6-a562-443e-b16d-76c159107794" containerName="keystone-api" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947419 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cd9d1a6-a562-443e-b16d-76c159107794" containerName="keystone-api" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.947429 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9495af0b-dca8-4695-8f40-f4f9a7ec5229" containerName="ovn-controller" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947435 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="9495af0b-dca8-4695-8f40-f4f9a7ec5229" containerName="ovn-controller" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.947444 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1f1e943-afb7-40c3-9ff2-56791f4e0ad5" containerName="setup-container" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947450 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1f1e943-afb7-40c3-9ff2-56791f4e0ad5" containerName="setup-container" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.947459 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1f1e943-afb7-40c3-9ff2-56791f4e0ad5" containerName="rabbitmq" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947465 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1f1e943-afb7-40c3-9ff2-56791f4e0ad5" containerName="rabbitmq" Nov 24 11:29:23 crc kubenswrapper[4752]: E1124 11:29:23.947474 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31369e0b-e7eb-44cc-9d7e-21d196d95ad3" containerName="mariadb-account-delete" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947480 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="31369e0b-e7eb-44cc-9d7e-21d196d95ad3" containerName="mariadb-account-delete" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947591 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3c01e5f-8338-497f-9e47-e2e3e857df63" containerName="glance-log" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947599 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d820ba8-03d6-48f4-9423-bbc1ed64a36b" containerName="rabbitmq" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947607 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="1077002f-be90-4e09-8158-efdc98329e5b" containerName="ovn-northd" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947617 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffd3886c-7122-4bd5-97d2-5c448c31e941" containerName="cinder-api-log" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947626 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="object-replicator" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947635 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cd9d1a6-a562-443e-b16d-76c159107794" containerName="keystone-api" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947644 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="account-reaper" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947651 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="object-updater" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947659 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d0084f6-064e-4089-87fe-4ead63923b56" containerName="probe" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947668 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="account-auditor" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947675 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="object-server" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947683 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1f1e943-afb7-40c3-9ff2-56791f4e0ad5" containerName="rabbitmq" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947690 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="5be4efe6-ed60-4417-a84c-8ff27bf4a685" containerName="nova-cell1-conductor-conductor" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947700 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="account-replicator" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947708 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="31369e0b-e7eb-44cc-9d7e-21d196d95ad3" containerName="mariadb-account-delete" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947719 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="container-server" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947726 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="26da9a73-3ede-476b-bfab-08839a8b88d1" containerName="ovsdb-server" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947731 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6e4558a-350e-442a-ad61-70d9a9824219" containerName="placement-log" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947738 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="rsync" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947763 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e8ac218-b256-4d82-afb6-34448254aa9f" containerName="ceilometer-notification-agent" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947772 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c592991-2ffb-417a-aa80-49d0111618bb" containerName="barbican-api-log" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947778 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c592991-2ffb-417a-aa80-49d0111618bb" containerName="barbican-api" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947787 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6e4558a-350e-442a-ad61-70d9a9824219" containerName="placement-api" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947795 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d0084f6-064e-4089-87fe-4ead63923b56" containerName="cinder-scheduler" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947803 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="16fc61a1-6794-4ecf-a3dd-d79a88eda486" containerName="glance-log" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947811 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9afd9c8-7b23-44fb-99ce-7924833d4589" containerName="neutron-api" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947818 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="26da9a73-3ede-476b-bfab-08839a8b88d1" containerName="ovs-vswitchd" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947825 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="11bd1a8d-cbf9-47c6-a116-538f60634a33" containerName="mariadb-account-delete" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947832 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="273fc6bc-214e-4fca-8939-003f983d6aa1" containerName="mariadb-account-delete" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947841 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="container-replicator" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947847 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="object-auditor" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947853 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3b08ab3-f6c3-417e-abce-e03994e7553d" containerName="nova-api-api" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947861 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="swift-recon-cron" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947868 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="container-updater" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947875 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffd3886c-7122-4bd5-97d2-5c448c31e941" containerName="cinder-api" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947883 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3b08ab3-f6c3-417e-abce-e03994e7553d" containerName="nova-api-log" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947892 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bdb834d-3fdb-4bed-b349-e41c60ec4d64" containerName="nova-metadata-metadata" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947900 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e8ac218-b256-4d82-afb6-34448254aa9f" containerName="ceilometer-central-agent" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947908 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e8ac218-b256-4d82-afb6-34448254aa9f" containerName="sg-core" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947915 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e8ac218-b256-4d82-afb6-34448254aa9f" containerName="proxy-httpd" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947923 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="9495af0b-dca8-4695-8f40-f4f9a7ec5229" containerName="ovn-controller" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947931 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bdb834d-3fdb-4bed-b349-e41c60ec4d64" containerName="nova-metadata-log" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947938 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddb825f2-a05f-409d-b4cc-80408e2db5d7" containerName="kube-state-metrics" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947947 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="16fc61a1-6794-4ecf-a3dd-d79a88eda486" containerName="glance-httpd" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947958 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3c01e5f-8338-497f-9e47-e2e3e857df63" containerName="glance-httpd" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947967 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd071fb7-f9a2-4f4a-aad6-90c340f0d009" containerName="nova-scheduler-scheduler" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947975 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="object-expirer" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947988 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="1020727e-3725-4fed-b276-043ae4d30c4c" containerName="memcached" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.947999 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9afd9c8-7b23-44fb-99ce-7924833d4589" containerName="neutron-httpd" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.948010 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="account-server" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.948020 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="d140408d-b7e5-4ba0-9404-877498cf18a1" containerName="container-auditor" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.948028 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="30292578-890d-42e8-b689-b909de083b48" containerName="galera" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.948034 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cb3afd6-0557-4a85-9763-1572d92e6aa3" containerName="mariadb-account-delete" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.948043 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f9336cf-11ee-46eb-a1a1-fece1d6a1ef0" containerName="mariadb-account-delete" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.948054 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8b9f59e-8570-4b2a-9c7b-14be641f74fe" containerName="mariadb-account-delete" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.948060 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="1077002f-be90-4e09-8158-efdc98329e5b" containerName="openstack-network-exporter" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.948069 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b9d21e3-863e-4129-a794-41c3bb8899df" containerName="mariadb-account-delete" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.948075 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="51186e60-74ec-447f-afd0-14056f01d5a3" containerName="mariadb-account-delete" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.948976 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n4z5t" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.981458 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq88f\" (UniqueName: \"kubernetes.io/projected/4683b220-90e6-40d0-b220-b88653c4c2a4-kube-api-access-dq88f\") pod \"redhat-operators-n4z5t\" (UID: \"4683b220-90e6-40d0-b220-b88653c4c2a4\") " pod="openshift-marketplace/redhat-operators-n4z5t" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.981918 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4683b220-90e6-40d0-b220-b88653c4c2a4-catalog-content\") pod \"redhat-operators-n4z5t\" (UID: \"4683b220-90e6-40d0-b220-b88653c4c2a4\") " pod="openshift-marketplace/redhat-operators-n4z5t" Nov 24 11:29:23 crc kubenswrapper[4752]: I1124 11:29:23.982054 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4683b220-90e6-40d0-b220-b88653c4c2a4-utilities\") pod \"redhat-operators-n4z5t\" (UID: \"4683b220-90e6-40d0-b220-b88653c4c2a4\") " pod="openshift-marketplace/redhat-operators-n4z5t" Nov 24 11:29:24 crc kubenswrapper[4752]: I1124 11:29:24.011981 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n4z5t"] Nov 24 11:29:24 crc kubenswrapper[4752]: I1124 11:29:24.082781 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4683b220-90e6-40d0-b220-b88653c4c2a4-utilities\") pod \"redhat-operators-n4z5t\" (UID: \"4683b220-90e6-40d0-b220-b88653c4c2a4\") " pod="openshift-marketplace/redhat-operators-n4z5t" Nov 24 11:29:24 crc kubenswrapper[4752]: I1124 11:29:24.083006 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq88f\" (UniqueName: \"kubernetes.io/projected/4683b220-90e6-40d0-b220-b88653c4c2a4-kube-api-access-dq88f\") pod \"redhat-operators-n4z5t\" (UID: \"4683b220-90e6-40d0-b220-b88653c4c2a4\") " pod="openshift-marketplace/redhat-operators-n4z5t" Nov 24 11:29:24 crc kubenswrapper[4752]: I1124 11:29:24.083131 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4683b220-90e6-40d0-b220-b88653c4c2a4-catalog-content\") pod \"redhat-operators-n4z5t\" (UID: \"4683b220-90e6-40d0-b220-b88653c4c2a4\") " pod="openshift-marketplace/redhat-operators-n4z5t" Nov 24 11:29:24 crc kubenswrapper[4752]: I1124 11:29:24.083547 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4683b220-90e6-40d0-b220-b88653c4c2a4-utilities\") pod \"redhat-operators-n4z5t\" (UID: \"4683b220-90e6-40d0-b220-b88653c4c2a4\") " pod="openshift-marketplace/redhat-operators-n4z5t" Nov 24 11:29:24 crc kubenswrapper[4752]: I1124 11:29:24.083836 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4683b220-90e6-40d0-b220-b88653c4c2a4-catalog-content\") pod \"redhat-operators-n4z5t\" (UID: \"4683b220-90e6-40d0-b220-b88653c4c2a4\") " pod="openshift-marketplace/redhat-operators-n4z5t" Nov 24 11:29:24 crc kubenswrapper[4752]: I1124 11:29:24.108652 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq88f\" (UniqueName: \"kubernetes.io/projected/4683b220-90e6-40d0-b220-b88653c4c2a4-kube-api-access-dq88f\") pod \"redhat-operators-n4z5t\" (UID: \"4683b220-90e6-40d0-b220-b88653c4c2a4\") " pod="openshift-marketplace/redhat-operators-n4z5t" Nov 24 11:29:24 crc kubenswrapper[4752]: I1124 11:29:24.268726 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n4z5t" Nov 24 11:29:24 crc kubenswrapper[4752]: I1124 11:29:24.754458 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n4z5t"] Nov 24 11:29:25 crc kubenswrapper[4752]: I1124 11:29:25.727059 4752 generic.go:334] "Generic (PLEG): container finished" podID="4683b220-90e6-40d0-b220-b88653c4c2a4" containerID="753d77b7435600f57b0beb5c556d494c4d0e3483edd0fbf22002775440a61319" exitCode=0 Nov 24 11:29:25 crc kubenswrapper[4752]: I1124 11:29:25.727457 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4z5t" event={"ID":"4683b220-90e6-40d0-b220-b88653c4c2a4","Type":"ContainerDied","Data":"753d77b7435600f57b0beb5c556d494c4d0e3483edd0fbf22002775440a61319"} Nov 24 11:29:25 crc kubenswrapper[4752]: I1124 11:29:25.727542 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4z5t" event={"ID":"4683b220-90e6-40d0-b220-b88653c4c2a4","Type":"ContainerStarted","Data":"9da0720c5ba0debf55bf1760be2392e2d52d11e82a4caa85de75748fb5350505"} Nov 24 11:29:25 crc kubenswrapper[4752]: I1124 11:29:25.729816 4752 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 24 11:29:26 crc kubenswrapper[4752]: I1124 11:29:26.738107 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4z5t" event={"ID":"4683b220-90e6-40d0-b220-b88653c4c2a4","Type":"ContainerStarted","Data":"66fdae3d257cd1885b0341403c15e9c6f6be7346418b9ce62dfae4cc63ee2c81"} Nov 24 11:29:27 crc kubenswrapper[4752]: I1124 11:29:27.747731 4752 generic.go:334] "Generic (PLEG): container finished" podID="4683b220-90e6-40d0-b220-b88653c4c2a4" containerID="66fdae3d257cd1885b0341403c15e9c6f6be7346418b9ce62dfae4cc63ee2c81" exitCode=0 Nov 24 11:29:27 crc kubenswrapper[4752]: I1124 11:29:27.747802 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4z5t" event={"ID":"4683b220-90e6-40d0-b220-b88653c4c2a4","Type":"ContainerDied","Data":"66fdae3d257cd1885b0341403c15e9c6f6be7346418b9ce62dfae4cc63ee2c81"} Nov 24 11:29:28 crc kubenswrapper[4752]: I1124 11:29:28.759335 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4z5t" event={"ID":"4683b220-90e6-40d0-b220-b88653c4c2a4","Type":"ContainerStarted","Data":"bca877ab48c10f53244871e503b8462fc3523e9fd8fc656b3f1bb9e1c2fb4cc5"} Nov 24 11:29:28 crc kubenswrapper[4752]: I1124 11:29:28.782085 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n4z5t" podStartSLOduration=3.382068794 podStartE2EDuration="5.782067545s" podCreationTimestamp="2025-11-24 11:29:23 +0000 UTC" firstStartedPulling="2025-11-24 11:29:25.728676168 +0000 UTC m=+1371.713496457" lastFinishedPulling="2025-11-24 11:29:28.128674919 +0000 UTC m=+1374.113495208" observedRunningTime="2025-11-24 11:29:28.776823173 +0000 UTC m=+1374.761643462" watchObservedRunningTime="2025-11-24 11:29:28.782067545 +0000 UTC m=+1374.766887834" Nov 24 11:29:34 crc kubenswrapper[4752]: I1124 11:29:34.269246 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n4z5t" Nov 24 11:29:34 crc kubenswrapper[4752]: I1124 11:29:34.269690 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n4z5t" Nov 24 11:29:34 crc kubenswrapper[4752]: I1124 11:29:34.326224 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n4z5t" Nov 24 11:29:34 crc kubenswrapper[4752]: I1124 11:29:34.865147 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n4z5t" Nov 24 11:29:34 crc kubenswrapper[4752]: I1124 11:29:34.912887 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n4z5t"] Nov 24 11:29:36 crc kubenswrapper[4752]: I1124 11:29:36.824536 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n4z5t" podUID="4683b220-90e6-40d0-b220-b88653c4c2a4" containerName="registry-server" containerID="cri-o://bca877ab48c10f53244871e503b8462fc3523e9fd8fc656b3f1bb9e1c2fb4cc5" gracePeriod=2 Nov 24 11:29:38 crc kubenswrapper[4752]: I1124 11:29:38.240378 4752 scope.go:117] "RemoveContainer" containerID="d34722d3f40c1996b7a80559039e9a50e8496db2ea2f4dc7d64da0f375ada589" Nov 24 11:29:38 crc kubenswrapper[4752]: I1124 11:29:38.339547 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n4z5t" Nov 24 11:29:38 crc kubenswrapper[4752]: I1124 11:29:38.505292 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4683b220-90e6-40d0-b220-b88653c4c2a4-utilities\") pod \"4683b220-90e6-40d0-b220-b88653c4c2a4\" (UID: \"4683b220-90e6-40d0-b220-b88653c4c2a4\") " Nov 24 11:29:38 crc kubenswrapper[4752]: I1124 11:29:38.505389 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4683b220-90e6-40d0-b220-b88653c4c2a4-catalog-content\") pod \"4683b220-90e6-40d0-b220-b88653c4c2a4\" (UID: \"4683b220-90e6-40d0-b220-b88653c4c2a4\") " Nov 24 11:29:38 crc kubenswrapper[4752]: I1124 11:29:38.505512 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dq88f\" (UniqueName: \"kubernetes.io/projected/4683b220-90e6-40d0-b220-b88653c4c2a4-kube-api-access-dq88f\") pod \"4683b220-90e6-40d0-b220-b88653c4c2a4\" (UID: \"4683b220-90e6-40d0-b220-b88653c4c2a4\") " Nov 24 11:29:38 crc kubenswrapper[4752]: I1124 11:29:38.507163 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4683b220-90e6-40d0-b220-b88653c4c2a4-utilities" (OuterVolumeSpecName: "utilities") pod "4683b220-90e6-40d0-b220-b88653c4c2a4" (UID: "4683b220-90e6-40d0-b220-b88653c4c2a4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:29:38 crc kubenswrapper[4752]: I1124 11:29:38.510992 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4683b220-90e6-40d0-b220-b88653c4c2a4-kube-api-access-dq88f" (OuterVolumeSpecName: "kube-api-access-dq88f") pod "4683b220-90e6-40d0-b220-b88653c4c2a4" (UID: "4683b220-90e6-40d0-b220-b88653c4c2a4"). InnerVolumeSpecName "kube-api-access-dq88f". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:29:38 crc kubenswrapper[4752]: I1124 11:29:38.598789 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4683b220-90e6-40d0-b220-b88653c4c2a4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4683b220-90e6-40d0-b220-b88653c4c2a4" (UID: "4683b220-90e6-40d0-b220-b88653c4c2a4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:29:38 crc kubenswrapper[4752]: I1124 11:29:38.606724 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dq88f\" (UniqueName: \"kubernetes.io/projected/4683b220-90e6-40d0-b220-b88653c4c2a4-kube-api-access-dq88f\") on node \"crc\" DevicePath \"\"" Nov 24 11:29:38 crc kubenswrapper[4752]: I1124 11:29:38.606790 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4683b220-90e6-40d0-b220-b88653c4c2a4-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 11:29:38 crc kubenswrapper[4752]: I1124 11:29:38.606805 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4683b220-90e6-40d0-b220-b88653c4c2a4-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 11:29:38 crc kubenswrapper[4752]: I1124 11:29:38.851171 4752 generic.go:334] "Generic (PLEG): container finished" podID="4683b220-90e6-40d0-b220-b88653c4c2a4" containerID="bca877ab48c10f53244871e503b8462fc3523e9fd8fc656b3f1bb9e1c2fb4cc5" exitCode=0 Nov 24 11:29:38 crc kubenswrapper[4752]: I1124 11:29:38.851210 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4z5t" event={"ID":"4683b220-90e6-40d0-b220-b88653c4c2a4","Type":"ContainerDied","Data":"bca877ab48c10f53244871e503b8462fc3523e9fd8fc656b3f1bb9e1c2fb4cc5"} Nov 24 11:29:38 crc kubenswrapper[4752]: I1124 11:29:38.851233 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4z5t" event={"ID":"4683b220-90e6-40d0-b220-b88653c4c2a4","Type":"ContainerDied","Data":"9da0720c5ba0debf55bf1760be2392e2d52d11e82a4caa85de75748fb5350505"} Nov 24 11:29:38 crc kubenswrapper[4752]: I1124 11:29:38.851252 4752 scope.go:117] "RemoveContainer" containerID="bca877ab48c10f53244871e503b8462fc3523e9fd8fc656b3f1bb9e1c2fb4cc5" Nov 24 11:29:38 crc kubenswrapper[4752]: I1124 11:29:38.851260 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n4z5t" Nov 24 11:29:38 crc kubenswrapper[4752]: I1124 11:29:38.876452 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n4z5t"] Nov 24 11:29:38 crc kubenswrapper[4752]: I1124 11:29:38.882977 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n4z5t"] Nov 24 11:29:38 crc kubenswrapper[4752]: I1124 11:29:38.883118 4752 scope.go:117] "RemoveContainer" containerID="66fdae3d257cd1885b0341403c15e9c6f6be7346418b9ce62dfae4cc63ee2c81" Nov 24 11:29:38 crc kubenswrapper[4752]: I1124 11:29:38.899925 4752 scope.go:117] "RemoveContainer" containerID="753d77b7435600f57b0beb5c556d494c4d0e3483edd0fbf22002775440a61319" Nov 24 11:29:38 crc kubenswrapper[4752]: I1124 11:29:38.929136 4752 scope.go:117] "RemoveContainer" containerID="bca877ab48c10f53244871e503b8462fc3523e9fd8fc656b3f1bb9e1c2fb4cc5" Nov 24 11:29:38 crc kubenswrapper[4752]: E1124 11:29:38.929973 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bca877ab48c10f53244871e503b8462fc3523e9fd8fc656b3f1bb9e1c2fb4cc5\": container with ID starting with bca877ab48c10f53244871e503b8462fc3523e9fd8fc656b3f1bb9e1c2fb4cc5 not found: ID does not exist" containerID="bca877ab48c10f53244871e503b8462fc3523e9fd8fc656b3f1bb9e1c2fb4cc5" Nov 24 11:29:38 crc kubenswrapper[4752]: I1124 11:29:38.930008 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bca877ab48c10f53244871e503b8462fc3523e9fd8fc656b3f1bb9e1c2fb4cc5"} err="failed to get container status \"bca877ab48c10f53244871e503b8462fc3523e9fd8fc656b3f1bb9e1c2fb4cc5\": rpc error: code = NotFound desc = could not find container \"bca877ab48c10f53244871e503b8462fc3523e9fd8fc656b3f1bb9e1c2fb4cc5\": container with ID starting with bca877ab48c10f53244871e503b8462fc3523e9fd8fc656b3f1bb9e1c2fb4cc5 not found: ID does not exist" Nov 24 11:29:38 crc kubenswrapper[4752]: I1124 11:29:38.930029 4752 scope.go:117] "RemoveContainer" containerID="66fdae3d257cd1885b0341403c15e9c6f6be7346418b9ce62dfae4cc63ee2c81" Nov 24 11:29:38 crc kubenswrapper[4752]: E1124 11:29:38.931121 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66fdae3d257cd1885b0341403c15e9c6f6be7346418b9ce62dfae4cc63ee2c81\": container with ID starting with 66fdae3d257cd1885b0341403c15e9c6f6be7346418b9ce62dfae4cc63ee2c81 not found: ID does not exist" containerID="66fdae3d257cd1885b0341403c15e9c6f6be7346418b9ce62dfae4cc63ee2c81" Nov 24 11:29:38 crc kubenswrapper[4752]: I1124 11:29:38.931171 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66fdae3d257cd1885b0341403c15e9c6f6be7346418b9ce62dfae4cc63ee2c81"} err="failed to get container status \"66fdae3d257cd1885b0341403c15e9c6f6be7346418b9ce62dfae4cc63ee2c81\": rpc error: code = NotFound desc = could not find container \"66fdae3d257cd1885b0341403c15e9c6f6be7346418b9ce62dfae4cc63ee2c81\": container with ID starting with 66fdae3d257cd1885b0341403c15e9c6f6be7346418b9ce62dfae4cc63ee2c81 not found: ID does not exist" Nov 24 11:29:38 crc kubenswrapper[4752]: I1124 11:29:38.931200 4752 scope.go:117] "RemoveContainer" containerID="753d77b7435600f57b0beb5c556d494c4d0e3483edd0fbf22002775440a61319" Nov 24 11:29:38 crc kubenswrapper[4752]: E1124 11:29:38.935399 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"753d77b7435600f57b0beb5c556d494c4d0e3483edd0fbf22002775440a61319\": container with ID starting with 753d77b7435600f57b0beb5c556d494c4d0e3483edd0fbf22002775440a61319 not found: ID does not exist" containerID="753d77b7435600f57b0beb5c556d494c4d0e3483edd0fbf22002775440a61319" Nov 24 11:29:38 crc kubenswrapper[4752]: I1124 11:29:38.935445 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"753d77b7435600f57b0beb5c556d494c4d0e3483edd0fbf22002775440a61319"} err="failed to get container status \"753d77b7435600f57b0beb5c556d494c4d0e3483edd0fbf22002775440a61319\": rpc error: code = NotFound desc = could not find container \"753d77b7435600f57b0beb5c556d494c4d0e3483edd0fbf22002775440a61319\": container with ID starting with 753d77b7435600f57b0beb5c556d494c4d0e3483edd0fbf22002775440a61319 not found: ID does not exist" Nov 24 11:29:40 crc kubenswrapper[4752]: I1124 11:29:40.738559 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4683b220-90e6-40d0-b220-b88653c4c2a4" path="/var/lib/kubelet/pods/4683b220-90e6-40d0-b220-b88653c4c2a4/volumes" Nov 24 11:30:00 crc kubenswrapper[4752]: I1124 11:30:00.159620 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399730-wlws9"] Nov 24 11:30:00 crc kubenswrapper[4752]: E1124 11:30:00.160443 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4683b220-90e6-40d0-b220-b88653c4c2a4" containerName="extract-content" Nov 24 11:30:00 crc kubenswrapper[4752]: I1124 11:30:00.160456 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="4683b220-90e6-40d0-b220-b88653c4c2a4" containerName="extract-content" Nov 24 11:30:00 crc kubenswrapper[4752]: E1124 11:30:00.160474 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4683b220-90e6-40d0-b220-b88653c4c2a4" containerName="extract-utilities" Nov 24 11:30:00 crc kubenswrapper[4752]: I1124 11:30:00.160480 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="4683b220-90e6-40d0-b220-b88653c4c2a4" containerName="extract-utilities" Nov 24 11:30:00 crc kubenswrapper[4752]: E1124 11:30:00.160504 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4683b220-90e6-40d0-b220-b88653c4c2a4" containerName="registry-server" Nov 24 11:30:00 crc kubenswrapper[4752]: I1124 11:30:00.160509 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="4683b220-90e6-40d0-b220-b88653c4c2a4" containerName="registry-server" Nov 24 11:30:00 crc kubenswrapper[4752]: I1124 11:30:00.160670 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="4683b220-90e6-40d0-b220-b88653c4c2a4" containerName="registry-server" Nov 24 11:30:00 crc kubenswrapper[4752]: I1124 11:30:00.161223 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399730-wlws9" Nov 24 11:30:00 crc kubenswrapper[4752]: I1124 11:30:00.163203 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 24 11:30:00 crc kubenswrapper[4752]: I1124 11:30:00.163447 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 24 11:30:00 crc kubenswrapper[4752]: I1124 11:30:00.166578 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399730-wlws9"] Nov 24 11:30:00 crc kubenswrapper[4752]: I1124 11:30:00.231389 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4f68b128-215a-4b08-b1ce-ef179f020723-config-volume\") pod \"collect-profiles-29399730-wlws9\" (UID: \"4f68b128-215a-4b08-b1ce-ef179f020723\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399730-wlws9" Nov 24 11:30:00 crc kubenswrapper[4752]: I1124 11:30:00.231724 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4f68b128-215a-4b08-b1ce-ef179f020723-secret-volume\") pod \"collect-profiles-29399730-wlws9\" (UID: \"4f68b128-215a-4b08-b1ce-ef179f020723\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399730-wlws9" Nov 24 11:30:00 crc kubenswrapper[4752]: I1124 11:30:00.231912 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkkmp\" (UniqueName: \"kubernetes.io/projected/4f68b128-215a-4b08-b1ce-ef179f020723-kube-api-access-qkkmp\") pod \"collect-profiles-29399730-wlws9\" (UID: \"4f68b128-215a-4b08-b1ce-ef179f020723\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399730-wlws9" Nov 24 11:30:00 crc kubenswrapper[4752]: I1124 11:30:00.333330 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkkmp\" (UniqueName: \"kubernetes.io/projected/4f68b128-215a-4b08-b1ce-ef179f020723-kube-api-access-qkkmp\") pod \"collect-profiles-29399730-wlws9\" (UID: \"4f68b128-215a-4b08-b1ce-ef179f020723\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399730-wlws9" Nov 24 11:30:00 crc kubenswrapper[4752]: I1124 11:30:00.333839 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4f68b128-215a-4b08-b1ce-ef179f020723-config-volume\") pod \"collect-profiles-29399730-wlws9\" (UID: \"4f68b128-215a-4b08-b1ce-ef179f020723\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399730-wlws9" Nov 24 11:30:00 crc kubenswrapper[4752]: I1124 11:30:00.334124 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4f68b128-215a-4b08-b1ce-ef179f020723-secret-volume\") pod \"collect-profiles-29399730-wlws9\" (UID: \"4f68b128-215a-4b08-b1ce-ef179f020723\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399730-wlws9" Nov 24 11:30:00 crc kubenswrapper[4752]: I1124 11:30:00.335547 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4f68b128-215a-4b08-b1ce-ef179f020723-config-volume\") pod \"collect-profiles-29399730-wlws9\" (UID: \"4f68b128-215a-4b08-b1ce-ef179f020723\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399730-wlws9" Nov 24 11:30:00 crc kubenswrapper[4752]: I1124 11:30:00.341800 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4f68b128-215a-4b08-b1ce-ef179f020723-secret-volume\") pod \"collect-profiles-29399730-wlws9\" (UID: \"4f68b128-215a-4b08-b1ce-ef179f020723\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399730-wlws9" Nov 24 11:30:00 crc kubenswrapper[4752]: I1124 11:30:00.354903 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkkmp\" (UniqueName: \"kubernetes.io/projected/4f68b128-215a-4b08-b1ce-ef179f020723-kube-api-access-qkkmp\") pod \"collect-profiles-29399730-wlws9\" (UID: \"4f68b128-215a-4b08-b1ce-ef179f020723\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399730-wlws9" Nov 24 11:30:00 crc kubenswrapper[4752]: I1124 11:30:00.515653 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399730-wlws9" Nov 24 11:30:00 crc kubenswrapper[4752]: I1124 11:30:00.930672 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399730-wlws9"] Nov 24 11:30:01 crc kubenswrapper[4752]: I1124 11:30:01.044128 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399730-wlws9" event={"ID":"4f68b128-215a-4b08-b1ce-ef179f020723","Type":"ContainerStarted","Data":"2880b26652fbe697034485c54ed84196548f501b50b0450e4055b04dbf1e5a0f"} Nov 24 11:30:02 crc kubenswrapper[4752]: I1124 11:30:02.055251 4752 generic.go:334] "Generic (PLEG): container finished" podID="4f68b128-215a-4b08-b1ce-ef179f020723" containerID="9e6466701c5e2974943f058f4727ef5cde02f243f235bbd683f20ee6b05000b7" exitCode=0 Nov 24 11:30:02 crc kubenswrapper[4752]: I1124 11:30:02.055336 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399730-wlws9" event={"ID":"4f68b128-215a-4b08-b1ce-ef179f020723","Type":"ContainerDied","Data":"9e6466701c5e2974943f058f4727ef5cde02f243f235bbd683f20ee6b05000b7"} Nov 24 11:30:03 crc kubenswrapper[4752]: I1124 11:30:03.352062 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399730-wlws9" Nov 24 11:30:03 crc kubenswrapper[4752]: I1124 11:30:03.479860 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4f68b128-215a-4b08-b1ce-ef179f020723-secret-volume\") pod \"4f68b128-215a-4b08-b1ce-ef179f020723\" (UID: \"4f68b128-215a-4b08-b1ce-ef179f020723\") " Nov 24 11:30:03 crc kubenswrapper[4752]: I1124 11:30:03.480061 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkkmp\" (UniqueName: \"kubernetes.io/projected/4f68b128-215a-4b08-b1ce-ef179f020723-kube-api-access-qkkmp\") pod \"4f68b128-215a-4b08-b1ce-ef179f020723\" (UID: \"4f68b128-215a-4b08-b1ce-ef179f020723\") " Nov 24 11:30:03 crc kubenswrapper[4752]: I1124 11:30:03.480168 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4f68b128-215a-4b08-b1ce-ef179f020723-config-volume\") pod \"4f68b128-215a-4b08-b1ce-ef179f020723\" (UID: \"4f68b128-215a-4b08-b1ce-ef179f020723\") " Nov 24 11:30:03 crc kubenswrapper[4752]: I1124 11:30:03.481207 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f68b128-215a-4b08-b1ce-ef179f020723-config-volume" (OuterVolumeSpecName: "config-volume") pod "4f68b128-215a-4b08-b1ce-ef179f020723" (UID: "4f68b128-215a-4b08-b1ce-ef179f020723"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:30:03 crc kubenswrapper[4752]: I1124 11:30:03.485946 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f68b128-215a-4b08-b1ce-ef179f020723-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4f68b128-215a-4b08-b1ce-ef179f020723" (UID: "4f68b128-215a-4b08-b1ce-ef179f020723"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:30:03 crc kubenswrapper[4752]: I1124 11:30:03.486861 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f68b128-215a-4b08-b1ce-ef179f020723-kube-api-access-qkkmp" (OuterVolumeSpecName: "kube-api-access-qkkmp") pod "4f68b128-215a-4b08-b1ce-ef179f020723" (UID: "4f68b128-215a-4b08-b1ce-ef179f020723"). InnerVolumeSpecName "kube-api-access-qkkmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:30:03 crc kubenswrapper[4752]: I1124 11:30:03.582714 4752 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4f68b128-215a-4b08-b1ce-ef179f020723-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 24 11:30:03 crc kubenswrapper[4752]: I1124 11:30:03.582786 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkkmp\" (UniqueName: \"kubernetes.io/projected/4f68b128-215a-4b08-b1ce-ef179f020723-kube-api-access-qkkmp\") on node \"crc\" DevicePath \"\"" Nov 24 11:30:03 crc kubenswrapper[4752]: I1124 11:30:03.582800 4752 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4f68b128-215a-4b08-b1ce-ef179f020723-config-volume\") on node \"crc\" DevicePath \"\"" Nov 24 11:30:04 crc kubenswrapper[4752]: I1124 11:30:04.073495 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399730-wlws9" event={"ID":"4f68b128-215a-4b08-b1ce-ef179f020723","Type":"ContainerDied","Data":"2880b26652fbe697034485c54ed84196548f501b50b0450e4055b04dbf1e5a0f"} Nov 24 11:30:04 crc kubenswrapper[4752]: I1124 11:30:04.073565 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2880b26652fbe697034485c54ed84196548f501b50b0450e4055b04dbf1e5a0f" Nov 24 11:30:04 crc kubenswrapper[4752]: I1124 11:30:04.073619 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399730-wlws9" Nov 24 11:30:15 crc kubenswrapper[4752]: I1124 11:30:15.469167 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 11:30:15 crc kubenswrapper[4752]: I1124 11:30:15.469687 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 11:30:31 crc kubenswrapper[4752]: I1124 11:30:31.691174 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-r5r6c"] Nov 24 11:30:31 crc kubenswrapper[4752]: E1124 11:30:31.692267 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f68b128-215a-4b08-b1ce-ef179f020723" containerName="collect-profiles" Nov 24 11:30:31 crc kubenswrapper[4752]: I1124 11:30:31.692283 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f68b128-215a-4b08-b1ce-ef179f020723" containerName="collect-profiles" Nov 24 11:30:31 crc kubenswrapper[4752]: I1124 11:30:31.692567 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f68b128-215a-4b08-b1ce-ef179f020723" containerName="collect-profiles" Nov 24 11:30:31 crc kubenswrapper[4752]: I1124 11:30:31.693977 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r5r6c" Nov 24 11:30:31 crc kubenswrapper[4752]: I1124 11:30:31.700447 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r5r6c"] Nov 24 11:30:31 crc kubenswrapper[4752]: I1124 11:30:31.819320 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txd2w\" (UniqueName: \"kubernetes.io/projected/122f7a2d-6c9c-49d9-9314-4b28cc33c73a-kube-api-access-txd2w\") pod \"redhat-marketplace-r5r6c\" (UID: \"122f7a2d-6c9c-49d9-9314-4b28cc33c73a\") " pod="openshift-marketplace/redhat-marketplace-r5r6c" Nov 24 11:30:31 crc kubenswrapper[4752]: I1124 11:30:31.819369 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/122f7a2d-6c9c-49d9-9314-4b28cc33c73a-utilities\") pod \"redhat-marketplace-r5r6c\" (UID: \"122f7a2d-6c9c-49d9-9314-4b28cc33c73a\") " pod="openshift-marketplace/redhat-marketplace-r5r6c" Nov 24 11:30:31 crc kubenswrapper[4752]: I1124 11:30:31.819798 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/122f7a2d-6c9c-49d9-9314-4b28cc33c73a-catalog-content\") pod \"redhat-marketplace-r5r6c\" (UID: \"122f7a2d-6c9c-49d9-9314-4b28cc33c73a\") " pod="openshift-marketplace/redhat-marketplace-r5r6c" Nov 24 11:30:31 crc kubenswrapper[4752]: I1124 11:30:31.925532 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txd2w\" (UniqueName: \"kubernetes.io/projected/122f7a2d-6c9c-49d9-9314-4b28cc33c73a-kube-api-access-txd2w\") pod \"redhat-marketplace-r5r6c\" (UID: \"122f7a2d-6c9c-49d9-9314-4b28cc33c73a\") " pod="openshift-marketplace/redhat-marketplace-r5r6c" Nov 24 11:30:31 crc kubenswrapper[4752]: I1124 11:30:31.925582 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/122f7a2d-6c9c-49d9-9314-4b28cc33c73a-utilities\") pod \"redhat-marketplace-r5r6c\" (UID: \"122f7a2d-6c9c-49d9-9314-4b28cc33c73a\") " pod="openshift-marketplace/redhat-marketplace-r5r6c" Nov 24 11:30:31 crc kubenswrapper[4752]: I1124 11:30:31.925665 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/122f7a2d-6c9c-49d9-9314-4b28cc33c73a-catalog-content\") pod \"redhat-marketplace-r5r6c\" (UID: \"122f7a2d-6c9c-49d9-9314-4b28cc33c73a\") " pod="openshift-marketplace/redhat-marketplace-r5r6c" Nov 24 11:30:31 crc kubenswrapper[4752]: I1124 11:30:31.926190 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/122f7a2d-6c9c-49d9-9314-4b28cc33c73a-catalog-content\") pod \"redhat-marketplace-r5r6c\" (UID: \"122f7a2d-6c9c-49d9-9314-4b28cc33c73a\") " pod="openshift-marketplace/redhat-marketplace-r5r6c" Nov 24 11:30:31 crc kubenswrapper[4752]: I1124 11:30:31.926384 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/122f7a2d-6c9c-49d9-9314-4b28cc33c73a-utilities\") pod \"redhat-marketplace-r5r6c\" (UID: \"122f7a2d-6c9c-49d9-9314-4b28cc33c73a\") " pod="openshift-marketplace/redhat-marketplace-r5r6c" Nov 24 11:30:31 crc kubenswrapper[4752]: I1124 11:30:31.947463 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txd2w\" (UniqueName: \"kubernetes.io/projected/122f7a2d-6c9c-49d9-9314-4b28cc33c73a-kube-api-access-txd2w\") pod \"redhat-marketplace-r5r6c\" (UID: \"122f7a2d-6c9c-49d9-9314-4b28cc33c73a\") " pod="openshift-marketplace/redhat-marketplace-r5r6c" Nov 24 11:30:32 crc kubenswrapper[4752]: I1124 11:30:32.026533 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r5r6c" Nov 24 11:30:32 crc kubenswrapper[4752]: I1124 11:30:32.257587 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r5r6c"] Nov 24 11:30:32 crc kubenswrapper[4752]: I1124 11:30:32.348631 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r5r6c" event={"ID":"122f7a2d-6c9c-49d9-9314-4b28cc33c73a","Type":"ContainerStarted","Data":"25064bd6d53466d984f8ef8498cd54512ad30446b9402add9bb403755307c782"} Nov 24 11:30:33 crc kubenswrapper[4752]: I1124 11:30:33.367651 4752 generic.go:334] "Generic (PLEG): container finished" podID="122f7a2d-6c9c-49d9-9314-4b28cc33c73a" containerID="85f6fdde8a79df217928a2288efec2dd9ddf7fb250576e6bf6c86e89d78a942a" exitCode=0 Nov 24 11:30:33 crc kubenswrapper[4752]: I1124 11:30:33.367716 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r5r6c" event={"ID":"122f7a2d-6c9c-49d9-9314-4b28cc33c73a","Type":"ContainerDied","Data":"85f6fdde8a79df217928a2288efec2dd9ddf7fb250576e6bf6c86e89d78a942a"} Nov 24 11:30:35 crc kubenswrapper[4752]: I1124 11:30:35.385609 4752 generic.go:334] "Generic (PLEG): container finished" podID="122f7a2d-6c9c-49d9-9314-4b28cc33c73a" containerID="516e1bc3e1b53f9f93bf5ea00dc84f424a8446c7b8cb5b5032f3ae369b158efe" exitCode=0 Nov 24 11:30:35 crc kubenswrapper[4752]: I1124 11:30:35.385712 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r5r6c" event={"ID":"122f7a2d-6c9c-49d9-9314-4b28cc33c73a","Type":"ContainerDied","Data":"516e1bc3e1b53f9f93bf5ea00dc84f424a8446c7b8cb5b5032f3ae369b158efe"} Nov 24 11:30:37 crc kubenswrapper[4752]: I1124 11:30:37.408106 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r5r6c" event={"ID":"122f7a2d-6c9c-49d9-9314-4b28cc33c73a","Type":"ContainerStarted","Data":"170d0c728bf15878ecdef26487183b0ed9dc93bbaa3866fec2ad9f9e5eb88fe8"} Nov 24 11:30:37 crc kubenswrapper[4752]: I1124 11:30:37.429240 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-r5r6c" podStartSLOduration=2.945385845 podStartE2EDuration="6.429219827s" podCreationTimestamp="2025-11-24 11:30:31 +0000 UTC" firstStartedPulling="2025-11-24 11:30:33.370020341 +0000 UTC m=+1439.354840630" lastFinishedPulling="2025-11-24 11:30:36.853854293 +0000 UTC m=+1442.838674612" observedRunningTime="2025-11-24 11:30:37.427561249 +0000 UTC m=+1443.412381568" watchObservedRunningTime="2025-11-24 11:30:37.429219827 +0000 UTC m=+1443.414040116" Nov 24 11:30:38 crc kubenswrapper[4752]: I1124 11:30:38.428705 4752 scope.go:117] "RemoveContainer" containerID="89f677f71ca7f0defc38da546867b6b1ffd8d6a90e5b7096f3f34fdcbd93a5d2" Nov 24 11:30:38 crc kubenswrapper[4752]: I1124 11:30:38.457035 4752 scope.go:117] "RemoveContainer" containerID="04e513bef10e5502ac575ac143cf0082eb87c1fcae98ef5badd9c267c63d7fa4" Nov 24 11:30:38 crc kubenswrapper[4752]: I1124 11:30:38.481031 4752 scope.go:117] "RemoveContainer" containerID="e14543262653c738d489c1dbda8ff1a881ebfc697586d0e9304a135ff3b5f748" Nov 24 11:30:38 crc kubenswrapper[4752]: I1124 11:30:38.513158 4752 scope.go:117] "RemoveContainer" containerID="4407b9469063ff93a22f5f0b8c37efedcfbeb77bcd5803ed59f64e869161de89" Nov 24 11:30:38 crc kubenswrapper[4752]: I1124 11:30:38.552365 4752 scope.go:117] "RemoveContainer" containerID="954b47a267aaf50acc4f84e04c8beafb487e7e1d28523852ecbb0dc2c03e0221" Nov 24 11:30:38 crc kubenswrapper[4752]: I1124 11:30:38.587462 4752 scope.go:117] "RemoveContainer" containerID="635c128b61e0f1be298eda06897c755b3bed52e73a8086297eaf4ffc3f88e948" Nov 24 11:30:38 crc kubenswrapper[4752]: I1124 11:30:38.613029 4752 scope.go:117] "RemoveContainer" containerID="bac5dbc1bd66465003c32bb0bf945a6afc962ebd56594b233ce83b4b293019e0" Nov 24 11:30:38 crc kubenswrapper[4752]: I1124 11:30:38.646095 4752 scope.go:117] "RemoveContainer" containerID="b77b1b1e377ddca3c012d7c1f6ddebf83b2c8548e70877aa230822808f642838" Nov 24 11:30:38 crc kubenswrapper[4752]: I1124 11:30:38.666188 4752 scope.go:117] "RemoveContainer" containerID="94a1d35075cddbba48a56524dbb8d50a49c8101d32685bfc1f3920e8a633ef21" Nov 24 11:30:38 crc kubenswrapper[4752]: I1124 11:30:38.685934 4752 scope.go:117] "RemoveContainer" containerID="8a24630c9c439bcce109e7be607830a0db7b63fc24cf6c38472435d5053bd622" Nov 24 11:30:38 crc kubenswrapper[4752]: I1124 11:30:38.709824 4752 scope.go:117] "RemoveContainer" containerID="00c9561e9a5735f6ed6ea2bd3f779c56f52d92d1f868c56150ff9276fc50cb92" Nov 24 11:30:38 crc kubenswrapper[4752]: I1124 11:30:38.733965 4752 scope.go:117] "RemoveContainer" containerID="edf2a6cd801741600904ad2ea8216f876114e28709f11d16ba51608f5807b542" Nov 24 11:30:38 crc kubenswrapper[4752]: I1124 11:30:38.757548 4752 scope.go:117] "RemoveContainer" containerID="c9c0faa7fabab5aa35b498dc4929bf133db55cdfd0aaa0be2d13f77832bd89d9" Nov 24 11:30:38 crc kubenswrapper[4752]: I1124 11:30:38.784020 4752 scope.go:117] "RemoveContainer" containerID="ba0485f587e84e6729b1a0edb628c4d684a2d0274e0deb773446ecb4a744c7d3" Nov 24 11:30:38 crc kubenswrapper[4752]: I1124 11:30:38.811032 4752 scope.go:117] "RemoveContainer" containerID="f4135c4e3c9981db1bcebbf4dd1fe967223301c429be2fa843360da3aaec3e0b" Nov 24 11:30:38 crc kubenswrapper[4752]: I1124 11:30:38.847461 4752 scope.go:117] "RemoveContainer" containerID="cd577f878c121c20c426390b80b9699b5ef8403e3a67c8fb2140b6b0c76f6a3f" Nov 24 11:30:38 crc kubenswrapper[4752]: I1124 11:30:38.888492 4752 scope.go:117] "RemoveContainer" containerID="5c6f110ff0032fe3eb25f36701cc1f335c97745e7f001c2c26b1d02c30d0b5ce" Nov 24 11:30:38 crc kubenswrapper[4752]: I1124 11:30:38.972346 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k6sst"] Nov 24 11:30:38 crc kubenswrapper[4752]: I1124 11:30:38.974294 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k6sst" Nov 24 11:30:38 crc kubenswrapper[4752]: I1124 11:30:38.988093 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k6sst"] Nov 24 11:30:39 crc kubenswrapper[4752]: I1124 11:30:39.032834 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jmsf\" (UniqueName: \"kubernetes.io/projected/1472ca2c-bc4e-4045-9300-69f716c5f3ce-kube-api-access-5jmsf\") pod \"community-operators-k6sst\" (UID: \"1472ca2c-bc4e-4045-9300-69f716c5f3ce\") " pod="openshift-marketplace/community-operators-k6sst" Nov 24 11:30:39 crc kubenswrapper[4752]: I1124 11:30:39.032936 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1472ca2c-bc4e-4045-9300-69f716c5f3ce-catalog-content\") pod \"community-operators-k6sst\" (UID: \"1472ca2c-bc4e-4045-9300-69f716c5f3ce\") " pod="openshift-marketplace/community-operators-k6sst" Nov 24 11:30:39 crc kubenswrapper[4752]: I1124 11:30:39.032974 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1472ca2c-bc4e-4045-9300-69f716c5f3ce-utilities\") pod \"community-operators-k6sst\" (UID: \"1472ca2c-bc4e-4045-9300-69f716c5f3ce\") " pod="openshift-marketplace/community-operators-k6sst" Nov 24 11:30:39 crc kubenswrapper[4752]: I1124 11:30:39.133926 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jmsf\" (UniqueName: \"kubernetes.io/projected/1472ca2c-bc4e-4045-9300-69f716c5f3ce-kube-api-access-5jmsf\") pod \"community-operators-k6sst\" (UID: \"1472ca2c-bc4e-4045-9300-69f716c5f3ce\") " pod="openshift-marketplace/community-operators-k6sst" Nov 24 11:30:39 crc kubenswrapper[4752]: I1124 11:30:39.134266 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1472ca2c-bc4e-4045-9300-69f716c5f3ce-catalog-content\") pod \"community-operators-k6sst\" (UID: \"1472ca2c-bc4e-4045-9300-69f716c5f3ce\") " pod="openshift-marketplace/community-operators-k6sst" Nov 24 11:30:39 crc kubenswrapper[4752]: I1124 11:30:39.134291 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1472ca2c-bc4e-4045-9300-69f716c5f3ce-utilities\") pod \"community-operators-k6sst\" (UID: \"1472ca2c-bc4e-4045-9300-69f716c5f3ce\") " pod="openshift-marketplace/community-operators-k6sst" Nov 24 11:30:39 crc kubenswrapper[4752]: I1124 11:30:39.134831 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1472ca2c-bc4e-4045-9300-69f716c5f3ce-utilities\") pod \"community-operators-k6sst\" (UID: \"1472ca2c-bc4e-4045-9300-69f716c5f3ce\") " pod="openshift-marketplace/community-operators-k6sst" Nov 24 11:30:39 crc kubenswrapper[4752]: I1124 11:30:39.135012 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1472ca2c-bc4e-4045-9300-69f716c5f3ce-catalog-content\") pod \"community-operators-k6sst\" (UID: \"1472ca2c-bc4e-4045-9300-69f716c5f3ce\") " pod="openshift-marketplace/community-operators-k6sst" Nov 24 11:30:39 crc kubenswrapper[4752]: I1124 11:30:39.153994 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jmsf\" (UniqueName: \"kubernetes.io/projected/1472ca2c-bc4e-4045-9300-69f716c5f3ce-kube-api-access-5jmsf\") pod \"community-operators-k6sst\" (UID: \"1472ca2c-bc4e-4045-9300-69f716c5f3ce\") " pod="openshift-marketplace/community-operators-k6sst" Nov 24 11:30:39 crc kubenswrapper[4752]: I1124 11:30:39.303124 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k6sst" Nov 24 11:30:39 crc kubenswrapper[4752]: I1124 11:30:39.765144 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k6sst"] Nov 24 11:30:39 crc kubenswrapper[4752]: W1124 11:30:39.771506 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1472ca2c_bc4e_4045_9300_69f716c5f3ce.slice/crio-0cfa4f1d45299af0168ff4decdb6109707b6e171f01bae3a1a0a66891097e5a3 WatchSource:0}: Error finding container 0cfa4f1d45299af0168ff4decdb6109707b6e171f01bae3a1a0a66891097e5a3: Status 404 returned error can't find the container with id 0cfa4f1d45299af0168ff4decdb6109707b6e171f01bae3a1a0a66891097e5a3 Nov 24 11:30:40 crc kubenswrapper[4752]: I1124 11:30:40.447107 4752 generic.go:334] "Generic (PLEG): container finished" podID="1472ca2c-bc4e-4045-9300-69f716c5f3ce" containerID="0577c6648ef71cd7017b49be1eaaede44ee8b6622eb3711acaedb6138b8b1b98" exitCode=0 Nov 24 11:30:40 crc kubenswrapper[4752]: I1124 11:30:40.447200 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6sst" event={"ID":"1472ca2c-bc4e-4045-9300-69f716c5f3ce","Type":"ContainerDied","Data":"0577c6648ef71cd7017b49be1eaaede44ee8b6622eb3711acaedb6138b8b1b98"} Nov 24 11:30:40 crc kubenswrapper[4752]: I1124 11:30:40.447246 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6sst" event={"ID":"1472ca2c-bc4e-4045-9300-69f716c5f3ce","Type":"ContainerStarted","Data":"0cfa4f1d45299af0168ff4decdb6109707b6e171f01bae3a1a0a66891097e5a3"} Nov 24 11:30:41 crc kubenswrapper[4752]: I1124 11:30:41.468975 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6sst" event={"ID":"1472ca2c-bc4e-4045-9300-69f716c5f3ce","Type":"ContainerStarted","Data":"e51351658a2bc31ac8046f8086f568a4243fd91ac47329bb1fe088ef8bfcdfc4"} Nov 24 11:30:42 crc kubenswrapper[4752]: I1124 11:30:42.028038 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-r5r6c" Nov 24 11:30:42 crc kubenswrapper[4752]: I1124 11:30:42.028089 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-r5r6c" Nov 24 11:30:42 crc kubenswrapper[4752]: I1124 11:30:42.073518 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-r5r6c" Nov 24 11:30:42 crc kubenswrapper[4752]: I1124 11:30:42.479783 4752 generic.go:334] "Generic (PLEG): container finished" podID="1472ca2c-bc4e-4045-9300-69f716c5f3ce" containerID="e51351658a2bc31ac8046f8086f568a4243fd91ac47329bb1fe088ef8bfcdfc4" exitCode=0 Nov 24 11:30:42 crc kubenswrapper[4752]: I1124 11:30:42.479829 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6sst" event={"ID":"1472ca2c-bc4e-4045-9300-69f716c5f3ce","Type":"ContainerDied","Data":"e51351658a2bc31ac8046f8086f568a4243fd91ac47329bb1fe088ef8bfcdfc4"} Nov 24 11:30:42 crc kubenswrapper[4752]: I1124 11:30:42.524379 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-r5r6c" Nov 24 11:30:43 crc kubenswrapper[4752]: I1124 11:30:43.490325 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6sst" event={"ID":"1472ca2c-bc4e-4045-9300-69f716c5f3ce","Type":"ContainerStarted","Data":"789774e86b0961bc5a41d8d8908e05e1a6e434f4d8c2b3ec59fbff6a8202ca06"} Nov 24 11:30:43 crc kubenswrapper[4752]: I1124 11:30:43.515638 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k6sst" podStartSLOduration=3.092377114 podStartE2EDuration="5.515609642s" podCreationTimestamp="2025-11-24 11:30:38 +0000 UTC" firstStartedPulling="2025-11-24 11:30:40.451369741 +0000 UTC m=+1446.436190050" lastFinishedPulling="2025-11-24 11:30:42.874602289 +0000 UTC m=+1448.859422578" observedRunningTime="2025-11-24 11:30:43.510545206 +0000 UTC m=+1449.495365485" watchObservedRunningTime="2025-11-24 11:30:43.515609642 +0000 UTC m=+1449.500429931" Nov 24 11:30:44 crc kubenswrapper[4752]: I1124 11:30:44.351081 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r5r6c"] Nov 24 11:30:44 crc kubenswrapper[4752]: I1124 11:30:44.500013 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-r5r6c" podUID="122f7a2d-6c9c-49d9-9314-4b28cc33c73a" containerName="registry-server" containerID="cri-o://170d0c728bf15878ecdef26487183b0ed9dc93bbaa3866fec2ad9f9e5eb88fe8" gracePeriod=2 Nov 24 11:30:44 crc kubenswrapper[4752]: I1124 11:30:44.856045 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r5r6c" Nov 24 11:30:44 crc kubenswrapper[4752]: I1124 11:30:44.916846 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/122f7a2d-6c9c-49d9-9314-4b28cc33c73a-utilities\") pod \"122f7a2d-6c9c-49d9-9314-4b28cc33c73a\" (UID: \"122f7a2d-6c9c-49d9-9314-4b28cc33c73a\") " Nov 24 11:30:44 crc kubenswrapper[4752]: I1124 11:30:44.916960 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/122f7a2d-6c9c-49d9-9314-4b28cc33c73a-catalog-content\") pod \"122f7a2d-6c9c-49d9-9314-4b28cc33c73a\" (UID: \"122f7a2d-6c9c-49d9-9314-4b28cc33c73a\") " Nov 24 11:30:44 crc kubenswrapper[4752]: I1124 11:30:44.917024 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txd2w\" (UniqueName: \"kubernetes.io/projected/122f7a2d-6c9c-49d9-9314-4b28cc33c73a-kube-api-access-txd2w\") pod \"122f7a2d-6c9c-49d9-9314-4b28cc33c73a\" (UID: \"122f7a2d-6c9c-49d9-9314-4b28cc33c73a\") " Nov 24 11:30:44 crc kubenswrapper[4752]: I1124 11:30:44.917651 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/122f7a2d-6c9c-49d9-9314-4b28cc33c73a-utilities" (OuterVolumeSpecName: "utilities") pod "122f7a2d-6c9c-49d9-9314-4b28cc33c73a" (UID: "122f7a2d-6c9c-49d9-9314-4b28cc33c73a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:30:44 crc kubenswrapper[4752]: I1124 11:30:44.922592 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/122f7a2d-6c9c-49d9-9314-4b28cc33c73a-kube-api-access-txd2w" (OuterVolumeSpecName: "kube-api-access-txd2w") pod "122f7a2d-6c9c-49d9-9314-4b28cc33c73a" (UID: "122f7a2d-6c9c-49d9-9314-4b28cc33c73a"). InnerVolumeSpecName "kube-api-access-txd2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:30:44 crc kubenswrapper[4752]: I1124 11:30:44.936772 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/122f7a2d-6c9c-49d9-9314-4b28cc33c73a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "122f7a2d-6c9c-49d9-9314-4b28cc33c73a" (UID: "122f7a2d-6c9c-49d9-9314-4b28cc33c73a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:30:45 crc kubenswrapper[4752]: I1124 11:30:45.019347 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/122f7a2d-6c9c-49d9-9314-4b28cc33c73a-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 11:30:45 crc kubenswrapper[4752]: I1124 11:30:45.019404 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txd2w\" (UniqueName: \"kubernetes.io/projected/122f7a2d-6c9c-49d9-9314-4b28cc33c73a-kube-api-access-txd2w\") on node \"crc\" DevicePath \"\"" Nov 24 11:30:45 crc kubenswrapper[4752]: I1124 11:30:45.019419 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/122f7a2d-6c9c-49d9-9314-4b28cc33c73a-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 11:30:45 crc kubenswrapper[4752]: I1124 11:30:45.468982 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 11:30:45 crc kubenswrapper[4752]: I1124 11:30:45.469076 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 11:30:45 crc kubenswrapper[4752]: I1124 11:30:45.513174 4752 generic.go:334] "Generic (PLEG): container finished" podID="122f7a2d-6c9c-49d9-9314-4b28cc33c73a" containerID="170d0c728bf15878ecdef26487183b0ed9dc93bbaa3866fec2ad9f9e5eb88fe8" exitCode=0 Nov 24 11:30:45 crc kubenswrapper[4752]: I1124 11:30:45.513246 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r5r6c" event={"ID":"122f7a2d-6c9c-49d9-9314-4b28cc33c73a","Type":"ContainerDied","Data":"170d0c728bf15878ecdef26487183b0ed9dc93bbaa3866fec2ad9f9e5eb88fe8"} Nov 24 11:30:45 crc kubenswrapper[4752]: I1124 11:30:45.513291 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r5r6c" event={"ID":"122f7a2d-6c9c-49d9-9314-4b28cc33c73a","Type":"ContainerDied","Data":"25064bd6d53466d984f8ef8498cd54512ad30446b9402add9bb403755307c782"} Nov 24 11:30:45 crc kubenswrapper[4752]: I1124 11:30:45.513323 4752 scope.go:117] "RemoveContainer" containerID="170d0c728bf15878ecdef26487183b0ed9dc93bbaa3866fec2ad9f9e5eb88fe8" Nov 24 11:30:45 crc kubenswrapper[4752]: I1124 11:30:45.513546 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r5r6c" Nov 24 11:30:45 crc kubenswrapper[4752]: I1124 11:30:45.548889 4752 scope.go:117] "RemoveContainer" containerID="516e1bc3e1b53f9f93bf5ea00dc84f424a8446c7b8cb5b5032f3ae369b158efe" Nov 24 11:30:45 crc kubenswrapper[4752]: I1124 11:30:45.570315 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r5r6c"] Nov 24 11:30:45 crc kubenswrapper[4752]: I1124 11:30:45.578294 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-r5r6c"] Nov 24 11:30:45 crc kubenswrapper[4752]: I1124 11:30:45.602561 4752 scope.go:117] "RemoveContainer" containerID="85f6fdde8a79df217928a2288efec2dd9ddf7fb250576e6bf6c86e89d78a942a" Nov 24 11:30:45 crc kubenswrapper[4752]: I1124 11:30:45.622791 4752 scope.go:117] "RemoveContainer" containerID="170d0c728bf15878ecdef26487183b0ed9dc93bbaa3866fec2ad9f9e5eb88fe8" Nov 24 11:30:45 crc kubenswrapper[4752]: E1124 11:30:45.623581 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"170d0c728bf15878ecdef26487183b0ed9dc93bbaa3866fec2ad9f9e5eb88fe8\": container with ID starting with 170d0c728bf15878ecdef26487183b0ed9dc93bbaa3866fec2ad9f9e5eb88fe8 not found: ID does not exist" containerID="170d0c728bf15878ecdef26487183b0ed9dc93bbaa3866fec2ad9f9e5eb88fe8" Nov 24 11:30:45 crc kubenswrapper[4752]: I1124 11:30:45.623616 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"170d0c728bf15878ecdef26487183b0ed9dc93bbaa3866fec2ad9f9e5eb88fe8"} err="failed to get container status \"170d0c728bf15878ecdef26487183b0ed9dc93bbaa3866fec2ad9f9e5eb88fe8\": rpc error: code = NotFound desc = could not find container \"170d0c728bf15878ecdef26487183b0ed9dc93bbaa3866fec2ad9f9e5eb88fe8\": container with ID starting with 170d0c728bf15878ecdef26487183b0ed9dc93bbaa3866fec2ad9f9e5eb88fe8 not found: ID does not exist" Nov 24 11:30:45 crc kubenswrapper[4752]: I1124 11:30:45.623658 4752 scope.go:117] "RemoveContainer" containerID="516e1bc3e1b53f9f93bf5ea00dc84f424a8446c7b8cb5b5032f3ae369b158efe" Nov 24 11:30:45 crc kubenswrapper[4752]: E1124 11:30:45.623932 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"516e1bc3e1b53f9f93bf5ea00dc84f424a8446c7b8cb5b5032f3ae369b158efe\": container with ID starting with 516e1bc3e1b53f9f93bf5ea00dc84f424a8446c7b8cb5b5032f3ae369b158efe not found: ID does not exist" containerID="516e1bc3e1b53f9f93bf5ea00dc84f424a8446c7b8cb5b5032f3ae369b158efe" Nov 24 11:30:45 crc kubenswrapper[4752]: I1124 11:30:45.623955 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"516e1bc3e1b53f9f93bf5ea00dc84f424a8446c7b8cb5b5032f3ae369b158efe"} err="failed to get container status \"516e1bc3e1b53f9f93bf5ea00dc84f424a8446c7b8cb5b5032f3ae369b158efe\": rpc error: code = NotFound desc = could not find container \"516e1bc3e1b53f9f93bf5ea00dc84f424a8446c7b8cb5b5032f3ae369b158efe\": container with ID starting with 516e1bc3e1b53f9f93bf5ea00dc84f424a8446c7b8cb5b5032f3ae369b158efe not found: ID does not exist" Nov 24 11:30:45 crc kubenswrapper[4752]: I1124 11:30:45.624004 4752 scope.go:117] "RemoveContainer" containerID="85f6fdde8a79df217928a2288efec2dd9ddf7fb250576e6bf6c86e89d78a942a" Nov 24 11:30:45 crc kubenswrapper[4752]: E1124 11:30:45.624371 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85f6fdde8a79df217928a2288efec2dd9ddf7fb250576e6bf6c86e89d78a942a\": container with ID starting with 85f6fdde8a79df217928a2288efec2dd9ddf7fb250576e6bf6c86e89d78a942a not found: ID does not exist" containerID="85f6fdde8a79df217928a2288efec2dd9ddf7fb250576e6bf6c86e89d78a942a" Nov 24 11:30:45 crc kubenswrapper[4752]: I1124 11:30:45.624421 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85f6fdde8a79df217928a2288efec2dd9ddf7fb250576e6bf6c86e89d78a942a"} err="failed to get container status \"85f6fdde8a79df217928a2288efec2dd9ddf7fb250576e6bf6c86e89d78a942a\": rpc error: code = NotFound desc = could not find container \"85f6fdde8a79df217928a2288efec2dd9ddf7fb250576e6bf6c86e89d78a942a\": container with ID starting with 85f6fdde8a79df217928a2288efec2dd9ddf7fb250576e6bf6c86e89d78a942a not found: ID does not exist" Nov 24 11:30:46 crc kubenswrapper[4752]: I1124 11:30:46.745824 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="122f7a2d-6c9c-49d9-9314-4b28cc33c73a" path="/var/lib/kubelet/pods/122f7a2d-6c9c-49d9-9314-4b28cc33c73a/volumes" Nov 24 11:30:49 crc kubenswrapper[4752]: I1124 11:30:49.304145 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k6sst" Nov 24 11:30:49 crc kubenswrapper[4752]: I1124 11:30:49.304686 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k6sst" Nov 24 11:30:49 crc kubenswrapper[4752]: I1124 11:30:49.356510 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k6sst" Nov 24 11:30:49 crc kubenswrapper[4752]: I1124 11:30:49.593013 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k6sst" Nov 24 11:30:49 crc kubenswrapper[4752]: I1124 11:30:49.656391 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k6sst"] Nov 24 11:30:51 crc kubenswrapper[4752]: I1124 11:30:51.584823 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-k6sst" podUID="1472ca2c-bc4e-4045-9300-69f716c5f3ce" containerName="registry-server" containerID="cri-o://789774e86b0961bc5a41d8d8908e05e1a6e434f4d8c2b3ec59fbff6a8202ca06" gracePeriod=2 Nov 24 11:30:51 crc kubenswrapper[4752]: I1124 11:30:51.963515 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k6sst" Nov 24 11:30:52 crc kubenswrapper[4752]: I1124 11:30:52.068958 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1472ca2c-bc4e-4045-9300-69f716c5f3ce-utilities\") pod \"1472ca2c-bc4e-4045-9300-69f716c5f3ce\" (UID: \"1472ca2c-bc4e-4045-9300-69f716c5f3ce\") " Nov 24 11:30:52 crc kubenswrapper[4752]: I1124 11:30:52.069121 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1472ca2c-bc4e-4045-9300-69f716c5f3ce-catalog-content\") pod \"1472ca2c-bc4e-4045-9300-69f716c5f3ce\" (UID: \"1472ca2c-bc4e-4045-9300-69f716c5f3ce\") " Nov 24 11:30:52 crc kubenswrapper[4752]: I1124 11:30:52.069193 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jmsf\" (UniqueName: \"kubernetes.io/projected/1472ca2c-bc4e-4045-9300-69f716c5f3ce-kube-api-access-5jmsf\") pod \"1472ca2c-bc4e-4045-9300-69f716c5f3ce\" (UID: \"1472ca2c-bc4e-4045-9300-69f716c5f3ce\") " Nov 24 11:30:52 crc kubenswrapper[4752]: I1124 11:30:52.071325 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1472ca2c-bc4e-4045-9300-69f716c5f3ce-utilities" (OuterVolumeSpecName: "utilities") pod "1472ca2c-bc4e-4045-9300-69f716c5f3ce" (UID: "1472ca2c-bc4e-4045-9300-69f716c5f3ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:30:52 crc kubenswrapper[4752]: I1124 11:30:52.077257 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1472ca2c-bc4e-4045-9300-69f716c5f3ce-kube-api-access-5jmsf" (OuterVolumeSpecName: "kube-api-access-5jmsf") pod "1472ca2c-bc4e-4045-9300-69f716c5f3ce" (UID: "1472ca2c-bc4e-4045-9300-69f716c5f3ce"). InnerVolumeSpecName "kube-api-access-5jmsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:30:52 crc kubenswrapper[4752]: I1124 11:30:52.171157 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1472ca2c-bc4e-4045-9300-69f716c5f3ce-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 11:30:52 crc kubenswrapper[4752]: I1124 11:30:52.171695 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jmsf\" (UniqueName: \"kubernetes.io/projected/1472ca2c-bc4e-4045-9300-69f716c5f3ce-kube-api-access-5jmsf\") on node \"crc\" DevicePath \"\"" Nov 24 11:30:52 crc kubenswrapper[4752]: I1124 11:30:52.306329 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1472ca2c-bc4e-4045-9300-69f716c5f3ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1472ca2c-bc4e-4045-9300-69f716c5f3ce" (UID: "1472ca2c-bc4e-4045-9300-69f716c5f3ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:30:52 crc kubenswrapper[4752]: I1124 11:30:52.374773 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1472ca2c-bc4e-4045-9300-69f716c5f3ce-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 11:30:52 crc kubenswrapper[4752]: I1124 11:30:52.596347 4752 generic.go:334] "Generic (PLEG): container finished" podID="1472ca2c-bc4e-4045-9300-69f716c5f3ce" containerID="789774e86b0961bc5a41d8d8908e05e1a6e434f4d8c2b3ec59fbff6a8202ca06" exitCode=0 Nov 24 11:30:52 crc kubenswrapper[4752]: I1124 11:30:52.596412 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6sst" event={"ID":"1472ca2c-bc4e-4045-9300-69f716c5f3ce","Type":"ContainerDied","Data":"789774e86b0961bc5a41d8d8908e05e1a6e434f4d8c2b3ec59fbff6a8202ca06"} Nov 24 11:30:52 crc kubenswrapper[4752]: I1124 11:30:52.596447 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k6sst" event={"ID":"1472ca2c-bc4e-4045-9300-69f716c5f3ce","Type":"ContainerDied","Data":"0cfa4f1d45299af0168ff4decdb6109707b6e171f01bae3a1a0a66891097e5a3"} Nov 24 11:30:52 crc kubenswrapper[4752]: I1124 11:30:52.596471 4752 scope.go:117] "RemoveContainer" containerID="789774e86b0961bc5a41d8d8908e05e1a6e434f4d8c2b3ec59fbff6a8202ca06" Nov 24 11:30:52 crc kubenswrapper[4752]: I1124 11:30:52.597452 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k6sst" Nov 24 11:30:52 crc kubenswrapper[4752]: I1124 11:30:52.617968 4752 scope.go:117] "RemoveContainer" containerID="e51351658a2bc31ac8046f8086f568a4243fd91ac47329bb1fe088ef8bfcdfc4" Nov 24 11:30:52 crc kubenswrapper[4752]: I1124 11:30:52.630177 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k6sst"] Nov 24 11:30:52 crc kubenswrapper[4752]: I1124 11:30:52.641107 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-k6sst"] Nov 24 11:30:52 crc kubenswrapper[4752]: I1124 11:30:52.653979 4752 scope.go:117] "RemoveContainer" containerID="0577c6648ef71cd7017b49be1eaaede44ee8b6622eb3711acaedb6138b8b1b98" Nov 24 11:30:52 crc kubenswrapper[4752]: I1124 11:30:52.678242 4752 scope.go:117] "RemoveContainer" containerID="789774e86b0961bc5a41d8d8908e05e1a6e434f4d8c2b3ec59fbff6a8202ca06" Nov 24 11:30:52 crc kubenswrapper[4752]: E1124 11:30:52.678632 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"789774e86b0961bc5a41d8d8908e05e1a6e434f4d8c2b3ec59fbff6a8202ca06\": container with ID starting with 789774e86b0961bc5a41d8d8908e05e1a6e434f4d8c2b3ec59fbff6a8202ca06 not found: ID does not exist" containerID="789774e86b0961bc5a41d8d8908e05e1a6e434f4d8c2b3ec59fbff6a8202ca06" Nov 24 11:30:52 crc kubenswrapper[4752]: I1124 11:30:52.678685 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"789774e86b0961bc5a41d8d8908e05e1a6e434f4d8c2b3ec59fbff6a8202ca06"} err="failed to get container status \"789774e86b0961bc5a41d8d8908e05e1a6e434f4d8c2b3ec59fbff6a8202ca06\": rpc error: code = NotFound desc = could not find container \"789774e86b0961bc5a41d8d8908e05e1a6e434f4d8c2b3ec59fbff6a8202ca06\": container with ID starting with 789774e86b0961bc5a41d8d8908e05e1a6e434f4d8c2b3ec59fbff6a8202ca06 not found: ID does not exist" Nov 24 11:30:52 crc kubenswrapper[4752]: I1124 11:30:52.678721 4752 scope.go:117] "RemoveContainer" containerID="e51351658a2bc31ac8046f8086f568a4243fd91ac47329bb1fe088ef8bfcdfc4" Nov 24 11:30:52 crc kubenswrapper[4752]: E1124 11:30:52.679067 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e51351658a2bc31ac8046f8086f568a4243fd91ac47329bb1fe088ef8bfcdfc4\": container with ID starting with e51351658a2bc31ac8046f8086f568a4243fd91ac47329bb1fe088ef8bfcdfc4 not found: ID does not exist" containerID="e51351658a2bc31ac8046f8086f568a4243fd91ac47329bb1fe088ef8bfcdfc4" Nov 24 11:30:52 crc kubenswrapper[4752]: I1124 11:30:52.679101 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e51351658a2bc31ac8046f8086f568a4243fd91ac47329bb1fe088ef8bfcdfc4"} err="failed to get container status \"e51351658a2bc31ac8046f8086f568a4243fd91ac47329bb1fe088ef8bfcdfc4\": rpc error: code = NotFound desc = could not find container \"e51351658a2bc31ac8046f8086f568a4243fd91ac47329bb1fe088ef8bfcdfc4\": container with ID starting with e51351658a2bc31ac8046f8086f568a4243fd91ac47329bb1fe088ef8bfcdfc4 not found: ID does not exist" Nov 24 11:30:52 crc kubenswrapper[4752]: I1124 11:30:52.679125 4752 scope.go:117] "RemoveContainer" containerID="0577c6648ef71cd7017b49be1eaaede44ee8b6622eb3711acaedb6138b8b1b98" Nov 24 11:30:52 crc kubenswrapper[4752]: E1124 11:30:52.679825 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0577c6648ef71cd7017b49be1eaaede44ee8b6622eb3711acaedb6138b8b1b98\": container with ID starting with 0577c6648ef71cd7017b49be1eaaede44ee8b6622eb3711acaedb6138b8b1b98 not found: ID does not exist" containerID="0577c6648ef71cd7017b49be1eaaede44ee8b6622eb3711acaedb6138b8b1b98" Nov 24 11:30:52 crc kubenswrapper[4752]: I1124 11:30:52.679857 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0577c6648ef71cd7017b49be1eaaede44ee8b6622eb3711acaedb6138b8b1b98"} err="failed to get container status \"0577c6648ef71cd7017b49be1eaaede44ee8b6622eb3711acaedb6138b8b1b98\": rpc error: code = NotFound desc = could not find container \"0577c6648ef71cd7017b49be1eaaede44ee8b6622eb3711acaedb6138b8b1b98\": container with ID starting with 0577c6648ef71cd7017b49be1eaaede44ee8b6622eb3711acaedb6138b8b1b98 not found: ID does not exist" Nov 24 11:30:52 crc kubenswrapper[4752]: I1124 11:30:52.735647 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1472ca2c-bc4e-4045-9300-69f716c5f3ce" path="/var/lib/kubelet/pods/1472ca2c-bc4e-4045-9300-69f716c5f3ce/volumes" Nov 24 11:30:54 crc kubenswrapper[4752]: I1124 11:30:54.999395 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mswn2"] Nov 24 11:30:55 crc kubenswrapper[4752]: E1124 11:30:55.000098 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="122f7a2d-6c9c-49d9-9314-4b28cc33c73a" containerName="registry-server" Nov 24 11:30:55 crc kubenswrapper[4752]: I1124 11:30:55.000122 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="122f7a2d-6c9c-49d9-9314-4b28cc33c73a" containerName="registry-server" Nov 24 11:30:55 crc kubenswrapper[4752]: E1124 11:30:55.000144 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1472ca2c-bc4e-4045-9300-69f716c5f3ce" containerName="extract-utilities" Nov 24 11:30:55 crc kubenswrapper[4752]: I1124 11:30:55.000157 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="1472ca2c-bc4e-4045-9300-69f716c5f3ce" containerName="extract-utilities" Nov 24 11:30:55 crc kubenswrapper[4752]: E1124 11:30:55.000175 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1472ca2c-bc4e-4045-9300-69f716c5f3ce" containerName="extract-content" Nov 24 11:30:55 crc kubenswrapper[4752]: I1124 11:30:55.000185 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="1472ca2c-bc4e-4045-9300-69f716c5f3ce" containerName="extract-content" Nov 24 11:30:55 crc kubenswrapper[4752]: E1124 11:30:55.000203 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="122f7a2d-6c9c-49d9-9314-4b28cc33c73a" containerName="extract-utilities" Nov 24 11:30:55 crc kubenswrapper[4752]: I1124 11:30:55.000212 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="122f7a2d-6c9c-49d9-9314-4b28cc33c73a" containerName="extract-utilities" Nov 24 11:30:55 crc kubenswrapper[4752]: E1124 11:30:55.000240 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1472ca2c-bc4e-4045-9300-69f716c5f3ce" containerName="registry-server" Nov 24 11:30:55 crc kubenswrapper[4752]: I1124 11:30:55.000258 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="1472ca2c-bc4e-4045-9300-69f716c5f3ce" containerName="registry-server" Nov 24 11:30:55 crc kubenswrapper[4752]: E1124 11:30:55.000286 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="122f7a2d-6c9c-49d9-9314-4b28cc33c73a" containerName="extract-content" Nov 24 11:30:55 crc kubenswrapper[4752]: I1124 11:30:55.000297 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="122f7a2d-6c9c-49d9-9314-4b28cc33c73a" containerName="extract-content" Nov 24 11:30:55 crc kubenswrapper[4752]: I1124 11:30:55.000497 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="1472ca2c-bc4e-4045-9300-69f716c5f3ce" containerName="registry-server" Nov 24 11:30:55 crc kubenswrapper[4752]: I1124 11:30:55.000541 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="122f7a2d-6c9c-49d9-9314-4b28cc33c73a" containerName="registry-server" Nov 24 11:30:55 crc kubenswrapper[4752]: I1124 11:30:55.002165 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mswn2" Nov 24 11:30:55 crc kubenswrapper[4752]: I1124 11:30:55.018276 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mswn2"] Nov 24 11:30:55 crc kubenswrapper[4752]: I1124 11:30:55.112864 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwlcc\" (UniqueName: \"kubernetes.io/projected/201aace8-e4aa-43f6-8c57-4a59170e14b0-kube-api-access-dwlcc\") pod \"certified-operators-mswn2\" (UID: \"201aace8-e4aa-43f6-8c57-4a59170e14b0\") " pod="openshift-marketplace/certified-operators-mswn2" Nov 24 11:30:55 crc kubenswrapper[4752]: I1124 11:30:55.112915 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/201aace8-e4aa-43f6-8c57-4a59170e14b0-catalog-content\") pod \"certified-operators-mswn2\" (UID: \"201aace8-e4aa-43f6-8c57-4a59170e14b0\") " pod="openshift-marketplace/certified-operators-mswn2" Nov 24 11:30:55 crc kubenswrapper[4752]: I1124 11:30:55.112937 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/201aace8-e4aa-43f6-8c57-4a59170e14b0-utilities\") pod \"certified-operators-mswn2\" (UID: \"201aace8-e4aa-43f6-8c57-4a59170e14b0\") " pod="openshift-marketplace/certified-operators-mswn2" Nov 24 11:30:55 crc kubenswrapper[4752]: I1124 11:30:55.215144 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwlcc\" (UniqueName: \"kubernetes.io/projected/201aace8-e4aa-43f6-8c57-4a59170e14b0-kube-api-access-dwlcc\") pod \"certified-operators-mswn2\" (UID: \"201aace8-e4aa-43f6-8c57-4a59170e14b0\") " pod="openshift-marketplace/certified-operators-mswn2" Nov 24 11:30:55 crc kubenswrapper[4752]: I1124 11:30:55.215208 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/201aace8-e4aa-43f6-8c57-4a59170e14b0-catalog-content\") pod \"certified-operators-mswn2\" (UID: \"201aace8-e4aa-43f6-8c57-4a59170e14b0\") " pod="openshift-marketplace/certified-operators-mswn2" Nov 24 11:30:55 crc kubenswrapper[4752]: I1124 11:30:55.215238 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/201aace8-e4aa-43f6-8c57-4a59170e14b0-utilities\") pod \"certified-operators-mswn2\" (UID: \"201aace8-e4aa-43f6-8c57-4a59170e14b0\") " pod="openshift-marketplace/certified-operators-mswn2" Nov 24 11:30:55 crc kubenswrapper[4752]: I1124 11:30:55.215702 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/201aace8-e4aa-43f6-8c57-4a59170e14b0-catalog-content\") pod \"certified-operators-mswn2\" (UID: \"201aace8-e4aa-43f6-8c57-4a59170e14b0\") " pod="openshift-marketplace/certified-operators-mswn2" Nov 24 11:30:55 crc kubenswrapper[4752]: I1124 11:30:55.215824 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/201aace8-e4aa-43f6-8c57-4a59170e14b0-utilities\") pod \"certified-operators-mswn2\" (UID: \"201aace8-e4aa-43f6-8c57-4a59170e14b0\") " pod="openshift-marketplace/certified-operators-mswn2" Nov 24 11:30:55 crc kubenswrapper[4752]: I1124 11:30:55.233773 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwlcc\" (UniqueName: \"kubernetes.io/projected/201aace8-e4aa-43f6-8c57-4a59170e14b0-kube-api-access-dwlcc\") pod \"certified-operators-mswn2\" (UID: \"201aace8-e4aa-43f6-8c57-4a59170e14b0\") " pod="openshift-marketplace/certified-operators-mswn2" Nov 24 11:30:55 crc kubenswrapper[4752]: I1124 11:30:55.323630 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mswn2" Nov 24 11:30:55 crc kubenswrapper[4752]: I1124 11:30:55.637456 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mswn2"] Nov 24 11:30:56 crc kubenswrapper[4752]: I1124 11:30:56.637054 4752 generic.go:334] "Generic (PLEG): container finished" podID="201aace8-e4aa-43f6-8c57-4a59170e14b0" containerID="51a3e351f4bacf33a4cb504ebd1155ced2ad3767959cf5a7236a7cba5a48d15e" exitCode=0 Nov 24 11:30:56 crc kubenswrapper[4752]: I1124 11:30:56.637103 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mswn2" event={"ID":"201aace8-e4aa-43f6-8c57-4a59170e14b0","Type":"ContainerDied","Data":"51a3e351f4bacf33a4cb504ebd1155ced2ad3767959cf5a7236a7cba5a48d15e"} Nov 24 11:30:56 crc kubenswrapper[4752]: I1124 11:30:56.637350 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mswn2" event={"ID":"201aace8-e4aa-43f6-8c57-4a59170e14b0","Type":"ContainerStarted","Data":"c14d25a6f4b42b967626cd6452c8f8241550a9959d081a5f933c48bed155f11a"} Nov 24 11:30:57 crc kubenswrapper[4752]: I1124 11:30:57.646721 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mswn2" event={"ID":"201aace8-e4aa-43f6-8c57-4a59170e14b0","Type":"ContainerStarted","Data":"4554fdbf58fb1d25f05fcb9614309bb23626af1db50a520b72ce2bf36c282795"} Nov 24 11:30:58 crc kubenswrapper[4752]: I1124 11:30:58.656514 4752 generic.go:334] "Generic (PLEG): container finished" podID="201aace8-e4aa-43f6-8c57-4a59170e14b0" containerID="4554fdbf58fb1d25f05fcb9614309bb23626af1db50a520b72ce2bf36c282795" exitCode=0 Nov 24 11:30:58 crc kubenswrapper[4752]: I1124 11:30:58.656873 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mswn2" event={"ID":"201aace8-e4aa-43f6-8c57-4a59170e14b0","Type":"ContainerDied","Data":"4554fdbf58fb1d25f05fcb9614309bb23626af1db50a520b72ce2bf36c282795"} Nov 24 11:30:59 crc kubenswrapper[4752]: I1124 11:30:59.670821 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mswn2" event={"ID":"201aace8-e4aa-43f6-8c57-4a59170e14b0","Type":"ContainerStarted","Data":"4d1f197794bb2479feb33c24bd7a74c686a4f7cf9bc336a0a95f14c89ecff408"} Nov 24 11:30:59 crc kubenswrapper[4752]: I1124 11:30:59.693726 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mswn2" podStartSLOduration=3.227649762 podStartE2EDuration="5.693700539s" podCreationTimestamp="2025-11-24 11:30:54 +0000 UTC" firstStartedPulling="2025-11-24 11:30:56.639266452 +0000 UTC m=+1462.624086741" lastFinishedPulling="2025-11-24 11:30:59.105317219 +0000 UTC m=+1465.090137518" observedRunningTime="2025-11-24 11:30:59.689825397 +0000 UTC m=+1465.674645676" watchObservedRunningTime="2025-11-24 11:30:59.693700539 +0000 UTC m=+1465.678520828" Nov 24 11:31:05 crc kubenswrapper[4752]: I1124 11:31:05.324624 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mswn2" Nov 24 11:31:05 crc kubenswrapper[4752]: I1124 11:31:05.325104 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mswn2" Nov 24 11:31:05 crc kubenswrapper[4752]: I1124 11:31:05.382757 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mswn2" Nov 24 11:31:05 crc kubenswrapper[4752]: I1124 11:31:05.757804 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mswn2" Nov 24 11:31:05 crc kubenswrapper[4752]: I1124 11:31:05.812732 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mswn2"] Nov 24 11:31:07 crc kubenswrapper[4752]: I1124 11:31:07.741152 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mswn2" podUID="201aace8-e4aa-43f6-8c57-4a59170e14b0" containerName="registry-server" containerID="cri-o://4d1f197794bb2479feb33c24bd7a74c686a4f7cf9bc336a0a95f14c89ecff408" gracePeriod=2 Nov 24 11:31:08 crc kubenswrapper[4752]: I1124 11:31:08.159883 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mswn2" Nov 24 11:31:08 crc kubenswrapper[4752]: I1124 11:31:08.324519 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwlcc\" (UniqueName: \"kubernetes.io/projected/201aace8-e4aa-43f6-8c57-4a59170e14b0-kube-api-access-dwlcc\") pod \"201aace8-e4aa-43f6-8c57-4a59170e14b0\" (UID: \"201aace8-e4aa-43f6-8c57-4a59170e14b0\") " Nov 24 11:31:08 crc kubenswrapper[4752]: I1124 11:31:08.324734 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/201aace8-e4aa-43f6-8c57-4a59170e14b0-utilities\") pod \"201aace8-e4aa-43f6-8c57-4a59170e14b0\" (UID: \"201aace8-e4aa-43f6-8c57-4a59170e14b0\") " Nov 24 11:31:08 crc kubenswrapper[4752]: I1124 11:31:08.324865 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/201aace8-e4aa-43f6-8c57-4a59170e14b0-catalog-content\") pod \"201aace8-e4aa-43f6-8c57-4a59170e14b0\" (UID: \"201aace8-e4aa-43f6-8c57-4a59170e14b0\") " Nov 24 11:31:08 crc kubenswrapper[4752]: I1124 11:31:08.327833 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/201aace8-e4aa-43f6-8c57-4a59170e14b0-utilities" (OuterVolumeSpecName: "utilities") pod "201aace8-e4aa-43f6-8c57-4a59170e14b0" (UID: "201aace8-e4aa-43f6-8c57-4a59170e14b0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:31:08 crc kubenswrapper[4752]: I1124 11:31:08.331374 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/201aace8-e4aa-43f6-8c57-4a59170e14b0-kube-api-access-dwlcc" (OuterVolumeSpecName: "kube-api-access-dwlcc") pod "201aace8-e4aa-43f6-8c57-4a59170e14b0" (UID: "201aace8-e4aa-43f6-8c57-4a59170e14b0"). InnerVolumeSpecName "kube-api-access-dwlcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:31:08 crc kubenswrapper[4752]: I1124 11:31:08.427710 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/201aace8-e4aa-43f6-8c57-4a59170e14b0-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 11:31:08 crc kubenswrapper[4752]: I1124 11:31:08.427746 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwlcc\" (UniqueName: \"kubernetes.io/projected/201aace8-e4aa-43f6-8c57-4a59170e14b0-kube-api-access-dwlcc\") on node \"crc\" DevicePath \"\"" Nov 24 11:31:08 crc kubenswrapper[4752]: I1124 11:31:08.643867 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/201aace8-e4aa-43f6-8c57-4a59170e14b0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "201aace8-e4aa-43f6-8c57-4a59170e14b0" (UID: "201aace8-e4aa-43f6-8c57-4a59170e14b0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:31:08 crc kubenswrapper[4752]: I1124 11:31:08.732707 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/201aace8-e4aa-43f6-8c57-4a59170e14b0-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 11:31:08 crc kubenswrapper[4752]: I1124 11:31:08.759123 4752 generic.go:334] "Generic (PLEG): container finished" podID="201aace8-e4aa-43f6-8c57-4a59170e14b0" containerID="4d1f197794bb2479feb33c24bd7a74c686a4f7cf9bc336a0a95f14c89ecff408" exitCode=0 Nov 24 11:31:08 crc kubenswrapper[4752]: I1124 11:31:08.759200 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mswn2" event={"ID":"201aace8-e4aa-43f6-8c57-4a59170e14b0","Type":"ContainerDied","Data":"4d1f197794bb2479feb33c24bd7a74c686a4f7cf9bc336a0a95f14c89ecff408"} Nov 24 11:31:08 crc kubenswrapper[4752]: I1124 11:31:08.759242 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mswn2" Nov 24 11:31:08 crc kubenswrapper[4752]: I1124 11:31:08.759291 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mswn2" event={"ID":"201aace8-e4aa-43f6-8c57-4a59170e14b0","Type":"ContainerDied","Data":"c14d25a6f4b42b967626cd6452c8f8241550a9959d081a5f933c48bed155f11a"} Nov 24 11:31:08 crc kubenswrapper[4752]: I1124 11:31:08.759336 4752 scope.go:117] "RemoveContainer" containerID="4d1f197794bb2479feb33c24bd7a74c686a4f7cf9bc336a0a95f14c89ecff408" Nov 24 11:31:08 crc kubenswrapper[4752]: I1124 11:31:08.786723 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mswn2"] Nov 24 11:31:08 crc kubenswrapper[4752]: I1124 11:31:08.790364 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mswn2"] Nov 24 11:31:08 crc kubenswrapper[4752]: I1124 11:31:08.805450 4752 scope.go:117] "RemoveContainer" containerID="4554fdbf58fb1d25f05fcb9614309bb23626af1db50a520b72ce2bf36c282795" Nov 24 11:31:08 crc kubenswrapper[4752]: I1124 11:31:08.833814 4752 scope.go:117] "RemoveContainer" containerID="51a3e351f4bacf33a4cb504ebd1155ced2ad3767959cf5a7236a7cba5a48d15e" Nov 24 11:31:08 crc kubenswrapper[4752]: I1124 11:31:08.874312 4752 scope.go:117] "RemoveContainer" containerID="4d1f197794bb2479feb33c24bd7a74c686a4f7cf9bc336a0a95f14c89ecff408" Nov 24 11:31:08 crc kubenswrapper[4752]: E1124 11:31:08.874843 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d1f197794bb2479feb33c24bd7a74c686a4f7cf9bc336a0a95f14c89ecff408\": container with ID starting with 4d1f197794bb2479feb33c24bd7a74c686a4f7cf9bc336a0a95f14c89ecff408 not found: ID does not exist" containerID="4d1f197794bb2479feb33c24bd7a74c686a4f7cf9bc336a0a95f14c89ecff408" Nov 24 11:31:08 crc kubenswrapper[4752]: I1124 11:31:08.874881 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d1f197794bb2479feb33c24bd7a74c686a4f7cf9bc336a0a95f14c89ecff408"} err="failed to get container status \"4d1f197794bb2479feb33c24bd7a74c686a4f7cf9bc336a0a95f14c89ecff408\": rpc error: code = NotFound desc = could not find container \"4d1f197794bb2479feb33c24bd7a74c686a4f7cf9bc336a0a95f14c89ecff408\": container with ID starting with 4d1f197794bb2479feb33c24bd7a74c686a4f7cf9bc336a0a95f14c89ecff408 not found: ID does not exist" Nov 24 11:31:08 crc kubenswrapper[4752]: I1124 11:31:08.874909 4752 scope.go:117] "RemoveContainer" containerID="4554fdbf58fb1d25f05fcb9614309bb23626af1db50a520b72ce2bf36c282795" Nov 24 11:31:08 crc kubenswrapper[4752]: E1124 11:31:08.875157 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4554fdbf58fb1d25f05fcb9614309bb23626af1db50a520b72ce2bf36c282795\": container with ID starting with 4554fdbf58fb1d25f05fcb9614309bb23626af1db50a520b72ce2bf36c282795 not found: ID does not exist" containerID="4554fdbf58fb1d25f05fcb9614309bb23626af1db50a520b72ce2bf36c282795" Nov 24 11:31:08 crc kubenswrapper[4752]: I1124 11:31:08.875182 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4554fdbf58fb1d25f05fcb9614309bb23626af1db50a520b72ce2bf36c282795"} err="failed to get container status \"4554fdbf58fb1d25f05fcb9614309bb23626af1db50a520b72ce2bf36c282795\": rpc error: code = NotFound desc = could not find container \"4554fdbf58fb1d25f05fcb9614309bb23626af1db50a520b72ce2bf36c282795\": container with ID starting with 4554fdbf58fb1d25f05fcb9614309bb23626af1db50a520b72ce2bf36c282795 not found: ID does not exist" Nov 24 11:31:08 crc kubenswrapper[4752]: I1124 11:31:08.875196 4752 scope.go:117] "RemoveContainer" containerID="51a3e351f4bacf33a4cb504ebd1155ced2ad3767959cf5a7236a7cba5a48d15e" Nov 24 11:31:08 crc kubenswrapper[4752]: E1124 11:31:08.875469 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51a3e351f4bacf33a4cb504ebd1155ced2ad3767959cf5a7236a7cba5a48d15e\": container with ID starting with 51a3e351f4bacf33a4cb504ebd1155ced2ad3767959cf5a7236a7cba5a48d15e not found: ID does not exist" containerID="51a3e351f4bacf33a4cb504ebd1155ced2ad3767959cf5a7236a7cba5a48d15e" Nov 24 11:31:08 crc kubenswrapper[4752]: I1124 11:31:08.875497 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51a3e351f4bacf33a4cb504ebd1155ced2ad3767959cf5a7236a7cba5a48d15e"} err="failed to get container status \"51a3e351f4bacf33a4cb504ebd1155ced2ad3767959cf5a7236a7cba5a48d15e\": rpc error: code = NotFound desc = could not find container \"51a3e351f4bacf33a4cb504ebd1155ced2ad3767959cf5a7236a7cba5a48d15e\": container with ID starting with 51a3e351f4bacf33a4cb504ebd1155ced2ad3767959cf5a7236a7cba5a48d15e not found: ID does not exist" Nov 24 11:31:10 crc kubenswrapper[4752]: I1124 11:31:10.742547 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="201aace8-e4aa-43f6-8c57-4a59170e14b0" path="/var/lib/kubelet/pods/201aace8-e4aa-43f6-8c57-4a59170e14b0/volumes" Nov 24 11:31:15 crc kubenswrapper[4752]: I1124 11:31:15.468990 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 11:31:15 crc kubenswrapper[4752]: I1124 11:31:15.469397 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 11:31:15 crc kubenswrapper[4752]: I1124 11:31:15.469453 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 11:31:15 crc kubenswrapper[4752]: I1124 11:31:15.469967 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a290b60e48084025bca90b1cd49b6254811f073d7356fc1da70f12f4b4085d03"} pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 24 11:31:15 crc kubenswrapper[4752]: I1124 11:31:15.470024 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" containerID="cri-o://a290b60e48084025bca90b1cd49b6254811f073d7356fc1da70f12f4b4085d03" gracePeriod=600 Nov 24 11:31:15 crc kubenswrapper[4752]: E1124 11:31:15.602983 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:31:15 crc kubenswrapper[4752]: I1124 11:31:15.822943 4752 generic.go:334] "Generic (PLEG): container finished" podID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerID="a290b60e48084025bca90b1cd49b6254811f073d7356fc1da70f12f4b4085d03" exitCode=0 Nov 24 11:31:15 crc kubenswrapper[4752]: I1124 11:31:15.823008 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerDied","Data":"a290b60e48084025bca90b1cd49b6254811f073d7356fc1da70f12f4b4085d03"} Nov 24 11:31:15 crc kubenswrapper[4752]: I1124 11:31:15.823099 4752 scope.go:117] "RemoveContainer" containerID="1a145d3540c315b0dee20cc299bc5214b8f1897d2d61c319cc9bb2cf1542af39" Nov 24 11:31:15 crc kubenswrapper[4752]: I1124 11:31:15.824544 4752 scope.go:117] "RemoveContainer" containerID="a290b60e48084025bca90b1cd49b6254811f073d7356fc1da70f12f4b4085d03" Nov 24 11:31:15 crc kubenswrapper[4752]: E1124 11:31:15.824909 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:31:28 crc kubenswrapper[4752]: I1124 11:31:28.728362 4752 scope.go:117] "RemoveContainer" containerID="a290b60e48084025bca90b1cd49b6254811f073d7356fc1da70f12f4b4085d03" Nov 24 11:31:28 crc kubenswrapper[4752]: E1124 11:31:28.729199 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:31:39 crc kubenswrapper[4752]: I1124 11:31:39.200026 4752 scope.go:117] "RemoveContainer" containerID="e55b3974f19bd84417679acf10b5aaff2058381e530e84be3ead5c645b2ddec0" Nov 24 11:31:39 crc kubenswrapper[4752]: I1124 11:31:39.243303 4752 scope.go:117] "RemoveContainer" containerID="b756f1cb33ac9fd77f5462fb04d0efd55526c3b8882a7fb850fc306538938658" Nov 24 11:31:39 crc kubenswrapper[4752]: I1124 11:31:39.271410 4752 scope.go:117] "RemoveContainer" containerID="2081cff28fb25e3157d2d48fa39f73ccfb779669f82a6e60a027ff939e135e2c" Nov 24 11:31:39 crc kubenswrapper[4752]: I1124 11:31:39.291926 4752 scope.go:117] "RemoveContainer" containerID="fe306ec37ea8c3b9abfe5a4546df04b8bc9a09dc37fbf1c4dfdef5c811afc94e" Nov 24 11:31:39 crc kubenswrapper[4752]: I1124 11:31:39.329288 4752 scope.go:117] "RemoveContainer" containerID="7db16ce7ee3ac97fc182bddca6eb14221eeabd5088bcf45d2229d3661c794580" Nov 24 11:31:39 crc kubenswrapper[4752]: I1124 11:31:39.364353 4752 scope.go:117] "RemoveContainer" containerID="681a52540e99f093687c5c5cb86c74a9c9b1f74b74796ab532bbc3a3af899758" Nov 24 11:31:39 crc kubenswrapper[4752]: I1124 11:31:39.398884 4752 scope.go:117] "RemoveContainer" containerID="c2dc34b980c8970786595272b69db9751ab1245dd93ac741139b987c4b76363e" Nov 24 11:31:39 crc kubenswrapper[4752]: I1124 11:31:39.429283 4752 scope.go:117] "RemoveContainer" containerID="4df45b8527f2c003f39ff0dbda540833a7117cc872aceb703558a81c2b0f4ae1" Nov 24 11:31:39 crc kubenswrapper[4752]: I1124 11:31:39.452996 4752 scope.go:117] "RemoveContainer" containerID="aaadfc25131dfddd66fa261c60845cf30f3a819ab939eec3683de48fc967913c" Nov 24 11:31:39 crc kubenswrapper[4752]: I1124 11:31:39.481188 4752 scope.go:117] "RemoveContainer" containerID="608e26395b4dc1de0bd00669f57c463df6c52c83abe35e40f229dda6794c160f" Nov 24 11:31:39 crc kubenswrapper[4752]: I1124 11:31:39.503261 4752 scope.go:117] "RemoveContainer" containerID="28993df0524c42d77ebd5a549957c79d732fa86a3d6de2ac8c0b38ed64532ac2" Nov 24 11:31:39 crc kubenswrapper[4752]: I1124 11:31:39.537903 4752 scope.go:117] "RemoveContainer" containerID="11876dbf5b00d15eb3d002e8d11562f90c15b2bebf4c5f309e67dc8dc3e37bbb" Nov 24 11:31:39 crc kubenswrapper[4752]: I1124 11:31:39.571993 4752 scope.go:117] "RemoveContainer" containerID="b891d4bc5a7581172d59c7e614c2ede206cf3a7bc2dcfbffabe7f2e2bc23b602" Nov 24 11:31:39 crc kubenswrapper[4752]: I1124 11:31:39.594037 4752 scope.go:117] "RemoveContainer" containerID="31f4463200798c8e71465693c21c3b267736953dd9c6093042d651dea9d77b08" Nov 24 11:31:39 crc kubenswrapper[4752]: I1124 11:31:39.618827 4752 scope.go:117] "RemoveContainer" containerID="30b709fe7204f5ebbe3e77680225e7690d93a9b9881d529fce8b592b7c3e865e" Nov 24 11:31:39 crc kubenswrapper[4752]: I1124 11:31:39.640194 4752 scope.go:117] "RemoveContainer" containerID="8afbde4838c68ed65d0a791d3f6cf81c9dff82063ee60fdee90cec80f669ae26" Nov 24 11:31:41 crc kubenswrapper[4752]: I1124 11:31:41.727896 4752 scope.go:117] "RemoveContainer" containerID="a290b60e48084025bca90b1cd49b6254811f073d7356fc1da70f12f4b4085d03" Nov 24 11:31:41 crc kubenswrapper[4752]: E1124 11:31:41.728296 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:31:53 crc kubenswrapper[4752]: I1124 11:31:53.728562 4752 scope.go:117] "RemoveContainer" containerID="a290b60e48084025bca90b1cd49b6254811f073d7356fc1da70f12f4b4085d03" Nov 24 11:31:53 crc kubenswrapper[4752]: E1124 11:31:53.729453 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:32:05 crc kubenswrapper[4752]: I1124 11:32:05.728267 4752 scope.go:117] "RemoveContainer" containerID="a290b60e48084025bca90b1cd49b6254811f073d7356fc1da70f12f4b4085d03" Nov 24 11:32:05 crc kubenswrapper[4752]: E1124 11:32:05.729227 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:32:20 crc kubenswrapper[4752]: I1124 11:32:20.728284 4752 scope.go:117] "RemoveContainer" containerID="a290b60e48084025bca90b1cd49b6254811f073d7356fc1da70f12f4b4085d03" Nov 24 11:32:20 crc kubenswrapper[4752]: E1124 11:32:20.729233 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:32:34 crc kubenswrapper[4752]: I1124 11:32:34.733306 4752 scope.go:117] "RemoveContainer" containerID="a290b60e48084025bca90b1cd49b6254811f073d7356fc1da70f12f4b4085d03" Nov 24 11:32:34 crc kubenswrapper[4752]: E1124 11:32:34.733880 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:32:39 crc kubenswrapper[4752]: I1124 11:32:39.866462 4752 scope.go:117] "RemoveContainer" containerID="9f8c293fe3f68cdc99eb5f8a593f9b4e799568c50b7fc38020752eb9bf6b2982" Nov 24 11:32:39 crc kubenswrapper[4752]: I1124 11:32:39.901400 4752 scope.go:117] "RemoveContainer" containerID="fad4f8068e453f4f0adeb3e91bcabcf8d768979c4e10c48b561d2681459e57aa" Nov 24 11:32:39 crc kubenswrapper[4752]: I1124 11:32:39.919540 4752 scope.go:117] "RemoveContainer" containerID="cc9d034a28c28ef8d0777d5e4c3d5981c8b605ee235018113534b49a6c30c837" Nov 24 11:32:39 crc kubenswrapper[4752]: I1124 11:32:39.942845 4752 scope.go:117] "RemoveContainer" containerID="020c79cb453e7768988ebf442209375f39cfa455420af82063716113800a4446" Nov 24 11:32:39 crc kubenswrapper[4752]: I1124 11:32:39.974255 4752 scope.go:117] "RemoveContainer" containerID="e0babdc37f4d23392fd7039a2c159f95eeb6c70c008d4d45704626d8cce25733" Nov 24 11:32:40 crc kubenswrapper[4752]: I1124 11:32:40.018530 4752 scope.go:117] "RemoveContainer" containerID="3ce9fb668d269804d6dd2f2c8f344a9ec3cad9852de93a6c221320e395600d05" Nov 24 11:32:40 crc kubenswrapper[4752]: I1124 11:32:40.034809 4752 scope.go:117] "RemoveContainer" containerID="5608c635aa46e4432dd124ea11759d4280b1e7c2301d6131246b4a3f068d7cee" Nov 24 11:32:40 crc kubenswrapper[4752]: I1124 11:32:40.057884 4752 scope.go:117] "RemoveContainer" containerID="b2e200c05ccf3e61807da30a56ceeccd81731dd7d3c5381db6ec4a545e58aceb" Nov 24 11:32:40 crc kubenswrapper[4752]: I1124 11:32:40.073480 4752 scope.go:117] "RemoveContainer" containerID="cd58caf1dc39366b3251ce0c0c7889a1bdc484f870e4acf5136461945278d7e8" Nov 24 11:32:48 crc kubenswrapper[4752]: I1124 11:32:48.728271 4752 scope.go:117] "RemoveContainer" containerID="a290b60e48084025bca90b1cd49b6254811f073d7356fc1da70f12f4b4085d03" Nov 24 11:32:48 crc kubenswrapper[4752]: E1124 11:32:48.729167 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:32:59 crc kubenswrapper[4752]: I1124 11:32:59.729080 4752 scope.go:117] "RemoveContainer" containerID="a290b60e48084025bca90b1cd49b6254811f073d7356fc1da70f12f4b4085d03" Nov 24 11:32:59 crc kubenswrapper[4752]: E1124 11:32:59.730991 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:33:10 crc kubenswrapper[4752]: I1124 11:33:10.728376 4752 scope.go:117] "RemoveContainer" containerID="a290b60e48084025bca90b1cd49b6254811f073d7356fc1da70f12f4b4085d03" Nov 24 11:33:10 crc kubenswrapper[4752]: E1124 11:33:10.731333 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:33:24 crc kubenswrapper[4752]: I1124 11:33:24.732988 4752 scope.go:117] "RemoveContainer" containerID="a290b60e48084025bca90b1cd49b6254811f073d7356fc1da70f12f4b4085d03" Nov 24 11:33:24 crc kubenswrapper[4752]: E1124 11:33:24.733702 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:33:39 crc kubenswrapper[4752]: I1124 11:33:39.728135 4752 scope.go:117] "RemoveContainer" containerID="a290b60e48084025bca90b1cd49b6254811f073d7356fc1da70f12f4b4085d03" Nov 24 11:33:39 crc kubenswrapper[4752]: E1124 11:33:39.729039 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:33:40 crc kubenswrapper[4752]: I1124 11:33:40.201457 4752 scope.go:117] "RemoveContainer" containerID="b6a6637e2c4a956270f0ef941c35eb469cca815e47679929b858c1f2f545b628" Nov 24 11:33:40 crc kubenswrapper[4752]: I1124 11:33:40.246568 4752 scope.go:117] "RemoveContainer" containerID="456e104a9500a5d75fcca8f093e7825be12c29b5f33ca5320de653b354d9b91c" Nov 24 11:33:40 crc kubenswrapper[4752]: I1124 11:33:40.262235 4752 scope.go:117] "RemoveContainer" containerID="682dc8577a8c51a3a589c8dcc6befead348a66e7150c6f7f392a3e1290c56b3a" Nov 24 11:33:40 crc kubenswrapper[4752]: I1124 11:33:40.282160 4752 scope.go:117] "RemoveContainer" containerID="8874e4f51742808fc31228902a4ae1a8f430f19b1a2fd93d62169bec6c779b21" Nov 24 11:33:40 crc kubenswrapper[4752]: I1124 11:33:40.311007 4752 scope.go:117] "RemoveContainer" containerID="dfa9458977016460868074db72d499cd7233b5ead8ca22d285fc42aa9c5b96af" Nov 24 11:33:52 crc kubenswrapper[4752]: I1124 11:33:52.728879 4752 scope.go:117] "RemoveContainer" containerID="a290b60e48084025bca90b1cd49b6254811f073d7356fc1da70f12f4b4085d03" Nov 24 11:33:52 crc kubenswrapper[4752]: E1124 11:33:52.729875 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:34:04 crc kubenswrapper[4752]: I1124 11:34:04.739877 4752 scope.go:117] "RemoveContainer" containerID="a290b60e48084025bca90b1cd49b6254811f073d7356fc1da70f12f4b4085d03" Nov 24 11:34:04 crc kubenswrapper[4752]: E1124 11:34:04.741126 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:34:19 crc kubenswrapper[4752]: I1124 11:34:19.728970 4752 scope.go:117] "RemoveContainer" containerID="a290b60e48084025bca90b1cd49b6254811f073d7356fc1da70f12f4b4085d03" Nov 24 11:34:19 crc kubenswrapper[4752]: E1124 11:34:19.733773 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:34:31 crc kubenswrapper[4752]: I1124 11:34:31.728933 4752 scope.go:117] "RemoveContainer" containerID="a290b60e48084025bca90b1cd49b6254811f073d7356fc1da70f12f4b4085d03" Nov 24 11:34:31 crc kubenswrapper[4752]: E1124 11:34:31.730484 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:34:40 crc kubenswrapper[4752]: I1124 11:34:40.422936 4752 scope.go:117] "RemoveContainer" containerID="858e76b45449fa0fcdbccdf4d4f6325a65859816c0ee8d347ba69396fbd78a83" Nov 24 11:34:40 crc kubenswrapper[4752]: I1124 11:34:40.456816 4752 scope.go:117] "RemoveContainer" containerID="9568b7bef3cbeeeda117b907e1246694e8acf31ceba1e01df07fa02cb3727f40" Nov 24 11:34:40 crc kubenswrapper[4752]: I1124 11:34:40.475337 4752 scope.go:117] "RemoveContainer" containerID="a4d46d258f63ded30ef4a97217ef0d958fa5fa116dacc0a15e72c21b6b729660" Nov 24 11:34:40 crc kubenswrapper[4752]: I1124 11:34:40.500369 4752 scope.go:117] "RemoveContainer" containerID="ab779a1c42b633961d1f5b419a52d5bcf3916d2d4fe9bc9ba7cc57e2fc21152c" Nov 24 11:34:45 crc kubenswrapper[4752]: I1124 11:34:45.727671 4752 scope.go:117] "RemoveContainer" containerID="a290b60e48084025bca90b1cd49b6254811f073d7356fc1da70f12f4b4085d03" Nov 24 11:34:45 crc kubenswrapper[4752]: E1124 11:34:45.729218 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:34:59 crc kubenswrapper[4752]: I1124 11:34:59.727971 4752 scope.go:117] "RemoveContainer" containerID="a290b60e48084025bca90b1cd49b6254811f073d7356fc1da70f12f4b4085d03" Nov 24 11:34:59 crc kubenswrapper[4752]: E1124 11:34:59.730125 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:35:14 crc kubenswrapper[4752]: I1124 11:35:14.732799 4752 scope.go:117] "RemoveContainer" containerID="a290b60e48084025bca90b1cd49b6254811f073d7356fc1da70f12f4b4085d03" Nov 24 11:35:14 crc kubenswrapper[4752]: E1124 11:35:14.735296 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:35:25 crc kubenswrapper[4752]: I1124 11:35:25.728639 4752 scope.go:117] "RemoveContainer" containerID="a290b60e48084025bca90b1cd49b6254811f073d7356fc1da70f12f4b4085d03" Nov 24 11:35:25 crc kubenswrapper[4752]: E1124 11:35:25.729307 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:35:37 crc kubenswrapper[4752]: I1124 11:35:37.730145 4752 scope.go:117] "RemoveContainer" containerID="a290b60e48084025bca90b1cd49b6254811f073d7356fc1da70f12f4b4085d03" Nov 24 11:35:37 crc kubenswrapper[4752]: E1124 11:35:37.730939 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:35:48 crc kubenswrapper[4752]: I1124 11:35:48.728779 4752 scope.go:117] "RemoveContainer" containerID="a290b60e48084025bca90b1cd49b6254811f073d7356fc1da70f12f4b4085d03" Nov 24 11:35:48 crc kubenswrapper[4752]: E1124 11:35:48.729658 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:36:00 crc kubenswrapper[4752]: I1124 11:36:00.728508 4752 scope.go:117] "RemoveContainer" containerID="a290b60e48084025bca90b1cd49b6254811f073d7356fc1da70f12f4b4085d03" Nov 24 11:36:00 crc kubenswrapper[4752]: E1124 11:36:00.729300 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:36:11 crc kubenswrapper[4752]: I1124 11:36:11.728455 4752 scope.go:117] "RemoveContainer" containerID="a290b60e48084025bca90b1cd49b6254811f073d7356fc1da70f12f4b4085d03" Nov 24 11:36:11 crc kubenswrapper[4752]: E1124 11:36:11.730393 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:36:23 crc kubenswrapper[4752]: I1124 11:36:23.728249 4752 scope.go:117] "RemoveContainer" containerID="a290b60e48084025bca90b1cd49b6254811f073d7356fc1da70f12f4b4085d03" Nov 24 11:36:24 crc kubenswrapper[4752]: I1124 11:36:24.498121 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerStarted","Data":"b317061cfbb267e4bf956b4697e267978fbf62eb5297c5426fa3230aef485d54"} Nov 24 11:38:45 crc kubenswrapper[4752]: I1124 11:38:45.469092 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 11:38:45 crc kubenswrapper[4752]: I1124 11:38:45.469720 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 11:39:15 crc kubenswrapper[4752]: I1124 11:39:15.469154 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 11:39:15 crc kubenswrapper[4752]: I1124 11:39:15.469713 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 11:39:45 crc kubenswrapper[4752]: I1124 11:39:45.469054 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 11:39:45 crc kubenswrapper[4752]: I1124 11:39:45.469723 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 11:39:45 crc kubenswrapper[4752]: I1124 11:39:45.469826 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 11:39:45 crc kubenswrapper[4752]: I1124 11:39:45.470852 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b317061cfbb267e4bf956b4697e267978fbf62eb5297c5426fa3230aef485d54"} pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 24 11:39:45 crc kubenswrapper[4752]: I1124 11:39:45.470951 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" containerID="cri-o://b317061cfbb267e4bf956b4697e267978fbf62eb5297c5426fa3230aef485d54" gracePeriod=600 Nov 24 11:39:46 crc kubenswrapper[4752]: I1124 11:39:46.196023 4752 generic.go:334] "Generic (PLEG): container finished" podID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerID="b317061cfbb267e4bf956b4697e267978fbf62eb5297c5426fa3230aef485d54" exitCode=0 Nov 24 11:39:46 crc kubenswrapper[4752]: I1124 11:39:46.196104 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerDied","Data":"b317061cfbb267e4bf956b4697e267978fbf62eb5297c5426fa3230aef485d54"} Nov 24 11:39:46 crc kubenswrapper[4752]: I1124 11:39:46.196473 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerStarted","Data":"14f4d00d9a59d31eb6e46cebe36677e01200347bee214a5be288ba8301a5f228"} Nov 24 11:39:46 crc kubenswrapper[4752]: I1124 11:39:46.196502 4752 scope.go:117] "RemoveContainer" containerID="a290b60e48084025bca90b1cd49b6254811f073d7356fc1da70f12f4b4085d03" Nov 24 11:40:31 crc kubenswrapper[4752]: I1124 11:40:31.459301 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-46whq"] Nov 24 11:40:31 crc kubenswrapper[4752]: E1124 11:40:31.460516 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="201aace8-e4aa-43f6-8c57-4a59170e14b0" containerName="extract-content" Nov 24 11:40:31 crc kubenswrapper[4752]: I1124 11:40:31.460541 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="201aace8-e4aa-43f6-8c57-4a59170e14b0" containerName="extract-content" Nov 24 11:40:31 crc kubenswrapper[4752]: E1124 11:40:31.460577 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="201aace8-e4aa-43f6-8c57-4a59170e14b0" containerName="extract-utilities" Nov 24 11:40:31 crc kubenswrapper[4752]: I1124 11:40:31.460588 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="201aace8-e4aa-43f6-8c57-4a59170e14b0" containerName="extract-utilities" Nov 24 11:40:31 crc kubenswrapper[4752]: E1124 11:40:31.460609 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="201aace8-e4aa-43f6-8c57-4a59170e14b0" containerName="registry-server" Nov 24 11:40:31 crc kubenswrapper[4752]: I1124 11:40:31.460623 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="201aace8-e4aa-43f6-8c57-4a59170e14b0" containerName="registry-server" Nov 24 11:40:31 crc kubenswrapper[4752]: I1124 11:40:31.460895 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="201aace8-e4aa-43f6-8c57-4a59170e14b0" containerName="registry-server" Nov 24 11:40:31 crc kubenswrapper[4752]: I1124 11:40:31.462468 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-46whq" Nov 24 11:40:31 crc kubenswrapper[4752]: I1124 11:40:31.468996 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-46whq"] Nov 24 11:40:31 crc kubenswrapper[4752]: I1124 11:40:31.646897 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg6q9\" (UniqueName: \"kubernetes.io/projected/cfa01e41-7f49-455f-ae77-821edc4a0018-kube-api-access-cg6q9\") pod \"redhat-operators-46whq\" (UID: \"cfa01e41-7f49-455f-ae77-821edc4a0018\") " pod="openshift-marketplace/redhat-operators-46whq" Nov 24 11:40:31 crc kubenswrapper[4752]: I1124 11:40:31.646996 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfa01e41-7f49-455f-ae77-821edc4a0018-utilities\") pod \"redhat-operators-46whq\" (UID: \"cfa01e41-7f49-455f-ae77-821edc4a0018\") " pod="openshift-marketplace/redhat-operators-46whq" Nov 24 11:40:31 crc kubenswrapper[4752]: I1124 11:40:31.647059 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfa01e41-7f49-455f-ae77-821edc4a0018-catalog-content\") pod \"redhat-operators-46whq\" (UID: \"cfa01e41-7f49-455f-ae77-821edc4a0018\") " pod="openshift-marketplace/redhat-operators-46whq" Nov 24 11:40:31 crc kubenswrapper[4752]: I1124 11:40:31.748229 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg6q9\" (UniqueName: \"kubernetes.io/projected/cfa01e41-7f49-455f-ae77-821edc4a0018-kube-api-access-cg6q9\") pod \"redhat-operators-46whq\" (UID: \"cfa01e41-7f49-455f-ae77-821edc4a0018\") " pod="openshift-marketplace/redhat-operators-46whq" Nov 24 11:40:31 crc kubenswrapper[4752]: I1124 11:40:31.748336 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfa01e41-7f49-455f-ae77-821edc4a0018-utilities\") pod \"redhat-operators-46whq\" (UID: \"cfa01e41-7f49-455f-ae77-821edc4a0018\") " pod="openshift-marketplace/redhat-operators-46whq" Nov 24 11:40:31 crc kubenswrapper[4752]: I1124 11:40:31.748380 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfa01e41-7f49-455f-ae77-821edc4a0018-catalog-content\") pod \"redhat-operators-46whq\" (UID: \"cfa01e41-7f49-455f-ae77-821edc4a0018\") " pod="openshift-marketplace/redhat-operators-46whq" Nov 24 11:40:31 crc kubenswrapper[4752]: I1124 11:40:31.748864 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfa01e41-7f49-455f-ae77-821edc4a0018-catalog-content\") pod \"redhat-operators-46whq\" (UID: \"cfa01e41-7f49-455f-ae77-821edc4a0018\") " pod="openshift-marketplace/redhat-operators-46whq" Nov 24 11:40:31 crc kubenswrapper[4752]: I1124 11:40:31.748951 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfa01e41-7f49-455f-ae77-821edc4a0018-utilities\") pod \"redhat-operators-46whq\" (UID: \"cfa01e41-7f49-455f-ae77-821edc4a0018\") " pod="openshift-marketplace/redhat-operators-46whq" Nov 24 11:40:31 crc kubenswrapper[4752]: I1124 11:40:31.767875 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg6q9\" (UniqueName: \"kubernetes.io/projected/cfa01e41-7f49-455f-ae77-821edc4a0018-kube-api-access-cg6q9\") pod \"redhat-operators-46whq\" (UID: \"cfa01e41-7f49-455f-ae77-821edc4a0018\") " pod="openshift-marketplace/redhat-operators-46whq" Nov 24 11:40:31 crc kubenswrapper[4752]: I1124 11:40:31.819472 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-46whq" Nov 24 11:40:32 crc kubenswrapper[4752]: I1124 11:40:32.234345 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-46whq"] Nov 24 11:40:32 crc kubenswrapper[4752]: I1124 11:40:32.600348 4752 generic.go:334] "Generic (PLEG): container finished" podID="cfa01e41-7f49-455f-ae77-821edc4a0018" containerID="4b3aa539bb2a0210afcb6fe8ba4dfd2c7350834fe5d5809c269fa4aaf1ba6d06" exitCode=0 Nov 24 11:40:32 crc kubenswrapper[4752]: I1124 11:40:32.600397 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-46whq" event={"ID":"cfa01e41-7f49-455f-ae77-821edc4a0018","Type":"ContainerDied","Data":"4b3aa539bb2a0210afcb6fe8ba4dfd2c7350834fe5d5809c269fa4aaf1ba6d06"} Nov 24 11:40:32 crc kubenswrapper[4752]: I1124 11:40:32.600447 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-46whq" event={"ID":"cfa01e41-7f49-455f-ae77-821edc4a0018","Type":"ContainerStarted","Data":"595bd2f0ad6d8cac4c664d324f101ccf9fe2eb6cf765ac47ea0db14659f71448"} Nov 24 11:40:32 crc kubenswrapper[4752]: I1124 11:40:32.602559 4752 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 24 11:40:33 crc kubenswrapper[4752]: I1124 11:40:33.613267 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-46whq" event={"ID":"cfa01e41-7f49-455f-ae77-821edc4a0018","Type":"ContainerStarted","Data":"f5f490f13bdbe3e98891cf126ed237aa318d3d193d1d6575ed32e9b00af8900c"} Nov 24 11:40:34 crc kubenswrapper[4752]: I1124 11:40:34.627188 4752 generic.go:334] "Generic (PLEG): container finished" podID="cfa01e41-7f49-455f-ae77-821edc4a0018" containerID="f5f490f13bdbe3e98891cf126ed237aa318d3d193d1d6575ed32e9b00af8900c" exitCode=0 Nov 24 11:40:34 crc kubenswrapper[4752]: I1124 11:40:34.627226 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-46whq" event={"ID":"cfa01e41-7f49-455f-ae77-821edc4a0018","Type":"ContainerDied","Data":"f5f490f13bdbe3e98891cf126ed237aa318d3d193d1d6575ed32e9b00af8900c"} Nov 24 11:40:35 crc kubenswrapper[4752]: I1124 11:40:35.638578 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-46whq" event={"ID":"cfa01e41-7f49-455f-ae77-821edc4a0018","Type":"ContainerStarted","Data":"ea4471d2c0c1a118542b3c2131d4da537b9e6d07a7f49e7dad060927706bbf89"} Nov 24 11:40:35 crc kubenswrapper[4752]: I1124 11:40:35.667726 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-46whq" podStartSLOduration=2.221610035 podStartE2EDuration="4.667705663s" podCreationTimestamp="2025-11-24 11:40:31 +0000 UTC" firstStartedPulling="2025-11-24 11:40:32.602324272 +0000 UTC m=+2038.587144561" lastFinishedPulling="2025-11-24 11:40:35.04841985 +0000 UTC m=+2041.033240189" observedRunningTime="2025-11-24 11:40:35.659506085 +0000 UTC m=+2041.644326374" watchObservedRunningTime="2025-11-24 11:40:35.667705663 +0000 UTC m=+2041.652525952" Nov 24 11:40:41 crc kubenswrapper[4752]: I1124 11:40:41.819934 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-46whq" Nov 24 11:40:41 crc kubenswrapper[4752]: I1124 11:40:41.820521 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-46whq" Nov 24 11:40:41 crc kubenswrapper[4752]: I1124 11:40:41.898071 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-46whq" Nov 24 11:40:42 crc kubenswrapper[4752]: I1124 11:40:42.768634 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-46whq" Nov 24 11:40:42 crc kubenswrapper[4752]: I1124 11:40:42.817097 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-46whq"] Nov 24 11:40:44 crc kubenswrapper[4752]: I1124 11:40:44.739870 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-46whq" podUID="cfa01e41-7f49-455f-ae77-821edc4a0018" containerName="registry-server" containerID="cri-o://ea4471d2c0c1a118542b3c2131d4da537b9e6d07a7f49e7dad060927706bbf89" gracePeriod=2 Nov 24 11:40:45 crc kubenswrapper[4752]: I1124 11:40:45.565799 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nl7tl"] Nov 24 11:40:45 crc kubenswrapper[4752]: I1124 11:40:45.567603 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nl7tl" Nov 24 11:40:45 crc kubenswrapper[4752]: I1124 11:40:45.585346 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59a05b3c-7f50-43bb-8fad-9de225d4fb96-utilities\") pod \"community-operators-nl7tl\" (UID: \"59a05b3c-7f50-43bb-8fad-9de225d4fb96\") " pod="openshift-marketplace/community-operators-nl7tl" Nov 24 11:40:45 crc kubenswrapper[4752]: I1124 11:40:45.585623 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh8q9\" (UniqueName: \"kubernetes.io/projected/59a05b3c-7f50-43bb-8fad-9de225d4fb96-kube-api-access-zh8q9\") pod \"community-operators-nl7tl\" (UID: \"59a05b3c-7f50-43bb-8fad-9de225d4fb96\") " pod="openshift-marketplace/community-operators-nl7tl" Nov 24 11:40:45 crc kubenswrapper[4752]: I1124 11:40:45.585695 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59a05b3c-7f50-43bb-8fad-9de225d4fb96-catalog-content\") pod \"community-operators-nl7tl\" (UID: \"59a05b3c-7f50-43bb-8fad-9de225d4fb96\") " pod="openshift-marketplace/community-operators-nl7tl" Nov 24 11:40:45 crc kubenswrapper[4752]: I1124 11:40:45.586780 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nl7tl"] Nov 24 11:40:45 crc kubenswrapper[4752]: I1124 11:40:45.686790 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59a05b3c-7f50-43bb-8fad-9de225d4fb96-utilities\") pod \"community-operators-nl7tl\" (UID: \"59a05b3c-7f50-43bb-8fad-9de225d4fb96\") " pod="openshift-marketplace/community-operators-nl7tl" Nov 24 11:40:45 crc kubenswrapper[4752]: I1124 11:40:45.686892 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh8q9\" (UniqueName: \"kubernetes.io/projected/59a05b3c-7f50-43bb-8fad-9de225d4fb96-kube-api-access-zh8q9\") pod \"community-operators-nl7tl\" (UID: \"59a05b3c-7f50-43bb-8fad-9de225d4fb96\") " pod="openshift-marketplace/community-operators-nl7tl" Nov 24 11:40:45 crc kubenswrapper[4752]: I1124 11:40:45.686922 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59a05b3c-7f50-43bb-8fad-9de225d4fb96-catalog-content\") pod \"community-operators-nl7tl\" (UID: \"59a05b3c-7f50-43bb-8fad-9de225d4fb96\") " pod="openshift-marketplace/community-operators-nl7tl" Nov 24 11:40:45 crc kubenswrapper[4752]: I1124 11:40:45.687431 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59a05b3c-7f50-43bb-8fad-9de225d4fb96-catalog-content\") pod \"community-operators-nl7tl\" (UID: \"59a05b3c-7f50-43bb-8fad-9de225d4fb96\") " pod="openshift-marketplace/community-operators-nl7tl" Nov 24 11:40:45 crc kubenswrapper[4752]: I1124 11:40:45.687466 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59a05b3c-7f50-43bb-8fad-9de225d4fb96-utilities\") pod \"community-operators-nl7tl\" (UID: \"59a05b3c-7f50-43bb-8fad-9de225d4fb96\") " pod="openshift-marketplace/community-operators-nl7tl" Nov 24 11:40:45 crc kubenswrapper[4752]: I1124 11:40:45.708174 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh8q9\" (UniqueName: \"kubernetes.io/projected/59a05b3c-7f50-43bb-8fad-9de225d4fb96-kube-api-access-zh8q9\") pod \"community-operators-nl7tl\" (UID: \"59a05b3c-7f50-43bb-8fad-9de225d4fb96\") " pod="openshift-marketplace/community-operators-nl7tl" Nov 24 11:40:45 crc kubenswrapper[4752]: I1124 11:40:45.748843 4752 generic.go:334] "Generic (PLEG): container finished" podID="cfa01e41-7f49-455f-ae77-821edc4a0018" containerID="ea4471d2c0c1a118542b3c2131d4da537b9e6d07a7f49e7dad060927706bbf89" exitCode=0 Nov 24 11:40:45 crc kubenswrapper[4752]: I1124 11:40:45.748876 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-46whq" event={"ID":"cfa01e41-7f49-455f-ae77-821edc4a0018","Type":"ContainerDied","Data":"ea4471d2c0c1a118542b3c2131d4da537b9e6d07a7f49e7dad060927706bbf89"} Nov 24 11:40:45 crc kubenswrapper[4752]: I1124 11:40:45.900104 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nl7tl" Nov 24 11:40:46 crc kubenswrapper[4752]: I1124 11:40:46.280288 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-46whq" Nov 24 11:40:46 crc kubenswrapper[4752]: I1124 11:40:46.395214 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfa01e41-7f49-455f-ae77-821edc4a0018-utilities\") pod \"cfa01e41-7f49-455f-ae77-821edc4a0018\" (UID: \"cfa01e41-7f49-455f-ae77-821edc4a0018\") " Nov 24 11:40:46 crc kubenswrapper[4752]: I1124 11:40:46.395258 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfa01e41-7f49-455f-ae77-821edc4a0018-catalog-content\") pod \"cfa01e41-7f49-455f-ae77-821edc4a0018\" (UID: \"cfa01e41-7f49-455f-ae77-821edc4a0018\") " Nov 24 11:40:46 crc kubenswrapper[4752]: I1124 11:40:46.395313 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cg6q9\" (UniqueName: \"kubernetes.io/projected/cfa01e41-7f49-455f-ae77-821edc4a0018-kube-api-access-cg6q9\") pod \"cfa01e41-7f49-455f-ae77-821edc4a0018\" (UID: \"cfa01e41-7f49-455f-ae77-821edc4a0018\") " Nov 24 11:40:46 crc kubenswrapper[4752]: I1124 11:40:46.396549 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfa01e41-7f49-455f-ae77-821edc4a0018-utilities" (OuterVolumeSpecName: "utilities") pod "cfa01e41-7f49-455f-ae77-821edc4a0018" (UID: "cfa01e41-7f49-455f-ae77-821edc4a0018"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:40:46 crc kubenswrapper[4752]: I1124 11:40:46.401737 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfa01e41-7f49-455f-ae77-821edc4a0018-kube-api-access-cg6q9" (OuterVolumeSpecName: "kube-api-access-cg6q9") pod "cfa01e41-7f49-455f-ae77-821edc4a0018" (UID: "cfa01e41-7f49-455f-ae77-821edc4a0018"). InnerVolumeSpecName "kube-api-access-cg6q9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:40:46 crc kubenswrapper[4752]: W1124 11:40:46.487924 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59a05b3c_7f50_43bb_8fad_9de225d4fb96.slice/crio-1ab5d6cd31d9be362a6871309e1aa7d4e0b30ca91918efd9db71f4bef63d4935 WatchSource:0}: Error finding container 1ab5d6cd31d9be362a6871309e1aa7d4e0b30ca91918efd9db71f4bef63d4935: Status 404 returned error can't find the container with id 1ab5d6cd31d9be362a6871309e1aa7d4e0b30ca91918efd9db71f4bef63d4935 Nov 24 11:40:46 crc kubenswrapper[4752]: I1124 11:40:46.489326 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nl7tl"] Nov 24 11:40:46 crc kubenswrapper[4752]: I1124 11:40:46.496314 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfa01e41-7f49-455f-ae77-821edc4a0018-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 11:40:46 crc kubenswrapper[4752]: I1124 11:40:46.496339 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cg6q9\" (UniqueName: \"kubernetes.io/projected/cfa01e41-7f49-455f-ae77-821edc4a0018-kube-api-access-cg6q9\") on node \"crc\" DevicePath \"\"" Nov 24 11:40:46 crc kubenswrapper[4752]: I1124 11:40:46.520136 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfa01e41-7f49-455f-ae77-821edc4a0018-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cfa01e41-7f49-455f-ae77-821edc4a0018" (UID: "cfa01e41-7f49-455f-ae77-821edc4a0018"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:40:46 crc kubenswrapper[4752]: I1124 11:40:46.597456 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfa01e41-7f49-455f-ae77-821edc4a0018-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 11:40:46 crc kubenswrapper[4752]: I1124 11:40:46.759729 4752 generic.go:334] "Generic (PLEG): container finished" podID="59a05b3c-7f50-43bb-8fad-9de225d4fb96" containerID="b31a6265aa77f22b3b745ec00b5ce2fe238242508f5a5b3ce5ccd62450150b0c" exitCode=0 Nov 24 11:40:46 crc kubenswrapper[4752]: I1124 11:40:46.759809 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nl7tl" event={"ID":"59a05b3c-7f50-43bb-8fad-9de225d4fb96","Type":"ContainerDied","Data":"b31a6265aa77f22b3b745ec00b5ce2fe238242508f5a5b3ce5ccd62450150b0c"} Nov 24 11:40:46 crc kubenswrapper[4752]: I1124 11:40:46.759886 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nl7tl" event={"ID":"59a05b3c-7f50-43bb-8fad-9de225d4fb96","Type":"ContainerStarted","Data":"1ab5d6cd31d9be362a6871309e1aa7d4e0b30ca91918efd9db71f4bef63d4935"} Nov 24 11:40:46 crc kubenswrapper[4752]: I1124 11:40:46.765013 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-46whq" event={"ID":"cfa01e41-7f49-455f-ae77-821edc4a0018","Type":"ContainerDied","Data":"595bd2f0ad6d8cac4c664d324f101ccf9fe2eb6cf765ac47ea0db14659f71448"} Nov 24 11:40:46 crc kubenswrapper[4752]: I1124 11:40:46.765071 4752 scope.go:117] "RemoveContainer" containerID="ea4471d2c0c1a118542b3c2131d4da537b9e6d07a7f49e7dad060927706bbf89" Nov 24 11:40:46 crc kubenswrapper[4752]: I1124 11:40:46.765194 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-46whq" Nov 24 11:40:46 crc kubenswrapper[4752]: I1124 11:40:46.799768 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-46whq"] Nov 24 11:40:46 crc kubenswrapper[4752]: I1124 11:40:46.801822 4752 scope.go:117] "RemoveContainer" containerID="f5f490f13bdbe3e98891cf126ed237aa318d3d193d1d6575ed32e9b00af8900c" Nov 24 11:40:46 crc kubenswrapper[4752]: I1124 11:40:46.804214 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-46whq"] Nov 24 11:40:46 crc kubenswrapper[4752]: I1124 11:40:46.825335 4752 scope.go:117] "RemoveContainer" containerID="4b3aa539bb2a0210afcb6fe8ba4dfd2c7350834fe5d5809c269fa4aaf1ba6d06" Nov 24 11:40:48 crc kubenswrapper[4752]: I1124 11:40:48.751477 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfa01e41-7f49-455f-ae77-821edc4a0018" path="/var/lib/kubelet/pods/cfa01e41-7f49-455f-ae77-821edc4a0018/volumes" Nov 24 11:40:48 crc kubenswrapper[4752]: I1124 11:40:48.752759 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hl28q"] Nov 24 11:40:48 crc kubenswrapper[4752]: E1124 11:40:48.753017 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfa01e41-7f49-455f-ae77-821edc4a0018" containerName="extract-utilities" Nov 24 11:40:48 crc kubenswrapper[4752]: I1124 11:40:48.753036 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfa01e41-7f49-455f-ae77-821edc4a0018" containerName="extract-utilities" Nov 24 11:40:48 crc kubenswrapper[4752]: E1124 11:40:48.753053 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfa01e41-7f49-455f-ae77-821edc4a0018" containerName="registry-server" Nov 24 11:40:48 crc kubenswrapper[4752]: I1124 11:40:48.753059 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfa01e41-7f49-455f-ae77-821edc4a0018" containerName="registry-server" Nov 24 11:40:48 crc kubenswrapper[4752]: E1124 11:40:48.753083 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfa01e41-7f49-455f-ae77-821edc4a0018" containerName="extract-content" Nov 24 11:40:48 crc kubenswrapper[4752]: I1124 11:40:48.753089 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfa01e41-7f49-455f-ae77-821edc4a0018" containerName="extract-content" Nov 24 11:40:48 crc kubenswrapper[4752]: I1124 11:40:48.753264 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfa01e41-7f49-455f-ae77-821edc4a0018" containerName="registry-server" Nov 24 11:40:48 crc kubenswrapper[4752]: I1124 11:40:48.754691 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hl28q" Nov 24 11:40:48 crc kubenswrapper[4752]: I1124 11:40:48.755127 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hl28q"] Nov 24 11:40:48 crc kubenswrapper[4752]: I1124 11:40:48.934400 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzzpl\" (UniqueName: \"kubernetes.io/projected/a8b31064-0358-4c82-bd3b-24359fcefd72-kube-api-access-nzzpl\") pod \"redhat-marketplace-hl28q\" (UID: \"a8b31064-0358-4c82-bd3b-24359fcefd72\") " pod="openshift-marketplace/redhat-marketplace-hl28q" Nov 24 11:40:48 crc kubenswrapper[4752]: I1124 11:40:48.934459 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8b31064-0358-4c82-bd3b-24359fcefd72-utilities\") pod \"redhat-marketplace-hl28q\" (UID: \"a8b31064-0358-4c82-bd3b-24359fcefd72\") " pod="openshift-marketplace/redhat-marketplace-hl28q" Nov 24 11:40:48 crc kubenswrapper[4752]: I1124 11:40:48.934507 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8b31064-0358-4c82-bd3b-24359fcefd72-catalog-content\") pod \"redhat-marketplace-hl28q\" (UID: \"a8b31064-0358-4c82-bd3b-24359fcefd72\") " pod="openshift-marketplace/redhat-marketplace-hl28q" Nov 24 11:40:49 crc kubenswrapper[4752]: I1124 11:40:49.036217 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzzpl\" (UniqueName: \"kubernetes.io/projected/a8b31064-0358-4c82-bd3b-24359fcefd72-kube-api-access-nzzpl\") pod \"redhat-marketplace-hl28q\" (UID: \"a8b31064-0358-4c82-bd3b-24359fcefd72\") " pod="openshift-marketplace/redhat-marketplace-hl28q" Nov 24 11:40:49 crc kubenswrapper[4752]: I1124 11:40:49.036275 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8b31064-0358-4c82-bd3b-24359fcefd72-utilities\") pod \"redhat-marketplace-hl28q\" (UID: \"a8b31064-0358-4c82-bd3b-24359fcefd72\") " pod="openshift-marketplace/redhat-marketplace-hl28q" Nov 24 11:40:49 crc kubenswrapper[4752]: I1124 11:40:49.036303 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8b31064-0358-4c82-bd3b-24359fcefd72-catalog-content\") pod \"redhat-marketplace-hl28q\" (UID: \"a8b31064-0358-4c82-bd3b-24359fcefd72\") " pod="openshift-marketplace/redhat-marketplace-hl28q" Nov 24 11:40:49 crc kubenswrapper[4752]: I1124 11:40:49.036822 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8b31064-0358-4c82-bd3b-24359fcefd72-utilities\") pod \"redhat-marketplace-hl28q\" (UID: \"a8b31064-0358-4c82-bd3b-24359fcefd72\") " pod="openshift-marketplace/redhat-marketplace-hl28q" Nov 24 11:40:49 crc kubenswrapper[4752]: I1124 11:40:49.036877 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8b31064-0358-4c82-bd3b-24359fcefd72-catalog-content\") pod \"redhat-marketplace-hl28q\" (UID: \"a8b31064-0358-4c82-bd3b-24359fcefd72\") " pod="openshift-marketplace/redhat-marketplace-hl28q" Nov 24 11:40:49 crc kubenswrapper[4752]: I1124 11:40:49.059108 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzzpl\" (UniqueName: \"kubernetes.io/projected/a8b31064-0358-4c82-bd3b-24359fcefd72-kube-api-access-nzzpl\") pod \"redhat-marketplace-hl28q\" (UID: \"a8b31064-0358-4c82-bd3b-24359fcefd72\") " pod="openshift-marketplace/redhat-marketplace-hl28q" Nov 24 11:40:49 crc kubenswrapper[4752]: I1124 11:40:49.080435 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hl28q" Nov 24 11:40:50 crc kubenswrapper[4752]: I1124 11:40:50.398607 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hl28q"] Nov 24 11:40:50 crc kubenswrapper[4752]: W1124 11:40:50.425033 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8b31064_0358_4c82_bd3b_24359fcefd72.slice/crio-0f09cc428f18c0cfb0733aea4db05ea2889d76fc6740bd46bab8c45a6ae7fb52 WatchSource:0}: Error finding container 0f09cc428f18c0cfb0733aea4db05ea2889d76fc6740bd46bab8c45a6ae7fb52: Status 404 returned error can't find the container with id 0f09cc428f18c0cfb0733aea4db05ea2889d76fc6740bd46bab8c45a6ae7fb52 Nov 24 11:40:50 crc kubenswrapper[4752]: I1124 11:40:50.802420 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hl28q" event={"ID":"a8b31064-0358-4c82-bd3b-24359fcefd72","Type":"ContainerStarted","Data":"835076d663102e67fa79fd88343c6891d1dc52dbc9506d0fef4be4571e834c3b"} Nov 24 11:40:50 crc kubenswrapper[4752]: I1124 11:40:50.802489 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hl28q" event={"ID":"a8b31064-0358-4c82-bd3b-24359fcefd72","Type":"ContainerStarted","Data":"0f09cc428f18c0cfb0733aea4db05ea2889d76fc6740bd46bab8c45a6ae7fb52"} Nov 24 11:40:50 crc kubenswrapper[4752]: I1124 11:40:50.804995 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nl7tl" event={"ID":"59a05b3c-7f50-43bb-8fad-9de225d4fb96","Type":"ContainerStarted","Data":"188fcfd77293d3326fecd6afae3a4273510e77c860a2decc9c73121acacc6b09"} Nov 24 11:40:51 crc kubenswrapper[4752]: I1124 11:40:51.816679 4752 generic.go:334] "Generic (PLEG): container finished" podID="59a05b3c-7f50-43bb-8fad-9de225d4fb96" containerID="188fcfd77293d3326fecd6afae3a4273510e77c860a2decc9c73121acacc6b09" exitCode=0 Nov 24 11:40:51 crc kubenswrapper[4752]: I1124 11:40:51.816761 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nl7tl" event={"ID":"59a05b3c-7f50-43bb-8fad-9de225d4fb96","Type":"ContainerDied","Data":"188fcfd77293d3326fecd6afae3a4273510e77c860a2decc9c73121acacc6b09"} Nov 24 11:40:51 crc kubenswrapper[4752]: I1124 11:40:51.823836 4752 generic.go:334] "Generic (PLEG): container finished" podID="a8b31064-0358-4c82-bd3b-24359fcefd72" containerID="835076d663102e67fa79fd88343c6891d1dc52dbc9506d0fef4be4571e834c3b" exitCode=0 Nov 24 11:40:51 crc kubenswrapper[4752]: I1124 11:40:51.824321 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hl28q" event={"ID":"a8b31064-0358-4c82-bd3b-24359fcefd72","Type":"ContainerDied","Data":"835076d663102e67fa79fd88343c6891d1dc52dbc9506d0fef4be4571e834c3b"} Nov 24 11:40:52 crc kubenswrapper[4752]: I1124 11:40:52.833592 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nl7tl" event={"ID":"59a05b3c-7f50-43bb-8fad-9de225d4fb96","Type":"ContainerStarted","Data":"6a5c2efbe9352ecf09dfc9cc19f1956bda6aa2fcfde12d97064f2e7c36b965e8"} Nov 24 11:40:52 crc kubenswrapper[4752]: I1124 11:40:52.836087 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hl28q" event={"ID":"a8b31064-0358-4c82-bd3b-24359fcefd72","Type":"ContainerStarted","Data":"87dd66213810b4ca57dbeed2ee639c324bde5f59fad5fad2cece3bdf30c4293c"} Nov 24 11:40:52 crc kubenswrapper[4752]: I1124 11:40:52.863632 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nl7tl" podStartSLOduration=2.118236894 podStartE2EDuration="7.863613458s" podCreationTimestamp="2025-11-24 11:40:45 +0000 UTC" firstStartedPulling="2025-11-24 11:40:46.761210289 +0000 UTC m=+2052.746030578" lastFinishedPulling="2025-11-24 11:40:52.506586843 +0000 UTC m=+2058.491407142" observedRunningTime="2025-11-24 11:40:52.857430349 +0000 UTC m=+2058.842250648" watchObservedRunningTime="2025-11-24 11:40:52.863613458 +0000 UTC m=+2058.848433757" Nov 24 11:40:53 crc kubenswrapper[4752]: I1124 11:40:53.845290 4752 generic.go:334] "Generic (PLEG): container finished" podID="a8b31064-0358-4c82-bd3b-24359fcefd72" containerID="87dd66213810b4ca57dbeed2ee639c324bde5f59fad5fad2cece3bdf30c4293c" exitCode=0 Nov 24 11:40:53 crc kubenswrapper[4752]: I1124 11:40:53.845382 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hl28q" event={"ID":"a8b31064-0358-4c82-bd3b-24359fcefd72","Type":"ContainerDied","Data":"87dd66213810b4ca57dbeed2ee639c324bde5f59fad5fad2cece3bdf30c4293c"} Nov 24 11:40:54 crc kubenswrapper[4752]: I1124 11:40:54.856784 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hl28q" event={"ID":"a8b31064-0358-4c82-bd3b-24359fcefd72","Type":"ContainerStarted","Data":"0e63d3a2279da8004000ea88b94fb5c6ff0d4f64e24f13df412c0c3936dac76f"} Nov 24 11:40:54 crc kubenswrapper[4752]: I1124 11:40:54.880378 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hl28q" podStartSLOduration=4.465554223 podStartE2EDuration="6.880356454s" podCreationTimestamp="2025-11-24 11:40:48 +0000 UTC" firstStartedPulling="2025-11-24 11:40:51.825595211 +0000 UTC m=+2057.810415500" lastFinishedPulling="2025-11-24 11:40:54.240397432 +0000 UTC m=+2060.225217731" observedRunningTime="2025-11-24 11:40:54.874082542 +0000 UTC m=+2060.858902841" watchObservedRunningTime="2025-11-24 11:40:54.880356454 +0000 UTC m=+2060.865176763" Nov 24 11:40:55 crc kubenswrapper[4752]: I1124 11:40:55.901027 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nl7tl" Nov 24 11:40:55 crc kubenswrapper[4752]: I1124 11:40:55.901365 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nl7tl" Nov 24 11:40:55 crc kubenswrapper[4752]: I1124 11:40:55.946370 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nl7tl" Nov 24 11:40:59 crc kubenswrapper[4752]: I1124 11:40:59.081605 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hl28q" Nov 24 11:40:59 crc kubenswrapper[4752]: I1124 11:40:59.082041 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hl28q" Nov 24 11:40:59 crc kubenswrapper[4752]: I1124 11:40:59.124319 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hl28q" Nov 24 11:40:59 crc kubenswrapper[4752]: I1124 11:40:59.932032 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hl28q" Nov 24 11:41:00 crc kubenswrapper[4752]: I1124 11:41:00.334044 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hl28q"] Nov 24 11:41:01 crc kubenswrapper[4752]: I1124 11:41:01.911426 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hl28q" podUID="a8b31064-0358-4c82-bd3b-24359fcefd72" containerName="registry-server" containerID="cri-o://0e63d3a2279da8004000ea88b94fb5c6ff0d4f64e24f13df412c0c3936dac76f" gracePeriod=2 Nov 24 11:41:02 crc kubenswrapper[4752]: I1124 11:41:02.256636 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hl28q" Nov 24 11:41:02 crc kubenswrapper[4752]: I1124 11:41:02.338807 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8b31064-0358-4c82-bd3b-24359fcefd72-catalog-content\") pod \"a8b31064-0358-4c82-bd3b-24359fcefd72\" (UID: \"a8b31064-0358-4c82-bd3b-24359fcefd72\") " Nov 24 11:41:02 crc kubenswrapper[4752]: I1124 11:41:02.338880 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzzpl\" (UniqueName: \"kubernetes.io/projected/a8b31064-0358-4c82-bd3b-24359fcefd72-kube-api-access-nzzpl\") pod \"a8b31064-0358-4c82-bd3b-24359fcefd72\" (UID: \"a8b31064-0358-4c82-bd3b-24359fcefd72\") " Nov 24 11:41:02 crc kubenswrapper[4752]: I1124 11:41:02.338912 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8b31064-0358-4c82-bd3b-24359fcefd72-utilities\") pod \"a8b31064-0358-4c82-bd3b-24359fcefd72\" (UID: \"a8b31064-0358-4c82-bd3b-24359fcefd72\") " Nov 24 11:41:02 crc kubenswrapper[4752]: I1124 11:41:02.340533 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8b31064-0358-4c82-bd3b-24359fcefd72-utilities" (OuterVolumeSpecName: "utilities") pod "a8b31064-0358-4c82-bd3b-24359fcefd72" (UID: "a8b31064-0358-4c82-bd3b-24359fcefd72"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:41:02 crc kubenswrapper[4752]: I1124 11:41:02.350011 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8b31064-0358-4c82-bd3b-24359fcefd72-kube-api-access-nzzpl" (OuterVolumeSpecName: "kube-api-access-nzzpl") pod "a8b31064-0358-4c82-bd3b-24359fcefd72" (UID: "a8b31064-0358-4c82-bd3b-24359fcefd72"). InnerVolumeSpecName "kube-api-access-nzzpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:41:02 crc kubenswrapper[4752]: I1124 11:41:02.357639 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8b31064-0358-4c82-bd3b-24359fcefd72-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a8b31064-0358-4c82-bd3b-24359fcefd72" (UID: "a8b31064-0358-4c82-bd3b-24359fcefd72"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:41:02 crc kubenswrapper[4752]: I1124 11:41:02.440138 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8b31064-0358-4c82-bd3b-24359fcefd72-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 11:41:02 crc kubenswrapper[4752]: I1124 11:41:02.440173 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzzpl\" (UniqueName: \"kubernetes.io/projected/a8b31064-0358-4c82-bd3b-24359fcefd72-kube-api-access-nzzpl\") on node \"crc\" DevicePath \"\"" Nov 24 11:41:02 crc kubenswrapper[4752]: I1124 11:41:02.440186 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8b31064-0358-4c82-bd3b-24359fcefd72-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 11:41:02 crc kubenswrapper[4752]: I1124 11:41:02.923813 4752 generic.go:334] "Generic (PLEG): container finished" podID="a8b31064-0358-4c82-bd3b-24359fcefd72" containerID="0e63d3a2279da8004000ea88b94fb5c6ff0d4f64e24f13df412c0c3936dac76f" exitCode=0 Nov 24 11:41:02 crc kubenswrapper[4752]: I1124 11:41:02.923872 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hl28q" event={"ID":"a8b31064-0358-4c82-bd3b-24359fcefd72","Type":"ContainerDied","Data":"0e63d3a2279da8004000ea88b94fb5c6ff0d4f64e24f13df412c0c3936dac76f"} Nov 24 11:41:02 crc kubenswrapper[4752]: I1124 11:41:02.923928 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hl28q" event={"ID":"a8b31064-0358-4c82-bd3b-24359fcefd72","Type":"ContainerDied","Data":"0f09cc428f18c0cfb0733aea4db05ea2889d76fc6740bd46bab8c45a6ae7fb52"} Nov 24 11:41:02 crc kubenswrapper[4752]: I1124 11:41:02.923951 4752 scope.go:117] "RemoveContainer" containerID="0e63d3a2279da8004000ea88b94fb5c6ff0d4f64e24f13df412c0c3936dac76f" Nov 24 11:41:02 crc kubenswrapper[4752]: I1124 11:41:02.923968 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hl28q" Nov 24 11:41:02 crc kubenswrapper[4752]: I1124 11:41:02.946728 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hl28q"] Nov 24 11:41:02 crc kubenswrapper[4752]: I1124 11:41:02.951696 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hl28q"] Nov 24 11:41:02 crc kubenswrapper[4752]: I1124 11:41:02.957295 4752 scope.go:117] "RemoveContainer" containerID="87dd66213810b4ca57dbeed2ee639c324bde5f59fad5fad2cece3bdf30c4293c" Nov 24 11:41:02 crc kubenswrapper[4752]: I1124 11:41:02.981954 4752 scope.go:117] "RemoveContainer" containerID="835076d663102e67fa79fd88343c6891d1dc52dbc9506d0fef4be4571e834c3b" Nov 24 11:41:03 crc kubenswrapper[4752]: I1124 11:41:03.026661 4752 scope.go:117] "RemoveContainer" containerID="0e63d3a2279da8004000ea88b94fb5c6ff0d4f64e24f13df412c0c3936dac76f" Nov 24 11:41:03 crc kubenswrapper[4752]: E1124 11:41:03.027812 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e63d3a2279da8004000ea88b94fb5c6ff0d4f64e24f13df412c0c3936dac76f\": container with ID starting with 0e63d3a2279da8004000ea88b94fb5c6ff0d4f64e24f13df412c0c3936dac76f not found: ID does not exist" containerID="0e63d3a2279da8004000ea88b94fb5c6ff0d4f64e24f13df412c0c3936dac76f" Nov 24 11:41:03 crc kubenswrapper[4752]: I1124 11:41:03.027855 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e63d3a2279da8004000ea88b94fb5c6ff0d4f64e24f13df412c0c3936dac76f"} err="failed to get container status \"0e63d3a2279da8004000ea88b94fb5c6ff0d4f64e24f13df412c0c3936dac76f\": rpc error: code = NotFound desc = could not find container \"0e63d3a2279da8004000ea88b94fb5c6ff0d4f64e24f13df412c0c3936dac76f\": container with ID starting with 0e63d3a2279da8004000ea88b94fb5c6ff0d4f64e24f13df412c0c3936dac76f not found: ID does not exist" Nov 24 11:41:03 crc kubenswrapper[4752]: I1124 11:41:03.027881 4752 scope.go:117] "RemoveContainer" containerID="87dd66213810b4ca57dbeed2ee639c324bde5f59fad5fad2cece3bdf30c4293c" Nov 24 11:41:03 crc kubenswrapper[4752]: E1124 11:41:03.028353 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87dd66213810b4ca57dbeed2ee639c324bde5f59fad5fad2cece3bdf30c4293c\": container with ID starting with 87dd66213810b4ca57dbeed2ee639c324bde5f59fad5fad2cece3bdf30c4293c not found: ID does not exist" containerID="87dd66213810b4ca57dbeed2ee639c324bde5f59fad5fad2cece3bdf30c4293c" Nov 24 11:41:03 crc kubenswrapper[4752]: I1124 11:41:03.028372 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87dd66213810b4ca57dbeed2ee639c324bde5f59fad5fad2cece3bdf30c4293c"} err="failed to get container status \"87dd66213810b4ca57dbeed2ee639c324bde5f59fad5fad2cece3bdf30c4293c\": rpc error: code = NotFound desc = could not find container \"87dd66213810b4ca57dbeed2ee639c324bde5f59fad5fad2cece3bdf30c4293c\": container with ID starting with 87dd66213810b4ca57dbeed2ee639c324bde5f59fad5fad2cece3bdf30c4293c not found: ID does not exist" Nov 24 11:41:03 crc kubenswrapper[4752]: I1124 11:41:03.028385 4752 scope.go:117] "RemoveContainer" containerID="835076d663102e67fa79fd88343c6891d1dc52dbc9506d0fef4be4571e834c3b" Nov 24 11:41:03 crc kubenswrapper[4752]: E1124 11:41:03.028718 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"835076d663102e67fa79fd88343c6891d1dc52dbc9506d0fef4be4571e834c3b\": container with ID starting with 835076d663102e67fa79fd88343c6891d1dc52dbc9506d0fef4be4571e834c3b not found: ID does not exist" containerID="835076d663102e67fa79fd88343c6891d1dc52dbc9506d0fef4be4571e834c3b" Nov 24 11:41:03 crc kubenswrapper[4752]: I1124 11:41:03.028736 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"835076d663102e67fa79fd88343c6891d1dc52dbc9506d0fef4be4571e834c3b"} err="failed to get container status \"835076d663102e67fa79fd88343c6891d1dc52dbc9506d0fef4be4571e834c3b\": rpc error: code = NotFound desc = could not find container \"835076d663102e67fa79fd88343c6891d1dc52dbc9506d0fef4be4571e834c3b\": container with ID starting with 835076d663102e67fa79fd88343c6891d1dc52dbc9506d0fef4be4571e834c3b not found: ID does not exist" Nov 24 11:41:04 crc kubenswrapper[4752]: I1124 11:41:04.741115 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8b31064-0358-4c82-bd3b-24359fcefd72" path="/var/lib/kubelet/pods/a8b31064-0358-4c82-bd3b-24359fcefd72/volumes" Nov 24 11:41:05 crc kubenswrapper[4752]: I1124 11:41:05.947820 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nl7tl" Nov 24 11:41:06 crc kubenswrapper[4752]: I1124 11:41:06.027659 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nl7tl"] Nov 24 11:41:06 crc kubenswrapper[4752]: I1124 11:41:06.075394 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pcgzp"] Nov 24 11:41:06 crc kubenswrapper[4752]: I1124 11:41:06.075626 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pcgzp" podUID="12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2" containerName="registry-server" containerID="cri-o://41fcaca80bce6fa6bd8bfd6d3b41bbad25a85e75ea620497889900ae27752395" gracePeriod=2 Nov 24 11:41:06 crc kubenswrapper[4752]: I1124 11:41:06.507918 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pcgzp" Nov 24 11:41:06 crc kubenswrapper[4752]: I1124 11:41:06.598371 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25ggt\" (UniqueName: \"kubernetes.io/projected/12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2-kube-api-access-25ggt\") pod \"12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2\" (UID: \"12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2\") " Nov 24 11:41:06 crc kubenswrapper[4752]: I1124 11:41:06.598438 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2-catalog-content\") pod \"12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2\" (UID: \"12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2\") " Nov 24 11:41:06 crc kubenswrapper[4752]: I1124 11:41:06.598483 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2-utilities\") pod \"12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2\" (UID: \"12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2\") " Nov 24 11:41:06 crc kubenswrapper[4752]: I1124 11:41:06.599234 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2-utilities" (OuterVolumeSpecName: "utilities") pod "12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2" (UID: "12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:41:06 crc kubenswrapper[4752]: I1124 11:41:06.604009 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2-kube-api-access-25ggt" (OuterVolumeSpecName: "kube-api-access-25ggt") pod "12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2" (UID: "12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2"). InnerVolumeSpecName "kube-api-access-25ggt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:41:06 crc kubenswrapper[4752]: I1124 11:41:06.650090 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2" (UID: "12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:41:06 crc kubenswrapper[4752]: I1124 11:41:06.700469 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25ggt\" (UniqueName: \"kubernetes.io/projected/12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2-kube-api-access-25ggt\") on node \"crc\" DevicePath \"\"" Nov 24 11:41:06 crc kubenswrapper[4752]: I1124 11:41:06.700508 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 11:41:06 crc kubenswrapper[4752]: I1124 11:41:06.700522 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 11:41:06 crc kubenswrapper[4752]: I1124 11:41:06.976180 4752 generic.go:334] "Generic (PLEG): container finished" podID="12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2" containerID="41fcaca80bce6fa6bd8bfd6d3b41bbad25a85e75ea620497889900ae27752395" exitCode=0 Nov 24 11:41:06 crc kubenswrapper[4752]: I1124 11:41:06.976230 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pcgzp" event={"ID":"12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2","Type":"ContainerDied","Data":"41fcaca80bce6fa6bd8bfd6d3b41bbad25a85e75ea620497889900ae27752395"} Nov 24 11:41:06 crc kubenswrapper[4752]: I1124 11:41:06.976259 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pcgzp" Nov 24 11:41:06 crc kubenswrapper[4752]: I1124 11:41:06.976283 4752 scope.go:117] "RemoveContainer" containerID="41fcaca80bce6fa6bd8bfd6d3b41bbad25a85e75ea620497889900ae27752395" Nov 24 11:41:06 crc kubenswrapper[4752]: I1124 11:41:06.976270 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pcgzp" event={"ID":"12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2","Type":"ContainerDied","Data":"5ad0b8921f3bbfddbeaaf2152ed651a3ed1f4ffbd299afbfa82bb2aa037b00e3"} Nov 24 11:41:06 crc kubenswrapper[4752]: I1124 11:41:06.996774 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pcgzp"] Nov 24 11:41:07 crc kubenswrapper[4752]: I1124 11:41:07.001035 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pcgzp"] Nov 24 11:41:07 crc kubenswrapper[4752]: I1124 11:41:07.001880 4752 scope.go:117] "RemoveContainer" containerID="963ce0dd6ba8c7196f28953f0e318fac3168e9d4c18cfae00979c4c0045a8f02" Nov 24 11:41:07 crc kubenswrapper[4752]: I1124 11:41:07.022516 4752 scope.go:117] "RemoveContainer" containerID="8aba01788d25286e7f7941edc1e4a054220d83be847aca964c692a97c1293e22" Nov 24 11:41:07 crc kubenswrapper[4752]: I1124 11:41:07.045642 4752 scope.go:117] "RemoveContainer" containerID="41fcaca80bce6fa6bd8bfd6d3b41bbad25a85e75ea620497889900ae27752395" Nov 24 11:41:07 crc kubenswrapper[4752]: E1124 11:41:07.046091 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41fcaca80bce6fa6bd8bfd6d3b41bbad25a85e75ea620497889900ae27752395\": container with ID starting with 41fcaca80bce6fa6bd8bfd6d3b41bbad25a85e75ea620497889900ae27752395 not found: ID does not exist" containerID="41fcaca80bce6fa6bd8bfd6d3b41bbad25a85e75ea620497889900ae27752395" Nov 24 11:41:07 crc kubenswrapper[4752]: I1124 11:41:07.046126 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41fcaca80bce6fa6bd8bfd6d3b41bbad25a85e75ea620497889900ae27752395"} err="failed to get container status \"41fcaca80bce6fa6bd8bfd6d3b41bbad25a85e75ea620497889900ae27752395\": rpc error: code = NotFound desc = could not find container \"41fcaca80bce6fa6bd8bfd6d3b41bbad25a85e75ea620497889900ae27752395\": container with ID starting with 41fcaca80bce6fa6bd8bfd6d3b41bbad25a85e75ea620497889900ae27752395 not found: ID does not exist" Nov 24 11:41:07 crc kubenswrapper[4752]: I1124 11:41:07.046153 4752 scope.go:117] "RemoveContainer" containerID="963ce0dd6ba8c7196f28953f0e318fac3168e9d4c18cfae00979c4c0045a8f02" Nov 24 11:41:07 crc kubenswrapper[4752]: E1124 11:41:07.046424 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"963ce0dd6ba8c7196f28953f0e318fac3168e9d4c18cfae00979c4c0045a8f02\": container with ID starting with 963ce0dd6ba8c7196f28953f0e318fac3168e9d4c18cfae00979c4c0045a8f02 not found: ID does not exist" containerID="963ce0dd6ba8c7196f28953f0e318fac3168e9d4c18cfae00979c4c0045a8f02" Nov 24 11:41:07 crc kubenswrapper[4752]: I1124 11:41:07.046464 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"963ce0dd6ba8c7196f28953f0e318fac3168e9d4c18cfae00979c4c0045a8f02"} err="failed to get container status \"963ce0dd6ba8c7196f28953f0e318fac3168e9d4c18cfae00979c4c0045a8f02\": rpc error: code = NotFound desc = could not find container \"963ce0dd6ba8c7196f28953f0e318fac3168e9d4c18cfae00979c4c0045a8f02\": container with ID starting with 963ce0dd6ba8c7196f28953f0e318fac3168e9d4c18cfae00979c4c0045a8f02 not found: ID does not exist" Nov 24 11:41:07 crc kubenswrapper[4752]: I1124 11:41:07.046489 4752 scope.go:117] "RemoveContainer" containerID="8aba01788d25286e7f7941edc1e4a054220d83be847aca964c692a97c1293e22" Nov 24 11:41:07 crc kubenswrapper[4752]: E1124 11:41:07.046737 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8aba01788d25286e7f7941edc1e4a054220d83be847aca964c692a97c1293e22\": container with ID starting with 8aba01788d25286e7f7941edc1e4a054220d83be847aca964c692a97c1293e22 not found: ID does not exist" containerID="8aba01788d25286e7f7941edc1e4a054220d83be847aca964c692a97c1293e22" Nov 24 11:41:07 crc kubenswrapper[4752]: I1124 11:41:07.046773 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aba01788d25286e7f7941edc1e4a054220d83be847aca964c692a97c1293e22"} err="failed to get container status \"8aba01788d25286e7f7941edc1e4a054220d83be847aca964c692a97c1293e22\": rpc error: code = NotFound desc = could not find container \"8aba01788d25286e7f7941edc1e4a054220d83be847aca964c692a97c1293e22\": container with ID starting with 8aba01788d25286e7f7941edc1e4a054220d83be847aca964c692a97c1293e22 not found: ID does not exist" Nov 24 11:41:08 crc kubenswrapper[4752]: I1124 11:41:08.737622 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2" path="/var/lib/kubelet/pods/12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2/volumes" Nov 24 11:41:45 crc kubenswrapper[4752]: I1124 11:41:45.469035 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 11:41:45 crc kubenswrapper[4752]: I1124 11:41:45.469638 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 11:42:06 crc kubenswrapper[4752]: I1124 11:42:06.715623 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2szc4"] Nov 24 11:42:06 crc kubenswrapper[4752]: E1124 11:42:06.716579 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8b31064-0358-4c82-bd3b-24359fcefd72" containerName="extract-content" Nov 24 11:42:06 crc kubenswrapper[4752]: I1124 11:42:06.716594 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8b31064-0358-4c82-bd3b-24359fcefd72" containerName="extract-content" Nov 24 11:42:06 crc kubenswrapper[4752]: E1124 11:42:06.716614 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2" containerName="registry-server" Nov 24 11:42:06 crc kubenswrapper[4752]: I1124 11:42:06.716622 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2" containerName="registry-server" Nov 24 11:42:06 crc kubenswrapper[4752]: E1124 11:42:06.716645 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8b31064-0358-4c82-bd3b-24359fcefd72" containerName="registry-server" Nov 24 11:42:06 crc kubenswrapper[4752]: I1124 11:42:06.716655 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8b31064-0358-4c82-bd3b-24359fcefd72" containerName="registry-server" Nov 24 11:42:06 crc kubenswrapper[4752]: E1124 11:42:06.716672 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2" containerName="extract-utilities" Nov 24 11:42:06 crc kubenswrapper[4752]: I1124 11:42:06.716681 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2" containerName="extract-utilities" Nov 24 11:42:06 crc kubenswrapper[4752]: E1124 11:42:06.716690 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8b31064-0358-4c82-bd3b-24359fcefd72" containerName="extract-utilities" Nov 24 11:42:06 crc kubenswrapper[4752]: I1124 11:42:06.716698 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8b31064-0358-4c82-bd3b-24359fcefd72" containerName="extract-utilities" Nov 24 11:42:06 crc kubenswrapper[4752]: E1124 11:42:06.716718 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2" containerName="extract-content" Nov 24 11:42:06 crc kubenswrapper[4752]: I1124 11:42:06.716725 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2" containerName="extract-content" Nov 24 11:42:06 crc kubenswrapper[4752]: I1124 11:42:06.716903 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8b31064-0358-4c82-bd3b-24359fcefd72" containerName="registry-server" Nov 24 11:42:06 crc kubenswrapper[4752]: I1124 11:42:06.716928 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="12a8f2f1-a3ce-4db6-bed0-da57ac3afdd2" containerName="registry-server" Nov 24 11:42:06 crc kubenswrapper[4752]: I1124 11:42:06.718590 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2szc4" Nov 24 11:42:06 crc kubenswrapper[4752]: I1124 11:42:06.752999 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2szc4"] Nov 24 11:42:06 crc kubenswrapper[4752]: I1124 11:42:06.816106 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94c9f396-0306-4484-bf67-cd7243c8066f-catalog-content\") pod \"certified-operators-2szc4\" (UID: \"94c9f396-0306-4484-bf67-cd7243c8066f\") " pod="openshift-marketplace/certified-operators-2szc4" Nov 24 11:42:06 crc kubenswrapper[4752]: I1124 11:42:06.816459 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp4h2\" (UniqueName: \"kubernetes.io/projected/94c9f396-0306-4484-bf67-cd7243c8066f-kube-api-access-mp4h2\") pod \"certified-operators-2szc4\" (UID: \"94c9f396-0306-4484-bf67-cd7243c8066f\") " pod="openshift-marketplace/certified-operators-2szc4" Nov 24 11:42:06 crc kubenswrapper[4752]: I1124 11:42:06.816887 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94c9f396-0306-4484-bf67-cd7243c8066f-utilities\") pod \"certified-operators-2szc4\" (UID: \"94c9f396-0306-4484-bf67-cd7243c8066f\") " pod="openshift-marketplace/certified-operators-2szc4" Nov 24 11:42:06 crc kubenswrapper[4752]: I1124 11:42:06.918272 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94c9f396-0306-4484-bf67-cd7243c8066f-catalog-content\") pod \"certified-operators-2szc4\" (UID: \"94c9f396-0306-4484-bf67-cd7243c8066f\") " pod="openshift-marketplace/certified-operators-2szc4" Nov 24 11:42:06 crc kubenswrapper[4752]: I1124 11:42:06.918330 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mp4h2\" (UniqueName: \"kubernetes.io/projected/94c9f396-0306-4484-bf67-cd7243c8066f-kube-api-access-mp4h2\") pod \"certified-operators-2szc4\" (UID: \"94c9f396-0306-4484-bf67-cd7243c8066f\") " pod="openshift-marketplace/certified-operators-2szc4" Nov 24 11:42:06 crc kubenswrapper[4752]: I1124 11:42:06.918455 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94c9f396-0306-4484-bf67-cd7243c8066f-utilities\") pod \"certified-operators-2szc4\" (UID: \"94c9f396-0306-4484-bf67-cd7243c8066f\") " pod="openshift-marketplace/certified-operators-2szc4" Nov 24 11:42:06 crc kubenswrapper[4752]: I1124 11:42:06.919112 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94c9f396-0306-4484-bf67-cd7243c8066f-utilities\") pod \"certified-operators-2szc4\" (UID: \"94c9f396-0306-4484-bf67-cd7243c8066f\") " pod="openshift-marketplace/certified-operators-2szc4" Nov 24 11:42:06 crc kubenswrapper[4752]: I1124 11:42:06.919259 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94c9f396-0306-4484-bf67-cd7243c8066f-catalog-content\") pod \"certified-operators-2szc4\" (UID: \"94c9f396-0306-4484-bf67-cd7243c8066f\") " pod="openshift-marketplace/certified-operators-2szc4" Nov 24 11:42:06 crc kubenswrapper[4752]: I1124 11:42:06.942482 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mp4h2\" (UniqueName: \"kubernetes.io/projected/94c9f396-0306-4484-bf67-cd7243c8066f-kube-api-access-mp4h2\") pod \"certified-operators-2szc4\" (UID: \"94c9f396-0306-4484-bf67-cd7243c8066f\") " pod="openshift-marketplace/certified-operators-2szc4" Nov 24 11:42:07 crc kubenswrapper[4752]: I1124 11:42:07.044127 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2szc4" Nov 24 11:42:07 crc kubenswrapper[4752]: I1124 11:42:07.334892 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2szc4"] Nov 24 11:42:07 crc kubenswrapper[4752]: I1124 11:42:07.653622 4752 generic.go:334] "Generic (PLEG): container finished" podID="94c9f396-0306-4484-bf67-cd7243c8066f" containerID="61cc1991396062b6e9448c01d6f64a068a4674530c057639d8a1827c3a30e58c" exitCode=0 Nov 24 11:42:07 crc kubenswrapper[4752]: I1124 11:42:07.653663 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2szc4" event={"ID":"94c9f396-0306-4484-bf67-cd7243c8066f","Type":"ContainerDied","Data":"61cc1991396062b6e9448c01d6f64a068a4674530c057639d8a1827c3a30e58c"} Nov 24 11:42:07 crc kubenswrapper[4752]: I1124 11:42:07.653714 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2szc4" event={"ID":"94c9f396-0306-4484-bf67-cd7243c8066f","Type":"ContainerStarted","Data":"64172d0c6cdd538bf6a7e0a9dfa7fe9beb3907a6d1bb657d8d1a048176e6efdc"} Nov 24 11:42:08 crc kubenswrapper[4752]: I1124 11:42:08.666983 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2szc4" event={"ID":"94c9f396-0306-4484-bf67-cd7243c8066f","Type":"ContainerStarted","Data":"4e4f5d3669e2d76159c8d0663a2cd8707932fe8b50d93aabb4ab1a594cea2e25"} Nov 24 11:42:09 crc kubenswrapper[4752]: I1124 11:42:09.678261 4752 generic.go:334] "Generic (PLEG): container finished" podID="94c9f396-0306-4484-bf67-cd7243c8066f" containerID="4e4f5d3669e2d76159c8d0663a2cd8707932fe8b50d93aabb4ab1a594cea2e25" exitCode=0 Nov 24 11:42:09 crc kubenswrapper[4752]: I1124 11:42:09.678338 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2szc4" event={"ID":"94c9f396-0306-4484-bf67-cd7243c8066f","Type":"ContainerDied","Data":"4e4f5d3669e2d76159c8d0663a2cd8707932fe8b50d93aabb4ab1a594cea2e25"} Nov 24 11:42:10 crc kubenswrapper[4752]: I1124 11:42:10.694346 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2szc4" event={"ID":"94c9f396-0306-4484-bf67-cd7243c8066f","Type":"ContainerStarted","Data":"e48cf98f45bad39cbed30c77e7369b317e9968fae30abdc41fefb4026dfc5a36"} Nov 24 11:42:10 crc kubenswrapper[4752]: I1124 11:42:10.717058 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2szc4" podStartSLOduration=2.040600183 podStartE2EDuration="4.717015712s" podCreationTimestamp="2025-11-24 11:42:06 +0000 UTC" firstStartedPulling="2025-11-24 11:42:07.655136863 +0000 UTC m=+2133.639957152" lastFinishedPulling="2025-11-24 11:42:10.331552402 +0000 UTC m=+2136.316372681" observedRunningTime="2025-11-24 11:42:10.715072416 +0000 UTC m=+2136.699892725" watchObservedRunningTime="2025-11-24 11:42:10.717015712 +0000 UTC m=+2136.701836011" Nov 24 11:42:15 crc kubenswrapper[4752]: I1124 11:42:15.469614 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 11:42:15 crc kubenswrapper[4752]: I1124 11:42:15.470252 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 11:42:17 crc kubenswrapper[4752]: I1124 11:42:17.044947 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2szc4" Nov 24 11:42:17 crc kubenswrapper[4752]: I1124 11:42:17.045325 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2szc4" Nov 24 11:42:17 crc kubenswrapper[4752]: I1124 11:42:17.124294 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2szc4" Nov 24 11:42:17 crc kubenswrapper[4752]: I1124 11:42:17.803731 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2szc4" Nov 24 11:42:20 crc kubenswrapper[4752]: I1124 11:42:20.295179 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2szc4"] Nov 24 11:42:20 crc kubenswrapper[4752]: I1124 11:42:20.296579 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2szc4" podUID="94c9f396-0306-4484-bf67-cd7243c8066f" containerName="registry-server" containerID="cri-o://e48cf98f45bad39cbed30c77e7369b317e9968fae30abdc41fefb4026dfc5a36" gracePeriod=2 Nov 24 11:42:20 crc kubenswrapper[4752]: I1124 11:42:20.766386 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2szc4" Nov 24 11:42:20 crc kubenswrapper[4752]: I1124 11:42:20.793628 4752 generic.go:334] "Generic (PLEG): container finished" podID="94c9f396-0306-4484-bf67-cd7243c8066f" containerID="e48cf98f45bad39cbed30c77e7369b317e9968fae30abdc41fefb4026dfc5a36" exitCode=0 Nov 24 11:42:20 crc kubenswrapper[4752]: I1124 11:42:20.793676 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2szc4" event={"ID":"94c9f396-0306-4484-bf67-cd7243c8066f","Type":"ContainerDied","Data":"e48cf98f45bad39cbed30c77e7369b317e9968fae30abdc41fefb4026dfc5a36"} Nov 24 11:42:20 crc kubenswrapper[4752]: I1124 11:42:20.793710 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2szc4" event={"ID":"94c9f396-0306-4484-bf67-cd7243c8066f","Type":"ContainerDied","Data":"64172d0c6cdd538bf6a7e0a9dfa7fe9beb3907a6d1bb657d8d1a048176e6efdc"} Nov 24 11:42:20 crc kubenswrapper[4752]: I1124 11:42:20.793732 4752 scope.go:117] "RemoveContainer" containerID="e48cf98f45bad39cbed30c77e7369b317e9968fae30abdc41fefb4026dfc5a36" Nov 24 11:42:20 crc kubenswrapper[4752]: I1124 11:42:20.793897 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2szc4" Nov 24 11:42:20 crc kubenswrapper[4752]: I1124 11:42:20.814212 4752 scope.go:117] "RemoveContainer" containerID="4e4f5d3669e2d76159c8d0663a2cd8707932fe8b50d93aabb4ab1a594cea2e25" Nov 24 11:42:20 crc kubenswrapper[4752]: I1124 11:42:20.834992 4752 scope.go:117] "RemoveContainer" containerID="61cc1991396062b6e9448c01d6f64a068a4674530c057639d8a1827c3a30e58c" Nov 24 11:42:20 crc kubenswrapper[4752]: I1124 11:42:20.836065 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mp4h2\" (UniqueName: \"kubernetes.io/projected/94c9f396-0306-4484-bf67-cd7243c8066f-kube-api-access-mp4h2\") pod \"94c9f396-0306-4484-bf67-cd7243c8066f\" (UID: \"94c9f396-0306-4484-bf67-cd7243c8066f\") " Nov 24 11:42:20 crc kubenswrapper[4752]: I1124 11:42:20.836169 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94c9f396-0306-4484-bf67-cd7243c8066f-utilities\") pod \"94c9f396-0306-4484-bf67-cd7243c8066f\" (UID: \"94c9f396-0306-4484-bf67-cd7243c8066f\") " Nov 24 11:42:20 crc kubenswrapper[4752]: I1124 11:42:20.836202 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94c9f396-0306-4484-bf67-cd7243c8066f-catalog-content\") pod \"94c9f396-0306-4484-bf67-cd7243c8066f\" (UID: \"94c9f396-0306-4484-bf67-cd7243c8066f\") " Nov 24 11:42:20 crc kubenswrapper[4752]: I1124 11:42:20.837284 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94c9f396-0306-4484-bf67-cd7243c8066f-utilities" (OuterVolumeSpecName: "utilities") pod "94c9f396-0306-4484-bf67-cd7243c8066f" (UID: "94c9f396-0306-4484-bf67-cd7243c8066f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:42:20 crc kubenswrapper[4752]: I1124 11:42:20.841504 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94c9f396-0306-4484-bf67-cd7243c8066f-kube-api-access-mp4h2" (OuterVolumeSpecName: "kube-api-access-mp4h2") pod "94c9f396-0306-4484-bf67-cd7243c8066f" (UID: "94c9f396-0306-4484-bf67-cd7243c8066f"). InnerVolumeSpecName "kube-api-access-mp4h2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:42:20 crc kubenswrapper[4752]: I1124 11:42:20.881416 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94c9f396-0306-4484-bf67-cd7243c8066f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "94c9f396-0306-4484-bf67-cd7243c8066f" (UID: "94c9f396-0306-4484-bf67-cd7243c8066f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:42:20 crc kubenswrapper[4752]: I1124 11:42:20.896648 4752 scope.go:117] "RemoveContainer" containerID="e48cf98f45bad39cbed30c77e7369b317e9968fae30abdc41fefb4026dfc5a36" Nov 24 11:42:20 crc kubenswrapper[4752]: E1124 11:42:20.897214 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e48cf98f45bad39cbed30c77e7369b317e9968fae30abdc41fefb4026dfc5a36\": container with ID starting with e48cf98f45bad39cbed30c77e7369b317e9968fae30abdc41fefb4026dfc5a36 not found: ID does not exist" containerID="e48cf98f45bad39cbed30c77e7369b317e9968fae30abdc41fefb4026dfc5a36" Nov 24 11:42:20 crc kubenswrapper[4752]: I1124 11:42:20.897258 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e48cf98f45bad39cbed30c77e7369b317e9968fae30abdc41fefb4026dfc5a36"} err="failed to get container status \"e48cf98f45bad39cbed30c77e7369b317e9968fae30abdc41fefb4026dfc5a36\": rpc error: code = NotFound desc = could not find container \"e48cf98f45bad39cbed30c77e7369b317e9968fae30abdc41fefb4026dfc5a36\": container with ID starting with e48cf98f45bad39cbed30c77e7369b317e9968fae30abdc41fefb4026dfc5a36 not found: ID does not exist" Nov 24 11:42:20 crc kubenswrapper[4752]: I1124 11:42:20.897285 4752 scope.go:117] "RemoveContainer" containerID="4e4f5d3669e2d76159c8d0663a2cd8707932fe8b50d93aabb4ab1a594cea2e25" Nov 24 11:42:20 crc kubenswrapper[4752]: E1124 11:42:20.897577 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e4f5d3669e2d76159c8d0663a2cd8707932fe8b50d93aabb4ab1a594cea2e25\": container with ID starting with 4e4f5d3669e2d76159c8d0663a2cd8707932fe8b50d93aabb4ab1a594cea2e25 not found: ID does not exist" containerID="4e4f5d3669e2d76159c8d0663a2cd8707932fe8b50d93aabb4ab1a594cea2e25" Nov 24 11:42:20 crc kubenswrapper[4752]: I1124 11:42:20.897607 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e4f5d3669e2d76159c8d0663a2cd8707932fe8b50d93aabb4ab1a594cea2e25"} err="failed to get container status \"4e4f5d3669e2d76159c8d0663a2cd8707932fe8b50d93aabb4ab1a594cea2e25\": rpc error: code = NotFound desc = could not find container \"4e4f5d3669e2d76159c8d0663a2cd8707932fe8b50d93aabb4ab1a594cea2e25\": container with ID starting with 4e4f5d3669e2d76159c8d0663a2cd8707932fe8b50d93aabb4ab1a594cea2e25 not found: ID does not exist" Nov 24 11:42:20 crc kubenswrapper[4752]: I1124 11:42:20.897622 4752 scope.go:117] "RemoveContainer" containerID="61cc1991396062b6e9448c01d6f64a068a4674530c057639d8a1827c3a30e58c" Nov 24 11:42:20 crc kubenswrapper[4752]: E1124 11:42:20.897897 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61cc1991396062b6e9448c01d6f64a068a4674530c057639d8a1827c3a30e58c\": container with ID starting with 61cc1991396062b6e9448c01d6f64a068a4674530c057639d8a1827c3a30e58c not found: ID does not exist" containerID="61cc1991396062b6e9448c01d6f64a068a4674530c057639d8a1827c3a30e58c" Nov 24 11:42:20 crc kubenswrapper[4752]: I1124 11:42:20.897920 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61cc1991396062b6e9448c01d6f64a068a4674530c057639d8a1827c3a30e58c"} err="failed to get container status \"61cc1991396062b6e9448c01d6f64a068a4674530c057639d8a1827c3a30e58c\": rpc error: code = NotFound desc = could not find container \"61cc1991396062b6e9448c01d6f64a068a4674530c057639d8a1827c3a30e58c\": container with ID starting with 61cc1991396062b6e9448c01d6f64a068a4674530c057639d8a1827c3a30e58c not found: ID does not exist" Nov 24 11:42:20 crc kubenswrapper[4752]: I1124 11:42:20.938175 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94c9f396-0306-4484-bf67-cd7243c8066f-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 11:42:20 crc kubenswrapper[4752]: I1124 11:42:20.938215 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94c9f396-0306-4484-bf67-cd7243c8066f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 11:42:20 crc kubenswrapper[4752]: I1124 11:42:20.938230 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mp4h2\" (UniqueName: \"kubernetes.io/projected/94c9f396-0306-4484-bf67-cd7243c8066f-kube-api-access-mp4h2\") on node \"crc\" DevicePath \"\"" Nov 24 11:42:21 crc kubenswrapper[4752]: I1124 11:42:21.126732 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2szc4"] Nov 24 11:42:21 crc kubenswrapper[4752]: I1124 11:42:21.131652 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2szc4"] Nov 24 11:42:22 crc kubenswrapper[4752]: I1124 11:42:22.743082 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94c9f396-0306-4484-bf67-cd7243c8066f" path="/var/lib/kubelet/pods/94c9f396-0306-4484-bf67-cd7243c8066f/volumes" Nov 24 11:42:45 crc kubenswrapper[4752]: I1124 11:42:45.469537 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 11:42:45 crc kubenswrapper[4752]: I1124 11:42:45.470292 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 11:42:45 crc kubenswrapper[4752]: I1124 11:42:45.470558 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 11:42:45 crc kubenswrapper[4752]: I1124 11:42:45.471510 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"14f4d00d9a59d31eb6e46cebe36677e01200347bee214a5be288ba8301a5f228"} pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 24 11:42:45 crc kubenswrapper[4752]: I1124 11:42:45.471598 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" containerID="cri-o://14f4d00d9a59d31eb6e46cebe36677e01200347bee214a5be288ba8301a5f228" gracePeriod=600 Nov 24 11:42:45 crc kubenswrapper[4752]: E1124 11:42:45.591605 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:42:45 crc kubenswrapper[4752]: I1124 11:42:45.990822 4752 generic.go:334] "Generic (PLEG): container finished" podID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerID="14f4d00d9a59d31eb6e46cebe36677e01200347bee214a5be288ba8301a5f228" exitCode=0 Nov 24 11:42:45 crc kubenswrapper[4752]: I1124 11:42:45.990869 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerDied","Data":"14f4d00d9a59d31eb6e46cebe36677e01200347bee214a5be288ba8301a5f228"} Nov 24 11:42:45 crc kubenswrapper[4752]: I1124 11:42:45.990933 4752 scope.go:117] "RemoveContainer" containerID="b317061cfbb267e4bf956b4697e267978fbf62eb5297c5426fa3230aef485d54" Nov 24 11:42:45 crc kubenswrapper[4752]: I1124 11:42:45.991430 4752 scope.go:117] "RemoveContainer" containerID="14f4d00d9a59d31eb6e46cebe36677e01200347bee214a5be288ba8301a5f228" Nov 24 11:42:45 crc kubenswrapper[4752]: E1124 11:42:45.991859 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:42:57 crc kubenswrapper[4752]: I1124 11:42:57.728885 4752 scope.go:117] "RemoveContainer" containerID="14f4d00d9a59d31eb6e46cebe36677e01200347bee214a5be288ba8301a5f228" Nov 24 11:42:57 crc kubenswrapper[4752]: E1124 11:42:57.730254 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:43:08 crc kubenswrapper[4752]: I1124 11:43:08.728646 4752 scope.go:117] "RemoveContainer" containerID="14f4d00d9a59d31eb6e46cebe36677e01200347bee214a5be288ba8301a5f228" Nov 24 11:43:08 crc kubenswrapper[4752]: E1124 11:43:08.729960 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:43:22 crc kubenswrapper[4752]: I1124 11:43:22.728125 4752 scope.go:117] "RemoveContainer" containerID="14f4d00d9a59d31eb6e46cebe36677e01200347bee214a5be288ba8301a5f228" Nov 24 11:43:22 crc kubenswrapper[4752]: E1124 11:43:22.728929 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:43:37 crc kubenswrapper[4752]: I1124 11:43:37.728840 4752 scope.go:117] "RemoveContainer" containerID="14f4d00d9a59d31eb6e46cebe36677e01200347bee214a5be288ba8301a5f228" Nov 24 11:43:37 crc kubenswrapper[4752]: E1124 11:43:37.730876 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:43:50 crc kubenswrapper[4752]: I1124 11:43:50.727968 4752 scope.go:117] "RemoveContainer" containerID="14f4d00d9a59d31eb6e46cebe36677e01200347bee214a5be288ba8301a5f228" Nov 24 11:43:50 crc kubenswrapper[4752]: E1124 11:43:50.728659 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:44:01 crc kubenswrapper[4752]: I1124 11:44:01.728519 4752 scope.go:117] "RemoveContainer" containerID="14f4d00d9a59d31eb6e46cebe36677e01200347bee214a5be288ba8301a5f228" Nov 24 11:44:01 crc kubenswrapper[4752]: E1124 11:44:01.729694 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:44:13 crc kubenswrapper[4752]: I1124 11:44:13.728601 4752 scope.go:117] "RemoveContainer" containerID="14f4d00d9a59d31eb6e46cebe36677e01200347bee214a5be288ba8301a5f228" Nov 24 11:44:13 crc kubenswrapper[4752]: E1124 11:44:13.729336 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:44:25 crc kubenswrapper[4752]: I1124 11:44:25.728120 4752 scope.go:117] "RemoveContainer" containerID="14f4d00d9a59d31eb6e46cebe36677e01200347bee214a5be288ba8301a5f228" Nov 24 11:44:25 crc kubenswrapper[4752]: E1124 11:44:25.728971 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:44:39 crc kubenswrapper[4752]: I1124 11:44:39.728391 4752 scope.go:117] "RemoveContainer" containerID="14f4d00d9a59d31eb6e46cebe36677e01200347bee214a5be288ba8301a5f228" Nov 24 11:44:39 crc kubenswrapper[4752]: E1124 11:44:39.729396 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:44:50 crc kubenswrapper[4752]: I1124 11:44:50.727533 4752 scope.go:117] "RemoveContainer" containerID="14f4d00d9a59d31eb6e46cebe36677e01200347bee214a5be288ba8301a5f228" Nov 24 11:44:50 crc kubenswrapper[4752]: E1124 11:44:50.728277 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:45:00 crc kubenswrapper[4752]: I1124 11:45:00.165787 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399745-gsknk"] Nov 24 11:45:00 crc kubenswrapper[4752]: E1124 11:45:00.172033 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94c9f396-0306-4484-bf67-cd7243c8066f" containerName="extract-utilities" Nov 24 11:45:00 crc kubenswrapper[4752]: I1124 11:45:00.172070 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="94c9f396-0306-4484-bf67-cd7243c8066f" containerName="extract-utilities" Nov 24 11:45:00 crc kubenswrapper[4752]: E1124 11:45:00.172107 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94c9f396-0306-4484-bf67-cd7243c8066f" containerName="registry-server" Nov 24 11:45:00 crc kubenswrapper[4752]: I1124 11:45:00.172120 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="94c9f396-0306-4484-bf67-cd7243c8066f" containerName="registry-server" Nov 24 11:45:00 crc kubenswrapper[4752]: E1124 11:45:00.172134 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94c9f396-0306-4484-bf67-cd7243c8066f" containerName="extract-content" Nov 24 11:45:00 crc kubenswrapper[4752]: I1124 11:45:00.172145 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="94c9f396-0306-4484-bf67-cd7243c8066f" containerName="extract-content" Nov 24 11:45:00 crc kubenswrapper[4752]: I1124 11:45:00.172405 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="94c9f396-0306-4484-bf67-cd7243c8066f" containerName="registry-server" Nov 24 11:45:00 crc kubenswrapper[4752]: I1124 11:45:00.173115 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399745-gsknk" Nov 24 11:45:00 crc kubenswrapper[4752]: I1124 11:45:00.178232 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 24 11:45:00 crc kubenswrapper[4752]: I1124 11:45:00.178238 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 24 11:45:00 crc kubenswrapper[4752]: I1124 11:45:00.183467 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399745-gsknk"] Nov 24 11:45:00 crc kubenswrapper[4752]: I1124 11:45:00.265608 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmr2k\" (UniqueName: \"kubernetes.io/projected/6b232e27-0a9b-4d35-9116-60a26c2deb80-kube-api-access-mmr2k\") pod \"collect-profiles-29399745-gsknk\" (UID: \"6b232e27-0a9b-4d35-9116-60a26c2deb80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399745-gsknk" Nov 24 11:45:00 crc kubenswrapper[4752]: I1124 11:45:00.266302 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b232e27-0a9b-4d35-9116-60a26c2deb80-config-volume\") pod \"collect-profiles-29399745-gsknk\" (UID: \"6b232e27-0a9b-4d35-9116-60a26c2deb80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399745-gsknk" Nov 24 11:45:00 crc kubenswrapper[4752]: I1124 11:45:00.266474 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b232e27-0a9b-4d35-9116-60a26c2deb80-secret-volume\") pod \"collect-profiles-29399745-gsknk\" (UID: \"6b232e27-0a9b-4d35-9116-60a26c2deb80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399745-gsknk" Nov 24 11:45:00 crc kubenswrapper[4752]: I1124 11:45:00.368033 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmr2k\" (UniqueName: \"kubernetes.io/projected/6b232e27-0a9b-4d35-9116-60a26c2deb80-kube-api-access-mmr2k\") pod \"collect-profiles-29399745-gsknk\" (UID: \"6b232e27-0a9b-4d35-9116-60a26c2deb80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399745-gsknk" Nov 24 11:45:00 crc kubenswrapper[4752]: I1124 11:45:00.368126 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b232e27-0a9b-4d35-9116-60a26c2deb80-config-volume\") pod \"collect-profiles-29399745-gsknk\" (UID: \"6b232e27-0a9b-4d35-9116-60a26c2deb80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399745-gsknk" Nov 24 11:45:00 crc kubenswrapper[4752]: I1124 11:45:00.368230 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b232e27-0a9b-4d35-9116-60a26c2deb80-secret-volume\") pod \"collect-profiles-29399745-gsknk\" (UID: \"6b232e27-0a9b-4d35-9116-60a26c2deb80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399745-gsknk" Nov 24 11:45:00 crc kubenswrapper[4752]: I1124 11:45:00.369245 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b232e27-0a9b-4d35-9116-60a26c2deb80-config-volume\") pod \"collect-profiles-29399745-gsknk\" (UID: \"6b232e27-0a9b-4d35-9116-60a26c2deb80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399745-gsknk" Nov 24 11:45:00 crc kubenswrapper[4752]: I1124 11:45:00.378402 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b232e27-0a9b-4d35-9116-60a26c2deb80-secret-volume\") pod \"collect-profiles-29399745-gsknk\" (UID: \"6b232e27-0a9b-4d35-9116-60a26c2deb80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399745-gsknk" Nov 24 11:45:00 crc kubenswrapper[4752]: I1124 11:45:00.391918 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmr2k\" (UniqueName: \"kubernetes.io/projected/6b232e27-0a9b-4d35-9116-60a26c2deb80-kube-api-access-mmr2k\") pod \"collect-profiles-29399745-gsknk\" (UID: \"6b232e27-0a9b-4d35-9116-60a26c2deb80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399745-gsknk" Nov 24 11:45:00 crc kubenswrapper[4752]: I1124 11:45:00.493676 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399745-gsknk" Nov 24 11:45:00 crc kubenswrapper[4752]: I1124 11:45:00.947069 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399745-gsknk"] Nov 24 11:45:01 crc kubenswrapper[4752]: I1124 11:45:01.170584 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399745-gsknk" event={"ID":"6b232e27-0a9b-4d35-9116-60a26c2deb80","Type":"ContainerStarted","Data":"effa82a305dce892bcb51a1289dac1f34fa53250a8b21c97bc9398e29a999628"} Nov 24 11:45:01 crc kubenswrapper[4752]: I1124 11:45:01.171136 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399745-gsknk" event={"ID":"6b232e27-0a9b-4d35-9116-60a26c2deb80","Type":"ContainerStarted","Data":"540034fde1ce228047a71761ada86cebd7e44453fb82ff6010c9d494fe514d28"} Nov 24 11:45:01 crc kubenswrapper[4752]: I1124 11:45:01.189902 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29399745-gsknk" podStartSLOduration=1.189875905 podStartE2EDuration="1.189875905s" podCreationTimestamp="2025-11-24 11:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 11:45:01.188635799 +0000 UTC m=+2307.173456098" watchObservedRunningTime="2025-11-24 11:45:01.189875905 +0000 UTC m=+2307.174696194" Nov 24 11:45:02 crc kubenswrapper[4752]: I1124 11:45:02.183183 4752 generic.go:334] "Generic (PLEG): container finished" podID="6b232e27-0a9b-4d35-9116-60a26c2deb80" containerID="effa82a305dce892bcb51a1289dac1f34fa53250a8b21c97bc9398e29a999628" exitCode=0 Nov 24 11:45:02 crc kubenswrapper[4752]: I1124 11:45:02.183261 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399745-gsknk" event={"ID":"6b232e27-0a9b-4d35-9116-60a26c2deb80","Type":"ContainerDied","Data":"effa82a305dce892bcb51a1289dac1f34fa53250a8b21c97bc9398e29a999628"} Nov 24 11:45:02 crc kubenswrapper[4752]: I1124 11:45:02.728660 4752 scope.go:117] "RemoveContainer" containerID="14f4d00d9a59d31eb6e46cebe36677e01200347bee214a5be288ba8301a5f228" Nov 24 11:45:02 crc kubenswrapper[4752]: E1124 11:45:02.729390 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:45:03 crc kubenswrapper[4752]: I1124 11:45:03.464097 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399745-gsknk" Nov 24 11:45:03 crc kubenswrapper[4752]: I1124 11:45:03.617074 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b232e27-0a9b-4d35-9116-60a26c2deb80-config-volume\") pod \"6b232e27-0a9b-4d35-9116-60a26c2deb80\" (UID: \"6b232e27-0a9b-4d35-9116-60a26c2deb80\") " Nov 24 11:45:03 crc kubenswrapper[4752]: I1124 11:45:03.617267 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b232e27-0a9b-4d35-9116-60a26c2deb80-secret-volume\") pod \"6b232e27-0a9b-4d35-9116-60a26c2deb80\" (UID: \"6b232e27-0a9b-4d35-9116-60a26c2deb80\") " Nov 24 11:45:03 crc kubenswrapper[4752]: I1124 11:45:03.617431 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmr2k\" (UniqueName: \"kubernetes.io/projected/6b232e27-0a9b-4d35-9116-60a26c2deb80-kube-api-access-mmr2k\") pod \"6b232e27-0a9b-4d35-9116-60a26c2deb80\" (UID: \"6b232e27-0a9b-4d35-9116-60a26c2deb80\") " Nov 24 11:45:03 crc kubenswrapper[4752]: I1124 11:45:03.617668 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b232e27-0a9b-4d35-9116-60a26c2deb80-config-volume" (OuterVolumeSpecName: "config-volume") pod "6b232e27-0a9b-4d35-9116-60a26c2deb80" (UID: "6b232e27-0a9b-4d35-9116-60a26c2deb80"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 11:45:03 crc kubenswrapper[4752]: I1124 11:45:03.617799 4752 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b232e27-0a9b-4d35-9116-60a26c2deb80-config-volume\") on node \"crc\" DevicePath \"\"" Nov 24 11:45:03 crc kubenswrapper[4752]: I1124 11:45:03.623100 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b232e27-0a9b-4d35-9116-60a26c2deb80-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6b232e27-0a9b-4d35-9116-60a26c2deb80" (UID: "6b232e27-0a9b-4d35-9116-60a26c2deb80"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 11:45:03 crc kubenswrapper[4752]: I1124 11:45:03.627050 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b232e27-0a9b-4d35-9116-60a26c2deb80-kube-api-access-mmr2k" (OuterVolumeSpecName: "kube-api-access-mmr2k") pod "6b232e27-0a9b-4d35-9116-60a26c2deb80" (UID: "6b232e27-0a9b-4d35-9116-60a26c2deb80"). InnerVolumeSpecName "kube-api-access-mmr2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:45:03 crc kubenswrapper[4752]: I1124 11:45:03.719204 4752 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b232e27-0a9b-4d35-9116-60a26c2deb80-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 24 11:45:03 crc kubenswrapper[4752]: I1124 11:45:03.719785 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmr2k\" (UniqueName: \"kubernetes.io/projected/6b232e27-0a9b-4d35-9116-60a26c2deb80-kube-api-access-mmr2k\") on node \"crc\" DevicePath \"\"" Nov 24 11:45:04 crc kubenswrapper[4752]: I1124 11:45:04.201852 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399745-gsknk" event={"ID":"6b232e27-0a9b-4d35-9116-60a26c2deb80","Type":"ContainerDied","Data":"540034fde1ce228047a71761ada86cebd7e44453fb82ff6010c9d494fe514d28"} Nov 24 11:45:04 crc kubenswrapper[4752]: I1124 11:45:04.202211 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="540034fde1ce228047a71761ada86cebd7e44453fb82ff6010c9d494fe514d28" Nov 24 11:45:04 crc kubenswrapper[4752]: I1124 11:45:04.201964 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399745-gsknk" Nov 24 11:45:04 crc kubenswrapper[4752]: I1124 11:45:04.270159 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399700-qknbd"] Nov 24 11:45:04 crc kubenswrapper[4752]: I1124 11:45:04.280815 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399700-qknbd"] Nov 24 11:45:04 crc kubenswrapper[4752]: I1124 11:45:04.739799 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f619c2d4-bcf5-4403-acd6-0bf90e2ece94" path="/var/lib/kubelet/pods/f619c2d4-bcf5-4403-acd6-0bf90e2ece94/volumes" Nov 24 11:45:13 crc kubenswrapper[4752]: I1124 11:45:13.728524 4752 scope.go:117] "RemoveContainer" containerID="14f4d00d9a59d31eb6e46cebe36677e01200347bee214a5be288ba8301a5f228" Nov 24 11:45:13 crc kubenswrapper[4752]: E1124 11:45:13.729823 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:45:28 crc kubenswrapper[4752]: I1124 11:45:28.727642 4752 scope.go:117] "RemoveContainer" containerID="14f4d00d9a59d31eb6e46cebe36677e01200347bee214a5be288ba8301a5f228" Nov 24 11:45:28 crc kubenswrapper[4752]: E1124 11:45:28.728399 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:45:39 crc kubenswrapper[4752]: I1124 11:45:39.727842 4752 scope.go:117] "RemoveContainer" containerID="14f4d00d9a59d31eb6e46cebe36677e01200347bee214a5be288ba8301a5f228" Nov 24 11:45:39 crc kubenswrapper[4752]: E1124 11:45:39.728650 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:45:40 crc kubenswrapper[4752]: I1124 11:45:40.809796 4752 scope.go:117] "RemoveContainer" containerID="723e98c50dd31d9365836ef6b053573db97c3c9d825af715ecd2ce5d83664ba8" Nov 24 11:45:51 crc kubenswrapper[4752]: I1124 11:45:51.728449 4752 scope.go:117] "RemoveContainer" containerID="14f4d00d9a59d31eb6e46cebe36677e01200347bee214a5be288ba8301a5f228" Nov 24 11:45:51 crc kubenswrapper[4752]: E1124 11:45:51.729308 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:46:03 crc kubenswrapper[4752]: I1124 11:46:03.728470 4752 scope.go:117] "RemoveContainer" containerID="14f4d00d9a59d31eb6e46cebe36677e01200347bee214a5be288ba8301a5f228" Nov 24 11:46:03 crc kubenswrapper[4752]: E1124 11:46:03.729328 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:46:16 crc kubenswrapper[4752]: I1124 11:46:16.728215 4752 scope.go:117] "RemoveContainer" containerID="14f4d00d9a59d31eb6e46cebe36677e01200347bee214a5be288ba8301a5f228" Nov 24 11:46:16 crc kubenswrapper[4752]: E1124 11:46:16.729015 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:46:29 crc kubenswrapper[4752]: I1124 11:46:29.728056 4752 scope.go:117] "RemoveContainer" containerID="14f4d00d9a59d31eb6e46cebe36677e01200347bee214a5be288ba8301a5f228" Nov 24 11:46:29 crc kubenswrapper[4752]: E1124 11:46:29.729221 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:46:42 crc kubenswrapper[4752]: I1124 11:46:42.728148 4752 scope.go:117] "RemoveContainer" containerID="14f4d00d9a59d31eb6e46cebe36677e01200347bee214a5be288ba8301a5f228" Nov 24 11:46:42 crc kubenswrapper[4752]: E1124 11:46:42.729204 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:46:54 crc kubenswrapper[4752]: I1124 11:46:54.737321 4752 scope.go:117] "RemoveContainer" containerID="14f4d00d9a59d31eb6e46cebe36677e01200347bee214a5be288ba8301a5f228" Nov 24 11:46:54 crc kubenswrapper[4752]: E1124 11:46:54.738923 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:47:08 crc kubenswrapper[4752]: I1124 11:47:08.727711 4752 scope.go:117] "RemoveContainer" containerID="14f4d00d9a59d31eb6e46cebe36677e01200347bee214a5be288ba8301a5f228" Nov 24 11:47:08 crc kubenswrapper[4752]: E1124 11:47:08.729418 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:47:19 crc kubenswrapper[4752]: I1124 11:47:19.728154 4752 scope.go:117] "RemoveContainer" containerID="14f4d00d9a59d31eb6e46cebe36677e01200347bee214a5be288ba8301a5f228" Nov 24 11:47:19 crc kubenswrapper[4752]: E1124 11:47:19.728796 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:47:31 crc kubenswrapper[4752]: I1124 11:47:31.727924 4752 scope.go:117] "RemoveContainer" containerID="14f4d00d9a59d31eb6e46cebe36677e01200347bee214a5be288ba8301a5f228" Nov 24 11:47:31 crc kubenswrapper[4752]: E1124 11:47:31.728510 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:47:42 crc kubenswrapper[4752]: I1124 11:47:42.728262 4752 scope.go:117] "RemoveContainer" containerID="14f4d00d9a59d31eb6e46cebe36677e01200347bee214a5be288ba8301a5f228" Nov 24 11:47:42 crc kubenswrapper[4752]: E1124 11:47:42.728988 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:47:57 crc kubenswrapper[4752]: I1124 11:47:57.728565 4752 scope.go:117] "RemoveContainer" containerID="14f4d00d9a59d31eb6e46cebe36677e01200347bee214a5be288ba8301a5f228" Nov 24 11:47:58 crc kubenswrapper[4752]: I1124 11:47:58.635429 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerStarted","Data":"b0e9434178b8e02af148efc8e61ca189701f5abf09e3a5a0ab354d292050140f"} Nov 24 11:50:15 crc kubenswrapper[4752]: I1124 11:50:15.468813 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 11:50:15 crc kubenswrapper[4752]: I1124 11:50:15.469673 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 11:50:45 crc kubenswrapper[4752]: I1124 11:50:45.468970 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 11:50:45 crc kubenswrapper[4752]: I1124 11:50:45.471248 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 11:50:58 crc kubenswrapper[4752]: I1124 11:50:58.641893 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4qjnj"] Nov 24 11:50:58 crc kubenswrapper[4752]: E1124 11:50:58.644208 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b232e27-0a9b-4d35-9116-60a26c2deb80" containerName="collect-profiles" Nov 24 11:50:58 crc kubenswrapper[4752]: I1124 11:50:58.644242 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b232e27-0a9b-4d35-9116-60a26c2deb80" containerName="collect-profiles" Nov 24 11:50:58 crc kubenswrapper[4752]: I1124 11:50:58.645231 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b232e27-0a9b-4d35-9116-60a26c2deb80" containerName="collect-profiles" Nov 24 11:50:58 crc kubenswrapper[4752]: I1124 11:50:58.650613 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4qjnj" Nov 24 11:50:58 crc kubenswrapper[4752]: I1124 11:50:58.692619 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4qjnj"] Nov 24 11:50:58 crc kubenswrapper[4752]: I1124 11:50:58.777143 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2aa3cc59-ad7a-4874-84bd-1d0b05b3acce-utilities\") pod \"redhat-operators-4qjnj\" (UID: \"2aa3cc59-ad7a-4874-84bd-1d0b05b3acce\") " pod="openshift-marketplace/redhat-operators-4qjnj" Nov 24 11:50:58 crc kubenswrapper[4752]: I1124 11:50:58.777386 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2aa3cc59-ad7a-4874-84bd-1d0b05b3acce-catalog-content\") pod \"redhat-operators-4qjnj\" (UID: \"2aa3cc59-ad7a-4874-84bd-1d0b05b3acce\") " pod="openshift-marketplace/redhat-operators-4qjnj" Nov 24 11:50:58 crc kubenswrapper[4752]: I1124 11:50:58.777404 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv92w\" (UniqueName: \"kubernetes.io/projected/2aa3cc59-ad7a-4874-84bd-1d0b05b3acce-kube-api-access-qv92w\") pod \"redhat-operators-4qjnj\" (UID: \"2aa3cc59-ad7a-4874-84bd-1d0b05b3acce\") " pod="openshift-marketplace/redhat-operators-4qjnj" Nov 24 11:50:58 crc kubenswrapper[4752]: I1124 11:50:58.878480 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2aa3cc59-ad7a-4874-84bd-1d0b05b3acce-utilities\") pod \"redhat-operators-4qjnj\" (UID: \"2aa3cc59-ad7a-4874-84bd-1d0b05b3acce\") " pod="openshift-marketplace/redhat-operators-4qjnj" Nov 24 11:50:58 crc kubenswrapper[4752]: I1124 11:50:58.878527 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2aa3cc59-ad7a-4874-84bd-1d0b05b3acce-catalog-content\") pod \"redhat-operators-4qjnj\" (UID: \"2aa3cc59-ad7a-4874-84bd-1d0b05b3acce\") " pod="openshift-marketplace/redhat-operators-4qjnj" Nov 24 11:50:58 crc kubenswrapper[4752]: I1124 11:50:58.878553 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv92w\" (UniqueName: \"kubernetes.io/projected/2aa3cc59-ad7a-4874-84bd-1d0b05b3acce-kube-api-access-qv92w\") pod \"redhat-operators-4qjnj\" (UID: \"2aa3cc59-ad7a-4874-84bd-1d0b05b3acce\") " pod="openshift-marketplace/redhat-operators-4qjnj" Nov 24 11:50:58 crc kubenswrapper[4752]: I1124 11:50:58.879114 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2aa3cc59-ad7a-4874-84bd-1d0b05b3acce-utilities\") pod \"redhat-operators-4qjnj\" (UID: \"2aa3cc59-ad7a-4874-84bd-1d0b05b3acce\") " pod="openshift-marketplace/redhat-operators-4qjnj" Nov 24 11:50:58 crc kubenswrapper[4752]: I1124 11:50:58.879187 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2aa3cc59-ad7a-4874-84bd-1d0b05b3acce-catalog-content\") pod \"redhat-operators-4qjnj\" (UID: \"2aa3cc59-ad7a-4874-84bd-1d0b05b3acce\") " pod="openshift-marketplace/redhat-operators-4qjnj" Nov 24 11:50:58 crc kubenswrapper[4752]: I1124 11:50:58.899552 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv92w\" (UniqueName: \"kubernetes.io/projected/2aa3cc59-ad7a-4874-84bd-1d0b05b3acce-kube-api-access-qv92w\") pod \"redhat-operators-4qjnj\" (UID: \"2aa3cc59-ad7a-4874-84bd-1d0b05b3acce\") " pod="openshift-marketplace/redhat-operators-4qjnj" Nov 24 11:50:58 crc kubenswrapper[4752]: I1124 11:50:58.996356 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4qjnj" Nov 24 11:50:59 crc kubenswrapper[4752]: I1124 11:50:59.420823 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4qjnj"] Nov 24 11:51:00 crc kubenswrapper[4752]: I1124 11:51:00.042878 4752 generic.go:334] "Generic (PLEG): container finished" podID="2aa3cc59-ad7a-4874-84bd-1d0b05b3acce" containerID="a179407bbe5cf2a0f0c5d4a39d2495519a04dab92d22c20a11229126ac4bd89d" exitCode=0 Nov 24 11:51:00 crc kubenswrapper[4752]: I1124 11:51:00.043007 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4qjnj" event={"ID":"2aa3cc59-ad7a-4874-84bd-1d0b05b3acce","Type":"ContainerDied","Data":"a179407bbe5cf2a0f0c5d4a39d2495519a04dab92d22c20a11229126ac4bd89d"} Nov 24 11:51:00 crc kubenswrapper[4752]: I1124 11:51:00.043252 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4qjnj" event={"ID":"2aa3cc59-ad7a-4874-84bd-1d0b05b3acce","Type":"ContainerStarted","Data":"5683aa0ee90b76d11e1b67c1eaa9ffc6fcc61082560c6d4131a22b6dbe3908ae"} Nov 24 11:51:00 crc kubenswrapper[4752]: I1124 11:51:00.045345 4752 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 24 11:51:07 crc kubenswrapper[4752]: I1124 11:51:07.101606 4752 generic.go:334] "Generic (PLEG): container finished" podID="2aa3cc59-ad7a-4874-84bd-1d0b05b3acce" containerID="ae1e54c2e2ff06d43a625e4085d7ca800abaf8970e81a33ff6054ee62665c595" exitCode=0 Nov 24 11:51:07 crc kubenswrapper[4752]: I1124 11:51:07.101775 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4qjnj" event={"ID":"2aa3cc59-ad7a-4874-84bd-1d0b05b3acce","Type":"ContainerDied","Data":"ae1e54c2e2ff06d43a625e4085d7ca800abaf8970e81a33ff6054ee62665c595"} Nov 24 11:51:08 crc kubenswrapper[4752]: I1124 11:51:08.116972 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4qjnj" event={"ID":"2aa3cc59-ad7a-4874-84bd-1d0b05b3acce","Type":"ContainerStarted","Data":"03b20c090304d0de1e14e982886a0c03359f971aae4a2dbc698e1bf3e94e6c6e"} Nov 24 11:51:08 crc kubenswrapper[4752]: I1124 11:51:08.143452 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4qjnj" podStartSLOduration=2.692851959 podStartE2EDuration="10.143433993s" podCreationTimestamp="2025-11-24 11:50:58 +0000 UTC" firstStartedPulling="2025-11-24 11:51:00.045124692 +0000 UTC m=+2666.029944981" lastFinishedPulling="2025-11-24 11:51:07.495706696 +0000 UTC m=+2673.480527015" observedRunningTime="2025-11-24 11:51:08.137422029 +0000 UTC m=+2674.122242328" watchObservedRunningTime="2025-11-24 11:51:08.143433993 +0000 UTC m=+2674.128254292" Nov 24 11:51:08 crc kubenswrapper[4752]: I1124 11:51:08.538763 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-w64vl"] Nov 24 11:51:08 crc kubenswrapper[4752]: I1124 11:51:08.540502 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w64vl" Nov 24 11:51:08 crc kubenswrapper[4752]: I1124 11:51:08.551554 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w64vl"] Nov 24 11:51:08 crc kubenswrapper[4752]: I1124 11:51:08.680013 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq8k9\" (UniqueName: \"kubernetes.io/projected/3fda2e76-e62a-41e7-ad15-6aeda79682a3-kube-api-access-pq8k9\") pod \"redhat-marketplace-w64vl\" (UID: \"3fda2e76-e62a-41e7-ad15-6aeda79682a3\") " pod="openshift-marketplace/redhat-marketplace-w64vl" Nov 24 11:51:08 crc kubenswrapper[4752]: I1124 11:51:08.680070 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fda2e76-e62a-41e7-ad15-6aeda79682a3-catalog-content\") pod \"redhat-marketplace-w64vl\" (UID: \"3fda2e76-e62a-41e7-ad15-6aeda79682a3\") " pod="openshift-marketplace/redhat-marketplace-w64vl" Nov 24 11:51:08 crc kubenswrapper[4752]: I1124 11:51:08.680098 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fda2e76-e62a-41e7-ad15-6aeda79682a3-utilities\") pod \"redhat-marketplace-w64vl\" (UID: \"3fda2e76-e62a-41e7-ad15-6aeda79682a3\") " pod="openshift-marketplace/redhat-marketplace-w64vl" Nov 24 11:51:08 crc kubenswrapper[4752]: I1124 11:51:08.782070 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq8k9\" (UniqueName: \"kubernetes.io/projected/3fda2e76-e62a-41e7-ad15-6aeda79682a3-kube-api-access-pq8k9\") pod \"redhat-marketplace-w64vl\" (UID: \"3fda2e76-e62a-41e7-ad15-6aeda79682a3\") " pod="openshift-marketplace/redhat-marketplace-w64vl" Nov 24 11:51:08 crc kubenswrapper[4752]: I1124 11:51:08.782139 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fda2e76-e62a-41e7-ad15-6aeda79682a3-catalog-content\") pod \"redhat-marketplace-w64vl\" (UID: \"3fda2e76-e62a-41e7-ad15-6aeda79682a3\") " pod="openshift-marketplace/redhat-marketplace-w64vl" Nov 24 11:51:08 crc kubenswrapper[4752]: I1124 11:51:08.782169 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fda2e76-e62a-41e7-ad15-6aeda79682a3-utilities\") pod \"redhat-marketplace-w64vl\" (UID: \"3fda2e76-e62a-41e7-ad15-6aeda79682a3\") " pod="openshift-marketplace/redhat-marketplace-w64vl" Nov 24 11:51:08 crc kubenswrapper[4752]: I1124 11:51:08.782781 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fda2e76-e62a-41e7-ad15-6aeda79682a3-catalog-content\") pod \"redhat-marketplace-w64vl\" (UID: \"3fda2e76-e62a-41e7-ad15-6aeda79682a3\") " pod="openshift-marketplace/redhat-marketplace-w64vl" Nov 24 11:51:08 crc kubenswrapper[4752]: I1124 11:51:08.782801 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fda2e76-e62a-41e7-ad15-6aeda79682a3-utilities\") pod \"redhat-marketplace-w64vl\" (UID: \"3fda2e76-e62a-41e7-ad15-6aeda79682a3\") " pod="openshift-marketplace/redhat-marketplace-w64vl" Nov 24 11:51:08 crc kubenswrapper[4752]: I1124 11:51:08.800819 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq8k9\" (UniqueName: \"kubernetes.io/projected/3fda2e76-e62a-41e7-ad15-6aeda79682a3-kube-api-access-pq8k9\") pod \"redhat-marketplace-w64vl\" (UID: \"3fda2e76-e62a-41e7-ad15-6aeda79682a3\") " pod="openshift-marketplace/redhat-marketplace-w64vl" Nov 24 11:51:08 crc kubenswrapper[4752]: I1124 11:51:08.857525 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w64vl" Nov 24 11:51:09 crc kubenswrapper[4752]: I1124 11:51:09.001006 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4qjnj" Nov 24 11:51:09 crc kubenswrapper[4752]: I1124 11:51:09.001389 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4qjnj" Nov 24 11:51:09 crc kubenswrapper[4752]: I1124 11:51:09.263306 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w64vl"] Nov 24 11:51:10 crc kubenswrapper[4752]: I1124 11:51:10.074226 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4qjnj" podUID="2aa3cc59-ad7a-4874-84bd-1d0b05b3acce" containerName="registry-server" probeResult="failure" output=< Nov 24 11:51:10 crc kubenswrapper[4752]: timeout: failed to connect service ":50051" within 1s Nov 24 11:51:10 crc kubenswrapper[4752]: > Nov 24 11:51:10 crc kubenswrapper[4752]: I1124 11:51:10.135171 4752 generic.go:334] "Generic (PLEG): container finished" podID="3fda2e76-e62a-41e7-ad15-6aeda79682a3" containerID="f740153e6fb55820a68fd68332f21a365e0e92294bffafbed5b3fca2df0ca405" exitCode=0 Nov 24 11:51:10 crc kubenswrapper[4752]: I1124 11:51:10.135277 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w64vl" event={"ID":"3fda2e76-e62a-41e7-ad15-6aeda79682a3","Type":"ContainerDied","Data":"f740153e6fb55820a68fd68332f21a365e0e92294bffafbed5b3fca2df0ca405"} Nov 24 11:51:10 crc kubenswrapper[4752]: I1124 11:51:10.135335 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w64vl" event={"ID":"3fda2e76-e62a-41e7-ad15-6aeda79682a3","Type":"ContainerStarted","Data":"7ae0e37b7abc932ecc067cf3a880ddf273ca8f9b39449ac1048e55bb228df807"} Nov 24 11:51:11 crc kubenswrapper[4752]: I1124 11:51:11.752383 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kwq7s"] Nov 24 11:51:11 crc kubenswrapper[4752]: I1124 11:51:11.755677 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kwq7s" Nov 24 11:51:11 crc kubenswrapper[4752]: I1124 11:51:11.763682 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kwq7s"] Nov 24 11:51:11 crc kubenswrapper[4752]: I1124 11:51:11.858400 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbvsp\" (UniqueName: \"kubernetes.io/projected/c52481e1-5ec3-47c6-a665-4e56317dd16e-kube-api-access-nbvsp\") pod \"community-operators-kwq7s\" (UID: \"c52481e1-5ec3-47c6-a665-4e56317dd16e\") " pod="openshift-marketplace/community-operators-kwq7s" Nov 24 11:51:11 crc kubenswrapper[4752]: I1124 11:51:11.858494 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c52481e1-5ec3-47c6-a665-4e56317dd16e-utilities\") pod \"community-operators-kwq7s\" (UID: \"c52481e1-5ec3-47c6-a665-4e56317dd16e\") " pod="openshift-marketplace/community-operators-kwq7s" Nov 24 11:51:11 crc kubenswrapper[4752]: I1124 11:51:11.858571 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c52481e1-5ec3-47c6-a665-4e56317dd16e-catalog-content\") pod \"community-operators-kwq7s\" (UID: \"c52481e1-5ec3-47c6-a665-4e56317dd16e\") " pod="openshift-marketplace/community-operators-kwq7s" Nov 24 11:51:11 crc kubenswrapper[4752]: I1124 11:51:11.960184 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbvsp\" (UniqueName: \"kubernetes.io/projected/c52481e1-5ec3-47c6-a665-4e56317dd16e-kube-api-access-nbvsp\") pod \"community-operators-kwq7s\" (UID: \"c52481e1-5ec3-47c6-a665-4e56317dd16e\") " pod="openshift-marketplace/community-operators-kwq7s" Nov 24 11:51:11 crc kubenswrapper[4752]: I1124 11:51:11.960270 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c52481e1-5ec3-47c6-a665-4e56317dd16e-utilities\") pod \"community-operators-kwq7s\" (UID: \"c52481e1-5ec3-47c6-a665-4e56317dd16e\") " pod="openshift-marketplace/community-operators-kwq7s" Nov 24 11:51:11 crc kubenswrapper[4752]: I1124 11:51:11.960321 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c52481e1-5ec3-47c6-a665-4e56317dd16e-catalog-content\") pod \"community-operators-kwq7s\" (UID: \"c52481e1-5ec3-47c6-a665-4e56317dd16e\") " pod="openshift-marketplace/community-operators-kwq7s" Nov 24 11:51:11 crc kubenswrapper[4752]: I1124 11:51:11.960946 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c52481e1-5ec3-47c6-a665-4e56317dd16e-utilities\") pod \"community-operators-kwq7s\" (UID: \"c52481e1-5ec3-47c6-a665-4e56317dd16e\") " pod="openshift-marketplace/community-operators-kwq7s" Nov 24 11:51:11 crc kubenswrapper[4752]: I1124 11:51:11.960952 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c52481e1-5ec3-47c6-a665-4e56317dd16e-catalog-content\") pod \"community-operators-kwq7s\" (UID: \"c52481e1-5ec3-47c6-a665-4e56317dd16e\") " pod="openshift-marketplace/community-operators-kwq7s" Nov 24 11:51:11 crc kubenswrapper[4752]: I1124 11:51:11.989260 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbvsp\" (UniqueName: \"kubernetes.io/projected/c52481e1-5ec3-47c6-a665-4e56317dd16e-kube-api-access-nbvsp\") pod \"community-operators-kwq7s\" (UID: \"c52481e1-5ec3-47c6-a665-4e56317dd16e\") " pod="openshift-marketplace/community-operators-kwq7s" Nov 24 11:51:12 crc kubenswrapper[4752]: I1124 11:51:12.073700 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kwq7s" Nov 24 11:51:12 crc kubenswrapper[4752]: I1124 11:51:12.167169 4752 generic.go:334] "Generic (PLEG): container finished" podID="3fda2e76-e62a-41e7-ad15-6aeda79682a3" containerID="c934a71a3fcd9e449f2f08b9248ce907d8759d82ce7855746aa0dd4ee93b7898" exitCode=0 Nov 24 11:51:12 crc kubenswrapper[4752]: I1124 11:51:12.167253 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w64vl" event={"ID":"3fda2e76-e62a-41e7-ad15-6aeda79682a3","Type":"ContainerDied","Data":"c934a71a3fcd9e449f2f08b9248ce907d8759d82ce7855746aa0dd4ee93b7898"} Nov 24 11:51:12 crc kubenswrapper[4752]: I1124 11:51:12.573565 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kwq7s"] Nov 24 11:51:13 crc kubenswrapper[4752]: I1124 11:51:13.178583 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kwq7s" event={"ID":"c52481e1-5ec3-47c6-a665-4e56317dd16e","Type":"ContainerStarted","Data":"29ef47b5f3158d97719b251b3210cd587a567587441e1cc1fd5056e7cc99ca9c"} Nov 24 11:51:14 crc kubenswrapper[4752]: I1124 11:51:14.193123 4752 generic.go:334] "Generic (PLEG): container finished" podID="c52481e1-5ec3-47c6-a665-4e56317dd16e" containerID="8e39d43cdef151749754a36d1e9360383debda20bc5c5f61c409c0e3da50cde7" exitCode=0 Nov 24 11:51:14 crc kubenswrapper[4752]: I1124 11:51:14.193394 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kwq7s" event={"ID":"c52481e1-5ec3-47c6-a665-4e56317dd16e","Type":"ContainerDied","Data":"8e39d43cdef151749754a36d1e9360383debda20bc5c5f61c409c0e3da50cde7"} Nov 24 11:51:15 crc kubenswrapper[4752]: I1124 11:51:15.203413 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w64vl" event={"ID":"3fda2e76-e62a-41e7-ad15-6aeda79682a3","Type":"ContainerStarted","Data":"15e9ebd6ece2f20db140ca213b57074882339aa513aa118e05377438e7fbaf22"} Nov 24 11:51:15 crc kubenswrapper[4752]: I1124 11:51:15.238334 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-w64vl" podStartSLOduration=2.99380972 podStartE2EDuration="7.238315602s" podCreationTimestamp="2025-11-24 11:51:08 +0000 UTC" firstStartedPulling="2025-11-24 11:51:10.138557136 +0000 UTC m=+2676.123377455" lastFinishedPulling="2025-11-24 11:51:14.383063038 +0000 UTC m=+2680.367883337" observedRunningTime="2025-11-24 11:51:15.235998725 +0000 UTC m=+2681.220819014" watchObservedRunningTime="2025-11-24 11:51:15.238315602 +0000 UTC m=+2681.223135891" Nov 24 11:51:15 crc kubenswrapper[4752]: I1124 11:51:15.468945 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 11:51:15 crc kubenswrapper[4752]: I1124 11:51:15.469010 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 11:51:15 crc kubenswrapper[4752]: I1124 11:51:15.469057 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 11:51:15 crc kubenswrapper[4752]: I1124 11:51:15.469647 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b0e9434178b8e02af148efc8e61ca189701f5abf09e3a5a0ab354d292050140f"} pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 24 11:51:15 crc kubenswrapper[4752]: I1124 11:51:15.469712 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" containerID="cri-o://b0e9434178b8e02af148efc8e61ca189701f5abf09e3a5a0ab354d292050140f" gracePeriod=600 Nov 24 11:51:16 crc kubenswrapper[4752]: I1124 11:51:16.211960 4752 generic.go:334] "Generic (PLEG): container finished" podID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerID="b0e9434178b8e02af148efc8e61ca189701f5abf09e3a5a0ab354d292050140f" exitCode=0 Nov 24 11:51:16 crc kubenswrapper[4752]: I1124 11:51:16.212108 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerDied","Data":"b0e9434178b8e02af148efc8e61ca189701f5abf09e3a5a0ab354d292050140f"} Nov 24 11:51:16 crc kubenswrapper[4752]: I1124 11:51:16.212268 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerStarted","Data":"907b3a90e8d229f8f10fdbf7bca82d9376cbc4d6ef9e7363d3b920398853c0b5"} Nov 24 11:51:16 crc kubenswrapper[4752]: I1124 11:51:16.212289 4752 scope.go:117] "RemoveContainer" containerID="14f4d00d9a59d31eb6e46cebe36677e01200347bee214a5be288ba8301a5f228" Nov 24 11:51:16 crc kubenswrapper[4752]: I1124 11:51:16.215053 4752 generic.go:334] "Generic (PLEG): container finished" podID="c52481e1-5ec3-47c6-a665-4e56317dd16e" containerID="c7b25baad878e227ac8d15f2787cad2a653b1e450e01f6a1f07c7353075d9e0c" exitCode=0 Nov 24 11:51:16 crc kubenswrapper[4752]: I1124 11:51:16.215136 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kwq7s" event={"ID":"c52481e1-5ec3-47c6-a665-4e56317dd16e","Type":"ContainerDied","Data":"c7b25baad878e227ac8d15f2787cad2a653b1e450e01f6a1f07c7353075d9e0c"} Nov 24 11:51:17 crc kubenswrapper[4752]: I1124 11:51:17.225376 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kwq7s" event={"ID":"c52481e1-5ec3-47c6-a665-4e56317dd16e","Type":"ContainerStarted","Data":"3059082ee49e93e8b9afd9363ce25250207520ee1a264b08bed5e26a71142ab8"} Nov 24 11:51:17 crc kubenswrapper[4752]: I1124 11:51:17.260496 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kwq7s" podStartSLOduration=4.780082616 podStartE2EDuration="6.260474704s" podCreationTimestamp="2025-11-24 11:51:11 +0000 UTC" firstStartedPulling="2025-11-24 11:51:15.205124556 +0000 UTC m=+2681.189944845" lastFinishedPulling="2025-11-24 11:51:16.685516614 +0000 UTC m=+2682.670336933" observedRunningTime="2025-11-24 11:51:17.248361745 +0000 UTC m=+2683.233182064" watchObservedRunningTime="2025-11-24 11:51:17.260474704 +0000 UTC m=+2683.245295003" Nov 24 11:51:18 crc kubenswrapper[4752]: I1124 11:51:18.858079 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-w64vl" Nov 24 11:51:18 crc kubenswrapper[4752]: I1124 11:51:18.858188 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-w64vl" Nov 24 11:51:18 crc kubenswrapper[4752]: I1124 11:51:18.905903 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-w64vl" Nov 24 11:51:19 crc kubenswrapper[4752]: I1124 11:51:19.041507 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4qjnj" Nov 24 11:51:19 crc kubenswrapper[4752]: I1124 11:51:19.093632 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4qjnj" Nov 24 11:51:19 crc kubenswrapper[4752]: I1124 11:51:19.291886 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-w64vl" Nov 24 11:51:20 crc kubenswrapper[4752]: I1124 11:51:20.956384 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4qjnj"] Nov 24 11:51:21 crc kubenswrapper[4752]: I1124 11:51:21.134651 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w64vl"] Nov 24 11:51:21 crc kubenswrapper[4752]: I1124 11:51:21.262470 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-w64vl" podUID="3fda2e76-e62a-41e7-ad15-6aeda79682a3" containerName="registry-server" containerID="cri-o://15e9ebd6ece2f20db140ca213b57074882339aa513aa118e05377438e7fbaf22" gracePeriod=2 Nov 24 11:51:21 crc kubenswrapper[4752]: I1124 11:51:21.338802 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8p8mr"] Nov 24 11:51:21 crc kubenswrapper[4752]: I1124 11:51:21.339107 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8p8mr" podUID="00198a03-44e5-42de-93b2-667fd5981ac4" containerName="registry-server" containerID="cri-o://97c26f8d5d9c8781808611e01aaac98583fd1f48642c21a9157782d27b5c318f" gracePeriod=2 Nov 24 11:51:21 crc kubenswrapper[4752]: E1124 11:51:21.436429 4752 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fda2e76_e62a_41e7_ad15_6aeda79682a3.slice/crio-15e9ebd6ece2f20db140ca213b57074882339aa513aa118e05377438e7fbaf22.scope\": RecentStats: unable to find data in memory cache]" Nov 24 11:51:21 crc kubenswrapper[4752]: I1124 11:51:21.788341 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w64vl" Nov 24 11:51:21 crc kubenswrapper[4752]: I1124 11:51:21.795209 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8p8mr" Nov 24 11:51:21 crc kubenswrapper[4752]: I1124 11:51:21.949346 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00198a03-44e5-42de-93b2-667fd5981ac4-catalog-content\") pod \"00198a03-44e5-42de-93b2-667fd5981ac4\" (UID: \"00198a03-44e5-42de-93b2-667fd5981ac4\") " Nov 24 11:51:21 crc kubenswrapper[4752]: I1124 11:51:21.949482 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9p9wv\" (UniqueName: \"kubernetes.io/projected/00198a03-44e5-42de-93b2-667fd5981ac4-kube-api-access-9p9wv\") pod \"00198a03-44e5-42de-93b2-667fd5981ac4\" (UID: \"00198a03-44e5-42de-93b2-667fd5981ac4\") " Nov 24 11:51:21 crc kubenswrapper[4752]: I1124 11:51:21.949518 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fda2e76-e62a-41e7-ad15-6aeda79682a3-utilities\") pod \"3fda2e76-e62a-41e7-ad15-6aeda79682a3\" (UID: \"3fda2e76-e62a-41e7-ad15-6aeda79682a3\") " Nov 24 11:51:21 crc kubenswrapper[4752]: I1124 11:51:21.949549 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00198a03-44e5-42de-93b2-667fd5981ac4-utilities\") pod \"00198a03-44e5-42de-93b2-667fd5981ac4\" (UID: \"00198a03-44e5-42de-93b2-667fd5981ac4\") " Nov 24 11:51:21 crc kubenswrapper[4752]: I1124 11:51:21.949624 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fda2e76-e62a-41e7-ad15-6aeda79682a3-catalog-content\") pod \"3fda2e76-e62a-41e7-ad15-6aeda79682a3\" (UID: \"3fda2e76-e62a-41e7-ad15-6aeda79682a3\") " Nov 24 11:51:21 crc kubenswrapper[4752]: I1124 11:51:21.949697 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pq8k9\" (UniqueName: \"kubernetes.io/projected/3fda2e76-e62a-41e7-ad15-6aeda79682a3-kube-api-access-pq8k9\") pod \"3fda2e76-e62a-41e7-ad15-6aeda79682a3\" (UID: \"3fda2e76-e62a-41e7-ad15-6aeda79682a3\") " Nov 24 11:51:21 crc kubenswrapper[4752]: I1124 11:51:21.950077 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00198a03-44e5-42de-93b2-667fd5981ac4-utilities" (OuterVolumeSpecName: "utilities") pod "00198a03-44e5-42de-93b2-667fd5981ac4" (UID: "00198a03-44e5-42de-93b2-667fd5981ac4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:51:21 crc kubenswrapper[4752]: I1124 11:51:21.950368 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fda2e76-e62a-41e7-ad15-6aeda79682a3-utilities" (OuterVolumeSpecName: "utilities") pod "3fda2e76-e62a-41e7-ad15-6aeda79682a3" (UID: "3fda2e76-e62a-41e7-ad15-6aeda79682a3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:51:21 crc kubenswrapper[4752]: I1124 11:51:21.955049 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fda2e76-e62a-41e7-ad15-6aeda79682a3-kube-api-access-pq8k9" (OuterVolumeSpecName: "kube-api-access-pq8k9") pod "3fda2e76-e62a-41e7-ad15-6aeda79682a3" (UID: "3fda2e76-e62a-41e7-ad15-6aeda79682a3"). InnerVolumeSpecName "kube-api-access-pq8k9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:51:21 crc kubenswrapper[4752]: I1124 11:51:21.955318 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00198a03-44e5-42de-93b2-667fd5981ac4-kube-api-access-9p9wv" (OuterVolumeSpecName: "kube-api-access-9p9wv") pod "00198a03-44e5-42de-93b2-667fd5981ac4" (UID: "00198a03-44e5-42de-93b2-667fd5981ac4"). InnerVolumeSpecName "kube-api-access-9p9wv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:51:21 crc kubenswrapper[4752]: I1124 11:51:21.956328 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pq8k9\" (UniqueName: \"kubernetes.io/projected/3fda2e76-e62a-41e7-ad15-6aeda79682a3-kube-api-access-pq8k9\") on node \"crc\" DevicePath \"\"" Nov 24 11:51:21 crc kubenswrapper[4752]: I1124 11:51:21.956359 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9p9wv\" (UniqueName: \"kubernetes.io/projected/00198a03-44e5-42de-93b2-667fd5981ac4-kube-api-access-9p9wv\") on node \"crc\" DevicePath \"\"" Nov 24 11:51:21 crc kubenswrapper[4752]: I1124 11:51:21.956369 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00198a03-44e5-42de-93b2-667fd5981ac4-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 11:51:21 crc kubenswrapper[4752]: I1124 11:51:21.956379 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fda2e76-e62a-41e7-ad15-6aeda79682a3-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 11:51:21 crc kubenswrapper[4752]: I1124 11:51:21.970266 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fda2e76-e62a-41e7-ad15-6aeda79682a3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3fda2e76-e62a-41e7-ad15-6aeda79682a3" (UID: "3fda2e76-e62a-41e7-ad15-6aeda79682a3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:51:22 crc kubenswrapper[4752]: I1124 11:51:22.035903 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00198a03-44e5-42de-93b2-667fd5981ac4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "00198a03-44e5-42de-93b2-667fd5981ac4" (UID: "00198a03-44e5-42de-93b2-667fd5981ac4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:51:22 crc kubenswrapper[4752]: I1124 11:51:22.057958 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fda2e76-e62a-41e7-ad15-6aeda79682a3-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 11:51:22 crc kubenswrapper[4752]: I1124 11:51:22.057988 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00198a03-44e5-42de-93b2-667fd5981ac4-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 11:51:22 crc kubenswrapper[4752]: I1124 11:51:22.074480 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kwq7s" Nov 24 11:51:22 crc kubenswrapper[4752]: I1124 11:51:22.074522 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kwq7s" Nov 24 11:51:22 crc kubenswrapper[4752]: I1124 11:51:22.123399 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kwq7s" Nov 24 11:51:22 crc kubenswrapper[4752]: I1124 11:51:22.270893 4752 generic.go:334] "Generic (PLEG): container finished" podID="00198a03-44e5-42de-93b2-667fd5981ac4" containerID="97c26f8d5d9c8781808611e01aaac98583fd1f48642c21a9157782d27b5c318f" exitCode=0 Nov 24 11:51:22 crc kubenswrapper[4752]: I1124 11:51:22.270965 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8p8mr" Nov 24 11:51:22 crc kubenswrapper[4752]: I1124 11:51:22.270976 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8p8mr" event={"ID":"00198a03-44e5-42de-93b2-667fd5981ac4","Type":"ContainerDied","Data":"97c26f8d5d9c8781808611e01aaac98583fd1f48642c21a9157782d27b5c318f"} Nov 24 11:51:22 crc kubenswrapper[4752]: I1124 11:51:22.271009 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8p8mr" event={"ID":"00198a03-44e5-42de-93b2-667fd5981ac4","Type":"ContainerDied","Data":"1bd87c35b8a7eb415b791e67efc865d992251c212e9e53cb030416cd67aefb90"} Nov 24 11:51:22 crc kubenswrapper[4752]: I1124 11:51:22.271028 4752 scope.go:117] "RemoveContainer" containerID="97c26f8d5d9c8781808611e01aaac98583fd1f48642c21a9157782d27b5c318f" Nov 24 11:51:22 crc kubenswrapper[4752]: I1124 11:51:22.277700 4752 generic.go:334] "Generic (PLEG): container finished" podID="3fda2e76-e62a-41e7-ad15-6aeda79682a3" containerID="15e9ebd6ece2f20db140ca213b57074882339aa513aa118e05377438e7fbaf22" exitCode=0 Nov 24 11:51:22 crc kubenswrapper[4752]: I1124 11:51:22.277778 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w64vl" event={"ID":"3fda2e76-e62a-41e7-ad15-6aeda79682a3","Type":"ContainerDied","Data":"15e9ebd6ece2f20db140ca213b57074882339aa513aa118e05377438e7fbaf22"} Nov 24 11:51:22 crc kubenswrapper[4752]: I1124 11:51:22.277816 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w64vl" event={"ID":"3fda2e76-e62a-41e7-ad15-6aeda79682a3","Type":"ContainerDied","Data":"7ae0e37b7abc932ecc067cf3a880ddf273ca8f9b39449ac1048e55bb228df807"} Nov 24 11:51:22 crc kubenswrapper[4752]: I1124 11:51:22.277839 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w64vl" Nov 24 11:51:22 crc kubenswrapper[4752]: I1124 11:51:22.305323 4752 scope.go:117] "RemoveContainer" containerID="9a3933f5ff6375ad55062bd77429e34547688773c9981fa98766262befc6113b" Nov 24 11:51:22 crc kubenswrapper[4752]: I1124 11:51:22.314008 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8p8mr"] Nov 24 11:51:22 crc kubenswrapper[4752]: I1124 11:51:22.323529 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8p8mr"] Nov 24 11:51:22 crc kubenswrapper[4752]: I1124 11:51:22.329098 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w64vl"] Nov 24 11:51:22 crc kubenswrapper[4752]: I1124 11:51:22.333707 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-w64vl"] Nov 24 11:51:22 crc kubenswrapper[4752]: I1124 11:51:22.335266 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kwq7s" Nov 24 11:51:22 crc kubenswrapper[4752]: I1124 11:51:22.342019 4752 scope.go:117] "RemoveContainer" containerID="c8f76c44e041024053327ad7ca10d04f095263d89cbfd480a2b81bd3744cfad3" Nov 24 11:51:22 crc kubenswrapper[4752]: I1124 11:51:22.363308 4752 scope.go:117] "RemoveContainer" containerID="97c26f8d5d9c8781808611e01aaac98583fd1f48642c21a9157782d27b5c318f" Nov 24 11:51:22 crc kubenswrapper[4752]: E1124 11:51:22.364493 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97c26f8d5d9c8781808611e01aaac98583fd1f48642c21a9157782d27b5c318f\": container with ID starting with 97c26f8d5d9c8781808611e01aaac98583fd1f48642c21a9157782d27b5c318f not found: ID does not exist" containerID="97c26f8d5d9c8781808611e01aaac98583fd1f48642c21a9157782d27b5c318f" Nov 24 11:51:22 crc kubenswrapper[4752]: I1124 11:51:22.364525 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97c26f8d5d9c8781808611e01aaac98583fd1f48642c21a9157782d27b5c318f"} err="failed to get container status \"97c26f8d5d9c8781808611e01aaac98583fd1f48642c21a9157782d27b5c318f\": rpc error: code = NotFound desc = could not find container \"97c26f8d5d9c8781808611e01aaac98583fd1f48642c21a9157782d27b5c318f\": container with ID starting with 97c26f8d5d9c8781808611e01aaac98583fd1f48642c21a9157782d27b5c318f not found: ID does not exist" Nov 24 11:51:22 crc kubenswrapper[4752]: I1124 11:51:22.364548 4752 scope.go:117] "RemoveContainer" containerID="9a3933f5ff6375ad55062bd77429e34547688773c9981fa98766262befc6113b" Nov 24 11:51:22 crc kubenswrapper[4752]: E1124 11:51:22.364915 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a3933f5ff6375ad55062bd77429e34547688773c9981fa98766262befc6113b\": container with ID starting with 9a3933f5ff6375ad55062bd77429e34547688773c9981fa98766262befc6113b not found: ID does not exist" containerID="9a3933f5ff6375ad55062bd77429e34547688773c9981fa98766262befc6113b" Nov 24 11:51:22 crc kubenswrapper[4752]: I1124 11:51:22.364945 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a3933f5ff6375ad55062bd77429e34547688773c9981fa98766262befc6113b"} err="failed to get container status \"9a3933f5ff6375ad55062bd77429e34547688773c9981fa98766262befc6113b\": rpc error: code = NotFound desc = could not find container \"9a3933f5ff6375ad55062bd77429e34547688773c9981fa98766262befc6113b\": container with ID starting with 9a3933f5ff6375ad55062bd77429e34547688773c9981fa98766262befc6113b not found: ID does not exist" Nov 24 11:51:22 crc kubenswrapper[4752]: I1124 11:51:22.364959 4752 scope.go:117] "RemoveContainer" containerID="c8f76c44e041024053327ad7ca10d04f095263d89cbfd480a2b81bd3744cfad3" Nov 24 11:51:22 crc kubenswrapper[4752]: E1124 11:51:22.365221 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8f76c44e041024053327ad7ca10d04f095263d89cbfd480a2b81bd3744cfad3\": container with ID starting with c8f76c44e041024053327ad7ca10d04f095263d89cbfd480a2b81bd3744cfad3 not found: ID does not exist" containerID="c8f76c44e041024053327ad7ca10d04f095263d89cbfd480a2b81bd3744cfad3" Nov 24 11:51:22 crc kubenswrapper[4752]: I1124 11:51:22.365243 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8f76c44e041024053327ad7ca10d04f095263d89cbfd480a2b81bd3744cfad3"} err="failed to get container status \"c8f76c44e041024053327ad7ca10d04f095263d89cbfd480a2b81bd3744cfad3\": rpc error: code = NotFound desc = could not find container \"c8f76c44e041024053327ad7ca10d04f095263d89cbfd480a2b81bd3744cfad3\": container with ID starting with c8f76c44e041024053327ad7ca10d04f095263d89cbfd480a2b81bd3744cfad3 not found: ID does not exist" Nov 24 11:51:22 crc kubenswrapper[4752]: I1124 11:51:22.365258 4752 scope.go:117] "RemoveContainer" containerID="15e9ebd6ece2f20db140ca213b57074882339aa513aa118e05377438e7fbaf22" Nov 24 11:51:22 crc kubenswrapper[4752]: I1124 11:51:22.384071 4752 scope.go:117] "RemoveContainer" containerID="c934a71a3fcd9e449f2f08b9248ce907d8759d82ce7855746aa0dd4ee93b7898" Nov 24 11:51:22 crc kubenswrapper[4752]: I1124 11:51:22.402489 4752 scope.go:117] "RemoveContainer" containerID="f740153e6fb55820a68fd68332f21a365e0e92294bffafbed5b3fca2df0ca405" Nov 24 11:51:22 crc kubenswrapper[4752]: I1124 11:51:22.448980 4752 scope.go:117] "RemoveContainer" containerID="15e9ebd6ece2f20db140ca213b57074882339aa513aa118e05377438e7fbaf22" Nov 24 11:51:22 crc kubenswrapper[4752]: E1124 11:51:22.449430 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15e9ebd6ece2f20db140ca213b57074882339aa513aa118e05377438e7fbaf22\": container with ID starting with 15e9ebd6ece2f20db140ca213b57074882339aa513aa118e05377438e7fbaf22 not found: ID does not exist" containerID="15e9ebd6ece2f20db140ca213b57074882339aa513aa118e05377438e7fbaf22" Nov 24 11:51:22 crc kubenswrapper[4752]: I1124 11:51:22.449469 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15e9ebd6ece2f20db140ca213b57074882339aa513aa118e05377438e7fbaf22"} err="failed to get container status \"15e9ebd6ece2f20db140ca213b57074882339aa513aa118e05377438e7fbaf22\": rpc error: code = NotFound desc = could not find container \"15e9ebd6ece2f20db140ca213b57074882339aa513aa118e05377438e7fbaf22\": container with ID starting with 15e9ebd6ece2f20db140ca213b57074882339aa513aa118e05377438e7fbaf22 not found: ID does not exist" Nov 24 11:51:22 crc kubenswrapper[4752]: I1124 11:51:22.449495 4752 scope.go:117] "RemoveContainer" containerID="c934a71a3fcd9e449f2f08b9248ce907d8759d82ce7855746aa0dd4ee93b7898" Nov 24 11:51:22 crc kubenswrapper[4752]: E1124 11:51:22.449837 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c934a71a3fcd9e449f2f08b9248ce907d8759d82ce7855746aa0dd4ee93b7898\": container with ID starting with c934a71a3fcd9e449f2f08b9248ce907d8759d82ce7855746aa0dd4ee93b7898 not found: ID does not exist" containerID="c934a71a3fcd9e449f2f08b9248ce907d8759d82ce7855746aa0dd4ee93b7898" Nov 24 11:51:22 crc kubenswrapper[4752]: I1124 11:51:22.449862 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c934a71a3fcd9e449f2f08b9248ce907d8759d82ce7855746aa0dd4ee93b7898"} err="failed to get container status \"c934a71a3fcd9e449f2f08b9248ce907d8759d82ce7855746aa0dd4ee93b7898\": rpc error: code = NotFound desc = could not find container \"c934a71a3fcd9e449f2f08b9248ce907d8759d82ce7855746aa0dd4ee93b7898\": container with ID starting with c934a71a3fcd9e449f2f08b9248ce907d8759d82ce7855746aa0dd4ee93b7898 not found: ID does not exist" Nov 24 11:51:22 crc kubenswrapper[4752]: I1124 11:51:22.449879 4752 scope.go:117] "RemoveContainer" containerID="f740153e6fb55820a68fd68332f21a365e0e92294bffafbed5b3fca2df0ca405" Nov 24 11:51:22 crc kubenswrapper[4752]: E1124 11:51:22.450116 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f740153e6fb55820a68fd68332f21a365e0e92294bffafbed5b3fca2df0ca405\": container with ID starting with f740153e6fb55820a68fd68332f21a365e0e92294bffafbed5b3fca2df0ca405 not found: ID does not exist" containerID="f740153e6fb55820a68fd68332f21a365e0e92294bffafbed5b3fca2df0ca405" Nov 24 11:51:22 crc kubenswrapper[4752]: I1124 11:51:22.450141 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f740153e6fb55820a68fd68332f21a365e0e92294bffafbed5b3fca2df0ca405"} err="failed to get container status \"f740153e6fb55820a68fd68332f21a365e0e92294bffafbed5b3fca2df0ca405\": rpc error: code = NotFound desc = could not find container \"f740153e6fb55820a68fd68332f21a365e0e92294bffafbed5b3fca2df0ca405\": container with ID starting with f740153e6fb55820a68fd68332f21a365e0e92294bffafbed5b3fca2df0ca405 not found: ID does not exist" Nov 24 11:51:22 crc kubenswrapper[4752]: I1124 11:51:22.736806 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00198a03-44e5-42de-93b2-667fd5981ac4" path="/var/lib/kubelet/pods/00198a03-44e5-42de-93b2-667fd5981ac4/volumes" Nov 24 11:51:22 crc kubenswrapper[4752]: I1124 11:51:22.737427 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fda2e76-e62a-41e7-ad15-6aeda79682a3" path="/var/lib/kubelet/pods/3fda2e76-e62a-41e7-ad15-6aeda79682a3/volumes" Nov 24 11:51:25 crc kubenswrapper[4752]: I1124 11:51:25.545820 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kwq7s"] Nov 24 11:51:25 crc kubenswrapper[4752]: I1124 11:51:25.548448 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kwq7s" podUID="c52481e1-5ec3-47c6-a665-4e56317dd16e" containerName="registry-server" containerID="cri-o://3059082ee49e93e8b9afd9363ce25250207520ee1a264b08bed5e26a71142ab8" gracePeriod=2 Nov 24 11:51:26 crc kubenswrapper[4752]: I1124 11:51:26.030534 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kwq7s" Nov 24 11:51:26 crc kubenswrapper[4752]: I1124 11:51:26.113131 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbvsp\" (UniqueName: \"kubernetes.io/projected/c52481e1-5ec3-47c6-a665-4e56317dd16e-kube-api-access-nbvsp\") pod \"c52481e1-5ec3-47c6-a665-4e56317dd16e\" (UID: \"c52481e1-5ec3-47c6-a665-4e56317dd16e\") " Nov 24 11:51:26 crc kubenswrapper[4752]: I1124 11:51:26.113230 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c52481e1-5ec3-47c6-a665-4e56317dd16e-utilities\") pod \"c52481e1-5ec3-47c6-a665-4e56317dd16e\" (UID: \"c52481e1-5ec3-47c6-a665-4e56317dd16e\") " Nov 24 11:51:26 crc kubenswrapper[4752]: I1124 11:51:26.113328 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c52481e1-5ec3-47c6-a665-4e56317dd16e-catalog-content\") pod \"c52481e1-5ec3-47c6-a665-4e56317dd16e\" (UID: \"c52481e1-5ec3-47c6-a665-4e56317dd16e\") " Nov 24 11:51:26 crc kubenswrapper[4752]: I1124 11:51:26.114201 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c52481e1-5ec3-47c6-a665-4e56317dd16e-utilities" (OuterVolumeSpecName: "utilities") pod "c52481e1-5ec3-47c6-a665-4e56317dd16e" (UID: "c52481e1-5ec3-47c6-a665-4e56317dd16e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:51:26 crc kubenswrapper[4752]: I1124 11:51:26.120850 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c52481e1-5ec3-47c6-a665-4e56317dd16e-kube-api-access-nbvsp" (OuterVolumeSpecName: "kube-api-access-nbvsp") pod "c52481e1-5ec3-47c6-a665-4e56317dd16e" (UID: "c52481e1-5ec3-47c6-a665-4e56317dd16e"). InnerVolumeSpecName "kube-api-access-nbvsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:51:26 crc kubenswrapper[4752]: I1124 11:51:26.183751 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c52481e1-5ec3-47c6-a665-4e56317dd16e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c52481e1-5ec3-47c6-a665-4e56317dd16e" (UID: "c52481e1-5ec3-47c6-a665-4e56317dd16e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:51:26 crc kubenswrapper[4752]: I1124 11:51:26.215264 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbvsp\" (UniqueName: \"kubernetes.io/projected/c52481e1-5ec3-47c6-a665-4e56317dd16e-kube-api-access-nbvsp\") on node \"crc\" DevicePath \"\"" Nov 24 11:51:26 crc kubenswrapper[4752]: I1124 11:51:26.215486 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c52481e1-5ec3-47c6-a665-4e56317dd16e-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 11:51:26 crc kubenswrapper[4752]: I1124 11:51:26.215585 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c52481e1-5ec3-47c6-a665-4e56317dd16e-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 11:51:26 crc kubenswrapper[4752]: I1124 11:51:26.325245 4752 generic.go:334] "Generic (PLEG): container finished" podID="c52481e1-5ec3-47c6-a665-4e56317dd16e" containerID="3059082ee49e93e8b9afd9363ce25250207520ee1a264b08bed5e26a71142ab8" exitCode=0 Nov 24 11:51:26 crc kubenswrapper[4752]: I1124 11:51:26.325291 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kwq7s" event={"ID":"c52481e1-5ec3-47c6-a665-4e56317dd16e","Type":"ContainerDied","Data":"3059082ee49e93e8b9afd9363ce25250207520ee1a264b08bed5e26a71142ab8"} Nov 24 11:51:26 crc kubenswrapper[4752]: I1124 11:51:26.325329 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kwq7s" event={"ID":"c52481e1-5ec3-47c6-a665-4e56317dd16e","Type":"ContainerDied","Data":"29ef47b5f3158d97719b251b3210cd587a567587441e1cc1fd5056e7cc99ca9c"} Nov 24 11:51:26 crc kubenswrapper[4752]: I1124 11:51:26.325350 4752 scope.go:117] "RemoveContainer" containerID="3059082ee49e93e8b9afd9363ce25250207520ee1a264b08bed5e26a71142ab8" Nov 24 11:51:26 crc kubenswrapper[4752]: I1124 11:51:26.325390 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kwq7s" Nov 24 11:51:26 crc kubenswrapper[4752]: I1124 11:51:26.348321 4752 scope.go:117] "RemoveContainer" containerID="c7b25baad878e227ac8d15f2787cad2a653b1e450e01f6a1f07c7353075d9e0c" Nov 24 11:51:26 crc kubenswrapper[4752]: I1124 11:51:26.371113 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kwq7s"] Nov 24 11:51:26 crc kubenswrapper[4752]: I1124 11:51:26.376395 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kwq7s"] Nov 24 11:51:26 crc kubenswrapper[4752]: I1124 11:51:26.381985 4752 scope.go:117] "RemoveContainer" containerID="8e39d43cdef151749754a36d1e9360383debda20bc5c5f61c409c0e3da50cde7" Nov 24 11:51:26 crc kubenswrapper[4752]: I1124 11:51:26.401494 4752 scope.go:117] "RemoveContainer" containerID="3059082ee49e93e8b9afd9363ce25250207520ee1a264b08bed5e26a71142ab8" Nov 24 11:51:26 crc kubenswrapper[4752]: E1124 11:51:26.401934 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3059082ee49e93e8b9afd9363ce25250207520ee1a264b08bed5e26a71142ab8\": container with ID starting with 3059082ee49e93e8b9afd9363ce25250207520ee1a264b08bed5e26a71142ab8 not found: ID does not exist" containerID="3059082ee49e93e8b9afd9363ce25250207520ee1a264b08bed5e26a71142ab8" Nov 24 11:51:26 crc kubenswrapper[4752]: I1124 11:51:26.401985 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3059082ee49e93e8b9afd9363ce25250207520ee1a264b08bed5e26a71142ab8"} err="failed to get container status \"3059082ee49e93e8b9afd9363ce25250207520ee1a264b08bed5e26a71142ab8\": rpc error: code = NotFound desc = could not find container \"3059082ee49e93e8b9afd9363ce25250207520ee1a264b08bed5e26a71142ab8\": container with ID starting with 3059082ee49e93e8b9afd9363ce25250207520ee1a264b08bed5e26a71142ab8 not found: ID does not exist" Nov 24 11:51:26 crc kubenswrapper[4752]: I1124 11:51:26.402019 4752 scope.go:117] "RemoveContainer" containerID="c7b25baad878e227ac8d15f2787cad2a653b1e450e01f6a1f07c7353075d9e0c" Nov 24 11:51:26 crc kubenswrapper[4752]: E1124 11:51:26.402389 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7b25baad878e227ac8d15f2787cad2a653b1e450e01f6a1f07c7353075d9e0c\": container with ID starting with c7b25baad878e227ac8d15f2787cad2a653b1e450e01f6a1f07c7353075d9e0c not found: ID does not exist" containerID="c7b25baad878e227ac8d15f2787cad2a653b1e450e01f6a1f07c7353075d9e0c" Nov 24 11:51:26 crc kubenswrapper[4752]: I1124 11:51:26.402429 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7b25baad878e227ac8d15f2787cad2a653b1e450e01f6a1f07c7353075d9e0c"} err="failed to get container status \"c7b25baad878e227ac8d15f2787cad2a653b1e450e01f6a1f07c7353075d9e0c\": rpc error: code = NotFound desc = could not find container \"c7b25baad878e227ac8d15f2787cad2a653b1e450e01f6a1f07c7353075d9e0c\": container with ID starting with c7b25baad878e227ac8d15f2787cad2a653b1e450e01f6a1f07c7353075d9e0c not found: ID does not exist" Nov 24 11:51:26 crc kubenswrapper[4752]: I1124 11:51:26.402460 4752 scope.go:117] "RemoveContainer" containerID="8e39d43cdef151749754a36d1e9360383debda20bc5c5f61c409c0e3da50cde7" Nov 24 11:51:26 crc kubenswrapper[4752]: E1124 11:51:26.402777 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e39d43cdef151749754a36d1e9360383debda20bc5c5f61c409c0e3da50cde7\": container with ID starting with 8e39d43cdef151749754a36d1e9360383debda20bc5c5f61c409c0e3da50cde7 not found: ID does not exist" containerID="8e39d43cdef151749754a36d1e9360383debda20bc5c5f61c409c0e3da50cde7" Nov 24 11:51:26 crc kubenswrapper[4752]: I1124 11:51:26.402811 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e39d43cdef151749754a36d1e9360383debda20bc5c5f61c409c0e3da50cde7"} err="failed to get container status \"8e39d43cdef151749754a36d1e9360383debda20bc5c5f61c409c0e3da50cde7\": rpc error: code = NotFound desc = could not find container \"8e39d43cdef151749754a36d1e9360383debda20bc5c5f61c409c0e3da50cde7\": container with ID starting with 8e39d43cdef151749754a36d1e9360383debda20bc5c5f61c409c0e3da50cde7 not found: ID does not exist" Nov 24 11:51:26 crc kubenswrapper[4752]: I1124 11:51:26.738205 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c52481e1-5ec3-47c6-a665-4e56317dd16e" path="/var/lib/kubelet/pods/c52481e1-5ec3-47c6-a665-4e56317dd16e/volumes" Nov 24 11:53:15 crc kubenswrapper[4752]: I1124 11:53:15.468490 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 11:53:15 crc kubenswrapper[4752]: I1124 11:53:15.469111 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 11:53:32 crc kubenswrapper[4752]: I1124 11:53:32.508228 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6szbf"] Nov 24 11:53:32 crc kubenswrapper[4752]: E1124 11:53:32.509113 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fda2e76-e62a-41e7-ad15-6aeda79682a3" containerName="extract-content" Nov 24 11:53:32 crc kubenswrapper[4752]: I1124 11:53:32.509125 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fda2e76-e62a-41e7-ad15-6aeda79682a3" containerName="extract-content" Nov 24 11:53:32 crc kubenswrapper[4752]: E1124 11:53:32.509138 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00198a03-44e5-42de-93b2-667fd5981ac4" containerName="extract-content" Nov 24 11:53:32 crc kubenswrapper[4752]: I1124 11:53:32.509145 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="00198a03-44e5-42de-93b2-667fd5981ac4" containerName="extract-content" Nov 24 11:53:32 crc kubenswrapper[4752]: E1124 11:53:32.509161 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c52481e1-5ec3-47c6-a665-4e56317dd16e" containerName="extract-utilities" Nov 24 11:53:32 crc kubenswrapper[4752]: I1124 11:53:32.509168 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="c52481e1-5ec3-47c6-a665-4e56317dd16e" containerName="extract-utilities" Nov 24 11:53:32 crc kubenswrapper[4752]: E1124 11:53:32.509184 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00198a03-44e5-42de-93b2-667fd5981ac4" containerName="registry-server" Nov 24 11:53:32 crc kubenswrapper[4752]: I1124 11:53:32.509190 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="00198a03-44e5-42de-93b2-667fd5981ac4" containerName="registry-server" Nov 24 11:53:32 crc kubenswrapper[4752]: E1124 11:53:32.509202 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fda2e76-e62a-41e7-ad15-6aeda79682a3" containerName="registry-server" Nov 24 11:53:32 crc kubenswrapper[4752]: I1124 11:53:32.509208 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fda2e76-e62a-41e7-ad15-6aeda79682a3" containerName="registry-server" Nov 24 11:53:32 crc kubenswrapper[4752]: E1124 11:53:32.509228 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fda2e76-e62a-41e7-ad15-6aeda79682a3" containerName="extract-utilities" Nov 24 11:53:32 crc kubenswrapper[4752]: I1124 11:53:32.509235 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fda2e76-e62a-41e7-ad15-6aeda79682a3" containerName="extract-utilities" Nov 24 11:53:32 crc kubenswrapper[4752]: E1124 11:53:32.509245 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c52481e1-5ec3-47c6-a665-4e56317dd16e" containerName="extract-content" Nov 24 11:53:32 crc kubenswrapper[4752]: I1124 11:53:32.509251 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="c52481e1-5ec3-47c6-a665-4e56317dd16e" containerName="extract-content" Nov 24 11:53:32 crc kubenswrapper[4752]: E1124 11:53:32.509258 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00198a03-44e5-42de-93b2-667fd5981ac4" containerName="extract-utilities" Nov 24 11:53:32 crc kubenswrapper[4752]: I1124 11:53:32.509264 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="00198a03-44e5-42de-93b2-667fd5981ac4" containerName="extract-utilities" Nov 24 11:53:32 crc kubenswrapper[4752]: E1124 11:53:32.509274 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c52481e1-5ec3-47c6-a665-4e56317dd16e" containerName="registry-server" Nov 24 11:53:32 crc kubenswrapper[4752]: I1124 11:53:32.509280 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="c52481e1-5ec3-47c6-a665-4e56317dd16e" containerName="registry-server" Nov 24 11:53:32 crc kubenswrapper[4752]: I1124 11:53:32.509412 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="00198a03-44e5-42de-93b2-667fd5981ac4" containerName="registry-server" Nov 24 11:53:32 crc kubenswrapper[4752]: I1124 11:53:32.509425 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fda2e76-e62a-41e7-ad15-6aeda79682a3" containerName="registry-server" Nov 24 11:53:32 crc kubenswrapper[4752]: I1124 11:53:32.509436 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="c52481e1-5ec3-47c6-a665-4e56317dd16e" containerName="registry-server" Nov 24 11:53:32 crc kubenswrapper[4752]: I1124 11:53:32.510434 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6szbf" Nov 24 11:53:32 crc kubenswrapper[4752]: I1124 11:53:32.520580 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6szbf"] Nov 24 11:53:32 crc kubenswrapper[4752]: I1124 11:53:32.564839 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8-utilities\") pod \"certified-operators-6szbf\" (UID: \"5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8\") " pod="openshift-marketplace/certified-operators-6szbf" Nov 24 11:53:32 crc kubenswrapper[4752]: I1124 11:53:32.565275 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r89hf\" (UniqueName: \"kubernetes.io/projected/5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8-kube-api-access-r89hf\") pod \"certified-operators-6szbf\" (UID: \"5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8\") " pod="openshift-marketplace/certified-operators-6szbf" Nov 24 11:53:32 crc kubenswrapper[4752]: I1124 11:53:32.565333 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8-catalog-content\") pod \"certified-operators-6szbf\" (UID: \"5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8\") " pod="openshift-marketplace/certified-operators-6szbf" Nov 24 11:53:32 crc kubenswrapper[4752]: I1124 11:53:32.667698 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8-utilities\") pod \"certified-operators-6szbf\" (UID: \"5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8\") " pod="openshift-marketplace/certified-operators-6szbf" Nov 24 11:53:32 crc kubenswrapper[4752]: I1124 11:53:32.667819 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r89hf\" (UniqueName: \"kubernetes.io/projected/5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8-kube-api-access-r89hf\") pod \"certified-operators-6szbf\" (UID: \"5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8\") " pod="openshift-marketplace/certified-operators-6szbf" Nov 24 11:53:32 crc kubenswrapper[4752]: I1124 11:53:32.667873 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8-catalog-content\") pod \"certified-operators-6szbf\" (UID: \"5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8\") " pod="openshift-marketplace/certified-operators-6szbf" Nov 24 11:53:32 crc kubenswrapper[4752]: I1124 11:53:32.668348 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8-catalog-content\") pod \"certified-operators-6szbf\" (UID: \"5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8\") " pod="openshift-marketplace/certified-operators-6szbf" Nov 24 11:53:32 crc kubenswrapper[4752]: I1124 11:53:32.668501 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8-utilities\") pod \"certified-operators-6szbf\" (UID: \"5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8\") " pod="openshift-marketplace/certified-operators-6szbf" Nov 24 11:53:32 crc kubenswrapper[4752]: I1124 11:53:32.689042 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r89hf\" (UniqueName: \"kubernetes.io/projected/5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8-kube-api-access-r89hf\") pod \"certified-operators-6szbf\" (UID: \"5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8\") " pod="openshift-marketplace/certified-operators-6szbf" Nov 24 11:53:32 crc kubenswrapper[4752]: I1124 11:53:32.851435 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6szbf" Nov 24 11:53:33 crc kubenswrapper[4752]: I1124 11:53:33.162437 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6szbf"] Nov 24 11:53:33 crc kubenswrapper[4752]: I1124 11:53:33.389520 4752 generic.go:334] "Generic (PLEG): container finished" podID="5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8" containerID="37b6a031b68cf27190c487368a63d60b43ff71b0f592c59b066178dbedf88b9a" exitCode=0 Nov 24 11:53:33 crc kubenswrapper[4752]: I1124 11:53:33.389561 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6szbf" event={"ID":"5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8","Type":"ContainerDied","Data":"37b6a031b68cf27190c487368a63d60b43ff71b0f592c59b066178dbedf88b9a"} Nov 24 11:53:33 crc kubenswrapper[4752]: I1124 11:53:33.389585 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6szbf" event={"ID":"5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8","Type":"ContainerStarted","Data":"a3c1cc31959c2d15f4fd83ccf9c7e37c5cf925d7c4ee1adef43f97c8d8d34c97"} Nov 24 11:53:34 crc kubenswrapper[4752]: I1124 11:53:34.400020 4752 generic.go:334] "Generic (PLEG): container finished" podID="5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8" containerID="9e36957d084f0beb5e93013824736782348f40da5afcdfa872b2ad9efe35c8ff" exitCode=0 Nov 24 11:53:34 crc kubenswrapper[4752]: I1124 11:53:34.400104 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6szbf" event={"ID":"5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8","Type":"ContainerDied","Data":"9e36957d084f0beb5e93013824736782348f40da5afcdfa872b2ad9efe35c8ff"} Nov 24 11:53:35 crc kubenswrapper[4752]: I1124 11:53:35.412877 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6szbf" event={"ID":"5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8","Type":"ContainerStarted","Data":"035f6809882cd202db76818714e09fd2fe8a91e05fd768a1a368056fd38cc0df"} Nov 24 11:53:35 crc kubenswrapper[4752]: I1124 11:53:35.429576 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6szbf" podStartSLOduration=1.99108248 podStartE2EDuration="3.429555932s" podCreationTimestamp="2025-11-24 11:53:32 +0000 UTC" firstStartedPulling="2025-11-24 11:53:33.390733939 +0000 UTC m=+2819.375554228" lastFinishedPulling="2025-11-24 11:53:34.829207391 +0000 UTC m=+2820.814027680" observedRunningTime="2025-11-24 11:53:35.426692969 +0000 UTC m=+2821.411513258" watchObservedRunningTime="2025-11-24 11:53:35.429555932 +0000 UTC m=+2821.414376221" Nov 24 11:53:42 crc kubenswrapper[4752]: I1124 11:53:42.851685 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6szbf" Nov 24 11:53:42 crc kubenswrapper[4752]: I1124 11:53:42.853035 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6szbf" Nov 24 11:53:42 crc kubenswrapper[4752]: I1124 11:53:42.929341 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6szbf" Nov 24 11:53:43 crc kubenswrapper[4752]: I1124 11:53:43.537525 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6szbf" Nov 24 11:53:43 crc kubenswrapper[4752]: I1124 11:53:43.585690 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6szbf"] Nov 24 11:53:45 crc kubenswrapper[4752]: I1124 11:53:45.469787 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 11:53:45 crc kubenswrapper[4752]: I1124 11:53:45.470234 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 11:53:45 crc kubenswrapper[4752]: I1124 11:53:45.498574 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6szbf" podUID="5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8" containerName="registry-server" containerID="cri-o://035f6809882cd202db76818714e09fd2fe8a91e05fd768a1a368056fd38cc0df" gracePeriod=2 Nov 24 11:53:45 crc kubenswrapper[4752]: I1124 11:53:45.918944 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6szbf" Nov 24 11:53:45 crc kubenswrapper[4752]: I1124 11:53:45.965029 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8-catalog-content\") pod \"5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8\" (UID: \"5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8\") " Nov 24 11:53:45 crc kubenswrapper[4752]: I1124 11:53:45.965194 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r89hf\" (UniqueName: \"kubernetes.io/projected/5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8-kube-api-access-r89hf\") pod \"5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8\" (UID: \"5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8\") " Nov 24 11:53:45 crc kubenswrapper[4752]: I1124 11:53:45.965323 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8-utilities\") pod \"5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8\" (UID: \"5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8\") " Nov 24 11:53:45 crc kubenswrapper[4752]: I1124 11:53:45.970666 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8-utilities" (OuterVolumeSpecName: "utilities") pod "5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8" (UID: "5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:53:45 crc kubenswrapper[4752]: I1124 11:53:45.978132 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8-kube-api-access-r89hf" (OuterVolumeSpecName: "kube-api-access-r89hf") pod "5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8" (UID: "5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8"). InnerVolumeSpecName "kube-api-access-r89hf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 11:53:46 crc kubenswrapper[4752]: I1124 11:53:46.021432 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8" (UID: "5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 11:53:46 crc kubenswrapper[4752]: I1124 11:53:46.069249 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r89hf\" (UniqueName: \"kubernetes.io/projected/5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8-kube-api-access-r89hf\") on node \"crc\" DevicePath \"\"" Nov 24 11:53:46 crc kubenswrapper[4752]: I1124 11:53:46.069296 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 11:53:46 crc kubenswrapper[4752]: I1124 11:53:46.069310 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 11:53:46 crc kubenswrapper[4752]: I1124 11:53:46.514531 4752 generic.go:334] "Generic (PLEG): container finished" podID="5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8" containerID="035f6809882cd202db76818714e09fd2fe8a91e05fd768a1a368056fd38cc0df" exitCode=0 Nov 24 11:53:46 crc kubenswrapper[4752]: I1124 11:53:46.514586 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6szbf" event={"ID":"5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8","Type":"ContainerDied","Data":"035f6809882cd202db76818714e09fd2fe8a91e05fd768a1a368056fd38cc0df"} Nov 24 11:53:46 crc kubenswrapper[4752]: I1124 11:53:46.514630 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6szbf" event={"ID":"5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8","Type":"ContainerDied","Data":"a3c1cc31959c2d15f4fd83ccf9c7e37c5cf925d7c4ee1adef43f97c8d8d34c97"} Nov 24 11:53:46 crc kubenswrapper[4752]: I1124 11:53:46.514653 4752 scope.go:117] "RemoveContainer" containerID="035f6809882cd202db76818714e09fd2fe8a91e05fd768a1a368056fd38cc0df" Nov 24 11:53:46 crc kubenswrapper[4752]: I1124 11:53:46.514675 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6szbf" Nov 24 11:53:46 crc kubenswrapper[4752]: I1124 11:53:46.556273 4752 scope.go:117] "RemoveContainer" containerID="9e36957d084f0beb5e93013824736782348f40da5afcdfa872b2ad9efe35c8ff" Nov 24 11:53:46 crc kubenswrapper[4752]: I1124 11:53:46.558961 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6szbf"] Nov 24 11:53:46 crc kubenswrapper[4752]: I1124 11:53:46.565637 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6szbf"] Nov 24 11:53:46 crc kubenswrapper[4752]: I1124 11:53:46.579292 4752 scope.go:117] "RemoveContainer" containerID="37b6a031b68cf27190c487368a63d60b43ff71b0f592c59b066178dbedf88b9a" Nov 24 11:53:46 crc kubenswrapper[4752]: I1124 11:53:46.609168 4752 scope.go:117] "RemoveContainer" containerID="035f6809882cd202db76818714e09fd2fe8a91e05fd768a1a368056fd38cc0df" Nov 24 11:53:46 crc kubenswrapper[4752]: E1124 11:53:46.609843 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"035f6809882cd202db76818714e09fd2fe8a91e05fd768a1a368056fd38cc0df\": container with ID starting with 035f6809882cd202db76818714e09fd2fe8a91e05fd768a1a368056fd38cc0df not found: ID does not exist" containerID="035f6809882cd202db76818714e09fd2fe8a91e05fd768a1a368056fd38cc0df" Nov 24 11:53:46 crc kubenswrapper[4752]: I1124 11:53:46.609899 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"035f6809882cd202db76818714e09fd2fe8a91e05fd768a1a368056fd38cc0df"} err="failed to get container status \"035f6809882cd202db76818714e09fd2fe8a91e05fd768a1a368056fd38cc0df\": rpc error: code = NotFound desc = could not find container \"035f6809882cd202db76818714e09fd2fe8a91e05fd768a1a368056fd38cc0df\": container with ID starting with 035f6809882cd202db76818714e09fd2fe8a91e05fd768a1a368056fd38cc0df not found: ID does not exist" Nov 24 11:53:46 crc kubenswrapper[4752]: I1124 11:53:46.609923 4752 scope.go:117] "RemoveContainer" containerID="9e36957d084f0beb5e93013824736782348f40da5afcdfa872b2ad9efe35c8ff" Nov 24 11:53:46 crc kubenswrapper[4752]: E1124 11:53:46.610328 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e36957d084f0beb5e93013824736782348f40da5afcdfa872b2ad9efe35c8ff\": container with ID starting with 9e36957d084f0beb5e93013824736782348f40da5afcdfa872b2ad9efe35c8ff not found: ID does not exist" containerID="9e36957d084f0beb5e93013824736782348f40da5afcdfa872b2ad9efe35c8ff" Nov 24 11:53:46 crc kubenswrapper[4752]: I1124 11:53:46.610366 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e36957d084f0beb5e93013824736782348f40da5afcdfa872b2ad9efe35c8ff"} err="failed to get container status \"9e36957d084f0beb5e93013824736782348f40da5afcdfa872b2ad9efe35c8ff\": rpc error: code = NotFound desc = could not find container \"9e36957d084f0beb5e93013824736782348f40da5afcdfa872b2ad9efe35c8ff\": container with ID starting with 9e36957d084f0beb5e93013824736782348f40da5afcdfa872b2ad9efe35c8ff not found: ID does not exist" Nov 24 11:53:46 crc kubenswrapper[4752]: I1124 11:53:46.610396 4752 scope.go:117] "RemoveContainer" containerID="37b6a031b68cf27190c487368a63d60b43ff71b0f592c59b066178dbedf88b9a" Nov 24 11:53:46 crc kubenswrapper[4752]: E1124 11:53:46.610728 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37b6a031b68cf27190c487368a63d60b43ff71b0f592c59b066178dbedf88b9a\": container with ID starting with 37b6a031b68cf27190c487368a63d60b43ff71b0f592c59b066178dbedf88b9a not found: ID does not exist" containerID="37b6a031b68cf27190c487368a63d60b43ff71b0f592c59b066178dbedf88b9a" Nov 24 11:53:46 crc kubenswrapper[4752]: I1124 11:53:46.610764 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37b6a031b68cf27190c487368a63d60b43ff71b0f592c59b066178dbedf88b9a"} err="failed to get container status \"37b6a031b68cf27190c487368a63d60b43ff71b0f592c59b066178dbedf88b9a\": rpc error: code = NotFound desc = could not find container \"37b6a031b68cf27190c487368a63d60b43ff71b0f592c59b066178dbedf88b9a\": container with ID starting with 37b6a031b68cf27190c487368a63d60b43ff71b0f592c59b066178dbedf88b9a not found: ID does not exist" Nov 24 11:53:46 crc kubenswrapper[4752]: I1124 11:53:46.740075 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8" path="/var/lib/kubelet/pods/5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8/volumes" Nov 24 11:54:15 crc kubenswrapper[4752]: I1124 11:54:15.469112 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 11:54:15 crc kubenswrapper[4752]: I1124 11:54:15.470873 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 11:54:15 crc kubenswrapper[4752]: I1124 11:54:15.471020 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 11:54:15 crc kubenswrapper[4752]: I1124 11:54:15.471764 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"907b3a90e8d229f8f10fdbf7bca82d9376cbc4d6ef9e7363d3b920398853c0b5"} pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 24 11:54:15 crc kubenswrapper[4752]: I1124 11:54:15.472126 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" containerID="cri-o://907b3a90e8d229f8f10fdbf7bca82d9376cbc4d6ef9e7363d3b920398853c0b5" gracePeriod=600 Nov 24 11:54:15 crc kubenswrapper[4752]: E1124 11:54:15.615048 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:54:15 crc kubenswrapper[4752]: I1124 11:54:15.739839 4752 generic.go:334] "Generic (PLEG): container finished" podID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerID="907b3a90e8d229f8f10fdbf7bca82d9376cbc4d6ef9e7363d3b920398853c0b5" exitCode=0 Nov 24 11:54:15 crc kubenswrapper[4752]: I1124 11:54:15.739894 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerDied","Data":"907b3a90e8d229f8f10fdbf7bca82d9376cbc4d6ef9e7363d3b920398853c0b5"} Nov 24 11:54:15 crc kubenswrapper[4752]: I1124 11:54:15.739950 4752 scope.go:117] "RemoveContainer" containerID="b0e9434178b8e02af148efc8e61ca189701f5abf09e3a5a0ab354d292050140f" Nov 24 11:54:15 crc kubenswrapper[4752]: I1124 11:54:15.740564 4752 scope.go:117] "RemoveContainer" containerID="907b3a90e8d229f8f10fdbf7bca82d9376cbc4d6ef9e7363d3b920398853c0b5" Nov 24 11:54:15 crc kubenswrapper[4752]: E1124 11:54:15.740884 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:54:30 crc kubenswrapper[4752]: I1124 11:54:30.727562 4752 scope.go:117] "RemoveContainer" containerID="907b3a90e8d229f8f10fdbf7bca82d9376cbc4d6ef9e7363d3b920398853c0b5" Nov 24 11:54:30 crc kubenswrapper[4752]: E1124 11:54:30.728436 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:54:43 crc kubenswrapper[4752]: I1124 11:54:43.727834 4752 scope.go:117] "RemoveContainer" containerID="907b3a90e8d229f8f10fdbf7bca82d9376cbc4d6ef9e7363d3b920398853c0b5" Nov 24 11:54:43 crc kubenswrapper[4752]: E1124 11:54:43.728769 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:54:58 crc kubenswrapper[4752]: I1124 11:54:58.728999 4752 scope.go:117] "RemoveContainer" containerID="907b3a90e8d229f8f10fdbf7bca82d9376cbc4d6ef9e7363d3b920398853c0b5" Nov 24 11:54:58 crc kubenswrapper[4752]: E1124 11:54:58.729626 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:55:12 crc kubenswrapper[4752]: I1124 11:55:12.729177 4752 scope.go:117] "RemoveContainer" containerID="907b3a90e8d229f8f10fdbf7bca82d9376cbc4d6ef9e7363d3b920398853c0b5" Nov 24 11:55:12 crc kubenswrapper[4752]: E1124 11:55:12.731175 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:55:26 crc kubenswrapper[4752]: I1124 11:55:26.729089 4752 scope.go:117] "RemoveContainer" containerID="907b3a90e8d229f8f10fdbf7bca82d9376cbc4d6ef9e7363d3b920398853c0b5" Nov 24 11:55:26 crc kubenswrapper[4752]: E1124 11:55:26.730552 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:55:41 crc kubenswrapper[4752]: I1124 11:55:41.729011 4752 scope.go:117] "RemoveContainer" containerID="907b3a90e8d229f8f10fdbf7bca82d9376cbc4d6ef9e7363d3b920398853c0b5" Nov 24 11:55:41 crc kubenswrapper[4752]: E1124 11:55:41.730056 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:55:56 crc kubenswrapper[4752]: I1124 11:55:56.728606 4752 scope.go:117] "RemoveContainer" containerID="907b3a90e8d229f8f10fdbf7bca82d9376cbc4d6ef9e7363d3b920398853c0b5" Nov 24 11:55:56 crc kubenswrapper[4752]: E1124 11:55:56.729337 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:56:11 crc kubenswrapper[4752]: I1124 11:56:11.728631 4752 scope.go:117] "RemoveContainer" containerID="907b3a90e8d229f8f10fdbf7bca82d9376cbc4d6ef9e7363d3b920398853c0b5" Nov 24 11:56:11 crc kubenswrapper[4752]: E1124 11:56:11.729679 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:56:24 crc kubenswrapper[4752]: I1124 11:56:24.734939 4752 scope.go:117] "RemoveContainer" containerID="907b3a90e8d229f8f10fdbf7bca82d9376cbc4d6ef9e7363d3b920398853c0b5" Nov 24 11:56:24 crc kubenswrapper[4752]: E1124 11:56:24.736152 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:56:35 crc kubenswrapper[4752]: I1124 11:56:35.727998 4752 scope.go:117] "RemoveContainer" containerID="907b3a90e8d229f8f10fdbf7bca82d9376cbc4d6ef9e7363d3b920398853c0b5" Nov 24 11:56:35 crc kubenswrapper[4752]: E1124 11:56:35.728861 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:56:49 crc kubenswrapper[4752]: I1124 11:56:49.728225 4752 scope.go:117] "RemoveContainer" containerID="907b3a90e8d229f8f10fdbf7bca82d9376cbc4d6ef9e7363d3b920398853c0b5" Nov 24 11:56:49 crc kubenswrapper[4752]: E1124 11:56:49.729083 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:57:02 crc kubenswrapper[4752]: I1124 11:57:02.728577 4752 scope.go:117] "RemoveContainer" containerID="907b3a90e8d229f8f10fdbf7bca82d9376cbc4d6ef9e7363d3b920398853c0b5" Nov 24 11:57:02 crc kubenswrapper[4752]: E1124 11:57:02.729541 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:57:14 crc kubenswrapper[4752]: I1124 11:57:14.732118 4752 scope.go:117] "RemoveContainer" containerID="907b3a90e8d229f8f10fdbf7bca82d9376cbc4d6ef9e7363d3b920398853c0b5" Nov 24 11:57:14 crc kubenswrapper[4752]: E1124 11:57:14.733395 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:57:25 crc kubenswrapper[4752]: I1124 11:57:25.728100 4752 scope.go:117] "RemoveContainer" containerID="907b3a90e8d229f8f10fdbf7bca82d9376cbc4d6ef9e7363d3b920398853c0b5" Nov 24 11:57:25 crc kubenswrapper[4752]: E1124 11:57:25.729048 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:57:40 crc kubenswrapper[4752]: I1124 11:57:40.728575 4752 scope.go:117] "RemoveContainer" containerID="907b3a90e8d229f8f10fdbf7bca82d9376cbc4d6ef9e7363d3b920398853c0b5" Nov 24 11:57:40 crc kubenswrapper[4752]: E1124 11:57:40.729595 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:57:53 crc kubenswrapper[4752]: I1124 11:57:53.728619 4752 scope.go:117] "RemoveContainer" containerID="907b3a90e8d229f8f10fdbf7bca82d9376cbc4d6ef9e7363d3b920398853c0b5" Nov 24 11:57:53 crc kubenswrapper[4752]: E1124 11:57:53.729721 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:58:07 crc kubenswrapper[4752]: I1124 11:58:07.727857 4752 scope.go:117] "RemoveContainer" containerID="907b3a90e8d229f8f10fdbf7bca82d9376cbc4d6ef9e7363d3b920398853c0b5" Nov 24 11:58:07 crc kubenswrapper[4752]: E1124 11:58:07.729060 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:58:22 crc kubenswrapper[4752]: I1124 11:58:22.728486 4752 scope.go:117] "RemoveContainer" containerID="907b3a90e8d229f8f10fdbf7bca82d9376cbc4d6ef9e7363d3b920398853c0b5" Nov 24 11:58:22 crc kubenswrapper[4752]: E1124 11:58:22.730444 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:58:36 crc kubenswrapper[4752]: I1124 11:58:36.728760 4752 scope.go:117] "RemoveContainer" containerID="907b3a90e8d229f8f10fdbf7bca82d9376cbc4d6ef9e7363d3b920398853c0b5" Nov 24 11:58:36 crc kubenswrapper[4752]: E1124 11:58:36.729507 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:58:50 crc kubenswrapper[4752]: I1124 11:58:50.728895 4752 scope.go:117] "RemoveContainer" containerID="907b3a90e8d229f8f10fdbf7bca82d9376cbc4d6ef9e7363d3b920398853c0b5" Nov 24 11:58:50 crc kubenswrapper[4752]: E1124 11:58:50.729780 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:59:02 crc kubenswrapper[4752]: I1124 11:59:02.727649 4752 scope.go:117] "RemoveContainer" containerID="907b3a90e8d229f8f10fdbf7bca82d9376cbc4d6ef9e7363d3b920398853c0b5" Nov 24 11:59:02 crc kubenswrapper[4752]: E1124 11:59:02.728348 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 11:59:17 crc kubenswrapper[4752]: I1124 11:59:17.728173 4752 scope.go:117] "RemoveContainer" containerID="907b3a90e8d229f8f10fdbf7bca82d9376cbc4d6ef9e7363d3b920398853c0b5" Nov 24 11:59:18 crc kubenswrapper[4752]: I1124 11:59:18.382920 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerStarted","Data":"edb1b974ef5fb118e959a83d42c4c5df135e8797e56158f3068ef15e6420bf45"} Nov 24 12:00:00 crc kubenswrapper[4752]: I1124 12:00:00.209359 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399760-k69tj"] Nov 24 12:00:00 crc kubenswrapper[4752]: E1124 12:00:00.210465 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8" containerName="extract-content" Nov 24 12:00:00 crc kubenswrapper[4752]: I1124 12:00:00.210490 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8" containerName="extract-content" Nov 24 12:00:00 crc kubenswrapper[4752]: E1124 12:00:00.210531 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8" containerName="extract-utilities" Nov 24 12:00:00 crc kubenswrapper[4752]: I1124 12:00:00.210543 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8" containerName="extract-utilities" Nov 24 12:00:00 crc kubenswrapper[4752]: E1124 12:00:00.210583 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8" containerName="registry-server" Nov 24 12:00:00 crc kubenswrapper[4752]: I1124 12:00:00.210594 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8" containerName="registry-server" Nov 24 12:00:00 crc kubenswrapper[4752]: I1124 12:00:00.210840 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="5148ff02-9b8e-4fd1-a4ef-5ee1ec25a2e8" containerName="registry-server" Nov 24 12:00:00 crc kubenswrapper[4752]: I1124 12:00:00.211466 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399760-k69tj" Nov 24 12:00:00 crc kubenswrapper[4752]: I1124 12:00:00.213990 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 24 12:00:00 crc kubenswrapper[4752]: I1124 12:00:00.214093 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 24 12:00:00 crc kubenswrapper[4752]: I1124 12:00:00.215279 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399760-k69tj"] Nov 24 12:00:00 crc kubenswrapper[4752]: I1124 12:00:00.318790 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz6s4\" (UniqueName: \"kubernetes.io/projected/fb2b95fb-899f-40a8-9282-799f7fa37597-kube-api-access-dz6s4\") pod \"collect-profiles-29399760-k69tj\" (UID: \"fb2b95fb-899f-40a8-9282-799f7fa37597\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399760-k69tj" Nov 24 12:00:00 crc kubenswrapper[4752]: I1124 12:00:00.319531 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fb2b95fb-899f-40a8-9282-799f7fa37597-secret-volume\") pod \"collect-profiles-29399760-k69tj\" (UID: \"fb2b95fb-899f-40a8-9282-799f7fa37597\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399760-k69tj" Nov 24 12:00:00 crc kubenswrapper[4752]: I1124 12:00:00.319610 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb2b95fb-899f-40a8-9282-799f7fa37597-config-volume\") pod \"collect-profiles-29399760-k69tj\" (UID: \"fb2b95fb-899f-40a8-9282-799f7fa37597\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399760-k69tj" Nov 24 12:00:00 crc kubenswrapper[4752]: I1124 12:00:00.420414 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dz6s4\" (UniqueName: \"kubernetes.io/projected/fb2b95fb-899f-40a8-9282-799f7fa37597-kube-api-access-dz6s4\") pod \"collect-profiles-29399760-k69tj\" (UID: \"fb2b95fb-899f-40a8-9282-799f7fa37597\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399760-k69tj" Nov 24 12:00:00 crc kubenswrapper[4752]: I1124 12:00:00.420695 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fb2b95fb-899f-40a8-9282-799f7fa37597-secret-volume\") pod \"collect-profiles-29399760-k69tj\" (UID: \"fb2b95fb-899f-40a8-9282-799f7fa37597\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399760-k69tj" Nov 24 12:00:00 crc kubenswrapper[4752]: I1124 12:00:00.420838 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb2b95fb-899f-40a8-9282-799f7fa37597-config-volume\") pod \"collect-profiles-29399760-k69tj\" (UID: \"fb2b95fb-899f-40a8-9282-799f7fa37597\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399760-k69tj" Nov 24 12:00:00 crc kubenswrapper[4752]: I1124 12:00:00.422020 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb2b95fb-899f-40a8-9282-799f7fa37597-config-volume\") pod \"collect-profiles-29399760-k69tj\" (UID: \"fb2b95fb-899f-40a8-9282-799f7fa37597\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399760-k69tj" Nov 24 12:00:00 crc kubenswrapper[4752]: I1124 12:00:00.429703 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fb2b95fb-899f-40a8-9282-799f7fa37597-secret-volume\") pod \"collect-profiles-29399760-k69tj\" (UID: \"fb2b95fb-899f-40a8-9282-799f7fa37597\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399760-k69tj" Nov 24 12:00:00 crc kubenswrapper[4752]: I1124 12:00:00.441441 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz6s4\" (UniqueName: \"kubernetes.io/projected/fb2b95fb-899f-40a8-9282-799f7fa37597-kube-api-access-dz6s4\") pod \"collect-profiles-29399760-k69tj\" (UID: \"fb2b95fb-899f-40a8-9282-799f7fa37597\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399760-k69tj" Nov 24 12:00:00 crc kubenswrapper[4752]: I1124 12:00:00.571606 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399760-k69tj" Nov 24 12:00:00 crc kubenswrapper[4752]: I1124 12:00:00.989109 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399760-k69tj"] Nov 24 12:00:01 crc kubenswrapper[4752]: E1124 12:00:01.531929 4752 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb2b95fb_899f_40a8_9282_799f7fa37597.slice/crio-conmon-b5af52724f270e0e2ea22302d1dfb909bea70b499eb59dd08546f1bdbc7951b4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb2b95fb_899f_40a8_9282_799f7fa37597.slice/crio-b5af52724f270e0e2ea22302d1dfb909bea70b499eb59dd08546f1bdbc7951b4.scope\": RecentStats: unable to find data in memory cache]" Nov 24 12:00:01 crc kubenswrapper[4752]: I1124 12:00:01.769278 4752 generic.go:334] "Generic (PLEG): container finished" podID="fb2b95fb-899f-40a8-9282-799f7fa37597" containerID="b5af52724f270e0e2ea22302d1dfb909bea70b499eb59dd08546f1bdbc7951b4" exitCode=0 Nov 24 12:00:01 crc kubenswrapper[4752]: I1124 12:00:01.769364 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399760-k69tj" event={"ID":"fb2b95fb-899f-40a8-9282-799f7fa37597","Type":"ContainerDied","Data":"b5af52724f270e0e2ea22302d1dfb909bea70b499eb59dd08546f1bdbc7951b4"} Nov 24 12:00:01 crc kubenswrapper[4752]: I1124 12:00:01.769829 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399760-k69tj" event={"ID":"fb2b95fb-899f-40a8-9282-799f7fa37597","Type":"ContainerStarted","Data":"e78ba1104da4e35246cf3f536ca818e3ec0086cb3bf99040875e49b4c8388f5a"} Nov 24 12:00:03 crc kubenswrapper[4752]: I1124 12:00:03.041926 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399760-k69tj" Nov 24 12:00:03 crc kubenswrapper[4752]: I1124 12:00:03.161514 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fb2b95fb-899f-40a8-9282-799f7fa37597-secret-volume\") pod \"fb2b95fb-899f-40a8-9282-799f7fa37597\" (UID: \"fb2b95fb-899f-40a8-9282-799f7fa37597\") " Nov 24 12:00:03 crc kubenswrapper[4752]: I1124 12:00:03.161586 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb2b95fb-899f-40a8-9282-799f7fa37597-config-volume\") pod \"fb2b95fb-899f-40a8-9282-799f7fa37597\" (UID: \"fb2b95fb-899f-40a8-9282-799f7fa37597\") " Nov 24 12:00:03 crc kubenswrapper[4752]: I1124 12:00:03.161619 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dz6s4\" (UniqueName: \"kubernetes.io/projected/fb2b95fb-899f-40a8-9282-799f7fa37597-kube-api-access-dz6s4\") pod \"fb2b95fb-899f-40a8-9282-799f7fa37597\" (UID: \"fb2b95fb-899f-40a8-9282-799f7fa37597\") " Nov 24 12:00:03 crc kubenswrapper[4752]: I1124 12:00:03.163939 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb2b95fb-899f-40a8-9282-799f7fa37597-config-volume" (OuterVolumeSpecName: "config-volume") pod "fb2b95fb-899f-40a8-9282-799f7fa37597" (UID: "fb2b95fb-899f-40a8-9282-799f7fa37597"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:00:03 crc kubenswrapper[4752]: I1124 12:00:03.167161 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb2b95fb-899f-40a8-9282-799f7fa37597-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fb2b95fb-899f-40a8-9282-799f7fa37597" (UID: "fb2b95fb-899f-40a8-9282-799f7fa37597"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:00:03 crc kubenswrapper[4752]: I1124 12:00:03.169492 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb2b95fb-899f-40a8-9282-799f7fa37597-kube-api-access-dz6s4" (OuterVolumeSpecName: "kube-api-access-dz6s4") pod "fb2b95fb-899f-40a8-9282-799f7fa37597" (UID: "fb2b95fb-899f-40a8-9282-799f7fa37597"). InnerVolumeSpecName "kube-api-access-dz6s4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:00:03 crc kubenswrapper[4752]: I1124 12:00:03.263080 4752 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fb2b95fb-899f-40a8-9282-799f7fa37597-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 24 12:00:03 crc kubenswrapper[4752]: I1124 12:00:03.263112 4752 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb2b95fb-899f-40a8-9282-799f7fa37597-config-volume\") on node \"crc\" DevicePath \"\"" Nov 24 12:00:03 crc kubenswrapper[4752]: I1124 12:00:03.263124 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dz6s4\" (UniqueName: \"kubernetes.io/projected/fb2b95fb-899f-40a8-9282-799f7fa37597-kube-api-access-dz6s4\") on node \"crc\" DevicePath \"\"" Nov 24 12:00:03 crc kubenswrapper[4752]: I1124 12:00:03.810259 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399760-k69tj" event={"ID":"fb2b95fb-899f-40a8-9282-799f7fa37597","Type":"ContainerDied","Data":"e78ba1104da4e35246cf3f536ca818e3ec0086cb3bf99040875e49b4c8388f5a"} Nov 24 12:00:03 crc kubenswrapper[4752]: I1124 12:00:03.810305 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e78ba1104da4e35246cf3f536ca818e3ec0086cb3bf99040875e49b4c8388f5a" Nov 24 12:00:03 crc kubenswrapper[4752]: I1124 12:00:03.810569 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399760-k69tj" Nov 24 12:00:04 crc kubenswrapper[4752]: I1124 12:00:04.109539 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399715-jd4wt"] Nov 24 12:00:04 crc kubenswrapper[4752]: I1124 12:00:04.113804 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399715-jd4wt"] Nov 24 12:00:04 crc kubenswrapper[4752]: I1124 12:00:04.736664 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="881eb454-d8b6-4b23-a23b-c3e0fc44d97c" path="/var/lib/kubelet/pods/881eb454-d8b6-4b23-a23b-c3e0fc44d97c/volumes" Nov 24 12:00:41 crc kubenswrapper[4752]: I1124 12:00:41.148338 4752 scope.go:117] "RemoveContainer" containerID="1b75735ac8682a5ee99c592c9ca633b197599670c14e997d46c504c8dcd42504" Nov 24 12:01:12 crc kubenswrapper[4752]: I1124 12:01:12.705504 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8zhdm"] Nov 24 12:01:12 crc kubenswrapper[4752]: E1124 12:01:12.706342 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb2b95fb-899f-40a8-9282-799f7fa37597" containerName="collect-profiles" Nov 24 12:01:12 crc kubenswrapper[4752]: I1124 12:01:12.706356 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb2b95fb-899f-40a8-9282-799f7fa37597" containerName="collect-profiles" Nov 24 12:01:12 crc kubenswrapper[4752]: I1124 12:01:12.706542 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb2b95fb-899f-40a8-9282-799f7fa37597" containerName="collect-profiles" Nov 24 12:01:12 crc kubenswrapper[4752]: I1124 12:01:12.707815 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8zhdm" Nov 24 12:01:12 crc kubenswrapper[4752]: I1124 12:01:12.719113 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8zhdm"] Nov 24 12:01:12 crc kubenswrapper[4752]: I1124 12:01:12.805520 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a027cdf-3bff-4215-a9d9-04447529d989-catalog-content\") pod \"redhat-operators-8zhdm\" (UID: \"7a027cdf-3bff-4215-a9d9-04447529d989\") " pod="openshift-marketplace/redhat-operators-8zhdm" Nov 24 12:01:12 crc kubenswrapper[4752]: I1124 12:01:12.805597 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vprcz\" (UniqueName: \"kubernetes.io/projected/7a027cdf-3bff-4215-a9d9-04447529d989-kube-api-access-vprcz\") pod \"redhat-operators-8zhdm\" (UID: \"7a027cdf-3bff-4215-a9d9-04447529d989\") " pod="openshift-marketplace/redhat-operators-8zhdm" Nov 24 12:01:12 crc kubenswrapper[4752]: I1124 12:01:12.805652 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a027cdf-3bff-4215-a9d9-04447529d989-utilities\") pod \"redhat-operators-8zhdm\" (UID: \"7a027cdf-3bff-4215-a9d9-04447529d989\") " pod="openshift-marketplace/redhat-operators-8zhdm" Nov 24 12:01:12 crc kubenswrapper[4752]: I1124 12:01:12.906515 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a027cdf-3bff-4215-a9d9-04447529d989-catalog-content\") pod \"redhat-operators-8zhdm\" (UID: \"7a027cdf-3bff-4215-a9d9-04447529d989\") " pod="openshift-marketplace/redhat-operators-8zhdm" Nov 24 12:01:12 crc kubenswrapper[4752]: I1124 12:01:12.906597 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vprcz\" (UniqueName: \"kubernetes.io/projected/7a027cdf-3bff-4215-a9d9-04447529d989-kube-api-access-vprcz\") pod \"redhat-operators-8zhdm\" (UID: \"7a027cdf-3bff-4215-a9d9-04447529d989\") " pod="openshift-marketplace/redhat-operators-8zhdm" Nov 24 12:01:12 crc kubenswrapper[4752]: I1124 12:01:12.906657 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a027cdf-3bff-4215-a9d9-04447529d989-utilities\") pod \"redhat-operators-8zhdm\" (UID: \"7a027cdf-3bff-4215-a9d9-04447529d989\") " pod="openshift-marketplace/redhat-operators-8zhdm" Nov 24 12:01:12 crc kubenswrapper[4752]: I1124 12:01:12.906942 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a027cdf-3bff-4215-a9d9-04447529d989-catalog-content\") pod \"redhat-operators-8zhdm\" (UID: \"7a027cdf-3bff-4215-a9d9-04447529d989\") " pod="openshift-marketplace/redhat-operators-8zhdm" Nov 24 12:01:12 crc kubenswrapper[4752]: I1124 12:01:12.907007 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a027cdf-3bff-4215-a9d9-04447529d989-utilities\") pod \"redhat-operators-8zhdm\" (UID: \"7a027cdf-3bff-4215-a9d9-04447529d989\") " pod="openshift-marketplace/redhat-operators-8zhdm" Nov 24 12:01:12 crc kubenswrapper[4752]: I1124 12:01:12.926860 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vprcz\" (UniqueName: \"kubernetes.io/projected/7a027cdf-3bff-4215-a9d9-04447529d989-kube-api-access-vprcz\") pod \"redhat-operators-8zhdm\" (UID: \"7a027cdf-3bff-4215-a9d9-04447529d989\") " pod="openshift-marketplace/redhat-operators-8zhdm" Nov 24 12:01:13 crc kubenswrapper[4752]: I1124 12:01:13.044817 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8zhdm" Nov 24 12:01:13 crc kubenswrapper[4752]: I1124 12:01:13.460143 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8zhdm"] Nov 24 12:01:14 crc kubenswrapper[4752]: I1124 12:01:14.409137 4752 generic.go:334] "Generic (PLEG): container finished" podID="7a027cdf-3bff-4215-a9d9-04447529d989" containerID="e35529d843fcbb5e8dc172b6144134af6004866e4f15ef14046b730b6c018e69" exitCode=0 Nov 24 12:01:14 crc kubenswrapper[4752]: I1124 12:01:14.409180 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8zhdm" event={"ID":"7a027cdf-3bff-4215-a9d9-04447529d989","Type":"ContainerDied","Data":"e35529d843fcbb5e8dc172b6144134af6004866e4f15ef14046b730b6c018e69"} Nov 24 12:01:14 crc kubenswrapper[4752]: I1124 12:01:14.409209 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8zhdm" event={"ID":"7a027cdf-3bff-4215-a9d9-04447529d989","Type":"ContainerStarted","Data":"823ff374661cdb8293d8421f7693147b56e5ca756ba84d7d1968f2923e649c98"} Nov 24 12:01:14 crc kubenswrapper[4752]: I1124 12:01:14.410960 4752 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 24 12:01:15 crc kubenswrapper[4752]: I1124 12:01:15.416997 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8zhdm" event={"ID":"7a027cdf-3bff-4215-a9d9-04447529d989","Type":"ContainerStarted","Data":"bdd318579c11b57b273db5dc23aa6108db699c88e61c0facb5196fca0a08aa59"} Nov 24 12:01:16 crc kubenswrapper[4752]: I1124 12:01:16.427489 4752 generic.go:334] "Generic (PLEG): container finished" podID="7a027cdf-3bff-4215-a9d9-04447529d989" containerID="bdd318579c11b57b273db5dc23aa6108db699c88e61c0facb5196fca0a08aa59" exitCode=0 Nov 24 12:01:16 crc kubenswrapper[4752]: I1124 12:01:16.427565 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8zhdm" event={"ID":"7a027cdf-3bff-4215-a9d9-04447529d989","Type":"ContainerDied","Data":"bdd318579c11b57b273db5dc23aa6108db699c88e61c0facb5196fca0a08aa59"} Nov 24 12:01:16 crc kubenswrapper[4752]: I1124 12:01:16.492827 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d6wjf"] Nov 24 12:01:16 crc kubenswrapper[4752]: I1124 12:01:16.494949 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d6wjf" Nov 24 12:01:16 crc kubenswrapper[4752]: I1124 12:01:16.508084 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d6wjf"] Nov 24 12:01:16 crc kubenswrapper[4752]: I1124 12:01:16.658592 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af891066-3d45-4398-a2f7-5f1ef681ee39-catalog-content\") pod \"community-operators-d6wjf\" (UID: \"af891066-3d45-4398-a2f7-5f1ef681ee39\") " pod="openshift-marketplace/community-operators-d6wjf" Nov 24 12:01:16 crc kubenswrapper[4752]: I1124 12:01:16.658655 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ccvv\" (UniqueName: \"kubernetes.io/projected/af891066-3d45-4398-a2f7-5f1ef681ee39-kube-api-access-9ccvv\") pod \"community-operators-d6wjf\" (UID: \"af891066-3d45-4398-a2f7-5f1ef681ee39\") " pod="openshift-marketplace/community-operators-d6wjf" Nov 24 12:01:16 crc kubenswrapper[4752]: I1124 12:01:16.658687 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af891066-3d45-4398-a2f7-5f1ef681ee39-utilities\") pod \"community-operators-d6wjf\" (UID: \"af891066-3d45-4398-a2f7-5f1ef681ee39\") " pod="openshift-marketplace/community-operators-d6wjf" Nov 24 12:01:16 crc kubenswrapper[4752]: I1124 12:01:16.760522 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af891066-3d45-4398-a2f7-5f1ef681ee39-catalog-content\") pod \"community-operators-d6wjf\" (UID: \"af891066-3d45-4398-a2f7-5f1ef681ee39\") " pod="openshift-marketplace/community-operators-d6wjf" Nov 24 12:01:16 crc kubenswrapper[4752]: I1124 12:01:16.760608 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ccvv\" (UniqueName: \"kubernetes.io/projected/af891066-3d45-4398-a2f7-5f1ef681ee39-kube-api-access-9ccvv\") pod \"community-operators-d6wjf\" (UID: \"af891066-3d45-4398-a2f7-5f1ef681ee39\") " pod="openshift-marketplace/community-operators-d6wjf" Nov 24 12:01:16 crc kubenswrapper[4752]: I1124 12:01:16.760646 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af891066-3d45-4398-a2f7-5f1ef681ee39-utilities\") pod \"community-operators-d6wjf\" (UID: \"af891066-3d45-4398-a2f7-5f1ef681ee39\") " pod="openshift-marketplace/community-operators-d6wjf" Nov 24 12:01:16 crc kubenswrapper[4752]: I1124 12:01:16.761426 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af891066-3d45-4398-a2f7-5f1ef681ee39-catalog-content\") pod \"community-operators-d6wjf\" (UID: \"af891066-3d45-4398-a2f7-5f1ef681ee39\") " pod="openshift-marketplace/community-operators-d6wjf" Nov 24 12:01:16 crc kubenswrapper[4752]: I1124 12:01:16.761436 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af891066-3d45-4398-a2f7-5f1ef681ee39-utilities\") pod \"community-operators-d6wjf\" (UID: \"af891066-3d45-4398-a2f7-5f1ef681ee39\") " pod="openshift-marketplace/community-operators-d6wjf" Nov 24 12:01:16 crc kubenswrapper[4752]: I1124 12:01:16.799684 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ccvv\" (UniqueName: \"kubernetes.io/projected/af891066-3d45-4398-a2f7-5f1ef681ee39-kube-api-access-9ccvv\") pod \"community-operators-d6wjf\" (UID: \"af891066-3d45-4398-a2f7-5f1ef681ee39\") " pod="openshift-marketplace/community-operators-d6wjf" Nov 24 12:01:16 crc kubenswrapper[4752]: I1124 12:01:16.813631 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d6wjf" Nov 24 12:01:17 crc kubenswrapper[4752]: I1124 12:01:17.388053 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d6wjf"] Nov 24 12:01:17 crc kubenswrapper[4752]: I1124 12:01:17.436812 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d6wjf" event={"ID":"af891066-3d45-4398-a2f7-5f1ef681ee39","Type":"ContainerStarted","Data":"81ad8a59f13779c74ce219ed43a8a7c08e9b2d3718d51d0bc001fde779b0a758"} Nov 24 12:01:17 crc kubenswrapper[4752]: I1124 12:01:17.439494 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8zhdm" event={"ID":"7a027cdf-3bff-4215-a9d9-04447529d989","Type":"ContainerStarted","Data":"ede4536505957169fb1a09eb446e49dd6dae7451f513cd3a8d7590c1cfcee379"} Nov 24 12:01:17 crc kubenswrapper[4752]: I1124 12:01:17.466908 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8zhdm" podStartSLOduration=3.028212366 podStartE2EDuration="5.466890729s" podCreationTimestamp="2025-11-24 12:01:12 +0000 UTC" firstStartedPulling="2025-11-24 12:01:14.410657114 +0000 UTC m=+3280.395477403" lastFinishedPulling="2025-11-24 12:01:16.849335477 +0000 UTC m=+3282.834155766" observedRunningTime="2025-11-24 12:01:17.459041494 +0000 UTC m=+3283.443861793" watchObservedRunningTime="2025-11-24 12:01:17.466890729 +0000 UTC m=+3283.451711018" Nov 24 12:01:18 crc kubenswrapper[4752]: I1124 12:01:18.449789 4752 generic.go:334] "Generic (PLEG): container finished" podID="af891066-3d45-4398-a2f7-5f1ef681ee39" containerID="8dfd50bd472334a1ddbe359d078c193a631ea7a0486f1ac0d06fd4cbaecf75f0" exitCode=0 Nov 24 12:01:18 crc kubenswrapper[4752]: I1124 12:01:18.449864 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d6wjf" event={"ID":"af891066-3d45-4398-a2f7-5f1ef681ee39","Type":"ContainerDied","Data":"8dfd50bd472334a1ddbe359d078c193a631ea7a0486f1ac0d06fd4cbaecf75f0"} Nov 24 12:01:19 crc kubenswrapper[4752]: I1124 12:01:19.464943 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d6wjf" event={"ID":"af891066-3d45-4398-a2f7-5f1ef681ee39","Type":"ContainerStarted","Data":"ad48925e094388dc89e4ce8c8226c46dd8b3625e82e9cc9f7fe4d0aa7db57a92"} Nov 24 12:01:20 crc kubenswrapper[4752]: I1124 12:01:20.475139 4752 generic.go:334] "Generic (PLEG): container finished" podID="af891066-3d45-4398-a2f7-5f1ef681ee39" containerID="ad48925e094388dc89e4ce8c8226c46dd8b3625e82e9cc9f7fe4d0aa7db57a92" exitCode=0 Nov 24 12:01:20 crc kubenswrapper[4752]: I1124 12:01:20.475204 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d6wjf" event={"ID":"af891066-3d45-4398-a2f7-5f1ef681ee39","Type":"ContainerDied","Data":"ad48925e094388dc89e4ce8c8226c46dd8b3625e82e9cc9f7fe4d0aa7db57a92"} Nov 24 12:01:21 crc kubenswrapper[4752]: I1124 12:01:21.484128 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d6wjf" event={"ID":"af891066-3d45-4398-a2f7-5f1ef681ee39","Type":"ContainerStarted","Data":"6fb439de130305b25ba73fc669c7c7af4b39e37e18cb868bf526c1967c320095"} Nov 24 12:01:21 crc kubenswrapper[4752]: I1124 12:01:21.499724 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d6wjf" podStartSLOduration=2.904456191 podStartE2EDuration="5.499714051s" podCreationTimestamp="2025-11-24 12:01:16 +0000 UTC" firstStartedPulling="2025-11-24 12:01:18.4527039 +0000 UTC m=+3284.437524229" lastFinishedPulling="2025-11-24 12:01:21.0479618 +0000 UTC m=+3287.032782089" observedRunningTime="2025-11-24 12:01:21.498308091 +0000 UTC m=+3287.483128380" watchObservedRunningTime="2025-11-24 12:01:21.499714051 +0000 UTC m=+3287.484534340" Nov 24 12:01:23 crc kubenswrapper[4752]: I1124 12:01:23.045117 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8zhdm" Nov 24 12:01:23 crc kubenswrapper[4752]: I1124 12:01:23.045175 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8zhdm" Nov 24 12:01:23 crc kubenswrapper[4752]: I1124 12:01:23.093306 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8zhdm" Nov 24 12:01:23 crc kubenswrapper[4752]: I1124 12:01:23.548565 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8zhdm" Nov 24 12:01:24 crc kubenswrapper[4752]: I1124 12:01:24.287624 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8zhdm"] Nov 24 12:01:25 crc kubenswrapper[4752]: I1124 12:01:25.518595 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8zhdm" podUID="7a027cdf-3bff-4215-a9d9-04447529d989" containerName="registry-server" containerID="cri-o://ede4536505957169fb1a09eb446e49dd6dae7451f513cd3a8d7590c1cfcee379" gracePeriod=2 Nov 24 12:01:25 crc kubenswrapper[4752]: I1124 12:01:25.993279 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8zhdm" Nov 24 12:01:26 crc kubenswrapper[4752]: I1124 12:01:26.097405 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a027cdf-3bff-4215-a9d9-04447529d989-catalog-content\") pod \"7a027cdf-3bff-4215-a9d9-04447529d989\" (UID: \"7a027cdf-3bff-4215-a9d9-04447529d989\") " Nov 24 12:01:26 crc kubenswrapper[4752]: I1124 12:01:26.097543 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vprcz\" (UniqueName: \"kubernetes.io/projected/7a027cdf-3bff-4215-a9d9-04447529d989-kube-api-access-vprcz\") pod \"7a027cdf-3bff-4215-a9d9-04447529d989\" (UID: \"7a027cdf-3bff-4215-a9d9-04447529d989\") " Nov 24 12:01:26 crc kubenswrapper[4752]: I1124 12:01:26.097571 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a027cdf-3bff-4215-a9d9-04447529d989-utilities\") pod \"7a027cdf-3bff-4215-a9d9-04447529d989\" (UID: \"7a027cdf-3bff-4215-a9d9-04447529d989\") " Nov 24 12:01:26 crc kubenswrapper[4752]: I1124 12:01:26.098534 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a027cdf-3bff-4215-a9d9-04447529d989-utilities" (OuterVolumeSpecName: "utilities") pod "7a027cdf-3bff-4215-a9d9-04447529d989" (UID: "7a027cdf-3bff-4215-a9d9-04447529d989"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:01:26 crc kubenswrapper[4752]: I1124 12:01:26.102650 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a027cdf-3bff-4215-a9d9-04447529d989-kube-api-access-vprcz" (OuterVolumeSpecName: "kube-api-access-vprcz") pod "7a027cdf-3bff-4215-a9d9-04447529d989" (UID: "7a027cdf-3bff-4215-a9d9-04447529d989"). InnerVolumeSpecName "kube-api-access-vprcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:01:26 crc kubenswrapper[4752]: I1124 12:01:26.199522 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vprcz\" (UniqueName: \"kubernetes.io/projected/7a027cdf-3bff-4215-a9d9-04447529d989-kube-api-access-vprcz\") on node \"crc\" DevicePath \"\"" Nov 24 12:01:26 crc kubenswrapper[4752]: I1124 12:01:26.199558 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a027cdf-3bff-4215-a9d9-04447529d989-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 12:01:26 crc kubenswrapper[4752]: I1124 12:01:26.200710 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a027cdf-3bff-4215-a9d9-04447529d989-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a027cdf-3bff-4215-a9d9-04447529d989" (UID: "7a027cdf-3bff-4215-a9d9-04447529d989"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:01:26 crc kubenswrapper[4752]: I1124 12:01:26.300488 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a027cdf-3bff-4215-a9d9-04447529d989-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 12:01:26 crc kubenswrapper[4752]: I1124 12:01:26.530154 4752 generic.go:334] "Generic (PLEG): container finished" podID="7a027cdf-3bff-4215-a9d9-04447529d989" containerID="ede4536505957169fb1a09eb446e49dd6dae7451f513cd3a8d7590c1cfcee379" exitCode=0 Nov 24 12:01:26 crc kubenswrapper[4752]: I1124 12:01:26.530203 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8zhdm" Nov 24 12:01:26 crc kubenswrapper[4752]: I1124 12:01:26.530220 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8zhdm" event={"ID":"7a027cdf-3bff-4215-a9d9-04447529d989","Type":"ContainerDied","Data":"ede4536505957169fb1a09eb446e49dd6dae7451f513cd3a8d7590c1cfcee379"} Nov 24 12:01:26 crc kubenswrapper[4752]: I1124 12:01:26.531524 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8zhdm" event={"ID":"7a027cdf-3bff-4215-a9d9-04447529d989","Type":"ContainerDied","Data":"823ff374661cdb8293d8421f7693147b56e5ca756ba84d7d1968f2923e649c98"} Nov 24 12:01:26 crc kubenswrapper[4752]: I1124 12:01:26.531580 4752 scope.go:117] "RemoveContainer" containerID="ede4536505957169fb1a09eb446e49dd6dae7451f513cd3a8d7590c1cfcee379" Nov 24 12:01:26 crc kubenswrapper[4752]: I1124 12:01:26.567382 4752 scope.go:117] "RemoveContainer" containerID="bdd318579c11b57b273db5dc23aa6108db699c88e61c0facb5196fca0a08aa59" Nov 24 12:01:26 crc kubenswrapper[4752]: I1124 12:01:26.569409 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8zhdm"] Nov 24 12:01:26 crc kubenswrapper[4752]: I1124 12:01:26.576271 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8zhdm"] Nov 24 12:01:26 crc kubenswrapper[4752]: I1124 12:01:26.589490 4752 scope.go:117] "RemoveContainer" containerID="e35529d843fcbb5e8dc172b6144134af6004866e4f15ef14046b730b6c018e69" Nov 24 12:01:26 crc kubenswrapper[4752]: I1124 12:01:26.620111 4752 scope.go:117] "RemoveContainer" containerID="ede4536505957169fb1a09eb446e49dd6dae7451f513cd3a8d7590c1cfcee379" Nov 24 12:01:26 crc kubenswrapper[4752]: E1124 12:01:26.620824 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ede4536505957169fb1a09eb446e49dd6dae7451f513cd3a8d7590c1cfcee379\": container with ID starting with ede4536505957169fb1a09eb446e49dd6dae7451f513cd3a8d7590c1cfcee379 not found: ID does not exist" containerID="ede4536505957169fb1a09eb446e49dd6dae7451f513cd3a8d7590c1cfcee379" Nov 24 12:01:26 crc kubenswrapper[4752]: I1124 12:01:26.620871 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ede4536505957169fb1a09eb446e49dd6dae7451f513cd3a8d7590c1cfcee379"} err="failed to get container status \"ede4536505957169fb1a09eb446e49dd6dae7451f513cd3a8d7590c1cfcee379\": rpc error: code = NotFound desc = could not find container \"ede4536505957169fb1a09eb446e49dd6dae7451f513cd3a8d7590c1cfcee379\": container with ID starting with ede4536505957169fb1a09eb446e49dd6dae7451f513cd3a8d7590c1cfcee379 not found: ID does not exist" Nov 24 12:01:26 crc kubenswrapper[4752]: I1124 12:01:26.620898 4752 scope.go:117] "RemoveContainer" containerID="bdd318579c11b57b273db5dc23aa6108db699c88e61c0facb5196fca0a08aa59" Nov 24 12:01:26 crc kubenswrapper[4752]: E1124 12:01:26.621339 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdd318579c11b57b273db5dc23aa6108db699c88e61c0facb5196fca0a08aa59\": container with ID starting with bdd318579c11b57b273db5dc23aa6108db699c88e61c0facb5196fca0a08aa59 not found: ID does not exist" containerID="bdd318579c11b57b273db5dc23aa6108db699c88e61c0facb5196fca0a08aa59" Nov 24 12:01:26 crc kubenswrapper[4752]: I1124 12:01:26.621544 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdd318579c11b57b273db5dc23aa6108db699c88e61c0facb5196fca0a08aa59"} err="failed to get container status \"bdd318579c11b57b273db5dc23aa6108db699c88e61c0facb5196fca0a08aa59\": rpc error: code = NotFound desc = could not find container \"bdd318579c11b57b273db5dc23aa6108db699c88e61c0facb5196fca0a08aa59\": container with ID starting with bdd318579c11b57b273db5dc23aa6108db699c88e61c0facb5196fca0a08aa59 not found: ID does not exist" Nov 24 12:01:26 crc kubenswrapper[4752]: I1124 12:01:26.621889 4752 scope.go:117] "RemoveContainer" containerID="e35529d843fcbb5e8dc172b6144134af6004866e4f15ef14046b730b6c018e69" Nov 24 12:01:26 crc kubenswrapper[4752]: E1124 12:01:26.622571 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e35529d843fcbb5e8dc172b6144134af6004866e4f15ef14046b730b6c018e69\": container with ID starting with e35529d843fcbb5e8dc172b6144134af6004866e4f15ef14046b730b6c018e69 not found: ID does not exist" containerID="e35529d843fcbb5e8dc172b6144134af6004866e4f15ef14046b730b6c018e69" Nov 24 12:01:26 crc kubenswrapper[4752]: I1124 12:01:26.622592 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e35529d843fcbb5e8dc172b6144134af6004866e4f15ef14046b730b6c018e69"} err="failed to get container status \"e35529d843fcbb5e8dc172b6144134af6004866e4f15ef14046b730b6c018e69\": rpc error: code = NotFound desc = could not find container \"e35529d843fcbb5e8dc172b6144134af6004866e4f15ef14046b730b6c018e69\": container with ID starting with e35529d843fcbb5e8dc172b6144134af6004866e4f15ef14046b730b6c018e69 not found: ID does not exist" Nov 24 12:01:26 crc kubenswrapper[4752]: I1124 12:01:26.737249 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a027cdf-3bff-4215-a9d9-04447529d989" path="/var/lib/kubelet/pods/7a027cdf-3bff-4215-a9d9-04447529d989/volumes" Nov 24 12:01:26 crc kubenswrapper[4752]: I1124 12:01:26.814054 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d6wjf" Nov 24 12:01:26 crc kubenswrapper[4752]: I1124 12:01:26.814155 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-d6wjf" Nov 24 12:01:26 crc kubenswrapper[4752]: I1124 12:01:26.865352 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d6wjf" Nov 24 12:01:27 crc kubenswrapper[4752]: I1124 12:01:27.610896 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d6wjf" Nov 24 12:01:29 crc kubenswrapper[4752]: I1124 12:01:29.286452 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d6wjf"] Nov 24 12:01:29 crc kubenswrapper[4752]: I1124 12:01:29.572017 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-d6wjf" podUID="af891066-3d45-4398-a2f7-5f1ef681ee39" containerName="registry-server" containerID="cri-o://6fb439de130305b25ba73fc669c7c7af4b39e37e18cb868bf526c1967c320095" gracePeriod=2 Nov 24 12:01:30 crc kubenswrapper[4752]: I1124 12:01:30.600513 4752 generic.go:334] "Generic (PLEG): container finished" podID="af891066-3d45-4398-a2f7-5f1ef681ee39" containerID="6fb439de130305b25ba73fc669c7c7af4b39e37e18cb868bf526c1967c320095" exitCode=0 Nov 24 12:01:30 crc kubenswrapper[4752]: I1124 12:01:30.600599 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d6wjf" event={"ID":"af891066-3d45-4398-a2f7-5f1ef681ee39","Type":"ContainerDied","Data":"6fb439de130305b25ba73fc669c7c7af4b39e37e18cb868bf526c1967c320095"} Nov 24 12:01:31 crc kubenswrapper[4752]: I1124 12:01:31.135892 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d6wjf" Nov 24 12:01:31 crc kubenswrapper[4752]: I1124 12:01:31.295454 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af891066-3d45-4398-a2f7-5f1ef681ee39-catalog-content\") pod \"af891066-3d45-4398-a2f7-5f1ef681ee39\" (UID: \"af891066-3d45-4398-a2f7-5f1ef681ee39\") " Nov 24 12:01:31 crc kubenswrapper[4752]: I1124 12:01:31.295595 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ccvv\" (UniqueName: \"kubernetes.io/projected/af891066-3d45-4398-a2f7-5f1ef681ee39-kube-api-access-9ccvv\") pod \"af891066-3d45-4398-a2f7-5f1ef681ee39\" (UID: \"af891066-3d45-4398-a2f7-5f1ef681ee39\") " Nov 24 12:01:31 crc kubenswrapper[4752]: I1124 12:01:31.295634 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af891066-3d45-4398-a2f7-5f1ef681ee39-utilities\") pod \"af891066-3d45-4398-a2f7-5f1ef681ee39\" (UID: \"af891066-3d45-4398-a2f7-5f1ef681ee39\") " Nov 24 12:01:31 crc kubenswrapper[4752]: I1124 12:01:31.296877 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af891066-3d45-4398-a2f7-5f1ef681ee39-utilities" (OuterVolumeSpecName: "utilities") pod "af891066-3d45-4398-a2f7-5f1ef681ee39" (UID: "af891066-3d45-4398-a2f7-5f1ef681ee39"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:01:31 crc kubenswrapper[4752]: I1124 12:01:31.302412 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af891066-3d45-4398-a2f7-5f1ef681ee39-kube-api-access-9ccvv" (OuterVolumeSpecName: "kube-api-access-9ccvv") pod "af891066-3d45-4398-a2f7-5f1ef681ee39" (UID: "af891066-3d45-4398-a2f7-5f1ef681ee39"). InnerVolumeSpecName "kube-api-access-9ccvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:01:31 crc kubenswrapper[4752]: I1124 12:01:31.343184 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af891066-3d45-4398-a2f7-5f1ef681ee39-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "af891066-3d45-4398-a2f7-5f1ef681ee39" (UID: "af891066-3d45-4398-a2f7-5f1ef681ee39"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:01:31 crc kubenswrapper[4752]: I1124 12:01:31.396920 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ccvv\" (UniqueName: \"kubernetes.io/projected/af891066-3d45-4398-a2f7-5f1ef681ee39-kube-api-access-9ccvv\") on node \"crc\" DevicePath \"\"" Nov 24 12:01:31 crc kubenswrapper[4752]: I1124 12:01:31.396957 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af891066-3d45-4398-a2f7-5f1ef681ee39-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 12:01:31 crc kubenswrapper[4752]: I1124 12:01:31.396966 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af891066-3d45-4398-a2f7-5f1ef681ee39-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 12:01:31 crc kubenswrapper[4752]: I1124 12:01:31.611151 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d6wjf" event={"ID":"af891066-3d45-4398-a2f7-5f1ef681ee39","Type":"ContainerDied","Data":"81ad8a59f13779c74ce219ed43a8a7c08e9b2d3718d51d0bc001fde779b0a758"} Nov 24 12:01:31 crc kubenswrapper[4752]: I1124 12:01:31.611898 4752 scope.go:117] "RemoveContainer" containerID="6fb439de130305b25ba73fc669c7c7af4b39e37e18cb868bf526c1967c320095" Nov 24 12:01:31 crc kubenswrapper[4752]: I1124 12:01:31.611212 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d6wjf" Nov 24 12:01:31 crc kubenswrapper[4752]: I1124 12:01:31.644816 4752 scope.go:117] "RemoveContainer" containerID="ad48925e094388dc89e4ce8c8226c46dd8b3625e82e9cc9f7fe4d0aa7db57a92" Nov 24 12:01:31 crc kubenswrapper[4752]: I1124 12:01:31.650114 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d6wjf"] Nov 24 12:01:31 crc kubenswrapper[4752]: I1124 12:01:31.655083 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-d6wjf"] Nov 24 12:01:31 crc kubenswrapper[4752]: I1124 12:01:31.662375 4752 scope.go:117] "RemoveContainer" containerID="8dfd50bd472334a1ddbe359d078c193a631ea7a0486f1ac0d06fd4cbaecf75f0" Nov 24 12:01:32 crc kubenswrapper[4752]: I1124 12:01:32.736495 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af891066-3d45-4398-a2f7-5f1ef681ee39" path="/var/lib/kubelet/pods/af891066-3d45-4398-a2f7-5f1ef681ee39/volumes" Nov 24 12:01:45 crc kubenswrapper[4752]: I1124 12:01:45.468534 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 12:01:45 crc kubenswrapper[4752]: I1124 12:01:45.470481 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 12:02:15 crc kubenswrapper[4752]: I1124 12:02:15.469309 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 12:02:15 crc kubenswrapper[4752]: I1124 12:02:15.469937 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 12:02:45 crc kubenswrapper[4752]: I1124 12:02:45.468492 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 12:02:45 crc kubenswrapper[4752]: I1124 12:02:45.468990 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 12:02:45 crc kubenswrapper[4752]: I1124 12:02:45.469035 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 12:02:45 crc kubenswrapper[4752]: I1124 12:02:45.469766 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"edb1b974ef5fb118e959a83d42c4c5df135e8797e56158f3068ef15e6420bf45"} pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 24 12:02:45 crc kubenswrapper[4752]: I1124 12:02:45.469833 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" containerID="cri-o://edb1b974ef5fb118e959a83d42c4c5df135e8797e56158f3068ef15e6420bf45" gracePeriod=600 Nov 24 12:02:46 crc kubenswrapper[4752]: I1124 12:02:46.228856 4752 generic.go:334] "Generic (PLEG): container finished" podID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerID="edb1b974ef5fb118e959a83d42c4c5df135e8797e56158f3068ef15e6420bf45" exitCode=0 Nov 24 12:02:46 crc kubenswrapper[4752]: I1124 12:02:46.228955 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerDied","Data":"edb1b974ef5fb118e959a83d42c4c5df135e8797e56158f3068ef15e6420bf45"} Nov 24 12:02:46 crc kubenswrapper[4752]: I1124 12:02:46.229384 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerStarted","Data":"3450f74eabbb61df240826a6bd281029c1a9b71d1788a44a2246b8fdfc23bac5"} Nov 24 12:02:46 crc kubenswrapper[4752]: I1124 12:02:46.229414 4752 scope.go:117] "RemoveContainer" containerID="907b3a90e8d229f8f10fdbf7bca82d9376cbc4d6ef9e7363d3b920398853c0b5" Nov 24 12:04:07 crc kubenswrapper[4752]: I1124 12:04:07.833897 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-r57qv"] Nov 24 12:04:07 crc kubenswrapper[4752]: E1124 12:04:07.834872 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a027cdf-3bff-4215-a9d9-04447529d989" containerName="extract-content" Nov 24 12:04:07 crc kubenswrapper[4752]: I1124 12:04:07.834889 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a027cdf-3bff-4215-a9d9-04447529d989" containerName="extract-content" Nov 24 12:04:07 crc kubenswrapper[4752]: E1124 12:04:07.834901 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af891066-3d45-4398-a2f7-5f1ef681ee39" containerName="registry-server" Nov 24 12:04:07 crc kubenswrapper[4752]: I1124 12:04:07.834908 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="af891066-3d45-4398-a2f7-5f1ef681ee39" containerName="registry-server" Nov 24 12:04:07 crc kubenswrapper[4752]: E1124 12:04:07.834925 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a027cdf-3bff-4215-a9d9-04447529d989" containerName="registry-server" Nov 24 12:04:07 crc kubenswrapper[4752]: I1124 12:04:07.834933 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a027cdf-3bff-4215-a9d9-04447529d989" containerName="registry-server" Nov 24 12:04:07 crc kubenswrapper[4752]: E1124 12:04:07.834953 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af891066-3d45-4398-a2f7-5f1ef681ee39" containerName="extract-content" Nov 24 12:04:07 crc kubenswrapper[4752]: I1124 12:04:07.834960 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="af891066-3d45-4398-a2f7-5f1ef681ee39" containerName="extract-content" Nov 24 12:04:07 crc kubenswrapper[4752]: E1124 12:04:07.834973 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af891066-3d45-4398-a2f7-5f1ef681ee39" containerName="extract-utilities" Nov 24 12:04:07 crc kubenswrapper[4752]: I1124 12:04:07.834982 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="af891066-3d45-4398-a2f7-5f1ef681ee39" containerName="extract-utilities" Nov 24 12:04:07 crc kubenswrapper[4752]: E1124 12:04:07.835006 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a027cdf-3bff-4215-a9d9-04447529d989" containerName="extract-utilities" Nov 24 12:04:07 crc kubenswrapper[4752]: I1124 12:04:07.835015 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a027cdf-3bff-4215-a9d9-04447529d989" containerName="extract-utilities" Nov 24 12:04:07 crc kubenswrapper[4752]: I1124 12:04:07.835234 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a027cdf-3bff-4215-a9d9-04447529d989" containerName="registry-server" Nov 24 12:04:07 crc kubenswrapper[4752]: I1124 12:04:07.835254 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="af891066-3d45-4398-a2f7-5f1ef681ee39" containerName="registry-server" Nov 24 12:04:07 crc kubenswrapper[4752]: I1124 12:04:07.836490 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r57qv" Nov 24 12:04:07 crc kubenswrapper[4752]: I1124 12:04:07.844103 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r57qv"] Nov 24 12:04:07 crc kubenswrapper[4752]: I1124 12:04:07.848032 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cdf6a2a-b023-468b-9b21-1165e76a9498-catalog-content\") pod \"certified-operators-r57qv\" (UID: \"2cdf6a2a-b023-468b-9b21-1165e76a9498\") " pod="openshift-marketplace/certified-operators-r57qv" Nov 24 12:04:07 crc kubenswrapper[4752]: I1124 12:04:07.848090 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcw57\" (UniqueName: \"kubernetes.io/projected/2cdf6a2a-b023-468b-9b21-1165e76a9498-kube-api-access-mcw57\") pod \"certified-operators-r57qv\" (UID: \"2cdf6a2a-b023-468b-9b21-1165e76a9498\") " pod="openshift-marketplace/certified-operators-r57qv" Nov 24 12:04:07 crc kubenswrapper[4752]: I1124 12:04:07.848128 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cdf6a2a-b023-468b-9b21-1165e76a9498-utilities\") pod \"certified-operators-r57qv\" (UID: \"2cdf6a2a-b023-468b-9b21-1165e76a9498\") " pod="openshift-marketplace/certified-operators-r57qv" Nov 24 12:04:07 crc kubenswrapper[4752]: I1124 12:04:07.948702 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cdf6a2a-b023-468b-9b21-1165e76a9498-catalog-content\") pod \"certified-operators-r57qv\" (UID: \"2cdf6a2a-b023-468b-9b21-1165e76a9498\") " pod="openshift-marketplace/certified-operators-r57qv" Nov 24 12:04:07 crc kubenswrapper[4752]: I1124 12:04:07.948806 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcw57\" (UniqueName: \"kubernetes.io/projected/2cdf6a2a-b023-468b-9b21-1165e76a9498-kube-api-access-mcw57\") pod \"certified-operators-r57qv\" (UID: \"2cdf6a2a-b023-468b-9b21-1165e76a9498\") " pod="openshift-marketplace/certified-operators-r57qv" Nov 24 12:04:07 crc kubenswrapper[4752]: I1124 12:04:07.948846 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cdf6a2a-b023-468b-9b21-1165e76a9498-utilities\") pod \"certified-operators-r57qv\" (UID: \"2cdf6a2a-b023-468b-9b21-1165e76a9498\") " pod="openshift-marketplace/certified-operators-r57qv" Nov 24 12:04:07 crc kubenswrapper[4752]: I1124 12:04:07.949374 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cdf6a2a-b023-468b-9b21-1165e76a9498-utilities\") pod \"certified-operators-r57qv\" (UID: \"2cdf6a2a-b023-468b-9b21-1165e76a9498\") " pod="openshift-marketplace/certified-operators-r57qv" Nov 24 12:04:07 crc kubenswrapper[4752]: I1124 12:04:07.949492 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cdf6a2a-b023-468b-9b21-1165e76a9498-catalog-content\") pod \"certified-operators-r57qv\" (UID: \"2cdf6a2a-b023-468b-9b21-1165e76a9498\") " pod="openshift-marketplace/certified-operators-r57qv" Nov 24 12:04:07 crc kubenswrapper[4752]: I1124 12:04:07.967165 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcw57\" (UniqueName: \"kubernetes.io/projected/2cdf6a2a-b023-468b-9b21-1165e76a9498-kube-api-access-mcw57\") pod \"certified-operators-r57qv\" (UID: \"2cdf6a2a-b023-468b-9b21-1165e76a9498\") " pod="openshift-marketplace/certified-operators-r57qv" Nov 24 12:04:08 crc kubenswrapper[4752]: I1124 12:04:08.161119 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r57qv" Nov 24 12:04:08 crc kubenswrapper[4752]: I1124 12:04:08.608310 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r57qv"] Nov 24 12:04:08 crc kubenswrapper[4752]: I1124 12:04:08.846621 4752 generic.go:334] "Generic (PLEG): container finished" podID="2cdf6a2a-b023-468b-9b21-1165e76a9498" containerID="62b54306f123f3c0bed0c11dc9e9fca7b86c57c9a45449bd3dd0b95770ef6d35" exitCode=0 Nov 24 12:04:08 crc kubenswrapper[4752]: I1124 12:04:08.846670 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r57qv" event={"ID":"2cdf6a2a-b023-468b-9b21-1165e76a9498","Type":"ContainerDied","Data":"62b54306f123f3c0bed0c11dc9e9fca7b86c57c9a45449bd3dd0b95770ef6d35"} Nov 24 12:04:08 crc kubenswrapper[4752]: I1124 12:04:08.846701 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r57qv" event={"ID":"2cdf6a2a-b023-468b-9b21-1165e76a9498","Type":"ContainerStarted","Data":"e01578e65e1361600fe8b85fa450e8f847a2eae5d179646736c1f81de2f6d222"} Nov 24 12:04:09 crc kubenswrapper[4752]: I1124 12:04:09.856204 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r57qv" event={"ID":"2cdf6a2a-b023-468b-9b21-1165e76a9498","Type":"ContainerStarted","Data":"00f0c217541bf394ecea8c2bb2374172524c1a8bd98f0e885115ffc85e095de9"} Nov 24 12:04:10 crc kubenswrapper[4752]: I1124 12:04:10.865169 4752 generic.go:334] "Generic (PLEG): container finished" podID="2cdf6a2a-b023-468b-9b21-1165e76a9498" containerID="00f0c217541bf394ecea8c2bb2374172524c1a8bd98f0e885115ffc85e095de9" exitCode=0 Nov 24 12:04:10 crc kubenswrapper[4752]: I1124 12:04:10.865238 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r57qv" event={"ID":"2cdf6a2a-b023-468b-9b21-1165e76a9498","Type":"ContainerDied","Data":"00f0c217541bf394ecea8c2bb2374172524c1a8bd98f0e885115ffc85e095de9"} Nov 24 12:04:11 crc kubenswrapper[4752]: I1124 12:04:11.877764 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r57qv" event={"ID":"2cdf6a2a-b023-468b-9b21-1165e76a9498","Type":"ContainerStarted","Data":"8172a7a512cb87498d19dc424609342812c17520029e55a3b65c4ff6d38a8762"} Nov 24 12:04:11 crc kubenswrapper[4752]: I1124 12:04:11.903513 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-r57qv" podStartSLOduration=2.4747455990000002 podStartE2EDuration="4.903466416s" podCreationTimestamp="2025-11-24 12:04:07 +0000 UTC" firstStartedPulling="2025-11-24 12:04:08.84832066 +0000 UTC m=+3454.833140949" lastFinishedPulling="2025-11-24 12:04:11.277041477 +0000 UTC m=+3457.261861766" observedRunningTime="2025-11-24 12:04:11.901030556 +0000 UTC m=+3457.885850865" watchObservedRunningTime="2025-11-24 12:04:11.903466416 +0000 UTC m=+3457.888286725" Nov 24 12:04:18 crc kubenswrapper[4752]: I1124 12:04:18.161840 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-r57qv" Nov 24 12:04:18 crc kubenswrapper[4752]: I1124 12:04:18.162370 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-r57qv" Nov 24 12:04:18 crc kubenswrapper[4752]: I1124 12:04:18.210991 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-r57qv" Nov 24 12:04:18 crc kubenswrapper[4752]: I1124 12:04:18.976835 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-r57qv" Nov 24 12:04:19 crc kubenswrapper[4752]: I1124 12:04:19.028355 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r57qv"] Nov 24 12:04:20 crc kubenswrapper[4752]: I1124 12:04:20.948400 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-r57qv" podUID="2cdf6a2a-b023-468b-9b21-1165e76a9498" containerName="registry-server" containerID="cri-o://8172a7a512cb87498d19dc424609342812c17520029e55a3b65c4ff6d38a8762" gracePeriod=2 Nov 24 12:04:21 crc kubenswrapper[4752]: I1124 12:04:21.380334 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r57qv" Nov 24 12:04:21 crc kubenswrapper[4752]: I1124 12:04:21.553576 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcw57\" (UniqueName: \"kubernetes.io/projected/2cdf6a2a-b023-468b-9b21-1165e76a9498-kube-api-access-mcw57\") pod \"2cdf6a2a-b023-468b-9b21-1165e76a9498\" (UID: \"2cdf6a2a-b023-468b-9b21-1165e76a9498\") " Nov 24 12:04:21 crc kubenswrapper[4752]: I1124 12:04:21.553737 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cdf6a2a-b023-468b-9b21-1165e76a9498-utilities\") pod \"2cdf6a2a-b023-468b-9b21-1165e76a9498\" (UID: \"2cdf6a2a-b023-468b-9b21-1165e76a9498\") " Nov 24 12:04:21 crc kubenswrapper[4752]: I1124 12:04:21.553883 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cdf6a2a-b023-468b-9b21-1165e76a9498-catalog-content\") pod \"2cdf6a2a-b023-468b-9b21-1165e76a9498\" (UID: \"2cdf6a2a-b023-468b-9b21-1165e76a9498\") " Nov 24 12:04:21 crc kubenswrapper[4752]: I1124 12:04:21.554811 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cdf6a2a-b023-468b-9b21-1165e76a9498-utilities" (OuterVolumeSpecName: "utilities") pod "2cdf6a2a-b023-468b-9b21-1165e76a9498" (UID: "2cdf6a2a-b023-468b-9b21-1165e76a9498"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:04:21 crc kubenswrapper[4752]: I1124 12:04:21.559797 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cdf6a2a-b023-468b-9b21-1165e76a9498-kube-api-access-mcw57" (OuterVolumeSpecName: "kube-api-access-mcw57") pod "2cdf6a2a-b023-468b-9b21-1165e76a9498" (UID: "2cdf6a2a-b023-468b-9b21-1165e76a9498"). InnerVolumeSpecName "kube-api-access-mcw57". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:04:21 crc kubenswrapper[4752]: I1124 12:04:21.604016 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cdf6a2a-b023-468b-9b21-1165e76a9498-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2cdf6a2a-b023-468b-9b21-1165e76a9498" (UID: "2cdf6a2a-b023-468b-9b21-1165e76a9498"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:04:21 crc kubenswrapper[4752]: I1124 12:04:21.655964 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcw57\" (UniqueName: \"kubernetes.io/projected/2cdf6a2a-b023-468b-9b21-1165e76a9498-kube-api-access-mcw57\") on node \"crc\" DevicePath \"\"" Nov 24 12:04:21 crc kubenswrapper[4752]: I1124 12:04:21.656007 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cdf6a2a-b023-468b-9b21-1165e76a9498-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 12:04:21 crc kubenswrapper[4752]: I1124 12:04:21.656016 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cdf6a2a-b023-468b-9b21-1165e76a9498-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 12:04:21 crc kubenswrapper[4752]: I1124 12:04:21.958474 4752 generic.go:334] "Generic (PLEG): container finished" podID="2cdf6a2a-b023-468b-9b21-1165e76a9498" containerID="8172a7a512cb87498d19dc424609342812c17520029e55a3b65c4ff6d38a8762" exitCode=0 Nov 24 12:04:21 crc kubenswrapper[4752]: I1124 12:04:21.958519 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r57qv" event={"ID":"2cdf6a2a-b023-468b-9b21-1165e76a9498","Type":"ContainerDied","Data":"8172a7a512cb87498d19dc424609342812c17520029e55a3b65c4ff6d38a8762"} Nov 24 12:04:21 crc kubenswrapper[4752]: I1124 12:04:21.958531 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r57qv" Nov 24 12:04:21 crc kubenswrapper[4752]: I1124 12:04:21.958549 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r57qv" event={"ID":"2cdf6a2a-b023-468b-9b21-1165e76a9498","Type":"ContainerDied","Data":"e01578e65e1361600fe8b85fa450e8f847a2eae5d179646736c1f81de2f6d222"} Nov 24 12:04:21 crc kubenswrapper[4752]: I1124 12:04:21.958570 4752 scope.go:117] "RemoveContainer" containerID="8172a7a512cb87498d19dc424609342812c17520029e55a3b65c4ff6d38a8762" Nov 24 12:04:21 crc kubenswrapper[4752]: I1124 12:04:21.988067 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r57qv"] Nov 24 12:04:21 crc kubenswrapper[4752]: I1124 12:04:21.990588 4752 scope.go:117] "RemoveContainer" containerID="00f0c217541bf394ecea8c2bb2374172524c1a8bd98f0e885115ffc85e095de9" Nov 24 12:04:21 crc kubenswrapper[4752]: I1124 12:04:21.993760 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-r57qv"] Nov 24 12:04:22 crc kubenswrapper[4752]: I1124 12:04:22.011894 4752 scope.go:117] "RemoveContainer" containerID="62b54306f123f3c0bed0c11dc9e9fca7b86c57c9a45449bd3dd0b95770ef6d35" Nov 24 12:04:22 crc kubenswrapper[4752]: I1124 12:04:22.028670 4752 scope.go:117] "RemoveContainer" containerID="8172a7a512cb87498d19dc424609342812c17520029e55a3b65c4ff6d38a8762" Nov 24 12:04:22 crc kubenswrapper[4752]: E1124 12:04:22.029038 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8172a7a512cb87498d19dc424609342812c17520029e55a3b65c4ff6d38a8762\": container with ID starting with 8172a7a512cb87498d19dc424609342812c17520029e55a3b65c4ff6d38a8762 not found: ID does not exist" containerID="8172a7a512cb87498d19dc424609342812c17520029e55a3b65c4ff6d38a8762" Nov 24 12:04:22 crc kubenswrapper[4752]: I1124 12:04:22.029074 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8172a7a512cb87498d19dc424609342812c17520029e55a3b65c4ff6d38a8762"} err="failed to get container status \"8172a7a512cb87498d19dc424609342812c17520029e55a3b65c4ff6d38a8762\": rpc error: code = NotFound desc = could not find container \"8172a7a512cb87498d19dc424609342812c17520029e55a3b65c4ff6d38a8762\": container with ID starting with 8172a7a512cb87498d19dc424609342812c17520029e55a3b65c4ff6d38a8762 not found: ID does not exist" Nov 24 12:04:22 crc kubenswrapper[4752]: I1124 12:04:22.029119 4752 scope.go:117] "RemoveContainer" containerID="00f0c217541bf394ecea8c2bb2374172524c1a8bd98f0e885115ffc85e095de9" Nov 24 12:04:22 crc kubenswrapper[4752]: E1124 12:04:22.029365 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00f0c217541bf394ecea8c2bb2374172524c1a8bd98f0e885115ffc85e095de9\": container with ID starting with 00f0c217541bf394ecea8c2bb2374172524c1a8bd98f0e885115ffc85e095de9 not found: ID does not exist" containerID="00f0c217541bf394ecea8c2bb2374172524c1a8bd98f0e885115ffc85e095de9" Nov 24 12:04:22 crc kubenswrapper[4752]: I1124 12:04:22.029396 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00f0c217541bf394ecea8c2bb2374172524c1a8bd98f0e885115ffc85e095de9"} err="failed to get container status \"00f0c217541bf394ecea8c2bb2374172524c1a8bd98f0e885115ffc85e095de9\": rpc error: code = NotFound desc = could not find container \"00f0c217541bf394ecea8c2bb2374172524c1a8bd98f0e885115ffc85e095de9\": container with ID starting with 00f0c217541bf394ecea8c2bb2374172524c1a8bd98f0e885115ffc85e095de9 not found: ID does not exist" Nov 24 12:04:22 crc kubenswrapper[4752]: I1124 12:04:22.029413 4752 scope.go:117] "RemoveContainer" containerID="62b54306f123f3c0bed0c11dc9e9fca7b86c57c9a45449bd3dd0b95770ef6d35" Nov 24 12:04:22 crc kubenswrapper[4752]: E1124 12:04:22.029645 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62b54306f123f3c0bed0c11dc9e9fca7b86c57c9a45449bd3dd0b95770ef6d35\": container with ID starting with 62b54306f123f3c0bed0c11dc9e9fca7b86c57c9a45449bd3dd0b95770ef6d35 not found: ID does not exist" containerID="62b54306f123f3c0bed0c11dc9e9fca7b86c57c9a45449bd3dd0b95770ef6d35" Nov 24 12:04:22 crc kubenswrapper[4752]: I1124 12:04:22.029668 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62b54306f123f3c0bed0c11dc9e9fca7b86c57c9a45449bd3dd0b95770ef6d35"} err="failed to get container status \"62b54306f123f3c0bed0c11dc9e9fca7b86c57c9a45449bd3dd0b95770ef6d35\": rpc error: code = NotFound desc = could not find container \"62b54306f123f3c0bed0c11dc9e9fca7b86c57c9a45449bd3dd0b95770ef6d35\": container with ID starting with 62b54306f123f3c0bed0c11dc9e9fca7b86c57c9a45449bd3dd0b95770ef6d35 not found: ID does not exist" Nov 24 12:04:22 crc kubenswrapper[4752]: I1124 12:04:22.748740 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cdf6a2a-b023-468b-9b21-1165e76a9498" path="/var/lib/kubelet/pods/2cdf6a2a-b023-468b-9b21-1165e76a9498/volumes" Nov 24 12:04:45 crc kubenswrapper[4752]: I1124 12:04:45.468970 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 12:04:45 crc kubenswrapper[4752]: I1124 12:04:45.469611 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 12:05:09 crc kubenswrapper[4752]: I1124 12:05:09.566857 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-b6k4p"] Nov 24 12:05:09 crc kubenswrapper[4752]: E1124 12:05:09.567659 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cdf6a2a-b023-468b-9b21-1165e76a9498" containerName="extract-utilities" Nov 24 12:05:09 crc kubenswrapper[4752]: I1124 12:05:09.567671 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cdf6a2a-b023-468b-9b21-1165e76a9498" containerName="extract-utilities" Nov 24 12:05:09 crc kubenswrapper[4752]: E1124 12:05:09.567682 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cdf6a2a-b023-468b-9b21-1165e76a9498" containerName="extract-content" Nov 24 12:05:09 crc kubenswrapper[4752]: I1124 12:05:09.567688 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cdf6a2a-b023-468b-9b21-1165e76a9498" containerName="extract-content" Nov 24 12:05:09 crc kubenswrapper[4752]: E1124 12:05:09.567698 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cdf6a2a-b023-468b-9b21-1165e76a9498" containerName="registry-server" Nov 24 12:05:09 crc kubenswrapper[4752]: I1124 12:05:09.567704 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cdf6a2a-b023-468b-9b21-1165e76a9498" containerName="registry-server" Nov 24 12:05:09 crc kubenswrapper[4752]: I1124 12:05:09.567932 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cdf6a2a-b023-468b-9b21-1165e76a9498" containerName="registry-server" Nov 24 12:05:09 crc kubenswrapper[4752]: I1124 12:05:09.568882 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b6k4p" Nov 24 12:05:09 crc kubenswrapper[4752]: I1124 12:05:09.582638 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b6k4p"] Nov 24 12:05:09 crc kubenswrapper[4752]: I1124 12:05:09.755110 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4503519a-7bcd-4610-8e59-c4feb4d82ee3-catalog-content\") pod \"redhat-marketplace-b6k4p\" (UID: \"4503519a-7bcd-4610-8e59-c4feb4d82ee3\") " pod="openshift-marketplace/redhat-marketplace-b6k4p" Nov 24 12:05:09 crc kubenswrapper[4752]: I1124 12:05:09.755173 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4503519a-7bcd-4610-8e59-c4feb4d82ee3-utilities\") pod \"redhat-marketplace-b6k4p\" (UID: \"4503519a-7bcd-4610-8e59-c4feb4d82ee3\") " pod="openshift-marketplace/redhat-marketplace-b6k4p" Nov 24 12:05:09 crc kubenswrapper[4752]: I1124 12:05:09.755201 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnktv\" (UniqueName: \"kubernetes.io/projected/4503519a-7bcd-4610-8e59-c4feb4d82ee3-kube-api-access-nnktv\") pod \"redhat-marketplace-b6k4p\" (UID: \"4503519a-7bcd-4610-8e59-c4feb4d82ee3\") " pod="openshift-marketplace/redhat-marketplace-b6k4p" Nov 24 12:05:09 crc kubenswrapper[4752]: I1124 12:05:09.856793 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4503519a-7bcd-4610-8e59-c4feb4d82ee3-catalog-content\") pod \"redhat-marketplace-b6k4p\" (UID: \"4503519a-7bcd-4610-8e59-c4feb4d82ee3\") " pod="openshift-marketplace/redhat-marketplace-b6k4p" Nov 24 12:05:09 crc kubenswrapper[4752]: I1124 12:05:09.856857 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4503519a-7bcd-4610-8e59-c4feb4d82ee3-utilities\") pod \"redhat-marketplace-b6k4p\" (UID: \"4503519a-7bcd-4610-8e59-c4feb4d82ee3\") " pod="openshift-marketplace/redhat-marketplace-b6k4p" Nov 24 12:05:09 crc kubenswrapper[4752]: I1124 12:05:09.856883 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnktv\" (UniqueName: \"kubernetes.io/projected/4503519a-7bcd-4610-8e59-c4feb4d82ee3-kube-api-access-nnktv\") pod \"redhat-marketplace-b6k4p\" (UID: \"4503519a-7bcd-4610-8e59-c4feb4d82ee3\") " pod="openshift-marketplace/redhat-marketplace-b6k4p" Nov 24 12:05:09 crc kubenswrapper[4752]: I1124 12:05:09.857376 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4503519a-7bcd-4610-8e59-c4feb4d82ee3-utilities\") pod \"redhat-marketplace-b6k4p\" (UID: \"4503519a-7bcd-4610-8e59-c4feb4d82ee3\") " pod="openshift-marketplace/redhat-marketplace-b6k4p" Nov 24 12:05:09 crc kubenswrapper[4752]: I1124 12:05:09.858003 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4503519a-7bcd-4610-8e59-c4feb4d82ee3-catalog-content\") pod \"redhat-marketplace-b6k4p\" (UID: \"4503519a-7bcd-4610-8e59-c4feb4d82ee3\") " pod="openshift-marketplace/redhat-marketplace-b6k4p" Nov 24 12:05:09 crc kubenswrapper[4752]: I1124 12:05:09.876892 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnktv\" (UniqueName: \"kubernetes.io/projected/4503519a-7bcd-4610-8e59-c4feb4d82ee3-kube-api-access-nnktv\") pod \"redhat-marketplace-b6k4p\" (UID: \"4503519a-7bcd-4610-8e59-c4feb4d82ee3\") " pod="openshift-marketplace/redhat-marketplace-b6k4p" Nov 24 12:05:09 crc kubenswrapper[4752]: I1124 12:05:09.940639 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b6k4p" Nov 24 12:05:10 crc kubenswrapper[4752]: I1124 12:05:10.402202 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b6k4p"] Nov 24 12:05:11 crc kubenswrapper[4752]: I1124 12:05:11.322056 4752 generic.go:334] "Generic (PLEG): container finished" podID="4503519a-7bcd-4610-8e59-c4feb4d82ee3" containerID="29d3e8de2b42f63846e16d23b7f901d2ad031fe0c2cc9c2b687ea2ca57985819" exitCode=0 Nov 24 12:05:11 crc kubenswrapper[4752]: I1124 12:05:11.322161 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b6k4p" event={"ID":"4503519a-7bcd-4610-8e59-c4feb4d82ee3","Type":"ContainerDied","Data":"29d3e8de2b42f63846e16d23b7f901d2ad031fe0c2cc9c2b687ea2ca57985819"} Nov 24 12:05:11 crc kubenswrapper[4752]: I1124 12:05:11.322394 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b6k4p" event={"ID":"4503519a-7bcd-4610-8e59-c4feb4d82ee3","Type":"ContainerStarted","Data":"6057aa611665257bfbea00c1e40f4a26325ecf5e7cd707fcc897765272d20010"} Nov 24 12:05:12 crc kubenswrapper[4752]: I1124 12:05:12.331517 4752 generic.go:334] "Generic (PLEG): container finished" podID="4503519a-7bcd-4610-8e59-c4feb4d82ee3" containerID="42e5bc82f845fe4635f3f6c6b593d3cea415f4cdfec10198af2c7dc78e5ab18f" exitCode=0 Nov 24 12:05:12 crc kubenswrapper[4752]: I1124 12:05:12.331553 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b6k4p" event={"ID":"4503519a-7bcd-4610-8e59-c4feb4d82ee3","Type":"ContainerDied","Data":"42e5bc82f845fe4635f3f6c6b593d3cea415f4cdfec10198af2c7dc78e5ab18f"} Nov 24 12:05:13 crc kubenswrapper[4752]: I1124 12:05:13.340913 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b6k4p" event={"ID":"4503519a-7bcd-4610-8e59-c4feb4d82ee3","Type":"ContainerStarted","Data":"dc6d7923d184c6bc23efa54bafb47688759cb0df754281ceb78522b02fb07c27"} Nov 24 12:05:13 crc kubenswrapper[4752]: I1124 12:05:13.361224 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-b6k4p" podStartSLOduration=2.951899693 podStartE2EDuration="4.36120607s" podCreationTimestamp="2025-11-24 12:05:09 +0000 UTC" firstStartedPulling="2025-11-24 12:05:11.325567624 +0000 UTC m=+3517.310387913" lastFinishedPulling="2025-11-24 12:05:12.734874001 +0000 UTC m=+3518.719694290" observedRunningTime="2025-11-24 12:05:13.359833661 +0000 UTC m=+3519.344653960" watchObservedRunningTime="2025-11-24 12:05:13.36120607 +0000 UTC m=+3519.346026359" Nov 24 12:05:15 crc kubenswrapper[4752]: I1124 12:05:15.468335 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 12:05:15 crc kubenswrapper[4752]: I1124 12:05:15.468393 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 12:05:19 crc kubenswrapper[4752]: I1124 12:05:19.941540 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-b6k4p" Nov 24 12:05:19 crc kubenswrapper[4752]: I1124 12:05:19.942125 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-b6k4p" Nov 24 12:05:19 crc kubenswrapper[4752]: I1124 12:05:19.993000 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-b6k4p" Nov 24 12:05:20 crc kubenswrapper[4752]: I1124 12:05:20.431502 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-b6k4p" Nov 24 12:05:20 crc kubenswrapper[4752]: I1124 12:05:20.484945 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b6k4p"] Nov 24 12:05:22 crc kubenswrapper[4752]: I1124 12:05:22.403967 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-b6k4p" podUID="4503519a-7bcd-4610-8e59-c4feb4d82ee3" containerName="registry-server" containerID="cri-o://dc6d7923d184c6bc23efa54bafb47688759cb0df754281ceb78522b02fb07c27" gracePeriod=2 Nov 24 12:05:22 crc kubenswrapper[4752]: I1124 12:05:22.809174 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b6k4p" Nov 24 12:05:22 crc kubenswrapper[4752]: I1124 12:05:22.941474 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4503519a-7bcd-4610-8e59-c4feb4d82ee3-catalog-content\") pod \"4503519a-7bcd-4610-8e59-c4feb4d82ee3\" (UID: \"4503519a-7bcd-4610-8e59-c4feb4d82ee3\") " Nov 24 12:05:22 crc kubenswrapper[4752]: I1124 12:05:22.941667 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4503519a-7bcd-4610-8e59-c4feb4d82ee3-utilities\") pod \"4503519a-7bcd-4610-8e59-c4feb4d82ee3\" (UID: \"4503519a-7bcd-4610-8e59-c4feb4d82ee3\") " Nov 24 12:05:22 crc kubenswrapper[4752]: I1124 12:05:22.941711 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnktv\" (UniqueName: \"kubernetes.io/projected/4503519a-7bcd-4610-8e59-c4feb4d82ee3-kube-api-access-nnktv\") pod \"4503519a-7bcd-4610-8e59-c4feb4d82ee3\" (UID: \"4503519a-7bcd-4610-8e59-c4feb4d82ee3\") " Nov 24 12:05:22 crc kubenswrapper[4752]: I1124 12:05:22.942533 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4503519a-7bcd-4610-8e59-c4feb4d82ee3-utilities" (OuterVolumeSpecName: "utilities") pod "4503519a-7bcd-4610-8e59-c4feb4d82ee3" (UID: "4503519a-7bcd-4610-8e59-c4feb4d82ee3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:05:22 crc kubenswrapper[4752]: I1124 12:05:22.954566 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4503519a-7bcd-4610-8e59-c4feb4d82ee3-kube-api-access-nnktv" (OuterVolumeSpecName: "kube-api-access-nnktv") pod "4503519a-7bcd-4610-8e59-c4feb4d82ee3" (UID: "4503519a-7bcd-4610-8e59-c4feb4d82ee3"). InnerVolumeSpecName "kube-api-access-nnktv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:05:22 crc kubenswrapper[4752]: I1124 12:05:22.962907 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4503519a-7bcd-4610-8e59-c4feb4d82ee3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4503519a-7bcd-4610-8e59-c4feb4d82ee3" (UID: "4503519a-7bcd-4610-8e59-c4feb4d82ee3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:05:23 crc kubenswrapper[4752]: I1124 12:05:23.043224 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnktv\" (UniqueName: \"kubernetes.io/projected/4503519a-7bcd-4610-8e59-c4feb4d82ee3-kube-api-access-nnktv\") on node \"crc\" DevicePath \"\"" Nov 24 12:05:23 crc kubenswrapper[4752]: I1124 12:05:23.043264 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4503519a-7bcd-4610-8e59-c4feb4d82ee3-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 12:05:23 crc kubenswrapper[4752]: I1124 12:05:23.043276 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4503519a-7bcd-4610-8e59-c4feb4d82ee3-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 12:05:23 crc kubenswrapper[4752]: I1124 12:05:23.416223 4752 generic.go:334] "Generic (PLEG): container finished" podID="4503519a-7bcd-4610-8e59-c4feb4d82ee3" containerID="dc6d7923d184c6bc23efa54bafb47688759cb0df754281ceb78522b02fb07c27" exitCode=0 Nov 24 12:05:23 crc kubenswrapper[4752]: I1124 12:05:23.416300 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b6k4p" event={"ID":"4503519a-7bcd-4610-8e59-c4feb4d82ee3","Type":"ContainerDied","Data":"dc6d7923d184c6bc23efa54bafb47688759cb0df754281ceb78522b02fb07c27"} Nov 24 12:05:23 crc kubenswrapper[4752]: I1124 12:05:23.416321 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b6k4p" Nov 24 12:05:23 crc kubenswrapper[4752]: I1124 12:05:23.416357 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b6k4p" event={"ID":"4503519a-7bcd-4610-8e59-c4feb4d82ee3","Type":"ContainerDied","Data":"6057aa611665257bfbea00c1e40f4a26325ecf5e7cd707fcc897765272d20010"} Nov 24 12:05:23 crc kubenswrapper[4752]: I1124 12:05:23.416395 4752 scope.go:117] "RemoveContainer" containerID="dc6d7923d184c6bc23efa54bafb47688759cb0df754281ceb78522b02fb07c27" Nov 24 12:05:23 crc kubenswrapper[4752]: I1124 12:05:23.455822 4752 scope.go:117] "RemoveContainer" containerID="42e5bc82f845fe4635f3f6c6b593d3cea415f4cdfec10198af2c7dc78e5ab18f" Nov 24 12:05:23 crc kubenswrapper[4752]: I1124 12:05:23.470415 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b6k4p"] Nov 24 12:05:23 crc kubenswrapper[4752]: I1124 12:05:23.476816 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-b6k4p"] Nov 24 12:05:23 crc kubenswrapper[4752]: I1124 12:05:23.492078 4752 scope.go:117] "RemoveContainer" containerID="29d3e8de2b42f63846e16d23b7f901d2ad031fe0c2cc9c2b687ea2ca57985819" Nov 24 12:05:23 crc kubenswrapper[4752]: I1124 12:05:23.510142 4752 scope.go:117] "RemoveContainer" containerID="dc6d7923d184c6bc23efa54bafb47688759cb0df754281ceb78522b02fb07c27" Nov 24 12:05:23 crc kubenswrapper[4752]: E1124 12:05:23.510635 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc6d7923d184c6bc23efa54bafb47688759cb0df754281ceb78522b02fb07c27\": container with ID starting with dc6d7923d184c6bc23efa54bafb47688759cb0df754281ceb78522b02fb07c27 not found: ID does not exist" containerID="dc6d7923d184c6bc23efa54bafb47688759cb0df754281ceb78522b02fb07c27" Nov 24 12:05:23 crc kubenswrapper[4752]: I1124 12:05:23.510667 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc6d7923d184c6bc23efa54bafb47688759cb0df754281ceb78522b02fb07c27"} err="failed to get container status \"dc6d7923d184c6bc23efa54bafb47688759cb0df754281ceb78522b02fb07c27\": rpc error: code = NotFound desc = could not find container \"dc6d7923d184c6bc23efa54bafb47688759cb0df754281ceb78522b02fb07c27\": container with ID starting with dc6d7923d184c6bc23efa54bafb47688759cb0df754281ceb78522b02fb07c27 not found: ID does not exist" Nov 24 12:05:23 crc kubenswrapper[4752]: I1124 12:05:23.510694 4752 scope.go:117] "RemoveContainer" containerID="42e5bc82f845fe4635f3f6c6b593d3cea415f4cdfec10198af2c7dc78e5ab18f" Nov 24 12:05:23 crc kubenswrapper[4752]: E1124 12:05:23.511044 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42e5bc82f845fe4635f3f6c6b593d3cea415f4cdfec10198af2c7dc78e5ab18f\": container with ID starting with 42e5bc82f845fe4635f3f6c6b593d3cea415f4cdfec10198af2c7dc78e5ab18f not found: ID does not exist" containerID="42e5bc82f845fe4635f3f6c6b593d3cea415f4cdfec10198af2c7dc78e5ab18f" Nov 24 12:05:23 crc kubenswrapper[4752]: I1124 12:05:23.511068 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42e5bc82f845fe4635f3f6c6b593d3cea415f4cdfec10198af2c7dc78e5ab18f"} err="failed to get container status \"42e5bc82f845fe4635f3f6c6b593d3cea415f4cdfec10198af2c7dc78e5ab18f\": rpc error: code = NotFound desc = could not find container \"42e5bc82f845fe4635f3f6c6b593d3cea415f4cdfec10198af2c7dc78e5ab18f\": container with ID starting with 42e5bc82f845fe4635f3f6c6b593d3cea415f4cdfec10198af2c7dc78e5ab18f not found: ID does not exist" Nov 24 12:05:23 crc kubenswrapper[4752]: I1124 12:05:23.511084 4752 scope.go:117] "RemoveContainer" containerID="29d3e8de2b42f63846e16d23b7f901d2ad031fe0c2cc9c2b687ea2ca57985819" Nov 24 12:05:23 crc kubenswrapper[4752]: E1124 12:05:23.511294 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29d3e8de2b42f63846e16d23b7f901d2ad031fe0c2cc9c2b687ea2ca57985819\": container with ID starting with 29d3e8de2b42f63846e16d23b7f901d2ad031fe0c2cc9c2b687ea2ca57985819 not found: ID does not exist" containerID="29d3e8de2b42f63846e16d23b7f901d2ad031fe0c2cc9c2b687ea2ca57985819" Nov 24 12:05:23 crc kubenswrapper[4752]: I1124 12:05:23.511317 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29d3e8de2b42f63846e16d23b7f901d2ad031fe0c2cc9c2b687ea2ca57985819"} err="failed to get container status \"29d3e8de2b42f63846e16d23b7f901d2ad031fe0c2cc9c2b687ea2ca57985819\": rpc error: code = NotFound desc = could not find container \"29d3e8de2b42f63846e16d23b7f901d2ad031fe0c2cc9c2b687ea2ca57985819\": container with ID starting with 29d3e8de2b42f63846e16d23b7f901d2ad031fe0c2cc9c2b687ea2ca57985819 not found: ID does not exist" Nov 24 12:05:24 crc kubenswrapper[4752]: I1124 12:05:24.737989 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4503519a-7bcd-4610-8e59-c4feb4d82ee3" path="/var/lib/kubelet/pods/4503519a-7bcd-4610-8e59-c4feb4d82ee3/volumes" Nov 24 12:05:45 crc kubenswrapper[4752]: I1124 12:05:45.469313 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 12:05:45 crc kubenswrapper[4752]: I1124 12:05:45.469991 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 12:05:45 crc kubenswrapper[4752]: I1124 12:05:45.470040 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 12:05:45 crc kubenswrapper[4752]: I1124 12:05:45.470628 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3450f74eabbb61df240826a6bd281029c1a9b71d1788a44a2246b8fdfc23bac5"} pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 24 12:05:45 crc kubenswrapper[4752]: I1124 12:05:45.470696 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" containerID="cri-o://3450f74eabbb61df240826a6bd281029c1a9b71d1788a44a2246b8fdfc23bac5" gracePeriod=600 Nov 24 12:05:45 crc kubenswrapper[4752]: E1124 12:05:45.644195 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:05:46 crc kubenswrapper[4752]: I1124 12:05:46.596668 4752 generic.go:334] "Generic (PLEG): container finished" podID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerID="3450f74eabbb61df240826a6bd281029c1a9b71d1788a44a2246b8fdfc23bac5" exitCode=0 Nov 24 12:05:46 crc kubenswrapper[4752]: I1124 12:05:46.596729 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerDied","Data":"3450f74eabbb61df240826a6bd281029c1a9b71d1788a44a2246b8fdfc23bac5"} Nov 24 12:05:46 crc kubenswrapper[4752]: I1124 12:05:46.597043 4752 scope.go:117] "RemoveContainer" containerID="edb1b974ef5fb118e959a83d42c4c5df135e8797e56158f3068ef15e6420bf45" Nov 24 12:05:46 crc kubenswrapper[4752]: I1124 12:05:46.597615 4752 scope.go:117] "RemoveContainer" containerID="3450f74eabbb61df240826a6bd281029c1a9b71d1788a44a2246b8fdfc23bac5" Nov 24 12:05:46 crc kubenswrapper[4752]: E1124 12:05:46.597936 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:05:58 crc kubenswrapper[4752]: I1124 12:05:58.728820 4752 scope.go:117] "RemoveContainer" containerID="3450f74eabbb61df240826a6bd281029c1a9b71d1788a44a2246b8fdfc23bac5" Nov 24 12:05:58 crc kubenswrapper[4752]: E1124 12:05:58.730046 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:06:09 crc kubenswrapper[4752]: I1124 12:06:09.728709 4752 scope.go:117] "RemoveContainer" containerID="3450f74eabbb61df240826a6bd281029c1a9b71d1788a44a2246b8fdfc23bac5" Nov 24 12:06:09 crc kubenswrapper[4752]: E1124 12:06:09.729492 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:06:20 crc kubenswrapper[4752]: I1124 12:06:20.728795 4752 scope.go:117] "RemoveContainer" containerID="3450f74eabbb61df240826a6bd281029c1a9b71d1788a44a2246b8fdfc23bac5" Nov 24 12:06:20 crc kubenswrapper[4752]: E1124 12:06:20.729512 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:06:35 crc kubenswrapper[4752]: I1124 12:06:35.728188 4752 scope.go:117] "RemoveContainer" containerID="3450f74eabbb61df240826a6bd281029c1a9b71d1788a44a2246b8fdfc23bac5" Nov 24 12:06:35 crc kubenswrapper[4752]: E1124 12:06:35.729012 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:06:47 crc kubenswrapper[4752]: I1124 12:06:47.727991 4752 scope.go:117] "RemoveContainer" containerID="3450f74eabbb61df240826a6bd281029c1a9b71d1788a44a2246b8fdfc23bac5" Nov 24 12:06:47 crc kubenswrapper[4752]: E1124 12:06:47.729089 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:06:59 crc kubenswrapper[4752]: I1124 12:06:59.728272 4752 scope.go:117] "RemoveContainer" containerID="3450f74eabbb61df240826a6bd281029c1a9b71d1788a44a2246b8fdfc23bac5" Nov 24 12:06:59 crc kubenswrapper[4752]: E1124 12:06:59.729078 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:07:12 crc kubenswrapper[4752]: I1124 12:07:12.728524 4752 scope.go:117] "RemoveContainer" containerID="3450f74eabbb61df240826a6bd281029c1a9b71d1788a44a2246b8fdfc23bac5" Nov 24 12:07:12 crc kubenswrapper[4752]: E1124 12:07:12.729412 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:07:23 crc kubenswrapper[4752]: I1124 12:07:23.728579 4752 scope.go:117] "RemoveContainer" containerID="3450f74eabbb61df240826a6bd281029c1a9b71d1788a44a2246b8fdfc23bac5" Nov 24 12:07:23 crc kubenswrapper[4752]: E1124 12:07:23.729398 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:07:34 crc kubenswrapper[4752]: I1124 12:07:34.736180 4752 scope.go:117] "RemoveContainer" containerID="3450f74eabbb61df240826a6bd281029c1a9b71d1788a44a2246b8fdfc23bac5" Nov 24 12:07:34 crc kubenswrapper[4752]: E1124 12:07:34.737288 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:07:47 crc kubenswrapper[4752]: I1124 12:07:47.728008 4752 scope.go:117] "RemoveContainer" containerID="3450f74eabbb61df240826a6bd281029c1a9b71d1788a44a2246b8fdfc23bac5" Nov 24 12:07:47 crc kubenswrapper[4752]: E1124 12:07:47.728918 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:07:58 crc kubenswrapper[4752]: I1124 12:07:58.728718 4752 scope.go:117] "RemoveContainer" containerID="3450f74eabbb61df240826a6bd281029c1a9b71d1788a44a2246b8fdfc23bac5" Nov 24 12:07:58 crc kubenswrapper[4752]: E1124 12:07:58.731976 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:08:09 crc kubenswrapper[4752]: I1124 12:08:09.728131 4752 scope.go:117] "RemoveContainer" containerID="3450f74eabbb61df240826a6bd281029c1a9b71d1788a44a2246b8fdfc23bac5" Nov 24 12:08:09 crc kubenswrapper[4752]: E1124 12:08:09.728851 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:08:21 crc kubenswrapper[4752]: I1124 12:08:21.728824 4752 scope.go:117] "RemoveContainer" containerID="3450f74eabbb61df240826a6bd281029c1a9b71d1788a44a2246b8fdfc23bac5" Nov 24 12:08:21 crc kubenswrapper[4752]: E1124 12:08:21.731372 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:08:34 crc kubenswrapper[4752]: I1124 12:08:34.732960 4752 scope.go:117] "RemoveContainer" containerID="3450f74eabbb61df240826a6bd281029c1a9b71d1788a44a2246b8fdfc23bac5" Nov 24 12:08:34 crc kubenswrapper[4752]: E1124 12:08:34.734135 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:08:46 crc kubenswrapper[4752]: I1124 12:08:46.728730 4752 scope.go:117] "RemoveContainer" containerID="3450f74eabbb61df240826a6bd281029c1a9b71d1788a44a2246b8fdfc23bac5" Nov 24 12:08:46 crc kubenswrapper[4752]: E1124 12:08:46.729952 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:08:59 crc kubenswrapper[4752]: I1124 12:08:59.728473 4752 scope.go:117] "RemoveContainer" containerID="3450f74eabbb61df240826a6bd281029c1a9b71d1788a44a2246b8fdfc23bac5" Nov 24 12:08:59 crc kubenswrapper[4752]: E1124 12:08:59.729397 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:09:12 crc kubenswrapper[4752]: I1124 12:09:12.728470 4752 scope.go:117] "RemoveContainer" containerID="3450f74eabbb61df240826a6bd281029c1a9b71d1788a44a2246b8fdfc23bac5" Nov 24 12:09:12 crc kubenswrapper[4752]: E1124 12:09:12.734613 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:09:24 crc kubenswrapper[4752]: I1124 12:09:24.737095 4752 scope.go:117] "RemoveContainer" containerID="3450f74eabbb61df240826a6bd281029c1a9b71d1788a44a2246b8fdfc23bac5" Nov 24 12:09:24 crc kubenswrapper[4752]: E1124 12:09:24.738381 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:09:37 crc kubenswrapper[4752]: I1124 12:09:37.727987 4752 scope.go:117] "RemoveContainer" containerID="3450f74eabbb61df240826a6bd281029c1a9b71d1788a44a2246b8fdfc23bac5" Nov 24 12:09:37 crc kubenswrapper[4752]: E1124 12:09:37.728908 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:09:51 crc kubenswrapper[4752]: I1124 12:09:51.728399 4752 scope.go:117] "RemoveContainer" containerID="3450f74eabbb61df240826a6bd281029c1a9b71d1788a44a2246b8fdfc23bac5" Nov 24 12:09:51 crc kubenswrapper[4752]: E1124 12:09:51.730465 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:10:06 crc kubenswrapper[4752]: I1124 12:10:06.728127 4752 scope.go:117] "RemoveContainer" containerID="3450f74eabbb61df240826a6bd281029c1a9b71d1788a44a2246b8fdfc23bac5" Nov 24 12:10:06 crc kubenswrapper[4752]: E1124 12:10:06.729151 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:10:17 crc kubenswrapper[4752]: I1124 12:10:17.727885 4752 scope.go:117] "RemoveContainer" containerID="3450f74eabbb61df240826a6bd281029c1a9b71d1788a44a2246b8fdfc23bac5" Nov 24 12:10:17 crc kubenswrapper[4752]: E1124 12:10:17.729140 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:10:30 crc kubenswrapper[4752]: I1124 12:10:30.727920 4752 scope.go:117] "RemoveContainer" containerID="3450f74eabbb61df240826a6bd281029c1a9b71d1788a44a2246b8fdfc23bac5" Nov 24 12:10:30 crc kubenswrapper[4752]: E1124 12:10:30.728869 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:10:44 crc kubenswrapper[4752]: I1124 12:10:44.732163 4752 scope.go:117] "RemoveContainer" containerID="3450f74eabbb61df240826a6bd281029c1a9b71d1788a44a2246b8fdfc23bac5" Nov 24 12:10:44 crc kubenswrapper[4752]: E1124 12:10:44.733197 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:10:59 crc kubenswrapper[4752]: I1124 12:10:59.727822 4752 scope.go:117] "RemoveContainer" containerID="3450f74eabbb61df240826a6bd281029c1a9b71d1788a44a2246b8fdfc23bac5" Nov 24 12:10:59 crc kubenswrapper[4752]: I1124 12:10:59.904989 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerStarted","Data":"6da3a157cf2d3aac64f3071b9f67cc0eac1a6b813d85528edb9f0789e946feee"} Nov 24 12:11:15 crc kubenswrapper[4752]: I1124 12:11:15.224637 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zmdbq"] Nov 24 12:11:15 crc kubenswrapper[4752]: E1124 12:11:15.225790 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4503519a-7bcd-4610-8e59-c4feb4d82ee3" containerName="registry-server" Nov 24 12:11:15 crc kubenswrapper[4752]: I1124 12:11:15.225807 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="4503519a-7bcd-4610-8e59-c4feb4d82ee3" containerName="registry-server" Nov 24 12:11:15 crc kubenswrapper[4752]: E1124 12:11:15.225819 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4503519a-7bcd-4610-8e59-c4feb4d82ee3" containerName="extract-utilities" Nov 24 12:11:15 crc kubenswrapper[4752]: I1124 12:11:15.225827 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="4503519a-7bcd-4610-8e59-c4feb4d82ee3" containerName="extract-utilities" Nov 24 12:11:15 crc kubenswrapper[4752]: E1124 12:11:15.225857 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4503519a-7bcd-4610-8e59-c4feb4d82ee3" containerName="extract-content" Nov 24 12:11:15 crc kubenswrapper[4752]: I1124 12:11:15.225865 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="4503519a-7bcd-4610-8e59-c4feb4d82ee3" containerName="extract-content" Nov 24 12:11:15 crc kubenswrapper[4752]: I1124 12:11:15.226007 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="4503519a-7bcd-4610-8e59-c4feb4d82ee3" containerName="registry-server" Nov 24 12:11:15 crc kubenswrapper[4752]: I1124 12:11:15.226988 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zmdbq" Nov 24 12:11:15 crc kubenswrapper[4752]: I1124 12:11:15.248384 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zmdbq"] Nov 24 12:11:15 crc kubenswrapper[4752]: I1124 12:11:15.393852 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/406baf43-1d64-4d56-baaa-0ddb4b263e08-utilities\") pod \"redhat-operators-zmdbq\" (UID: \"406baf43-1d64-4d56-baaa-0ddb4b263e08\") " pod="openshift-marketplace/redhat-operators-zmdbq" Nov 24 12:11:15 crc kubenswrapper[4752]: I1124 12:11:15.394077 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/406baf43-1d64-4d56-baaa-0ddb4b263e08-catalog-content\") pod \"redhat-operators-zmdbq\" (UID: \"406baf43-1d64-4d56-baaa-0ddb4b263e08\") " pod="openshift-marketplace/redhat-operators-zmdbq" Nov 24 12:11:15 crc kubenswrapper[4752]: I1124 12:11:15.394148 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdth4\" (UniqueName: \"kubernetes.io/projected/406baf43-1d64-4d56-baaa-0ddb4b263e08-kube-api-access-zdth4\") pod \"redhat-operators-zmdbq\" (UID: \"406baf43-1d64-4d56-baaa-0ddb4b263e08\") " pod="openshift-marketplace/redhat-operators-zmdbq" Nov 24 12:11:15 crc kubenswrapper[4752]: I1124 12:11:15.496107 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/406baf43-1d64-4d56-baaa-0ddb4b263e08-utilities\") pod \"redhat-operators-zmdbq\" (UID: \"406baf43-1d64-4d56-baaa-0ddb4b263e08\") " pod="openshift-marketplace/redhat-operators-zmdbq" Nov 24 12:11:15 crc kubenswrapper[4752]: I1124 12:11:15.496191 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/406baf43-1d64-4d56-baaa-0ddb4b263e08-catalog-content\") pod \"redhat-operators-zmdbq\" (UID: \"406baf43-1d64-4d56-baaa-0ddb4b263e08\") " pod="openshift-marketplace/redhat-operators-zmdbq" Nov 24 12:11:15 crc kubenswrapper[4752]: I1124 12:11:15.496225 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdth4\" (UniqueName: \"kubernetes.io/projected/406baf43-1d64-4d56-baaa-0ddb4b263e08-kube-api-access-zdth4\") pod \"redhat-operators-zmdbq\" (UID: \"406baf43-1d64-4d56-baaa-0ddb4b263e08\") " pod="openshift-marketplace/redhat-operators-zmdbq" Nov 24 12:11:15 crc kubenswrapper[4752]: I1124 12:11:15.496770 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/406baf43-1d64-4d56-baaa-0ddb4b263e08-utilities\") pod \"redhat-operators-zmdbq\" (UID: \"406baf43-1d64-4d56-baaa-0ddb4b263e08\") " pod="openshift-marketplace/redhat-operators-zmdbq" Nov 24 12:11:15 crc kubenswrapper[4752]: I1124 12:11:15.496815 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/406baf43-1d64-4d56-baaa-0ddb4b263e08-catalog-content\") pod \"redhat-operators-zmdbq\" (UID: \"406baf43-1d64-4d56-baaa-0ddb4b263e08\") " pod="openshift-marketplace/redhat-operators-zmdbq" Nov 24 12:11:15 crc kubenswrapper[4752]: I1124 12:11:15.515157 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdth4\" (UniqueName: \"kubernetes.io/projected/406baf43-1d64-4d56-baaa-0ddb4b263e08-kube-api-access-zdth4\") pod \"redhat-operators-zmdbq\" (UID: \"406baf43-1d64-4d56-baaa-0ddb4b263e08\") " pod="openshift-marketplace/redhat-operators-zmdbq" Nov 24 12:11:15 crc kubenswrapper[4752]: I1124 12:11:15.548605 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zmdbq" Nov 24 12:11:15 crc kubenswrapper[4752]: I1124 12:11:15.990686 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zmdbq"] Nov 24 12:11:16 crc kubenswrapper[4752]: W1124 12:11:15.997432 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod406baf43_1d64_4d56_baaa_0ddb4b263e08.slice/crio-eaf4cdd1b8bd9357a187ec8ec042302ab6cd860954ee948c83758df58841830e WatchSource:0}: Error finding container eaf4cdd1b8bd9357a187ec8ec042302ab6cd860954ee948c83758df58841830e: Status 404 returned error can't find the container with id eaf4cdd1b8bd9357a187ec8ec042302ab6cd860954ee948c83758df58841830e Nov 24 12:11:16 crc kubenswrapper[4752]: I1124 12:11:16.029769 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zmdbq" event={"ID":"406baf43-1d64-4d56-baaa-0ddb4b263e08","Type":"ContainerStarted","Data":"eaf4cdd1b8bd9357a187ec8ec042302ab6cd860954ee948c83758df58841830e"} Nov 24 12:11:17 crc kubenswrapper[4752]: I1124 12:11:17.037762 4752 generic.go:334] "Generic (PLEG): container finished" podID="406baf43-1d64-4d56-baaa-0ddb4b263e08" containerID="ee8ba0725796eb65a13711d66cfe80214afcb8d6067be1e3577a6cddbbad99d2" exitCode=0 Nov 24 12:11:17 crc kubenswrapper[4752]: I1124 12:11:17.037822 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zmdbq" event={"ID":"406baf43-1d64-4d56-baaa-0ddb4b263e08","Type":"ContainerDied","Data":"ee8ba0725796eb65a13711d66cfe80214afcb8d6067be1e3577a6cddbbad99d2"} Nov 24 12:11:17 crc kubenswrapper[4752]: I1124 12:11:17.039686 4752 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 24 12:11:18 crc kubenswrapper[4752]: I1124 12:11:18.046922 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zmdbq" event={"ID":"406baf43-1d64-4d56-baaa-0ddb4b263e08","Type":"ContainerStarted","Data":"e185f4f26900b3995797db58a54a9c1577abd42ca94d5aa6fd6ce887138282c2"} Nov 24 12:11:19 crc kubenswrapper[4752]: I1124 12:11:19.081082 4752 generic.go:334] "Generic (PLEG): container finished" podID="406baf43-1d64-4d56-baaa-0ddb4b263e08" containerID="e185f4f26900b3995797db58a54a9c1577abd42ca94d5aa6fd6ce887138282c2" exitCode=0 Nov 24 12:11:19 crc kubenswrapper[4752]: I1124 12:11:19.081264 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zmdbq" event={"ID":"406baf43-1d64-4d56-baaa-0ddb4b263e08","Type":"ContainerDied","Data":"e185f4f26900b3995797db58a54a9c1577abd42ca94d5aa6fd6ce887138282c2"} Nov 24 12:11:20 crc kubenswrapper[4752]: I1124 12:11:20.089603 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zmdbq" event={"ID":"406baf43-1d64-4d56-baaa-0ddb4b263e08","Type":"ContainerStarted","Data":"16f364c150c1ca9908878fb3c70b1bde08418fd7fc73dbcfbe9f1fb014b7ffe3"} Nov 24 12:11:20 crc kubenswrapper[4752]: I1124 12:11:20.110201 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zmdbq" podStartSLOduration=2.607823524 podStartE2EDuration="5.110184138s" podCreationTimestamp="2025-11-24 12:11:15 +0000 UTC" firstStartedPulling="2025-11-24 12:11:17.039473958 +0000 UTC m=+3883.024294247" lastFinishedPulling="2025-11-24 12:11:19.541834572 +0000 UTC m=+3885.526654861" observedRunningTime="2025-11-24 12:11:20.107291435 +0000 UTC m=+3886.092111724" watchObservedRunningTime="2025-11-24 12:11:20.110184138 +0000 UTC m=+3886.095004417" Nov 24 12:11:25 crc kubenswrapper[4752]: I1124 12:11:25.549293 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zmdbq" Nov 24 12:11:25 crc kubenswrapper[4752]: I1124 12:11:25.549928 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zmdbq" Nov 24 12:11:25 crc kubenswrapper[4752]: I1124 12:11:25.592943 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zmdbq" Nov 24 12:11:26 crc kubenswrapper[4752]: I1124 12:11:26.173921 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zmdbq" Nov 24 12:11:26 crc kubenswrapper[4752]: I1124 12:11:26.226557 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zmdbq"] Nov 24 12:11:28 crc kubenswrapper[4752]: I1124 12:11:28.148370 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zmdbq" podUID="406baf43-1d64-4d56-baaa-0ddb4b263e08" containerName="registry-server" containerID="cri-o://16f364c150c1ca9908878fb3c70b1bde08418fd7fc73dbcfbe9f1fb014b7ffe3" gracePeriod=2 Nov 24 12:11:29 crc kubenswrapper[4752]: I1124 12:11:29.805162 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zmdbq" Nov 24 12:11:29 crc kubenswrapper[4752]: I1124 12:11:29.893942 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdth4\" (UniqueName: \"kubernetes.io/projected/406baf43-1d64-4d56-baaa-0ddb4b263e08-kube-api-access-zdth4\") pod \"406baf43-1d64-4d56-baaa-0ddb4b263e08\" (UID: \"406baf43-1d64-4d56-baaa-0ddb4b263e08\") " Nov 24 12:11:29 crc kubenswrapper[4752]: I1124 12:11:29.894063 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/406baf43-1d64-4d56-baaa-0ddb4b263e08-catalog-content\") pod \"406baf43-1d64-4d56-baaa-0ddb4b263e08\" (UID: \"406baf43-1d64-4d56-baaa-0ddb4b263e08\") " Nov 24 12:11:29 crc kubenswrapper[4752]: I1124 12:11:29.894115 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/406baf43-1d64-4d56-baaa-0ddb4b263e08-utilities\") pod \"406baf43-1d64-4d56-baaa-0ddb4b263e08\" (UID: \"406baf43-1d64-4d56-baaa-0ddb4b263e08\") " Nov 24 12:11:29 crc kubenswrapper[4752]: I1124 12:11:29.895368 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/406baf43-1d64-4d56-baaa-0ddb4b263e08-utilities" (OuterVolumeSpecName: "utilities") pod "406baf43-1d64-4d56-baaa-0ddb4b263e08" (UID: "406baf43-1d64-4d56-baaa-0ddb4b263e08"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:11:29 crc kubenswrapper[4752]: I1124 12:11:29.899459 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/406baf43-1d64-4d56-baaa-0ddb4b263e08-kube-api-access-zdth4" (OuterVolumeSpecName: "kube-api-access-zdth4") pod "406baf43-1d64-4d56-baaa-0ddb4b263e08" (UID: "406baf43-1d64-4d56-baaa-0ddb4b263e08"). InnerVolumeSpecName "kube-api-access-zdth4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:11:29 crc kubenswrapper[4752]: I1124 12:11:29.984240 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/406baf43-1d64-4d56-baaa-0ddb4b263e08-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "406baf43-1d64-4d56-baaa-0ddb4b263e08" (UID: "406baf43-1d64-4d56-baaa-0ddb4b263e08"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:11:29 crc kubenswrapper[4752]: I1124 12:11:29.996130 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/406baf43-1d64-4d56-baaa-0ddb4b263e08-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 12:11:29 crc kubenswrapper[4752]: I1124 12:11:29.996171 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdth4\" (UniqueName: \"kubernetes.io/projected/406baf43-1d64-4d56-baaa-0ddb4b263e08-kube-api-access-zdth4\") on node \"crc\" DevicePath \"\"" Nov 24 12:11:29 crc kubenswrapper[4752]: I1124 12:11:29.996184 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/406baf43-1d64-4d56-baaa-0ddb4b263e08-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 12:11:30 crc kubenswrapper[4752]: I1124 12:11:30.165515 4752 generic.go:334] "Generic (PLEG): container finished" podID="406baf43-1d64-4d56-baaa-0ddb4b263e08" containerID="16f364c150c1ca9908878fb3c70b1bde08418fd7fc73dbcfbe9f1fb014b7ffe3" exitCode=0 Nov 24 12:11:30 crc kubenswrapper[4752]: I1124 12:11:30.165575 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zmdbq" event={"ID":"406baf43-1d64-4d56-baaa-0ddb4b263e08","Type":"ContainerDied","Data":"16f364c150c1ca9908878fb3c70b1bde08418fd7fc73dbcfbe9f1fb014b7ffe3"} Nov 24 12:11:30 crc kubenswrapper[4752]: I1124 12:11:30.165611 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zmdbq" event={"ID":"406baf43-1d64-4d56-baaa-0ddb4b263e08","Type":"ContainerDied","Data":"eaf4cdd1b8bd9357a187ec8ec042302ab6cd860954ee948c83758df58841830e"} Nov 24 12:11:30 crc kubenswrapper[4752]: I1124 12:11:30.165650 4752 scope.go:117] "RemoveContainer" containerID="16f364c150c1ca9908878fb3c70b1bde08418fd7fc73dbcfbe9f1fb014b7ffe3" Nov 24 12:11:30 crc kubenswrapper[4752]: I1124 12:11:30.165881 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zmdbq" Nov 24 12:11:30 crc kubenswrapper[4752]: I1124 12:11:30.193911 4752 scope.go:117] "RemoveContainer" containerID="e185f4f26900b3995797db58a54a9c1577abd42ca94d5aa6fd6ce887138282c2" Nov 24 12:11:30 crc kubenswrapper[4752]: I1124 12:11:30.210656 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zmdbq"] Nov 24 12:11:30 crc kubenswrapper[4752]: I1124 12:11:30.216589 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zmdbq"] Nov 24 12:11:30 crc kubenswrapper[4752]: I1124 12:11:30.236329 4752 scope.go:117] "RemoveContainer" containerID="ee8ba0725796eb65a13711d66cfe80214afcb8d6067be1e3577a6cddbbad99d2" Nov 24 12:11:30 crc kubenswrapper[4752]: I1124 12:11:30.252095 4752 scope.go:117] "RemoveContainer" containerID="16f364c150c1ca9908878fb3c70b1bde08418fd7fc73dbcfbe9f1fb014b7ffe3" Nov 24 12:11:30 crc kubenswrapper[4752]: E1124 12:11:30.252656 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16f364c150c1ca9908878fb3c70b1bde08418fd7fc73dbcfbe9f1fb014b7ffe3\": container with ID starting with 16f364c150c1ca9908878fb3c70b1bde08418fd7fc73dbcfbe9f1fb014b7ffe3 not found: ID does not exist" containerID="16f364c150c1ca9908878fb3c70b1bde08418fd7fc73dbcfbe9f1fb014b7ffe3" Nov 24 12:11:30 crc kubenswrapper[4752]: I1124 12:11:30.252711 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16f364c150c1ca9908878fb3c70b1bde08418fd7fc73dbcfbe9f1fb014b7ffe3"} err="failed to get container status \"16f364c150c1ca9908878fb3c70b1bde08418fd7fc73dbcfbe9f1fb014b7ffe3\": rpc error: code = NotFound desc = could not find container \"16f364c150c1ca9908878fb3c70b1bde08418fd7fc73dbcfbe9f1fb014b7ffe3\": container with ID starting with 16f364c150c1ca9908878fb3c70b1bde08418fd7fc73dbcfbe9f1fb014b7ffe3 not found: ID does not exist" Nov 24 12:11:30 crc kubenswrapper[4752]: I1124 12:11:30.252757 4752 scope.go:117] "RemoveContainer" containerID="e185f4f26900b3995797db58a54a9c1577abd42ca94d5aa6fd6ce887138282c2" Nov 24 12:11:30 crc kubenswrapper[4752]: E1124 12:11:30.253141 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e185f4f26900b3995797db58a54a9c1577abd42ca94d5aa6fd6ce887138282c2\": container with ID starting with e185f4f26900b3995797db58a54a9c1577abd42ca94d5aa6fd6ce887138282c2 not found: ID does not exist" containerID="e185f4f26900b3995797db58a54a9c1577abd42ca94d5aa6fd6ce887138282c2" Nov 24 12:11:30 crc kubenswrapper[4752]: I1124 12:11:30.253300 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e185f4f26900b3995797db58a54a9c1577abd42ca94d5aa6fd6ce887138282c2"} err="failed to get container status \"e185f4f26900b3995797db58a54a9c1577abd42ca94d5aa6fd6ce887138282c2\": rpc error: code = NotFound desc = could not find container \"e185f4f26900b3995797db58a54a9c1577abd42ca94d5aa6fd6ce887138282c2\": container with ID starting with e185f4f26900b3995797db58a54a9c1577abd42ca94d5aa6fd6ce887138282c2 not found: ID does not exist" Nov 24 12:11:30 crc kubenswrapper[4752]: I1124 12:11:30.253431 4752 scope.go:117] "RemoveContainer" containerID="ee8ba0725796eb65a13711d66cfe80214afcb8d6067be1e3577a6cddbbad99d2" Nov 24 12:11:30 crc kubenswrapper[4752]: E1124 12:11:30.254263 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee8ba0725796eb65a13711d66cfe80214afcb8d6067be1e3577a6cddbbad99d2\": container with ID starting with ee8ba0725796eb65a13711d66cfe80214afcb8d6067be1e3577a6cddbbad99d2 not found: ID does not exist" containerID="ee8ba0725796eb65a13711d66cfe80214afcb8d6067be1e3577a6cddbbad99d2" Nov 24 12:11:30 crc kubenswrapper[4752]: I1124 12:11:30.254299 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee8ba0725796eb65a13711d66cfe80214afcb8d6067be1e3577a6cddbbad99d2"} err="failed to get container status \"ee8ba0725796eb65a13711d66cfe80214afcb8d6067be1e3577a6cddbbad99d2\": rpc error: code = NotFound desc = could not find container \"ee8ba0725796eb65a13711d66cfe80214afcb8d6067be1e3577a6cddbbad99d2\": container with ID starting with ee8ba0725796eb65a13711d66cfe80214afcb8d6067be1e3577a6cddbbad99d2 not found: ID does not exist" Nov 24 12:11:30 crc kubenswrapper[4752]: I1124 12:11:30.738207 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="406baf43-1d64-4d56-baaa-0ddb4b263e08" path="/var/lib/kubelet/pods/406baf43-1d64-4d56-baaa-0ddb4b263e08/volumes" Nov 24 12:12:10 crc kubenswrapper[4752]: I1124 12:12:10.972259 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4m2db"] Nov 24 12:12:10 crc kubenswrapper[4752]: E1124 12:12:10.973216 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="406baf43-1d64-4d56-baaa-0ddb4b263e08" containerName="registry-server" Nov 24 12:12:10 crc kubenswrapper[4752]: I1124 12:12:10.973230 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="406baf43-1d64-4d56-baaa-0ddb4b263e08" containerName="registry-server" Nov 24 12:12:10 crc kubenswrapper[4752]: E1124 12:12:10.973239 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="406baf43-1d64-4d56-baaa-0ddb4b263e08" containerName="extract-utilities" Nov 24 12:12:10 crc kubenswrapper[4752]: I1124 12:12:10.973245 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="406baf43-1d64-4d56-baaa-0ddb4b263e08" containerName="extract-utilities" Nov 24 12:12:10 crc kubenswrapper[4752]: E1124 12:12:10.973258 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="406baf43-1d64-4d56-baaa-0ddb4b263e08" containerName="extract-content" Nov 24 12:12:10 crc kubenswrapper[4752]: I1124 12:12:10.973265 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="406baf43-1d64-4d56-baaa-0ddb4b263e08" containerName="extract-content" Nov 24 12:12:10 crc kubenswrapper[4752]: I1124 12:12:10.973393 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="406baf43-1d64-4d56-baaa-0ddb4b263e08" containerName="registry-server" Nov 24 12:12:10 crc kubenswrapper[4752]: I1124 12:12:10.974373 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4m2db" Nov 24 12:12:10 crc kubenswrapper[4752]: I1124 12:12:10.991191 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4m2db"] Nov 24 12:12:11 crc kubenswrapper[4752]: I1124 12:12:11.090523 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tltzk\" (UniqueName: \"kubernetes.io/projected/2bab4439-df0a-49a4-9583-073a817432cf-kube-api-access-tltzk\") pod \"community-operators-4m2db\" (UID: \"2bab4439-df0a-49a4-9583-073a817432cf\") " pod="openshift-marketplace/community-operators-4m2db" Nov 24 12:12:11 crc kubenswrapper[4752]: I1124 12:12:11.090568 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bab4439-df0a-49a4-9583-073a817432cf-catalog-content\") pod \"community-operators-4m2db\" (UID: \"2bab4439-df0a-49a4-9583-073a817432cf\") " pod="openshift-marketplace/community-operators-4m2db" Nov 24 12:12:11 crc kubenswrapper[4752]: I1124 12:12:11.090601 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bab4439-df0a-49a4-9583-073a817432cf-utilities\") pod \"community-operators-4m2db\" (UID: \"2bab4439-df0a-49a4-9583-073a817432cf\") " pod="openshift-marketplace/community-operators-4m2db" Nov 24 12:12:11 crc kubenswrapper[4752]: I1124 12:12:11.192624 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tltzk\" (UniqueName: \"kubernetes.io/projected/2bab4439-df0a-49a4-9583-073a817432cf-kube-api-access-tltzk\") pod \"community-operators-4m2db\" (UID: \"2bab4439-df0a-49a4-9583-073a817432cf\") " pod="openshift-marketplace/community-operators-4m2db" Nov 24 12:12:11 crc kubenswrapper[4752]: I1124 12:12:11.192662 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bab4439-df0a-49a4-9583-073a817432cf-catalog-content\") pod \"community-operators-4m2db\" (UID: \"2bab4439-df0a-49a4-9583-073a817432cf\") " pod="openshift-marketplace/community-operators-4m2db" Nov 24 12:12:11 crc kubenswrapper[4752]: I1124 12:12:11.192699 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bab4439-df0a-49a4-9583-073a817432cf-utilities\") pod \"community-operators-4m2db\" (UID: \"2bab4439-df0a-49a4-9583-073a817432cf\") " pod="openshift-marketplace/community-operators-4m2db" Nov 24 12:12:11 crc kubenswrapper[4752]: I1124 12:12:11.193190 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bab4439-df0a-49a4-9583-073a817432cf-utilities\") pod \"community-operators-4m2db\" (UID: \"2bab4439-df0a-49a4-9583-073a817432cf\") " pod="openshift-marketplace/community-operators-4m2db" Nov 24 12:12:11 crc kubenswrapper[4752]: I1124 12:12:11.193294 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bab4439-df0a-49a4-9583-073a817432cf-catalog-content\") pod \"community-operators-4m2db\" (UID: \"2bab4439-df0a-49a4-9583-073a817432cf\") " pod="openshift-marketplace/community-operators-4m2db" Nov 24 12:12:11 crc kubenswrapper[4752]: I1124 12:12:11.213980 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tltzk\" (UniqueName: \"kubernetes.io/projected/2bab4439-df0a-49a4-9583-073a817432cf-kube-api-access-tltzk\") pod \"community-operators-4m2db\" (UID: \"2bab4439-df0a-49a4-9583-073a817432cf\") " pod="openshift-marketplace/community-operators-4m2db" Nov 24 12:12:11 crc kubenswrapper[4752]: I1124 12:12:11.300109 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4m2db" Nov 24 12:12:11 crc kubenswrapper[4752]: I1124 12:12:11.780114 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4m2db"] Nov 24 12:12:11 crc kubenswrapper[4752]: W1124 12:12:11.785499 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bab4439_df0a_49a4_9583_073a817432cf.slice/crio-5459bea3eb1ffdbca0d9d8a41b362866d6fd22cec6e064d9d2e1467833841ee2 WatchSource:0}: Error finding container 5459bea3eb1ffdbca0d9d8a41b362866d6fd22cec6e064d9d2e1467833841ee2: Status 404 returned error can't find the container with id 5459bea3eb1ffdbca0d9d8a41b362866d6fd22cec6e064d9d2e1467833841ee2 Nov 24 12:12:12 crc kubenswrapper[4752]: I1124 12:12:12.523910 4752 generic.go:334] "Generic (PLEG): container finished" podID="2bab4439-df0a-49a4-9583-073a817432cf" containerID="fa7066ac0eb5b236ea631e40d80bc1682c8096eccc3106ccfb59383a5d673b02" exitCode=0 Nov 24 12:12:12 crc kubenswrapper[4752]: I1124 12:12:12.523969 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4m2db" event={"ID":"2bab4439-df0a-49a4-9583-073a817432cf","Type":"ContainerDied","Data":"fa7066ac0eb5b236ea631e40d80bc1682c8096eccc3106ccfb59383a5d673b02"} Nov 24 12:12:12 crc kubenswrapper[4752]: I1124 12:12:12.525965 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4m2db" event={"ID":"2bab4439-df0a-49a4-9583-073a817432cf","Type":"ContainerStarted","Data":"5459bea3eb1ffdbca0d9d8a41b362866d6fd22cec6e064d9d2e1467833841ee2"} Nov 24 12:12:14 crc kubenswrapper[4752]: I1124 12:12:14.542499 4752 generic.go:334] "Generic (PLEG): container finished" podID="2bab4439-df0a-49a4-9583-073a817432cf" containerID="6d7e43bd738b1150b1faf8c706d3f3479d67bb2c4863eebcd47ec3bab3915729" exitCode=0 Nov 24 12:12:14 crc kubenswrapper[4752]: I1124 12:12:14.542560 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4m2db" event={"ID":"2bab4439-df0a-49a4-9583-073a817432cf","Type":"ContainerDied","Data":"6d7e43bd738b1150b1faf8c706d3f3479d67bb2c4863eebcd47ec3bab3915729"} Nov 24 12:12:15 crc kubenswrapper[4752]: I1124 12:12:15.552726 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4m2db" event={"ID":"2bab4439-df0a-49a4-9583-073a817432cf","Type":"ContainerStarted","Data":"d30e57fa2c483cf8cebdd32a3d4da34e75093fdfb6bc5ea5fc04b077888e0e36"} Nov 24 12:12:15 crc kubenswrapper[4752]: I1124 12:12:15.571099 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4m2db" podStartSLOduration=2.914139302 podStartE2EDuration="5.571066884s" podCreationTimestamp="2025-11-24 12:12:10 +0000 UTC" firstStartedPulling="2025-11-24 12:12:12.525224518 +0000 UTC m=+3938.510044807" lastFinishedPulling="2025-11-24 12:12:15.1821521 +0000 UTC m=+3941.166972389" observedRunningTime="2025-11-24 12:12:15.567034338 +0000 UTC m=+3941.551854627" watchObservedRunningTime="2025-11-24 12:12:15.571066884 +0000 UTC m=+3941.555887173" Nov 24 12:12:21 crc kubenswrapper[4752]: I1124 12:12:21.300260 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4m2db" Nov 24 12:12:21 crc kubenswrapper[4752]: I1124 12:12:21.301015 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4m2db" Nov 24 12:12:21 crc kubenswrapper[4752]: I1124 12:12:21.348330 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4m2db" Nov 24 12:12:21 crc kubenswrapper[4752]: I1124 12:12:21.634446 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4m2db" Nov 24 12:12:21 crc kubenswrapper[4752]: I1124 12:12:21.690909 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4m2db"] Nov 24 12:12:23 crc kubenswrapper[4752]: I1124 12:12:23.612923 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4m2db" podUID="2bab4439-df0a-49a4-9583-073a817432cf" containerName="registry-server" containerID="cri-o://d30e57fa2c483cf8cebdd32a3d4da34e75093fdfb6bc5ea5fc04b077888e0e36" gracePeriod=2 Nov 24 12:12:23 crc kubenswrapper[4752]: I1124 12:12:23.999027 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4m2db" Nov 24 12:12:24 crc kubenswrapper[4752]: I1124 12:12:24.121032 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tltzk\" (UniqueName: \"kubernetes.io/projected/2bab4439-df0a-49a4-9583-073a817432cf-kube-api-access-tltzk\") pod \"2bab4439-df0a-49a4-9583-073a817432cf\" (UID: \"2bab4439-df0a-49a4-9583-073a817432cf\") " Nov 24 12:12:24 crc kubenswrapper[4752]: I1124 12:12:24.121137 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bab4439-df0a-49a4-9583-073a817432cf-catalog-content\") pod \"2bab4439-df0a-49a4-9583-073a817432cf\" (UID: \"2bab4439-df0a-49a4-9583-073a817432cf\") " Nov 24 12:12:24 crc kubenswrapper[4752]: I1124 12:12:24.121202 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bab4439-df0a-49a4-9583-073a817432cf-utilities\") pod \"2bab4439-df0a-49a4-9583-073a817432cf\" (UID: \"2bab4439-df0a-49a4-9583-073a817432cf\") " Nov 24 12:12:24 crc kubenswrapper[4752]: I1124 12:12:24.122460 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bab4439-df0a-49a4-9583-073a817432cf-utilities" (OuterVolumeSpecName: "utilities") pod "2bab4439-df0a-49a4-9583-073a817432cf" (UID: "2bab4439-df0a-49a4-9583-073a817432cf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:12:24 crc kubenswrapper[4752]: I1124 12:12:24.127013 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bab4439-df0a-49a4-9583-073a817432cf-kube-api-access-tltzk" (OuterVolumeSpecName: "kube-api-access-tltzk") pod "2bab4439-df0a-49a4-9583-073a817432cf" (UID: "2bab4439-df0a-49a4-9583-073a817432cf"). InnerVolumeSpecName "kube-api-access-tltzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:12:24 crc kubenswrapper[4752]: I1124 12:12:24.178655 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bab4439-df0a-49a4-9583-073a817432cf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2bab4439-df0a-49a4-9583-073a817432cf" (UID: "2bab4439-df0a-49a4-9583-073a817432cf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:12:24 crc kubenswrapper[4752]: I1124 12:12:24.222472 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tltzk\" (UniqueName: \"kubernetes.io/projected/2bab4439-df0a-49a4-9583-073a817432cf-kube-api-access-tltzk\") on node \"crc\" DevicePath \"\"" Nov 24 12:12:24 crc kubenswrapper[4752]: I1124 12:12:24.222859 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bab4439-df0a-49a4-9583-073a817432cf-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 12:12:24 crc kubenswrapper[4752]: I1124 12:12:24.222870 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bab4439-df0a-49a4-9583-073a817432cf-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 12:12:24 crc kubenswrapper[4752]: I1124 12:12:24.621062 4752 generic.go:334] "Generic (PLEG): container finished" podID="2bab4439-df0a-49a4-9583-073a817432cf" containerID="d30e57fa2c483cf8cebdd32a3d4da34e75093fdfb6bc5ea5fc04b077888e0e36" exitCode=0 Nov 24 12:12:24 crc kubenswrapper[4752]: I1124 12:12:24.621121 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4m2db" event={"ID":"2bab4439-df0a-49a4-9583-073a817432cf","Type":"ContainerDied","Data":"d30e57fa2c483cf8cebdd32a3d4da34e75093fdfb6bc5ea5fc04b077888e0e36"} Nov 24 12:12:24 crc kubenswrapper[4752]: I1124 12:12:24.621160 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4m2db" Nov 24 12:12:24 crc kubenswrapper[4752]: I1124 12:12:24.621181 4752 scope.go:117] "RemoveContainer" containerID="d30e57fa2c483cf8cebdd32a3d4da34e75093fdfb6bc5ea5fc04b077888e0e36" Nov 24 12:12:24 crc kubenswrapper[4752]: I1124 12:12:24.621166 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4m2db" event={"ID":"2bab4439-df0a-49a4-9583-073a817432cf","Type":"ContainerDied","Data":"5459bea3eb1ffdbca0d9d8a41b362866d6fd22cec6e064d9d2e1467833841ee2"} Nov 24 12:12:24 crc kubenswrapper[4752]: I1124 12:12:24.641397 4752 scope.go:117] "RemoveContainer" containerID="6d7e43bd738b1150b1faf8c706d3f3479d67bb2c4863eebcd47ec3bab3915729" Nov 24 12:12:24 crc kubenswrapper[4752]: I1124 12:12:24.662157 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4m2db"] Nov 24 12:12:24 crc kubenswrapper[4752]: I1124 12:12:24.668370 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4m2db"] Nov 24 12:12:24 crc kubenswrapper[4752]: I1124 12:12:24.680023 4752 scope.go:117] "RemoveContainer" containerID="fa7066ac0eb5b236ea631e40d80bc1682c8096eccc3106ccfb59383a5d673b02" Nov 24 12:12:24 crc kubenswrapper[4752]: I1124 12:12:24.701053 4752 scope.go:117] "RemoveContainer" containerID="d30e57fa2c483cf8cebdd32a3d4da34e75093fdfb6bc5ea5fc04b077888e0e36" Nov 24 12:12:24 crc kubenswrapper[4752]: E1124 12:12:24.701602 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d30e57fa2c483cf8cebdd32a3d4da34e75093fdfb6bc5ea5fc04b077888e0e36\": container with ID starting with d30e57fa2c483cf8cebdd32a3d4da34e75093fdfb6bc5ea5fc04b077888e0e36 not found: ID does not exist" containerID="d30e57fa2c483cf8cebdd32a3d4da34e75093fdfb6bc5ea5fc04b077888e0e36" Nov 24 12:12:24 crc kubenswrapper[4752]: I1124 12:12:24.701657 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d30e57fa2c483cf8cebdd32a3d4da34e75093fdfb6bc5ea5fc04b077888e0e36"} err="failed to get container status \"d30e57fa2c483cf8cebdd32a3d4da34e75093fdfb6bc5ea5fc04b077888e0e36\": rpc error: code = NotFound desc = could not find container \"d30e57fa2c483cf8cebdd32a3d4da34e75093fdfb6bc5ea5fc04b077888e0e36\": container with ID starting with d30e57fa2c483cf8cebdd32a3d4da34e75093fdfb6bc5ea5fc04b077888e0e36 not found: ID does not exist" Nov 24 12:12:24 crc kubenswrapper[4752]: I1124 12:12:24.701688 4752 scope.go:117] "RemoveContainer" containerID="6d7e43bd738b1150b1faf8c706d3f3479d67bb2c4863eebcd47ec3bab3915729" Nov 24 12:12:24 crc kubenswrapper[4752]: E1124 12:12:24.702179 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d7e43bd738b1150b1faf8c706d3f3479d67bb2c4863eebcd47ec3bab3915729\": container with ID starting with 6d7e43bd738b1150b1faf8c706d3f3479d67bb2c4863eebcd47ec3bab3915729 not found: ID does not exist" containerID="6d7e43bd738b1150b1faf8c706d3f3479d67bb2c4863eebcd47ec3bab3915729" Nov 24 12:12:24 crc kubenswrapper[4752]: I1124 12:12:24.702213 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d7e43bd738b1150b1faf8c706d3f3479d67bb2c4863eebcd47ec3bab3915729"} err="failed to get container status \"6d7e43bd738b1150b1faf8c706d3f3479d67bb2c4863eebcd47ec3bab3915729\": rpc error: code = NotFound desc = could not find container \"6d7e43bd738b1150b1faf8c706d3f3479d67bb2c4863eebcd47ec3bab3915729\": container with ID starting with 6d7e43bd738b1150b1faf8c706d3f3479d67bb2c4863eebcd47ec3bab3915729 not found: ID does not exist" Nov 24 12:12:24 crc kubenswrapper[4752]: I1124 12:12:24.702240 4752 scope.go:117] "RemoveContainer" containerID="fa7066ac0eb5b236ea631e40d80bc1682c8096eccc3106ccfb59383a5d673b02" Nov 24 12:12:24 crc kubenswrapper[4752]: E1124 12:12:24.702519 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa7066ac0eb5b236ea631e40d80bc1682c8096eccc3106ccfb59383a5d673b02\": container with ID starting with fa7066ac0eb5b236ea631e40d80bc1682c8096eccc3106ccfb59383a5d673b02 not found: ID does not exist" containerID="fa7066ac0eb5b236ea631e40d80bc1682c8096eccc3106ccfb59383a5d673b02" Nov 24 12:12:24 crc kubenswrapper[4752]: I1124 12:12:24.702551 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa7066ac0eb5b236ea631e40d80bc1682c8096eccc3106ccfb59383a5d673b02"} err="failed to get container status \"fa7066ac0eb5b236ea631e40d80bc1682c8096eccc3106ccfb59383a5d673b02\": rpc error: code = NotFound desc = could not find container \"fa7066ac0eb5b236ea631e40d80bc1682c8096eccc3106ccfb59383a5d673b02\": container with ID starting with fa7066ac0eb5b236ea631e40d80bc1682c8096eccc3106ccfb59383a5d673b02 not found: ID does not exist" Nov 24 12:12:24 crc kubenswrapper[4752]: I1124 12:12:24.736604 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bab4439-df0a-49a4-9583-073a817432cf" path="/var/lib/kubelet/pods/2bab4439-df0a-49a4-9583-073a817432cf/volumes" Nov 24 12:13:15 crc kubenswrapper[4752]: I1124 12:13:15.469176 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 12:13:15 crc kubenswrapper[4752]: I1124 12:13:15.469874 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 12:13:45 crc kubenswrapper[4752]: I1124 12:13:45.468965 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 12:13:45 crc kubenswrapper[4752]: I1124 12:13:45.469649 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 12:14:15 crc kubenswrapper[4752]: I1124 12:14:15.469383 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 12:14:15 crc kubenswrapper[4752]: I1124 12:14:15.470017 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 12:14:15 crc kubenswrapper[4752]: I1124 12:14:15.470065 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 12:14:15 crc kubenswrapper[4752]: I1124 12:14:15.470670 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6da3a157cf2d3aac64f3071b9f67cc0eac1a6b813d85528edb9f0789e946feee"} pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 24 12:14:15 crc kubenswrapper[4752]: I1124 12:14:15.470790 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" containerID="cri-o://6da3a157cf2d3aac64f3071b9f67cc0eac1a6b813d85528edb9f0789e946feee" gracePeriod=600 Nov 24 12:14:16 crc kubenswrapper[4752]: I1124 12:14:16.507300 4752 generic.go:334] "Generic (PLEG): container finished" podID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerID="6da3a157cf2d3aac64f3071b9f67cc0eac1a6b813d85528edb9f0789e946feee" exitCode=0 Nov 24 12:14:16 crc kubenswrapper[4752]: I1124 12:14:16.507948 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerDied","Data":"6da3a157cf2d3aac64f3071b9f67cc0eac1a6b813d85528edb9f0789e946feee"} Nov 24 12:14:16 crc kubenswrapper[4752]: I1124 12:14:16.508028 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerStarted","Data":"a07fb79ffa4c02fa1ac7fbfe36d7a8beb4399a115479826734e5d12ec7283331"} Nov 24 12:14:16 crc kubenswrapper[4752]: I1124 12:14:16.508100 4752 scope.go:117] "RemoveContainer" containerID="3450f74eabbb61df240826a6bd281029c1a9b71d1788a44a2246b8fdfc23bac5" Nov 24 12:14:26 crc kubenswrapper[4752]: I1124 12:14:26.399106 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mvmln"] Nov 24 12:14:26 crc kubenswrapper[4752]: E1124 12:14:26.400094 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bab4439-df0a-49a4-9583-073a817432cf" containerName="extract-utilities" Nov 24 12:14:26 crc kubenswrapper[4752]: I1124 12:14:26.400113 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bab4439-df0a-49a4-9583-073a817432cf" containerName="extract-utilities" Nov 24 12:14:26 crc kubenswrapper[4752]: E1124 12:14:26.400127 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bab4439-df0a-49a4-9583-073a817432cf" containerName="extract-content" Nov 24 12:14:26 crc kubenswrapper[4752]: I1124 12:14:26.400135 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bab4439-df0a-49a4-9583-073a817432cf" containerName="extract-content" Nov 24 12:14:26 crc kubenswrapper[4752]: E1124 12:14:26.400154 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bab4439-df0a-49a4-9583-073a817432cf" containerName="registry-server" Nov 24 12:14:26 crc kubenswrapper[4752]: I1124 12:14:26.400163 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bab4439-df0a-49a4-9583-073a817432cf" containerName="registry-server" Nov 24 12:14:26 crc kubenswrapper[4752]: I1124 12:14:26.400337 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bab4439-df0a-49a4-9583-073a817432cf" containerName="registry-server" Nov 24 12:14:26 crc kubenswrapper[4752]: I1124 12:14:26.401542 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mvmln" Nov 24 12:14:26 crc kubenswrapper[4752]: I1124 12:14:26.413450 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mvmln"] Nov 24 12:14:26 crc kubenswrapper[4752]: I1124 12:14:26.505610 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh7nd\" (UniqueName: \"kubernetes.io/projected/3edbd3c0-308c-4031-9460-202f832a8e84-kube-api-access-vh7nd\") pod \"certified-operators-mvmln\" (UID: \"3edbd3c0-308c-4031-9460-202f832a8e84\") " pod="openshift-marketplace/certified-operators-mvmln" Nov 24 12:14:26 crc kubenswrapper[4752]: I1124 12:14:26.505682 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3edbd3c0-308c-4031-9460-202f832a8e84-catalog-content\") pod \"certified-operators-mvmln\" (UID: \"3edbd3c0-308c-4031-9460-202f832a8e84\") " pod="openshift-marketplace/certified-operators-mvmln" Nov 24 12:14:26 crc kubenswrapper[4752]: I1124 12:14:26.505882 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3edbd3c0-308c-4031-9460-202f832a8e84-utilities\") pod \"certified-operators-mvmln\" (UID: \"3edbd3c0-308c-4031-9460-202f832a8e84\") " pod="openshift-marketplace/certified-operators-mvmln" Nov 24 12:14:26 crc kubenswrapper[4752]: I1124 12:14:26.607693 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh7nd\" (UniqueName: \"kubernetes.io/projected/3edbd3c0-308c-4031-9460-202f832a8e84-kube-api-access-vh7nd\") pod \"certified-operators-mvmln\" (UID: \"3edbd3c0-308c-4031-9460-202f832a8e84\") " pod="openshift-marketplace/certified-operators-mvmln" Nov 24 12:14:26 crc kubenswrapper[4752]: I1124 12:14:26.607786 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3edbd3c0-308c-4031-9460-202f832a8e84-catalog-content\") pod \"certified-operators-mvmln\" (UID: \"3edbd3c0-308c-4031-9460-202f832a8e84\") " pod="openshift-marketplace/certified-operators-mvmln" Nov 24 12:14:26 crc kubenswrapper[4752]: I1124 12:14:26.607836 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3edbd3c0-308c-4031-9460-202f832a8e84-utilities\") pod \"certified-operators-mvmln\" (UID: \"3edbd3c0-308c-4031-9460-202f832a8e84\") " pod="openshift-marketplace/certified-operators-mvmln" Nov 24 12:14:26 crc kubenswrapper[4752]: I1124 12:14:26.608332 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3edbd3c0-308c-4031-9460-202f832a8e84-catalog-content\") pod \"certified-operators-mvmln\" (UID: \"3edbd3c0-308c-4031-9460-202f832a8e84\") " pod="openshift-marketplace/certified-operators-mvmln" Nov 24 12:14:26 crc kubenswrapper[4752]: I1124 12:14:26.608332 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3edbd3c0-308c-4031-9460-202f832a8e84-utilities\") pod \"certified-operators-mvmln\" (UID: \"3edbd3c0-308c-4031-9460-202f832a8e84\") " pod="openshift-marketplace/certified-operators-mvmln" Nov 24 12:14:26 crc kubenswrapper[4752]: I1124 12:14:26.628703 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh7nd\" (UniqueName: \"kubernetes.io/projected/3edbd3c0-308c-4031-9460-202f832a8e84-kube-api-access-vh7nd\") pod \"certified-operators-mvmln\" (UID: \"3edbd3c0-308c-4031-9460-202f832a8e84\") " pod="openshift-marketplace/certified-operators-mvmln" Nov 24 12:14:26 crc kubenswrapper[4752]: I1124 12:14:26.766491 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mvmln" Nov 24 12:14:27 crc kubenswrapper[4752]: I1124 12:14:27.203347 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mvmln"] Nov 24 12:14:27 crc kubenswrapper[4752]: I1124 12:14:27.583936 4752 generic.go:334] "Generic (PLEG): container finished" podID="3edbd3c0-308c-4031-9460-202f832a8e84" containerID="2abc4f622bf329d4d6e3f4c2cbd604b189d1cd793f1c8a66df1167bacc027e3f" exitCode=0 Nov 24 12:14:27 crc kubenswrapper[4752]: I1124 12:14:27.584193 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mvmln" event={"ID":"3edbd3c0-308c-4031-9460-202f832a8e84","Type":"ContainerDied","Data":"2abc4f622bf329d4d6e3f4c2cbd604b189d1cd793f1c8a66df1167bacc027e3f"} Nov 24 12:14:27 crc kubenswrapper[4752]: I1124 12:14:27.584646 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mvmln" event={"ID":"3edbd3c0-308c-4031-9460-202f832a8e84","Type":"ContainerStarted","Data":"6dab72f2387525d273dbde102019f6a753f4383244c817b386bad29d44444cf1"} Nov 24 12:14:28 crc kubenswrapper[4752]: I1124 12:14:28.597040 4752 generic.go:334] "Generic (PLEG): container finished" podID="3edbd3c0-308c-4031-9460-202f832a8e84" containerID="7b00ed6a30d0030ba5d7ec193904f56e7de27bf47e284fbc205f7fa30e5b0604" exitCode=0 Nov 24 12:14:28 crc kubenswrapper[4752]: I1124 12:14:28.597117 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mvmln" event={"ID":"3edbd3c0-308c-4031-9460-202f832a8e84","Type":"ContainerDied","Data":"7b00ed6a30d0030ba5d7ec193904f56e7de27bf47e284fbc205f7fa30e5b0604"} Nov 24 12:14:29 crc kubenswrapper[4752]: I1124 12:14:29.606548 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mvmln" event={"ID":"3edbd3c0-308c-4031-9460-202f832a8e84","Type":"ContainerStarted","Data":"e427c17ea33410e8268914ad89aa4be9d562966939fc50022daa8668b40ff4e5"} Nov 24 12:14:29 crc kubenswrapper[4752]: I1124 12:14:29.626115 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mvmln" podStartSLOduration=2.238293581 podStartE2EDuration="3.626088082s" podCreationTimestamp="2025-11-24 12:14:26 +0000 UTC" firstStartedPulling="2025-11-24 12:14:27.58998224 +0000 UTC m=+4073.574802539" lastFinishedPulling="2025-11-24 12:14:28.977776751 +0000 UTC m=+4074.962597040" observedRunningTime="2025-11-24 12:14:29.624462405 +0000 UTC m=+4075.609282704" watchObservedRunningTime="2025-11-24 12:14:29.626088082 +0000 UTC m=+4075.610908391" Nov 24 12:14:36 crc kubenswrapper[4752]: I1124 12:14:36.766699 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mvmln" Nov 24 12:14:36 crc kubenswrapper[4752]: I1124 12:14:36.767319 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mvmln" Nov 24 12:14:36 crc kubenswrapper[4752]: I1124 12:14:36.808537 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mvmln" Nov 24 12:14:37 crc kubenswrapper[4752]: I1124 12:14:37.734285 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mvmln" Nov 24 12:14:37 crc kubenswrapper[4752]: I1124 12:14:37.784140 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mvmln"] Nov 24 12:14:39 crc kubenswrapper[4752]: I1124 12:14:39.701038 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mvmln" podUID="3edbd3c0-308c-4031-9460-202f832a8e84" containerName="registry-server" containerID="cri-o://e427c17ea33410e8268914ad89aa4be9d562966939fc50022daa8668b40ff4e5" gracePeriod=2 Nov 24 12:14:40 crc kubenswrapper[4752]: I1124 12:14:40.098410 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mvmln" Nov 24 12:14:40 crc kubenswrapper[4752]: I1124 12:14:40.209350 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vh7nd\" (UniqueName: \"kubernetes.io/projected/3edbd3c0-308c-4031-9460-202f832a8e84-kube-api-access-vh7nd\") pod \"3edbd3c0-308c-4031-9460-202f832a8e84\" (UID: \"3edbd3c0-308c-4031-9460-202f832a8e84\") " Nov 24 12:14:40 crc kubenswrapper[4752]: I1124 12:14:40.209493 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3edbd3c0-308c-4031-9460-202f832a8e84-utilities\") pod \"3edbd3c0-308c-4031-9460-202f832a8e84\" (UID: \"3edbd3c0-308c-4031-9460-202f832a8e84\") " Nov 24 12:14:40 crc kubenswrapper[4752]: I1124 12:14:40.209527 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3edbd3c0-308c-4031-9460-202f832a8e84-catalog-content\") pod \"3edbd3c0-308c-4031-9460-202f832a8e84\" (UID: \"3edbd3c0-308c-4031-9460-202f832a8e84\") " Nov 24 12:14:40 crc kubenswrapper[4752]: I1124 12:14:40.210644 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3edbd3c0-308c-4031-9460-202f832a8e84-utilities" (OuterVolumeSpecName: "utilities") pod "3edbd3c0-308c-4031-9460-202f832a8e84" (UID: "3edbd3c0-308c-4031-9460-202f832a8e84"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:14:40 crc kubenswrapper[4752]: I1124 12:14:40.218033 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3edbd3c0-308c-4031-9460-202f832a8e84-kube-api-access-vh7nd" (OuterVolumeSpecName: "kube-api-access-vh7nd") pod "3edbd3c0-308c-4031-9460-202f832a8e84" (UID: "3edbd3c0-308c-4031-9460-202f832a8e84"). InnerVolumeSpecName "kube-api-access-vh7nd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:14:40 crc kubenswrapper[4752]: I1124 12:14:40.256359 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3edbd3c0-308c-4031-9460-202f832a8e84-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3edbd3c0-308c-4031-9460-202f832a8e84" (UID: "3edbd3c0-308c-4031-9460-202f832a8e84"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:14:40 crc kubenswrapper[4752]: I1124 12:14:40.311576 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vh7nd\" (UniqueName: \"kubernetes.io/projected/3edbd3c0-308c-4031-9460-202f832a8e84-kube-api-access-vh7nd\") on node \"crc\" DevicePath \"\"" Nov 24 12:14:40 crc kubenswrapper[4752]: I1124 12:14:40.311621 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3edbd3c0-308c-4031-9460-202f832a8e84-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 12:14:40 crc kubenswrapper[4752]: I1124 12:14:40.311634 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3edbd3c0-308c-4031-9460-202f832a8e84-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 12:14:40 crc kubenswrapper[4752]: I1124 12:14:40.710181 4752 generic.go:334] "Generic (PLEG): container finished" podID="3edbd3c0-308c-4031-9460-202f832a8e84" containerID="e427c17ea33410e8268914ad89aa4be9d562966939fc50022daa8668b40ff4e5" exitCode=0 Nov 24 12:14:40 crc kubenswrapper[4752]: I1124 12:14:40.710263 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mvmln" Nov 24 12:14:40 crc kubenswrapper[4752]: I1124 12:14:40.710269 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mvmln" event={"ID":"3edbd3c0-308c-4031-9460-202f832a8e84","Type":"ContainerDied","Data":"e427c17ea33410e8268914ad89aa4be9d562966939fc50022daa8668b40ff4e5"} Nov 24 12:14:40 crc kubenswrapper[4752]: I1124 12:14:40.710798 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mvmln" event={"ID":"3edbd3c0-308c-4031-9460-202f832a8e84","Type":"ContainerDied","Data":"6dab72f2387525d273dbde102019f6a753f4383244c817b386bad29d44444cf1"} Nov 24 12:14:40 crc kubenswrapper[4752]: I1124 12:14:40.710819 4752 scope.go:117] "RemoveContainer" containerID="e427c17ea33410e8268914ad89aa4be9d562966939fc50022daa8668b40ff4e5" Nov 24 12:14:40 crc kubenswrapper[4752]: I1124 12:14:40.752913 4752 scope.go:117] "RemoveContainer" containerID="7b00ed6a30d0030ba5d7ec193904f56e7de27bf47e284fbc205f7fa30e5b0604" Nov 24 12:14:40 crc kubenswrapper[4752]: I1124 12:14:40.759438 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mvmln"] Nov 24 12:14:40 crc kubenswrapper[4752]: I1124 12:14:40.770519 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mvmln"] Nov 24 12:14:40 crc kubenswrapper[4752]: I1124 12:14:40.775348 4752 scope.go:117] "RemoveContainer" containerID="2abc4f622bf329d4d6e3f4c2cbd604b189d1cd793f1c8a66df1167bacc027e3f" Nov 24 12:14:40 crc kubenswrapper[4752]: I1124 12:14:40.801953 4752 scope.go:117] "RemoveContainer" containerID="e427c17ea33410e8268914ad89aa4be9d562966939fc50022daa8668b40ff4e5" Nov 24 12:14:40 crc kubenswrapper[4752]: E1124 12:14:40.802591 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e427c17ea33410e8268914ad89aa4be9d562966939fc50022daa8668b40ff4e5\": container with ID starting with e427c17ea33410e8268914ad89aa4be9d562966939fc50022daa8668b40ff4e5 not found: ID does not exist" containerID="e427c17ea33410e8268914ad89aa4be9d562966939fc50022daa8668b40ff4e5" Nov 24 12:14:40 crc kubenswrapper[4752]: I1124 12:14:40.802633 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e427c17ea33410e8268914ad89aa4be9d562966939fc50022daa8668b40ff4e5"} err="failed to get container status \"e427c17ea33410e8268914ad89aa4be9d562966939fc50022daa8668b40ff4e5\": rpc error: code = NotFound desc = could not find container \"e427c17ea33410e8268914ad89aa4be9d562966939fc50022daa8668b40ff4e5\": container with ID starting with e427c17ea33410e8268914ad89aa4be9d562966939fc50022daa8668b40ff4e5 not found: ID does not exist" Nov 24 12:14:40 crc kubenswrapper[4752]: I1124 12:14:40.802661 4752 scope.go:117] "RemoveContainer" containerID="7b00ed6a30d0030ba5d7ec193904f56e7de27bf47e284fbc205f7fa30e5b0604" Nov 24 12:14:40 crc kubenswrapper[4752]: E1124 12:14:40.803230 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b00ed6a30d0030ba5d7ec193904f56e7de27bf47e284fbc205f7fa30e5b0604\": container with ID starting with 7b00ed6a30d0030ba5d7ec193904f56e7de27bf47e284fbc205f7fa30e5b0604 not found: ID does not exist" containerID="7b00ed6a30d0030ba5d7ec193904f56e7de27bf47e284fbc205f7fa30e5b0604" Nov 24 12:14:40 crc kubenswrapper[4752]: I1124 12:14:40.803326 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b00ed6a30d0030ba5d7ec193904f56e7de27bf47e284fbc205f7fa30e5b0604"} err="failed to get container status \"7b00ed6a30d0030ba5d7ec193904f56e7de27bf47e284fbc205f7fa30e5b0604\": rpc error: code = NotFound desc = could not find container \"7b00ed6a30d0030ba5d7ec193904f56e7de27bf47e284fbc205f7fa30e5b0604\": container with ID starting with 7b00ed6a30d0030ba5d7ec193904f56e7de27bf47e284fbc205f7fa30e5b0604 not found: ID does not exist" Nov 24 12:14:40 crc kubenswrapper[4752]: I1124 12:14:40.803382 4752 scope.go:117] "RemoveContainer" containerID="2abc4f622bf329d4d6e3f4c2cbd604b189d1cd793f1c8a66df1167bacc027e3f" Nov 24 12:14:40 crc kubenswrapper[4752]: E1124 12:14:40.804232 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2abc4f622bf329d4d6e3f4c2cbd604b189d1cd793f1c8a66df1167bacc027e3f\": container with ID starting with 2abc4f622bf329d4d6e3f4c2cbd604b189d1cd793f1c8a66df1167bacc027e3f not found: ID does not exist" containerID="2abc4f622bf329d4d6e3f4c2cbd604b189d1cd793f1c8a66df1167bacc027e3f" Nov 24 12:14:40 crc kubenswrapper[4752]: I1124 12:14:40.804266 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2abc4f622bf329d4d6e3f4c2cbd604b189d1cd793f1c8a66df1167bacc027e3f"} err="failed to get container status \"2abc4f622bf329d4d6e3f4c2cbd604b189d1cd793f1c8a66df1167bacc027e3f\": rpc error: code = NotFound desc = could not find container \"2abc4f622bf329d4d6e3f4c2cbd604b189d1cd793f1c8a66df1167bacc027e3f\": container with ID starting with 2abc4f622bf329d4d6e3f4c2cbd604b189d1cd793f1c8a66df1167bacc027e3f not found: ID does not exist" Nov 24 12:14:42 crc kubenswrapper[4752]: I1124 12:14:42.736385 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3edbd3c0-308c-4031-9460-202f832a8e84" path="/var/lib/kubelet/pods/3edbd3c0-308c-4031-9460-202f832a8e84/volumes" Nov 24 12:15:00 crc kubenswrapper[4752]: I1124 12:15:00.151605 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399775-gqkn2"] Nov 24 12:15:00 crc kubenswrapper[4752]: E1124 12:15:00.152421 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3edbd3c0-308c-4031-9460-202f832a8e84" containerName="extract-content" Nov 24 12:15:00 crc kubenswrapper[4752]: I1124 12:15:00.152432 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="3edbd3c0-308c-4031-9460-202f832a8e84" containerName="extract-content" Nov 24 12:15:00 crc kubenswrapper[4752]: E1124 12:15:00.152456 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3edbd3c0-308c-4031-9460-202f832a8e84" containerName="extract-utilities" Nov 24 12:15:00 crc kubenswrapper[4752]: I1124 12:15:00.152462 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="3edbd3c0-308c-4031-9460-202f832a8e84" containerName="extract-utilities" Nov 24 12:15:00 crc kubenswrapper[4752]: E1124 12:15:00.152478 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3edbd3c0-308c-4031-9460-202f832a8e84" containerName="registry-server" Nov 24 12:15:00 crc kubenswrapper[4752]: I1124 12:15:00.152485 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="3edbd3c0-308c-4031-9460-202f832a8e84" containerName="registry-server" Nov 24 12:15:00 crc kubenswrapper[4752]: I1124 12:15:00.152625 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="3edbd3c0-308c-4031-9460-202f832a8e84" containerName="registry-server" Nov 24 12:15:00 crc kubenswrapper[4752]: I1124 12:15:00.153150 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399775-gqkn2" Nov 24 12:15:00 crc kubenswrapper[4752]: I1124 12:15:00.159454 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 24 12:15:00 crc kubenswrapper[4752]: I1124 12:15:00.159840 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 24 12:15:00 crc kubenswrapper[4752]: I1124 12:15:00.160093 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399775-gqkn2"] Nov 24 12:15:00 crc kubenswrapper[4752]: I1124 12:15:00.212898 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2t5j\" (UniqueName: \"kubernetes.io/projected/478b588e-3987-4b51-8934-cf455f0a6408-kube-api-access-v2t5j\") pod \"collect-profiles-29399775-gqkn2\" (UID: \"478b588e-3987-4b51-8934-cf455f0a6408\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399775-gqkn2" Nov 24 12:15:00 crc kubenswrapper[4752]: I1124 12:15:00.213253 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/478b588e-3987-4b51-8934-cf455f0a6408-config-volume\") pod \"collect-profiles-29399775-gqkn2\" (UID: \"478b588e-3987-4b51-8934-cf455f0a6408\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399775-gqkn2" Nov 24 12:15:00 crc kubenswrapper[4752]: I1124 12:15:00.213378 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/478b588e-3987-4b51-8934-cf455f0a6408-secret-volume\") pod \"collect-profiles-29399775-gqkn2\" (UID: \"478b588e-3987-4b51-8934-cf455f0a6408\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399775-gqkn2" Nov 24 12:15:00 crc kubenswrapper[4752]: I1124 12:15:00.314179 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/478b588e-3987-4b51-8934-cf455f0a6408-secret-volume\") pod \"collect-profiles-29399775-gqkn2\" (UID: \"478b588e-3987-4b51-8934-cf455f0a6408\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399775-gqkn2" Nov 24 12:15:00 crc kubenswrapper[4752]: I1124 12:15:00.314235 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2t5j\" (UniqueName: \"kubernetes.io/projected/478b588e-3987-4b51-8934-cf455f0a6408-kube-api-access-v2t5j\") pod \"collect-profiles-29399775-gqkn2\" (UID: \"478b588e-3987-4b51-8934-cf455f0a6408\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399775-gqkn2" Nov 24 12:15:00 crc kubenswrapper[4752]: I1124 12:15:00.314305 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/478b588e-3987-4b51-8934-cf455f0a6408-config-volume\") pod \"collect-profiles-29399775-gqkn2\" (UID: \"478b588e-3987-4b51-8934-cf455f0a6408\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399775-gqkn2" Nov 24 12:15:00 crc kubenswrapper[4752]: I1124 12:15:00.315104 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/478b588e-3987-4b51-8934-cf455f0a6408-config-volume\") pod \"collect-profiles-29399775-gqkn2\" (UID: \"478b588e-3987-4b51-8934-cf455f0a6408\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399775-gqkn2" Nov 24 12:15:00 crc kubenswrapper[4752]: I1124 12:15:00.331661 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2t5j\" (UniqueName: \"kubernetes.io/projected/478b588e-3987-4b51-8934-cf455f0a6408-kube-api-access-v2t5j\") pod \"collect-profiles-29399775-gqkn2\" (UID: \"478b588e-3987-4b51-8934-cf455f0a6408\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399775-gqkn2" Nov 24 12:15:00 crc kubenswrapper[4752]: I1124 12:15:00.333853 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/478b588e-3987-4b51-8934-cf455f0a6408-secret-volume\") pod \"collect-profiles-29399775-gqkn2\" (UID: \"478b588e-3987-4b51-8934-cf455f0a6408\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399775-gqkn2" Nov 24 12:15:00 crc kubenswrapper[4752]: I1124 12:15:00.484661 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399775-gqkn2" Nov 24 12:15:00 crc kubenswrapper[4752]: I1124 12:15:00.910577 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399775-gqkn2"] Nov 24 12:15:01 crc kubenswrapper[4752]: I1124 12:15:01.874723 4752 generic.go:334] "Generic (PLEG): container finished" podID="478b588e-3987-4b51-8934-cf455f0a6408" containerID="92e120f4ed3198101cd9f950393fe58fd6ebfec3363950c9eb88558d96b13480" exitCode=0 Nov 24 12:15:01 crc kubenswrapper[4752]: I1124 12:15:01.874802 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399775-gqkn2" event={"ID":"478b588e-3987-4b51-8934-cf455f0a6408","Type":"ContainerDied","Data":"92e120f4ed3198101cd9f950393fe58fd6ebfec3363950c9eb88558d96b13480"} Nov 24 12:15:01 crc kubenswrapper[4752]: I1124 12:15:01.874844 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399775-gqkn2" event={"ID":"478b588e-3987-4b51-8934-cf455f0a6408","Type":"ContainerStarted","Data":"03022a8edf79a34c18d3291239faddb5460bf9a170edce2888114eabe138609b"} Nov 24 12:15:03 crc kubenswrapper[4752]: I1124 12:15:03.177683 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399775-gqkn2" Nov 24 12:15:03 crc kubenswrapper[4752]: I1124 12:15:03.352775 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2t5j\" (UniqueName: \"kubernetes.io/projected/478b588e-3987-4b51-8934-cf455f0a6408-kube-api-access-v2t5j\") pod \"478b588e-3987-4b51-8934-cf455f0a6408\" (UID: \"478b588e-3987-4b51-8934-cf455f0a6408\") " Nov 24 12:15:03 crc kubenswrapper[4752]: I1124 12:15:03.353162 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/478b588e-3987-4b51-8934-cf455f0a6408-secret-volume\") pod \"478b588e-3987-4b51-8934-cf455f0a6408\" (UID: \"478b588e-3987-4b51-8934-cf455f0a6408\") " Nov 24 12:15:03 crc kubenswrapper[4752]: I1124 12:15:03.353202 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/478b588e-3987-4b51-8934-cf455f0a6408-config-volume\") pod \"478b588e-3987-4b51-8934-cf455f0a6408\" (UID: \"478b588e-3987-4b51-8934-cf455f0a6408\") " Nov 24 12:15:03 crc kubenswrapper[4752]: I1124 12:15:03.353855 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/478b588e-3987-4b51-8934-cf455f0a6408-config-volume" (OuterVolumeSpecName: "config-volume") pod "478b588e-3987-4b51-8934-cf455f0a6408" (UID: "478b588e-3987-4b51-8934-cf455f0a6408"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:15:03 crc kubenswrapper[4752]: I1124 12:15:03.359032 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/478b588e-3987-4b51-8934-cf455f0a6408-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "478b588e-3987-4b51-8934-cf455f0a6408" (UID: "478b588e-3987-4b51-8934-cf455f0a6408"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:15:03 crc kubenswrapper[4752]: I1124 12:15:03.359916 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/478b588e-3987-4b51-8934-cf455f0a6408-kube-api-access-v2t5j" (OuterVolumeSpecName: "kube-api-access-v2t5j") pod "478b588e-3987-4b51-8934-cf455f0a6408" (UID: "478b588e-3987-4b51-8934-cf455f0a6408"). InnerVolumeSpecName "kube-api-access-v2t5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:15:03 crc kubenswrapper[4752]: I1124 12:15:03.454202 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2t5j\" (UniqueName: \"kubernetes.io/projected/478b588e-3987-4b51-8934-cf455f0a6408-kube-api-access-v2t5j\") on node \"crc\" DevicePath \"\"" Nov 24 12:15:03 crc kubenswrapper[4752]: I1124 12:15:03.454239 4752 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/478b588e-3987-4b51-8934-cf455f0a6408-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 24 12:15:03 crc kubenswrapper[4752]: I1124 12:15:03.454254 4752 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/478b588e-3987-4b51-8934-cf455f0a6408-config-volume\") on node \"crc\" DevicePath \"\"" Nov 24 12:15:03 crc kubenswrapper[4752]: I1124 12:15:03.888667 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399775-gqkn2" event={"ID":"478b588e-3987-4b51-8934-cf455f0a6408","Type":"ContainerDied","Data":"03022a8edf79a34c18d3291239faddb5460bf9a170edce2888114eabe138609b"} Nov 24 12:15:03 crc kubenswrapper[4752]: I1124 12:15:03.888710 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03022a8edf79a34c18d3291239faddb5460bf9a170edce2888114eabe138609b" Nov 24 12:15:03 crc kubenswrapper[4752]: I1124 12:15:03.888762 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399775-gqkn2" Nov 24 12:15:04 crc kubenswrapper[4752]: I1124 12:15:04.240361 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399730-wlws9"] Nov 24 12:15:04 crc kubenswrapper[4752]: I1124 12:15:04.247820 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399730-wlws9"] Nov 24 12:15:04 crc kubenswrapper[4752]: I1124 12:15:04.736221 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f68b128-215a-4b08-b1ce-ef179f020723" path="/var/lib/kubelet/pods/4f68b128-215a-4b08-b1ce-ef179f020723/volumes" Nov 24 12:15:12 crc kubenswrapper[4752]: I1124 12:15:12.344735 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5fvcs"] Nov 24 12:15:12 crc kubenswrapper[4752]: E1124 12:15:12.349362 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="478b588e-3987-4b51-8934-cf455f0a6408" containerName="collect-profiles" Nov 24 12:15:12 crc kubenswrapper[4752]: I1124 12:15:12.349540 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="478b588e-3987-4b51-8934-cf455f0a6408" containerName="collect-profiles" Nov 24 12:15:12 crc kubenswrapper[4752]: I1124 12:15:12.350517 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="478b588e-3987-4b51-8934-cf455f0a6408" containerName="collect-profiles" Nov 24 12:15:12 crc kubenswrapper[4752]: I1124 12:15:12.353642 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5fvcs" Nov 24 12:15:12 crc kubenswrapper[4752]: I1124 12:15:12.355454 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5fvcs"] Nov 24 12:15:12 crc kubenswrapper[4752]: I1124 12:15:12.475809 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l26bl\" (UniqueName: \"kubernetes.io/projected/6094f2f8-ecf8-4541-bd68-b05c959fbd58-kube-api-access-l26bl\") pod \"redhat-marketplace-5fvcs\" (UID: \"6094f2f8-ecf8-4541-bd68-b05c959fbd58\") " pod="openshift-marketplace/redhat-marketplace-5fvcs" Nov 24 12:15:12 crc kubenswrapper[4752]: I1124 12:15:12.475884 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6094f2f8-ecf8-4541-bd68-b05c959fbd58-utilities\") pod \"redhat-marketplace-5fvcs\" (UID: \"6094f2f8-ecf8-4541-bd68-b05c959fbd58\") " pod="openshift-marketplace/redhat-marketplace-5fvcs" Nov 24 12:15:12 crc kubenswrapper[4752]: I1124 12:15:12.475915 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6094f2f8-ecf8-4541-bd68-b05c959fbd58-catalog-content\") pod \"redhat-marketplace-5fvcs\" (UID: \"6094f2f8-ecf8-4541-bd68-b05c959fbd58\") " pod="openshift-marketplace/redhat-marketplace-5fvcs" Nov 24 12:15:12 crc kubenswrapper[4752]: I1124 12:15:12.577045 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l26bl\" (UniqueName: \"kubernetes.io/projected/6094f2f8-ecf8-4541-bd68-b05c959fbd58-kube-api-access-l26bl\") pod \"redhat-marketplace-5fvcs\" (UID: \"6094f2f8-ecf8-4541-bd68-b05c959fbd58\") " pod="openshift-marketplace/redhat-marketplace-5fvcs" Nov 24 12:15:12 crc kubenswrapper[4752]: I1124 12:15:12.577152 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6094f2f8-ecf8-4541-bd68-b05c959fbd58-utilities\") pod \"redhat-marketplace-5fvcs\" (UID: \"6094f2f8-ecf8-4541-bd68-b05c959fbd58\") " pod="openshift-marketplace/redhat-marketplace-5fvcs" Nov 24 12:15:12 crc kubenswrapper[4752]: I1124 12:15:12.577189 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6094f2f8-ecf8-4541-bd68-b05c959fbd58-catalog-content\") pod \"redhat-marketplace-5fvcs\" (UID: \"6094f2f8-ecf8-4541-bd68-b05c959fbd58\") " pod="openshift-marketplace/redhat-marketplace-5fvcs" Nov 24 12:15:12 crc kubenswrapper[4752]: I1124 12:15:12.577782 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6094f2f8-ecf8-4541-bd68-b05c959fbd58-catalog-content\") pod \"redhat-marketplace-5fvcs\" (UID: \"6094f2f8-ecf8-4541-bd68-b05c959fbd58\") " pod="openshift-marketplace/redhat-marketplace-5fvcs" Nov 24 12:15:12 crc kubenswrapper[4752]: I1124 12:15:12.578955 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6094f2f8-ecf8-4541-bd68-b05c959fbd58-utilities\") pod \"redhat-marketplace-5fvcs\" (UID: \"6094f2f8-ecf8-4541-bd68-b05c959fbd58\") " pod="openshift-marketplace/redhat-marketplace-5fvcs" Nov 24 12:15:12 crc kubenswrapper[4752]: I1124 12:15:12.600364 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l26bl\" (UniqueName: \"kubernetes.io/projected/6094f2f8-ecf8-4541-bd68-b05c959fbd58-kube-api-access-l26bl\") pod \"redhat-marketplace-5fvcs\" (UID: \"6094f2f8-ecf8-4541-bd68-b05c959fbd58\") " pod="openshift-marketplace/redhat-marketplace-5fvcs" Nov 24 12:15:12 crc kubenswrapper[4752]: I1124 12:15:12.674392 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5fvcs" Nov 24 12:15:13 crc kubenswrapper[4752]: I1124 12:15:13.104317 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5fvcs"] Nov 24 12:15:13 crc kubenswrapper[4752]: I1124 12:15:13.961438 4752 generic.go:334] "Generic (PLEG): container finished" podID="6094f2f8-ecf8-4541-bd68-b05c959fbd58" containerID="8d3b1d7b931248460b309fc24e8e7aab4fcbac691f826529521e07563824de8d" exitCode=0 Nov 24 12:15:13 crc kubenswrapper[4752]: I1124 12:15:13.961543 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5fvcs" event={"ID":"6094f2f8-ecf8-4541-bd68-b05c959fbd58","Type":"ContainerDied","Data":"8d3b1d7b931248460b309fc24e8e7aab4fcbac691f826529521e07563824de8d"} Nov 24 12:15:13 crc kubenswrapper[4752]: I1124 12:15:13.961814 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5fvcs" event={"ID":"6094f2f8-ecf8-4541-bd68-b05c959fbd58","Type":"ContainerStarted","Data":"4ae04031aa9c29fde72698d1feceb3057ec0ceaaee51bf35ce746828184b7a85"} Nov 24 12:15:14 crc kubenswrapper[4752]: I1124 12:15:14.973802 4752 generic.go:334] "Generic (PLEG): container finished" podID="6094f2f8-ecf8-4541-bd68-b05c959fbd58" containerID="9b8d0f33e580d967131206ca634c0daa1f766cd9a4afd8a4fad449ee756d8e52" exitCode=0 Nov 24 12:15:14 crc kubenswrapper[4752]: I1124 12:15:14.973870 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5fvcs" event={"ID":"6094f2f8-ecf8-4541-bd68-b05c959fbd58","Type":"ContainerDied","Data":"9b8d0f33e580d967131206ca634c0daa1f766cd9a4afd8a4fad449ee756d8e52"} Nov 24 12:15:15 crc kubenswrapper[4752]: I1124 12:15:15.984009 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5fvcs" event={"ID":"6094f2f8-ecf8-4541-bd68-b05c959fbd58","Type":"ContainerStarted","Data":"2a3707481ffcdb8deee1414c0d91e3aba9cd3542a265b0ef25dbd8e91fa42422"} Nov 24 12:15:22 crc kubenswrapper[4752]: I1124 12:15:22.674569 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5fvcs" Nov 24 12:15:22 crc kubenswrapper[4752]: I1124 12:15:22.675263 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5fvcs" Nov 24 12:15:22 crc kubenswrapper[4752]: I1124 12:15:22.714803 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5fvcs" Nov 24 12:15:22 crc kubenswrapper[4752]: I1124 12:15:22.734371 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5fvcs" podStartSLOduration=9.316500183 podStartE2EDuration="10.734351258s" podCreationTimestamp="2025-11-24 12:15:12 +0000 UTC" firstStartedPulling="2025-11-24 12:15:13.963787867 +0000 UTC m=+4119.948608166" lastFinishedPulling="2025-11-24 12:15:15.381638952 +0000 UTC m=+4121.366459241" observedRunningTime="2025-11-24 12:15:16.001018271 +0000 UTC m=+4121.985838580" watchObservedRunningTime="2025-11-24 12:15:22.734351258 +0000 UTC m=+4128.719171547" Nov 24 12:15:23 crc kubenswrapper[4752]: I1124 12:15:23.082917 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5fvcs" Nov 24 12:15:23 crc kubenswrapper[4752]: I1124 12:15:23.122810 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5fvcs"] Nov 24 12:15:25 crc kubenswrapper[4752]: I1124 12:15:25.062267 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5fvcs" podUID="6094f2f8-ecf8-4541-bd68-b05c959fbd58" containerName="registry-server" containerID="cri-o://2a3707481ffcdb8deee1414c0d91e3aba9cd3542a265b0ef25dbd8e91fa42422" gracePeriod=2 Nov 24 12:15:25 crc kubenswrapper[4752]: I1124 12:15:25.473520 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5fvcs" Nov 24 12:15:25 crc kubenswrapper[4752]: I1124 12:15:25.658809 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6094f2f8-ecf8-4541-bd68-b05c959fbd58-catalog-content\") pod \"6094f2f8-ecf8-4541-bd68-b05c959fbd58\" (UID: \"6094f2f8-ecf8-4541-bd68-b05c959fbd58\") " Nov 24 12:15:25 crc kubenswrapper[4752]: I1124 12:15:25.658860 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6094f2f8-ecf8-4541-bd68-b05c959fbd58-utilities\") pod \"6094f2f8-ecf8-4541-bd68-b05c959fbd58\" (UID: \"6094f2f8-ecf8-4541-bd68-b05c959fbd58\") " Nov 24 12:15:25 crc kubenswrapper[4752]: I1124 12:15:25.658996 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l26bl\" (UniqueName: \"kubernetes.io/projected/6094f2f8-ecf8-4541-bd68-b05c959fbd58-kube-api-access-l26bl\") pod \"6094f2f8-ecf8-4541-bd68-b05c959fbd58\" (UID: \"6094f2f8-ecf8-4541-bd68-b05c959fbd58\") " Nov 24 12:15:25 crc kubenswrapper[4752]: I1124 12:15:25.660278 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6094f2f8-ecf8-4541-bd68-b05c959fbd58-utilities" (OuterVolumeSpecName: "utilities") pod "6094f2f8-ecf8-4541-bd68-b05c959fbd58" (UID: "6094f2f8-ecf8-4541-bd68-b05c959fbd58"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:15:25 crc kubenswrapper[4752]: I1124 12:15:25.665515 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6094f2f8-ecf8-4541-bd68-b05c959fbd58-kube-api-access-l26bl" (OuterVolumeSpecName: "kube-api-access-l26bl") pod "6094f2f8-ecf8-4541-bd68-b05c959fbd58" (UID: "6094f2f8-ecf8-4541-bd68-b05c959fbd58"). InnerVolumeSpecName "kube-api-access-l26bl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:15:25 crc kubenswrapper[4752]: I1124 12:15:25.675934 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6094f2f8-ecf8-4541-bd68-b05c959fbd58-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6094f2f8-ecf8-4541-bd68-b05c959fbd58" (UID: "6094f2f8-ecf8-4541-bd68-b05c959fbd58"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:15:25 crc kubenswrapper[4752]: I1124 12:15:25.761128 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6094f2f8-ecf8-4541-bd68-b05c959fbd58-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 12:15:25 crc kubenswrapper[4752]: I1124 12:15:25.761158 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6094f2f8-ecf8-4541-bd68-b05c959fbd58-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 12:15:25 crc kubenswrapper[4752]: I1124 12:15:25.761170 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l26bl\" (UniqueName: \"kubernetes.io/projected/6094f2f8-ecf8-4541-bd68-b05c959fbd58-kube-api-access-l26bl\") on node \"crc\" DevicePath \"\"" Nov 24 12:15:26 crc kubenswrapper[4752]: I1124 12:15:26.071526 4752 generic.go:334] "Generic (PLEG): container finished" podID="6094f2f8-ecf8-4541-bd68-b05c959fbd58" containerID="2a3707481ffcdb8deee1414c0d91e3aba9cd3542a265b0ef25dbd8e91fa42422" exitCode=0 Nov 24 12:15:26 crc kubenswrapper[4752]: I1124 12:15:26.071565 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5fvcs" event={"ID":"6094f2f8-ecf8-4541-bd68-b05c959fbd58","Type":"ContainerDied","Data":"2a3707481ffcdb8deee1414c0d91e3aba9cd3542a265b0ef25dbd8e91fa42422"} Nov 24 12:15:26 crc kubenswrapper[4752]: I1124 12:15:26.071594 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5fvcs" event={"ID":"6094f2f8-ecf8-4541-bd68-b05c959fbd58","Type":"ContainerDied","Data":"4ae04031aa9c29fde72698d1feceb3057ec0ceaaee51bf35ce746828184b7a85"} Nov 24 12:15:26 crc kubenswrapper[4752]: I1124 12:15:26.071612 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5fvcs" Nov 24 12:15:26 crc kubenswrapper[4752]: I1124 12:15:26.071641 4752 scope.go:117] "RemoveContainer" containerID="2a3707481ffcdb8deee1414c0d91e3aba9cd3542a265b0ef25dbd8e91fa42422" Nov 24 12:15:26 crc kubenswrapper[4752]: I1124 12:15:26.089954 4752 scope.go:117] "RemoveContainer" containerID="9b8d0f33e580d967131206ca634c0daa1f766cd9a4afd8a4fad449ee756d8e52" Nov 24 12:15:26 crc kubenswrapper[4752]: I1124 12:15:26.101319 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5fvcs"] Nov 24 12:15:26 crc kubenswrapper[4752]: I1124 12:15:26.108727 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5fvcs"] Nov 24 12:15:26 crc kubenswrapper[4752]: I1124 12:15:26.129092 4752 scope.go:117] "RemoveContainer" containerID="8d3b1d7b931248460b309fc24e8e7aab4fcbac691f826529521e07563824de8d" Nov 24 12:15:26 crc kubenswrapper[4752]: I1124 12:15:26.148323 4752 scope.go:117] "RemoveContainer" containerID="2a3707481ffcdb8deee1414c0d91e3aba9cd3542a265b0ef25dbd8e91fa42422" Nov 24 12:15:26 crc kubenswrapper[4752]: E1124 12:15:26.148860 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a3707481ffcdb8deee1414c0d91e3aba9cd3542a265b0ef25dbd8e91fa42422\": container with ID starting with 2a3707481ffcdb8deee1414c0d91e3aba9cd3542a265b0ef25dbd8e91fa42422 not found: ID does not exist" containerID="2a3707481ffcdb8deee1414c0d91e3aba9cd3542a265b0ef25dbd8e91fa42422" Nov 24 12:15:26 crc kubenswrapper[4752]: I1124 12:15:26.148901 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a3707481ffcdb8deee1414c0d91e3aba9cd3542a265b0ef25dbd8e91fa42422"} err="failed to get container status \"2a3707481ffcdb8deee1414c0d91e3aba9cd3542a265b0ef25dbd8e91fa42422\": rpc error: code = NotFound desc = could not find container \"2a3707481ffcdb8deee1414c0d91e3aba9cd3542a265b0ef25dbd8e91fa42422\": container with ID starting with 2a3707481ffcdb8deee1414c0d91e3aba9cd3542a265b0ef25dbd8e91fa42422 not found: ID does not exist" Nov 24 12:15:26 crc kubenswrapper[4752]: I1124 12:15:26.148922 4752 scope.go:117] "RemoveContainer" containerID="9b8d0f33e580d967131206ca634c0daa1f766cd9a4afd8a4fad449ee756d8e52" Nov 24 12:15:26 crc kubenswrapper[4752]: E1124 12:15:26.149257 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b8d0f33e580d967131206ca634c0daa1f766cd9a4afd8a4fad449ee756d8e52\": container with ID starting with 9b8d0f33e580d967131206ca634c0daa1f766cd9a4afd8a4fad449ee756d8e52 not found: ID does not exist" containerID="9b8d0f33e580d967131206ca634c0daa1f766cd9a4afd8a4fad449ee756d8e52" Nov 24 12:15:26 crc kubenswrapper[4752]: I1124 12:15:26.149282 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b8d0f33e580d967131206ca634c0daa1f766cd9a4afd8a4fad449ee756d8e52"} err="failed to get container status \"9b8d0f33e580d967131206ca634c0daa1f766cd9a4afd8a4fad449ee756d8e52\": rpc error: code = NotFound desc = could not find container \"9b8d0f33e580d967131206ca634c0daa1f766cd9a4afd8a4fad449ee756d8e52\": container with ID starting with 9b8d0f33e580d967131206ca634c0daa1f766cd9a4afd8a4fad449ee756d8e52 not found: ID does not exist" Nov 24 12:15:26 crc kubenswrapper[4752]: I1124 12:15:26.149300 4752 scope.go:117] "RemoveContainer" containerID="8d3b1d7b931248460b309fc24e8e7aab4fcbac691f826529521e07563824de8d" Nov 24 12:15:26 crc kubenswrapper[4752]: E1124 12:15:26.149664 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d3b1d7b931248460b309fc24e8e7aab4fcbac691f826529521e07563824de8d\": container with ID starting with 8d3b1d7b931248460b309fc24e8e7aab4fcbac691f826529521e07563824de8d not found: ID does not exist" containerID="8d3b1d7b931248460b309fc24e8e7aab4fcbac691f826529521e07563824de8d" Nov 24 12:15:26 crc kubenswrapper[4752]: I1124 12:15:26.149719 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d3b1d7b931248460b309fc24e8e7aab4fcbac691f826529521e07563824de8d"} err="failed to get container status \"8d3b1d7b931248460b309fc24e8e7aab4fcbac691f826529521e07563824de8d\": rpc error: code = NotFound desc = could not find container \"8d3b1d7b931248460b309fc24e8e7aab4fcbac691f826529521e07563824de8d\": container with ID starting with 8d3b1d7b931248460b309fc24e8e7aab4fcbac691f826529521e07563824de8d not found: ID does not exist" Nov 24 12:15:26 crc kubenswrapper[4752]: I1124 12:15:26.737036 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6094f2f8-ecf8-4541-bd68-b05c959fbd58" path="/var/lib/kubelet/pods/6094f2f8-ecf8-4541-bd68-b05c959fbd58/volumes" Nov 24 12:15:41 crc kubenswrapper[4752]: I1124 12:15:41.527493 4752 scope.go:117] "RemoveContainer" containerID="9e6466701c5e2974943f058f4727ef5cde02f243f235bbd683f20ee6b05000b7" Nov 24 12:16:15 crc kubenswrapper[4752]: I1124 12:16:15.468780 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 12:16:15 crc kubenswrapper[4752]: I1124 12:16:15.469249 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 12:16:45 crc kubenswrapper[4752]: I1124 12:16:45.469034 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 12:16:45 crc kubenswrapper[4752]: I1124 12:16:45.470112 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 12:17:15 crc kubenswrapper[4752]: I1124 12:17:15.469188 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 12:17:15 crc kubenswrapper[4752]: I1124 12:17:15.469854 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 12:17:15 crc kubenswrapper[4752]: I1124 12:17:15.469920 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 12:17:15 crc kubenswrapper[4752]: I1124 12:17:15.470495 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a07fb79ffa4c02fa1ac7fbfe36d7a8beb4399a115479826734e5d12ec7283331"} pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 24 12:17:15 crc kubenswrapper[4752]: I1124 12:17:15.470582 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" containerID="cri-o://a07fb79ffa4c02fa1ac7fbfe36d7a8beb4399a115479826734e5d12ec7283331" gracePeriod=600 Nov 24 12:17:15 crc kubenswrapper[4752]: E1124 12:17:15.601278 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:17:15 crc kubenswrapper[4752]: I1124 12:17:15.851827 4752 generic.go:334] "Generic (PLEG): container finished" podID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerID="a07fb79ffa4c02fa1ac7fbfe36d7a8beb4399a115479826734e5d12ec7283331" exitCode=0 Nov 24 12:17:15 crc kubenswrapper[4752]: I1124 12:17:15.851894 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerDied","Data":"a07fb79ffa4c02fa1ac7fbfe36d7a8beb4399a115479826734e5d12ec7283331"} Nov 24 12:17:15 crc kubenswrapper[4752]: I1124 12:17:15.851943 4752 scope.go:117] "RemoveContainer" containerID="6da3a157cf2d3aac64f3071b9f67cc0eac1a6b813d85528edb9f0789e946feee" Nov 24 12:17:15 crc kubenswrapper[4752]: I1124 12:17:15.852678 4752 scope.go:117] "RemoveContainer" containerID="a07fb79ffa4c02fa1ac7fbfe36d7a8beb4399a115479826734e5d12ec7283331" Nov 24 12:17:15 crc kubenswrapper[4752]: E1124 12:17:15.853091 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:17:28 crc kubenswrapper[4752]: I1124 12:17:28.727592 4752 scope.go:117] "RemoveContainer" containerID="a07fb79ffa4c02fa1ac7fbfe36d7a8beb4399a115479826734e5d12ec7283331" Nov 24 12:17:28 crc kubenswrapper[4752]: E1124 12:17:28.728357 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:17:42 crc kubenswrapper[4752]: I1124 12:17:42.728684 4752 scope.go:117] "RemoveContainer" containerID="a07fb79ffa4c02fa1ac7fbfe36d7a8beb4399a115479826734e5d12ec7283331" Nov 24 12:17:42 crc kubenswrapper[4752]: E1124 12:17:42.729903 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:17:54 crc kubenswrapper[4752]: I1124 12:17:54.733207 4752 scope.go:117] "RemoveContainer" containerID="a07fb79ffa4c02fa1ac7fbfe36d7a8beb4399a115479826734e5d12ec7283331" Nov 24 12:17:54 crc kubenswrapper[4752]: E1124 12:17:54.734243 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:18:08 crc kubenswrapper[4752]: I1124 12:18:08.727865 4752 scope.go:117] "RemoveContainer" containerID="a07fb79ffa4c02fa1ac7fbfe36d7a8beb4399a115479826734e5d12ec7283331" Nov 24 12:18:08 crc kubenswrapper[4752]: E1124 12:18:08.728534 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:18:19 crc kubenswrapper[4752]: I1124 12:18:19.728734 4752 scope.go:117] "RemoveContainer" containerID="a07fb79ffa4c02fa1ac7fbfe36d7a8beb4399a115479826734e5d12ec7283331" Nov 24 12:18:19 crc kubenswrapper[4752]: E1124 12:18:19.729968 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:18:33 crc kubenswrapper[4752]: I1124 12:18:33.728113 4752 scope.go:117] "RemoveContainer" containerID="a07fb79ffa4c02fa1ac7fbfe36d7a8beb4399a115479826734e5d12ec7283331" Nov 24 12:18:33 crc kubenswrapper[4752]: E1124 12:18:33.729119 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:18:42 crc kubenswrapper[4752]: I1124 12:18:42.930605 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-x24q4"] Nov 24 12:18:42 crc kubenswrapper[4752]: I1124 12:18:42.937500 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-x24q4"] Nov 24 12:18:43 crc kubenswrapper[4752]: I1124 12:18:43.043412 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-s94lq"] Nov 24 12:18:43 crc kubenswrapper[4752]: E1124 12:18:43.043820 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6094f2f8-ecf8-4541-bd68-b05c959fbd58" containerName="extract-content" Nov 24 12:18:43 crc kubenswrapper[4752]: I1124 12:18:43.043844 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="6094f2f8-ecf8-4541-bd68-b05c959fbd58" containerName="extract-content" Nov 24 12:18:43 crc kubenswrapper[4752]: E1124 12:18:43.043871 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6094f2f8-ecf8-4541-bd68-b05c959fbd58" containerName="registry-server" Nov 24 12:18:43 crc kubenswrapper[4752]: I1124 12:18:43.043880 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="6094f2f8-ecf8-4541-bd68-b05c959fbd58" containerName="registry-server" Nov 24 12:18:43 crc kubenswrapper[4752]: E1124 12:18:43.043892 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6094f2f8-ecf8-4541-bd68-b05c959fbd58" containerName="extract-utilities" Nov 24 12:18:43 crc kubenswrapper[4752]: I1124 12:18:43.043901 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="6094f2f8-ecf8-4541-bd68-b05c959fbd58" containerName="extract-utilities" Nov 24 12:18:43 crc kubenswrapper[4752]: I1124 12:18:43.044074 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="6094f2f8-ecf8-4541-bd68-b05c959fbd58" containerName="registry-server" Nov 24 12:18:43 crc kubenswrapper[4752]: I1124 12:18:43.044647 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-s94lq" Nov 24 12:18:43 crc kubenswrapper[4752]: I1124 12:18:43.046782 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Nov 24 12:18:43 crc kubenswrapper[4752]: I1124 12:18:43.046825 4752 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-wzbzq" Nov 24 12:18:43 crc kubenswrapper[4752]: I1124 12:18:43.048372 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Nov 24 12:18:43 crc kubenswrapper[4752]: I1124 12:18:43.048390 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Nov 24 12:18:43 crc kubenswrapper[4752]: I1124 12:18:43.062436 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-s94lq"] Nov 24 12:18:43 crc kubenswrapper[4752]: I1124 12:18:43.172507 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/cb44fbe3-4c06-4223-bf66-5c7798cf4dd4-crc-storage\") pod \"crc-storage-crc-s94lq\" (UID: \"cb44fbe3-4c06-4223-bf66-5c7798cf4dd4\") " pod="crc-storage/crc-storage-crc-s94lq" Nov 24 12:18:43 crc kubenswrapper[4752]: I1124 12:18:43.172632 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/cb44fbe3-4c06-4223-bf66-5c7798cf4dd4-node-mnt\") pod \"crc-storage-crc-s94lq\" (UID: \"cb44fbe3-4c06-4223-bf66-5c7798cf4dd4\") " pod="crc-storage/crc-storage-crc-s94lq" Nov 24 12:18:43 crc kubenswrapper[4752]: I1124 12:18:43.172839 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzsqh\" (UniqueName: \"kubernetes.io/projected/cb44fbe3-4c06-4223-bf66-5c7798cf4dd4-kube-api-access-vzsqh\") pod \"crc-storage-crc-s94lq\" (UID: \"cb44fbe3-4c06-4223-bf66-5c7798cf4dd4\") " pod="crc-storage/crc-storage-crc-s94lq" Nov 24 12:18:43 crc kubenswrapper[4752]: I1124 12:18:43.304918 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzsqh\" (UniqueName: \"kubernetes.io/projected/cb44fbe3-4c06-4223-bf66-5c7798cf4dd4-kube-api-access-vzsqh\") pod \"crc-storage-crc-s94lq\" (UID: \"cb44fbe3-4c06-4223-bf66-5c7798cf4dd4\") " pod="crc-storage/crc-storage-crc-s94lq" Nov 24 12:18:43 crc kubenswrapper[4752]: I1124 12:18:43.305485 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/cb44fbe3-4c06-4223-bf66-5c7798cf4dd4-crc-storage\") pod \"crc-storage-crc-s94lq\" (UID: \"cb44fbe3-4c06-4223-bf66-5c7798cf4dd4\") " pod="crc-storage/crc-storage-crc-s94lq" Nov 24 12:18:43 crc kubenswrapper[4752]: I1124 12:18:43.305632 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/cb44fbe3-4c06-4223-bf66-5c7798cf4dd4-node-mnt\") pod \"crc-storage-crc-s94lq\" (UID: \"cb44fbe3-4c06-4223-bf66-5c7798cf4dd4\") " pod="crc-storage/crc-storage-crc-s94lq" Nov 24 12:18:43 crc kubenswrapper[4752]: I1124 12:18:43.306151 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/cb44fbe3-4c06-4223-bf66-5c7798cf4dd4-node-mnt\") pod \"crc-storage-crc-s94lq\" (UID: \"cb44fbe3-4c06-4223-bf66-5c7798cf4dd4\") " pod="crc-storage/crc-storage-crc-s94lq" Nov 24 12:18:43 crc kubenswrapper[4752]: I1124 12:18:43.306982 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/cb44fbe3-4c06-4223-bf66-5c7798cf4dd4-crc-storage\") pod \"crc-storage-crc-s94lq\" (UID: \"cb44fbe3-4c06-4223-bf66-5c7798cf4dd4\") " pod="crc-storage/crc-storage-crc-s94lq" Nov 24 12:18:43 crc kubenswrapper[4752]: I1124 12:18:43.330395 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzsqh\" (UniqueName: \"kubernetes.io/projected/cb44fbe3-4c06-4223-bf66-5c7798cf4dd4-kube-api-access-vzsqh\") pod \"crc-storage-crc-s94lq\" (UID: \"cb44fbe3-4c06-4223-bf66-5c7798cf4dd4\") " pod="crc-storage/crc-storage-crc-s94lq" Nov 24 12:18:43 crc kubenswrapper[4752]: I1124 12:18:43.365411 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-s94lq" Nov 24 12:18:43 crc kubenswrapper[4752]: I1124 12:18:43.787537 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-s94lq"] Nov 24 12:18:43 crc kubenswrapper[4752]: I1124 12:18:43.797850 4752 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 24 12:18:44 crc kubenswrapper[4752]: I1124 12:18:44.508700 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-s94lq" event={"ID":"cb44fbe3-4c06-4223-bf66-5c7798cf4dd4","Type":"ContainerStarted","Data":"9984e35d2a8c63bf81e61d3edba860ff9e2a571f4d705b5aa4b3a57ebbd59eed"} Nov 24 12:18:44 crc kubenswrapper[4752]: I1124 12:18:44.509145 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-s94lq" event={"ID":"cb44fbe3-4c06-4223-bf66-5c7798cf4dd4","Type":"ContainerStarted","Data":"38dfd718d0f242c728fdc06d8b5e14666b9619abf0c7c656ec9f160d8999059d"} Nov 24 12:18:44 crc kubenswrapper[4752]: I1124 12:18:44.524130 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="crc-storage/crc-storage-crc-s94lq" podStartSLOduration=1.026780045 podStartE2EDuration="1.524110957s" podCreationTimestamp="2025-11-24 12:18:43 +0000 UTC" firstStartedPulling="2025-11-24 12:18:43.797621219 +0000 UTC m=+4329.782441508" lastFinishedPulling="2025-11-24 12:18:44.294952131 +0000 UTC m=+4330.279772420" observedRunningTime="2025-11-24 12:18:44.523116358 +0000 UTC m=+4330.507936647" watchObservedRunningTime="2025-11-24 12:18:44.524110957 +0000 UTC m=+4330.508931246" Nov 24 12:18:44 crc kubenswrapper[4752]: I1124 12:18:44.738441 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f" path="/var/lib/kubelet/pods/4c64e610-fc8d-4fd5-a71f-f8fa7c6d7f3f/volumes" Nov 24 12:18:45 crc kubenswrapper[4752]: I1124 12:18:45.518649 4752 generic.go:334] "Generic (PLEG): container finished" podID="cb44fbe3-4c06-4223-bf66-5c7798cf4dd4" containerID="9984e35d2a8c63bf81e61d3edba860ff9e2a571f4d705b5aa4b3a57ebbd59eed" exitCode=0 Nov 24 12:18:45 crc kubenswrapper[4752]: I1124 12:18:45.518791 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-s94lq" event={"ID":"cb44fbe3-4c06-4223-bf66-5c7798cf4dd4","Type":"ContainerDied","Data":"9984e35d2a8c63bf81e61d3edba860ff9e2a571f4d705b5aa4b3a57ebbd59eed"} Nov 24 12:18:46 crc kubenswrapper[4752]: I1124 12:18:46.738354 4752 scope.go:117] "RemoveContainer" containerID="a07fb79ffa4c02fa1ac7fbfe36d7a8beb4399a115479826734e5d12ec7283331" Nov 24 12:18:46 crc kubenswrapper[4752]: E1124 12:18:46.739003 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:18:46 crc kubenswrapper[4752]: I1124 12:18:46.785162 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-s94lq" Nov 24 12:18:46 crc kubenswrapper[4752]: I1124 12:18:46.959899 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/cb44fbe3-4c06-4223-bf66-5c7798cf4dd4-crc-storage\") pod \"cb44fbe3-4c06-4223-bf66-5c7798cf4dd4\" (UID: \"cb44fbe3-4c06-4223-bf66-5c7798cf4dd4\") " Nov 24 12:18:46 crc kubenswrapper[4752]: I1124 12:18:46.960018 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzsqh\" (UniqueName: \"kubernetes.io/projected/cb44fbe3-4c06-4223-bf66-5c7798cf4dd4-kube-api-access-vzsqh\") pod \"cb44fbe3-4c06-4223-bf66-5c7798cf4dd4\" (UID: \"cb44fbe3-4c06-4223-bf66-5c7798cf4dd4\") " Nov 24 12:18:46 crc kubenswrapper[4752]: I1124 12:18:46.960116 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/cb44fbe3-4c06-4223-bf66-5c7798cf4dd4-node-mnt\") pod \"cb44fbe3-4c06-4223-bf66-5c7798cf4dd4\" (UID: \"cb44fbe3-4c06-4223-bf66-5c7798cf4dd4\") " Nov 24 12:18:46 crc kubenswrapper[4752]: I1124 12:18:46.960242 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb44fbe3-4c06-4223-bf66-5c7798cf4dd4-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "cb44fbe3-4c06-4223-bf66-5c7798cf4dd4" (UID: "cb44fbe3-4c06-4223-bf66-5c7798cf4dd4"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 12:18:46 crc kubenswrapper[4752]: I1124 12:18:46.960757 4752 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/cb44fbe3-4c06-4223-bf66-5c7798cf4dd4-node-mnt\") on node \"crc\" DevicePath \"\"" Nov 24 12:18:46 crc kubenswrapper[4752]: I1124 12:18:46.966933 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb44fbe3-4c06-4223-bf66-5c7798cf4dd4-kube-api-access-vzsqh" (OuterVolumeSpecName: "kube-api-access-vzsqh") pod "cb44fbe3-4c06-4223-bf66-5c7798cf4dd4" (UID: "cb44fbe3-4c06-4223-bf66-5c7798cf4dd4"). InnerVolumeSpecName "kube-api-access-vzsqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:18:46 crc kubenswrapper[4752]: I1124 12:18:46.991985 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb44fbe3-4c06-4223-bf66-5c7798cf4dd4-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "cb44fbe3-4c06-4223-bf66-5c7798cf4dd4" (UID: "cb44fbe3-4c06-4223-bf66-5c7798cf4dd4"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:18:47 crc kubenswrapper[4752]: I1124 12:18:47.062519 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzsqh\" (UniqueName: \"kubernetes.io/projected/cb44fbe3-4c06-4223-bf66-5c7798cf4dd4-kube-api-access-vzsqh\") on node \"crc\" DevicePath \"\"" Nov 24 12:18:47 crc kubenswrapper[4752]: I1124 12:18:47.062557 4752 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/cb44fbe3-4c06-4223-bf66-5c7798cf4dd4-crc-storage\") on node \"crc\" DevicePath \"\"" Nov 24 12:18:47 crc kubenswrapper[4752]: I1124 12:18:47.542559 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-s94lq" Nov 24 12:18:47 crc kubenswrapper[4752]: I1124 12:18:47.542962 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-s94lq" event={"ID":"cb44fbe3-4c06-4223-bf66-5c7798cf4dd4","Type":"ContainerDied","Data":"38dfd718d0f242c728fdc06d8b5e14666b9619abf0c7c656ec9f160d8999059d"} Nov 24 12:18:47 crc kubenswrapper[4752]: I1124 12:18:47.543019 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38dfd718d0f242c728fdc06d8b5e14666b9619abf0c7c656ec9f160d8999059d" Nov 24 12:18:48 crc kubenswrapper[4752]: I1124 12:18:48.640399 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-s94lq"] Nov 24 12:18:48 crc kubenswrapper[4752]: I1124 12:18:48.644970 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-s94lq"] Nov 24 12:18:48 crc kubenswrapper[4752]: I1124 12:18:48.738846 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb44fbe3-4c06-4223-bf66-5c7798cf4dd4" path="/var/lib/kubelet/pods/cb44fbe3-4c06-4223-bf66-5c7798cf4dd4/volumes" Nov 24 12:18:48 crc kubenswrapper[4752]: I1124 12:18:48.773613 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-jzrl9"] Nov 24 12:18:48 crc kubenswrapper[4752]: E1124 12:18:48.774071 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb44fbe3-4c06-4223-bf66-5c7798cf4dd4" containerName="storage" Nov 24 12:18:48 crc kubenswrapper[4752]: I1124 12:18:48.774090 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb44fbe3-4c06-4223-bf66-5c7798cf4dd4" containerName="storage" Nov 24 12:18:48 crc kubenswrapper[4752]: I1124 12:18:48.774307 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb44fbe3-4c06-4223-bf66-5c7798cf4dd4" containerName="storage" Nov 24 12:18:48 crc kubenswrapper[4752]: I1124 12:18:48.774877 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jzrl9" Nov 24 12:18:48 crc kubenswrapper[4752]: I1124 12:18:48.784942 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Nov 24 12:18:48 crc kubenswrapper[4752]: I1124 12:18:48.785152 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Nov 24 12:18:48 crc kubenswrapper[4752]: I1124 12:18:48.785266 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Nov 24 12:18:48 crc kubenswrapper[4752]: I1124 12:18:48.785621 4752 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-wzbzq" Nov 24 12:18:48 crc kubenswrapper[4752]: I1124 12:18:48.798213 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d26d8c7c-b2dd-44b3-913e-06c75e376c88-crc-storage\") pod \"crc-storage-crc-jzrl9\" (UID: \"d26d8c7c-b2dd-44b3-913e-06c75e376c88\") " pod="crc-storage/crc-storage-crc-jzrl9" Nov 24 12:18:48 crc kubenswrapper[4752]: I1124 12:18:48.798292 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbvdf\" (UniqueName: \"kubernetes.io/projected/d26d8c7c-b2dd-44b3-913e-06c75e376c88-kube-api-access-pbvdf\") pod \"crc-storage-crc-jzrl9\" (UID: \"d26d8c7c-b2dd-44b3-913e-06c75e376c88\") " pod="crc-storage/crc-storage-crc-jzrl9" Nov 24 12:18:48 crc kubenswrapper[4752]: I1124 12:18:48.798346 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d26d8c7c-b2dd-44b3-913e-06c75e376c88-node-mnt\") pod \"crc-storage-crc-jzrl9\" (UID: \"d26d8c7c-b2dd-44b3-913e-06c75e376c88\") " pod="crc-storage/crc-storage-crc-jzrl9" Nov 24 12:18:48 crc kubenswrapper[4752]: I1124 12:18:48.800433 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-jzrl9"] Nov 24 12:18:48 crc kubenswrapper[4752]: I1124 12:18:48.899221 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d26d8c7c-b2dd-44b3-913e-06c75e376c88-crc-storage\") pod \"crc-storage-crc-jzrl9\" (UID: \"d26d8c7c-b2dd-44b3-913e-06c75e376c88\") " pod="crc-storage/crc-storage-crc-jzrl9" Nov 24 12:18:48 crc kubenswrapper[4752]: I1124 12:18:48.899280 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbvdf\" (UniqueName: \"kubernetes.io/projected/d26d8c7c-b2dd-44b3-913e-06c75e376c88-kube-api-access-pbvdf\") pod \"crc-storage-crc-jzrl9\" (UID: \"d26d8c7c-b2dd-44b3-913e-06c75e376c88\") " pod="crc-storage/crc-storage-crc-jzrl9" Nov 24 12:18:48 crc kubenswrapper[4752]: I1124 12:18:48.899322 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d26d8c7c-b2dd-44b3-913e-06c75e376c88-node-mnt\") pod \"crc-storage-crc-jzrl9\" (UID: \"d26d8c7c-b2dd-44b3-913e-06c75e376c88\") " pod="crc-storage/crc-storage-crc-jzrl9" Nov 24 12:18:48 crc kubenswrapper[4752]: I1124 12:18:48.899580 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d26d8c7c-b2dd-44b3-913e-06c75e376c88-node-mnt\") pod \"crc-storage-crc-jzrl9\" (UID: \"d26d8c7c-b2dd-44b3-913e-06c75e376c88\") " pod="crc-storage/crc-storage-crc-jzrl9" Nov 24 12:18:48 crc kubenswrapper[4752]: I1124 12:18:48.900059 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d26d8c7c-b2dd-44b3-913e-06c75e376c88-crc-storage\") pod \"crc-storage-crc-jzrl9\" (UID: \"d26d8c7c-b2dd-44b3-913e-06c75e376c88\") " pod="crc-storage/crc-storage-crc-jzrl9" Nov 24 12:18:48 crc kubenswrapper[4752]: I1124 12:18:48.918505 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbvdf\" (UniqueName: \"kubernetes.io/projected/d26d8c7c-b2dd-44b3-913e-06c75e376c88-kube-api-access-pbvdf\") pod \"crc-storage-crc-jzrl9\" (UID: \"d26d8c7c-b2dd-44b3-913e-06c75e376c88\") " pod="crc-storage/crc-storage-crc-jzrl9" Nov 24 12:18:49 crc kubenswrapper[4752]: I1124 12:18:49.114327 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jzrl9" Nov 24 12:18:49 crc kubenswrapper[4752]: I1124 12:18:49.332167 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-jzrl9"] Nov 24 12:18:49 crc kubenswrapper[4752]: W1124 12:18:49.342909 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd26d8c7c_b2dd_44b3_913e_06c75e376c88.slice/crio-45f7559723eda2f531138088539af17a708a37c9f626dab8ed700603c2bc78d2 WatchSource:0}: Error finding container 45f7559723eda2f531138088539af17a708a37c9f626dab8ed700603c2bc78d2: Status 404 returned error can't find the container with id 45f7559723eda2f531138088539af17a708a37c9f626dab8ed700603c2bc78d2 Nov 24 12:18:49 crc kubenswrapper[4752]: I1124 12:18:49.562612 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-jzrl9" event={"ID":"d26d8c7c-b2dd-44b3-913e-06c75e376c88","Type":"ContainerStarted","Data":"45f7559723eda2f531138088539af17a708a37c9f626dab8ed700603c2bc78d2"} Nov 24 12:18:50 crc kubenswrapper[4752]: I1124 12:18:50.574670 4752 generic.go:334] "Generic (PLEG): container finished" podID="d26d8c7c-b2dd-44b3-913e-06c75e376c88" containerID="658c56f1eb738577f5bd49e5d76b23937c10620d2c10fb6ea5f287dbb4e53ca3" exitCode=0 Nov 24 12:18:50 crc kubenswrapper[4752]: I1124 12:18:50.574779 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-jzrl9" event={"ID":"d26d8c7c-b2dd-44b3-913e-06c75e376c88","Type":"ContainerDied","Data":"658c56f1eb738577f5bd49e5d76b23937c10620d2c10fb6ea5f287dbb4e53ca3"} Nov 24 12:18:51 crc kubenswrapper[4752]: I1124 12:18:51.829159 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jzrl9" Nov 24 12:18:51 crc kubenswrapper[4752]: I1124 12:18:51.941889 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d26d8c7c-b2dd-44b3-913e-06c75e376c88-node-mnt\") pod \"d26d8c7c-b2dd-44b3-913e-06c75e376c88\" (UID: \"d26d8c7c-b2dd-44b3-913e-06c75e376c88\") " Nov 24 12:18:51 crc kubenswrapper[4752]: I1124 12:18:51.941939 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbvdf\" (UniqueName: \"kubernetes.io/projected/d26d8c7c-b2dd-44b3-913e-06c75e376c88-kube-api-access-pbvdf\") pod \"d26d8c7c-b2dd-44b3-913e-06c75e376c88\" (UID: \"d26d8c7c-b2dd-44b3-913e-06c75e376c88\") " Nov 24 12:18:51 crc kubenswrapper[4752]: I1124 12:18:51.942071 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d26d8c7c-b2dd-44b3-913e-06c75e376c88-crc-storage\") pod \"d26d8c7c-b2dd-44b3-913e-06c75e376c88\" (UID: \"d26d8c7c-b2dd-44b3-913e-06c75e376c88\") " Nov 24 12:18:51 crc kubenswrapper[4752]: I1124 12:18:51.942861 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d26d8c7c-b2dd-44b3-913e-06c75e376c88-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "d26d8c7c-b2dd-44b3-913e-06c75e376c88" (UID: "d26d8c7c-b2dd-44b3-913e-06c75e376c88"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 12:18:51 crc kubenswrapper[4752]: I1124 12:18:51.947270 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d26d8c7c-b2dd-44b3-913e-06c75e376c88-kube-api-access-pbvdf" (OuterVolumeSpecName: "kube-api-access-pbvdf") pod "d26d8c7c-b2dd-44b3-913e-06c75e376c88" (UID: "d26d8c7c-b2dd-44b3-913e-06c75e376c88"). InnerVolumeSpecName "kube-api-access-pbvdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:18:51 crc kubenswrapper[4752]: I1124 12:18:51.961653 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d26d8c7c-b2dd-44b3-913e-06c75e376c88-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "d26d8c7c-b2dd-44b3-913e-06c75e376c88" (UID: "d26d8c7c-b2dd-44b3-913e-06c75e376c88"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:18:52 crc kubenswrapper[4752]: I1124 12:18:52.043990 4752 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d26d8c7c-b2dd-44b3-913e-06c75e376c88-node-mnt\") on node \"crc\" DevicePath \"\"" Nov 24 12:18:52 crc kubenswrapper[4752]: I1124 12:18:52.044031 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbvdf\" (UniqueName: \"kubernetes.io/projected/d26d8c7c-b2dd-44b3-913e-06c75e376c88-kube-api-access-pbvdf\") on node \"crc\" DevicePath \"\"" Nov 24 12:18:52 crc kubenswrapper[4752]: I1124 12:18:52.044045 4752 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d26d8c7c-b2dd-44b3-913e-06c75e376c88-crc-storage\") on node \"crc\" DevicePath \"\"" Nov 24 12:18:52 crc kubenswrapper[4752]: I1124 12:18:52.590943 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-jzrl9" event={"ID":"d26d8c7c-b2dd-44b3-913e-06c75e376c88","Type":"ContainerDied","Data":"45f7559723eda2f531138088539af17a708a37c9f626dab8ed700603c2bc78d2"} Nov 24 12:18:52 crc kubenswrapper[4752]: I1124 12:18:52.590992 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45f7559723eda2f531138088539af17a708a37c9f626dab8ed700603c2bc78d2" Nov 24 12:18:52 crc kubenswrapper[4752]: I1124 12:18:52.591063 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jzrl9" Nov 24 12:18:52 crc kubenswrapper[4752]: E1124 12:18:52.761261 4752 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd26d8c7c_b2dd_44b3_913e_06c75e376c88.slice\": RecentStats: unable to find data in memory cache]" Nov 24 12:18:57 crc kubenswrapper[4752]: I1124 12:18:57.729556 4752 scope.go:117] "RemoveContainer" containerID="a07fb79ffa4c02fa1ac7fbfe36d7a8beb4399a115479826734e5d12ec7283331" Nov 24 12:18:57 crc kubenswrapper[4752]: E1124 12:18:57.731025 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:19:09 crc kubenswrapper[4752]: I1124 12:19:09.728432 4752 scope.go:117] "RemoveContainer" containerID="a07fb79ffa4c02fa1ac7fbfe36d7a8beb4399a115479826734e5d12ec7283331" Nov 24 12:19:09 crc kubenswrapper[4752]: E1124 12:19:09.729460 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:19:22 crc kubenswrapper[4752]: I1124 12:19:22.727588 4752 scope.go:117] "RemoveContainer" containerID="a07fb79ffa4c02fa1ac7fbfe36d7a8beb4399a115479826734e5d12ec7283331" Nov 24 12:19:22 crc kubenswrapper[4752]: E1124 12:19:22.728387 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:19:37 crc kubenswrapper[4752]: I1124 12:19:37.728249 4752 scope.go:117] "RemoveContainer" containerID="a07fb79ffa4c02fa1ac7fbfe36d7a8beb4399a115479826734e5d12ec7283331" Nov 24 12:19:37 crc kubenswrapper[4752]: E1124 12:19:37.729060 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:19:41 crc kubenswrapper[4752]: I1124 12:19:41.652296 4752 scope.go:117] "RemoveContainer" containerID="49a92d8cacf21c15dacb7e2e661eec4d93f24f9175ad5a8324777a4edc76aac5" Nov 24 12:19:49 crc kubenswrapper[4752]: I1124 12:19:49.728164 4752 scope.go:117] "RemoveContainer" containerID="a07fb79ffa4c02fa1ac7fbfe36d7a8beb4399a115479826734e5d12ec7283331" Nov 24 12:19:49 crc kubenswrapper[4752]: E1124 12:19:49.729013 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:20:01 crc kubenswrapper[4752]: I1124 12:20:01.727259 4752 scope.go:117] "RemoveContainer" containerID="a07fb79ffa4c02fa1ac7fbfe36d7a8beb4399a115479826734e5d12ec7283331" Nov 24 12:20:01 crc kubenswrapper[4752]: E1124 12:20:01.728047 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:20:13 crc kubenswrapper[4752]: I1124 12:20:13.728212 4752 scope.go:117] "RemoveContainer" containerID="a07fb79ffa4c02fa1ac7fbfe36d7a8beb4399a115479826734e5d12ec7283331" Nov 24 12:20:13 crc kubenswrapper[4752]: E1124 12:20:13.729143 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:20:24 crc kubenswrapper[4752]: I1124 12:20:24.733153 4752 scope.go:117] "RemoveContainer" containerID="a07fb79ffa4c02fa1ac7fbfe36d7a8beb4399a115479826734e5d12ec7283331" Nov 24 12:20:24 crc kubenswrapper[4752]: E1124 12:20:24.733802 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:20:36 crc kubenswrapper[4752]: I1124 12:20:36.728006 4752 scope.go:117] "RemoveContainer" containerID="a07fb79ffa4c02fa1ac7fbfe36d7a8beb4399a115479826734e5d12ec7283331" Nov 24 12:20:36 crc kubenswrapper[4752]: E1124 12:20:36.728729 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:20:47 crc kubenswrapper[4752]: I1124 12:20:47.727780 4752 scope.go:117] "RemoveContainer" containerID="a07fb79ffa4c02fa1ac7fbfe36d7a8beb4399a115479826734e5d12ec7283331" Nov 24 12:20:47 crc kubenswrapper[4752]: E1124 12:20:47.728648 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:20:58 crc kubenswrapper[4752]: I1124 12:20:58.727577 4752 scope.go:117] "RemoveContainer" containerID="a07fb79ffa4c02fa1ac7fbfe36d7a8beb4399a115479826734e5d12ec7283331" Nov 24 12:20:58 crc kubenswrapper[4752]: E1124 12:20:58.728334 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:21:10 crc kubenswrapper[4752]: I1124 12:21:10.728143 4752 scope.go:117] "RemoveContainer" containerID="a07fb79ffa4c02fa1ac7fbfe36d7a8beb4399a115479826734e5d12ec7283331" Nov 24 12:21:10 crc kubenswrapper[4752]: E1124 12:21:10.729188 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:21:15 crc kubenswrapper[4752]: I1124 12:21:15.164990 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gwp6g"] Nov 24 12:21:15 crc kubenswrapper[4752]: E1124 12:21:15.165798 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d26d8c7c-b2dd-44b3-913e-06c75e376c88" containerName="storage" Nov 24 12:21:15 crc kubenswrapper[4752]: I1124 12:21:15.165814 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="d26d8c7c-b2dd-44b3-913e-06c75e376c88" containerName="storage" Nov 24 12:21:15 crc kubenswrapper[4752]: I1124 12:21:15.165974 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="d26d8c7c-b2dd-44b3-913e-06c75e376c88" containerName="storage" Nov 24 12:21:15 crc kubenswrapper[4752]: I1124 12:21:15.167083 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gwp6g" Nov 24 12:21:15 crc kubenswrapper[4752]: I1124 12:21:15.198528 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gwp6g"] Nov 24 12:21:15 crc kubenswrapper[4752]: I1124 12:21:15.244084 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dead7321-d402-4c56-acb7-ce18c49b6941-utilities\") pod \"redhat-operators-gwp6g\" (UID: \"dead7321-d402-4c56-acb7-ce18c49b6941\") " pod="openshift-marketplace/redhat-operators-gwp6g" Nov 24 12:21:15 crc kubenswrapper[4752]: I1124 12:21:15.244170 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dead7321-d402-4c56-acb7-ce18c49b6941-catalog-content\") pod \"redhat-operators-gwp6g\" (UID: \"dead7321-d402-4c56-acb7-ce18c49b6941\") " pod="openshift-marketplace/redhat-operators-gwp6g" Nov 24 12:21:15 crc kubenswrapper[4752]: I1124 12:21:15.244207 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n956\" (UniqueName: \"kubernetes.io/projected/dead7321-d402-4c56-acb7-ce18c49b6941-kube-api-access-5n956\") pod \"redhat-operators-gwp6g\" (UID: \"dead7321-d402-4c56-acb7-ce18c49b6941\") " pod="openshift-marketplace/redhat-operators-gwp6g" Nov 24 12:21:15 crc kubenswrapper[4752]: I1124 12:21:15.344831 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dead7321-d402-4c56-acb7-ce18c49b6941-utilities\") pod \"redhat-operators-gwp6g\" (UID: \"dead7321-d402-4c56-acb7-ce18c49b6941\") " pod="openshift-marketplace/redhat-operators-gwp6g" Nov 24 12:21:15 crc kubenswrapper[4752]: I1124 12:21:15.345198 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dead7321-d402-4c56-acb7-ce18c49b6941-catalog-content\") pod \"redhat-operators-gwp6g\" (UID: \"dead7321-d402-4c56-acb7-ce18c49b6941\") " pod="openshift-marketplace/redhat-operators-gwp6g" Nov 24 12:21:15 crc kubenswrapper[4752]: I1124 12:21:15.345226 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5n956\" (UniqueName: \"kubernetes.io/projected/dead7321-d402-4c56-acb7-ce18c49b6941-kube-api-access-5n956\") pod \"redhat-operators-gwp6g\" (UID: \"dead7321-d402-4c56-acb7-ce18c49b6941\") " pod="openshift-marketplace/redhat-operators-gwp6g" Nov 24 12:21:15 crc kubenswrapper[4752]: I1124 12:21:15.345285 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dead7321-d402-4c56-acb7-ce18c49b6941-utilities\") pod \"redhat-operators-gwp6g\" (UID: \"dead7321-d402-4c56-acb7-ce18c49b6941\") " pod="openshift-marketplace/redhat-operators-gwp6g" Nov 24 12:21:15 crc kubenswrapper[4752]: I1124 12:21:15.345553 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dead7321-d402-4c56-acb7-ce18c49b6941-catalog-content\") pod \"redhat-operators-gwp6g\" (UID: \"dead7321-d402-4c56-acb7-ce18c49b6941\") " pod="openshift-marketplace/redhat-operators-gwp6g" Nov 24 12:21:15 crc kubenswrapper[4752]: I1124 12:21:15.366522 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5n956\" (UniqueName: \"kubernetes.io/projected/dead7321-d402-4c56-acb7-ce18c49b6941-kube-api-access-5n956\") pod \"redhat-operators-gwp6g\" (UID: \"dead7321-d402-4c56-acb7-ce18c49b6941\") " pod="openshift-marketplace/redhat-operators-gwp6g" Nov 24 12:21:15 crc kubenswrapper[4752]: I1124 12:21:15.493464 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gwp6g" Nov 24 12:21:15 crc kubenswrapper[4752]: I1124 12:21:15.904952 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gwp6g"] Nov 24 12:21:15 crc kubenswrapper[4752]: W1124 12:21:15.914010 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddead7321_d402_4c56_acb7_ce18c49b6941.slice/crio-303edccc28cf0860e4caea372fd3167588e916c923dd1c29d8a0aa1fc1affc99 WatchSource:0}: Error finding container 303edccc28cf0860e4caea372fd3167588e916c923dd1c29d8a0aa1fc1affc99: Status 404 returned error can't find the container with id 303edccc28cf0860e4caea372fd3167588e916c923dd1c29d8a0aa1fc1affc99 Nov 24 12:21:16 crc kubenswrapper[4752]: I1124 12:21:16.693344 4752 generic.go:334] "Generic (PLEG): container finished" podID="dead7321-d402-4c56-acb7-ce18c49b6941" containerID="e72aef7281e17824197ed22b0b58e92802053ad9a5c90263d8c59c47c0e31046" exitCode=0 Nov 24 12:21:16 crc kubenswrapper[4752]: I1124 12:21:16.693453 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gwp6g" event={"ID":"dead7321-d402-4c56-acb7-ce18c49b6941","Type":"ContainerDied","Data":"e72aef7281e17824197ed22b0b58e92802053ad9a5c90263d8c59c47c0e31046"} Nov 24 12:21:16 crc kubenswrapper[4752]: I1124 12:21:16.693836 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gwp6g" event={"ID":"dead7321-d402-4c56-acb7-ce18c49b6941","Type":"ContainerStarted","Data":"303edccc28cf0860e4caea372fd3167588e916c923dd1c29d8a0aa1fc1affc99"} Nov 24 12:21:17 crc kubenswrapper[4752]: I1124 12:21:17.703492 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gwp6g" event={"ID":"dead7321-d402-4c56-acb7-ce18c49b6941","Type":"ContainerStarted","Data":"1504230ca13a65f3c36bcefa5d3ae8612975af616f58b20414334a0e5cee7d88"} Nov 24 12:21:18 crc kubenswrapper[4752]: I1124 12:21:18.715711 4752 generic.go:334] "Generic (PLEG): container finished" podID="dead7321-d402-4c56-acb7-ce18c49b6941" containerID="1504230ca13a65f3c36bcefa5d3ae8612975af616f58b20414334a0e5cee7d88" exitCode=0 Nov 24 12:21:18 crc kubenswrapper[4752]: I1124 12:21:18.715799 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gwp6g" event={"ID":"dead7321-d402-4c56-acb7-ce18c49b6941","Type":"ContainerDied","Data":"1504230ca13a65f3c36bcefa5d3ae8612975af616f58b20414334a0e5cee7d88"} Nov 24 12:21:19 crc kubenswrapper[4752]: I1124 12:21:19.725491 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gwp6g" event={"ID":"dead7321-d402-4c56-acb7-ce18c49b6941","Type":"ContainerStarted","Data":"2d3bc7447e77ee757fda0e4747e66e8d935c9f60db0bea3d0e17d51af23862cb"} Nov 24 12:21:19 crc kubenswrapper[4752]: I1124 12:21:19.744444 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gwp6g" podStartSLOduration=2.284131407 podStartE2EDuration="4.744424259s" podCreationTimestamp="2025-11-24 12:21:15 +0000 UTC" firstStartedPulling="2025-11-24 12:21:16.696361127 +0000 UTC m=+4482.681181416" lastFinishedPulling="2025-11-24 12:21:19.156653969 +0000 UTC m=+4485.141474268" observedRunningTime="2025-11-24 12:21:19.741079833 +0000 UTC m=+4485.725900122" watchObservedRunningTime="2025-11-24 12:21:19.744424259 +0000 UTC m=+4485.729244558" Nov 24 12:21:24 crc kubenswrapper[4752]: I1124 12:21:24.740216 4752 scope.go:117] "RemoveContainer" containerID="a07fb79ffa4c02fa1ac7fbfe36d7a8beb4399a115479826734e5d12ec7283331" Nov 24 12:21:24 crc kubenswrapper[4752]: E1124 12:21:24.743231 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:21:25 crc kubenswrapper[4752]: I1124 12:21:25.493630 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gwp6g" Nov 24 12:21:25 crc kubenswrapper[4752]: I1124 12:21:25.493689 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gwp6g" Nov 24 12:21:26 crc kubenswrapper[4752]: I1124 12:21:26.532629 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gwp6g" podUID="dead7321-d402-4c56-acb7-ce18c49b6941" containerName="registry-server" probeResult="failure" output=< Nov 24 12:21:26 crc kubenswrapper[4752]: timeout: failed to connect service ":50051" within 1s Nov 24 12:21:26 crc kubenswrapper[4752]: > Nov 24 12:21:35 crc kubenswrapper[4752]: I1124 12:21:35.531044 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gwp6g" Nov 24 12:21:35 crc kubenswrapper[4752]: I1124 12:21:35.577030 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gwp6g" Nov 24 12:21:35 crc kubenswrapper[4752]: I1124 12:21:35.762563 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gwp6g"] Nov 24 12:21:36 crc kubenswrapper[4752]: I1124 12:21:36.870961 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gwp6g" podUID="dead7321-d402-4c56-acb7-ce18c49b6941" containerName="registry-server" containerID="cri-o://2d3bc7447e77ee757fda0e4747e66e8d935c9f60db0bea3d0e17d51af23862cb" gracePeriod=2 Nov 24 12:21:37 crc kubenswrapper[4752]: I1124 12:21:37.222465 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gwp6g" Nov 24 12:21:37 crc kubenswrapper[4752]: I1124 12:21:37.347834 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dead7321-d402-4c56-acb7-ce18c49b6941-catalog-content\") pod \"dead7321-d402-4c56-acb7-ce18c49b6941\" (UID: \"dead7321-d402-4c56-acb7-ce18c49b6941\") " Nov 24 12:21:37 crc kubenswrapper[4752]: I1124 12:21:37.347912 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5n956\" (UniqueName: \"kubernetes.io/projected/dead7321-d402-4c56-acb7-ce18c49b6941-kube-api-access-5n956\") pod \"dead7321-d402-4c56-acb7-ce18c49b6941\" (UID: \"dead7321-d402-4c56-acb7-ce18c49b6941\") " Nov 24 12:21:37 crc kubenswrapper[4752]: I1124 12:21:37.347988 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dead7321-d402-4c56-acb7-ce18c49b6941-utilities\") pod \"dead7321-d402-4c56-acb7-ce18c49b6941\" (UID: \"dead7321-d402-4c56-acb7-ce18c49b6941\") " Nov 24 12:21:37 crc kubenswrapper[4752]: I1124 12:21:37.349129 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dead7321-d402-4c56-acb7-ce18c49b6941-utilities" (OuterVolumeSpecName: "utilities") pod "dead7321-d402-4c56-acb7-ce18c49b6941" (UID: "dead7321-d402-4c56-acb7-ce18c49b6941"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:21:37 crc kubenswrapper[4752]: I1124 12:21:37.353382 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dead7321-d402-4c56-acb7-ce18c49b6941-kube-api-access-5n956" (OuterVolumeSpecName: "kube-api-access-5n956") pod "dead7321-d402-4c56-acb7-ce18c49b6941" (UID: "dead7321-d402-4c56-acb7-ce18c49b6941"). InnerVolumeSpecName "kube-api-access-5n956". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:21:37 crc kubenswrapper[4752]: I1124 12:21:37.440496 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dead7321-d402-4c56-acb7-ce18c49b6941-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dead7321-d402-4c56-acb7-ce18c49b6941" (UID: "dead7321-d402-4c56-acb7-ce18c49b6941"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:21:37 crc kubenswrapper[4752]: I1124 12:21:37.450867 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dead7321-d402-4c56-acb7-ce18c49b6941-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 12:21:37 crc kubenswrapper[4752]: I1124 12:21:37.451017 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5n956\" (UniqueName: \"kubernetes.io/projected/dead7321-d402-4c56-acb7-ce18c49b6941-kube-api-access-5n956\") on node \"crc\" DevicePath \"\"" Nov 24 12:21:37 crc kubenswrapper[4752]: I1124 12:21:37.451032 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dead7321-d402-4c56-acb7-ce18c49b6941-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 12:21:37 crc kubenswrapper[4752]: I1124 12:21:37.878346 4752 generic.go:334] "Generic (PLEG): container finished" podID="dead7321-d402-4c56-acb7-ce18c49b6941" containerID="2d3bc7447e77ee757fda0e4747e66e8d935c9f60db0bea3d0e17d51af23862cb" exitCode=0 Nov 24 12:21:37 crc kubenswrapper[4752]: I1124 12:21:37.878388 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gwp6g" event={"ID":"dead7321-d402-4c56-acb7-ce18c49b6941","Type":"ContainerDied","Data":"2d3bc7447e77ee757fda0e4747e66e8d935c9f60db0bea3d0e17d51af23862cb"} Nov 24 12:21:37 crc kubenswrapper[4752]: I1124 12:21:37.878406 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gwp6g" Nov 24 12:21:37 crc kubenswrapper[4752]: I1124 12:21:37.878427 4752 scope.go:117] "RemoveContainer" containerID="2d3bc7447e77ee757fda0e4747e66e8d935c9f60db0bea3d0e17d51af23862cb" Nov 24 12:21:37 crc kubenswrapper[4752]: I1124 12:21:37.878416 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gwp6g" event={"ID":"dead7321-d402-4c56-acb7-ce18c49b6941","Type":"ContainerDied","Data":"303edccc28cf0860e4caea372fd3167588e916c923dd1c29d8a0aa1fc1affc99"} Nov 24 12:21:37 crc kubenswrapper[4752]: I1124 12:21:37.896729 4752 scope.go:117] "RemoveContainer" containerID="1504230ca13a65f3c36bcefa5d3ae8612975af616f58b20414334a0e5cee7d88" Nov 24 12:21:37 crc kubenswrapper[4752]: I1124 12:21:37.910317 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gwp6g"] Nov 24 12:21:37 crc kubenswrapper[4752]: I1124 12:21:37.915436 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gwp6g"] Nov 24 12:21:37 crc kubenswrapper[4752]: I1124 12:21:37.934566 4752 scope.go:117] "RemoveContainer" containerID="e72aef7281e17824197ed22b0b58e92802053ad9a5c90263d8c59c47c0e31046" Nov 24 12:21:37 crc kubenswrapper[4752]: I1124 12:21:37.947924 4752 scope.go:117] "RemoveContainer" containerID="2d3bc7447e77ee757fda0e4747e66e8d935c9f60db0bea3d0e17d51af23862cb" Nov 24 12:21:37 crc kubenswrapper[4752]: E1124 12:21:37.948384 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d3bc7447e77ee757fda0e4747e66e8d935c9f60db0bea3d0e17d51af23862cb\": container with ID starting with 2d3bc7447e77ee757fda0e4747e66e8d935c9f60db0bea3d0e17d51af23862cb not found: ID does not exist" containerID="2d3bc7447e77ee757fda0e4747e66e8d935c9f60db0bea3d0e17d51af23862cb" Nov 24 12:21:37 crc kubenswrapper[4752]: I1124 12:21:37.948430 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d3bc7447e77ee757fda0e4747e66e8d935c9f60db0bea3d0e17d51af23862cb"} err="failed to get container status \"2d3bc7447e77ee757fda0e4747e66e8d935c9f60db0bea3d0e17d51af23862cb\": rpc error: code = NotFound desc = could not find container \"2d3bc7447e77ee757fda0e4747e66e8d935c9f60db0bea3d0e17d51af23862cb\": container with ID starting with 2d3bc7447e77ee757fda0e4747e66e8d935c9f60db0bea3d0e17d51af23862cb not found: ID does not exist" Nov 24 12:21:37 crc kubenswrapper[4752]: I1124 12:21:37.948457 4752 scope.go:117] "RemoveContainer" containerID="1504230ca13a65f3c36bcefa5d3ae8612975af616f58b20414334a0e5cee7d88" Nov 24 12:21:37 crc kubenswrapper[4752]: E1124 12:21:37.948777 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1504230ca13a65f3c36bcefa5d3ae8612975af616f58b20414334a0e5cee7d88\": container with ID starting with 1504230ca13a65f3c36bcefa5d3ae8612975af616f58b20414334a0e5cee7d88 not found: ID does not exist" containerID="1504230ca13a65f3c36bcefa5d3ae8612975af616f58b20414334a0e5cee7d88" Nov 24 12:21:37 crc kubenswrapper[4752]: I1124 12:21:37.948806 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1504230ca13a65f3c36bcefa5d3ae8612975af616f58b20414334a0e5cee7d88"} err="failed to get container status \"1504230ca13a65f3c36bcefa5d3ae8612975af616f58b20414334a0e5cee7d88\": rpc error: code = NotFound desc = could not find container \"1504230ca13a65f3c36bcefa5d3ae8612975af616f58b20414334a0e5cee7d88\": container with ID starting with 1504230ca13a65f3c36bcefa5d3ae8612975af616f58b20414334a0e5cee7d88 not found: ID does not exist" Nov 24 12:21:37 crc kubenswrapper[4752]: I1124 12:21:37.948829 4752 scope.go:117] "RemoveContainer" containerID="e72aef7281e17824197ed22b0b58e92802053ad9a5c90263d8c59c47c0e31046" Nov 24 12:21:37 crc kubenswrapper[4752]: E1124 12:21:37.949069 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e72aef7281e17824197ed22b0b58e92802053ad9a5c90263d8c59c47c0e31046\": container with ID starting with e72aef7281e17824197ed22b0b58e92802053ad9a5c90263d8c59c47c0e31046 not found: ID does not exist" containerID="e72aef7281e17824197ed22b0b58e92802053ad9a5c90263d8c59c47c0e31046" Nov 24 12:21:37 crc kubenswrapper[4752]: I1124 12:21:37.949099 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e72aef7281e17824197ed22b0b58e92802053ad9a5c90263d8c59c47c0e31046"} err="failed to get container status \"e72aef7281e17824197ed22b0b58e92802053ad9a5c90263d8c59c47c0e31046\": rpc error: code = NotFound desc = could not find container \"e72aef7281e17824197ed22b0b58e92802053ad9a5c90263d8c59c47c0e31046\": container with ID starting with e72aef7281e17824197ed22b0b58e92802053ad9a5c90263d8c59c47c0e31046 not found: ID does not exist" Nov 24 12:21:38 crc kubenswrapper[4752]: I1124 12:21:38.728018 4752 scope.go:117] "RemoveContainer" containerID="a07fb79ffa4c02fa1ac7fbfe36d7a8beb4399a115479826734e5d12ec7283331" Nov 24 12:21:38 crc kubenswrapper[4752]: E1124 12:21:38.728488 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:21:38 crc kubenswrapper[4752]: I1124 12:21:38.737076 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dead7321-d402-4c56-acb7-ce18c49b6941" path="/var/lib/kubelet/pods/dead7321-d402-4c56-acb7-ce18c49b6941/volumes" Nov 24 12:21:49 crc kubenswrapper[4752]: I1124 12:21:49.728362 4752 scope.go:117] "RemoveContainer" containerID="a07fb79ffa4c02fa1ac7fbfe36d7a8beb4399a115479826734e5d12ec7283331" Nov 24 12:21:49 crc kubenswrapper[4752]: E1124 12:21:49.729514 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:21:58 crc kubenswrapper[4752]: I1124 12:21:58.553357 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-vdbdb"] Nov 24 12:21:58 crc kubenswrapper[4752]: E1124 12:21:58.554304 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dead7321-d402-4c56-acb7-ce18c49b6941" containerName="extract-utilities" Nov 24 12:21:58 crc kubenswrapper[4752]: I1124 12:21:58.554322 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="dead7321-d402-4c56-acb7-ce18c49b6941" containerName="extract-utilities" Nov 24 12:21:58 crc kubenswrapper[4752]: E1124 12:21:58.554344 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dead7321-d402-4c56-acb7-ce18c49b6941" containerName="extract-content" Nov 24 12:21:58 crc kubenswrapper[4752]: I1124 12:21:58.554352 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="dead7321-d402-4c56-acb7-ce18c49b6941" containerName="extract-content" Nov 24 12:21:58 crc kubenswrapper[4752]: E1124 12:21:58.554374 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dead7321-d402-4c56-acb7-ce18c49b6941" containerName="registry-server" Nov 24 12:21:58 crc kubenswrapper[4752]: I1124 12:21:58.554382 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="dead7321-d402-4c56-acb7-ce18c49b6941" containerName="registry-server" Nov 24 12:21:58 crc kubenswrapper[4752]: I1124 12:21:58.554565 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="dead7321-d402-4c56-acb7-ce18c49b6941" containerName="registry-server" Nov 24 12:21:58 crc kubenswrapper[4752]: I1124 12:21:58.555485 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-vdbdb" Nov 24 12:21:58 crc kubenswrapper[4752]: I1124 12:21:58.557588 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Nov 24 12:21:58 crc kubenswrapper[4752]: I1124 12:21:58.557776 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Nov 24 12:21:58 crc kubenswrapper[4752]: I1124 12:21:58.557835 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Nov 24 12:21:58 crc kubenswrapper[4752]: I1124 12:21:58.558187 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-dx2r5" Nov 24 12:21:58 crc kubenswrapper[4752]: I1124 12:21:58.558378 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Nov 24 12:21:58 crc kubenswrapper[4752]: I1124 12:21:58.562311 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2d34cc2-feb3-4657-996c-e2c3faf94b7d-config\") pod \"dnsmasq-dns-5d7b5456f5-vdbdb\" (UID: \"a2d34cc2-feb3-4657-996c-e2c3faf94b7d\") " pod="openstack/dnsmasq-dns-5d7b5456f5-vdbdb" Nov 24 12:21:58 crc kubenswrapper[4752]: I1124 12:21:58.562365 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92lhh\" (UniqueName: \"kubernetes.io/projected/a2d34cc2-feb3-4657-996c-e2c3faf94b7d-kube-api-access-92lhh\") pod \"dnsmasq-dns-5d7b5456f5-vdbdb\" (UID: \"a2d34cc2-feb3-4657-996c-e2c3faf94b7d\") " pod="openstack/dnsmasq-dns-5d7b5456f5-vdbdb" Nov 24 12:21:58 crc kubenswrapper[4752]: I1124 12:21:58.562497 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2d34cc2-feb3-4657-996c-e2c3faf94b7d-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-vdbdb\" (UID: \"a2d34cc2-feb3-4657-996c-e2c3faf94b7d\") " pod="openstack/dnsmasq-dns-5d7b5456f5-vdbdb" Nov 24 12:21:58 crc kubenswrapper[4752]: I1124 12:21:58.570045 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-vdbdb"] Nov 24 12:21:58 crc kubenswrapper[4752]: I1124 12:21:58.663404 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2d34cc2-feb3-4657-996c-e2c3faf94b7d-config\") pod \"dnsmasq-dns-5d7b5456f5-vdbdb\" (UID: \"a2d34cc2-feb3-4657-996c-e2c3faf94b7d\") " pod="openstack/dnsmasq-dns-5d7b5456f5-vdbdb" Nov 24 12:21:58 crc kubenswrapper[4752]: I1124 12:21:58.663473 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92lhh\" (UniqueName: \"kubernetes.io/projected/a2d34cc2-feb3-4657-996c-e2c3faf94b7d-kube-api-access-92lhh\") pod \"dnsmasq-dns-5d7b5456f5-vdbdb\" (UID: \"a2d34cc2-feb3-4657-996c-e2c3faf94b7d\") " pod="openstack/dnsmasq-dns-5d7b5456f5-vdbdb" Nov 24 12:21:58 crc kubenswrapper[4752]: I1124 12:21:58.663514 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2d34cc2-feb3-4657-996c-e2c3faf94b7d-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-vdbdb\" (UID: \"a2d34cc2-feb3-4657-996c-e2c3faf94b7d\") " pod="openstack/dnsmasq-dns-5d7b5456f5-vdbdb" Nov 24 12:21:58 crc kubenswrapper[4752]: I1124 12:21:58.664374 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2d34cc2-feb3-4657-996c-e2c3faf94b7d-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-vdbdb\" (UID: \"a2d34cc2-feb3-4657-996c-e2c3faf94b7d\") " pod="openstack/dnsmasq-dns-5d7b5456f5-vdbdb" Nov 24 12:21:58 crc kubenswrapper[4752]: I1124 12:21:58.664918 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2d34cc2-feb3-4657-996c-e2c3faf94b7d-config\") pod \"dnsmasq-dns-5d7b5456f5-vdbdb\" (UID: \"a2d34cc2-feb3-4657-996c-e2c3faf94b7d\") " pod="openstack/dnsmasq-dns-5d7b5456f5-vdbdb" Nov 24 12:21:58 crc kubenswrapper[4752]: I1124 12:21:58.689646 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92lhh\" (UniqueName: \"kubernetes.io/projected/a2d34cc2-feb3-4657-996c-e2c3faf94b7d-kube-api-access-92lhh\") pod \"dnsmasq-dns-5d7b5456f5-vdbdb\" (UID: \"a2d34cc2-feb3-4657-996c-e2c3faf94b7d\") " pod="openstack/dnsmasq-dns-5d7b5456f5-vdbdb" Nov 24 12:21:58 crc kubenswrapper[4752]: I1124 12:21:58.790495 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-cfmff"] Nov 24 12:21:58 crc kubenswrapper[4752]: I1124 12:21:58.791611 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-cfmff" Nov 24 12:21:58 crc kubenswrapper[4752]: I1124 12:21:58.810343 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-cfmff"] Nov 24 12:21:58 crc kubenswrapper[4752]: I1124 12:21:58.875297 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-vdbdb" Nov 24 12:21:58 crc kubenswrapper[4752]: I1124 12:21:58.967705 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j6st\" (UniqueName: \"kubernetes.io/projected/2c417dbd-7198-4e8f-9136-a64236e924e8-kube-api-access-5j6st\") pod \"dnsmasq-dns-98ddfc8f-cfmff\" (UID: \"2c417dbd-7198-4e8f-9136-a64236e924e8\") " pod="openstack/dnsmasq-dns-98ddfc8f-cfmff" Nov 24 12:21:58 crc kubenswrapper[4752]: I1124 12:21:58.968072 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c417dbd-7198-4e8f-9136-a64236e924e8-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-cfmff\" (UID: \"2c417dbd-7198-4e8f-9136-a64236e924e8\") " pod="openstack/dnsmasq-dns-98ddfc8f-cfmff" Nov 24 12:21:58 crc kubenswrapper[4752]: I1124 12:21:58.968101 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c417dbd-7198-4e8f-9136-a64236e924e8-config\") pod \"dnsmasq-dns-98ddfc8f-cfmff\" (UID: \"2c417dbd-7198-4e8f-9136-a64236e924e8\") " pod="openstack/dnsmasq-dns-98ddfc8f-cfmff" Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.070583 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j6st\" (UniqueName: \"kubernetes.io/projected/2c417dbd-7198-4e8f-9136-a64236e924e8-kube-api-access-5j6st\") pod \"dnsmasq-dns-98ddfc8f-cfmff\" (UID: \"2c417dbd-7198-4e8f-9136-a64236e924e8\") " pod="openstack/dnsmasq-dns-98ddfc8f-cfmff" Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.070631 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c417dbd-7198-4e8f-9136-a64236e924e8-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-cfmff\" (UID: \"2c417dbd-7198-4e8f-9136-a64236e924e8\") " pod="openstack/dnsmasq-dns-98ddfc8f-cfmff" Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.070659 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c417dbd-7198-4e8f-9136-a64236e924e8-config\") pod \"dnsmasq-dns-98ddfc8f-cfmff\" (UID: \"2c417dbd-7198-4e8f-9136-a64236e924e8\") " pod="openstack/dnsmasq-dns-98ddfc8f-cfmff" Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.071426 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c417dbd-7198-4e8f-9136-a64236e924e8-config\") pod \"dnsmasq-dns-98ddfc8f-cfmff\" (UID: \"2c417dbd-7198-4e8f-9136-a64236e924e8\") " pod="openstack/dnsmasq-dns-98ddfc8f-cfmff" Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.071625 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c417dbd-7198-4e8f-9136-a64236e924e8-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-cfmff\" (UID: \"2c417dbd-7198-4e8f-9136-a64236e924e8\") " pod="openstack/dnsmasq-dns-98ddfc8f-cfmff" Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.110546 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j6st\" (UniqueName: \"kubernetes.io/projected/2c417dbd-7198-4e8f-9136-a64236e924e8-kube-api-access-5j6st\") pod \"dnsmasq-dns-98ddfc8f-cfmff\" (UID: \"2c417dbd-7198-4e8f-9136-a64236e924e8\") " pod="openstack/dnsmasq-dns-98ddfc8f-cfmff" Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.409583 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-cfmff" Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.490663 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-vdbdb"] Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.643519 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.644988 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.650535 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-96klc" Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.650649 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.655089 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.655340 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.655370 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.671828 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.697550 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1d501403-cb13-4634-9ca3-c5bacc88afbb-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1d501403-cb13-4634-9ca3-c5bacc88afbb\") " pod="openstack/rabbitmq-server-0" Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.697619 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0d80a964-a5fe-4cde-9b86-e065487a1d17\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0d80a964-a5fe-4cde-9b86-e065487a1d17\") pod \"rabbitmq-server-0\" (UID: \"1d501403-cb13-4634-9ca3-c5bacc88afbb\") " pod="openstack/rabbitmq-server-0" Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.697658 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1d501403-cb13-4634-9ca3-c5bacc88afbb-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1d501403-cb13-4634-9ca3-c5bacc88afbb\") " pod="openstack/rabbitmq-server-0" Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.697685 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjk6t\" (UniqueName: \"kubernetes.io/projected/1d501403-cb13-4634-9ca3-c5bacc88afbb-kube-api-access-jjk6t\") pod \"rabbitmq-server-0\" (UID: \"1d501403-cb13-4634-9ca3-c5bacc88afbb\") " pod="openstack/rabbitmq-server-0" Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.697715 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1d501403-cb13-4634-9ca3-c5bacc88afbb-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1d501403-cb13-4634-9ca3-c5bacc88afbb\") " pod="openstack/rabbitmq-server-0" Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.697738 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1d501403-cb13-4634-9ca3-c5bacc88afbb-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1d501403-cb13-4634-9ca3-c5bacc88afbb\") " pod="openstack/rabbitmq-server-0" Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.697774 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1d501403-cb13-4634-9ca3-c5bacc88afbb-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1d501403-cb13-4634-9ca3-c5bacc88afbb\") " pod="openstack/rabbitmq-server-0" Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.697797 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1d501403-cb13-4634-9ca3-c5bacc88afbb-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1d501403-cb13-4634-9ca3-c5bacc88afbb\") " pod="openstack/rabbitmq-server-0" Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.697988 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1d501403-cb13-4634-9ca3-c5bacc88afbb-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1d501403-cb13-4634-9ca3-c5bacc88afbb\") " pod="openstack/rabbitmq-server-0" Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.799901 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0d80a964-a5fe-4cde-9b86-e065487a1d17\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0d80a964-a5fe-4cde-9b86-e065487a1d17\") pod \"rabbitmq-server-0\" (UID: \"1d501403-cb13-4634-9ca3-c5bacc88afbb\") " pod="openstack/rabbitmq-server-0" Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.799995 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1d501403-cb13-4634-9ca3-c5bacc88afbb-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1d501403-cb13-4634-9ca3-c5bacc88afbb\") " pod="openstack/rabbitmq-server-0" Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.800028 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjk6t\" (UniqueName: \"kubernetes.io/projected/1d501403-cb13-4634-9ca3-c5bacc88afbb-kube-api-access-jjk6t\") pod \"rabbitmq-server-0\" (UID: \"1d501403-cb13-4634-9ca3-c5bacc88afbb\") " pod="openstack/rabbitmq-server-0" Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.800067 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1d501403-cb13-4634-9ca3-c5bacc88afbb-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1d501403-cb13-4634-9ca3-c5bacc88afbb\") " pod="openstack/rabbitmq-server-0" Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.800090 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1d501403-cb13-4634-9ca3-c5bacc88afbb-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1d501403-cb13-4634-9ca3-c5bacc88afbb\") " pod="openstack/rabbitmq-server-0" Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.800108 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1d501403-cb13-4634-9ca3-c5bacc88afbb-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1d501403-cb13-4634-9ca3-c5bacc88afbb\") " pod="openstack/rabbitmq-server-0" Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.800130 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1d501403-cb13-4634-9ca3-c5bacc88afbb-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1d501403-cb13-4634-9ca3-c5bacc88afbb\") " pod="openstack/rabbitmq-server-0" Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.800859 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1d501403-cb13-4634-9ca3-c5bacc88afbb-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1d501403-cb13-4634-9ca3-c5bacc88afbb\") " pod="openstack/rabbitmq-server-0" Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.801019 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1d501403-cb13-4634-9ca3-c5bacc88afbb-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1d501403-cb13-4634-9ca3-c5bacc88afbb\") " pod="openstack/rabbitmq-server-0" Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.801020 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1d501403-cb13-4634-9ca3-c5bacc88afbb-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1d501403-cb13-4634-9ca3-c5bacc88afbb\") " pod="openstack/rabbitmq-server-0" Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.801358 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1d501403-cb13-4634-9ca3-c5bacc88afbb-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1d501403-cb13-4634-9ca3-c5bacc88afbb\") " pod="openstack/rabbitmq-server-0" Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.802325 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1d501403-cb13-4634-9ca3-c5bacc88afbb-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1d501403-cb13-4634-9ca3-c5bacc88afbb\") " pod="openstack/rabbitmq-server-0" Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.802426 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1d501403-cb13-4634-9ca3-c5bacc88afbb-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1d501403-cb13-4634-9ca3-c5bacc88afbb\") " pod="openstack/rabbitmq-server-0" Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.805548 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1d501403-cb13-4634-9ca3-c5bacc88afbb-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1d501403-cb13-4634-9ca3-c5bacc88afbb\") " pod="openstack/rabbitmq-server-0" Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.805566 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1d501403-cb13-4634-9ca3-c5bacc88afbb-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1d501403-cb13-4634-9ca3-c5bacc88afbb\") " pod="openstack/rabbitmq-server-0" Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.807450 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1d501403-cb13-4634-9ca3-c5bacc88afbb-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1d501403-cb13-4634-9ca3-c5bacc88afbb\") " pod="openstack/rabbitmq-server-0" Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.807857 4752 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.807913 4752 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0d80a964-a5fe-4cde-9b86-e065487a1d17\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0d80a964-a5fe-4cde-9b86-e065487a1d17\") pod \"rabbitmq-server-0\" (UID: \"1d501403-cb13-4634-9ca3-c5bacc88afbb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/cec840fc310b133ee765763ff04aa0c368df6d959d37e1bdbc23d48f8ebe2d97/globalmount\"" pod="openstack/rabbitmq-server-0" Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.819530 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjk6t\" (UniqueName: \"kubernetes.io/projected/1d501403-cb13-4634-9ca3-c5bacc88afbb-kube-api-access-jjk6t\") pod \"rabbitmq-server-0\" (UID: \"1d501403-cb13-4634-9ca3-c5bacc88afbb\") " pod="openstack/rabbitmq-server-0" Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.835723 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0d80a964-a5fe-4cde-9b86-e065487a1d17\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0d80a964-a5fe-4cde-9b86-e065487a1d17\") pod \"rabbitmq-server-0\" (UID: \"1d501403-cb13-4634-9ca3-c5bacc88afbb\") " pod="openstack/rabbitmq-server-0" Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.869955 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-cfmff"] Nov 24 12:21:59 crc kubenswrapper[4752]: W1124 12:21:59.872795 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c417dbd_7198_4e8f_9136_a64236e924e8.slice/crio-cdbe2a61ef69e89f3118849f2b3b4e155b1394b5b8dc3e4a2183999f70bf5fc4 WatchSource:0}: Error finding container cdbe2a61ef69e89f3118849f2b3b4e155b1394b5b8dc3e4a2183999f70bf5fc4: Status 404 returned error can't find the container with id cdbe2a61ef69e89f3118849f2b3b4e155b1394b5b8dc3e4a2183999f70bf5fc4 Nov 24 12:21:59 crc kubenswrapper[4752]: I1124 12:21:59.981333 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 24 12:22:00 crc kubenswrapper[4752]: I1124 12:22:00.030270 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 24 12:22:00 crc kubenswrapper[4752]: I1124 12:22:00.031461 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:22:00 crc kubenswrapper[4752]: I1124 12:22:00.033998 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Nov 24 12:22:00 crc kubenswrapper[4752]: I1124 12:22:00.034177 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Nov 24 12:22:00 crc kubenswrapper[4752]: I1124 12:22:00.034446 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Nov 24 12:22:00 crc kubenswrapper[4752]: I1124 12:22:00.034652 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-6xkh5" Nov 24 12:22:00 crc kubenswrapper[4752]: I1124 12:22:00.034950 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Nov 24 12:22:00 crc kubenswrapper[4752]: I1124 12:22:00.051459 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 24 12:22:00 crc kubenswrapper[4752]: I1124 12:22:00.085422 4752 generic.go:334] "Generic (PLEG): container finished" podID="a2d34cc2-feb3-4657-996c-e2c3faf94b7d" containerID="1497fe85263867a73349c08db4c7116ad6b22625e350650a9841d74e7807770d" exitCode=0 Nov 24 12:22:00 crc kubenswrapper[4752]: I1124 12:22:00.085489 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-vdbdb" event={"ID":"a2d34cc2-feb3-4657-996c-e2c3faf94b7d","Type":"ContainerDied","Data":"1497fe85263867a73349c08db4c7116ad6b22625e350650a9841d74e7807770d"} Nov 24 12:22:00 crc kubenswrapper[4752]: I1124 12:22:00.085517 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-vdbdb" event={"ID":"a2d34cc2-feb3-4657-996c-e2c3faf94b7d","Type":"ContainerStarted","Data":"ca0c8a1750df00d5333c96e4240f3bcf4704f8b8267667bab473b54599bb9162"} Nov 24 12:22:00 crc kubenswrapper[4752]: I1124 12:22:00.090429 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-cfmff" event={"ID":"2c417dbd-7198-4e8f-9136-a64236e924e8","Type":"ContainerStarted","Data":"cdbe2a61ef69e89f3118849f2b3b4e155b1394b5b8dc3e4a2183999f70bf5fc4"} Nov 24 12:22:00 crc kubenswrapper[4752]: I1124 12:22:00.111012 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/90991bb7-c6be-4088-9071-14bd5884b506-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"90991bb7-c6be-4088-9071-14bd5884b506\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:22:00 crc kubenswrapper[4752]: I1124 12:22:00.111223 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7ce8f75d-e230-4e12-879b-47127d511398\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7ce8f75d-e230-4e12-879b-47127d511398\") pod \"rabbitmq-cell1-server-0\" (UID: \"90991bb7-c6be-4088-9071-14bd5884b506\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:22:00 crc kubenswrapper[4752]: I1124 12:22:00.111278 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/90991bb7-c6be-4088-9071-14bd5884b506-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"90991bb7-c6be-4088-9071-14bd5884b506\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:22:00 crc kubenswrapper[4752]: I1124 12:22:00.111300 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8ltd\" (UniqueName: \"kubernetes.io/projected/90991bb7-c6be-4088-9071-14bd5884b506-kube-api-access-t8ltd\") pod \"rabbitmq-cell1-server-0\" (UID: \"90991bb7-c6be-4088-9071-14bd5884b506\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:22:00 crc kubenswrapper[4752]: I1124 12:22:00.111322 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/90991bb7-c6be-4088-9071-14bd5884b506-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"90991bb7-c6be-4088-9071-14bd5884b506\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:22:00 crc kubenswrapper[4752]: I1124 12:22:00.111375 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/90991bb7-c6be-4088-9071-14bd5884b506-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"90991bb7-c6be-4088-9071-14bd5884b506\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:22:00 crc kubenswrapper[4752]: I1124 12:22:00.111396 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/90991bb7-c6be-4088-9071-14bd5884b506-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"90991bb7-c6be-4088-9071-14bd5884b506\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:22:00 crc kubenswrapper[4752]: I1124 12:22:00.111423 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/90991bb7-c6be-4088-9071-14bd5884b506-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"90991bb7-c6be-4088-9071-14bd5884b506\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:22:00 crc kubenswrapper[4752]: I1124 12:22:00.111447 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/90991bb7-c6be-4088-9071-14bd5884b506-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"90991bb7-c6be-4088-9071-14bd5884b506\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:22:00 crc kubenswrapper[4752]: I1124 12:22:00.217610 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/90991bb7-c6be-4088-9071-14bd5884b506-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"90991bb7-c6be-4088-9071-14bd5884b506\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:22:00 crc kubenswrapper[4752]: I1124 12:22:00.218013 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7ce8f75d-e230-4e12-879b-47127d511398\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7ce8f75d-e230-4e12-879b-47127d511398\") pod \"rabbitmq-cell1-server-0\" (UID: \"90991bb7-c6be-4088-9071-14bd5884b506\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:22:00 crc kubenswrapper[4752]: I1124 12:22:00.218080 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/90991bb7-c6be-4088-9071-14bd5884b506-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"90991bb7-c6be-4088-9071-14bd5884b506\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:22:00 crc kubenswrapper[4752]: I1124 12:22:00.218107 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8ltd\" (UniqueName: \"kubernetes.io/projected/90991bb7-c6be-4088-9071-14bd5884b506-kube-api-access-t8ltd\") pod \"rabbitmq-cell1-server-0\" (UID: \"90991bb7-c6be-4088-9071-14bd5884b506\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:22:00 crc kubenswrapper[4752]: I1124 12:22:00.218129 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/90991bb7-c6be-4088-9071-14bd5884b506-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"90991bb7-c6be-4088-9071-14bd5884b506\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:22:00 crc kubenswrapper[4752]: I1124 12:22:00.218173 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/90991bb7-c6be-4088-9071-14bd5884b506-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"90991bb7-c6be-4088-9071-14bd5884b506\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:22:00 crc kubenswrapper[4752]: I1124 12:22:00.218192 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/90991bb7-c6be-4088-9071-14bd5884b506-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"90991bb7-c6be-4088-9071-14bd5884b506\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:22:00 crc kubenswrapper[4752]: I1124 12:22:00.218220 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/90991bb7-c6be-4088-9071-14bd5884b506-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"90991bb7-c6be-4088-9071-14bd5884b506\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:22:00 crc kubenswrapper[4752]: I1124 12:22:00.218241 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/90991bb7-c6be-4088-9071-14bd5884b506-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"90991bb7-c6be-4088-9071-14bd5884b506\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:22:00 crc kubenswrapper[4752]: I1124 12:22:00.219782 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/90991bb7-c6be-4088-9071-14bd5884b506-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"90991bb7-c6be-4088-9071-14bd5884b506\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:22:00 crc kubenswrapper[4752]: I1124 12:22:00.220639 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/90991bb7-c6be-4088-9071-14bd5884b506-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"90991bb7-c6be-4088-9071-14bd5884b506\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:22:00 crc kubenswrapper[4752]: I1124 12:22:00.220735 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/90991bb7-c6be-4088-9071-14bd5884b506-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"90991bb7-c6be-4088-9071-14bd5884b506\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:22:00 crc kubenswrapper[4752]: I1124 12:22:00.221017 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/90991bb7-c6be-4088-9071-14bd5884b506-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"90991bb7-c6be-4088-9071-14bd5884b506\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:22:00 crc kubenswrapper[4752]: I1124 12:22:00.224668 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/90991bb7-c6be-4088-9071-14bd5884b506-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"90991bb7-c6be-4088-9071-14bd5884b506\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:22:00 crc kubenswrapper[4752]: I1124 12:22:00.231777 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/90991bb7-c6be-4088-9071-14bd5884b506-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"90991bb7-c6be-4088-9071-14bd5884b506\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:22:00 crc kubenswrapper[4752]: I1124 12:22:00.232275 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/90991bb7-c6be-4088-9071-14bd5884b506-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"90991bb7-c6be-4088-9071-14bd5884b506\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:22:00 crc kubenswrapper[4752]: I1124 12:22:00.240615 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8ltd\" (UniqueName: \"kubernetes.io/projected/90991bb7-c6be-4088-9071-14bd5884b506-kube-api-access-t8ltd\") pod \"rabbitmq-cell1-server-0\" (UID: \"90991bb7-c6be-4088-9071-14bd5884b506\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:22:00 crc kubenswrapper[4752]: I1124 12:22:00.240718 4752 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 24 12:22:00 crc kubenswrapper[4752]: I1124 12:22:00.240791 4752 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7ce8f75d-e230-4e12-879b-47127d511398\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7ce8f75d-e230-4e12-879b-47127d511398\") pod \"rabbitmq-cell1-server-0\" (UID: \"90991bb7-c6be-4088-9071-14bd5884b506\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8d6896910fa2a6fe840e1e468d56f0adfe5685e733944c9fc4c358b75510fd35/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:22:00 crc kubenswrapper[4752]: I1124 12:22:00.281002 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 24 12:22:00 crc kubenswrapper[4752]: I1124 12:22:00.300855 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7ce8f75d-e230-4e12-879b-47127d511398\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7ce8f75d-e230-4e12-879b-47127d511398\") pod \"rabbitmq-cell1-server-0\" (UID: \"90991bb7-c6be-4088-9071-14bd5884b506\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:22:00 crc kubenswrapper[4752]: E1124 12:22:00.332027 4752 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Nov 24 12:22:00 crc kubenswrapper[4752]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/a2d34cc2-feb3-4657-996c-e2c3faf94b7d/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Nov 24 12:22:00 crc kubenswrapper[4752]: > podSandboxID="ca0c8a1750df00d5333c96e4240f3bcf4704f8b8267667bab473b54599bb9162" Nov 24 12:22:00 crc kubenswrapper[4752]: E1124 12:22:00.332196 4752 kuberuntime_manager.go:1274] "Unhandled Error" err=< Nov 24 12:22:00 crc kubenswrapper[4752]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n8chc6h5bh56fh546hb7hc8h67h5bchffh577h697h5b5h5bdh59bhf6hf4h558hb5h578h595h5cchfbh644h59ch7fh654h547h587h5cbh5d5h8fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-92lhh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5d7b5456f5-vdbdb_openstack(a2d34cc2-feb3-4657-996c-e2c3faf94b7d): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/a2d34cc2-feb3-4657-996c-e2c3faf94b7d/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Nov 24 12:22:00 crc kubenswrapper[4752]: > logger="UnhandledError" Nov 24 12:22:00 crc kubenswrapper[4752]: E1124 12:22:00.333532 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/a2d34cc2-feb3-4657-996c-e2c3faf94b7d/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-5d7b5456f5-vdbdb" podUID="a2d34cc2-feb3-4657-996c-e2c3faf94b7d" Nov 24 12:22:00 crc kubenswrapper[4752]: I1124 12:22:00.351509 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:22:00 crc kubenswrapper[4752]: I1124 12:22:00.776009 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 24 12:22:00 crc kubenswrapper[4752]: W1124 12:22:00.779067 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90991bb7_c6be_4088_9071_14bd5884b506.slice/crio-0548b5244f526c9a91003e029c1df6ed4fc773ec3e511d04df8a4c2fc6e6679b WatchSource:0}: Error finding container 0548b5244f526c9a91003e029c1df6ed4fc773ec3e511d04df8a4c2fc6e6679b: Status 404 returned error can't find the container with id 0548b5244f526c9a91003e029c1df6ed4fc773ec3e511d04df8a4c2fc6e6679b Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.100022 4752 generic.go:334] "Generic (PLEG): container finished" podID="2c417dbd-7198-4e8f-9136-a64236e924e8" containerID="d5b820a96fb0e01c00f11737385f5cfb27af1fbb556978dbbc29656fb9e37e82" exitCode=0 Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.100177 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-cfmff" event={"ID":"2c417dbd-7198-4e8f-9136-a64236e924e8","Type":"ContainerDied","Data":"d5b820a96fb0e01c00f11737385f5cfb27af1fbb556978dbbc29656fb9e37e82"} Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.101570 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1d501403-cb13-4634-9ca3-c5bacc88afbb","Type":"ContainerStarted","Data":"59d83d68c9d55f83982b8a9bcbbe111bbbae86f6570e5780c0b498b1d45ddc40"} Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.102816 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"90991bb7-c6be-4088-9071-14bd5884b506","Type":"ContainerStarted","Data":"0548b5244f526c9a91003e029c1df6ed4fc773ec3e511d04df8a4c2fc6e6679b"} Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.307932 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.309236 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.311001 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.312318 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-pgtjq" Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.313060 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.313771 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.320949 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.328119 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.334565 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d606151b-c042-4691-89a7-f1d8f21d033d-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d606151b-c042-4691-89a7-f1d8f21d033d\") " pod="openstack/openstack-galera-0" Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.334619 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d606151b-c042-4691-89a7-f1d8f21d033d-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d606151b-c042-4691-89a7-f1d8f21d033d\") " pod="openstack/openstack-galera-0" Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.334683 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnz4k\" (UniqueName: \"kubernetes.io/projected/d606151b-c042-4691-89a7-f1d8f21d033d-kube-api-access-gnz4k\") pod \"openstack-galera-0\" (UID: \"d606151b-c042-4691-89a7-f1d8f21d033d\") " pod="openstack/openstack-galera-0" Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.334711 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6e034601-271f-4c99-8ed7-e02a868ebc6f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6e034601-271f-4c99-8ed7-e02a868ebc6f\") pod \"openstack-galera-0\" (UID: \"d606151b-c042-4691-89a7-f1d8f21d033d\") " pod="openstack/openstack-galera-0" Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.334794 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d606151b-c042-4691-89a7-f1d8f21d033d-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d606151b-c042-4691-89a7-f1d8f21d033d\") " pod="openstack/openstack-galera-0" Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.334824 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d606151b-c042-4691-89a7-f1d8f21d033d-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d606151b-c042-4691-89a7-f1d8f21d033d\") " pod="openstack/openstack-galera-0" Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.334845 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d606151b-c042-4691-89a7-f1d8f21d033d-config-data-default\") pod \"openstack-galera-0\" (UID: \"d606151b-c042-4691-89a7-f1d8f21d033d\") " pod="openstack/openstack-galera-0" Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.334872 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d606151b-c042-4691-89a7-f1d8f21d033d-kolla-config\") pod \"openstack-galera-0\" (UID: \"d606151b-c042-4691-89a7-f1d8f21d033d\") " pod="openstack/openstack-galera-0" Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.436638 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d606151b-c042-4691-89a7-f1d8f21d033d-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d606151b-c042-4691-89a7-f1d8f21d033d\") " pod="openstack/openstack-galera-0" Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.437069 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d606151b-c042-4691-89a7-f1d8f21d033d-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d606151b-c042-4691-89a7-f1d8f21d033d\") " pod="openstack/openstack-galera-0" Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.437094 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d606151b-c042-4691-89a7-f1d8f21d033d-config-data-default\") pod \"openstack-galera-0\" (UID: \"d606151b-c042-4691-89a7-f1d8f21d033d\") " pod="openstack/openstack-galera-0" Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.437125 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d606151b-c042-4691-89a7-f1d8f21d033d-kolla-config\") pod \"openstack-galera-0\" (UID: \"d606151b-c042-4691-89a7-f1d8f21d033d\") " pod="openstack/openstack-galera-0" Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.437175 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d606151b-c042-4691-89a7-f1d8f21d033d-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d606151b-c042-4691-89a7-f1d8f21d033d\") " pod="openstack/openstack-galera-0" Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.437196 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d606151b-c042-4691-89a7-f1d8f21d033d-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d606151b-c042-4691-89a7-f1d8f21d033d\") " pod="openstack/openstack-galera-0" Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.437289 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnz4k\" (UniqueName: \"kubernetes.io/projected/d606151b-c042-4691-89a7-f1d8f21d033d-kube-api-access-gnz4k\") pod \"openstack-galera-0\" (UID: \"d606151b-c042-4691-89a7-f1d8f21d033d\") " pod="openstack/openstack-galera-0" Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.437323 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6e034601-271f-4c99-8ed7-e02a868ebc6f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6e034601-271f-4c99-8ed7-e02a868ebc6f\") pod \"openstack-galera-0\" (UID: \"d606151b-c042-4691-89a7-f1d8f21d033d\") " pod="openstack/openstack-galera-0" Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.438237 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d606151b-c042-4691-89a7-f1d8f21d033d-kolla-config\") pod \"openstack-galera-0\" (UID: \"d606151b-c042-4691-89a7-f1d8f21d033d\") " pod="openstack/openstack-galera-0" Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.439721 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d606151b-c042-4691-89a7-f1d8f21d033d-config-data-default\") pod \"openstack-galera-0\" (UID: \"d606151b-c042-4691-89a7-f1d8f21d033d\") " pod="openstack/openstack-galera-0" Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.440346 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d606151b-c042-4691-89a7-f1d8f21d033d-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d606151b-c042-4691-89a7-f1d8f21d033d\") " pod="openstack/openstack-galera-0" Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.441144 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d606151b-c042-4691-89a7-f1d8f21d033d-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d606151b-c042-4691-89a7-f1d8f21d033d\") " pod="openstack/openstack-galera-0" Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.441259 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d606151b-c042-4691-89a7-f1d8f21d033d-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d606151b-c042-4691-89a7-f1d8f21d033d\") " pod="openstack/openstack-galera-0" Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.441293 4752 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.441318 4752 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6e034601-271f-4c99-8ed7-e02a868ebc6f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6e034601-271f-4c99-8ed7-e02a868ebc6f\") pod \"openstack-galera-0\" (UID: \"d606151b-c042-4691-89a7-f1d8f21d033d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a9be50cdd941f369c498ca0b722aecefd207a14e81b9a9350048d9ed1d40b160/globalmount\"" pod="openstack/openstack-galera-0" Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.442959 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d606151b-c042-4691-89a7-f1d8f21d033d-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d606151b-c042-4691-89a7-f1d8f21d033d\") " pod="openstack/openstack-galera-0" Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.546380 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnz4k\" (UniqueName: \"kubernetes.io/projected/d606151b-c042-4691-89a7-f1d8f21d033d-kube-api-access-gnz4k\") pod \"openstack-galera-0\" (UID: \"d606151b-c042-4691-89a7-f1d8f21d033d\") " pod="openstack/openstack-galera-0" Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.566669 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6e034601-271f-4c99-8ed7-e02a868ebc6f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6e034601-271f-4c99-8ed7-e02a868ebc6f\") pod \"openstack-galera-0\" (UID: \"d606151b-c042-4691-89a7-f1d8f21d033d\") " pod="openstack/openstack-galera-0" Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.628363 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.629208 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.630316 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.631263 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.632411 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-kl8q2" Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.636382 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.640555 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3b79e654-374a-40f7-87b6-08a45f62c170-kolla-config\") pod \"memcached-0\" (UID: \"3b79e654-374a-40f7-87b6-08a45f62c170\") " pod="openstack/memcached-0" Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.640648 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87dmz\" (UniqueName: \"kubernetes.io/projected/3b79e654-374a-40f7-87b6-08a45f62c170-kube-api-access-87dmz\") pod \"memcached-0\" (UID: \"3b79e654-374a-40f7-87b6-08a45f62c170\") " pod="openstack/memcached-0" Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.640713 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3b79e654-374a-40f7-87b6-08a45f62c170-config-data\") pod \"memcached-0\" (UID: \"3b79e654-374a-40f7-87b6-08a45f62c170\") " pod="openstack/memcached-0" Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.741968 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3b79e654-374a-40f7-87b6-08a45f62c170-config-data\") pod \"memcached-0\" (UID: \"3b79e654-374a-40f7-87b6-08a45f62c170\") " pod="openstack/memcached-0" Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.742050 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3b79e654-374a-40f7-87b6-08a45f62c170-kolla-config\") pod \"memcached-0\" (UID: \"3b79e654-374a-40f7-87b6-08a45f62c170\") " pod="openstack/memcached-0" Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.742123 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87dmz\" (UniqueName: \"kubernetes.io/projected/3b79e654-374a-40f7-87b6-08a45f62c170-kube-api-access-87dmz\") pod \"memcached-0\" (UID: \"3b79e654-374a-40f7-87b6-08a45f62c170\") " pod="openstack/memcached-0" Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.743213 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3b79e654-374a-40f7-87b6-08a45f62c170-kolla-config\") pod \"memcached-0\" (UID: \"3b79e654-374a-40f7-87b6-08a45f62c170\") " pod="openstack/memcached-0" Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.743293 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3b79e654-374a-40f7-87b6-08a45f62c170-config-data\") pod \"memcached-0\" (UID: \"3b79e654-374a-40f7-87b6-08a45f62c170\") " pod="openstack/memcached-0" Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.802272 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87dmz\" (UniqueName: \"kubernetes.io/projected/3b79e654-374a-40f7-87b6-08a45f62c170-kube-api-access-87dmz\") pod \"memcached-0\" (UID: \"3b79e654-374a-40f7-87b6-08a45f62c170\") " pod="openstack/memcached-0" Nov 24 12:22:01 crc kubenswrapper[4752]: I1124 12:22:01.963164 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 24 12:22:02 crc kubenswrapper[4752]: I1124 12:22:02.034632 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 24 12:22:02 crc kubenswrapper[4752]: W1124 12:22:02.054680 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd606151b_c042_4691_89a7_f1d8f21d033d.slice/crio-a7c29812a81b52a650a6b15bb77770aeea83452703f9c01b88a82ba5807dc0af WatchSource:0}: Error finding container a7c29812a81b52a650a6b15bb77770aeea83452703f9c01b88a82ba5807dc0af: Status 404 returned error can't find the container with id a7c29812a81b52a650a6b15bb77770aeea83452703f9c01b88a82ba5807dc0af Nov 24 12:22:02 crc kubenswrapper[4752]: I1124 12:22:02.111488 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d606151b-c042-4691-89a7-f1d8f21d033d","Type":"ContainerStarted","Data":"a7c29812a81b52a650a6b15bb77770aeea83452703f9c01b88a82ba5807dc0af"} Nov 24 12:22:02 crc kubenswrapper[4752]: I1124 12:22:02.114523 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-cfmff" event={"ID":"2c417dbd-7198-4e8f-9136-a64236e924e8","Type":"ContainerStarted","Data":"d8cfee982fb6af5839f19bb3eb5483e4fb12284880e3b9b5ec07b692d160b33f"} Nov 24 12:22:02 crc kubenswrapper[4752]: I1124 12:22:02.114977 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-98ddfc8f-cfmff" Nov 24 12:22:02 crc kubenswrapper[4752]: I1124 12:22:02.116074 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1d501403-cb13-4634-9ca3-c5bacc88afbb","Type":"ContainerStarted","Data":"43190f620818c55969439e084c3a8dc56ded7b65f5caa0134917550722135669"} Nov 24 12:22:02 crc kubenswrapper[4752]: I1124 12:22:02.146137 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-98ddfc8f-cfmff" podStartSLOduration=4.146111322 podStartE2EDuration="4.146111322s" podCreationTimestamp="2025-11-24 12:21:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:22:02.143311252 +0000 UTC m=+4528.128131541" watchObservedRunningTime="2025-11-24 12:22:02.146111322 +0000 UTC m=+4528.130931611" Nov 24 12:22:02 crc kubenswrapper[4752]: I1124 12:22:02.154226 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-vdbdb" event={"ID":"a2d34cc2-feb3-4657-996c-e2c3faf94b7d","Type":"ContainerStarted","Data":"27ec014031f270418b43709ac8278ebb6a7fc593fc2fcbdcc2487db78881b8db"} Nov 24 12:22:02 crc kubenswrapper[4752]: I1124 12:22:02.154544 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d7b5456f5-vdbdb" Nov 24 12:22:02 crc kubenswrapper[4752]: I1124 12:22:02.155420 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"90991bb7-c6be-4088-9071-14bd5884b506","Type":"ContainerStarted","Data":"801285b08d9741fcb790faaa434e8388447bab9cc1187e70fe84e0ad30a689a9"} Nov 24 12:22:02 crc kubenswrapper[4752]: I1124 12:22:02.223658 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d7b5456f5-vdbdb" podStartSLOduration=4.223633877 podStartE2EDuration="4.223633877s" podCreationTimestamp="2025-11-24 12:21:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:22:02.217060028 +0000 UTC m=+4528.201880337" watchObservedRunningTime="2025-11-24 12:22:02.223633877 +0000 UTC m=+4528.208454156" Nov 24 12:22:02 crc kubenswrapper[4752]: I1124 12:22:02.479492 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 24 12:22:02 crc kubenswrapper[4752]: W1124 12:22:02.482553 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b79e654_374a_40f7_87b6_08a45f62c170.slice/crio-d536841ee2c4b617a21c0537922e656a49a9f1d3f509652a5e6baf3b0b1a3027 WatchSource:0}: Error finding container d536841ee2c4b617a21c0537922e656a49a9f1d3f509652a5e6baf3b0b1a3027: Status 404 returned error can't find the container with id d536841ee2c4b617a21c0537922e656a49a9f1d3f509652a5e6baf3b0b1a3027 Nov 24 12:22:02 crc kubenswrapper[4752]: I1124 12:22:02.800886 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 24 12:22:02 crc kubenswrapper[4752]: I1124 12:22:02.802667 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 24 12:22:02 crc kubenswrapper[4752]: I1124 12:22:02.805731 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-2n8rj" Nov 24 12:22:02 crc kubenswrapper[4752]: I1124 12:22:02.805831 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Nov 24 12:22:02 crc kubenswrapper[4752]: I1124 12:22:02.806387 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Nov 24 12:22:02 crc kubenswrapper[4752]: I1124 12:22:02.806836 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Nov 24 12:22:02 crc kubenswrapper[4752]: I1124 12:22:02.823310 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 24 12:22:02 crc kubenswrapper[4752]: I1124 12:22:02.959090 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33\") " pod="openstack/openstack-cell1-galera-0" Nov 24 12:22:02 crc kubenswrapper[4752]: I1124 12:22:02.959265 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33\") " pod="openstack/openstack-cell1-galera-0" Nov 24 12:22:02 crc kubenswrapper[4752]: I1124 12:22:02.959304 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33\") " pod="openstack/openstack-cell1-galera-0" Nov 24 12:22:02 crc kubenswrapper[4752]: I1124 12:22:02.959345 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0cbac2a4-a3eb-4aba-8848-cc14ba3e9aac\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0cbac2a4-a3eb-4aba-8848-cc14ba3e9aac\") pod \"openstack-cell1-galera-0\" (UID: \"1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33\") " pod="openstack/openstack-cell1-galera-0" Nov 24 12:22:02 crc kubenswrapper[4752]: I1124 12:22:02.959378 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjxwj\" (UniqueName: \"kubernetes.io/projected/1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33-kube-api-access-wjxwj\") pod \"openstack-cell1-galera-0\" (UID: \"1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33\") " pod="openstack/openstack-cell1-galera-0" Nov 24 12:22:02 crc kubenswrapper[4752]: I1124 12:22:02.959442 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33\") " pod="openstack/openstack-cell1-galera-0" Nov 24 12:22:02 crc kubenswrapper[4752]: I1124 12:22:02.959466 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33\") " pod="openstack/openstack-cell1-galera-0" Nov 24 12:22:02 crc kubenswrapper[4752]: I1124 12:22:02.959496 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33\") " pod="openstack/openstack-cell1-galera-0" Nov 24 12:22:03 crc kubenswrapper[4752]: I1124 12:22:03.061022 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33\") " pod="openstack/openstack-cell1-galera-0" Nov 24 12:22:03 crc kubenswrapper[4752]: I1124 12:22:03.061168 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33\") " pod="openstack/openstack-cell1-galera-0" Nov 24 12:22:03 crc kubenswrapper[4752]: I1124 12:22:03.061206 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33\") " pod="openstack/openstack-cell1-galera-0" Nov 24 12:22:03 crc kubenswrapper[4752]: I1124 12:22:03.061265 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0cbac2a4-a3eb-4aba-8848-cc14ba3e9aac\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0cbac2a4-a3eb-4aba-8848-cc14ba3e9aac\") pod \"openstack-cell1-galera-0\" (UID: \"1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33\") " pod="openstack/openstack-cell1-galera-0" Nov 24 12:22:03 crc kubenswrapper[4752]: I1124 12:22:03.061311 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjxwj\" (UniqueName: \"kubernetes.io/projected/1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33-kube-api-access-wjxwj\") pod \"openstack-cell1-galera-0\" (UID: \"1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33\") " pod="openstack/openstack-cell1-galera-0" Nov 24 12:22:03 crc kubenswrapper[4752]: I1124 12:22:03.061375 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33\") " pod="openstack/openstack-cell1-galera-0" Nov 24 12:22:03 crc kubenswrapper[4752]: I1124 12:22:03.061404 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33\") " pod="openstack/openstack-cell1-galera-0" Nov 24 12:22:03 crc kubenswrapper[4752]: I1124 12:22:03.061446 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33\") " pod="openstack/openstack-cell1-galera-0" Nov 24 12:22:03 crc kubenswrapper[4752]: I1124 12:22:03.062030 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33\") " pod="openstack/openstack-cell1-galera-0" Nov 24 12:22:03 crc kubenswrapper[4752]: I1124 12:22:03.062613 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33\") " pod="openstack/openstack-cell1-galera-0" Nov 24 12:22:03 crc kubenswrapper[4752]: I1124 12:22:03.063269 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33\") " pod="openstack/openstack-cell1-galera-0" Nov 24 12:22:03 crc kubenswrapper[4752]: I1124 12:22:03.063699 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33\") " pod="openstack/openstack-cell1-galera-0" Nov 24 12:22:03 crc kubenswrapper[4752]: I1124 12:22:03.067217 4752 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 24 12:22:03 crc kubenswrapper[4752]: I1124 12:22:03.067266 4752 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0cbac2a4-a3eb-4aba-8848-cc14ba3e9aac\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0cbac2a4-a3eb-4aba-8848-cc14ba3e9aac\") pod \"openstack-cell1-galera-0\" (UID: \"1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/cadc0a85e55fddec1d3da1161b7214c0ffb0510a6933e183d47e1553a420c5d2/globalmount\"" pod="openstack/openstack-cell1-galera-0" Nov 24 12:22:03 crc kubenswrapper[4752]: I1124 12:22:03.067574 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33\") " pod="openstack/openstack-cell1-galera-0" Nov 24 12:22:03 crc kubenswrapper[4752]: I1124 12:22:03.068458 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33\") " pod="openstack/openstack-cell1-galera-0" Nov 24 12:22:03 crc kubenswrapper[4752]: I1124 12:22:03.078242 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjxwj\" (UniqueName: \"kubernetes.io/projected/1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33-kube-api-access-wjxwj\") pod \"openstack-cell1-galera-0\" (UID: \"1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33\") " pod="openstack/openstack-cell1-galera-0" Nov 24 12:22:03 crc kubenswrapper[4752]: I1124 12:22:03.098170 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0cbac2a4-a3eb-4aba-8848-cc14ba3e9aac\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0cbac2a4-a3eb-4aba-8848-cc14ba3e9aac\") pod \"openstack-cell1-galera-0\" (UID: \"1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33\") " pod="openstack/openstack-cell1-galera-0" Nov 24 12:22:03 crc kubenswrapper[4752]: I1124 12:22:03.128495 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 24 12:22:03 crc kubenswrapper[4752]: I1124 12:22:03.165124 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"3b79e654-374a-40f7-87b6-08a45f62c170","Type":"ContainerStarted","Data":"1512ee0748bfe302d2f18dfa3332c85d25c634d452b361134402d7f505574153"} Nov 24 12:22:03 crc kubenswrapper[4752]: I1124 12:22:03.165173 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"3b79e654-374a-40f7-87b6-08a45f62c170","Type":"ContainerStarted","Data":"d536841ee2c4b617a21c0537922e656a49a9f1d3f509652a5e6baf3b0b1a3027"} Nov 24 12:22:03 crc kubenswrapper[4752]: I1124 12:22:03.165349 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Nov 24 12:22:03 crc kubenswrapper[4752]: I1124 12:22:03.168112 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d606151b-c042-4691-89a7-f1d8f21d033d","Type":"ContainerStarted","Data":"0cbde4a6fe3080d19c7c7dd62f73a3a57718dd8b28e4b3a9173a820e94eaf315"} Nov 24 12:22:03 crc kubenswrapper[4752]: I1124 12:22:03.193547 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.19351624 podStartE2EDuration="2.19351624s" podCreationTimestamp="2025-11-24 12:22:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:22:03.181661589 +0000 UTC m=+4529.166481888" watchObservedRunningTime="2025-11-24 12:22:03.19351624 +0000 UTC m=+4529.178336539" Nov 24 12:22:03 crc kubenswrapper[4752]: I1124 12:22:03.668148 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 24 12:22:03 crc kubenswrapper[4752]: W1124 12:22:03.674475 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fefe6b9_4e33_4ade_9212_0d2f8f6d4a33.slice/crio-18e726812ae7fe3fc82af4addeb6a20138ef95b94d1eb80303c742eafb5a7848 WatchSource:0}: Error finding container 18e726812ae7fe3fc82af4addeb6a20138ef95b94d1eb80303c742eafb5a7848: Status 404 returned error can't find the container with id 18e726812ae7fe3fc82af4addeb6a20138ef95b94d1eb80303c742eafb5a7848 Nov 24 12:22:03 crc kubenswrapper[4752]: I1124 12:22:03.728047 4752 scope.go:117] "RemoveContainer" containerID="a07fb79ffa4c02fa1ac7fbfe36d7a8beb4399a115479826734e5d12ec7283331" Nov 24 12:22:03 crc kubenswrapper[4752]: E1124 12:22:03.728673 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:22:04 crc kubenswrapper[4752]: I1124 12:22:04.177910 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33","Type":"ContainerStarted","Data":"d7c7b00aad63f113b2fa409f70a525a1e2762c0d51ed374849e1c589cb2db174"} Nov 24 12:22:04 crc kubenswrapper[4752]: I1124 12:22:04.177969 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33","Type":"ContainerStarted","Data":"18e726812ae7fe3fc82af4addeb6a20138ef95b94d1eb80303c742eafb5a7848"} Nov 24 12:22:06 crc kubenswrapper[4752]: I1124 12:22:06.202295 4752 generic.go:334] "Generic (PLEG): container finished" podID="d606151b-c042-4691-89a7-f1d8f21d033d" containerID="0cbde4a6fe3080d19c7c7dd62f73a3a57718dd8b28e4b3a9173a820e94eaf315" exitCode=0 Nov 24 12:22:06 crc kubenswrapper[4752]: I1124 12:22:06.202393 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d606151b-c042-4691-89a7-f1d8f21d033d","Type":"ContainerDied","Data":"0cbde4a6fe3080d19c7c7dd62f73a3a57718dd8b28e4b3a9173a820e94eaf315"} Nov 24 12:22:07 crc kubenswrapper[4752]: I1124 12:22:07.212652 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d606151b-c042-4691-89a7-f1d8f21d033d","Type":"ContainerStarted","Data":"a5a2732d2e14912dae9909886983adcdd06f7e43c584b3e2e166a5bd4cacc441"} Nov 24 12:22:07 crc kubenswrapper[4752]: I1124 12:22:07.236702 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=7.236680999 podStartE2EDuration="7.236680999s" podCreationTimestamp="2025-11-24 12:22:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:22:07.235027972 +0000 UTC m=+4533.219848291" watchObservedRunningTime="2025-11-24 12:22:07.236680999 +0000 UTC m=+4533.221501288" Nov 24 12:22:08 crc kubenswrapper[4752]: I1124 12:22:08.224701 4752 generic.go:334] "Generic (PLEG): container finished" podID="1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33" containerID="d7c7b00aad63f113b2fa409f70a525a1e2762c0d51ed374849e1c589cb2db174" exitCode=0 Nov 24 12:22:08 crc kubenswrapper[4752]: I1124 12:22:08.224781 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33","Type":"ContainerDied","Data":"d7c7b00aad63f113b2fa409f70a525a1e2762c0d51ed374849e1c589cb2db174"} Nov 24 12:22:08 crc kubenswrapper[4752]: I1124 12:22:08.876945 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d7b5456f5-vdbdb" Nov 24 12:22:09 crc kubenswrapper[4752]: I1124 12:22:09.239588 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33","Type":"ContainerStarted","Data":"bc9bdd13e54b1e28c4c8562f923be8cfb0f4ec0ce3252603d3952a1895034064"} Nov 24 12:22:09 crc kubenswrapper[4752]: I1124 12:22:09.268241 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=8.268219030000001 podStartE2EDuration="8.26821903s" podCreationTimestamp="2025-11-24 12:22:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:22:09.263994409 +0000 UTC m=+4535.248814728" watchObservedRunningTime="2025-11-24 12:22:09.26821903 +0000 UTC m=+4535.253039319" Nov 24 12:22:09 crc kubenswrapper[4752]: I1124 12:22:09.412112 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-98ddfc8f-cfmff" Nov 24 12:22:09 crc kubenswrapper[4752]: I1124 12:22:09.485492 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-vdbdb"] Nov 24 12:22:09 crc kubenswrapper[4752]: I1124 12:22:09.485835 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d7b5456f5-vdbdb" podUID="a2d34cc2-feb3-4657-996c-e2c3faf94b7d" containerName="dnsmasq-dns" containerID="cri-o://27ec014031f270418b43709ac8278ebb6a7fc593fc2fcbdcc2487db78881b8db" gracePeriod=10 Nov 24 12:22:09 crc kubenswrapper[4752]: I1124 12:22:09.997594 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-vdbdb" Nov 24 12:22:10 crc kubenswrapper[4752]: I1124 12:22:10.080562 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2d34cc2-feb3-4657-996c-e2c3faf94b7d-config\") pod \"a2d34cc2-feb3-4657-996c-e2c3faf94b7d\" (UID: \"a2d34cc2-feb3-4657-996c-e2c3faf94b7d\") " Nov 24 12:22:10 crc kubenswrapper[4752]: I1124 12:22:10.080625 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92lhh\" (UniqueName: \"kubernetes.io/projected/a2d34cc2-feb3-4657-996c-e2c3faf94b7d-kube-api-access-92lhh\") pod \"a2d34cc2-feb3-4657-996c-e2c3faf94b7d\" (UID: \"a2d34cc2-feb3-4657-996c-e2c3faf94b7d\") " Nov 24 12:22:10 crc kubenswrapper[4752]: I1124 12:22:10.080728 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2d34cc2-feb3-4657-996c-e2c3faf94b7d-dns-svc\") pod \"a2d34cc2-feb3-4657-996c-e2c3faf94b7d\" (UID: \"a2d34cc2-feb3-4657-996c-e2c3faf94b7d\") " Nov 24 12:22:10 crc kubenswrapper[4752]: I1124 12:22:10.090912 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2d34cc2-feb3-4657-996c-e2c3faf94b7d-kube-api-access-92lhh" (OuterVolumeSpecName: "kube-api-access-92lhh") pod "a2d34cc2-feb3-4657-996c-e2c3faf94b7d" (UID: "a2d34cc2-feb3-4657-996c-e2c3faf94b7d"). InnerVolumeSpecName "kube-api-access-92lhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:22:10 crc kubenswrapper[4752]: I1124 12:22:10.117932 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2d34cc2-feb3-4657-996c-e2c3faf94b7d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a2d34cc2-feb3-4657-996c-e2c3faf94b7d" (UID: "a2d34cc2-feb3-4657-996c-e2c3faf94b7d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:22:10 crc kubenswrapper[4752]: I1124 12:22:10.120162 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2d34cc2-feb3-4657-996c-e2c3faf94b7d-config" (OuterVolumeSpecName: "config") pod "a2d34cc2-feb3-4657-996c-e2c3faf94b7d" (UID: "a2d34cc2-feb3-4657-996c-e2c3faf94b7d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:22:10 crc kubenswrapper[4752]: I1124 12:22:10.182954 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2d34cc2-feb3-4657-996c-e2c3faf94b7d-config\") on node \"crc\" DevicePath \"\"" Nov 24 12:22:10 crc kubenswrapper[4752]: I1124 12:22:10.183022 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92lhh\" (UniqueName: \"kubernetes.io/projected/a2d34cc2-feb3-4657-996c-e2c3faf94b7d-kube-api-access-92lhh\") on node \"crc\" DevicePath \"\"" Nov 24 12:22:10 crc kubenswrapper[4752]: I1124 12:22:10.183031 4752 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2d34cc2-feb3-4657-996c-e2c3faf94b7d-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 12:22:10 crc kubenswrapper[4752]: I1124 12:22:10.248316 4752 generic.go:334] "Generic (PLEG): container finished" podID="a2d34cc2-feb3-4657-996c-e2c3faf94b7d" containerID="27ec014031f270418b43709ac8278ebb6a7fc593fc2fcbdcc2487db78881b8db" exitCode=0 Nov 24 12:22:10 crc kubenswrapper[4752]: I1124 12:22:10.248365 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-vdbdb" event={"ID":"a2d34cc2-feb3-4657-996c-e2c3faf94b7d","Type":"ContainerDied","Data":"27ec014031f270418b43709ac8278ebb6a7fc593fc2fcbdcc2487db78881b8db"} Nov 24 12:22:10 crc kubenswrapper[4752]: I1124 12:22:10.248398 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-vdbdb" event={"ID":"a2d34cc2-feb3-4657-996c-e2c3faf94b7d","Type":"ContainerDied","Data":"ca0c8a1750df00d5333c96e4240f3bcf4704f8b8267667bab473b54599bb9162"} Nov 24 12:22:10 crc kubenswrapper[4752]: I1124 12:22:10.248413 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-vdbdb" Nov 24 12:22:10 crc kubenswrapper[4752]: I1124 12:22:10.248419 4752 scope.go:117] "RemoveContainer" containerID="27ec014031f270418b43709ac8278ebb6a7fc593fc2fcbdcc2487db78881b8db" Nov 24 12:22:10 crc kubenswrapper[4752]: I1124 12:22:10.264829 4752 scope.go:117] "RemoveContainer" containerID="1497fe85263867a73349c08db4c7116ad6b22625e350650a9841d74e7807770d" Nov 24 12:22:10 crc kubenswrapper[4752]: I1124 12:22:10.281804 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-vdbdb"] Nov 24 12:22:10 crc kubenswrapper[4752]: I1124 12:22:10.290462 4752 scope.go:117] "RemoveContainer" containerID="27ec014031f270418b43709ac8278ebb6a7fc593fc2fcbdcc2487db78881b8db" Nov 24 12:22:10 crc kubenswrapper[4752]: I1124 12:22:10.290817 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-vdbdb"] Nov 24 12:22:10 crc kubenswrapper[4752]: E1124 12:22:10.291020 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27ec014031f270418b43709ac8278ebb6a7fc593fc2fcbdcc2487db78881b8db\": container with ID starting with 27ec014031f270418b43709ac8278ebb6a7fc593fc2fcbdcc2487db78881b8db not found: ID does not exist" containerID="27ec014031f270418b43709ac8278ebb6a7fc593fc2fcbdcc2487db78881b8db" Nov 24 12:22:10 crc kubenswrapper[4752]: I1124 12:22:10.291062 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27ec014031f270418b43709ac8278ebb6a7fc593fc2fcbdcc2487db78881b8db"} err="failed to get container status \"27ec014031f270418b43709ac8278ebb6a7fc593fc2fcbdcc2487db78881b8db\": rpc error: code = NotFound desc = could not find container \"27ec014031f270418b43709ac8278ebb6a7fc593fc2fcbdcc2487db78881b8db\": container with ID starting with 27ec014031f270418b43709ac8278ebb6a7fc593fc2fcbdcc2487db78881b8db not found: ID does not exist" Nov 24 12:22:10 crc kubenswrapper[4752]: I1124 12:22:10.291096 4752 scope.go:117] "RemoveContainer" containerID="1497fe85263867a73349c08db4c7116ad6b22625e350650a9841d74e7807770d" Nov 24 12:22:10 crc kubenswrapper[4752]: E1124 12:22:10.291692 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1497fe85263867a73349c08db4c7116ad6b22625e350650a9841d74e7807770d\": container with ID starting with 1497fe85263867a73349c08db4c7116ad6b22625e350650a9841d74e7807770d not found: ID does not exist" containerID="1497fe85263867a73349c08db4c7116ad6b22625e350650a9841d74e7807770d" Nov 24 12:22:10 crc kubenswrapper[4752]: I1124 12:22:10.291714 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1497fe85263867a73349c08db4c7116ad6b22625e350650a9841d74e7807770d"} err="failed to get container status \"1497fe85263867a73349c08db4c7116ad6b22625e350650a9841d74e7807770d\": rpc error: code = NotFound desc = could not find container \"1497fe85263867a73349c08db4c7116ad6b22625e350650a9841d74e7807770d\": container with ID starting with 1497fe85263867a73349c08db4c7116ad6b22625e350650a9841d74e7807770d not found: ID does not exist" Nov 24 12:22:10 crc kubenswrapper[4752]: I1124 12:22:10.737147 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2d34cc2-feb3-4657-996c-e2c3faf94b7d" path="/var/lib/kubelet/pods/a2d34cc2-feb3-4657-996c-e2c3faf94b7d/volumes" Nov 24 12:22:11 crc kubenswrapper[4752]: I1124 12:22:11.630720 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Nov 24 12:22:11 crc kubenswrapper[4752]: I1124 12:22:11.631226 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Nov 24 12:22:11 crc kubenswrapper[4752]: I1124 12:22:11.964973 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Nov 24 12:22:12 crc kubenswrapper[4752]: I1124 12:22:12.192196 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vdn5c"] Nov 24 12:22:12 crc kubenswrapper[4752]: E1124 12:22:12.192584 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2d34cc2-feb3-4657-996c-e2c3faf94b7d" containerName="dnsmasq-dns" Nov 24 12:22:12 crc kubenswrapper[4752]: I1124 12:22:12.192605 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2d34cc2-feb3-4657-996c-e2c3faf94b7d" containerName="dnsmasq-dns" Nov 24 12:22:12 crc kubenswrapper[4752]: E1124 12:22:12.192642 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2d34cc2-feb3-4657-996c-e2c3faf94b7d" containerName="init" Nov 24 12:22:12 crc kubenswrapper[4752]: I1124 12:22:12.192650 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2d34cc2-feb3-4657-996c-e2c3faf94b7d" containerName="init" Nov 24 12:22:12 crc kubenswrapper[4752]: I1124 12:22:12.192863 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2d34cc2-feb3-4657-996c-e2c3faf94b7d" containerName="dnsmasq-dns" Nov 24 12:22:12 crc kubenswrapper[4752]: I1124 12:22:12.194219 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vdn5c" Nov 24 12:22:12 crc kubenswrapper[4752]: I1124 12:22:12.203636 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vdn5c"] Nov 24 12:22:12 crc kubenswrapper[4752]: I1124 12:22:12.317501 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw59c\" (UniqueName: \"kubernetes.io/projected/3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0-kube-api-access-kw59c\") pod \"community-operators-vdn5c\" (UID: \"3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0\") " pod="openshift-marketplace/community-operators-vdn5c" Nov 24 12:22:12 crc kubenswrapper[4752]: I1124 12:22:12.317600 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0-utilities\") pod \"community-operators-vdn5c\" (UID: \"3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0\") " pod="openshift-marketplace/community-operators-vdn5c" Nov 24 12:22:12 crc kubenswrapper[4752]: I1124 12:22:12.317639 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0-catalog-content\") pod \"community-operators-vdn5c\" (UID: \"3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0\") " pod="openshift-marketplace/community-operators-vdn5c" Nov 24 12:22:12 crc kubenswrapper[4752]: I1124 12:22:12.419192 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kw59c\" (UniqueName: \"kubernetes.io/projected/3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0-kube-api-access-kw59c\") pod \"community-operators-vdn5c\" (UID: \"3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0\") " pod="openshift-marketplace/community-operators-vdn5c" Nov 24 12:22:12 crc kubenswrapper[4752]: I1124 12:22:12.419275 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0-utilities\") pod \"community-operators-vdn5c\" (UID: \"3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0\") " pod="openshift-marketplace/community-operators-vdn5c" Nov 24 12:22:12 crc kubenswrapper[4752]: I1124 12:22:12.419314 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0-catalog-content\") pod \"community-operators-vdn5c\" (UID: \"3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0\") " pod="openshift-marketplace/community-operators-vdn5c" Nov 24 12:22:12 crc kubenswrapper[4752]: I1124 12:22:12.419958 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0-utilities\") pod \"community-operators-vdn5c\" (UID: \"3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0\") " pod="openshift-marketplace/community-operators-vdn5c" Nov 24 12:22:12 crc kubenswrapper[4752]: I1124 12:22:12.419983 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0-catalog-content\") pod \"community-operators-vdn5c\" (UID: \"3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0\") " pod="openshift-marketplace/community-operators-vdn5c" Nov 24 12:22:12 crc kubenswrapper[4752]: I1124 12:22:12.448972 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw59c\" (UniqueName: \"kubernetes.io/projected/3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0-kube-api-access-kw59c\") pod \"community-operators-vdn5c\" (UID: \"3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0\") " pod="openshift-marketplace/community-operators-vdn5c" Nov 24 12:22:12 crc kubenswrapper[4752]: I1124 12:22:12.512221 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vdn5c" Nov 24 12:22:13 crc kubenswrapper[4752]: I1124 12:22:13.008337 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vdn5c"] Nov 24 12:22:13 crc kubenswrapper[4752]: I1124 12:22:13.128983 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Nov 24 12:22:13 crc kubenswrapper[4752]: I1124 12:22:13.129302 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Nov 24 12:22:13 crc kubenswrapper[4752]: I1124 12:22:13.195460 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Nov 24 12:22:13 crc kubenswrapper[4752]: I1124 12:22:13.275644 4752 generic.go:334] "Generic (PLEG): container finished" podID="3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0" containerID="701845971e89b040ebc0643a8c18ed23150de3fd670adce565a8e75cf2bd7e42" exitCode=0 Nov 24 12:22:13 crc kubenswrapper[4752]: I1124 12:22:13.275766 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vdn5c" event={"ID":"3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0","Type":"ContainerDied","Data":"701845971e89b040ebc0643a8c18ed23150de3fd670adce565a8e75cf2bd7e42"} Nov 24 12:22:13 crc kubenswrapper[4752]: I1124 12:22:13.275821 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vdn5c" event={"ID":"3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0","Type":"ContainerStarted","Data":"52a56a2f80fd62058d68ce3cdc968e54c11e4f54121228c6963f2d0e3a6ec459"} Nov 24 12:22:13 crc kubenswrapper[4752]: I1124 12:22:13.350033 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Nov 24 12:22:13 crc kubenswrapper[4752]: I1124 12:22:13.792987 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Nov 24 12:22:13 crc kubenswrapper[4752]: I1124 12:22:13.868176 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Nov 24 12:22:14 crc kubenswrapper[4752]: I1124 12:22:14.286103 4752 generic.go:334] "Generic (PLEG): container finished" podID="3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0" containerID="f22cd05757593ab1509e6c34e80bce705ff47e3cab8e4d64997007627fd132e4" exitCode=0 Nov 24 12:22:14 crc kubenswrapper[4752]: I1124 12:22:14.286266 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vdn5c" event={"ID":"3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0","Type":"ContainerDied","Data":"f22cd05757593ab1509e6c34e80bce705ff47e3cab8e4d64997007627fd132e4"} Nov 24 12:22:15 crc kubenswrapper[4752]: I1124 12:22:15.299445 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vdn5c" event={"ID":"3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0","Type":"ContainerStarted","Data":"d8a1a4306f442eea8319c5c851410530fbcd00ed7af51244a630caac364d9733"} Nov 24 12:22:15 crc kubenswrapper[4752]: I1124 12:22:15.325362 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vdn5c" podStartSLOduration=1.9052554449999999 podStartE2EDuration="3.325340795s" podCreationTimestamp="2025-11-24 12:22:12 +0000 UTC" firstStartedPulling="2025-11-24 12:22:13.27797951 +0000 UTC m=+4539.262799799" lastFinishedPulling="2025-11-24 12:22:14.69806485 +0000 UTC m=+4540.682885149" observedRunningTime="2025-11-24 12:22:15.321804274 +0000 UTC m=+4541.306624583" watchObservedRunningTime="2025-11-24 12:22:15.325340795 +0000 UTC m=+4541.310161084" Nov 24 12:22:16 crc kubenswrapper[4752]: I1124 12:22:16.728580 4752 scope.go:117] "RemoveContainer" containerID="a07fb79ffa4c02fa1ac7fbfe36d7a8beb4399a115479826734e5d12ec7283331" Nov 24 12:22:17 crc kubenswrapper[4752]: I1124 12:22:17.317952 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerStarted","Data":"a761cdb5d16e454a3bb6fbbc0a1a780f17a2f259a2694adda1f6d7d5f0bbc7a3"} Nov 24 12:22:22 crc kubenswrapper[4752]: I1124 12:22:22.512902 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vdn5c" Nov 24 12:22:22 crc kubenswrapper[4752]: I1124 12:22:22.513523 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vdn5c" Nov 24 12:22:22 crc kubenswrapper[4752]: I1124 12:22:22.562585 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vdn5c" Nov 24 12:22:23 crc kubenswrapper[4752]: I1124 12:22:23.445614 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vdn5c" Nov 24 12:22:23 crc kubenswrapper[4752]: I1124 12:22:23.509953 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vdn5c"] Nov 24 12:22:25 crc kubenswrapper[4752]: I1124 12:22:25.399777 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vdn5c" podUID="3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0" containerName="registry-server" containerID="cri-o://d8a1a4306f442eea8319c5c851410530fbcd00ed7af51244a630caac364d9733" gracePeriod=2 Nov 24 12:22:25 crc kubenswrapper[4752]: I1124 12:22:25.808037 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vdn5c" Nov 24 12:22:25 crc kubenswrapper[4752]: I1124 12:22:25.933927 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kw59c\" (UniqueName: \"kubernetes.io/projected/3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0-kube-api-access-kw59c\") pod \"3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0\" (UID: \"3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0\") " Nov 24 12:22:25 crc kubenswrapper[4752]: I1124 12:22:25.933968 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0-catalog-content\") pod \"3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0\" (UID: \"3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0\") " Nov 24 12:22:25 crc kubenswrapper[4752]: I1124 12:22:25.933986 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0-utilities\") pod \"3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0\" (UID: \"3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0\") " Nov 24 12:22:25 crc kubenswrapper[4752]: I1124 12:22:25.935367 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0-utilities" (OuterVolumeSpecName: "utilities") pod "3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0" (UID: "3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:22:25 crc kubenswrapper[4752]: I1124 12:22:25.941426 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0-kube-api-access-kw59c" (OuterVolumeSpecName: "kube-api-access-kw59c") pod "3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0" (UID: "3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0"). InnerVolumeSpecName "kube-api-access-kw59c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:22:26 crc kubenswrapper[4752]: I1124 12:22:26.008816 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0" (UID: "3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:22:26 crc kubenswrapper[4752]: I1124 12:22:26.036147 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kw59c\" (UniqueName: \"kubernetes.io/projected/3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0-kube-api-access-kw59c\") on node \"crc\" DevicePath \"\"" Nov 24 12:22:26 crc kubenswrapper[4752]: I1124 12:22:26.036194 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 12:22:26 crc kubenswrapper[4752]: I1124 12:22:26.036208 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 12:22:26 crc kubenswrapper[4752]: I1124 12:22:26.412048 4752 generic.go:334] "Generic (PLEG): container finished" podID="3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0" containerID="d8a1a4306f442eea8319c5c851410530fbcd00ed7af51244a630caac364d9733" exitCode=0 Nov 24 12:22:26 crc kubenswrapper[4752]: I1124 12:22:26.412098 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vdn5c" event={"ID":"3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0","Type":"ContainerDied","Data":"d8a1a4306f442eea8319c5c851410530fbcd00ed7af51244a630caac364d9733"} Nov 24 12:22:26 crc kubenswrapper[4752]: I1124 12:22:26.412135 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vdn5c" event={"ID":"3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0","Type":"ContainerDied","Data":"52a56a2f80fd62058d68ce3cdc968e54c11e4f54121228c6963f2d0e3a6ec459"} Nov 24 12:22:26 crc kubenswrapper[4752]: I1124 12:22:26.412134 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vdn5c" Nov 24 12:22:26 crc kubenswrapper[4752]: I1124 12:22:26.412166 4752 scope.go:117] "RemoveContainer" containerID="d8a1a4306f442eea8319c5c851410530fbcd00ed7af51244a630caac364d9733" Nov 24 12:22:26 crc kubenswrapper[4752]: I1124 12:22:26.449646 4752 scope.go:117] "RemoveContainer" containerID="f22cd05757593ab1509e6c34e80bce705ff47e3cab8e4d64997007627fd132e4" Nov 24 12:22:26 crc kubenswrapper[4752]: I1124 12:22:26.466201 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vdn5c"] Nov 24 12:22:26 crc kubenswrapper[4752]: I1124 12:22:26.472981 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vdn5c"] Nov 24 12:22:26 crc kubenswrapper[4752]: I1124 12:22:26.480450 4752 scope.go:117] "RemoveContainer" containerID="701845971e89b040ebc0643a8c18ed23150de3fd670adce565a8e75cf2bd7e42" Nov 24 12:22:26 crc kubenswrapper[4752]: I1124 12:22:26.511079 4752 scope.go:117] "RemoveContainer" containerID="d8a1a4306f442eea8319c5c851410530fbcd00ed7af51244a630caac364d9733" Nov 24 12:22:26 crc kubenswrapper[4752]: E1124 12:22:26.511664 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8a1a4306f442eea8319c5c851410530fbcd00ed7af51244a630caac364d9733\": container with ID starting with d8a1a4306f442eea8319c5c851410530fbcd00ed7af51244a630caac364d9733 not found: ID does not exist" containerID="d8a1a4306f442eea8319c5c851410530fbcd00ed7af51244a630caac364d9733" Nov 24 12:22:26 crc kubenswrapper[4752]: I1124 12:22:26.511699 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8a1a4306f442eea8319c5c851410530fbcd00ed7af51244a630caac364d9733"} err="failed to get container status \"d8a1a4306f442eea8319c5c851410530fbcd00ed7af51244a630caac364d9733\": rpc error: code = NotFound desc = could not find container \"d8a1a4306f442eea8319c5c851410530fbcd00ed7af51244a630caac364d9733\": container with ID starting with d8a1a4306f442eea8319c5c851410530fbcd00ed7af51244a630caac364d9733 not found: ID does not exist" Nov 24 12:22:26 crc kubenswrapper[4752]: I1124 12:22:26.511721 4752 scope.go:117] "RemoveContainer" containerID="f22cd05757593ab1509e6c34e80bce705ff47e3cab8e4d64997007627fd132e4" Nov 24 12:22:26 crc kubenswrapper[4752]: E1124 12:22:26.512116 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f22cd05757593ab1509e6c34e80bce705ff47e3cab8e4d64997007627fd132e4\": container with ID starting with f22cd05757593ab1509e6c34e80bce705ff47e3cab8e4d64997007627fd132e4 not found: ID does not exist" containerID="f22cd05757593ab1509e6c34e80bce705ff47e3cab8e4d64997007627fd132e4" Nov 24 12:22:26 crc kubenswrapper[4752]: I1124 12:22:26.512155 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f22cd05757593ab1509e6c34e80bce705ff47e3cab8e4d64997007627fd132e4"} err="failed to get container status \"f22cd05757593ab1509e6c34e80bce705ff47e3cab8e4d64997007627fd132e4\": rpc error: code = NotFound desc = could not find container \"f22cd05757593ab1509e6c34e80bce705ff47e3cab8e4d64997007627fd132e4\": container with ID starting with f22cd05757593ab1509e6c34e80bce705ff47e3cab8e4d64997007627fd132e4 not found: ID does not exist" Nov 24 12:22:26 crc kubenswrapper[4752]: I1124 12:22:26.512168 4752 scope.go:117] "RemoveContainer" containerID="701845971e89b040ebc0643a8c18ed23150de3fd670adce565a8e75cf2bd7e42" Nov 24 12:22:26 crc kubenswrapper[4752]: E1124 12:22:26.512381 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"701845971e89b040ebc0643a8c18ed23150de3fd670adce565a8e75cf2bd7e42\": container with ID starting with 701845971e89b040ebc0643a8c18ed23150de3fd670adce565a8e75cf2bd7e42 not found: ID does not exist" containerID="701845971e89b040ebc0643a8c18ed23150de3fd670adce565a8e75cf2bd7e42" Nov 24 12:22:26 crc kubenswrapper[4752]: I1124 12:22:26.512405 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"701845971e89b040ebc0643a8c18ed23150de3fd670adce565a8e75cf2bd7e42"} err="failed to get container status \"701845971e89b040ebc0643a8c18ed23150de3fd670adce565a8e75cf2bd7e42\": rpc error: code = NotFound desc = could not find container \"701845971e89b040ebc0643a8c18ed23150de3fd670adce565a8e75cf2bd7e42\": container with ID starting with 701845971e89b040ebc0643a8c18ed23150de3fd670adce565a8e75cf2bd7e42 not found: ID does not exist" Nov 24 12:22:26 crc kubenswrapper[4752]: I1124 12:22:26.737066 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0" path="/var/lib/kubelet/pods/3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0/volumes" Nov 24 12:22:33 crc kubenswrapper[4752]: I1124 12:22:33.489205 4752 generic.go:334] "Generic (PLEG): container finished" podID="1d501403-cb13-4634-9ca3-c5bacc88afbb" containerID="43190f620818c55969439e084c3a8dc56ded7b65f5caa0134917550722135669" exitCode=0 Nov 24 12:22:33 crc kubenswrapper[4752]: I1124 12:22:33.489343 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1d501403-cb13-4634-9ca3-c5bacc88afbb","Type":"ContainerDied","Data":"43190f620818c55969439e084c3a8dc56ded7b65f5caa0134917550722135669"} Nov 24 12:22:34 crc kubenswrapper[4752]: I1124 12:22:34.499114 4752 generic.go:334] "Generic (PLEG): container finished" podID="90991bb7-c6be-4088-9071-14bd5884b506" containerID="801285b08d9741fcb790faaa434e8388447bab9cc1187e70fe84e0ad30a689a9" exitCode=0 Nov 24 12:22:34 crc kubenswrapper[4752]: I1124 12:22:34.499209 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"90991bb7-c6be-4088-9071-14bd5884b506","Type":"ContainerDied","Data":"801285b08d9741fcb790faaa434e8388447bab9cc1187e70fe84e0ad30a689a9"} Nov 24 12:22:34 crc kubenswrapper[4752]: I1124 12:22:34.501772 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1d501403-cb13-4634-9ca3-c5bacc88afbb","Type":"ContainerStarted","Data":"26343c6493ffa37f66dfae8ab37d1730bba8878ab6a73ede1a6603f98acb108e"} Nov 24 12:22:34 crc kubenswrapper[4752]: I1124 12:22:34.502049 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Nov 24 12:22:34 crc kubenswrapper[4752]: I1124 12:22:34.563695 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.563667808 podStartE2EDuration="36.563667808s" podCreationTimestamp="2025-11-24 12:21:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:22:34.55713094 +0000 UTC m=+4560.541951299" watchObservedRunningTime="2025-11-24 12:22:34.563667808 +0000 UTC m=+4560.548488117" Nov 24 12:22:35 crc kubenswrapper[4752]: I1124 12:22:35.509645 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"90991bb7-c6be-4088-9071-14bd5884b506","Type":"ContainerStarted","Data":"8d849e7737f1fed3abc77c2db9c5e5b94b6a304de8781d9cd29ff61a8a86f9ad"} Nov 24 12:22:35 crc kubenswrapper[4752]: I1124 12:22:35.510422 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:22:35 crc kubenswrapper[4752]: I1124 12:22:35.540864 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.540837931 podStartE2EDuration="37.540837931s" podCreationTimestamp="2025-11-24 12:21:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:22:35.532865722 +0000 UTC m=+4561.517686031" watchObservedRunningTime="2025-11-24 12:22:35.540837931 +0000 UTC m=+4561.525658220" Nov 24 12:22:49 crc kubenswrapper[4752]: I1124 12:22:49.984979 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Nov 24 12:22:50 crc kubenswrapper[4752]: I1124 12:22:50.354722 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:22:55 crc kubenswrapper[4752]: I1124 12:22:55.398771 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-2ph2n"] Nov 24 12:22:55 crc kubenswrapper[4752]: E1124 12:22:55.399549 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0" containerName="extract-content" Nov 24 12:22:55 crc kubenswrapper[4752]: I1124 12:22:55.399561 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0" containerName="extract-content" Nov 24 12:22:55 crc kubenswrapper[4752]: E1124 12:22:55.399576 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0" containerName="registry-server" Nov 24 12:22:55 crc kubenswrapper[4752]: I1124 12:22:55.399582 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0" containerName="registry-server" Nov 24 12:22:55 crc kubenswrapper[4752]: E1124 12:22:55.399603 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0" containerName="extract-utilities" Nov 24 12:22:55 crc kubenswrapper[4752]: I1124 12:22:55.399609 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0" containerName="extract-utilities" Nov 24 12:22:55 crc kubenswrapper[4752]: I1124 12:22:55.399764 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e1f1012-63b3-4f00-b7b1-1d2f3a62abf0" containerName="registry-server" Nov 24 12:22:55 crc kubenswrapper[4752]: I1124 12:22:55.400485 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-2ph2n" Nov 24 12:22:55 crc kubenswrapper[4752]: I1124 12:22:55.410477 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-2ph2n"] Nov 24 12:22:55 crc kubenswrapper[4752]: I1124 12:22:55.432958 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbc20788-5a71-4bb6-a9f3-6a41f1b9f582-config\") pod \"dnsmasq-dns-5b7946d7b9-2ph2n\" (UID: \"bbc20788-5a71-4bb6-a9f3-6a41f1b9f582\") " pod="openstack/dnsmasq-dns-5b7946d7b9-2ph2n" Nov 24 12:22:55 crc kubenswrapper[4752]: I1124 12:22:55.433027 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbc20788-5a71-4bb6-a9f3-6a41f1b9f582-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-2ph2n\" (UID: \"bbc20788-5a71-4bb6-a9f3-6a41f1b9f582\") " pod="openstack/dnsmasq-dns-5b7946d7b9-2ph2n" Nov 24 12:22:55 crc kubenswrapper[4752]: I1124 12:22:55.433077 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6x8f\" (UniqueName: \"kubernetes.io/projected/bbc20788-5a71-4bb6-a9f3-6a41f1b9f582-kube-api-access-t6x8f\") pod \"dnsmasq-dns-5b7946d7b9-2ph2n\" (UID: \"bbc20788-5a71-4bb6-a9f3-6a41f1b9f582\") " pod="openstack/dnsmasq-dns-5b7946d7b9-2ph2n" Nov 24 12:22:55 crc kubenswrapper[4752]: I1124 12:22:55.534528 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbc20788-5a71-4bb6-a9f3-6a41f1b9f582-config\") pod \"dnsmasq-dns-5b7946d7b9-2ph2n\" (UID: \"bbc20788-5a71-4bb6-a9f3-6a41f1b9f582\") " pod="openstack/dnsmasq-dns-5b7946d7b9-2ph2n" Nov 24 12:22:55 crc kubenswrapper[4752]: I1124 12:22:55.534937 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbc20788-5a71-4bb6-a9f3-6a41f1b9f582-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-2ph2n\" (UID: \"bbc20788-5a71-4bb6-a9f3-6a41f1b9f582\") " pod="openstack/dnsmasq-dns-5b7946d7b9-2ph2n" Nov 24 12:22:55 crc kubenswrapper[4752]: I1124 12:22:55.534981 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6x8f\" (UniqueName: \"kubernetes.io/projected/bbc20788-5a71-4bb6-a9f3-6a41f1b9f582-kube-api-access-t6x8f\") pod \"dnsmasq-dns-5b7946d7b9-2ph2n\" (UID: \"bbc20788-5a71-4bb6-a9f3-6a41f1b9f582\") " pod="openstack/dnsmasq-dns-5b7946d7b9-2ph2n" Nov 24 12:22:55 crc kubenswrapper[4752]: I1124 12:22:55.535888 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbc20788-5a71-4bb6-a9f3-6a41f1b9f582-config\") pod \"dnsmasq-dns-5b7946d7b9-2ph2n\" (UID: \"bbc20788-5a71-4bb6-a9f3-6a41f1b9f582\") " pod="openstack/dnsmasq-dns-5b7946d7b9-2ph2n" Nov 24 12:22:55 crc kubenswrapper[4752]: I1124 12:22:55.535902 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbc20788-5a71-4bb6-a9f3-6a41f1b9f582-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-2ph2n\" (UID: \"bbc20788-5a71-4bb6-a9f3-6a41f1b9f582\") " pod="openstack/dnsmasq-dns-5b7946d7b9-2ph2n" Nov 24 12:22:55 crc kubenswrapper[4752]: I1124 12:22:55.553408 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6x8f\" (UniqueName: \"kubernetes.io/projected/bbc20788-5a71-4bb6-a9f3-6a41f1b9f582-kube-api-access-t6x8f\") pod \"dnsmasq-dns-5b7946d7b9-2ph2n\" (UID: \"bbc20788-5a71-4bb6-a9f3-6a41f1b9f582\") " pod="openstack/dnsmasq-dns-5b7946d7b9-2ph2n" Nov 24 12:22:55 crc kubenswrapper[4752]: I1124 12:22:55.719286 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-2ph2n" Nov 24 12:22:56 crc kubenswrapper[4752]: I1124 12:22:56.038339 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 24 12:22:56 crc kubenswrapper[4752]: I1124 12:22:56.139965 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-2ph2n"] Nov 24 12:22:56 crc kubenswrapper[4752]: I1124 12:22:56.682470 4752 generic.go:334] "Generic (PLEG): container finished" podID="bbc20788-5a71-4bb6-a9f3-6a41f1b9f582" containerID="2f4d6c8ce2ee68ba5ab0bcd8124fc381a6e0ecaf5294b796c5b9be59ee120a05" exitCode=0 Nov 24 12:22:56 crc kubenswrapper[4752]: I1124 12:22:56.682521 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-2ph2n" event={"ID":"bbc20788-5a71-4bb6-a9f3-6a41f1b9f582","Type":"ContainerDied","Data":"2f4d6c8ce2ee68ba5ab0bcd8124fc381a6e0ecaf5294b796c5b9be59ee120a05"} Nov 24 12:22:56 crc kubenswrapper[4752]: I1124 12:22:56.682874 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-2ph2n" event={"ID":"bbc20788-5a71-4bb6-a9f3-6a41f1b9f582","Type":"ContainerStarted","Data":"54d3196f8ce8e288a9f3b71217dfaf5cbe8f9efb987a35983cf642af4b89e2a1"} Nov 24 12:22:56 crc kubenswrapper[4752]: I1124 12:22:56.835332 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 24 12:22:57 crc kubenswrapper[4752]: I1124 12:22:57.691836 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-2ph2n" event={"ID":"bbc20788-5a71-4bb6-a9f3-6a41f1b9f582","Type":"ContainerStarted","Data":"b3fea195a21ea21f3da8e4e11888480da96d09fb7eb4f8e0e8744897f13b1b59"} Nov 24 12:22:57 crc kubenswrapper[4752]: I1124 12:22:57.692179 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b7946d7b9-2ph2n" Nov 24 12:22:57 crc kubenswrapper[4752]: I1124 12:22:57.711557 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b7946d7b9-2ph2n" podStartSLOduration=2.711536557 podStartE2EDuration="2.711536557s" podCreationTimestamp="2025-11-24 12:22:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:22:57.708089228 +0000 UTC m=+4583.692909517" watchObservedRunningTime="2025-11-24 12:22:57.711536557 +0000 UTC m=+4583.696356846" Nov 24 12:22:57 crc kubenswrapper[4752]: I1124 12:22:57.883523 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="1d501403-cb13-4634-9ca3-c5bacc88afbb" containerName="rabbitmq" containerID="cri-o://26343c6493ffa37f66dfae8ab37d1730bba8878ab6a73ede1a6603f98acb108e" gracePeriod=604799 Nov 24 12:22:58 crc kubenswrapper[4752]: I1124 12:22:58.669251 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="90991bb7-c6be-4088-9071-14bd5884b506" containerName="rabbitmq" containerID="cri-o://8d849e7737f1fed3abc77c2db9c5e5b94b6a304de8781d9cd29ff61a8a86f9ad" gracePeriod=604799 Nov 24 12:22:59 crc kubenswrapper[4752]: I1124 12:22:59.981904 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="1d501403-cb13-4634-9ca3-c5bacc88afbb" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.241:5672: connect: connection refused" Nov 24 12:23:00 crc kubenswrapper[4752]: I1124 12:23:00.353681 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="90991bb7-c6be-4088-9071-14bd5884b506" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.242:5672: connect: connection refused" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.489040 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.599279 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1d501403-cb13-4634-9ca3-c5bacc88afbb-plugins-conf\") pod \"1d501403-cb13-4634-9ca3-c5bacc88afbb\" (UID: \"1d501403-cb13-4634-9ca3-c5bacc88afbb\") " Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.599394 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1d501403-cb13-4634-9ca3-c5bacc88afbb-rabbitmq-confd\") pod \"1d501403-cb13-4634-9ca3-c5bacc88afbb\" (UID: \"1d501403-cb13-4634-9ca3-c5bacc88afbb\") " Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.599431 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjk6t\" (UniqueName: \"kubernetes.io/projected/1d501403-cb13-4634-9ca3-c5bacc88afbb-kube-api-access-jjk6t\") pod \"1d501403-cb13-4634-9ca3-c5bacc88afbb\" (UID: \"1d501403-cb13-4634-9ca3-c5bacc88afbb\") " Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.599455 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1d501403-cb13-4634-9ca3-c5bacc88afbb-erlang-cookie-secret\") pod \"1d501403-cb13-4634-9ca3-c5bacc88afbb\" (UID: \"1d501403-cb13-4634-9ca3-c5bacc88afbb\") " Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.599475 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1d501403-cb13-4634-9ca3-c5bacc88afbb-server-conf\") pod \"1d501403-cb13-4634-9ca3-c5bacc88afbb\" (UID: \"1d501403-cb13-4634-9ca3-c5bacc88afbb\") " Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.599549 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1d501403-cb13-4634-9ca3-c5bacc88afbb-rabbitmq-plugins\") pod \"1d501403-cb13-4634-9ca3-c5bacc88afbb\" (UID: \"1d501403-cb13-4634-9ca3-c5bacc88afbb\") " Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.599587 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1d501403-cb13-4634-9ca3-c5bacc88afbb-rabbitmq-erlang-cookie\") pod \"1d501403-cb13-4634-9ca3-c5bacc88afbb\" (UID: \"1d501403-cb13-4634-9ca3-c5bacc88afbb\") " Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.599857 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0d80a964-a5fe-4cde-9b86-e065487a1d17\") pod \"1d501403-cb13-4634-9ca3-c5bacc88afbb\" (UID: \"1d501403-cb13-4634-9ca3-c5bacc88afbb\") " Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.599942 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1d501403-cb13-4634-9ca3-c5bacc88afbb-pod-info\") pod \"1d501403-cb13-4634-9ca3-c5bacc88afbb\" (UID: \"1d501403-cb13-4634-9ca3-c5bacc88afbb\") " Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.602885 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d501403-cb13-4634-9ca3-c5bacc88afbb-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "1d501403-cb13-4634-9ca3-c5bacc88afbb" (UID: "1d501403-cb13-4634-9ca3-c5bacc88afbb"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.604225 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d501403-cb13-4634-9ca3-c5bacc88afbb-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "1d501403-cb13-4634-9ca3-c5bacc88afbb" (UID: "1d501403-cb13-4634-9ca3-c5bacc88afbb"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.607517 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d501403-cb13-4634-9ca3-c5bacc88afbb-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "1d501403-cb13-4634-9ca3-c5bacc88afbb" (UID: "1d501403-cb13-4634-9ca3-c5bacc88afbb"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.609890 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/1d501403-cb13-4634-9ca3-c5bacc88afbb-pod-info" (OuterVolumeSpecName: "pod-info") pod "1d501403-cb13-4634-9ca3-c5bacc88afbb" (UID: "1d501403-cb13-4634-9ca3-c5bacc88afbb"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.610245 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d501403-cb13-4634-9ca3-c5bacc88afbb-kube-api-access-jjk6t" (OuterVolumeSpecName: "kube-api-access-jjk6t") pod "1d501403-cb13-4634-9ca3-c5bacc88afbb" (UID: "1d501403-cb13-4634-9ca3-c5bacc88afbb"). InnerVolumeSpecName "kube-api-access-jjk6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.611288 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d501403-cb13-4634-9ca3-c5bacc88afbb-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "1d501403-cb13-4634-9ca3-c5bacc88afbb" (UID: "1d501403-cb13-4634-9ca3-c5bacc88afbb"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.620056 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0d80a964-a5fe-4cde-9b86-e065487a1d17" (OuterVolumeSpecName: "persistence") pod "1d501403-cb13-4634-9ca3-c5bacc88afbb" (UID: "1d501403-cb13-4634-9ca3-c5bacc88afbb"). InnerVolumeSpecName "pvc-0d80a964-a5fe-4cde-9b86-e065487a1d17". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.633386 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d501403-cb13-4634-9ca3-c5bacc88afbb-server-conf" (OuterVolumeSpecName: "server-conf") pod "1d501403-cb13-4634-9ca3-c5bacc88afbb" (UID: "1d501403-cb13-4634-9ca3-c5bacc88afbb"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.686730 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d501403-cb13-4634-9ca3-c5bacc88afbb-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "1d501403-cb13-4634-9ca3-c5bacc88afbb" (UID: "1d501403-cb13-4634-9ca3-c5bacc88afbb"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.701773 4752 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1d501403-cb13-4634-9ca3-c5bacc88afbb-pod-info\") on node \"crc\" DevicePath \"\"" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.701804 4752 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1d501403-cb13-4634-9ca3-c5bacc88afbb-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.701814 4752 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1d501403-cb13-4634-9ca3-c5bacc88afbb-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.701823 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjk6t\" (UniqueName: \"kubernetes.io/projected/1d501403-cb13-4634-9ca3-c5bacc88afbb-kube-api-access-jjk6t\") on node \"crc\" DevicePath \"\"" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.701832 4752 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1d501403-cb13-4634-9ca3-c5bacc88afbb-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.701840 4752 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1d501403-cb13-4634-9ca3-c5bacc88afbb-server-conf\") on node \"crc\" DevicePath \"\"" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.701847 4752 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1d501403-cb13-4634-9ca3-c5bacc88afbb-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.701858 4752 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1d501403-cb13-4634-9ca3-c5bacc88afbb-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.701894 4752 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-0d80a964-a5fe-4cde-9b86-e065487a1d17\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0d80a964-a5fe-4cde-9b86-e065487a1d17\") on node \"crc\" " Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.717228 4752 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.717392 4752 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-0d80a964-a5fe-4cde-9b86-e065487a1d17" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0d80a964-a5fe-4cde-9b86-e065487a1d17") on node "crc" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.740812 4752 generic.go:334] "Generic (PLEG): container finished" podID="1d501403-cb13-4634-9ca3-c5bacc88afbb" containerID="26343c6493ffa37f66dfae8ab37d1730bba8878ab6a73ede1a6603f98acb108e" exitCode=0 Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.740969 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.741916 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1d501403-cb13-4634-9ca3-c5bacc88afbb","Type":"ContainerDied","Data":"26343c6493ffa37f66dfae8ab37d1730bba8878ab6a73ede1a6603f98acb108e"} Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.742036 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1d501403-cb13-4634-9ca3-c5bacc88afbb","Type":"ContainerDied","Data":"59d83d68c9d55f83982b8a9bcbbe111bbbae86f6570e5780c0b498b1d45ddc40"} Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.742147 4752 scope.go:117] "RemoveContainer" containerID="26343c6493ffa37f66dfae8ab37d1730bba8878ab6a73ede1a6603f98acb108e" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.787167 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.798950 4752 scope.go:117] "RemoveContainer" containerID="43190f620818c55969439e084c3a8dc56ded7b65f5caa0134917550722135669" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.799924 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.803109 4752 reconciler_common.go:293] "Volume detached for volume \"pvc-0d80a964-a5fe-4cde-9b86-e065487a1d17\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0d80a964-a5fe-4cde-9b86-e065487a1d17\") on node \"crc\" DevicePath \"\"" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.809625 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Nov 24 12:23:04 crc kubenswrapper[4752]: E1124 12:23:04.810081 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d501403-cb13-4634-9ca3-c5bacc88afbb" containerName="rabbitmq" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.810102 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d501403-cb13-4634-9ca3-c5bacc88afbb" containerName="rabbitmq" Nov 24 12:23:04 crc kubenswrapper[4752]: E1124 12:23:04.810117 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d501403-cb13-4634-9ca3-c5bacc88afbb" containerName="setup-container" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.810124 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d501403-cb13-4634-9ca3-c5bacc88afbb" containerName="setup-container" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.810338 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d501403-cb13-4634-9ca3-c5bacc88afbb" containerName="rabbitmq" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.811700 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.813644 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.813781 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-96klc" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.813644 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.814504 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.815513 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.816988 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.877369 4752 scope.go:117] "RemoveContainer" containerID="26343c6493ffa37f66dfae8ab37d1730bba8878ab6a73ede1a6603f98acb108e" Nov 24 12:23:04 crc kubenswrapper[4752]: E1124 12:23:04.877877 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26343c6493ffa37f66dfae8ab37d1730bba8878ab6a73ede1a6603f98acb108e\": container with ID starting with 26343c6493ffa37f66dfae8ab37d1730bba8878ab6a73ede1a6603f98acb108e not found: ID does not exist" containerID="26343c6493ffa37f66dfae8ab37d1730bba8878ab6a73ede1a6603f98acb108e" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.878064 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26343c6493ffa37f66dfae8ab37d1730bba8878ab6a73ede1a6603f98acb108e"} err="failed to get container status \"26343c6493ffa37f66dfae8ab37d1730bba8878ab6a73ede1a6603f98acb108e\": rpc error: code = NotFound desc = could not find container \"26343c6493ffa37f66dfae8ab37d1730bba8878ab6a73ede1a6603f98acb108e\": container with ID starting with 26343c6493ffa37f66dfae8ab37d1730bba8878ab6a73ede1a6603f98acb108e not found: ID does not exist" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.878173 4752 scope.go:117] "RemoveContainer" containerID="43190f620818c55969439e084c3a8dc56ded7b65f5caa0134917550722135669" Nov 24 12:23:04 crc kubenswrapper[4752]: E1124 12:23:04.878599 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43190f620818c55969439e084c3a8dc56ded7b65f5caa0134917550722135669\": container with ID starting with 43190f620818c55969439e084c3a8dc56ded7b65f5caa0134917550722135669 not found: ID does not exist" containerID="43190f620818c55969439e084c3a8dc56ded7b65f5caa0134917550722135669" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.878624 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43190f620818c55969439e084c3a8dc56ded7b65f5caa0134917550722135669"} err="failed to get container status \"43190f620818c55969439e084c3a8dc56ded7b65f5caa0134917550722135669\": rpc error: code = NotFound desc = could not find container \"43190f620818c55969439e084c3a8dc56ded7b65f5caa0134917550722135669\": container with ID starting with 43190f620818c55969439e084c3a8dc56ded7b65f5caa0134917550722135669 not found: ID does not exist" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.904514 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6d47e62f-ca69-4327-97d4-c8e10e7bc522-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6d47e62f-ca69-4327-97d4-c8e10e7bc522\") " pod="openstack/rabbitmq-server-0" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.904576 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6d47e62f-ca69-4327-97d4-c8e10e7bc522-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6d47e62f-ca69-4327-97d4-c8e10e7bc522\") " pod="openstack/rabbitmq-server-0" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.904616 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6d47e62f-ca69-4327-97d4-c8e10e7bc522-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6d47e62f-ca69-4327-97d4-c8e10e7bc522\") " pod="openstack/rabbitmq-server-0" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.904690 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6d47e62f-ca69-4327-97d4-c8e10e7bc522-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6d47e62f-ca69-4327-97d4-c8e10e7bc522\") " pod="openstack/rabbitmq-server-0" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.904713 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6d47e62f-ca69-4327-97d4-c8e10e7bc522-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6d47e62f-ca69-4327-97d4-c8e10e7bc522\") " pod="openstack/rabbitmq-server-0" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.904732 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6d47e62f-ca69-4327-97d4-c8e10e7bc522-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6d47e62f-ca69-4327-97d4-c8e10e7bc522\") " pod="openstack/rabbitmq-server-0" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.904786 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0d80a964-a5fe-4cde-9b86-e065487a1d17\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0d80a964-a5fe-4cde-9b86-e065487a1d17\") pod \"rabbitmq-server-0\" (UID: \"6d47e62f-ca69-4327-97d4-c8e10e7bc522\") " pod="openstack/rabbitmq-server-0" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.904819 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6d47e62f-ca69-4327-97d4-c8e10e7bc522-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6d47e62f-ca69-4327-97d4-c8e10e7bc522\") " pod="openstack/rabbitmq-server-0" Nov 24 12:23:04 crc kubenswrapper[4752]: I1124 12:23:04.904858 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhp9t\" (UniqueName: \"kubernetes.io/projected/6d47e62f-ca69-4327-97d4-c8e10e7bc522-kube-api-access-mhp9t\") pod \"rabbitmq-server-0\" (UID: \"6d47e62f-ca69-4327-97d4-c8e10e7bc522\") " pod="openstack/rabbitmq-server-0" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.006342 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6d47e62f-ca69-4327-97d4-c8e10e7bc522-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6d47e62f-ca69-4327-97d4-c8e10e7bc522\") " pod="openstack/rabbitmq-server-0" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.006393 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6d47e62f-ca69-4327-97d4-c8e10e7bc522-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6d47e62f-ca69-4327-97d4-c8e10e7bc522\") " pod="openstack/rabbitmq-server-0" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.006422 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6d47e62f-ca69-4327-97d4-c8e10e7bc522-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6d47e62f-ca69-4327-97d4-c8e10e7bc522\") " pod="openstack/rabbitmq-server-0" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.006467 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0d80a964-a5fe-4cde-9b86-e065487a1d17\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0d80a964-a5fe-4cde-9b86-e065487a1d17\") pod \"rabbitmq-server-0\" (UID: \"6d47e62f-ca69-4327-97d4-c8e10e7bc522\") " pod="openstack/rabbitmq-server-0" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.007302 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6d47e62f-ca69-4327-97d4-c8e10e7bc522-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6d47e62f-ca69-4327-97d4-c8e10e7bc522\") " pod="openstack/rabbitmq-server-0" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.006508 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6d47e62f-ca69-4327-97d4-c8e10e7bc522-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6d47e62f-ca69-4327-97d4-c8e10e7bc522\") " pod="openstack/rabbitmq-server-0" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.007427 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhp9t\" (UniqueName: \"kubernetes.io/projected/6d47e62f-ca69-4327-97d4-c8e10e7bc522-kube-api-access-mhp9t\") pod \"rabbitmq-server-0\" (UID: \"6d47e62f-ca69-4327-97d4-c8e10e7bc522\") " pod="openstack/rabbitmq-server-0" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.007451 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6d47e62f-ca69-4327-97d4-c8e10e7bc522-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6d47e62f-ca69-4327-97d4-c8e10e7bc522\") " pod="openstack/rabbitmq-server-0" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.007480 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6d47e62f-ca69-4327-97d4-c8e10e7bc522-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6d47e62f-ca69-4327-97d4-c8e10e7bc522\") " pod="openstack/rabbitmq-server-0" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.007517 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6d47e62f-ca69-4327-97d4-c8e10e7bc522-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6d47e62f-ca69-4327-97d4-c8e10e7bc522\") " pod="openstack/rabbitmq-server-0" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.008094 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6d47e62f-ca69-4327-97d4-c8e10e7bc522-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6d47e62f-ca69-4327-97d4-c8e10e7bc522\") " pod="openstack/rabbitmq-server-0" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.008243 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6d47e62f-ca69-4327-97d4-c8e10e7bc522-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6d47e62f-ca69-4327-97d4-c8e10e7bc522\") " pod="openstack/rabbitmq-server-0" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.013109 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6d47e62f-ca69-4327-97d4-c8e10e7bc522-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6d47e62f-ca69-4327-97d4-c8e10e7bc522\") " pod="openstack/rabbitmq-server-0" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.013133 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6d47e62f-ca69-4327-97d4-c8e10e7bc522-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6d47e62f-ca69-4327-97d4-c8e10e7bc522\") " pod="openstack/rabbitmq-server-0" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.013357 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6d47e62f-ca69-4327-97d4-c8e10e7bc522-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6d47e62f-ca69-4327-97d4-c8e10e7bc522\") " pod="openstack/rabbitmq-server-0" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.013509 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6d47e62f-ca69-4327-97d4-c8e10e7bc522-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6d47e62f-ca69-4327-97d4-c8e10e7bc522\") " pod="openstack/rabbitmq-server-0" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.025348 4752 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.025387 4752 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0d80a964-a5fe-4cde-9b86-e065487a1d17\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0d80a964-a5fe-4cde-9b86-e065487a1d17\") pod \"rabbitmq-server-0\" (UID: \"6d47e62f-ca69-4327-97d4-c8e10e7bc522\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/cec840fc310b133ee765763ff04aa0c368df6d959d37e1bdbc23d48f8ebe2d97/globalmount\"" pod="openstack/rabbitmq-server-0" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.026736 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhp9t\" (UniqueName: \"kubernetes.io/projected/6d47e62f-ca69-4327-97d4-c8e10e7bc522-kube-api-access-mhp9t\") pod \"rabbitmq-server-0\" (UID: \"6d47e62f-ca69-4327-97d4-c8e10e7bc522\") " pod="openstack/rabbitmq-server-0" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.056672 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0d80a964-a5fe-4cde-9b86-e065487a1d17\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0d80a964-a5fe-4cde-9b86-e065487a1d17\") pod \"rabbitmq-server-0\" (UID: \"6d47e62f-ca69-4327-97d4-c8e10e7bc522\") " pod="openstack/rabbitmq-server-0" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.190365 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.200060 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.310773 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/90991bb7-c6be-4088-9071-14bd5884b506-server-conf\") pod \"90991bb7-c6be-4088-9071-14bd5884b506\" (UID: \"90991bb7-c6be-4088-9071-14bd5884b506\") " Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.310828 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/90991bb7-c6be-4088-9071-14bd5884b506-erlang-cookie-secret\") pod \"90991bb7-c6be-4088-9071-14bd5884b506\" (UID: \"90991bb7-c6be-4088-9071-14bd5884b506\") " Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.310898 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/90991bb7-c6be-4088-9071-14bd5884b506-rabbitmq-plugins\") pod \"90991bb7-c6be-4088-9071-14bd5884b506\" (UID: \"90991bb7-c6be-4088-9071-14bd5884b506\") " Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.310939 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/90991bb7-c6be-4088-9071-14bd5884b506-rabbitmq-confd\") pod \"90991bb7-c6be-4088-9071-14bd5884b506\" (UID: \"90991bb7-c6be-4088-9071-14bd5884b506\") " Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.310982 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/90991bb7-c6be-4088-9071-14bd5884b506-plugins-conf\") pod \"90991bb7-c6be-4088-9071-14bd5884b506\" (UID: \"90991bb7-c6be-4088-9071-14bd5884b506\") " Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.311050 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/90991bb7-c6be-4088-9071-14bd5884b506-rabbitmq-erlang-cookie\") pod \"90991bb7-c6be-4088-9071-14bd5884b506\" (UID: \"90991bb7-c6be-4088-9071-14bd5884b506\") " Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.311107 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/90991bb7-c6be-4088-9071-14bd5884b506-pod-info\") pod \"90991bb7-c6be-4088-9071-14bd5884b506\" (UID: \"90991bb7-c6be-4088-9071-14bd5884b506\") " Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.311132 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8ltd\" (UniqueName: \"kubernetes.io/projected/90991bb7-c6be-4088-9071-14bd5884b506-kube-api-access-t8ltd\") pod \"90991bb7-c6be-4088-9071-14bd5884b506\" (UID: \"90991bb7-c6be-4088-9071-14bd5884b506\") " Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.311358 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7ce8f75d-e230-4e12-879b-47127d511398\") pod \"90991bb7-c6be-4088-9071-14bd5884b506\" (UID: \"90991bb7-c6be-4088-9071-14bd5884b506\") " Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.312403 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90991bb7-c6be-4088-9071-14bd5884b506-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "90991bb7-c6be-4088-9071-14bd5884b506" (UID: "90991bb7-c6be-4088-9071-14bd5884b506"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.312452 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90991bb7-c6be-4088-9071-14bd5884b506-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "90991bb7-c6be-4088-9071-14bd5884b506" (UID: "90991bb7-c6be-4088-9071-14bd5884b506"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.313509 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90991bb7-c6be-4088-9071-14bd5884b506-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "90991bb7-c6be-4088-9071-14bd5884b506" (UID: "90991bb7-c6be-4088-9071-14bd5884b506"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.316669 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90991bb7-c6be-4088-9071-14bd5884b506-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "90991bb7-c6be-4088-9071-14bd5884b506" (UID: "90991bb7-c6be-4088-9071-14bd5884b506"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.324010 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/90991bb7-c6be-4088-9071-14bd5884b506-pod-info" (OuterVolumeSpecName: "pod-info") pod "90991bb7-c6be-4088-9071-14bd5884b506" (UID: "90991bb7-c6be-4088-9071-14bd5884b506"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.327287 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90991bb7-c6be-4088-9071-14bd5884b506-kube-api-access-t8ltd" (OuterVolumeSpecName: "kube-api-access-t8ltd") pod "90991bb7-c6be-4088-9071-14bd5884b506" (UID: "90991bb7-c6be-4088-9071-14bd5884b506"). InnerVolumeSpecName "kube-api-access-t8ltd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.330792 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7ce8f75d-e230-4e12-879b-47127d511398" (OuterVolumeSpecName: "persistence") pod "90991bb7-c6be-4088-9071-14bd5884b506" (UID: "90991bb7-c6be-4088-9071-14bd5884b506"). InnerVolumeSpecName "pvc-7ce8f75d-e230-4e12-879b-47127d511398". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.341250 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90991bb7-c6be-4088-9071-14bd5884b506-server-conf" (OuterVolumeSpecName: "server-conf") pod "90991bb7-c6be-4088-9071-14bd5884b506" (UID: "90991bb7-c6be-4088-9071-14bd5884b506"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.408391 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90991bb7-c6be-4088-9071-14bd5884b506-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "90991bb7-c6be-4088-9071-14bd5884b506" (UID: "90991bb7-c6be-4088-9071-14bd5884b506"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.412637 4752 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/90991bb7-c6be-4088-9071-14bd5884b506-server-conf\") on node \"crc\" DevicePath \"\"" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.413001 4752 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/90991bb7-c6be-4088-9071-14bd5884b506-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.413021 4752 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/90991bb7-c6be-4088-9071-14bd5884b506-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.413036 4752 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/90991bb7-c6be-4088-9071-14bd5884b506-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.413047 4752 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/90991bb7-c6be-4088-9071-14bd5884b506-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.413059 4752 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/90991bb7-c6be-4088-9071-14bd5884b506-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.413068 4752 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/90991bb7-c6be-4088-9071-14bd5884b506-pod-info\") on node \"crc\" DevicePath \"\"" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.413075 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8ltd\" (UniqueName: \"kubernetes.io/projected/90991bb7-c6be-4088-9071-14bd5884b506-kube-api-access-t8ltd\") on node \"crc\" DevicePath \"\"" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.413113 4752 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-7ce8f75d-e230-4e12-879b-47127d511398\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7ce8f75d-e230-4e12-879b-47127d511398\") on node \"crc\" " Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.438613 4752 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.438834 4752 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-7ce8f75d-e230-4e12-879b-47127d511398" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7ce8f75d-e230-4e12-879b-47127d511398") on node "crc" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.514658 4752 reconciler_common.go:293] "Volume detached for volume \"pvc-7ce8f75d-e230-4e12-879b-47127d511398\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7ce8f75d-e230-4e12-879b-47127d511398\") on node \"crc\" DevicePath \"\"" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.687937 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.720634 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b7946d7b9-2ph2n" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.752313 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6d47e62f-ca69-4327-97d4-c8e10e7bc522","Type":"ContainerStarted","Data":"a144f527d7e3d21915c7d2de8149911b324e724a3a36c80d64cfe176d63785cf"} Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.756101 4752 generic.go:334] "Generic (PLEG): container finished" podID="90991bb7-c6be-4088-9071-14bd5884b506" containerID="8d849e7737f1fed3abc77c2db9c5e5b94b6a304de8781d9cd29ff61a8a86f9ad" exitCode=0 Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.756169 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"90991bb7-c6be-4088-9071-14bd5884b506","Type":"ContainerDied","Data":"8d849e7737f1fed3abc77c2db9c5e5b94b6a304de8781d9cd29ff61a8a86f9ad"} Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.756227 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"90991bb7-c6be-4088-9071-14bd5884b506","Type":"ContainerDied","Data":"0548b5244f526c9a91003e029c1df6ed4fc773ec3e511d04df8a4c2fc6e6679b"} Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.756251 4752 scope.go:117] "RemoveContainer" containerID="8d849e7737f1fed3abc77c2db9c5e5b94b6a304de8781d9cd29ff61a8a86f9ad" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.756490 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.764041 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-cfmff"] Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.764299 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-98ddfc8f-cfmff" podUID="2c417dbd-7198-4e8f-9136-a64236e924e8" containerName="dnsmasq-dns" containerID="cri-o://d8cfee982fb6af5839f19bb3eb5483e4fb12284880e3b9b5ec07b692d160b33f" gracePeriod=10 Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.818192 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.826730 4752 scope.go:117] "RemoveContainer" containerID="801285b08d9741fcb790faaa434e8388447bab9cc1187e70fe84e0ad30a689a9" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.828402 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.846124 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 24 12:23:05 crc kubenswrapper[4752]: E1124 12:23:05.846545 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90991bb7-c6be-4088-9071-14bd5884b506" containerName="rabbitmq" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.846563 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="90991bb7-c6be-4088-9071-14bd5884b506" containerName="rabbitmq" Nov 24 12:23:05 crc kubenswrapper[4752]: E1124 12:23:05.846576 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90991bb7-c6be-4088-9071-14bd5884b506" containerName="setup-container" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.846584 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="90991bb7-c6be-4088-9071-14bd5884b506" containerName="setup-container" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.846735 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="90991bb7-c6be-4088-9071-14bd5884b506" containerName="rabbitmq" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.847779 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.856451 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-6xkh5" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.856661 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.856848 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.856899 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.857125 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.860014 4752 scope.go:117] "RemoveContainer" containerID="8d849e7737f1fed3abc77c2db9c5e5b94b6a304de8781d9cd29ff61a8a86f9ad" Nov 24 12:23:05 crc kubenswrapper[4752]: E1124 12:23:05.861664 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d849e7737f1fed3abc77c2db9c5e5b94b6a304de8781d9cd29ff61a8a86f9ad\": container with ID starting with 8d849e7737f1fed3abc77c2db9c5e5b94b6a304de8781d9cd29ff61a8a86f9ad not found: ID does not exist" containerID="8d849e7737f1fed3abc77c2db9c5e5b94b6a304de8781d9cd29ff61a8a86f9ad" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.861709 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d849e7737f1fed3abc77c2db9c5e5b94b6a304de8781d9cd29ff61a8a86f9ad"} err="failed to get container status \"8d849e7737f1fed3abc77c2db9c5e5b94b6a304de8781d9cd29ff61a8a86f9ad\": rpc error: code = NotFound desc = could not find container \"8d849e7737f1fed3abc77c2db9c5e5b94b6a304de8781d9cd29ff61a8a86f9ad\": container with ID starting with 8d849e7737f1fed3abc77c2db9c5e5b94b6a304de8781d9cd29ff61a8a86f9ad not found: ID does not exist" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.861760 4752 scope.go:117] "RemoveContainer" containerID="801285b08d9741fcb790faaa434e8388447bab9cc1187e70fe84e0ad30a689a9" Nov 24 12:23:05 crc kubenswrapper[4752]: E1124 12:23:05.862221 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"801285b08d9741fcb790faaa434e8388447bab9cc1187e70fe84e0ad30a689a9\": container with ID starting with 801285b08d9741fcb790faaa434e8388447bab9cc1187e70fe84e0ad30a689a9 not found: ID does not exist" containerID="801285b08d9741fcb790faaa434e8388447bab9cc1187e70fe84e0ad30a689a9" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.862267 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"801285b08d9741fcb790faaa434e8388447bab9cc1187e70fe84e0ad30a689a9"} err="failed to get container status \"801285b08d9741fcb790faaa434e8388447bab9cc1187e70fe84e0ad30a689a9\": rpc error: code = NotFound desc = could not find container \"801285b08d9741fcb790faaa434e8388447bab9cc1187e70fe84e0ad30a689a9\": container with ID starting with 801285b08d9741fcb790faaa434e8388447bab9cc1187e70fe84e0ad30a689a9 not found: ID does not exist" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.864012 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.920637 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ca78463f-b2cb-49d4-96a1-44c8b1f2ae85-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca78463f-b2cb-49d4-96a1-44c8b1f2ae85\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.920702 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ca78463f-b2cb-49d4-96a1-44c8b1f2ae85-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca78463f-b2cb-49d4-96a1-44c8b1f2ae85\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.920725 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ca78463f-b2cb-49d4-96a1-44c8b1f2ae85-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca78463f-b2cb-49d4-96a1-44c8b1f2ae85\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.920776 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ca78463f-b2cb-49d4-96a1-44c8b1f2ae85-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca78463f-b2cb-49d4-96a1-44c8b1f2ae85\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.920800 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7ce8f75d-e230-4e12-879b-47127d511398\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7ce8f75d-e230-4e12-879b-47127d511398\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca78463f-b2cb-49d4-96a1-44c8b1f2ae85\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.920818 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ca78463f-b2cb-49d4-96a1-44c8b1f2ae85-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca78463f-b2cb-49d4-96a1-44c8b1f2ae85\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.920866 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgrw9\" (UniqueName: \"kubernetes.io/projected/ca78463f-b2cb-49d4-96a1-44c8b1f2ae85-kube-api-access-vgrw9\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca78463f-b2cb-49d4-96a1-44c8b1f2ae85\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.920885 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ca78463f-b2cb-49d4-96a1-44c8b1f2ae85-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca78463f-b2cb-49d4-96a1-44c8b1f2ae85\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:23:05 crc kubenswrapper[4752]: I1124 12:23:05.920907 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ca78463f-b2cb-49d4-96a1-44c8b1f2ae85-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca78463f-b2cb-49d4-96a1-44c8b1f2ae85\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.022788 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ca78463f-b2cb-49d4-96a1-44c8b1f2ae85-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca78463f-b2cb-49d4-96a1-44c8b1f2ae85\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.022840 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7ce8f75d-e230-4e12-879b-47127d511398\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7ce8f75d-e230-4e12-879b-47127d511398\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca78463f-b2cb-49d4-96a1-44c8b1f2ae85\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.022870 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ca78463f-b2cb-49d4-96a1-44c8b1f2ae85-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca78463f-b2cb-49d4-96a1-44c8b1f2ae85\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.022939 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgrw9\" (UniqueName: \"kubernetes.io/projected/ca78463f-b2cb-49d4-96a1-44c8b1f2ae85-kube-api-access-vgrw9\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca78463f-b2cb-49d4-96a1-44c8b1f2ae85\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.022964 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ca78463f-b2cb-49d4-96a1-44c8b1f2ae85-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca78463f-b2cb-49d4-96a1-44c8b1f2ae85\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.022998 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ca78463f-b2cb-49d4-96a1-44c8b1f2ae85-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca78463f-b2cb-49d4-96a1-44c8b1f2ae85\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.023055 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ca78463f-b2cb-49d4-96a1-44c8b1f2ae85-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca78463f-b2cb-49d4-96a1-44c8b1f2ae85\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.023084 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ca78463f-b2cb-49d4-96a1-44c8b1f2ae85-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca78463f-b2cb-49d4-96a1-44c8b1f2ae85\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.023111 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ca78463f-b2cb-49d4-96a1-44c8b1f2ae85-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca78463f-b2cb-49d4-96a1-44c8b1f2ae85\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.024233 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ca78463f-b2cb-49d4-96a1-44c8b1f2ae85-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca78463f-b2cb-49d4-96a1-44c8b1f2ae85\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.025024 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ca78463f-b2cb-49d4-96a1-44c8b1f2ae85-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca78463f-b2cb-49d4-96a1-44c8b1f2ae85\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.025300 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ca78463f-b2cb-49d4-96a1-44c8b1f2ae85-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca78463f-b2cb-49d4-96a1-44c8b1f2ae85\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.025627 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ca78463f-b2cb-49d4-96a1-44c8b1f2ae85-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca78463f-b2cb-49d4-96a1-44c8b1f2ae85\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.028653 4752 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.028917 4752 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7ce8f75d-e230-4e12-879b-47127d511398\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7ce8f75d-e230-4e12-879b-47127d511398\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca78463f-b2cb-49d4-96a1-44c8b1f2ae85\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8d6896910fa2a6fe840e1e468d56f0adfe5685e733944c9fc4c358b75510fd35/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.029960 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ca78463f-b2cb-49d4-96a1-44c8b1f2ae85-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca78463f-b2cb-49d4-96a1-44c8b1f2ae85\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.030040 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ca78463f-b2cb-49d4-96a1-44c8b1f2ae85-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca78463f-b2cb-49d4-96a1-44c8b1f2ae85\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.031937 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ca78463f-b2cb-49d4-96a1-44c8b1f2ae85-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca78463f-b2cb-49d4-96a1-44c8b1f2ae85\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.044201 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgrw9\" (UniqueName: \"kubernetes.io/projected/ca78463f-b2cb-49d4-96a1-44c8b1f2ae85-kube-api-access-vgrw9\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca78463f-b2cb-49d4-96a1-44c8b1f2ae85\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.063919 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7ce8f75d-e230-4e12-879b-47127d511398\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7ce8f75d-e230-4e12-879b-47127d511398\") pod \"rabbitmq-cell1-server-0\" (UID: \"ca78463f-b2cb-49d4-96a1-44c8b1f2ae85\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.179726 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.184458 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-cfmff" Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.225857 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c417dbd-7198-4e8f-9136-a64236e924e8-config\") pod \"2c417dbd-7198-4e8f-9136-a64236e924e8\" (UID: \"2c417dbd-7198-4e8f-9136-a64236e924e8\") " Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.225915 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5j6st\" (UniqueName: \"kubernetes.io/projected/2c417dbd-7198-4e8f-9136-a64236e924e8-kube-api-access-5j6st\") pod \"2c417dbd-7198-4e8f-9136-a64236e924e8\" (UID: \"2c417dbd-7198-4e8f-9136-a64236e924e8\") " Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.225968 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c417dbd-7198-4e8f-9136-a64236e924e8-dns-svc\") pod \"2c417dbd-7198-4e8f-9136-a64236e924e8\" (UID: \"2c417dbd-7198-4e8f-9136-a64236e924e8\") " Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.243579 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c417dbd-7198-4e8f-9136-a64236e924e8-kube-api-access-5j6st" (OuterVolumeSpecName: "kube-api-access-5j6st") pod "2c417dbd-7198-4e8f-9136-a64236e924e8" (UID: "2c417dbd-7198-4e8f-9136-a64236e924e8"). InnerVolumeSpecName "kube-api-access-5j6st". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.329110 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5j6st\" (UniqueName: \"kubernetes.io/projected/2c417dbd-7198-4e8f-9136-a64236e924e8-kube-api-access-5j6st\") on node \"crc\" DevicePath \"\"" Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.463853 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c417dbd-7198-4e8f-9136-a64236e924e8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2c417dbd-7198-4e8f-9136-a64236e924e8" (UID: "2c417dbd-7198-4e8f-9136-a64236e924e8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.532384 4752 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c417dbd-7198-4e8f-9136-a64236e924e8-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.571435 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c417dbd-7198-4e8f-9136-a64236e924e8-config" (OuterVolumeSpecName: "config") pod "2c417dbd-7198-4e8f-9136-a64236e924e8" (UID: "2c417dbd-7198-4e8f-9136-a64236e924e8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.602644 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.634162 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c417dbd-7198-4e8f-9136-a64236e924e8-config\") on node \"crc\" DevicePath \"\"" Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.746056 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d501403-cb13-4634-9ca3-c5bacc88afbb" path="/var/lib/kubelet/pods/1d501403-cb13-4634-9ca3-c5bacc88afbb/volumes" Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.747169 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90991bb7-c6be-4088-9071-14bd5884b506" path="/var/lib/kubelet/pods/90991bb7-c6be-4088-9071-14bd5884b506/volumes" Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.765540 4752 generic.go:334] "Generic (PLEG): container finished" podID="2c417dbd-7198-4e8f-9136-a64236e924e8" containerID="d8cfee982fb6af5839f19bb3eb5483e4fb12284880e3b9b5ec07b692d160b33f" exitCode=0 Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.765636 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-cfmff" Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.765635 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-cfmff" event={"ID":"2c417dbd-7198-4e8f-9136-a64236e924e8","Type":"ContainerDied","Data":"d8cfee982fb6af5839f19bb3eb5483e4fb12284880e3b9b5ec07b692d160b33f"} Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.765774 4752 scope.go:117] "RemoveContainer" containerID="d8cfee982fb6af5839f19bb3eb5483e4fb12284880e3b9b5ec07b692d160b33f" Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.765787 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-cfmff" event={"ID":"2c417dbd-7198-4e8f-9136-a64236e924e8","Type":"ContainerDied","Data":"cdbe2a61ef69e89f3118849f2b3b4e155b1394b5b8dc3e4a2183999f70bf5fc4"} Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.770307 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ca78463f-b2cb-49d4-96a1-44c8b1f2ae85","Type":"ContainerStarted","Data":"4bb0115010423a4eb3d36959fce3b3e1f1c9fe3fabca87a92682c007c3d1707e"} Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.795679 4752 scope.go:117] "RemoveContainer" containerID="d5b820a96fb0e01c00f11737385f5cfb27af1fbb556978dbbc29656fb9e37e82" Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.803617 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-cfmff"] Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.809586 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-cfmff"] Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.815097 4752 scope.go:117] "RemoveContainer" containerID="d8cfee982fb6af5839f19bb3eb5483e4fb12284880e3b9b5ec07b692d160b33f" Nov 24 12:23:06 crc kubenswrapper[4752]: E1124 12:23:06.815572 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8cfee982fb6af5839f19bb3eb5483e4fb12284880e3b9b5ec07b692d160b33f\": container with ID starting with d8cfee982fb6af5839f19bb3eb5483e4fb12284880e3b9b5ec07b692d160b33f not found: ID does not exist" containerID="d8cfee982fb6af5839f19bb3eb5483e4fb12284880e3b9b5ec07b692d160b33f" Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.815609 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8cfee982fb6af5839f19bb3eb5483e4fb12284880e3b9b5ec07b692d160b33f"} err="failed to get container status \"d8cfee982fb6af5839f19bb3eb5483e4fb12284880e3b9b5ec07b692d160b33f\": rpc error: code = NotFound desc = could not find container \"d8cfee982fb6af5839f19bb3eb5483e4fb12284880e3b9b5ec07b692d160b33f\": container with ID starting with d8cfee982fb6af5839f19bb3eb5483e4fb12284880e3b9b5ec07b692d160b33f not found: ID does not exist" Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.815633 4752 scope.go:117] "RemoveContainer" containerID="d5b820a96fb0e01c00f11737385f5cfb27af1fbb556978dbbc29656fb9e37e82" Nov 24 12:23:06 crc kubenswrapper[4752]: E1124 12:23:06.815981 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5b820a96fb0e01c00f11737385f5cfb27af1fbb556978dbbc29656fb9e37e82\": container with ID starting with d5b820a96fb0e01c00f11737385f5cfb27af1fbb556978dbbc29656fb9e37e82 not found: ID does not exist" containerID="d5b820a96fb0e01c00f11737385f5cfb27af1fbb556978dbbc29656fb9e37e82" Nov 24 12:23:06 crc kubenswrapper[4752]: I1124 12:23:06.816007 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5b820a96fb0e01c00f11737385f5cfb27af1fbb556978dbbc29656fb9e37e82"} err="failed to get container status \"d5b820a96fb0e01c00f11737385f5cfb27af1fbb556978dbbc29656fb9e37e82\": rpc error: code = NotFound desc = could not find container \"d5b820a96fb0e01c00f11737385f5cfb27af1fbb556978dbbc29656fb9e37e82\": container with ID starting with d5b820a96fb0e01c00f11737385f5cfb27af1fbb556978dbbc29656fb9e37e82 not found: ID does not exist" Nov 24 12:23:07 crc kubenswrapper[4752]: I1124 12:23:07.779914 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ca78463f-b2cb-49d4-96a1-44c8b1f2ae85","Type":"ContainerStarted","Data":"88f199cf35ee31563048ddecdd0f0f08201dd432a4c66fb5b569a48de0fb61f7"} Nov 24 12:23:07 crc kubenswrapper[4752]: I1124 12:23:07.782091 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6d47e62f-ca69-4327-97d4-c8e10e7bc522","Type":"ContainerStarted","Data":"5e20ba19a4bd5c45a7e4db77939fe4ed8edbc5e5b9d68143c4bf66a1851b0047"} Nov 24 12:23:08 crc kubenswrapper[4752]: I1124 12:23:08.739798 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c417dbd-7198-4e8f-9136-a64236e924e8" path="/var/lib/kubelet/pods/2c417dbd-7198-4e8f-9136-a64236e924e8/volumes" Nov 24 12:23:39 crc kubenswrapper[4752]: I1124 12:23:39.019704 4752 generic.go:334] "Generic (PLEG): container finished" podID="6d47e62f-ca69-4327-97d4-c8e10e7bc522" containerID="5e20ba19a4bd5c45a7e4db77939fe4ed8edbc5e5b9d68143c4bf66a1851b0047" exitCode=0 Nov 24 12:23:39 crc kubenswrapper[4752]: I1124 12:23:39.019800 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6d47e62f-ca69-4327-97d4-c8e10e7bc522","Type":"ContainerDied","Data":"5e20ba19a4bd5c45a7e4db77939fe4ed8edbc5e5b9d68143c4bf66a1851b0047"} Nov 24 12:23:40 crc kubenswrapper[4752]: I1124 12:23:40.030851 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6d47e62f-ca69-4327-97d4-c8e10e7bc522","Type":"ContainerStarted","Data":"6e95ab3ccb308242aa21389f1ed0e23d55b2d45cf4b8aa55f791f4ed51b5f9d0"} Nov 24 12:23:40 crc kubenswrapper[4752]: I1124 12:23:40.032128 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Nov 24 12:23:40 crc kubenswrapper[4752]: I1124 12:23:40.033076 4752 generic.go:334] "Generic (PLEG): container finished" podID="ca78463f-b2cb-49d4-96a1-44c8b1f2ae85" containerID="88f199cf35ee31563048ddecdd0f0f08201dd432a4c66fb5b569a48de0fb61f7" exitCode=0 Nov 24 12:23:40 crc kubenswrapper[4752]: I1124 12:23:40.033120 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ca78463f-b2cb-49d4-96a1-44c8b1f2ae85","Type":"ContainerDied","Data":"88f199cf35ee31563048ddecdd0f0f08201dd432a4c66fb5b569a48de0fb61f7"} Nov 24 12:23:40 crc kubenswrapper[4752]: I1124 12:23:40.064092 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.064072342 podStartE2EDuration="36.064072342s" podCreationTimestamp="2025-11-24 12:23:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:23:40.062304871 +0000 UTC m=+4626.047125180" watchObservedRunningTime="2025-11-24 12:23:40.064072342 +0000 UTC m=+4626.048892631" Nov 24 12:23:41 crc kubenswrapper[4752]: I1124 12:23:41.041596 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ca78463f-b2cb-49d4-96a1-44c8b1f2ae85","Type":"ContainerStarted","Data":"4b15424f8f269de3e9153a726f69090c54989e51506ca0da63104ba57696e100"} Nov 24 12:23:41 crc kubenswrapper[4752]: I1124 12:23:41.042115 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:23:41 crc kubenswrapper[4752]: I1124 12:23:41.067468 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.067449947 podStartE2EDuration="36.067449947s" podCreationTimestamp="2025-11-24 12:23:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:23:41.062209996 +0000 UTC m=+4627.047030305" watchObservedRunningTime="2025-11-24 12:23:41.067449947 +0000 UTC m=+4627.052270236" Nov 24 12:23:55 crc kubenswrapper[4752]: I1124 12:23:55.193641 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Nov 24 12:23:56 crc kubenswrapper[4752]: I1124 12:23:56.183525 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Nov 24 12:24:07 crc kubenswrapper[4752]: I1124 12:24:07.161977 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1-default"] Nov 24 12:24:07 crc kubenswrapper[4752]: E1124 12:24:07.163114 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c417dbd-7198-4e8f-9136-a64236e924e8" containerName="dnsmasq-dns" Nov 24 12:24:07 crc kubenswrapper[4752]: I1124 12:24:07.163135 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c417dbd-7198-4e8f-9136-a64236e924e8" containerName="dnsmasq-dns" Nov 24 12:24:07 crc kubenswrapper[4752]: E1124 12:24:07.163157 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c417dbd-7198-4e8f-9136-a64236e924e8" containerName="init" Nov 24 12:24:07 crc kubenswrapper[4752]: I1124 12:24:07.163166 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c417dbd-7198-4e8f-9136-a64236e924e8" containerName="init" Nov 24 12:24:07 crc kubenswrapper[4752]: I1124 12:24:07.163364 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c417dbd-7198-4e8f-9136-a64236e924e8" containerName="dnsmasq-dns" Nov 24 12:24:07 crc kubenswrapper[4752]: I1124 12:24:07.164017 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Nov 24 12:24:07 crc kubenswrapper[4752]: I1124 12:24:07.166011 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-dhdgz" Nov 24 12:24:07 crc kubenswrapper[4752]: I1124 12:24:07.169782 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Nov 24 12:24:07 crc kubenswrapper[4752]: I1124 12:24:07.267448 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2st5\" (UniqueName: \"kubernetes.io/projected/eb7f18e7-ac96-40ab-b51a-be645fb673bb-kube-api-access-h2st5\") pod \"mariadb-client-1-default\" (UID: \"eb7f18e7-ac96-40ab-b51a-be645fb673bb\") " pod="openstack/mariadb-client-1-default" Nov 24 12:24:07 crc kubenswrapper[4752]: I1124 12:24:07.368584 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2st5\" (UniqueName: \"kubernetes.io/projected/eb7f18e7-ac96-40ab-b51a-be645fb673bb-kube-api-access-h2st5\") pod \"mariadb-client-1-default\" (UID: \"eb7f18e7-ac96-40ab-b51a-be645fb673bb\") " pod="openstack/mariadb-client-1-default" Nov 24 12:24:07 crc kubenswrapper[4752]: I1124 12:24:07.394606 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2st5\" (UniqueName: \"kubernetes.io/projected/eb7f18e7-ac96-40ab-b51a-be645fb673bb-kube-api-access-h2st5\") pod \"mariadb-client-1-default\" (UID: \"eb7f18e7-ac96-40ab-b51a-be645fb673bb\") " pod="openstack/mariadb-client-1-default" Nov 24 12:24:07 crc kubenswrapper[4752]: I1124 12:24:07.511463 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Nov 24 12:24:07 crc kubenswrapper[4752]: I1124 12:24:07.994466 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Nov 24 12:24:08 crc kubenswrapper[4752]: I1124 12:24:08.263137 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"eb7f18e7-ac96-40ab-b51a-be645fb673bb","Type":"ContainerStarted","Data":"789479bb15c576b15ff87bffd6d9a716059e49b5a4b85ae334ff2d6824df43fb"} Nov 24 12:24:09 crc kubenswrapper[4752]: I1124 12:24:09.272294 4752 generic.go:334] "Generic (PLEG): container finished" podID="eb7f18e7-ac96-40ab-b51a-be645fb673bb" containerID="0c3394fc59cac84a57a561fc24bad2cb7b841f6dcd44724cee2d9c7a012cd29e" exitCode=0 Nov 24 12:24:09 crc kubenswrapper[4752]: I1124 12:24:09.272666 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"eb7f18e7-ac96-40ab-b51a-be645fb673bb","Type":"ContainerDied","Data":"0c3394fc59cac84a57a561fc24bad2cb7b841f6dcd44724cee2d9c7a012cd29e"} Nov 24 12:24:10 crc kubenswrapper[4752]: I1124 12:24:10.679162 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Nov 24 12:24:10 crc kubenswrapper[4752]: I1124 12:24:10.705889 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1-default_eb7f18e7-ac96-40ab-b51a-be645fb673bb/mariadb-client-1-default/0.log" Nov 24 12:24:10 crc kubenswrapper[4752]: I1124 12:24:10.733955 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2st5\" (UniqueName: \"kubernetes.io/projected/eb7f18e7-ac96-40ab-b51a-be645fb673bb-kube-api-access-h2st5\") pod \"eb7f18e7-ac96-40ab-b51a-be645fb673bb\" (UID: \"eb7f18e7-ac96-40ab-b51a-be645fb673bb\") " Nov 24 12:24:10 crc kubenswrapper[4752]: I1124 12:24:10.743870 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb7f18e7-ac96-40ab-b51a-be645fb673bb-kube-api-access-h2st5" (OuterVolumeSpecName: "kube-api-access-h2st5") pod "eb7f18e7-ac96-40ab-b51a-be645fb673bb" (UID: "eb7f18e7-ac96-40ab-b51a-be645fb673bb"). InnerVolumeSpecName "kube-api-access-h2st5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:24:10 crc kubenswrapper[4752]: I1124 12:24:10.745202 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1-default"] Nov 24 12:24:10 crc kubenswrapper[4752]: I1124 12:24:10.745237 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1-default"] Nov 24 12:24:10 crc kubenswrapper[4752]: I1124 12:24:10.836040 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2st5\" (UniqueName: \"kubernetes.io/projected/eb7f18e7-ac96-40ab-b51a-be645fb673bb-kube-api-access-h2st5\") on node \"crc\" DevicePath \"\"" Nov 24 12:24:11 crc kubenswrapper[4752]: I1124 12:24:11.154495 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2-default"] Nov 24 12:24:11 crc kubenswrapper[4752]: E1124 12:24:11.155024 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb7f18e7-ac96-40ab-b51a-be645fb673bb" containerName="mariadb-client-1-default" Nov 24 12:24:11 crc kubenswrapper[4752]: I1124 12:24:11.155050 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb7f18e7-ac96-40ab-b51a-be645fb673bb" containerName="mariadb-client-1-default" Nov 24 12:24:11 crc kubenswrapper[4752]: I1124 12:24:11.155224 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb7f18e7-ac96-40ab-b51a-be645fb673bb" containerName="mariadb-client-1-default" Nov 24 12:24:11 crc kubenswrapper[4752]: I1124 12:24:11.160866 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Nov 24 12:24:11 crc kubenswrapper[4752]: I1124 12:24:11.173900 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Nov 24 12:24:11 crc kubenswrapper[4752]: I1124 12:24:11.245021 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwc68\" (UniqueName: \"kubernetes.io/projected/3da6f0dc-4bc4-43ca-9bed-895a3042c5fb-kube-api-access-lwc68\") pod \"mariadb-client-2-default\" (UID: \"3da6f0dc-4bc4-43ca-9bed-895a3042c5fb\") " pod="openstack/mariadb-client-2-default" Nov 24 12:24:11 crc kubenswrapper[4752]: I1124 12:24:11.296776 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="789479bb15c576b15ff87bffd6d9a716059e49b5a4b85ae334ff2d6824df43fb" Nov 24 12:24:11 crc kubenswrapper[4752]: I1124 12:24:11.296826 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Nov 24 12:24:11 crc kubenswrapper[4752]: I1124 12:24:11.346694 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwc68\" (UniqueName: \"kubernetes.io/projected/3da6f0dc-4bc4-43ca-9bed-895a3042c5fb-kube-api-access-lwc68\") pod \"mariadb-client-2-default\" (UID: \"3da6f0dc-4bc4-43ca-9bed-895a3042c5fb\") " pod="openstack/mariadb-client-2-default" Nov 24 12:24:11 crc kubenswrapper[4752]: I1124 12:24:11.363933 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwc68\" (UniqueName: \"kubernetes.io/projected/3da6f0dc-4bc4-43ca-9bed-895a3042c5fb-kube-api-access-lwc68\") pod \"mariadb-client-2-default\" (UID: \"3da6f0dc-4bc4-43ca-9bed-895a3042c5fb\") " pod="openstack/mariadb-client-2-default" Nov 24 12:24:11 crc kubenswrapper[4752]: I1124 12:24:11.493372 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Nov 24 12:24:12 crc kubenswrapper[4752]: I1124 12:24:12.020246 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Nov 24 12:24:12 crc kubenswrapper[4752]: I1124 12:24:12.311419 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"3da6f0dc-4bc4-43ca-9bed-895a3042c5fb","Type":"ContainerStarted","Data":"870e04b04bfe23347cf123272a897c66beb6a04cfc4a391e9503b677af85f538"} Nov 24 12:24:12 crc kubenswrapper[4752]: I1124 12:24:12.311484 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"3da6f0dc-4bc4-43ca-9bed-895a3042c5fb","Type":"ContainerStarted","Data":"cedf88f21ffe0f6b5d167201491d6359e147afb406c4d63dbf8bdd4692c351c6"} Nov 24 12:24:12 crc kubenswrapper[4752]: I1124 12:24:12.338779 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-2-default" podStartSLOduration=1.338722748 podStartE2EDuration="1.338722748s" podCreationTimestamp="2025-11-24 12:24:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:24:12.33182225 +0000 UTC m=+4658.316642579" watchObservedRunningTime="2025-11-24 12:24:12.338722748 +0000 UTC m=+4658.323543037" Nov 24 12:24:12 crc kubenswrapper[4752]: I1124 12:24:12.739656 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb7f18e7-ac96-40ab-b51a-be645fb673bb" path="/var/lib/kubelet/pods/eb7f18e7-ac96-40ab-b51a-be645fb673bb/volumes" Nov 24 12:24:13 crc kubenswrapper[4752]: I1124 12:24:13.324086 4752 generic.go:334] "Generic (PLEG): container finished" podID="3da6f0dc-4bc4-43ca-9bed-895a3042c5fb" containerID="870e04b04bfe23347cf123272a897c66beb6a04cfc4a391e9503b677af85f538" exitCode=1 Nov 24 12:24:13 crc kubenswrapper[4752]: I1124 12:24:13.324211 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"3da6f0dc-4bc4-43ca-9bed-895a3042c5fb","Type":"ContainerDied","Data":"870e04b04bfe23347cf123272a897c66beb6a04cfc4a391e9503b677af85f538"} Nov 24 12:24:14 crc kubenswrapper[4752]: I1124 12:24:14.668689 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Nov 24 12:24:14 crc kubenswrapper[4752]: I1124 12:24:14.706788 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2-default"] Nov 24 12:24:14 crc kubenswrapper[4752]: I1124 12:24:14.712435 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2-default"] Nov 24 12:24:14 crc kubenswrapper[4752]: I1124 12:24:14.801898 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwc68\" (UniqueName: \"kubernetes.io/projected/3da6f0dc-4bc4-43ca-9bed-895a3042c5fb-kube-api-access-lwc68\") pod \"3da6f0dc-4bc4-43ca-9bed-895a3042c5fb\" (UID: \"3da6f0dc-4bc4-43ca-9bed-895a3042c5fb\") " Nov 24 12:24:14 crc kubenswrapper[4752]: I1124 12:24:14.815177 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3da6f0dc-4bc4-43ca-9bed-895a3042c5fb-kube-api-access-lwc68" (OuterVolumeSpecName: "kube-api-access-lwc68") pod "3da6f0dc-4bc4-43ca-9bed-895a3042c5fb" (UID: "3da6f0dc-4bc4-43ca-9bed-895a3042c5fb"). InnerVolumeSpecName "kube-api-access-lwc68". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:24:14 crc kubenswrapper[4752]: I1124 12:24:14.903604 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwc68\" (UniqueName: \"kubernetes.io/projected/3da6f0dc-4bc4-43ca-9bed-895a3042c5fb-kube-api-access-lwc68\") on node \"crc\" DevicePath \"\"" Nov 24 12:24:15 crc kubenswrapper[4752]: I1124 12:24:15.121504 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1"] Nov 24 12:24:15 crc kubenswrapper[4752]: E1124 12:24:15.121941 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3da6f0dc-4bc4-43ca-9bed-895a3042c5fb" containerName="mariadb-client-2-default" Nov 24 12:24:15 crc kubenswrapper[4752]: I1124 12:24:15.121965 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="3da6f0dc-4bc4-43ca-9bed-895a3042c5fb" containerName="mariadb-client-2-default" Nov 24 12:24:15 crc kubenswrapper[4752]: I1124 12:24:15.122133 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="3da6f0dc-4bc4-43ca-9bed-895a3042c5fb" containerName="mariadb-client-2-default" Nov 24 12:24:15 crc kubenswrapper[4752]: I1124 12:24:15.122636 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Nov 24 12:24:15 crc kubenswrapper[4752]: I1124 12:24:15.129759 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Nov 24 12:24:15 crc kubenswrapper[4752]: I1124 12:24:15.207086 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g99l\" (UniqueName: \"kubernetes.io/projected/c7fe0cb5-e664-4f7f-9bf0-cc2fc213b681-kube-api-access-6g99l\") pod \"mariadb-client-1\" (UID: \"c7fe0cb5-e664-4f7f-9bf0-cc2fc213b681\") " pod="openstack/mariadb-client-1" Nov 24 12:24:15 crc kubenswrapper[4752]: I1124 12:24:15.309063 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g99l\" (UniqueName: \"kubernetes.io/projected/c7fe0cb5-e664-4f7f-9bf0-cc2fc213b681-kube-api-access-6g99l\") pod \"mariadb-client-1\" (UID: \"c7fe0cb5-e664-4f7f-9bf0-cc2fc213b681\") " pod="openstack/mariadb-client-1" Nov 24 12:24:15 crc kubenswrapper[4752]: I1124 12:24:15.327195 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g99l\" (UniqueName: \"kubernetes.io/projected/c7fe0cb5-e664-4f7f-9bf0-cc2fc213b681-kube-api-access-6g99l\") pod \"mariadb-client-1\" (UID: \"c7fe0cb5-e664-4f7f-9bf0-cc2fc213b681\") " pod="openstack/mariadb-client-1" Nov 24 12:24:15 crc kubenswrapper[4752]: I1124 12:24:15.342392 4752 scope.go:117] "RemoveContainer" containerID="870e04b04bfe23347cf123272a897c66beb6a04cfc4a391e9503b677af85f538" Nov 24 12:24:15 crc kubenswrapper[4752]: I1124 12:24:15.342714 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Nov 24 12:24:15 crc kubenswrapper[4752]: I1124 12:24:15.443563 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Nov 24 12:24:15 crc kubenswrapper[4752]: I1124 12:24:15.952824 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Nov 24 12:24:15 crc kubenswrapper[4752]: W1124 12:24:15.963722 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7fe0cb5_e664_4f7f_9bf0_cc2fc213b681.slice/crio-e7baeab1da76523a0b4283efbee83991b0fc0810a39e379728c45547036f6e5e WatchSource:0}: Error finding container e7baeab1da76523a0b4283efbee83991b0fc0810a39e379728c45547036f6e5e: Status 404 returned error can't find the container with id e7baeab1da76523a0b4283efbee83991b0fc0810a39e379728c45547036f6e5e Nov 24 12:24:16 crc kubenswrapper[4752]: I1124 12:24:16.351935 4752 generic.go:334] "Generic (PLEG): container finished" podID="c7fe0cb5-e664-4f7f-9bf0-cc2fc213b681" containerID="ec15ec535fcc135a03d6d7bbab34b33fb48c690aa05f30294625b57690e24388" exitCode=0 Nov 24 12:24:16 crc kubenswrapper[4752]: I1124 12:24:16.351988 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"c7fe0cb5-e664-4f7f-9bf0-cc2fc213b681","Type":"ContainerDied","Data":"ec15ec535fcc135a03d6d7bbab34b33fb48c690aa05f30294625b57690e24388"} Nov 24 12:24:16 crc kubenswrapper[4752]: I1124 12:24:16.352337 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"c7fe0cb5-e664-4f7f-9bf0-cc2fc213b681","Type":"ContainerStarted","Data":"e7baeab1da76523a0b4283efbee83991b0fc0810a39e379728c45547036f6e5e"} Nov 24 12:24:16 crc kubenswrapper[4752]: I1124 12:24:16.736367 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3da6f0dc-4bc4-43ca-9bed-895a3042c5fb" path="/var/lib/kubelet/pods/3da6f0dc-4bc4-43ca-9bed-895a3042c5fb/volumes" Nov 24 12:24:17 crc kubenswrapper[4752]: I1124 12:24:17.719115 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Nov 24 12:24:17 crc kubenswrapper[4752]: I1124 12:24:17.737664 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1_c7fe0cb5-e664-4f7f-9bf0-cc2fc213b681/mariadb-client-1/0.log" Nov 24 12:24:17 crc kubenswrapper[4752]: I1124 12:24:17.761903 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1"] Nov 24 12:24:17 crc kubenswrapper[4752]: I1124 12:24:17.766709 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1"] Nov 24 12:24:17 crc kubenswrapper[4752]: I1124 12:24:17.855774 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g99l\" (UniqueName: \"kubernetes.io/projected/c7fe0cb5-e664-4f7f-9bf0-cc2fc213b681-kube-api-access-6g99l\") pod \"c7fe0cb5-e664-4f7f-9bf0-cc2fc213b681\" (UID: \"c7fe0cb5-e664-4f7f-9bf0-cc2fc213b681\") " Nov 24 12:24:17 crc kubenswrapper[4752]: I1124 12:24:17.860517 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7fe0cb5-e664-4f7f-9bf0-cc2fc213b681-kube-api-access-6g99l" (OuterVolumeSpecName: "kube-api-access-6g99l") pod "c7fe0cb5-e664-4f7f-9bf0-cc2fc213b681" (UID: "c7fe0cb5-e664-4f7f-9bf0-cc2fc213b681"). InnerVolumeSpecName "kube-api-access-6g99l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:24:17 crc kubenswrapper[4752]: I1124 12:24:17.957312 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g99l\" (UniqueName: \"kubernetes.io/projected/c7fe0cb5-e664-4f7f-9bf0-cc2fc213b681-kube-api-access-6g99l\") on node \"crc\" DevicePath \"\"" Nov 24 12:24:18 crc kubenswrapper[4752]: I1124 12:24:18.182516 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-4-default"] Nov 24 12:24:18 crc kubenswrapper[4752]: E1124 12:24:18.182913 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7fe0cb5-e664-4f7f-9bf0-cc2fc213b681" containerName="mariadb-client-1" Nov 24 12:24:18 crc kubenswrapper[4752]: I1124 12:24:18.182934 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7fe0cb5-e664-4f7f-9bf0-cc2fc213b681" containerName="mariadb-client-1" Nov 24 12:24:18 crc kubenswrapper[4752]: I1124 12:24:18.183098 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7fe0cb5-e664-4f7f-9bf0-cc2fc213b681" containerName="mariadb-client-1" Nov 24 12:24:18 crc kubenswrapper[4752]: I1124 12:24:18.183578 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Nov 24 12:24:18 crc kubenswrapper[4752]: I1124 12:24:18.189458 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Nov 24 12:24:18 crc kubenswrapper[4752]: I1124 12:24:18.261936 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sxvd\" (UniqueName: \"kubernetes.io/projected/7608ee13-4108-4741-8b86-2c50eff5b0a6-kube-api-access-5sxvd\") pod \"mariadb-client-4-default\" (UID: \"7608ee13-4108-4741-8b86-2c50eff5b0a6\") " pod="openstack/mariadb-client-4-default" Nov 24 12:24:18 crc kubenswrapper[4752]: I1124 12:24:18.363110 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sxvd\" (UniqueName: \"kubernetes.io/projected/7608ee13-4108-4741-8b86-2c50eff5b0a6-kube-api-access-5sxvd\") pod \"mariadb-client-4-default\" (UID: \"7608ee13-4108-4741-8b86-2c50eff5b0a6\") " pod="openstack/mariadb-client-4-default" Nov 24 12:24:18 crc kubenswrapper[4752]: I1124 12:24:18.369567 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7baeab1da76523a0b4283efbee83991b0fc0810a39e379728c45547036f6e5e" Nov 24 12:24:18 crc kubenswrapper[4752]: I1124 12:24:18.369620 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Nov 24 12:24:18 crc kubenswrapper[4752]: I1124 12:24:18.382153 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sxvd\" (UniqueName: \"kubernetes.io/projected/7608ee13-4108-4741-8b86-2c50eff5b0a6-kube-api-access-5sxvd\") pod \"mariadb-client-4-default\" (UID: \"7608ee13-4108-4741-8b86-2c50eff5b0a6\") " pod="openstack/mariadb-client-4-default" Nov 24 12:24:18 crc kubenswrapper[4752]: I1124 12:24:18.501363 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Nov 24 12:24:18 crc kubenswrapper[4752]: I1124 12:24:18.741197 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7fe0cb5-e664-4f7f-9bf0-cc2fc213b681" path="/var/lib/kubelet/pods/c7fe0cb5-e664-4f7f-9bf0-cc2fc213b681/volumes" Nov 24 12:24:19 crc kubenswrapper[4752]: I1124 12:24:19.191762 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Nov 24 12:24:19 crc kubenswrapper[4752]: I1124 12:24:19.376878 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"7608ee13-4108-4741-8b86-2c50eff5b0a6","Type":"ContainerStarted","Data":"c2715b13110eb5eb0c124f434cd1db2d3a07e89d32fa818d9c560cf97dfe0ca3"} Nov 24 12:24:19 crc kubenswrapper[4752]: I1124 12:24:19.376924 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"7608ee13-4108-4741-8b86-2c50eff5b0a6","Type":"ContainerStarted","Data":"22d3ad554779f9bbc9d0dd786cb1433f16125fbe7e274241361a9e165e13e713"} Nov 24 12:24:19 crc kubenswrapper[4752]: I1124 12:24:19.397355 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-4-default" podStartSLOduration=1.397334184 podStartE2EDuration="1.397334184s" podCreationTimestamp="2025-11-24 12:24:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:24:19.390720294 +0000 UTC m=+4665.375540593" watchObservedRunningTime="2025-11-24 12:24:19.397334184 +0000 UTC m=+4665.382154473" Nov 24 12:24:19 crc kubenswrapper[4752]: I1124 12:24:19.438102 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-4-default_7608ee13-4108-4741-8b86-2c50eff5b0a6/mariadb-client-4-default/0.log" Nov 24 12:24:20 crc kubenswrapper[4752]: I1124 12:24:20.383811 4752 generic.go:334] "Generic (PLEG): container finished" podID="7608ee13-4108-4741-8b86-2c50eff5b0a6" containerID="c2715b13110eb5eb0c124f434cd1db2d3a07e89d32fa818d9c560cf97dfe0ca3" exitCode=0 Nov 24 12:24:20 crc kubenswrapper[4752]: I1124 12:24:20.383888 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"7608ee13-4108-4741-8b86-2c50eff5b0a6","Type":"ContainerDied","Data":"c2715b13110eb5eb0c124f434cd1db2d3a07e89d32fa818d9c560cf97dfe0ca3"} Nov 24 12:24:21 crc kubenswrapper[4752]: I1124 12:24:21.741612 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Nov 24 12:24:21 crc kubenswrapper[4752]: I1124 12:24:21.777069 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-4-default"] Nov 24 12:24:21 crc kubenswrapper[4752]: I1124 12:24:21.782549 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-4-default"] Nov 24 12:24:21 crc kubenswrapper[4752]: I1124 12:24:21.812211 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sxvd\" (UniqueName: \"kubernetes.io/projected/7608ee13-4108-4741-8b86-2c50eff5b0a6-kube-api-access-5sxvd\") pod \"7608ee13-4108-4741-8b86-2c50eff5b0a6\" (UID: \"7608ee13-4108-4741-8b86-2c50eff5b0a6\") " Nov 24 12:24:21 crc kubenswrapper[4752]: I1124 12:24:21.817685 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7608ee13-4108-4741-8b86-2c50eff5b0a6-kube-api-access-5sxvd" (OuterVolumeSpecName: "kube-api-access-5sxvd") pod "7608ee13-4108-4741-8b86-2c50eff5b0a6" (UID: "7608ee13-4108-4741-8b86-2c50eff5b0a6"). InnerVolumeSpecName "kube-api-access-5sxvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:24:21 crc kubenswrapper[4752]: I1124 12:24:21.914571 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sxvd\" (UniqueName: \"kubernetes.io/projected/7608ee13-4108-4741-8b86-2c50eff5b0a6-kube-api-access-5sxvd\") on node \"crc\" DevicePath \"\"" Nov 24 12:24:22 crc kubenswrapper[4752]: I1124 12:24:22.400574 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22d3ad554779f9bbc9d0dd786cb1433f16125fbe7e274241361a9e165e13e713" Nov 24 12:24:22 crc kubenswrapper[4752]: I1124 12:24:22.400638 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Nov 24 12:24:22 crc kubenswrapper[4752]: I1124 12:24:22.736610 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7608ee13-4108-4741-8b86-2c50eff5b0a6" path="/var/lib/kubelet/pods/7608ee13-4108-4741-8b86-2c50eff5b0a6/volumes" Nov 24 12:24:26 crc kubenswrapper[4752]: I1124 12:24:26.320377 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-5-default"] Nov 24 12:24:26 crc kubenswrapper[4752]: E1124 12:24:26.321229 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7608ee13-4108-4741-8b86-2c50eff5b0a6" containerName="mariadb-client-4-default" Nov 24 12:24:26 crc kubenswrapper[4752]: I1124 12:24:26.321242 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="7608ee13-4108-4741-8b86-2c50eff5b0a6" containerName="mariadb-client-4-default" Nov 24 12:24:26 crc kubenswrapper[4752]: I1124 12:24:26.321397 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="7608ee13-4108-4741-8b86-2c50eff5b0a6" containerName="mariadb-client-4-default" Nov 24 12:24:26 crc kubenswrapper[4752]: I1124 12:24:26.321926 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Nov 24 12:24:26 crc kubenswrapper[4752]: I1124 12:24:26.328209 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-dhdgz" Nov 24 12:24:26 crc kubenswrapper[4752]: I1124 12:24:26.332079 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Nov 24 12:24:26 crc kubenswrapper[4752]: I1124 12:24:26.387856 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5p7p\" (UniqueName: \"kubernetes.io/projected/6d22a411-d1f1-4edf-9f2e-f5d9a9606308-kube-api-access-b5p7p\") pod \"mariadb-client-5-default\" (UID: \"6d22a411-d1f1-4edf-9f2e-f5d9a9606308\") " pod="openstack/mariadb-client-5-default" Nov 24 12:24:26 crc kubenswrapper[4752]: I1124 12:24:26.489238 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5p7p\" (UniqueName: \"kubernetes.io/projected/6d22a411-d1f1-4edf-9f2e-f5d9a9606308-kube-api-access-b5p7p\") pod \"mariadb-client-5-default\" (UID: \"6d22a411-d1f1-4edf-9f2e-f5d9a9606308\") " pod="openstack/mariadb-client-5-default" Nov 24 12:24:26 crc kubenswrapper[4752]: I1124 12:24:26.512997 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5p7p\" (UniqueName: \"kubernetes.io/projected/6d22a411-d1f1-4edf-9f2e-f5d9a9606308-kube-api-access-b5p7p\") pod \"mariadb-client-5-default\" (UID: \"6d22a411-d1f1-4edf-9f2e-f5d9a9606308\") " pod="openstack/mariadb-client-5-default" Nov 24 12:24:26 crc kubenswrapper[4752]: I1124 12:24:26.640806 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Nov 24 12:24:27 crc kubenswrapper[4752]: I1124 12:24:27.137111 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Nov 24 12:24:27 crc kubenswrapper[4752]: I1124 12:24:27.449527 4752 generic.go:334] "Generic (PLEG): container finished" podID="6d22a411-d1f1-4edf-9f2e-f5d9a9606308" containerID="c9ce396e8ff9b199b3f6b73f8dee36a4e25d6e3eaa4bd14066301b8cfc934d59" exitCode=0 Nov 24 12:24:27 crc kubenswrapper[4752]: I1124 12:24:27.449618 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"6d22a411-d1f1-4edf-9f2e-f5d9a9606308","Type":"ContainerDied","Data":"c9ce396e8ff9b199b3f6b73f8dee36a4e25d6e3eaa4bd14066301b8cfc934d59"} Nov 24 12:24:27 crc kubenswrapper[4752]: I1124 12:24:27.449820 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"6d22a411-d1f1-4edf-9f2e-f5d9a9606308","Type":"ContainerStarted","Data":"c03b552e8be9009b2033ec88d84b452f98bdc3d8ab9b751d9ff033c62ce28b0c"} Nov 24 12:24:28 crc kubenswrapper[4752]: I1124 12:24:28.849362 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Nov 24 12:24:28 crc kubenswrapper[4752]: I1124 12:24:28.866653 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-5-default_6d22a411-d1f1-4edf-9f2e-f5d9a9606308/mariadb-client-5-default/0.log" Nov 24 12:24:28 crc kubenswrapper[4752]: I1124 12:24:28.894106 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-5-default"] Nov 24 12:24:28 crc kubenswrapper[4752]: I1124 12:24:28.901164 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-5-default"] Nov 24 12:24:28 crc kubenswrapper[4752]: I1124 12:24:28.949708 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5p7p\" (UniqueName: \"kubernetes.io/projected/6d22a411-d1f1-4edf-9f2e-f5d9a9606308-kube-api-access-b5p7p\") pod \"6d22a411-d1f1-4edf-9f2e-f5d9a9606308\" (UID: \"6d22a411-d1f1-4edf-9f2e-f5d9a9606308\") " Nov 24 12:24:28 crc kubenswrapper[4752]: I1124 12:24:28.955666 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d22a411-d1f1-4edf-9f2e-f5d9a9606308-kube-api-access-b5p7p" (OuterVolumeSpecName: "kube-api-access-b5p7p") pod "6d22a411-d1f1-4edf-9f2e-f5d9a9606308" (UID: "6d22a411-d1f1-4edf-9f2e-f5d9a9606308"). InnerVolumeSpecName "kube-api-access-b5p7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:24:29 crc kubenswrapper[4752]: I1124 12:24:29.039137 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-6-default"] Nov 24 12:24:29 crc kubenswrapper[4752]: E1124 12:24:29.039492 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d22a411-d1f1-4edf-9f2e-f5d9a9606308" containerName="mariadb-client-5-default" Nov 24 12:24:29 crc kubenswrapper[4752]: I1124 12:24:29.039506 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d22a411-d1f1-4edf-9f2e-f5d9a9606308" containerName="mariadb-client-5-default" Nov 24 12:24:29 crc kubenswrapper[4752]: I1124 12:24:29.039652 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d22a411-d1f1-4edf-9f2e-f5d9a9606308" containerName="mariadb-client-5-default" Nov 24 12:24:29 crc kubenswrapper[4752]: I1124 12:24:29.040214 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Nov 24 12:24:29 crc kubenswrapper[4752]: I1124 12:24:29.045097 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Nov 24 12:24:29 crc kubenswrapper[4752]: I1124 12:24:29.057819 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5p7p\" (UniqueName: \"kubernetes.io/projected/6d22a411-d1f1-4edf-9f2e-f5d9a9606308-kube-api-access-b5p7p\") on node \"crc\" DevicePath \"\"" Nov 24 12:24:29 crc kubenswrapper[4752]: I1124 12:24:29.159161 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24mwf\" (UniqueName: \"kubernetes.io/projected/9c9b700e-6515-4ad2-a0da-6edce5673122-kube-api-access-24mwf\") pod \"mariadb-client-6-default\" (UID: \"9c9b700e-6515-4ad2-a0da-6edce5673122\") " pod="openstack/mariadb-client-6-default" Nov 24 12:24:29 crc kubenswrapper[4752]: I1124 12:24:29.260593 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24mwf\" (UniqueName: \"kubernetes.io/projected/9c9b700e-6515-4ad2-a0da-6edce5673122-kube-api-access-24mwf\") pod \"mariadb-client-6-default\" (UID: \"9c9b700e-6515-4ad2-a0da-6edce5673122\") " pod="openstack/mariadb-client-6-default" Nov 24 12:24:29 crc kubenswrapper[4752]: I1124 12:24:29.445518 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24mwf\" (UniqueName: \"kubernetes.io/projected/9c9b700e-6515-4ad2-a0da-6edce5673122-kube-api-access-24mwf\") pod \"mariadb-client-6-default\" (UID: \"9c9b700e-6515-4ad2-a0da-6edce5673122\") " pod="openstack/mariadb-client-6-default" Nov 24 12:24:29 crc kubenswrapper[4752]: I1124 12:24:29.463985 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c03b552e8be9009b2033ec88d84b452f98bdc3d8ab9b751d9ff033c62ce28b0c" Nov 24 12:24:29 crc kubenswrapper[4752]: I1124 12:24:29.464325 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Nov 24 12:24:29 crc kubenswrapper[4752]: I1124 12:24:29.662247 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Nov 24 12:24:30 crc kubenswrapper[4752]: I1124 12:24:30.185285 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Nov 24 12:24:30 crc kubenswrapper[4752]: I1124 12:24:30.473387 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"9c9b700e-6515-4ad2-a0da-6edce5673122","Type":"ContainerStarted","Data":"f9e1034cb16de78dc658c4449692af4a495475c73da8f0d44cc63782e23f2016"} Nov 24 12:24:30 crc kubenswrapper[4752]: I1124 12:24:30.473430 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"9c9b700e-6515-4ad2-a0da-6edce5673122","Type":"ContainerStarted","Data":"69f1bb06a3aef8f1ebd0ca9494afabafc976e79301b4e49aa6404ada40d58224"} Nov 24 12:24:30 crc kubenswrapper[4752]: I1124 12:24:30.491793 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-6-default" podStartSLOduration=1.491769599 podStartE2EDuration="1.491769599s" podCreationTimestamp="2025-11-24 12:24:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:24:30.488576227 +0000 UTC m=+4676.473396516" watchObservedRunningTime="2025-11-24 12:24:30.491769599 +0000 UTC m=+4676.476589888" Nov 24 12:24:30 crc kubenswrapper[4752]: I1124 12:24:30.740378 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d22a411-d1f1-4edf-9f2e-f5d9a9606308" path="/var/lib/kubelet/pods/6d22a411-d1f1-4edf-9f2e-f5d9a9606308/volumes" Nov 24 12:24:31 crc kubenswrapper[4752]: I1124 12:24:31.481720 4752 generic.go:334] "Generic (PLEG): container finished" podID="9c9b700e-6515-4ad2-a0da-6edce5673122" containerID="f9e1034cb16de78dc658c4449692af4a495475c73da8f0d44cc63782e23f2016" exitCode=1 Nov 24 12:24:31 crc kubenswrapper[4752]: I1124 12:24:31.481773 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"9c9b700e-6515-4ad2-a0da-6edce5673122","Type":"ContainerDied","Data":"f9e1034cb16de78dc658c4449692af4a495475c73da8f0d44cc63782e23f2016"} Nov 24 12:24:32 crc kubenswrapper[4752]: I1124 12:24:32.835762 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Nov 24 12:24:32 crc kubenswrapper[4752]: I1124 12:24:32.885677 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-6-default"] Nov 24 12:24:32 crc kubenswrapper[4752]: I1124 12:24:32.895976 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-6-default"] Nov 24 12:24:32 crc kubenswrapper[4752]: I1124 12:24:32.914016 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24mwf\" (UniqueName: \"kubernetes.io/projected/9c9b700e-6515-4ad2-a0da-6edce5673122-kube-api-access-24mwf\") pod \"9c9b700e-6515-4ad2-a0da-6edce5673122\" (UID: \"9c9b700e-6515-4ad2-a0da-6edce5673122\") " Nov 24 12:24:32 crc kubenswrapper[4752]: I1124 12:24:32.919480 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c9b700e-6515-4ad2-a0da-6edce5673122-kube-api-access-24mwf" (OuterVolumeSpecName: "kube-api-access-24mwf") pod "9c9b700e-6515-4ad2-a0da-6edce5673122" (UID: "9c9b700e-6515-4ad2-a0da-6edce5673122"). InnerVolumeSpecName "kube-api-access-24mwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:24:32 crc kubenswrapper[4752]: I1124 12:24:32.990508 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-7-default"] Nov 24 12:24:32 crc kubenswrapper[4752]: E1124 12:24:32.990872 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c9b700e-6515-4ad2-a0da-6edce5673122" containerName="mariadb-client-6-default" Nov 24 12:24:32 crc kubenswrapper[4752]: I1124 12:24:32.990894 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c9b700e-6515-4ad2-a0da-6edce5673122" containerName="mariadb-client-6-default" Nov 24 12:24:32 crc kubenswrapper[4752]: I1124 12:24:32.991102 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c9b700e-6515-4ad2-a0da-6edce5673122" containerName="mariadb-client-6-default" Nov 24 12:24:32 crc kubenswrapper[4752]: I1124 12:24:32.992817 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Nov 24 12:24:32 crc kubenswrapper[4752]: I1124 12:24:32.997621 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Nov 24 12:24:33 crc kubenswrapper[4752]: I1124 12:24:33.015435 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbpjh\" (UniqueName: \"kubernetes.io/projected/b0ceb1e4-23c0-47a6-b40d-629644ab5063-kube-api-access-tbpjh\") pod \"mariadb-client-7-default\" (UID: \"b0ceb1e4-23c0-47a6-b40d-629644ab5063\") " pod="openstack/mariadb-client-7-default" Nov 24 12:24:33 crc kubenswrapper[4752]: I1124 12:24:33.015520 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24mwf\" (UniqueName: \"kubernetes.io/projected/9c9b700e-6515-4ad2-a0da-6edce5673122-kube-api-access-24mwf\") on node \"crc\" DevicePath \"\"" Nov 24 12:24:33 crc kubenswrapper[4752]: I1124 12:24:33.116058 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbpjh\" (UniqueName: \"kubernetes.io/projected/b0ceb1e4-23c0-47a6-b40d-629644ab5063-kube-api-access-tbpjh\") pod \"mariadb-client-7-default\" (UID: \"b0ceb1e4-23c0-47a6-b40d-629644ab5063\") " pod="openstack/mariadb-client-7-default" Nov 24 12:24:33 crc kubenswrapper[4752]: I1124 12:24:33.133656 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbpjh\" (UniqueName: \"kubernetes.io/projected/b0ceb1e4-23c0-47a6-b40d-629644ab5063-kube-api-access-tbpjh\") pod \"mariadb-client-7-default\" (UID: \"b0ceb1e4-23c0-47a6-b40d-629644ab5063\") " pod="openstack/mariadb-client-7-default" Nov 24 12:24:33 crc kubenswrapper[4752]: I1124 12:24:33.346102 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Nov 24 12:24:33 crc kubenswrapper[4752]: I1124 12:24:33.496913 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69f1bb06a3aef8f1ebd0ca9494afabafc976e79301b4e49aa6404ada40d58224" Nov 24 12:24:33 crc kubenswrapper[4752]: I1124 12:24:33.496957 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Nov 24 12:24:33 crc kubenswrapper[4752]: I1124 12:24:33.835936 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Nov 24 12:24:34 crc kubenswrapper[4752]: I1124 12:24:34.505705 4752 generic.go:334] "Generic (PLEG): container finished" podID="b0ceb1e4-23c0-47a6-b40d-629644ab5063" containerID="407e994f8b0be39e2be3ef7de51b6edb928df16b26b4146717afff104b33eef4" exitCode=0 Nov 24 12:24:34 crc kubenswrapper[4752]: I1124 12:24:34.505809 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"b0ceb1e4-23c0-47a6-b40d-629644ab5063","Type":"ContainerDied","Data":"407e994f8b0be39e2be3ef7de51b6edb928df16b26b4146717afff104b33eef4"} Nov 24 12:24:34 crc kubenswrapper[4752]: I1124 12:24:34.506108 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"b0ceb1e4-23c0-47a6-b40d-629644ab5063","Type":"ContainerStarted","Data":"716c1bcab98ae683dc8b600196730b8ee3c7cbeb5a7e6a73c5500278dec4bb93"} Nov 24 12:24:34 crc kubenswrapper[4752]: I1124 12:24:34.740905 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c9b700e-6515-4ad2-a0da-6edce5673122" path="/var/lib/kubelet/pods/9c9b700e-6515-4ad2-a0da-6edce5673122/volumes" Nov 24 12:24:35 crc kubenswrapper[4752]: I1124 12:24:35.888981 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Nov 24 12:24:35 crc kubenswrapper[4752]: I1124 12:24:35.906907 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-7-default_b0ceb1e4-23c0-47a6-b40d-629644ab5063/mariadb-client-7-default/0.log" Nov 24 12:24:35 crc kubenswrapper[4752]: I1124 12:24:35.936324 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-7-default"] Nov 24 12:24:35 crc kubenswrapper[4752]: I1124 12:24:35.941293 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-7-default"] Nov 24 12:24:36 crc kubenswrapper[4752]: I1124 12:24:36.056385 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbpjh\" (UniqueName: \"kubernetes.io/projected/b0ceb1e4-23c0-47a6-b40d-629644ab5063-kube-api-access-tbpjh\") pod \"b0ceb1e4-23c0-47a6-b40d-629644ab5063\" (UID: \"b0ceb1e4-23c0-47a6-b40d-629644ab5063\") " Nov 24 12:24:36 crc kubenswrapper[4752]: I1124 12:24:36.062196 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0ceb1e4-23c0-47a6-b40d-629644ab5063-kube-api-access-tbpjh" (OuterVolumeSpecName: "kube-api-access-tbpjh") pod "b0ceb1e4-23c0-47a6-b40d-629644ab5063" (UID: "b0ceb1e4-23c0-47a6-b40d-629644ab5063"). InnerVolumeSpecName "kube-api-access-tbpjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:24:36 crc kubenswrapper[4752]: I1124 12:24:36.086346 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2"] Nov 24 12:24:36 crc kubenswrapper[4752]: E1124 12:24:36.087138 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0ceb1e4-23c0-47a6-b40d-629644ab5063" containerName="mariadb-client-7-default" Nov 24 12:24:36 crc kubenswrapper[4752]: I1124 12:24:36.087163 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0ceb1e4-23c0-47a6-b40d-629644ab5063" containerName="mariadb-client-7-default" Nov 24 12:24:36 crc kubenswrapper[4752]: I1124 12:24:36.087461 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0ceb1e4-23c0-47a6-b40d-629644ab5063" containerName="mariadb-client-7-default" Nov 24 12:24:36 crc kubenswrapper[4752]: I1124 12:24:36.088329 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Nov 24 12:24:36 crc kubenswrapper[4752]: I1124 12:24:36.102826 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Nov 24 12:24:36 crc kubenswrapper[4752]: I1124 12:24:36.158128 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjp5v\" (UniqueName: \"kubernetes.io/projected/80c6e512-72c1-4359-9548-6eb6518e3092-kube-api-access-hjp5v\") pod \"mariadb-client-2\" (UID: \"80c6e512-72c1-4359-9548-6eb6518e3092\") " pod="openstack/mariadb-client-2" Nov 24 12:24:36 crc kubenswrapper[4752]: I1124 12:24:36.158284 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbpjh\" (UniqueName: \"kubernetes.io/projected/b0ceb1e4-23c0-47a6-b40d-629644ab5063-kube-api-access-tbpjh\") on node \"crc\" DevicePath \"\"" Nov 24 12:24:36 crc kubenswrapper[4752]: I1124 12:24:36.259116 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjp5v\" (UniqueName: \"kubernetes.io/projected/80c6e512-72c1-4359-9548-6eb6518e3092-kube-api-access-hjp5v\") pod \"mariadb-client-2\" (UID: \"80c6e512-72c1-4359-9548-6eb6518e3092\") " pod="openstack/mariadb-client-2" Nov 24 12:24:36 crc kubenswrapper[4752]: I1124 12:24:36.277448 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjp5v\" (UniqueName: \"kubernetes.io/projected/80c6e512-72c1-4359-9548-6eb6518e3092-kube-api-access-hjp5v\") pod \"mariadb-client-2\" (UID: \"80c6e512-72c1-4359-9548-6eb6518e3092\") " pod="openstack/mariadb-client-2" Nov 24 12:24:36 crc kubenswrapper[4752]: I1124 12:24:36.422185 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Nov 24 12:24:36 crc kubenswrapper[4752]: I1124 12:24:36.538639 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="716c1bcab98ae683dc8b600196730b8ee3c7cbeb5a7e6a73c5500278dec4bb93" Nov 24 12:24:36 crc kubenswrapper[4752]: I1124 12:24:36.538705 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Nov 24 12:24:36 crc kubenswrapper[4752]: I1124 12:24:36.735821 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0ceb1e4-23c0-47a6-b40d-629644ab5063" path="/var/lib/kubelet/pods/b0ceb1e4-23c0-47a6-b40d-629644ab5063/volumes" Nov 24 12:24:37 crc kubenswrapper[4752]: I1124 12:24:36.926571 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Nov 24 12:24:37 crc kubenswrapper[4752]: W1124 12:24:36.938028 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80c6e512_72c1_4359_9548_6eb6518e3092.slice/crio-1cf9552e1b92d09abb91c728e5c3252f712df475a4e69e5a001beb9313614ff1 WatchSource:0}: Error finding container 1cf9552e1b92d09abb91c728e5c3252f712df475a4e69e5a001beb9313614ff1: Status 404 returned error can't find the container with id 1cf9552e1b92d09abb91c728e5c3252f712df475a4e69e5a001beb9313614ff1 Nov 24 12:24:37 crc kubenswrapper[4752]: I1124 12:24:37.555199 4752 generic.go:334] "Generic (PLEG): container finished" podID="80c6e512-72c1-4359-9548-6eb6518e3092" containerID="2efd9b4d72b138d56faff435a5fb0f29bfc7b9cfb1549fcd47b983b75bf4c663" exitCode=0 Nov 24 12:24:37 crc kubenswrapper[4752]: I1124 12:24:37.555298 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"80c6e512-72c1-4359-9548-6eb6518e3092","Type":"ContainerDied","Data":"2efd9b4d72b138d56faff435a5fb0f29bfc7b9cfb1549fcd47b983b75bf4c663"} Nov 24 12:24:37 crc kubenswrapper[4752]: I1124 12:24:37.555819 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"80c6e512-72c1-4359-9548-6eb6518e3092","Type":"ContainerStarted","Data":"1cf9552e1b92d09abb91c728e5c3252f712df475a4e69e5a001beb9313614ff1"} Nov 24 12:24:38 crc kubenswrapper[4752]: I1124 12:24:38.973236 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Nov 24 12:24:38 crc kubenswrapper[4752]: I1124 12:24:38.990737 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2_80c6e512-72c1-4359-9548-6eb6518e3092/mariadb-client-2/0.log" Nov 24 12:24:39 crc kubenswrapper[4752]: I1124 12:24:39.021879 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2"] Nov 24 12:24:39 crc kubenswrapper[4752]: I1124 12:24:39.028522 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2"] Nov 24 12:24:39 crc kubenswrapper[4752]: I1124 12:24:39.100285 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjp5v\" (UniqueName: \"kubernetes.io/projected/80c6e512-72c1-4359-9548-6eb6518e3092-kube-api-access-hjp5v\") pod \"80c6e512-72c1-4359-9548-6eb6518e3092\" (UID: \"80c6e512-72c1-4359-9548-6eb6518e3092\") " Nov 24 12:24:39 crc kubenswrapper[4752]: I1124 12:24:39.105377 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80c6e512-72c1-4359-9548-6eb6518e3092-kube-api-access-hjp5v" (OuterVolumeSpecName: "kube-api-access-hjp5v") pod "80c6e512-72c1-4359-9548-6eb6518e3092" (UID: "80c6e512-72c1-4359-9548-6eb6518e3092"). InnerVolumeSpecName "kube-api-access-hjp5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:24:39 crc kubenswrapper[4752]: I1124 12:24:39.202650 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjp5v\" (UniqueName: \"kubernetes.io/projected/80c6e512-72c1-4359-9548-6eb6518e3092-kube-api-access-hjp5v\") on node \"crc\" DevicePath \"\"" Nov 24 12:24:39 crc kubenswrapper[4752]: I1124 12:24:39.576411 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cf9552e1b92d09abb91c728e5c3252f712df475a4e69e5a001beb9313614ff1" Nov 24 12:24:39 crc kubenswrapper[4752]: I1124 12:24:39.576509 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Nov 24 12:24:40 crc kubenswrapper[4752]: I1124 12:24:40.745485 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80c6e512-72c1-4359-9548-6eb6518e3092" path="/var/lib/kubelet/pods/80c6e512-72c1-4359-9548-6eb6518e3092/volumes" Nov 24 12:24:45 crc kubenswrapper[4752]: I1124 12:24:45.468443 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 12:24:45 crc kubenswrapper[4752]: I1124 12:24:45.469258 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 12:25:08 crc kubenswrapper[4752]: I1124 12:25:08.911691 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gsr8n"] Nov 24 12:25:08 crc kubenswrapper[4752]: E1124 12:25:08.915335 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80c6e512-72c1-4359-9548-6eb6518e3092" containerName="mariadb-client-2" Nov 24 12:25:08 crc kubenswrapper[4752]: I1124 12:25:08.915363 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="80c6e512-72c1-4359-9548-6eb6518e3092" containerName="mariadb-client-2" Nov 24 12:25:08 crc kubenswrapper[4752]: I1124 12:25:08.915643 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="80c6e512-72c1-4359-9548-6eb6518e3092" containerName="mariadb-client-2" Nov 24 12:25:08 crc kubenswrapper[4752]: I1124 12:25:08.918501 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gsr8n" Nov 24 12:25:08 crc kubenswrapper[4752]: I1124 12:25:08.922423 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gsr8n"] Nov 24 12:25:09 crc kubenswrapper[4752]: I1124 12:25:09.092735 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d7c2582-7a35-44aa-b3af-429319f3931a-catalog-content\") pod \"certified-operators-gsr8n\" (UID: \"3d7c2582-7a35-44aa-b3af-429319f3931a\") " pod="openshift-marketplace/certified-operators-gsr8n" Nov 24 12:25:09 crc kubenswrapper[4752]: I1124 12:25:09.092876 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d7c2582-7a35-44aa-b3af-429319f3931a-utilities\") pod \"certified-operators-gsr8n\" (UID: \"3d7c2582-7a35-44aa-b3af-429319f3931a\") " pod="openshift-marketplace/certified-operators-gsr8n" Nov 24 12:25:09 crc kubenswrapper[4752]: I1124 12:25:09.092943 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5nbf\" (UniqueName: \"kubernetes.io/projected/3d7c2582-7a35-44aa-b3af-429319f3931a-kube-api-access-l5nbf\") pod \"certified-operators-gsr8n\" (UID: \"3d7c2582-7a35-44aa-b3af-429319f3931a\") " pod="openshift-marketplace/certified-operators-gsr8n" Nov 24 12:25:09 crc kubenswrapper[4752]: I1124 12:25:09.194600 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d7c2582-7a35-44aa-b3af-429319f3931a-catalog-content\") pod \"certified-operators-gsr8n\" (UID: \"3d7c2582-7a35-44aa-b3af-429319f3931a\") " pod="openshift-marketplace/certified-operators-gsr8n" Nov 24 12:25:09 crc kubenswrapper[4752]: I1124 12:25:09.194705 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d7c2582-7a35-44aa-b3af-429319f3931a-utilities\") pod \"certified-operators-gsr8n\" (UID: \"3d7c2582-7a35-44aa-b3af-429319f3931a\") " pod="openshift-marketplace/certified-operators-gsr8n" Nov 24 12:25:09 crc kubenswrapper[4752]: I1124 12:25:09.194779 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5nbf\" (UniqueName: \"kubernetes.io/projected/3d7c2582-7a35-44aa-b3af-429319f3931a-kube-api-access-l5nbf\") pod \"certified-operators-gsr8n\" (UID: \"3d7c2582-7a35-44aa-b3af-429319f3931a\") " pod="openshift-marketplace/certified-operators-gsr8n" Nov 24 12:25:09 crc kubenswrapper[4752]: I1124 12:25:09.195266 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d7c2582-7a35-44aa-b3af-429319f3931a-catalog-content\") pod \"certified-operators-gsr8n\" (UID: \"3d7c2582-7a35-44aa-b3af-429319f3931a\") " pod="openshift-marketplace/certified-operators-gsr8n" Nov 24 12:25:09 crc kubenswrapper[4752]: I1124 12:25:09.195389 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d7c2582-7a35-44aa-b3af-429319f3931a-utilities\") pod \"certified-operators-gsr8n\" (UID: \"3d7c2582-7a35-44aa-b3af-429319f3931a\") " pod="openshift-marketplace/certified-operators-gsr8n" Nov 24 12:25:09 crc kubenswrapper[4752]: I1124 12:25:09.231056 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5nbf\" (UniqueName: \"kubernetes.io/projected/3d7c2582-7a35-44aa-b3af-429319f3931a-kube-api-access-l5nbf\") pod \"certified-operators-gsr8n\" (UID: \"3d7c2582-7a35-44aa-b3af-429319f3931a\") " pod="openshift-marketplace/certified-operators-gsr8n" Nov 24 12:25:09 crc kubenswrapper[4752]: I1124 12:25:09.240471 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gsr8n" Nov 24 12:25:09 crc kubenswrapper[4752]: I1124 12:25:09.850438 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gsr8n"] Nov 24 12:25:10 crc kubenswrapper[4752]: I1124 12:25:10.840610 4752 generic.go:334] "Generic (PLEG): container finished" podID="3d7c2582-7a35-44aa-b3af-429319f3931a" containerID="e3407a83a1ef305d5e989d47e4b9698fc5670b7a5a52e2672abe7ba9ea573e17" exitCode=0 Nov 24 12:25:10 crc kubenswrapper[4752]: I1124 12:25:10.840718 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gsr8n" event={"ID":"3d7c2582-7a35-44aa-b3af-429319f3931a","Type":"ContainerDied","Data":"e3407a83a1ef305d5e989d47e4b9698fc5670b7a5a52e2672abe7ba9ea573e17"} Nov 24 12:25:10 crc kubenswrapper[4752]: I1124 12:25:10.841318 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gsr8n" event={"ID":"3d7c2582-7a35-44aa-b3af-429319f3931a","Type":"ContainerStarted","Data":"0319791da50aceef997277d0317ddb472cf097cb553b2807b37378214a67fb2c"} Nov 24 12:25:10 crc kubenswrapper[4752]: I1124 12:25:10.844714 4752 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 24 12:25:11 crc kubenswrapper[4752]: I1124 12:25:11.852036 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gsr8n" event={"ID":"3d7c2582-7a35-44aa-b3af-429319f3931a","Type":"ContainerStarted","Data":"b388bea21f83d41cf06a99e64b8df1d0f6da54f7e2f7d6e71db7ef8364c06899"} Nov 24 12:25:12 crc kubenswrapper[4752]: I1124 12:25:12.873281 4752 generic.go:334] "Generic (PLEG): container finished" podID="3d7c2582-7a35-44aa-b3af-429319f3931a" containerID="b388bea21f83d41cf06a99e64b8df1d0f6da54f7e2f7d6e71db7ef8364c06899" exitCode=0 Nov 24 12:25:12 crc kubenswrapper[4752]: I1124 12:25:12.873377 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gsr8n" event={"ID":"3d7c2582-7a35-44aa-b3af-429319f3931a","Type":"ContainerDied","Data":"b388bea21f83d41cf06a99e64b8df1d0f6da54f7e2f7d6e71db7ef8364c06899"} Nov 24 12:25:13 crc kubenswrapper[4752]: I1124 12:25:13.882894 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gsr8n" event={"ID":"3d7c2582-7a35-44aa-b3af-429319f3931a","Type":"ContainerStarted","Data":"519e3ceb295f9b2489833a08796992601902fe793f7555f9761b529b90b8c22b"} Nov 24 12:25:15 crc kubenswrapper[4752]: I1124 12:25:15.468397 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 12:25:15 crc kubenswrapper[4752]: I1124 12:25:15.468478 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 12:25:19 crc kubenswrapper[4752]: I1124 12:25:19.241190 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gsr8n" Nov 24 12:25:19 crc kubenswrapper[4752]: I1124 12:25:19.241621 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gsr8n" Nov 24 12:25:19 crc kubenswrapper[4752]: I1124 12:25:19.280948 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gsr8n" Nov 24 12:25:19 crc kubenswrapper[4752]: I1124 12:25:19.303315 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gsr8n" podStartSLOduration=8.876552428 podStartE2EDuration="11.303295794s" podCreationTimestamp="2025-11-24 12:25:08 +0000 UTC" firstStartedPulling="2025-11-24 12:25:10.844321057 +0000 UTC m=+4716.829141386" lastFinishedPulling="2025-11-24 12:25:13.271064463 +0000 UTC m=+4719.255884752" observedRunningTime="2025-11-24 12:25:13.904521466 +0000 UTC m=+4719.889341755" watchObservedRunningTime="2025-11-24 12:25:19.303295794 +0000 UTC m=+4725.288116073" Nov 24 12:25:19 crc kubenswrapper[4752]: I1124 12:25:19.966307 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gsr8n" Nov 24 12:25:20 crc kubenswrapper[4752]: I1124 12:25:20.022360 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gsr8n"] Nov 24 12:25:21 crc kubenswrapper[4752]: I1124 12:25:21.935851 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gsr8n" podUID="3d7c2582-7a35-44aa-b3af-429319f3931a" containerName="registry-server" containerID="cri-o://519e3ceb295f9b2489833a08796992601902fe793f7555f9761b529b90b8c22b" gracePeriod=2 Nov 24 12:25:22 crc kubenswrapper[4752]: I1124 12:25:22.355828 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gsr8n" Nov 24 12:25:22 crc kubenswrapper[4752]: I1124 12:25:22.497373 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5nbf\" (UniqueName: \"kubernetes.io/projected/3d7c2582-7a35-44aa-b3af-429319f3931a-kube-api-access-l5nbf\") pod \"3d7c2582-7a35-44aa-b3af-429319f3931a\" (UID: \"3d7c2582-7a35-44aa-b3af-429319f3931a\") " Nov 24 12:25:22 crc kubenswrapper[4752]: I1124 12:25:22.497505 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d7c2582-7a35-44aa-b3af-429319f3931a-utilities\") pod \"3d7c2582-7a35-44aa-b3af-429319f3931a\" (UID: \"3d7c2582-7a35-44aa-b3af-429319f3931a\") " Nov 24 12:25:22 crc kubenswrapper[4752]: I1124 12:25:22.497588 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d7c2582-7a35-44aa-b3af-429319f3931a-catalog-content\") pod \"3d7c2582-7a35-44aa-b3af-429319f3931a\" (UID: \"3d7c2582-7a35-44aa-b3af-429319f3931a\") " Nov 24 12:25:22 crc kubenswrapper[4752]: I1124 12:25:22.498866 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d7c2582-7a35-44aa-b3af-429319f3931a-utilities" (OuterVolumeSpecName: "utilities") pod "3d7c2582-7a35-44aa-b3af-429319f3931a" (UID: "3d7c2582-7a35-44aa-b3af-429319f3931a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:25:22 crc kubenswrapper[4752]: I1124 12:25:22.507245 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d7c2582-7a35-44aa-b3af-429319f3931a-kube-api-access-l5nbf" (OuterVolumeSpecName: "kube-api-access-l5nbf") pod "3d7c2582-7a35-44aa-b3af-429319f3931a" (UID: "3d7c2582-7a35-44aa-b3af-429319f3931a"). InnerVolumeSpecName "kube-api-access-l5nbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:25:22 crc kubenswrapper[4752]: I1124 12:25:22.560891 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d7c2582-7a35-44aa-b3af-429319f3931a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3d7c2582-7a35-44aa-b3af-429319f3931a" (UID: "3d7c2582-7a35-44aa-b3af-429319f3931a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:25:22 crc kubenswrapper[4752]: I1124 12:25:22.599989 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5nbf\" (UniqueName: \"kubernetes.io/projected/3d7c2582-7a35-44aa-b3af-429319f3931a-kube-api-access-l5nbf\") on node \"crc\" DevicePath \"\"" Nov 24 12:25:22 crc kubenswrapper[4752]: I1124 12:25:22.600040 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d7c2582-7a35-44aa-b3af-429319f3931a-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 12:25:22 crc kubenswrapper[4752]: I1124 12:25:22.600059 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d7c2582-7a35-44aa-b3af-429319f3931a-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 12:25:22 crc kubenswrapper[4752]: I1124 12:25:22.944667 4752 generic.go:334] "Generic (PLEG): container finished" podID="3d7c2582-7a35-44aa-b3af-429319f3931a" containerID="519e3ceb295f9b2489833a08796992601902fe793f7555f9761b529b90b8c22b" exitCode=0 Nov 24 12:25:22 crc kubenswrapper[4752]: I1124 12:25:22.944719 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gsr8n" event={"ID":"3d7c2582-7a35-44aa-b3af-429319f3931a","Type":"ContainerDied","Data":"519e3ceb295f9b2489833a08796992601902fe793f7555f9761b529b90b8c22b"} Nov 24 12:25:22 crc kubenswrapper[4752]: I1124 12:25:22.944731 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gsr8n" Nov 24 12:25:22 crc kubenswrapper[4752]: I1124 12:25:22.944761 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gsr8n" event={"ID":"3d7c2582-7a35-44aa-b3af-429319f3931a","Type":"ContainerDied","Data":"0319791da50aceef997277d0317ddb472cf097cb553b2807b37378214a67fb2c"} Nov 24 12:25:22 crc kubenswrapper[4752]: I1124 12:25:22.944780 4752 scope.go:117] "RemoveContainer" containerID="519e3ceb295f9b2489833a08796992601902fe793f7555f9761b529b90b8c22b" Nov 24 12:25:22 crc kubenswrapper[4752]: I1124 12:25:22.985711 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gsr8n"] Nov 24 12:25:22 crc kubenswrapper[4752]: I1124 12:25:22.986074 4752 scope.go:117] "RemoveContainer" containerID="b388bea21f83d41cf06a99e64b8df1d0f6da54f7e2f7d6e71db7ef8364c06899" Nov 24 12:25:22 crc kubenswrapper[4752]: I1124 12:25:22.992405 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gsr8n"] Nov 24 12:25:23 crc kubenswrapper[4752]: I1124 12:25:23.006790 4752 scope.go:117] "RemoveContainer" containerID="e3407a83a1ef305d5e989d47e4b9698fc5670b7a5a52e2672abe7ba9ea573e17" Nov 24 12:25:23 crc kubenswrapper[4752]: I1124 12:25:23.029589 4752 scope.go:117] "RemoveContainer" containerID="519e3ceb295f9b2489833a08796992601902fe793f7555f9761b529b90b8c22b" Nov 24 12:25:23 crc kubenswrapper[4752]: E1124 12:25:23.030083 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"519e3ceb295f9b2489833a08796992601902fe793f7555f9761b529b90b8c22b\": container with ID starting with 519e3ceb295f9b2489833a08796992601902fe793f7555f9761b529b90b8c22b not found: ID does not exist" containerID="519e3ceb295f9b2489833a08796992601902fe793f7555f9761b529b90b8c22b" Nov 24 12:25:23 crc kubenswrapper[4752]: I1124 12:25:23.030116 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"519e3ceb295f9b2489833a08796992601902fe793f7555f9761b529b90b8c22b"} err="failed to get container status \"519e3ceb295f9b2489833a08796992601902fe793f7555f9761b529b90b8c22b\": rpc error: code = NotFound desc = could not find container \"519e3ceb295f9b2489833a08796992601902fe793f7555f9761b529b90b8c22b\": container with ID starting with 519e3ceb295f9b2489833a08796992601902fe793f7555f9761b529b90b8c22b not found: ID does not exist" Nov 24 12:25:23 crc kubenswrapper[4752]: I1124 12:25:23.030157 4752 scope.go:117] "RemoveContainer" containerID="b388bea21f83d41cf06a99e64b8df1d0f6da54f7e2f7d6e71db7ef8364c06899" Nov 24 12:25:23 crc kubenswrapper[4752]: E1124 12:25:23.030456 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b388bea21f83d41cf06a99e64b8df1d0f6da54f7e2f7d6e71db7ef8364c06899\": container with ID starting with b388bea21f83d41cf06a99e64b8df1d0f6da54f7e2f7d6e71db7ef8364c06899 not found: ID does not exist" containerID="b388bea21f83d41cf06a99e64b8df1d0f6da54f7e2f7d6e71db7ef8364c06899" Nov 24 12:25:23 crc kubenswrapper[4752]: I1124 12:25:23.030572 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b388bea21f83d41cf06a99e64b8df1d0f6da54f7e2f7d6e71db7ef8364c06899"} err="failed to get container status \"b388bea21f83d41cf06a99e64b8df1d0f6da54f7e2f7d6e71db7ef8364c06899\": rpc error: code = NotFound desc = could not find container \"b388bea21f83d41cf06a99e64b8df1d0f6da54f7e2f7d6e71db7ef8364c06899\": container with ID starting with b388bea21f83d41cf06a99e64b8df1d0f6da54f7e2f7d6e71db7ef8364c06899 not found: ID does not exist" Nov 24 12:25:23 crc kubenswrapper[4752]: I1124 12:25:23.030678 4752 scope.go:117] "RemoveContainer" containerID="e3407a83a1ef305d5e989d47e4b9698fc5670b7a5a52e2672abe7ba9ea573e17" Nov 24 12:25:23 crc kubenswrapper[4752]: E1124 12:25:23.031132 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3407a83a1ef305d5e989d47e4b9698fc5670b7a5a52e2672abe7ba9ea573e17\": container with ID starting with e3407a83a1ef305d5e989d47e4b9698fc5670b7a5a52e2672abe7ba9ea573e17 not found: ID does not exist" containerID="e3407a83a1ef305d5e989d47e4b9698fc5670b7a5a52e2672abe7ba9ea573e17" Nov 24 12:25:23 crc kubenswrapper[4752]: I1124 12:25:23.031226 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3407a83a1ef305d5e989d47e4b9698fc5670b7a5a52e2672abe7ba9ea573e17"} err="failed to get container status \"e3407a83a1ef305d5e989d47e4b9698fc5670b7a5a52e2672abe7ba9ea573e17\": rpc error: code = NotFound desc = could not find container \"e3407a83a1ef305d5e989d47e4b9698fc5670b7a5a52e2672abe7ba9ea573e17\": container with ID starting with e3407a83a1ef305d5e989d47e4b9698fc5670b7a5a52e2672abe7ba9ea573e17 not found: ID does not exist" Nov 24 12:25:24 crc kubenswrapper[4752]: I1124 12:25:24.737447 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d7c2582-7a35-44aa-b3af-429319f3931a" path="/var/lib/kubelet/pods/3d7c2582-7a35-44aa-b3af-429319f3931a/volumes" Nov 24 12:25:41 crc kubenswrapper[4752]: I1124 12:25:41.934431 4752 scope.go:117] "RemoveContainer" containerID="9984e35d2a8c63bf81e61d3edba860ff9e2a571f4d705b5aa4b3a57ebbd59eed" Nov 24 12:25:45 crc kubenswrapper[4752]: I1124 12:25:45.468917 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 12:25:45 crc kubenswrapper[4752]: I1124 12:25:45.469523 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 12:25:45 crc kubenswrapper[4752]: I1124 12:25:45.469570 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 12:25:45 crc kubenswrapper[4752]: I1124 12:25:45.470245 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a761cdb5d16e454a3bb6fbbc0a1a780f17a2f259a2694adda1f6d7d5f0bbc7a3"} pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 24 12:25:45 crc kubenswrapper[4752]: I1124 12:25:45.470315 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" containerID="cri-o://a761cdb5d16e454a3bb6fbbc0a1a780f17a2f259a2694adda1f6d7d5f0bbc7a3" gracePeriod=600 Nov 24 12:25:46 crc kubenswrapper[4752]: I1124 12:25:46.141089 4752 generic.go:334] "Generic (PLEG): container finished" podID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerID="a761cdb5d16e454a3bb6fbbc0a1a780f17a2f259a2694adda1f6d7d5f0bbc7a3" exitCode=0 Nov 24 12:25:46 crc kubenswrapper[4752]: I1124 12:25:46.141451 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerDied","Data":"a761cdb5d16e454a3bb6fbbc0a1a780f17a2f259a2694adda1f6d7d5f0bbc7a3"} Nov 24 12:25:46 crc kubenswrapper[4752]: I1124 12:25:46.141484 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerStarted","Data":"294534fa96e641fffe9b0cce6e2bc4d525450d730fff13a2ed6629d46358ee8a"} Nov 24 12:25:46 crc kubenswrapper[4752]: I1124 12:25:46.141505 4752 scope.go:117] "RemoveContainer" containerID="a07fb79ffa4c02fa1ac7fbfe36d7a8beb4399a115479826734e5d12ec7283331" Nov 24 12:27:45 crc kubenswrapper[4752]: I1124 12:27:45.468990 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 12:27:45 crc kubenswrapper[4752]: I1124 12:27:45.469651 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 12:28:15 crc kubenswrapper[4752]: I1124 12:28:15.468505 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 12:28:15 crc kubenswrapper[4752]: I1124 12:28:15.469286 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 12:28:20 crc kubenswrapper[4752]: I1124 12:28:20.148115 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zznxs"] Nov 24 12:28:20 crc kubenswrapper[4752]: E1124 12:28:20.149444 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d7c2582-7a35-44aa-b3af-429319f3931a" containerName="extract-content" Nov 24 12:28:20 crc kubenswrapper[4752]: I1124 12:28:20.149462 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d7c2582-7a35-44aa-b3af-429319f3931a" containerName="extract-content" Nov 24 12:28:20 crc kubenswrapper[4752]: E1124 12:28:20.149476 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d7c2582-7a35-44aa-b3af-429319f3931a" containerName="extract-utilities" Nov 24 12:28:20 crc kubenswrapper[4752]: I1124 12:28:20.149484 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d7c2582-7a35-44aa-b3af-429319f3931a" containerName="extract-utilities" Nov 24 12:28:20 crc kubenswrapper[4752]: E1124 12:28:20.149510 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d7c2582-7a35-44aa-b3af-429319f3931a" containerName="registry-server" Nov 24 12:28:20 crc kubenswrapper[4752]: I1124 12:28:20.149518 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d7c2582-7a35-44aa-b3af-429319f3931a" containerName="registry-server" Nov 24 12:28:20 crc kubenswrapper[4752]: I1124 12:28:20.149704 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d7c2582-7a35-44aa-b3af-429319f3931a" containerName="registry-server" Nov 24 12:28:20 crc kubenswrapper[4752]: I1124 12:28:20.151101 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zznxs" Nov 24 12:28:20 crc kubenswrapper[4752]: I1124 12:28:20.213889 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zznxs"] Nov 24 12:28:20 crc kubenswrapper[4752]: I1124 12:28:20.259040 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d5bx\" (UniqueName: \"kubernetes.io/projected/41912f55-f0b3-4be6-bee2-4dcf764af9f0-kube-api-access-4d5bx\") pod \"redhat-marketplace-zznxs\" (UID: \"41912f55-f0b3-4be6-bee2-4dcf764af9f0\") " pod="openshift-marketplace/redhat-marketplace-zznxs" Nov 24 12:28:20 crc kubenswrapper[4752]: I1124 12:28:20.259188 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41912f55-f0b3-4be6-bee2-4dcf764af9f0-catalog-content\") pod \"redhat-marketplace-zznxs\" (UID: \"41912f55-f0b3-4be6-bee2-4dcf764af9f0\") " pod="openshift-marketplace/redhat-marketplace-zznxs" Nov 24 12:28:20 crc kubenswrapper[4752]: I1124 12:28:20.259308 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41912f55-f0b3-4be6-bee2-4dcf764af9f0-utilities\") pod \"redhat-marketplace-zznxs\" (UID: \"41912f55-f0b3-4be6-bee2-4dcf764af9f0\") " pod="openshift-marketplace/redhat-marketplace-zznxs" Nov 24 12:28:20 crc kubenswrapper[4752]: I1124 12:28:20.361003 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d5bx\" (UniqueName: \"kubernetes.io/projected/41912f55-f0b3-4be6-bee2-4dcf764af9f0-kube-api-access-4d5bx\") pod \"redhat-marketplace-zznxs\" (UID: \"41912f55-f0b3-4be6-bee2-4dcf764af9f0\") " pod="openshift-marketplace/redhat-marketplace-zznxs" Nov 24 12:28:20 crc kubenswrapper[4752]: I1124 12:28:20.361086 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41912f55-f0b3-4be6-bee2-4dcf764af9f0-catalog-content\") pod \"redhat-marketplace-zznxs\" (UID: \"41912f55-f0b3-4be6-bee2-4dcf764af9f0\") " pod="openshift-marketplace/redhat-marketplace-zznxs" Nov 24 12:28:20 crc kubenswrapper[4752]: I1124 12:28:20.361172 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41912f55-f0b3-4be6-bee2-4dcf764af9f0-utilities\") pod \"redhat-marketplace-zznxs\" (UID: \"41912f55-f0b3-4be6-bee2-4dcf764af9f0\") " pod="openshift-marketplace/redhat-marketplace-zznxs" Nov 24 12:28:20 crc kubenswrapper[4752]: I1124 12:28:20.361721 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41912f55-f0b3-4be6-bee2-4dcf764af9f0-utilities\") pod \"redhat-marketplace-zznxs\" (UID: \"41912f55-f0b3-4be6-bee2-4dcf764af9f0\") " pod="openshift-marketplace/redhat-marketplace-zznxs" Nov 24 12:28:20 crc kubenswrapper[4752]: I1124 12:28:20.362394 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41912f55-f0b3-4be6-bee2-4dcf764af9f0-catalog-content\") pod \"redhat-marketplace-zznxs\" (UID: \"41912f55-f0b3-4be6-bee2-4dcf764af9f0\") " pod="openshift-marketplace/redhat-marketplace-zznxs" Nov 24 12:28:20 crc kubenswrapper[4752]: I1124 12:28:20.382617 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d5bx\" (UniqueName: \"kubernetes.io/projected/41912f55-f0b3-4be6-bee2-4dcf764af9f0-kube-api-access-4d5bx\") pod \"redhat-marketplace-zznxs\" (UID: \"41912f55-f0b3-4be6-bee2-4dcf764af9f0\") " pod="openshift-marketplace/redhat-marketplace-zznxs" Nov 24 12:28:20 crc kubenswrapper[4752]: I1124 12:28:20.493583 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zznxs" Nov 24 12:28:20 crc kubenswrapper[4752]: I1124 12:28:20.924932 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zznxs"] Nov 24 12:28:21 crc kubenswrapper[4752]: I1124 12:28:21.531880 4752 generic.go:334] "Generic (PLEG): container finished" podID="41912f55-f0b3-4be6-bee2-4dcf764af9f0" containerID="60d5622282288d58116a242d5b4c8b36287a1900d4bcdfb6166654898a710916" exitCode=0 Nov 24 12:28:21 crc kubenswrapper[4752]: I1124 12:28:21.531981 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zznxs" event={"ID":"41912f55-f0b3-4be6-bee2-4dcf764af9f0","Type":"ContainerDied","Data":"60d5622282288d58116a242d5b4c8b36287a1900d4bcdfb6166654898a710916"} Nov 24 12:28:21 crc kubenswrapper[4752]: I1124 12:28:21.532265 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zznxs" event={"ID":"41912f55-f0b3-4be6-bee2-4dcf764af9f0","Type":"ContainerStarted","Data":"22bd06919af17065949dc0031a668caf45df00b9db66a132eb67740b9dfa135e"} Nov 24 12:28:23 crc kubenswrapper[4752]: I1124 12:28:23.550953 4752 generic.go:334] "Generic (PLEG): container finished" podID="41912f55-f0b3-4be6-bee2-4dcf764af9f0" containerID="44cab84960e74d1fd96b27551bcc7e3545420f91d26dc6bc6add7f5ef32e6713" exitCode=0 Nov 24 12:28:23 crc kubenswrapper[4752]: I1124 12:28:23.551052 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zznxs" event={"ID":"41912f55-f0b3-4be6-bee2-4dcf764af9f0","Type":"ContainerDied","Data":"44cab84960e74d1fd96b27551bcc7e3545420f91d26dc6bc6add7f5ef32e6713"} Nov 24 12:28:24 crc kubenswrapper[4752]: I1124 12:28:24.562107 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zznxs" event={"ID":"41912f55-f0b3-4be6-bee2-4dcf764af9f0","Type":"ContainerStarted","Data":"6fa5a01fb53a354399c89c0c6b4f7d41157bc3c0da7c307b8a14667e4c04d539"} Nov 24 12:28:24 crc kubenswrapper[4752]: I1124 12:28:24.584386 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zznxs" podStartSLOduration=2.163385646 podStartE2EDuration="4.584372368s" podCreationTimestamp="2025-11-24 12:28:20 +0000 UTC" firstStartedPulling="2025-11-24 12:28:21.535298177 +0000 UTC m=+4907.520118486" lastFinishedPulling="2025-11-24 12:28:23.956284929 +0000 UTC m=+4909.941105208" observedRunningTime="2025-11-24 12:28:24.582126553 +0000 UTC m=+4910.566946842" watchObservedRunningTime="2025-11-24 12:28:24.584372368 +0000 UTC m=+4910.569192657" Nov 24 12:28:30 crc kubenswrapper[4752]: I1124 12:28:30.494461 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zznxs" Nov 24 12:28:30 crc kubenswrapper[4752]: I1124 12:28:30.495517 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zznxs" Nov 24 12:28:30 crc kubenswrapper[4752]: I1124 12:28:30.563790 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zznxs" Nov 24 12:28:30 crc kubenswrapper[4752]: I1124 12:28:30.668355 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zznxs" Nov 24 12:28:30 crc kubenswrapper[4752]: I1124 12:28:30.799278 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zznxs"] Nov 24 12:28:32 crc kubenswrapper[4752]: I1124 12:28:32.635184 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zznxs" podUID="41912f55-f0b3-4be6-bee2-4dcf764af9f0" containerName="registry-server" containerID="cri-o://6fa5a01fb53a354399c89c0c6b4f7d41157bc3c0da7c307b8a14667e4c04d539" gracePeriod=2 Nov 24 12:28:33 crc kubenswrapper[4752]: I1124 12:28:33.082619 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zznxs" Nov 24 12:28:33 crc kubenswrapper[4752]: I1124 12:28:33.279771 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41912f55-f0b3-4be6-bee2-4dcf764af9f0-catalog-content\") pod \"41912f55-f0b3-4be6-bee2-4dcf764af9f0\" (UID: \"41912f55-f0b3-4be6-bee2-4dcf764af9f0\") " Nov 24 12:28:33 crc kubenswrapper[4752]: I1124 12:28:33.280230 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41912f55-f0b3-4be6-bee2-4dcf764af9f0-utilities\") pod \"41912f55-f0b3-4be6-bee2-4dcf764af9f0\" (UID: \"41912f55-f0b3-4be6-bee2-4dcf764af9f0\") " Nov 24 12:28:33 crc kubenswrapper[4752]: I1124 12:28:33.281715 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41912f55-f0b3-4be6-bee2-4dcf764af9f0-utilities" (OuterVolumeSpecName: "utilities") pod "41912f55-f0b3-4be6-bee2-4dcf764af9f0" (UID: "41912f55-f0b3-4be6-bee2-4dcf764af9f0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:28:33 crc kubenswrapper[4752]: I1124 12:28:33.282040 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d5bx\" (UniqueName: \"kubernetes.io/projected/41912f55-f0b3-4be6-bee2-4dcf764af9f0-kube-api-access-4d5bx\") pod \"41912f55-f0b3-4be6-bee2-4dcf764af9f0\" (UID: \"41912f55-f0b3-4be6-bee2-4dcf764af9f0\") " Nov 24 12:28:33 crc kubenswrapper[4752]: I1124 12:28:33.284238 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41912f55-f0b3-4be6-bee2-4dcf764af9f0-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 12:28:33 crc kubenswrapper[4752]: I1124 12:28:33.291855 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41912f55-f0b3-4be6-bee2-4dcf764af9f0-kube-api-access-4d5bx" (OuterVolumeSpecName: "kube-api-access-4d5bx") pod "41912f55-f0b3-4be6-bee2-4dcf764af9f0" (UID: "41912f55-f0b3-4be6-bee2-4dcf764af9f0"). InnerVolumeSpecName "kube-api-access-4d5bx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:28:33 crc kubenswrapper[4752]: I1124 12:28:33.306794 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41912f55-f0b3-4be6-bee2-4dcf764af9f0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "41912f55-f0b3-4be6-bee2-4dcf764af9f0" (UID: "41912f55-f0b3-4be6-bee2-4dcf764af9f0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:28:33 crc kubenswrapper[4752]: I1124 12:28:33.386090 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d5bx\" (UniqueName: \"kubernetes.io/projected/41912f55-f0b3-4be6-bee2-4dcf764af9f0-kube-api-access-4d5bx\") on node \"crc\" DevicePath \"\"" Nov 24 12:28:33 crc kubenswrapper[4752]: I1124 12:28:33.386145 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41912f55-f0b3-4be6-bee2-4dcf764af9f0-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 12:28:33 crc kubenswrapper[4752]: I1124 12:28:33.651374 4752 generic.go:334] "Generic (PLEG): container finished" podID="41912f55-f0b3-4be6-bee2-4dcf764af9f0" containerID="6fa5a01fb53a354399c89c0c6b4f7d41157bc3c0da7c307b8a14667e4c04d539" exitCode=0 Nov 24 12:28:33 crc kubenswrapper[4752]: I1124 12:28:33.651432 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zznxs" event={"ID":"41912f55-f0b3-4be6-bee2-4dcf764af9f0","Type":"ContainerDied","Data":"6fa5a01fb53a354399c89c0c6b4f7d41157bc3c0da7c307b8a14667e4c04d539"} Nov 24 12:28:33 crc kubenswrapper[4752]: I1124 12:28:33.651477 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zznxs" Nov 24 12:28:33 crc kubenswrapper[4752]: I1124 12:28:33.651529 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zznxs" event={"ID":"41912f55-f0b3-4be6-bee2-4dcf764af9f0","Type":"ContainerDied","Data":"22bd06919af17065949dc0031a668caf45df00b9db66a132eb67740b9dfa135e"} Nov 24 12:28:33 crc kubenswrapper[4752]: I1124 12:28:33.651567 4752 scope.go:117] "RemoveContainer" containerID="6fa5a01fb53a354399c89c0c6b4f7d41157bc3c0da7c307b8a14667e4c04d539" Nov 24 12:28:33 crc kubenswrapper[4752]: I1124 12:28:33.682459 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zznxs"] Nov 24 12:28:33 crc kubenswrapper[4752]: I1124 12:28:33.684105 4752 scope.go:117] "RemoveContainer" containerID="44cab84960e74d1fd96b27551bcc7e3545420f91d26dc6bc6add7f5ef32e6713" Nov 24 12:28:33 crc kubenswrapper[4752]: I1124 12:28:33.688830 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zznxs"] Nov 24 12:28:33 crc kubenswrapper[4752]: I1124 12:28:33.710911 4752 scope.go:117] "RemoveContainer" containerID="60d5622282288d58116a242d5b4c8b36287a1900d4bcdfb6166654898a710916" Nov 24 12:28:33 crc kubenswrapper[4752]: I1124 12:28:33.733090 4752 scope.go:117] "RemoveContainer" containerID="6fa5a01fb53a354399c89c0c6b4f7d41157bc3c0da7c307b8a14667e4c04d539" Nov 24 12:28:33 crc kubenswrapper[4752]: E1124 12:28:33.733676 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fa5a01fb53a354399c89c0c6b4f7d41157bc3c0da7c307b8a14667e4c04d539\": container with ID starting with 6fa5a01fb53a354399c89c0c6b4f7d41157bc3c0da7c307b8a14667e4c04d539 not found: ID does not exist" containerID="6fa5a01fb53a354399c89c0c6b4f7d41157bc3c0da7c307b8a14667e4c04d539" Nov 24 12:28:33 crc kubenswrapper[4752]: I1124 12:28:33.733738 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fa5a01fb53a354399c89c0c6b4f7d41157bc3c0da7c307b8a14667e4c04d539"} err="failed to get container status \"6fa5a01fb53a354399c89c0c6b4f7d41157bc3c0da7c307b8a14667e4c04d539\": rpc error: code = NotFound desc = could not find container \"6fa5a01fb53a354399c89c0c6b4f7d41157bc3c0da7c307b8a14667e4c04d539\": container with ID starting with 6fa5a01fb53a354399c89c0c6b4f7d41157bc3c0da7c307b8a14667e4c04d539 not found: ID does not exist" Nov 24 12:28:33 crc kubenswrapper[4752]: I1124 12:28:33.733795 4752 scope.go:117] "RemoveContainer" containerID="44cab84960e74d1fd96b27551bcc7e3545420f91d26dc6bc6add7f5ef32e6713" Nov 24 12:28:33 crc kubenswrapper[4752]: E1124 12:28:33.734304 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44cab84960e74d1fd96b27551bcc7e3545420f91d26dc6bc6add7f5ef32e6713\": container with ID starting with 44cab84960e74d1fd96b27551bcc7e3545420f91d26dc6bc6add7f5ef32e6713 not found: ID does not exist" containerID="44cab84960e74d1fd96b27551bcc7e3545420f91d26dc6bc6add7f5ef32e6713" Nov 24 12:28:33 crc kubenswrapper[4752]: I1124 12:28:33.734337 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44cab84960e74d1fd96b27551bcc7e3545420f91d26dc6bc6add7f5ef32e6713"} err="failed to get container status \"44cab84960e74d1fd96b27551bcc7e3545420f91d26dc6bc6add7f5ef32e6713\": rpc error: code = NotFound desc = could not find container \"44cab84960e74d1fd96b27551bcc7e3545420f91d26dc6bc6add7f5ef32e6713\": container with ID starting with 44cab84960e74d1fd96b27551bcc7e3545420f91d26dc6bc6add7f5ef32e6713 not found: ID does not exist" Nov 24 12:28:33 crc kubenswrapper[4752]: I1124 12:28:33.734353 4752 scope.go:117] "RemoveContainer" containerID="60d5622282288d58116a242d5b4c8b36287a1900d4bcdfb6166654898a710916" Nov 24 12:28:33 crc kubenswrapper[4752]: E1124 12:28:33.734629 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60d5622282288d58116a242d5b4c8b36287a1900d4bcdfb6166654898a710916\": container with ID starting with 60d5622282288d58116a242d5b4c8b36287a1900d4bcdfb6166654898a710916 not found: ID does not exist" containerID="60d5622282288d58116a242d5b4c8b36287a1900d4bcdfb6166654898a710916" Nov 24 12:28:33 crc kubenswrapper[4752]: I1124 12:28:33.734654 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60d5622282288d58116a242d5b4c8b36287a1900d4bcdfb6166654898a710916"} err="failed to get container status \"60d5622282288d58116a242d5b4c8b36287a1900d4bcdfb6166654898a710916\": rpc error: code = NotFound desc = could not find container \"60d5622282288d58116a242d5b4c8b36287a1900d4bcdfb6166654898a710916\": container with ID starting with 60d5622282288d58116a242d5b4c8b36287a1900d4bcdfb6166654898a710916 not found: ID does not exist" Nov 24 12:28:33 crc kubenswrapper[4752]: I1124 12:28:33.941536 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Nov 24 12:28:33 crc kubenswrapper[4752]: E1124 12:28:33.941966 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41912f55-f0b3-4be6-bee2-4dcf764af9f0" containerName="extract-utilities" Nov 24 12:28:33 crc kubenswrapper[4752]: I1124 12:28:33.941987 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="41912f55-f0b3-4be6-bee2-4dcf764af9f0" containerName="extract-utilities" Nov 24 12:28:33 crc kubenswrapper[4752]: E1124 12:28:33.942002 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41912f55-f0b3-4be6-bee2-4dcf764af9f0" containerName="extract-content" Nov 24 12:28:33 crc kubenswrapper[4752]: I1124 12:28:33.942008 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="41912f55-f0b3-4be6-bee2-4dcf764af9f0" containerName="extract-content" Nov 24 12:28:33 crc kubenswrapper[4752]: E1124 12:28:33.942034 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41912f55-f0b3-4be6-bee2-4dcf764af9f0" containerName="registry-server" Nov 24 12:28:33 crc kubenswrapper[4752]: I1124 12:28:33.942040 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="41912f55-f0b3-4be6-bee2-4dcf764af9f0" containerName="registry-server" Nov 24 12:28:33 crc kubenswrapper[4752]: I1124 12:28:33.942201 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="41912f55-f0b3-4be6-bee2-4dcf764af9f0" containerName="registry-server" Nov 24 12:28:33 crc kubenswrapper[4752]: I1124 12:28:33.942727 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Nov 24 12:28:33 crc kubenswrapper[4752]: I1124 12:28:33.947183 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-dhdgz" Nov 24 12:28:33 crc kubenswrapper[4752]: I1124 12:28:33.957992 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Nov 24 12:28:34 crc kubenswrapper[4752]: I1124 12:28:34.097203 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f8mb\" (UniqueName: \"kubernetes.io/projected/a69e61b5-1af7-4e15-90d6-a255a4e8e897-kube-api-access-6f8mb\") pod \"mariadb-copy-data\" (UID: \"a69e61b5-1af7-4e15-90d6-a255a4e8e897\") " pod="openstack/mariadb-copy-data" Nov 24 12:28:34 crc kubenswrapper[4752]: I1124 12:28:34.097281 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1dc286de-9de6-4b52-8dc8-4349df67d983\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1dc286de-9de6-4b52-8dc8-4349df67d983\") pod \"mariadb-copy-data\" (UID: \"a69e61b5-1af7-4e15-90d6-a255a4e8e897\") " pod="openstack/mariadb-copy-data" Nov 24 12:28:34 crc kubenswrapper[4752]: I1124 12:28:34.199655 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f8mb\" (UniqueName: \"kubernetes.io/projected/a69e61b5-1af7-4e15-90d6-a255a4e8e897-kube-api-access-6f8mb\") pod \"mariadb-copy-data\" (UID: \"a69e61b5-1af7-4e15-90d6-a255a4e8e897\") " pod="openstack/mariadb-copy-data" Nov 24 12:28:34 crc kubenswrapper[4752]: I1124 12:28:34.200184 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1dc286de-9de6-4b52-8dc8-4349df67d983\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1dc286de-9de6-4b52-8dc8-4349df67d983\") pod \"mariadb-copy-data\" (UID: \"a69e61b5-1af7-4e15-90d6-a255a4e8e897\") " pod="openstack/mariadb-copy-data" Nov 24 12:28:34 crc kubenswrapper[4752]: I1124 12:28:34.206012 4752 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 24 12:28:34 crc kubenswrapper[4752]: I1124 12:28:34.206207 4752 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1dc286de-9de6-4b52-8dc8-4349df67d983\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1dc286de-9de6-4b52-8dc8-4349df67d983\") pod \"mariadb-copy-data\" (UID: \"a69e61b5-1af7-4e15-90d6-a255a4e8e897\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/eaf5538f371803c3a4783fa6946db7de2c26892cfb078967061772bd35e10864/globalmount\"" pod="openstack/mariadb-copy-data" Nov 24 12:28:34 crc kubenswrapper[4752]: I1124 12:28:34.229656 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f8mb\" (UniqueName: \"kubernetes.io/projected/a69e61b5-1af7-4e15-90d6-a255a4e8e897-kube-api-access-6f8mb\") pod \"mariadb-copy-data\" (UID: \"a69e61b5-1af7-4e15-90d6-a255a4e8e897\") " pod="openstack/mariadb-copy-data" Nov 24 12:28:34 crc kubenswrapper[4752]: I1124 12:28:34.257633 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1dc286de-9de6-4b52-8dc8-4349df67d983\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1dc286de-9de6-4b52-8dc8-4349df67d983\") pod \"mariadb-copy-data\" (UID: \"a69e61b5-1af7-4e15-90d6-a255a4e8e897\") " pod="openstack/mariadb-copy-data" Nov 24 12:28:34 crc kubenswrapper[4752]: I1124 12:28:34.268313 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Nov 24 12:28:34 crc kubenswrapper[4752]: W1124 12:28:34.600681 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda69e61b5_1af7_4e15_90d6_a255a4e8e897.slice/crio-94efc8e7a1e76f81cc0367c26fad17d001cf5f8c1c48174c1c8de929f14a0d96 WatchSource:0}: Error finding container 94efc8e7a1e76f81cc0367c26fad17d001cf5f8c1c48174c1c8de929f14a0d96: Status 404 returned error can't find the container with id 94efc8e7a1e76f81cc0367c26fad17d001cf5f8c1c48174c1c8de929f14a0d96 Nov 24 12:28:34 crc kubenswrapper[4752]: I1124 12:28:34.621331 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Nov 24 12:28:34 crc kubenswrapper[4752]: I1124 12:28:34.679102 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"a69e61b5-1af7-4e15-90d6-a255a4e8e897","Type":"ContainerStarted","Data":"94efc8e7a1e76f81cc0367c26fad17d001cf5f8c1c48174c1c8de929f14a0d96"} Nov 24 12:28:34 crc kubenswrapper[4752]: I1124 12:28:34.742571 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41912f55-f0b3-4be6-bee2-4dcf764af9f0" path="/var/lib/kubelet/pods/41912f55-f0b3-4be6-bee2-4dcf764af9f0/volumes" Nov 24 12:28:35 crc kubenswrapper[4752]: I1124 12:28:35.689470 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"a69e61b5-1af7-4e15-90d6-a255a4e8e897","Type":"ContainerStarted","Data":"d7ca833f32a8c7a6e404c1b34bba311680456e9cd82e2a1f35e3e167f2a430b3"} Nov 24 12:28:35 crc kubenswrapper[4752]: I1124 12:28:35.715091 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=3.715070302 podStartE2EDuration="3.715070302s" podCreationTimestamp="2025-11-24 12:28:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:28:35.711070608 +0000 UTC m=+4921.695890897" watchObservedRunningTime="2025-11-24 12:28:35.715070302 +0000 UTC m=+4921.699890591" Nov 24 12:28:38 crc kubenswrapper[4752]: I1124 12:28:38.691836 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Nov 24 12:28:38 crc kubenswrapper[4752]: I1124 12:28:38.694770 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Nov 24 12:28:38 crc kubenswrapper[4752]: I1124 12:28:38.710520 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Nov 24 12:28:38 crc kubenswrapper[4752]: I1124 12:28:38.874977 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6psmc\" (UniqueName: \"kubernetes.io/projected/ebb3ade7-8d0d-4bc0-8862-93be2c883acd-kube-api-access-6psmc\") pod \"mariadb-client\" (UID: \"ebb3ade7-8d0d-4bc0-8862-93be2c883acd\") " pod="openstack/mariadb-client" Nov 24 12:28:38 crc kubenswrapper[4752]: I1124 12:28:38.976778 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6psmc\" (UniqueName: \"kubernetes.io/projected/ebb3ade7-8d0d-4bc0-8862-93be2c883acd-kube-api-access-6psmc\") pod \"mariadb-client\" (UID: \"ebb3ade7-8d0d-4bc0-8862-93be2c883acd\") " pod="openstack/mariadb-client" Nov 24 12:28:39 crc kubenswrapper[4752]: I1124 12:28:39.002513 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6psmc\" (UniqueName: \"kubernetes.io/projected/ebb3ade7-8d0d-4bc0-8862-93be2c883acd-kube-api-access-6psmc\") pod \"mariadb-client\" (UID: \"ebb3ade7-8d0d-4bc0-8862-93be2c883acd\") " pod="openstack/mariadb-client" Nov 24 12:28:39 crc kubenswrapper[4752]: I1124 12:28:39.032623 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Nov 24 12:28:39 crc kubenswrapper[4752]: I1124 12:28:39.481249 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Nov 24 12:28:39 crc kubenswrapper[4752]: W1124 12:28:39.489331 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebb3ade7_8d0d_4bc0_8862_93be2c883acd.slice/crio-4b652fa9491695981fca7ca802ccad68f29ae40d434174bc301263dc9ac9e325 WatchSource:0}: Error finding container 4b652fa9491695981fca7ca802ccad68f29ae40d434174bc301263dc9ac9e325: Status 404 returned error can't find the container with id 4b652fa9491695981fca7ca802ccad68f29ae40d434174bc301263dc9ac9e325 Nov 24 12:28:39 crc kubenswrapper[4752]: I1124 12:28:39.735695 4752 generic.go:334] "Generic (PLEG): container finished" podID="ebb3ade7-8d0d-4bc0-8862-93be2c883acd" containerID="c8dd2495f532a5a91797f6fcf7f93ffba663cf328ca6507aadf35d76895744b6" exitCode=0 Nov 24 12:28:39 crc kubenswrapper[4752]: I1124 12:28:39.735805 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"ebb3ade7-8d0d-4bc0-8862-93be2c883acd","Type":"ContainerDied","Data":"c8dd2495f532a5a91797f6fcf7f93ffba663cf328ca6507aadf35d76895744b6"} Nov 24 12:28:39 crc kubenswrapper[4752]: I1124 12:28:39.736202 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"ebb3ade7-8d0d-4bc0-8862-93be2c883acd","Type":"ContainerStarted","Data":"4b652fa9491695981fca7ca802ccad68f29ae40d434174bc301263dc9ac9e325"} Nov 24 12:28:41 crc kubenswrapper[4752]: I1124 12:28:41.147416 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Nov 24 12:28:41 crc kubenswrapper[4752]: I1124 12:28:41.168905 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_ebb3ade7-8d0d-4bc0-8862-93be2c883acd/mariadb-client/0.log" Nov 24 12:28:41 crc kubenswrapper[4752]: I1124 12:28:41.194337 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Nov 24 12:28:41 crc kubenswrapper[4752]: I1124 12:28:41.199467 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Nov 24 12:28:41 crc kubenswrapper[4752]: I1124 12:28:41.313938 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6psmc\" (UniqueName: \"kubernetes.io/projected/ebb3ade7-8d0d-4bc0-8862-93be2c883acd-kube-api-access-6psmc\") pod \"ebb3ade7-8d0d-4bc0-8862-93be2c883acd\" (UID: \"ebb3ade7-8d0d-4bc0-8862-93be2c883acd\") " Nov 24 12:28:41 crc kubenswrapper[4752]: I1124 12:28:41.320330 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebb3ade7-8d0d-4bc0-8862-93be2c883acd-kube-api-access-6psmc" (OuterVolumeSpecName: "kube-api-access-6psmc") pod "ebb3ade7-8d0d-4bc0-8862-93be2c883acd" (UID: "ebb3ade7-8d0d-4bc0-8862-93be2c883acd"). InnerVolumeSpecName "kube-api-access-6psmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:28:41 crc kubenswrapper[4752]: I1124 12:28:41.353072 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Nov 24 12:28:41 crc kubenswrapper[4752]: E1124 12:28:41.353439 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebb3ade7-8d0d-4bc0-8862-93be2c883acd" containerName="mariadb-client" Nov 24 12:28:41 crc kubenswrapper[4752]: I1124 12:28:41.353455 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebb3ade7-8d0d-4bc0-8862-93be2c883acd" containerName="mariadb-client" Nov 24 12:28:41 crc kubenswrapper[4752]: I1124 12:28:41.353640 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebb3ade7-8d0d-4bc0-8862-93be2c883acd" containerName="mariadb-client" Nov 24 12:28:41 crc kubenswrapper[4752]: I1124 12:28:41.354211 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Nov 24 12:28:41 crc kubenswrapper[4752]: I1124 12:28:41.364447 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Nov 24 12:28:41 crc kubenswrapper[4752]: I1124 12:28:41.415996 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6psmc\" (UniqueName: \"kubernetes.io/projected/ebb3ade7-8d0d-4bc0-8862-93be2c883acd-kube-api-access-6psmc\") on node \"crc\" DevicePath \"\"" Nov 24 12:28:41 crc kubenswrapper[4752]: I1124 12:28:41.517593 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdxct\" (UniqueName: \"kubernetes.io/projected/807c9f54-3c93-4228-be14-89483bea987c-kube-api-access-mdxct\") pod \"mariadb-client\" (UID: \"807c9f54-3c93-4228-be14-89483bea987c\") " pod="openstack/mariadb-client" Nov 24 12:28:41 crc kubenswrapper[4752]: I1124 12:28:41.619312 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdxct\" (UniqueName: \"kubernetes.io/projected/807c9f54-3c93-4228-be14-89483bea987c-kube-api-access-mdxct\") pod \"mariadb-client\" (UID: \"807c9f54-3c93-4228-be14-89483bea987c\") " pod="openstack/mariadb-client" Nov 24 12:28:41 crc kubenswrapper[4752]: I1124 12:28:41.641717 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdxct\" (UniqueName: \"kubernetes.io/projected/807c9f54-3c93-4228-be14-89483bea987c-kube-api-access-mdxct\") pod \"mariadb-client\" (UID: \"807c9f54-3c93-4228-be14-89483bea987c\") " pod="openstack/mariadb-client" Nov 24 12:28:41 crc kubenswrapper[4752]: I1124 12:28:41.702479 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Nov 24 12:28:41 crc kubenswrapper[4752]: I1124 12:28:41.753686 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b652fa9491695981fca7ca802ccad68f29ae40d434174bc301263dc9ac9e325" Nov 24 12:28:41 crc kubenswrapper[4752]: I1124 12:28:41.753822 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Nov 24 12:28:41 crc kubenswrapper[4752]: I1124 12:28:41.775107 4752 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="ebb3ade7-8d0d-4bc0-8862-93be2c883acd" podUID="807c9f54-3c93-4228-be14-89483bea987c" Nov 24 12:28:42 crc kubenswrapper[4752]: I1124 12:28:42.119837 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Nov 24 12:28:42 crc kubenswrapper[4752]: W1124 12:28:42.124046 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod807c9f54_3c93_4228_be14_89483bea987c.slice/crio-947c21ec0b178d725f8397d98bf573f11ab7b4538ded4eaa1aa74c628fa164de WatchSource:0}: Error finding container 947c21ec0b178d725f8397d98bf573f11ab7b4538ded4eaa1aa74c628fa164de: Status 404 returned error can't find the container with id 947c21ec0b178d725f8397d98bf573f11ab7b4538ded4eaa1aa74c628fa164de Nov 24 12:28:42 crc kubenswrapper[4752]: I1124 12:28:42.743820 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebb3ade7-8d0d-4bc0-8862-93be2c883acd" path="/var/lib/kubelet/pods/ebb3ade7-8d0d-4bc0-8862-93be2c883acd/volumes" Nov 24 12:28:42 crc kubenswrapper[4752]: I1124 12:28:42.768651 4752 generic.go:334] "Generic (PLEG): container finished" podID="807c9f54-3c93-4228-be14-89483bea987c" containerID="33591f753780a7a066d188c51926f4a7f858777e9991ed358b26a316ea40b50b" exitCode=0 Nov 24 12:28:42 crc kubenswrapper[4752]: I1124 12:28:42.768788 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"807c9f54-3c93-4228-be14-89483bea987c","Type":"ContainerDied","Data":"33591f753780a7a066d188c51926f4a7f858777e9991ed358b26a316ea40b50b"} Nov 24 12:28:42 crc kubenswrapper[4752]: I1124 12:28:42.768841 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"807c9f54-3c93-4228-be14-89483bea987c","Type":"ContainerStarted","Data":"947c21ec0b178d725f8397d98bf573f11ab7b4538ded4eaa1aa74c628fa164de"} Nov 24 12:28:44 crc kubenswrapper[4752]: I1124 12:28:44.136921 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Nov 24 12:28:44 crc kubenswrapper[4752]: I1124 12:28:44.157620 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_807c9f54-3c93-4228-be14-89483bea987c/mariadb-client/0.log" Nov 24 12:28:44 crc kubenswrapper[4752]: I1124 12:28:44.194684 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Nov 24 12:28:44 crc kubenswrapper[4752]: I1124 12:28:44.204200 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Nov 24 12:28:44 crc kubenswrapper[4752]: I1124 12:28:44.264574 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdxct\" (UniqueName: \"kubernetes.io/projected/807c9f54-3c93-4228-be14-89483bea987c-kube-api-access-mdxct\") pod \"807c9f54-3c93-4228-be14-89483bea987c\" (UID: \"807c9f54-3c93-4228-be14-89483bea987c\") " Nov 24 12:28:44 crc kubenswrapper[4752]: I1124 12:28:44.270224 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/807c9f54-3c93-4228-be14-89483bea987c-kube-api-access-mdxct" (OuterVolumeSpecName: "kube-api-access-mdxct") pod "807c9f54-3c93-4228-be14-89483bea987c" (UID: "807c9f54-3c93-4228-be14-89483bea987c"). InnerVolumeSpecName "kube-api-access-mdxct". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:28:44 crc kubenswrapper[4752]: I1124 12:28:44.367066 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdxct\" (UniqueName: \"kubernetes.io/projected/807c9f54-3c93-4228-be14-89483bea987c-kube-api-access-mdxct\") on node \"crc\" DevicePath \"\"" Nov 24 12:28:44 crc kubenswrapper[4752]: I1124 12:28:44.743827 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="807c9f54-3c93-4228-be14-89483bea987c" path="/var/lib/kubelet/pods/807c9f54-3c93-4228-be14-89483bea987c/volumes" Nov 24 12:28:44 crc kubenswrapper[4752]: I1124 12:28:44.788389 4752 scope.go:117] "RemoveContainer" containerID="33591f753780a7a066d188c51926f4a7f858777e9991ed358b26a316ea40b50b" Nov 24 12:28:44 crc kubenswrapper[4752]: I1124 12:28:44.788436 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Nov 24 12:28:44 crc kubenswrapper[4752]: E1124 12:28:44.905368 4752 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod807c9f54_3c93_4228_be14_89483bea987c.slice\": RecentStats: unable to find data in memory cache]" Nov 24 12:28:45 crc kubenswrapper[4752]: I1124 12:28:45.469258 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 12:28:45 crc kubenswrapper[4752]: I1124 12:28:45.469343 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 12:28:45 crc kubenswrapper[4752]: I1124 12:28:45.469408 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 12:28:45 crc kubenswrapper[4752]: I1124 12:28:45.470447 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"294534fa96e641fffe9b0cce6e2bc4d525450d730fff13a2ed6629d46358ee8a"} pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 24 12:28:45 crc kubenswrapper[4752]: I1124 12:28:45.470658 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" containerID="cri-o://294534fa96e641fffe9b0cce6e2bc4d525450d730fff13a2ed6629d46358ee8a" gracePeriod=600 Nov 24 12:28:45 crc kubenswrapper[4752]: E1124 12:28:45.601642 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:28:45 crc kubenswrapper[4752]: I1124 12:28:45.811986 4752 generic.go:334] "Generic (PLEG): container finished" podID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerID="294534fa96e641fffe9b0cce6e2bc4d525450d730fff13a2ed6629d46358ee8a" exitCode=0 Nov 24 12:28:45 crc kubenswrapper[4752]: I1124 12:28:45.812069 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerDied","Data":"294534fa96e641fffe9b0cce6e2bc4d525450d730fff13a2ed6629d46358ee8a"} Nov 24 12:28:45 crc kubenswrapper[4752]: I1124 12:28:45.812112 4752 scope.go:117] "RemoveContainer" containerID="a761cdb5d16e454a3bb6fbbc0a1a780f17a2f259a2694adda1f6d7d5f0bbc7a3" Nov 24 12:28:45 crc kubenswrapper[4752]: I1124 12:28:45.812718 4752 scope.go:117] "RemoveContainer" containerID="294534fa96e641fffe9b0cce6e2bc4d525450d730fff13a2ed6629d46358ee8a" Nov 24 12:28:45 crc kubenswrapper[4752]: E1124 12:28:45.813111 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:28:56 crc kubenswrapper[4752]: I1124 12:28:56.728857 4752 scope.go:117] "RemoveContainer" containerID="294534fa96e641fffe9b0cce6e2bc4d525450d730fff13a2ed6629d46358ee8a" Nov 24 12:28:56 crc kubenswrapper[4752]: E1124 12:28:56.730451 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:29:09 crc kubenswrapper[4752]: I1124 12:29:09.728097 4752 scope.go:117] "RemoveContainer" containerID="294534fa96e641fffe9b0cce6e2bc4d525450d730fff13a2ed6629d46358ee8a" Nov 24 12:29:09 crc kubenswrapper[4752]: E1124 12:29:09.729030 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:29:14 crc kubenswrapper[4752]: E1124 12:29:14.764263 4752 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.145:59168->38.102.83.145:38429: write tcp 38.102.83.145:59168->38.102.83.145:38429: write: connection reset by peer Nov 24 12:29:18 crc kubenswrapper[4752]: I1124 12:29:18.762454 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 24 12:29:18 crc kubenswrapper[4752]: E1124 12:29:18.763965 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="807c9f54-3c93-4228-be14-89483bea987c" containerName="mariadb-client" Nov 24 12:29:18 crc kubenswrapper[4752]: I1124 12:29:18.763991 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="807c9f54-3c93-4228-be14-89483bea987c" containerName="mariadb-client" Nov 24 12:29:18 crc kubenswrapper[4752]: I1124 12:29:18.764282 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="807c9f54-3c93-4228-be14-89483bea987c" containerName="mariadb-client" Nov 24 12:29:18 crc kubenswrapper[4752]: I1124 12:29:18.777024 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 24 12:29:18 crc kubenswrapper[4752]: I1124 12:29:18.779813 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Nov 24 12:29:18 crc kubenswrapper[4752]: I1124 12:29:18.779902 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Nov 24 12:29:18 crc kubenswrapper[4752]: I1124 12:29:18.780490 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Nov 24 12:29:18 crc kubenswrapper[4752]: I1124 12:29:18.781269 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Nov 24 12:29:18 crc kubenswrapper[4752]: I1124 12:29:18.781590 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-4xhc2" Nov 24 12:29:18 crc kubenswrapper[4752]: I1124 12:29:18.788280 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Nov 24 12:29:18 crc kubenswrapper[4752]: I1124 12:29:18.789987 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Nov 24 12:29:18 crc kubenswrapper[4752]: I1124 12:29:18.796546 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 24 12:29:18 crc kubenswrapper[4752]: I1124 12:29:18.807547 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Nov 24 12:29:18 crc kubenswrapper[4752]: I1124 12:29:18.814522 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Nov 24 12:29:18 crc kubenswrapper[4752]: I1124 12:29:18.974771 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-118a2861-51c2-4116-b2e8-e54130ff75ad\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-118a2861-51c2-4116-b2e8-e54130ff75ad\") pod \"ovsdbserver-sb-2\" (UID: \"63dc0a8c-07d1-4962-8686-22042a0db911\") " pod="openstack/ovsdbserver-sb-2" Nov 24 12:29:18 crc kubenswrapper[4752]: I1124 12:29:18.974891 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d1d8b13-b2a7-483a-b1ad-be75868386e7-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"1d1d8b13-b2a7-483a-b1ad-be75868386e7\") " pod="openstack/ovsdbserver-sb-1" Nov 24 12:29:18 crc kubenswrapper[4752]: I1124 12:29:18.974946 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63dc0a8c-07d1-4962-8686-22042a0db911-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"63dc0a8c-07d1-4962-8686-22042a0db911\") " pod="openstack/ovsdbserver-sb-2" Nov 24 12:29:18 crc kubenswrapper[4752]: I1124 12:29:18.975001 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d1d8b13-b2a7-483a-b1ad-be75868386e7-config\") pod \"ovsdbserver-sb-1\" (UID: \"1d1d8b13-b2a7-483a-b1ad-be75868386e7\") " pod="openstack/ovsdbserver-sb-1" Nov 24 12:29:18 crc kubenswrapper[4752]: I1124 12:29:18.975063 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/63dc0a8c-07d1-4962-8686-22042a0db911-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"63dc0a8c-07d1-4962-8686-22042a0db911\") " pod="openstack/ovsdbserver-sb-2" Nov 24 12:29:18 crc kubenswrapper[4752]: I1124 12:29:18.975107 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/82c20dce-5921-46ea-aaa2-b62e24923c1c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"82c20dce-5921-46ea-aaa2-b62e24923c1c\") " pod="openstack/ovsdbserver-sb-0" Nov 24 12:29:18 crc kubenswrapper[4752]: I1124 12:29:18.975182 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82c20dce-5921-46ea-aaa2-b62e24923c1c-config\") pod \"ovsdbserver-sb-0\" (UID: \"82c20dce-5921-46ea-aaa2-b62e24923c1c\") " pod="openstack/ovsdbserver-sb-0" Nov 24 12:29:18 crc kubenswrapper[4752]: I1124 12:29:18.975273 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1d1d8b13-b2a7-483a-b1ad-be75868386e7-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"1d1d8b13-b2a7-483a-b1ad-be75868386e7\") " pod="openstack/ovsdbserver-sb-1" Nov 24 12:29:18 crc kubenswrapper[4752]: I1124 12:29:18.975337 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63dc0a8c-07d1-4962-8686-22042a0db911-config\") pod \"ovsdbserver-sb-2\" (UID: \"63dc0a8c-07d1-4962-8686-22042a0db911\") " pod="openstack/ovsdbserver-sb-2" Nov 24 12:29:18 crc kubenswrapper[4752]: I1124 12:29:18.975388 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/63dc0a8c-07d1-4962-8686-22042a0db911-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"63dc0a8c-07d1-4962-8686-22042a0db911\") " pod="openstack/ovsdbserver-sb-2" Nov 24 12:29:18 crc kubenswrapper[4752]: I1124 12:29:18.975471 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/82c20dce-5921-46ea-aaa2-b62e24923c1c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"82c20dce-5921-46ea-aaa2-b62e24923c1c\") " pod="openstack/ovsdbserver-sb-0" Nov 24 12:29:18 crc kubenswrapper[4752]: I1124 12:29:18.975534 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g962f\" (UniqueName: \"kubernetes.io/projected/1d1d8b13-b2a7-483a-b1ad-be75868386e7-kube-api-access-g962f\") pod \"ovsdbserver-sb-1\" (UID: \"1d1d8b13-b2a7-483a-b1ad-be75868386e7\") " pod="openstack/ovsdbserver-sb-1" Nov 24 12:29:18 crc kubenswrapper[4752]: I1124 12:29:18.975577 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m8kz\" (UniqueName: \"kubernetes.io/projected/63dc0a8c-07d1-4962-8686-22042a0db911-kube-api-access-8m8kz\") pod \"ovsdbserver-sb-2\" (UID: \"63dc0a8c-07d1-4962-8686-22042a0db911\") " pod="openstack/ovsdbserver-sb-2" Nov 24 12:29:18 crc kubenswrapper[4752]: I1124 12:29:18.975813 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d1d8b13-b2a7-483a-b1ad-be75868386e7-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"1d1d8b13-b2a7-483a-b1ad-be75868386e7\") " pod="openstack/ovsdbserver-sb-1" Nov 24 12:29:18 crc kubenswrapper[4752]: I1124 12:29:18.975888 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82c20dce-5921-46ea-aaa2-b62e24923c1c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"82c20dce-5921-46ea-aaa2-b62e24923c1c\") " pod="openstack/ovsdbserver-sb-0" Nov 24 12:29:18 crc kubenswrapper[4752]: I1124 12:29:18.975950 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3088af86-19e0-4bb1-96a9-720e87b833e0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3088af86-19e0-4bb1-96a9-720e87b833e0\") pod \"ovsdbserver-sb-1\" (UID: \"1d1d8b13-b2a7-483a-b1ad-be75868386e7\") " pod="openstack/ovsdbserver-sb-1" Nov 24 12:29:18 crc kubenswrapper[4752]: I1124 12:29:18.975996 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8a8cba89-a46b-44a5-932c-6cf6ba1fb43c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8a8cba89-a46b-44a5-932c-6cf6ba1fb43c\") pod \"ovsdbserver-sb-0\" (UID: \"82c20dce-5921-46ea-aaa2-b62e24923c1c\") " pod="openstack/ovsdbserver-sb-0" Nov 24 12:29:18 crc kubenswrapper[4752]: I1124 12:29:18.976024 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz6wd\" (UniqueName: \"kubernetes.io/projected/82c20dce-5921-46ea-aaa2-b62e24923c1c-kube-api-access-sz6wd\") pod \"ovsdbserver-sb-0\" (UID: \"82c20dce-5921-46ea-aaa2-b62e24923c1c\") " pod="openstack/ovsdbserver-sb-0" Nov 24 12:29:18 crc kubenswrapper[4752]: I1124 12:29:18.979699 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 24 12:29:18 crc kubenswrapper[4752]: I1124 12:29:18.982077 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 24 12:29:18 crc kubenswrapper[4752]: I1124 12:29:18.992316 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-rtdjf" Nov 24 12:29:18 crc kubenswrapper[4752]: I1124 12:29:18.992505 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Nov 24 12:29:18 crc kubenswrapper[4752]: I1124 12:29:18.992914 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Nov 24 12:29:18 crc kubenswrapper[4752]: I1124 12:29:18.997638 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:18.999983 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.023147 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.023207 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.024633 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.024733 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.038178 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.077515 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d1d8b13-b2a7-483a-b1ad-be75868386e7-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"1d1d8b13-b2a7-483a-b1ad-be75868386e7\") " pod="openstack/ovsdbserver-sb-1" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.077606 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63dc0a8c-07d1-4962-8686-22042a0db911-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"63dc0a8c-07d1-4962-8686-22042a0db911\") " pod="openstack/ovsdbserver-sb-2" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.077645 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d1d8b13-b2a7-483a-b1ad-be75868386e7-config\") pod \"ovsdbserver-sb-1\" (UID: \"1d1d8b13-b2a7-483a-b1ad-be75868386e7\") " pod="openstack/ovsdbserver-sb-1" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.077675 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/63dc0a8c-07d1-4962-8686-22042a0db911-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"63dc0a8c-07d1-4962-8686-22042a0db911\") " pod="openstack/ovsdbserver-sb-2" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.077702 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/82c20dce-5921-46ea-aaa2-b62e24923c1c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"82c20dce-5921-46ea-aaa2-b62e24923c1c\") " pod="openstack/ovsdbserver-sb-0" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.077874 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82c20dce-5921-46ea-aaa2-b62e24923c1c-config\") pod \"ovsdbserver-sb-0\" (UID: \"82c20dce-5921-46ea-aaa2-b62e24923c1c\") " pod="openstack/ovsdbserver-sb-0" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.077948 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1d1d8b13-b2a7-483a-b1ad-be75868386e7-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"1d1d8b13-b2a7-483a-b1ad-be75868386e7\") " pod="openstack/ovsdbserver-sb-1" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.077973 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63dc0a8c-07d1-4962-8686-22042a0db911-config\") pod \"ovsdbserver-sb-2\" (UID: \"63dc0a8c-07d1-4962-8686-22042a0db911\") " pod="openstack/ovsdbserver-sb-2" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.077994 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/63dc0a8c-07d1-4962-8686-22042a0db911-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"63dc0a8c-07d1-4962-8686-22042a0db911\") " pod="openstack/ovsdbserver-sb-2" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.078024 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/82c20dce-5921-46ea-aaa2-b62e24923c1c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"82c20dce-5921-46ea-aaa2-b62e24923c1c\") " pod="openstack/ovsdbserver-sb-0" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.078045 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g962f\" (UniqueName: \"kubernetes.io/projected/1d1d8b13-b2a7-483a-b1ad-be75868386e7-kube-api-access-g962f\") pod \"ovsdbserver-sb-1\" (UID: \"1d1d8b13-b2a7-483a-b1ad-be75868386e7\") " pod="openstack/ovsdbserver-sb-1" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.078065 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8m8kz\" (UniqueName: \"kubernetes.io/projected/63dc0a8c-07d1-4962-8686-22042a0db911-kube-api-access-8m8kz\") pod \"ovsdbserver-sb-2\" (UID: \"63dc0a8c-07d1-4962-8686-22042a0db911\") " pod="openstack/ovsdbserver-sb-2" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.078093 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d1d8b13-b2a7-483a-b1ad-be75868386e7-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"1d1d8b13-b2a7-483a-b1ad-be75868386e7\") " pod="openstack/ovsdbserver-sb-1" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.078120 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82c20dce-5921-46ea-aaa2-b62e24923c1c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"82c20dce-5921-46ea-aaa2-b62e24923c1c\") " pod="openstack/ovsdbserver-sb-0" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.078156 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3088af86-19e0-4bb1-96a9-720e87b833e0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3088af86-19e0-4bb1-96a9-720e87b833e0\") pod \"ovsdbserver-sb-1\" (UID: \"1d1d8b13-b2a7-483a-b1ad-be75868386e7\") " pod="openstack/ovsdbserver-sb-1" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.078194 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8a8cba89-a46b-44a5-932c-6cf6ba1fb43c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8a8cba89-a46b-44a5-932c-6cf6ba1fb43c\") pod \"ovsdbserver-sb-0\" (UID: \"82c20dce-5921-46ea-aaa2-b62e24923c1c\") " pod="openstack/ovsdbserver-sb-0" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.078220 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz6wd\" (UniqueName: \"kubernetes.io/projected/82c20dce-5921-46ea-aaa2-b62e24923c1c-kube-api-access-sz6wd\") pod \"ovsdbserver-sb-0\" (UID: \"82c20dce-5921-46ea-aaa2-b62e24923c1c\") " pod="openstack/ovsdbserver-sb-0" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.078250 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-118a2861-51c2-4116-b2e8-e54130ff75ad\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-118a2861-51c2-4116-b2e8-e54130ff75ad\") pod \"ovsdbserver-sb-2\" (UID: \"63dc0a8c-07d1-4962-8686-22042a0db911\") " pod="openstack/ovsdbserver-sb-2" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.078519 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1d1d8b13-b2a7-483a-b1ad-be75868386e7-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"1d1d8b13-b2a7-483a-b1ad-be75868386e7\") " pod="openstack/ovsdbserver-sb-1" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.079342 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d1d8b13-b2a7-483a-b1ad-be75868386e7-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"1d1d8b13-b2a7-483a-b1ad-be75868386e7\") " pod="openstack/ovsdbserver-sb-1" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.080565 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63dc0a8c-07d1-4962-8686-22042a0db911-config\") pod \"ovsdbserver-sb-2\" (UID: \"63dc0a8c-07d1-4962-8686-22042a0db911\") " pod="openstack/ovsdbserver-sb-2" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.080843 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/63dc0a8c-07d1-4962-8686-22042a0db911-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"63dc0a8c-07d1-4962-8686-22042a0db911\") " pod="openstack/ovsdbserver-sb-2" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.081077 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d1d8b13-b2a7-483a-b1ad-be75868386e7-config\") pod \"ovsdbserver-sb-1\" (UID: \"1d1d8b13-b2a7-483a-b1ad-be75868386e7\") " pod="openstack/ovsdbserver-sb-1" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.081814 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82c20dce-5921-46ea-aaa2-b62e24923c1c-config\") pod \"ovsdbserver-sb-0\" (UID: \"82c20dce-5921-46ea-aaa2-b62e24923c1c\") " pod="openstack/ovsdbserver-sb-0" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.084167 4752 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.084216 4752 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.084213 4752 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3088af86-19e0-4bb1-96a9-720e87b833e0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3088af86-19e0-4bb1-96a9-720e87b833e0\") pod \"ovsdbserver-sb-1\" (UID: \"1d1d8b13-b2a7-483a-b1ad-be75868386e7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ed7278f566df2c9e11c8fdb228cde003947b8091fc50291b6935daf6c9edf588/globalmount\"" pod="openstack/ovsdbserver-sb-1" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.084239 4752 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8a8cba89-a46b-44a5-932c-6cf6ba1fb43c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8a8cba89-a46b-44a5-932c-6cf6ba1fb43c\") pod \"ovsdbserver-sb-0\" (UID: \"82c20dce-5921-46ea-aaa2-b62e24923c1c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f7136cfef448e567906d52595aa4d6a11bdadb1af98a5f23422c8232be7fbd9c/globalmount\"" pod="openstack/ovsdbserver-sb-0" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.087424 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63dc0a8c-07d1-4962-8686-22042a0db911-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"63dc0a8c-07d1-4962-8686-22042a0db911\") " pod="openstack/ovsdbserver-sb-2" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.088005 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/82c20dce-5921-46ea-aaa2-b62e24923c1c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"82c20dce-5921-46ea-aaa2-b62e24923c1c\") " pod="openstack/ovsdbserver-sb-0" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.089188 4752 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.089239 4752 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-118a2861-51c2-4116-b2e8-e54130ff75ad\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-118a2861-51c2-4116-b2e8-e54130ff75ad\") pod \"ovsdbserver-sb-2\" (UID: \"63dc0a8c-07d1-4962-8686-22042a0db911\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b709880ca698021ba3452d258e2fa51d3f7a64a011fadbf7a603111dbf59b957/globalmount\"" pod="openstack/ovsdbserver-sb-2" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.089934 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/63dc0a8c-07d1-4962-8686-22042a0db911-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"63dc0a8c-07d1-4962-8686-22042a0db911\") " pod="openstack/ovsdbserver-sb-2" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.092057 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/82c20dce-5921-46ea-aaa2-b62e24923c1c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"82c20dce-5921-46ea-aaa2-b62e24923c1c\") " pod="openstack/ovsdbserver-sb-0" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.099603 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d1d8b13-b2a7-483a-b1ad-be75868386e7-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"1d1d8b13-b2a7-483a-b1ad-be75868386e7\") " pod="openstack/ovsdbserver-sb-1" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.102404 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz6wd\" (UniqueName: \"kubernetes.io/projected/82c20dce-5921-46ea-aaa2-b62e24923c1c-kube-api-access-sz6wd\") pod \"ovsdbserver-sb-0\" (UID: \"82c20dce-5921-46ea-aaa2-b62e24923c1c\") " pod="openstack/ovsdbserver-sb-0" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.108537 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82c20dce-5921-46ea-aaa2-b62e24923c1c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"82c20dce-5921-46ea-aaa2-b62e24923c1c\") " pod="openstack/ovsdbserver-sb-0" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.111964 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m8kz\" (UniqueName: \"kubernetes.io/projected/63dc0a8c-07d1-4962-8686-22042a0db911-kube-api-access-8m8kz\") pod \"ovsdbserver-sb-2\" (UID: \"63dc0a8c-07d1-4962-8686-22042a0db911\") " pod="openstack/ovsdbserver-sb-2" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.112985 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g962f\" (UniqueName: \"kubernetes.io/projected/1d1d8b13-b2a7-483a-b1ad-be75868386e7-kube-api-access-g962f\") pod \"ovsdbserver-sb-1\" (UID: \"1d1d8b13-b2a7-483a-b1ad-be75868386e7\") " pod="openstack/ovsdbserver-sb-1" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.139685 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8a8cba89-a46b-44a5-932c-6cf6ba1fb43c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8a8cba89-a46b-44a5-932c-6cf6ba1fb43c\") pod \"ovsdbserver-sb-0\" (UID: \"82c20dce-5921-46ea-aaa2-b62e24923c1c\") " pod="openstack/ovsdbserver-sb-0" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.140178 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3088af86-19e0-4bb1-96a9-720e87b833e0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3088af86-19e0-4bb1-96a9-720e87b833e0\") pod \"ovsdbserver-sb-1\" (UID: \"1d1d8b13-b2a7-483a-b1ad-be75868386e7\") " pod="openstack/ovsdbserver-sb-1" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.144996 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-118a2861-51c2-4116-b2e8-e54130ff75ad\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-118a2861-51c2-4116-b2e8-e54130ff75ad\") pod \"ovsdbserver-sb-2\" (UID: \"63dc0a8c-07d1-4962-8686-22042a0db911\") " pod="openstack/ovsdbserver-sb-2" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.179953 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b25fbb23-a487-4f06-a9c8-66d110ac5903-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"b25fbb23-a487-4f06-a9c8-66d110ac5903\") " pod="openstack/ovsdbserver-nb-2" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.180028 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b25fbb23-a487-4f06-a9c8-66d110ac5903-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"b25fbb23-a487-4f06-a9c8-66d110ac5903\") " pod="openstack/ovsdbserver-nb-2" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.180074 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/27731060-c814-4ebe-9d1b-4ac937bdc995-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"27731060-c814-4ebe-9d1b-4ac937bdc995\") " pod="openstack/ovsdbserver-nb-0" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.180109 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a45b572-4f27-4ab2-9eb6-488ebe78895e-config\") pod \"ovsdbserver-nb-1\" (UID: \"2a45b572-4f27-4ab2-9eb6-488ebe78895e\") " pod="openstack/ovsdbserver-nb-1" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.180164 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm6lz\" (UniqueName: \"kubernetes.io/projected/2a45b572-4f27-4ab2-9eb6-488ebe78895e-kube-api-access-cm6lz\") pod \"ovsdbserver-nb-1\" (UID: \"2a45b572-4f27-4ab2-9eb6-488ebe78895e\") " pod="openstack/ovsdbserver-nb-1" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.180250 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27731060-c814-4ebe-9d1b-4ac937bdc995-config\") pod \"ovsdbserver-nb-0\" (UID: \"27731060-c814-4ebe-9d1b-4ac937bdc995\") " pod="openstack/ovsdbserver-nb-0" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.180318 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2a45b572-4f27-4ab2-9eb6-488ebe78895e-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"2a45b572-4f27-4ab2-9eb6-488ebe78895e\") " pod="openstack/ovsdbserver-nb-1" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.180358 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27731060-c814-4ebe-9d1b-4ac937bdc995-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"27731060-c814-4ebe-9d1b-4ac937bdc995\") " pod="openstack/ovsdbserver-nb-0" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.180398 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/27731060-c814-4ebe-9d1b-4ac937bdc995-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"27731060-c814-4ebe-9d1b-4ac937bdc995\") " pod="openstack/ovsdbserver-nb-0" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.180418 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a45b572-4f27-4ab2-9eb6-488ebe78895e-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"2a45b572-4f27-4ab2-9eb6-488ebe78895e\") " pod="openstack/ovsdbserver-nb-1" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.180467 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvwmv\" (UniqueName: \"kubernetes.io/projected/27731060-c814-4ebe-9d1b-4ac937bdc995-kube-api-access-cvwmv\") pod \"ovsdbserver-nb-0\" (UID: \"27731060-c814-4ebe-9d1b-4ac937bdc995\") " pod="openstack/ovsdbserver-nb-0" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.180493 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f4174486-758b-47be-bdf1-bb0bb4dbc994\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f4174486-758b-47be-bdf1-bb0bb4dbc994\") pod \"ovsdbserver-nb-2\" (UID: \"b25fbb23-a487-4f06-a9c8-66d110ac5903\") " pod="openstack/ovsdbserver-nb-2" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.180535 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cthkx\" (UniqueName: \"kubernetes.io/projected/b25fbb23-a487-4f06-a9c8-66d110ac5903-kube-api-access-cthkx\") pod \"ovsdbserver-nb-2\" (UID: \"b25fbb23-a487-4f06-a9c8-66d110ac5903\") " pod="openstack/ovsdbserver-nb-2" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.180576 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b25fbb23-a487-4f06-a9c8-66d110ac5903-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"b25fbb23-a487-4f06-a9c8-66d110ac5903\") " pod="openstack/ovsdbserver-nb-2" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.180603 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a45b572-4f27-4ab2-9eb6-488ebe78895e-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"2a45b572-4f27-4ab2-9eb6-488ebe78895e\") " pod="openstack/ovsdbserver-nb-1" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.180625 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-409de4aa-e740-4cc1-b5ad-2745b621dce9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-409de4aa-e740-4cc1-b5ad-2745b621dce9\") pod \"ovsdbserver-nb-1\" (UID: \"2a45b572-4f27-4ab2-9eb6-488ebe78895e\") " pod="openstack/ovsdbserver-nb-1" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.180675 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b25fbb23-a487-4f06-a9c8-66d110ac5903-config\") pod \"ovsdbserver-nb-2\" (UID: \"b25fbb23-a487-4f06-a9c8-66d110ac5903\") " pod="openstack/ovsdbserver-nb-2" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.180721 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5deee38f-e5bf-4650-bb08-a52f820a6006\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5deee38f-e5bf-4650-bb08-a52f820a6006\") pod \"ovsdbserver-nb-0\" (UID: \"27731060-c814-4ebe-9d1b-4ac937bdc995\") " pod="openstack/ovsdbserver-nb-0" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.283050 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvwmv\" (UniqueName: \"kubernetes.io/projected/27731060-c814-4ebe-9d1b-4ac937bdc995-kube-api-access-cvwmv\") pod \"ovsdbserver-nb-0\" (UID: \"27731060-c814-4ebe-9d1b-4ac937bdc995\") " pod="openstack/ovsdbserver-nb-0" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.283123 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f4174486-758b-47be-bdf1-bb0bb4dbc994\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f4174486-758b-47be-bdf1-bb0bb4dbc994\") pod \"ovsdbserver-nb-2\" (UID: \"b25fbb23-a487-4f06-a9c8-66d110ac5903\") " pod="openstack/ovsdbserver-nb-2" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.283165 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cthkx\" (UniqueName: \"kubernetes.io/projected/b25fbb23-a487-4f06-a9c8-66d110ac5903-kube-api-access-cthkx\") pod \"ovsdbserver-nb-2\" (UID: \"b25fbb23-a487-4f06-a9c8-66d110ac5903\") " pod="openstack/ovsdbserver-nb-2" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.283223 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b25fbb23-a487-4f06-a9c8-66d110ac5903-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"b25fbb23-a487-4f06-a9c8-66d110ac5903\") " pod="openstack/ovsdbserver-nb-2" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.283267 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a45b572-4f27-4ab2-9eb6-488ebe78895e-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"2a45b572-4f27-4ab2-9eb6-488ebe78895e\") " pod="openstack/ovsdbserver-nb-1" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.283301 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-409de4aa-e740-4cc1-b5ad-2745b621dce9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-409de4aa-e740-4cc1-b5ad-2745b621dce9\") pod \"ovsdbserver-nb-1\" (UID: \"2a45b572-4f27-4ab2-9eb6-488ebe78895e\") " pod="openstack/ovsdbserver-nb-1" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.283355 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b25fbb23-a487-4f06-a9c8-66d110ac5903-config\") pod \"ovsdbserver-nb-2\" (UID: \"b25fbb23-a487-4f06-a9c8-66d110ac5903\") " pod="openstack/ovsdbserver-nb-2" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.283411 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5deee38f-e5bf-4650-bb08-a52f820a6006\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5deee38f-e5bf-4650-bb08-a52f820a6006\") pod \"ovsdbserver-nb-0\" (UID: \"27731060-c814-4ebe-9d1b-4ac937bdc995\") " pod="openstack/ovsdbserver-nb-0" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.283490 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b25fbb23-a487-4f06-a9c8-66d110ac5903-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"b25fbb23-a487-4f06-a9c8-66d110ac5903\") " pod="openstack/ovsdbserver-nb-2" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.283529 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b25fbb23-a487-4f06-a9c8-66d110ac5903-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"b25fbb23-a487-4f06-a9c8-66d110ac5903\") " pod="openstack/ovsdbserver-nb-2" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.283565 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a45b572-4f27-4ab2-9eb6-488ebe78895e-config\") pod \"ovsdbserver-nb-1\" (UID: \"2a45b572-4f27-4ab2-9eb6-488ebe78895e\") " pod="openstack/ovsdbserver-nb-1" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.283594 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/27731060-c814-4ebe-9d1b-4ac937bdc995-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"27731060-c814-4ebe-9d1b-4ac937bdc995\") " pod="openstack/ovsdbserver-nb-0" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.283647 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm6lz\" (UniqueName: \"kubernetes.io/projected/2a45b572-4f27-4ab2-9eb6-488ebe78895e-kube-api-access-cm6lz\") pod \"ovsdbserver-nb-1\" (UID: \"2a45b572-4f27-4ab2-9eb6-488ebe78895e\") " pod="openstack/ovsdbserver-nb-1" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.283679 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27731060-c814-4ebe-9d1b-4ac937bdc995-config\") pod \"ovsdbserver-nb-0\" (UID: \"27731060-c814-4ebe-9d1b-4ac937bdc995\") " pod="openstack/ovsdbserver-nb-0" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.283721 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2a45b572-4f27-4ab2-9eb6-488ebe78895e-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"2a45b572-4f27-4ab2-9eb6-488ebe78895e\") " pod="openstack/ovsdbserver-nb-1" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.283793 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27731060-c814-4ebe-9d1b-4ac937bdc995-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"27731060-c814-4ebe-9d1b-4ac937bdc995\") " pod="openstack/ovsdbserver-nb-0" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.283836 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/27731060-c814-4ebe-9d1b-4ac937bdc995-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"27731060-c814-4ebe-9d1b-4ac937bdc995\") " pod="openstack/ovsdbserver-nb-0" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.283865 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a45b572-4f27-4ab2-9eb6-488ebe78895e-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"2a45b572-4f27-4ab2-9eb6-488ebe78895e\") " pod="openstack/ovsdbserver-nb-1" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.285074 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/27731060-c814-4ebe-9d1b-4ac937bdc995-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"27731060-c814-4ebe-9d1b-4ac937bdc995\") " pod="openstack/ovsdbserver-nb-0" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.285408 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b25fbb23-a487-4f06-a9c8-66d110ac5903-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"b25fbb23-a487-4f06-a9c8-66d110ac5903\") " pod="openstack/ovsdbserver-nb-2" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.285918 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2a45b572-4f27-4ab2-9eb6-488ebe78895e-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"2a45b572-4f27-4ab2-9eb6-488ebe78895e\") " pod="openstack/ovsdbserver-nb-1" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.286311 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b25fbb23-a487-4f06-a9c8-66d110ac5903-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"b25fbb23-a487-4f06-a9c8-66d110ac5903\") " pod="openstack/ovsdbserver-nb-2" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.286650 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/27731060-c814-4ebe-9d1b-4ac937bdc995-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"27731060-c814-4ebe-9d1b-4ac937bdc995\") " pod="openstack/ovsdbserver-nb-0" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.287399 4752 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.287443 4752 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f4174486-758b-47be-bdf1-bb0bb4dbc994\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f4174486-758b-47be-bdf1-bb0bb4dbc994\") pod \"ovsdbserver-nb-2\" (UID: \"b25fbb23-a487-4f06-a9c8-66d110ac5903\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/da80d479593dde95ce30ed07aad3680cff3e27fb0b14772619c75ae7e0716402/globalmount\"" pod="openstack/ovsdbserver-nb-2" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.287485 4752 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.287543 4752 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5deee38f-e5bf-4650-bb08-a52f820a6006\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5deee38f-e5bf-4650-bb08-a52f820a6006\") pod \"ovsdbserver-nb-0\" (UID: \"27731060-c814-4ebe-9d1b-4ac937bdc995\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5dccc5d86df751d62af29506421feba5297cba505d50dbfa726f6f8b44e27016/globalmount\"" pod="openstack/ovsdbserver-nb-0" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.287592 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27731060-c814-4ebe-9d1b-4ac937bdc995-config\") pod \"ovsdbserver-nb-0\" (UID: \"27731060-c814-4ebe-9d1b-4ac937bdc995\") " pod="openstack/ovsdbserver-nb-0" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.288111 4752 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.288144 4752 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-409de4aa-e740-4cc1-b5ad-2745b621dce9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-409de4aa-e740-4cc1-b5ad-2745b621dce9\") pod \"ovsdbserver-nb-1\" (UID: \"2a45b572-4f27-4ab2-9eb6-488ebe78895e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1c28353097a201a358f56a95c7a81a209c4b510746e0305584bb8ecd5cb0194f/globalmount\"" pod="openstack/ovsdbserver-nb-1" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.288365 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b25fbb23-a487-4f06-a9c8-66d110ac5903-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"b25fbb23-a487-4f06-a9c8-66d110ac5903\") " pod="openstack/ovsdbserver-nb-2" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.288870 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b25fbb23-a487-4f06-a9c8-66d110ac5903-config\") pod \"ovsdbserver-nb-2\" (UID: \"b25fbb23-a487-4f06-a9c8-66d110ac5903\") " pod="openstack/ovsdbserver-nb-2" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.289538 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a45b572-4f27-4ab2-9eb6-488ebe78895e-config\") pod \"ovsdbserver-nb-1\" (UID: \"2a45b572-4f27-4ab2-9eb6-488ebe78895e\") " pod="openstack/ovsdbserver-nb-1" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.289571 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a45b572-4f27-4ab2-9eb6-488ebe78895e-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"2a45b572-4f27-4ab2-9eb6-488ebe78895e\") " pod="openstack/ovsdbserver-nb-1" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.298467 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a45b572-4f27-4ab2-9eb6-488ebe78895e-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"2a45b572-4f27-4ab2-9eb6-488ebe78895e\") " pod="openstack/ovsdbserver-nb-1" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.304332 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27731060-c814-4ebe-9d1b-4ac937bdc995-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"27731060-c814-4ebe-9d1b-4ac937bdc995\") " pod="openstack/ovsdbserver-nb-0" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.308522 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cthkx\" (UniqueName: \"kubernetes.io/projected/b25fbb23-a487-4f06-a9c8-66d110ac5903-kube-api-access-cthkx\") pod \"ovsdbserver-nb-2\" (UID: \"b25fbb23-a487-4f06-a9c8-66d110ac5903\") " pod="openstack/ovsdbserver-nb-2" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.312850 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvwmv\" (UniqueName: \"kubernetes.io/projected/27731060-c814-4ebe-9d1b-4ac937bdc995-kube-api-access-cvwmv\") pod \"ovsdbserver-nb-0\" (UID: \"27731060-c814-4ebe-9d1b-4ac937bdc995\") " pod="openstack/ovsdbserver-nb-0" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.320046 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm6lz\" (UniqueName: \"kubernetes.io/projected/2a45b572-4f27-4ab2-9eb6-488ebe78895e-kube-api-access-cm6lz\") pod \"ovsdbserver-nb-1\" (UID: \"2a45b572-4f27-4ab2-9eb6-488ebe78895e\") " pod="openstack/ovsdbserver-nb-1" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.331622 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f4174486-758b-47be-bdf1-bb0bb4dbc994\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f4174486-758b-47be-bdf1-bb0bb4dbc994\") pod \"ovsdbserver-nb-2\" (UID: \"b25fbb23-a487-4f06-a9c8-66d110ac5903\") " pod="openstack/ovsdbserver-nb-2" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.343469 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-409de4aa-e740-4cc1-b5ad-2745b621dce9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-409de4aa-e740-4cc1-b5ad-2745b621dce9\") pod \"ovsdbserver-nb-1\" (UID: \"2a45b572-4f27-4ab2-9eb6-488ebe78895e\") " pod="openstack/ovsdbserver-nb-1" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.351015 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5deee38f-e5bf-4650-bb08-a52f820a6006\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5deee38f-e5bf-4650-bb08-a52f820a6006\") pod \"ovsdbserver-nb-0\" (UID: \"27731060-c814-4ebe-9d1b-4ac937bdc995\") " pod="openstack/ovsdbserver-nb-0" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.352389 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.370319 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.418212 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.434660 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.441192 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.634949 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 24 12:29:19 crc kubenswrapper[4752]: I1124 12:29:19.910563 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Nov 24 12:29:20 crc kubenswrapper[4752]: I1124 12:29:20.030799 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Nov 24 12:29:20 crc kubenswrapper[4752]: I1124 12:29:20.116074 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Nov 24 12:29:20 crc kubenswrapper[4752]: I1124 12:29:20.119678 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"2a45b572-4f27-4ab2-9eb6-488ebe78895e","Type":"ContainerStarted","Data":"d012eebbb870703b27b595ee331421c97af405884c0e6f8ed3c74a20d02916a2"} Nov 24 12:29:20 crc kubenswrapper[4752]: W1124 12:29:20.120312 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d1d8b13_b2a7_483a_b1ad_be75868386e7.slice/crio-5c21cc1a6a62e872ef98cabd15f41b18489af9f59bf34bb1ed8c8a6ff05addd0 WatchSource:0}: Error finding container 5c21cc1a6a62e872ef98cabd15f41b18489af9f59bf34bb1ed8c8a6ff05addd0: Status 404 returned error can't find the container with id 5c21cc1a6a62e872ef98cabd15f41b18489af9f59bf34bb1ed8c8a6ff05addd0 Nov 24 12:29:20 crc kubenswrapper[4752]: I1124 12:29:20.121596 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"b25fbb23-a487-4f06-a9c8-66d110ac5903","Type":"ContainerStarted","Data":"6352e12d844b392ff612aef01515e2fbd3800986436bbe2ec42d0d7cd05baf60"} Nov 24 12:29:20 crc kubenswrapper[4752]: I1124 12:29:20.228120 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 24 12:29:20 crc kubenswrapper[4752]: W1124 12:29:20.235065 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27731060_c814_4ebe_9d1b_4ac937bdc995.slice/crio-7e330f6ff789382bf13c918d032b911e66640d6ea313c9cd72274375038b0f7e WatchSource:0}: Error finding container 7e330f6ff789382bf13c918d032b911e66640d6ea313c9cd72274375038b0f7e: Status 404 returned error can't find the container with id 7e330f6ff789382bf13c918d032b911e66640d6ea313c9cd72274375038b0f7e Nov 24 12:29:20 crc kubenswrapper[4752]: I1124 12:29:20.728317 4752 scope.go:117] "RemoveContainer" containerID="294534fa96e641fffe9b0cce6e2bc4d525450d730fff13a2ed6629d46358ee8a" Nov 24 12:29:20 crc kubenswrapper[4752]: E1124 12:29:20.728765 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:29:20 crc kubenswrapper[4752]: I1124 12:29:20.955171 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Nov 24 12:29:20 crc kubenswrapper[4752]: W1124 12:29:20.963163 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63dc0a8c_07d1_4962_8686_22042a0db911.slice/crio-1e88eed0c057ce2e5c7e3f63aa0b33a85105e0e016e16b56839bfed377902115 WatchSource:0}: Error finding container 1e88eed0c057ce2e5c7e3f63aa0b33a85105e0e016e16b56839bfed377902115: Status 404 returned error can't find the container with id 1e88eed0c057ce2e5c7e3f63aa0b33a85105e0e016e16b56839bfed377902115 Nov 24 12:29:21 crc kubenswrapper[4752]: I1124 12:29:21.044588 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 24 12:29:21 crc kubenswrapper[4752]: I1124 12:29:21.129544 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"82c20dce-5921-46ea-aaa2-b62e24923c1c","Type":"ContainerStarted","Data":"39ed050ea665f0915e43118d4834593e8ef2dbc85070fcf368678b1b4db5404c"} Nov 24 12:29:21 crc kubenswrapper[4752]: I1124 12:29:21.137023 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"27731060-c814-4ebe-9d1b-4ac937bdc995","Type":"ContainerStarted","Data":"77d5d703cbf2e4712c73610677b61406e12235d4c916e31ad1f76f97ddf4be74"} Nov 24 12:29:21 crc kubenswrapper[4752]: I1124 12:29:21.137098 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"27731060-c814-4ebe-9d1b-4ac937bdc995","Type":"ContainerStarted","Data":"f049bdaa04e563a8ecbebee39711a092a42a6dd456cc89b3f81ffe15367e2e6c"} Nov 24 12:29:21 crc kubenswrapper[4752]: I1124 12:29:21.137126 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"27731060-c814-4ebe-9d1b-4ac937bdc995","Type":"ContainerStarted","Data":"7e330f6ff789382bf13c918d032b911e66640d6ea313c9cd72274375038b0f7e"} Nov 24 12:29:21 crc kubenswrapper[4752]: I1124 12:29:21.140289 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"1d1d8b13-b2a7-483a-b1ad-be75868386e7","Type":"ContainerStarted","Data":"fcc1fd279d188c2176279b65723888adac8020d63ceffe69a31f67ee4acb3d58"} Nov 24 12:29:21 crc kubenswrapper[4752]: I1124 12:29:21.140336 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"1d1d8b13-b2a7-483a-b1ad-be75868386e7","Type":"ContainerStarted","Data":"65eb6c68310fdf47e03bb90b98e83d52cd1f2439519592cfbd4cc96cbfa3f99b"} Nov 24 12:29:21 crc kubenswrapper[4752]: I1124 12:29:21.140359 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"1d1d8b13-b2a7-483a-b1ad-be75868386e7","Type":"ContainerStarted","Data":"5c21cc1a6a62e872ef98cabd15f41b18489af9f59bf34bb1ed8c8a6ff05addd0"} Nov 24 12:29:21 crc kubenswrapper[4752]: I1124 12:29:21.145773 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"2a45b572-4f27-4ab2-9eb6-488ebe78895e","Type":"ContainerStarted","Data":"b6999efb7e8eee14c53648e93a403ae21e66470f830df83f6401a8ccbc01c07c"} Nov 24 12:29:21 crc kubenswrapper[4752]: I1124 12:29:21.145932 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"2a45b572-4f27-4ab2-9eb6-488ebe78895e","Type":"ContainerStarted","Data":"a1028f554cce47c63017e2dcd72b5edd56e5aed28a5ab36d6aa6ad31dd90f75c"} Nov 24 12:29:21 crc kubenswrapper[4752]: I1124 12:29:21.160512 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"b25fbb23-a487-4f06-a9c8-66d110ac5903","Type":"ContainerStarted","Data":"5822023d10bb49161b00dd8aa57787fd5df0478aef56388617afcdd2e3106efb"} Nov 24 12:29:21 crc kubenswrapper[4752]: I1124 12:29:21.160563 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"b25fbb23-a487-4f06-a9c8-66d110ac5903","Type":"ContainerStarted","Data":"45ab846401ea97ea930515513d7710d26852e9190b1344a7d33edf8b40bd07d0"} Nov 24 12:29:21 crc kubenswrapper[4752]: I1124 12:29:21.162716 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"63dc0a8c-07d1-4962-8686-22042a0db911","Type":"ContainerStarted","Data":"1e88eed0c057ce2e5c7e3f63aa0b33a85105e0e016e16b56839bfed377902115"} Nov 24 12:29:21 crc kubenswrapper[4752]: I1124 12:29:21.170121 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=4.170097529 podStartE2EDuration="4.170097529s" podCreationTimestamp="2025-11-24 12:29:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:29:21.159802873 +0000 UTC m=+4967.144623182" watchObservedRunningTime="2025-11-24 12:29:21.170097529 +0000 UTC m=+4967.154917838" Nov 24 12:29:21 crc kubenswrapper[4752]: I1124 12:29:21.184722 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=4.184701268 podStartE2EDuration="4.184701268s" podCreationTimestamp="2025-11-24 12:29:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:29:21.179931801 +0000 UTC m=+4967.164752110" watchObservedRunningTime="2025-11-24 12:29:21.184701268 +0000 UTC m=+4967.169521597" Nov 24 12:29:21 crc kubenswrapper[4752]: I1124 12:29:21.205651 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=4.205631528 podStartE2EDuration="4.205631528s" podCreationTimestamp="2025-11-24 12:29:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:29:21.199826912 +0000 UTC m=+4967.184647211" watchObservedRunningTime="2025-11-24 12:29:21.205631528 +0000 UTC m=+4967.190451817" Nov 24 12:29:21 crc kubenswrapper[4752]: I1124 12:29:21.228108 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=4.228084612 podStartE2EDuration="4.228084612s" podCreationTimestamp="2025-11-24 12:29:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:29:21.218386364 +0000 UTC m=+4967.203206683" watchObservedRunningTime="2025-11-24 12:29:21.228084612 +0000 UTC m=+4967.212904931" Nov 24 12:29:22 crc kubenswrapper[4752]: I1124 12:29:22.177945 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"63dc0a8c-07d1-4962-8686-22042a0db911","Type":"ContainerStarted","Data":"2aaa78d395c23bfce5de723907719379c682f189d7226e8881d7ea34a1ff679c"} Nov 24 12:29:22 crc kubenswrapper[4752]: I1124 12:29:22.178463 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"63dc0a8c-07d1-4962-8686-22042a0db911","Type":"ContainerStarted","Data":"9badd35d1d78c7d7ef99890ec16fa48cebb67c6edbe876054e7fb03869540480"} Nov 24 12:29:22 crc kubenswrapper[4752]: I1124 12:29:22.182080 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"82c20dce-5921-46ea-aaa2-b62e24923c1c","Type":"ContainerStarted","Data":"bc515afcc90610c50b8c8dcd4a9dbfc8f959e13dab15c04cb13f3bb83b8d912b"} Nov 24 12:29:22 crc kubenswrapper[4752]: I1124 12:29:22.182161 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"82c20dce-5921-46ea-aaa2-b62e24923c1c","Type":"ContainerStarted","Data":"fcc8fe16a3b56ab3251d2f783fb846e4bfd6d62e290494edbeced99bf5f56e1b"} Nov 24 12:29:22 crc kubenswrapper[4752]: I1124 12:29:22.217123 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=5.217101905 podStartE2EDuration="5.217101905s" podCreationTimestamp="2025-11-24 12:29:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:29:22.205462551 +0000 UTC m=+4968.190282840" watchObservedRunningTime="2025-11-24 12:29:22.217101905 +0000 UTC m=+4968.201922204" Nov 24 12:29:22 crc kubenswrapper[4752]: I1124 12:29:22.233659 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=5.233630549 podStartE2EDuration="5.233630549s" podCreationTimestamp="2025-11-24 12:29:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:29:22.230184261 +0000 UTC m=+4968.215004560" watchObservedRunningTime="2025-11-24 12:29:22.233630549 +0000 UTC m=+4968.218450878" Nov 24 12:29:22 crc kubenswrapper[4752]: I1124 12:29:22.352638 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Nov 24 12:29:22 crc kubenswrapper[4752]: I1124 12:29:22.371274 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Nov 24 12:29:22 crc kubenswrapper[4752]: I1124 12:29:22.419215 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Nov 24 12:29:22 crc kubenswrapper[4752]: I1124 12:29:22.436143 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Nov 24 12:29:22 crc kubenswrapper[4752]: I1124 12:29:22.443481 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Nov 24 12:29:22 crc kubenswrapper[4752]: I1124 12:29:22.636056 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Nov 24 12:29:24 crc kubenswrapper[4752]: I1124 12:29:24.353212 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Nov 24 12:29:24 crc kubenswrapper[4752]: I1124 12:29:24.371533 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Nov 24 12:29:24 crc kubenswrapper[4752]: I1124 12:29:24.418910 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Nov 24 12:29:24 crc kubenswrapper[4752]: I1124 12:29:24.435848 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Nov 24 12:29:24 crc kubenswrapper[4752]: I1124 12:29:24.443022 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Nov 24 12:29:24 crc kubenswrapper[4752]: I1124 12:29:24.635396 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Nov 24 12:29:25 crc kubenswrapper[4752]: I1124 12:29:25.402950 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Nov 24 12:29:25 crc kubenswrapper[4752]: I1124 12:29:25.412601 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Nov 24 12:29:25 crc kubenswrapper[4752]: I1124 12:29:25.456953 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Nov 24 12:29:25 crc kubenswrapper[4752]: I1124 12:29:25.459493 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Nov 24 12:29:25 crc kubenswrapper[4752]: I1124 12:29:25.481946 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Nov 24 12:29:25 crc kubenswrapper[4752]: I1124 12:29:25.491882 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Nov 24 12:29:25 crc kubenswrapper[4752]: I1124 12:29:25.506264 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Nov 24 12:29:25 crc kubenswrapper[4752]: I1124 12:29:25.577081 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Nov 24 12:29:25 crc kubenswrapper[4752]: I1124 12:29:25.689400 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Nov 24 12:29:25 crc kubenswrapper[4752]: I1124 12:29:25.697649 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-749675b4c7-fjhk2"] Nov 24 12:29:25 crc kubenswrapper[4752]: I1124 12:29:25.699225 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-749675b4c7-fjhk2" Nov 24 12:29:25 crc kubenswrapper[4752]: I1124 12:29:25.701519 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Nov 24 12:29:25 crc kubenswrapper[4752]: I1124 12:29:25.709840 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-749675b4c7-fjhk2"] Nov 24 12:29:25 crc kubenswrapper[4752]: I1124 12:29:25.762871 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Nov 24 12:29:25 crc kubenswrapper[4752]: I1124 12:29:25.820279 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18995463-7a03-467e-9373-e80610a4d15d-dns-svc\") pod \"dnsmasq-dns-749675b4c7-fjhk2\" (UID: \"18995463-7a03-467e-9373-e80610a4d15d\") " pod="openstack/dnsmasq-dns-749675b4c7-fjhk2" Nov 24 12:29:25 crc kubenswrapper[4752]: I1124 12:29:25.820355 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18995463-7a03-467e-9373-e80610a4d15d-ovsdbserver-nb\") pod \"dnsmasq-dns-749675b4c7-fjhk2\" (UID: \"18995463-7a03-467e-9373-e80610a4d15d\") " pod="openstack/dnsmasq-dns-749675b4c7-fjhk2" Nov 24 12:29:25 crc kubenswrapper[4752]: I1124 12:29:25.820380 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxx8h\" (UniqueName: \"kubernetes.io/projected/18995463-7a03-467e-9373-e80610a4d15d-kube-api-access-nxx8h\") pod \"dnsmasq-dns-749675b4c7-fjhk2\" (UID: \"18995463-7a03-467e-9373-e80610a4d15d\") " pod="openstack/dnsmasq-dns-749675b4c7-fjhk2" Nov 24 12:29:25 crc kubenswrapper[4752]: I1124 12:29:25.820406 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18995463-7a03-467e-9373-e80610a4d15d-config\") pod \"dnsmasq-dns-749675b4c7-fjhk2\" (UID: \"18995463-7a03-467e-9373-e80610a4d15d\") " pod="openstack/dnsmasq-dns-749675b4c7-fjhk2" Nov 24 12:29:25 crc kubenswrapper[4752]: I1124 12:29:25.921997 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxx8h\" (UniqueName: \"kubernetes.io/projected/18995463-7a03-467e-9373-e80610a4d15d-kube-api-access-nxx8h\") pod \"dnsmasq-dns-749675b4c7-fjhk2\" (UID: \"18995463-7a03-467e-9373-e80610a4d15d\") " pod="openstack/dnsmasq-dns-749675b4c7-fjhk2" Nov 24 12:29:25 crc kubenswrapper[4752]: I1124 12:29:25.922078 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18995463-7a03-467e-9373-e80610a4d15d-config\") pod \"dnsmasq-dns-749675b4c7-fjhk2\" (UID: \"18995463-7a03-467e-9373-e80610a4d15d\") " pod="openstack/dnsmasq-dns-749675b4c7-fjhk2" Nov 24 12:29:25 crc kubenswrapper[4752]: I1124 12:29:25.922231 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18995463-7a03-467e-9373-e80610a4d15d-dns-svc\") pod \"dnsmasq-dns-749675b4c7-fjhk2\" (UID: \"18995463-7a03-467e-9373-e80610a4d15d\") " pod="openstack/dnsmasq-dns-749675b4c7-fjhk2" Nov 24 12:29:25 crc kubenswrapper[4752]: I1124 12:29:25.922270 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18995463-7a03-467e-9373-e80610a4d15d-ovsdbserver-nb\") pod \"dnsmasq-dns-749675b4c7-fjhk2\" (UID: \"18995463-7a03-467e-9373-e80610a4d15d\") " pod="openstack/dnsmasq-dns-749675b4c7-fjhk2" Nov 24 12:29:25 crc kubenswrapper[4752]: I1124 12:29:25.923437 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18995463-7a03-467e-9373-e80610a4d15d-config\") pod \"dnsmasq-dns-749675b4c7-fjhk2\" (UID: \"18995463-7a03-467e-9373-e80610a4d15d\") " pod="openstack/dnsmasq-dns-749675b4c7-fjhk2" Nov 24 12:29:25 crc kubenswrapper[4752]: I1124 12:29:25.923457 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18995463-7a03-467e-9373-e80610a4d15d-ovsdbserver-nb\") pod \"dnsmasq-dns-749675b4c7-fjhk2\" (UID: \"18995463-7a03-467e-9373-e80610a4d15d\") " pod="openstack/dnsmasq-dns-749675b4c7-fjhk2" Nov 24 12:29:25 crc kubenswrapper[4752]: I1124 12:29:25.925233 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18995463-7a03-467e-9373-e80610a4d15d-dns-svc\") pod \"dnsmasq-dns-749675b4c7-fjhk2\" (UID: \"18995463-7a03-467e-9373-e80610a4d15d\") " pod="openstack/dnsmasq-dns-749675b4c7-fjhk2" Nov 24 12:29:25 crc kubenswrapper[4752]: I1124 12:29:25.926226 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-749675b4c7-fjhk2"] Nov 24 12:29:25 crc kubenswrapper[4752]: E1124 12:29:25.926717 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-nxx8h], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-749675b4c7-fjhk2" podUID="18995463-7a03-467e-9373-e80610a4d15d" Nov 24 12:29:25 crc kubenswrapper[4752]: I1124 12:29:25.948321 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6fc965c-l88z4"] Nov 24 12:29:25 crc kubenswrapper[4752]: I1124 12:29:25.950576 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6fc965c-l88z4" Nov 24 12:29:25 crc kubenswrapper[4752]: I1124 12:29:25.956206 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Nov 24 12:29:25 crc kubenswrapper[4752]: I1124 12:29:25.966130 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6fc965c-l88z4"] Nov 24 12:29:25 crc kubenswrapper[4752]: I1124 12:29:25.967431 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxx8h\" (UniqueName: \"kubernetes.io/projected/18995463-7a03-467e-9373-e80610a4d15d-kube-api-access-nxx8h\") pod \"dnsmasq-dns-749675b4c7-fjhk2\" (UID: \"18995463-7a03-467e-9373-e80610a4d15d\") " pod="openstack/dnsmasq-dns-749675b4c7-fjhk2" Nov 24 12:29:26 crc kubenswrapper[4752]: I1124 12:29:26.125357 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmczf\" (UniqueName: \"kubernetes.io/projected/ff71927d-9232-4525-8ac9-4c6c82eb4ed6-kube-api-access-cmczf\") pod \"dnsmasq-dns-74f6fc965c-l88z4\" (UID: \"ff71927d-9232-4525-8ac9-4c6c82eb4ed6\") " pod="openstack/dnsmasq-dns-74f6fc965c-l88z4" Nov 24 12:29:26 crc kubenswrapper[4752]: I1124 12:29:26.125708 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff71927d-9232-4525-8ac9-4c6c82eb4ed6-dns-svc\") pod \"dnsmasq-dns-74f6fc965c-l88z4\" (UID: \"ff71927d-9232-4525-8ac9-4c6c82eb4ed6\") " pod="openstack/dnsmasq-dns-74f6fc965c-l88z4" Nov 24 12:29:26 crc kubenswrapper[4752]: I1124 12:29:26.125736 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff71927d-9232-4525-8ac9-4c6c82eb4ed6-config\") pod \"dnsmasq-dns-74f6fc965c-l88z4\" (UID: \"ff71927d-9232-4525-8ac9-4c6c82eb4ed6\") " pod="openstack/dnsmasq-dns-74f6fc965c-l88z4" Nov 24 12:29:26 crc kubenswrapper[4752]: I1124 12:29:26.125844 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff71927d-9232-4525-8ac9-4c6c82eb4ed6-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6fc965c-l88z4\" (UID: \"ff71927d-9232-4525-8ac9-4c6c82eb4ed6\") " pod="openstack/dnsmasq-dns-74f6fc965c-l88z4" Nov 24 12:29:26 crc kubenswrapper[4752]: I1124 12:29:26.125873 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff71927d-9232-4525-8ac9-4c6c82eb4ed6-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6fc965c-l88z4\" (UID: \"ff71927d-9232-4525-8ac9-4c6c82eb4ed6\") " pod="openstack/dnsmasq-dns-74f6fc965c-l88z4" Nov 24 12:29:26 crc kubenswrapper[4752]: I1124 12:29:26.220178 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-749675b4c7-fjhk2" Nov 24 12:29:26 crc kubenswrapper[4752]: I1124 12:29:26.227401 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff71927d-9232-4525-8ac9-4c6c82eb4ed6-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6fc965c-l88z4\" (UID: \"ff71927d-9232-4525-8ac9-4c6c82eb4ed6\") " pod="openstack/dnsmasq-dns-74f6fc965c-l88z4" Nov 24 12:29:26 crc kubenswrapper[4752]: I1124 12:29:26.227473 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff71927d-9232-4525-8ac9-4c6c82eb4ed6-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6fc965c-l88z4\" (UID: \"ff71927d-9232-4525-8ac9-4c6c82eb4ed6\") " pod="openstack/dnsmasq-dns-74f6fc965c-l88z4" Nov 24 12:29:26 crc kubenswrapper[4752]: I1124 12:29:26.227542 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmczf\" (UniqueName: \"kubernetes.io/projected/ff71927d-9232-4525-8ac9-4c6c82eb4ed6-kube-api-access-cmczf\") pod \"dnsmasq-dns-74f6fc965c-l88z4\" (UID: \"ff71927d-9232-4525-8ac9-4c6c82eb4ed6\") " pod="openstack/dnsmasq-dns-74f6fc965c-l88z4" Nov 24 12:29:26 crc kubenswrapper[4752]: I1124 12:29:26.227596 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff71927d-9232-4525-8ac9-4c6c82eb4ed6-dns-svc\") pod \"dnsmasq-dns-74f6fc965c-l88z4\" (UID: \"ff71927d-9232-4525-8ac9-4c6c82eb4ed6\") " pod="openstack/dnsmasq-dns-74f6fc965c-l88z4" Nov 24 12:29:26 crc kubenswrapper[4752]: I1124 12:29:26.227640 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff71927d-9232-4525-8ac9-4c6c82eb4ed6-config\") pod \"dnsmasq-dns-74f6fc965c-l88z4\" (UID: \"ff71927d-9232-4525-8ac9-4c6c82eb4ed6\") " pod="openstack/dnsmasq-dns-74f6fc965c-l88z4" Nov 24 12:29:26 crc kubenswrapper[4752]: I1124 12:29:26.229053 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff71927d-9232-4525-8ac9-4c6c82eb4ed6-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6fc965c-l88z4\" (UID: \"ff71927d-9232-4525-8ac9-4c6c82eb4ed6\") " pod="openstack/dnsmasq-dns-74f6fc965c-l88z4" Nov 24 12:29:26 crc kubenswrapper[4752]: I1124 12:29:26.229147 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff71927d-9232-4525-8ac9-4c6c82eb4ed6-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6fc965c-l88z4\" (UID: \"ff71927d-9232-4525-8ac9-4c6c82eb4ed6\") " pod="openstack/dnsmasq-dns-74f6fc965c-l88z4" Nov 24 12:29:26 crc kubenswrapper[4752]: I1124 12:29:26.229224 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff71927d-9232-4525-8ac9-4c6c82eb4ed6-config\") pod \"dnsmasq-dns-74f6fc965c-l88z4\" (UID: \"ff71927d-9232-4525-8ac9-4c6c82eb4ed6\") " pod="openstack/dnsmasq-dns-74f6fc965c-l88z4" Nov 24 12:29:26 crc kubenswrapper[4752]: I1124 12:29:26.229267 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff71927d-9232-4525-8ac9-4c6c82eb4ed6-dns-svc\") pod \"dnsmasq-dns-74f6fc965c-l88z4\" (UID: \"ff71927d-9232-4525-8ac9-4c6c82eb4ed6\") " pod="openstack/dnsmasq-dns-74f6fc965c-l88z4" Nov 24 12:29:26 crc kubenswrapper[4752]: I1124 12:29:26.232210 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-749675b4c7-fjhk2" Nov 24 12:29:26 crc kubenswrapper[4752]: I1124 12:29:26.249403 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmczf\" (UniqueName: \"kubernetes.io/projected/ff71927d-9232-4525-8ac9-4c6c82eb4ed6-kube-api-access-cmczf\") pod \"dnsmasq-dns-74f6fc965c-l88z4\" (UID: \"ff71927d-9232-4525-8ac9-4c6c82eb4ed6\") " pod="openstack/dnsmasq-dns-74f6fc965c-l88z4" Nov 24 12:29:26 crc kubenswrapper[4752]: I1124 12:29:26.267281 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Nov 24 12:29:26 crc kubenswrapper[4752]: I1124 12:29:26.267854 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Nov 24 12:29:26 crc kubenswrapper[4752]: I1124 12:29:26.314629 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6fc965c-l88z4" Nov 24 12:29:26 crc kubenswrapper[4752]: I1124 12:29:26.328805 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18995463-7a03-467e-9373-e80610a4d15d-ovsdbserver-nb\") pod \"18995463-7a03-467e-9373-e80610a4d15d\" (UID: \"18995463-7a03-467e-9373-e80610a4d15d\") " Nov 24 12:29:26 crc kubenswrapper[4752]: I1124 12:29:26.328925 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18995463-7a03-467e-9373-e80610a4d15d-dns-svc\") pod \"18995463-7a03-467e-9373-e80610a4d15d\" (UID: \"18995463-7a03-467e-9373-e80610a4d15d\") " Nov 24 12:29:26 crc kubenswrapper[4752]: I1124 12:29:26.329043 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18995463-7a03-467e-9373-e80610a4d15d-config\") pod \"18995463-7a03-467e-9373-e80610a4d15d\" (UID: \"18995463-7a03-467e-9373-e80610a4d15d\") " Nov 24 12:29:26 crc kubenswrapper[4752]: I1124 12:29:26.329123 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxx8h\" (UniqueName: \"kubernetes.io/projected/18995463-7a03-467e-9373-e80610a4d15d-kube-api-access-nxx8h\") pod \"18995463-7a03-467e-9373-e80610a4d15d\" (UID: \"18995463-7a03-467e-9373-e80610a4d15d\") " Nov 24 12:29:26 crc kubenswrapper[4752]: I1124 12:29:26.330120 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18995463-7a03-467e-9373-e80610a4d15d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "18995463-7a03-467e-9373-e80610a4d15d" (UID: "18995463-7a03-467e-9373-e80610a4d15d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:29:26 crc kubenswrapper[4752]: I1124 12:29:26.330522 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18995463-7a03-467e-9373-e80610a4d15d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "18995463-7a03-467e-9373-e80610a4d15d" (UID: "18995463-7a03-467e-9373-e80610a4d15d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:29:26 crc kubenswrapper[4752]: I1124 12:29:26.331579 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18995463-7a03-467e-9373-e80610a4d15d-config" (OuterVolumeSpecName: "config") pod "18995463-7a03-467e-9373-e80610a4d15d" (UID: "18995463-7a03-467e-9373-e80610a4d15d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:29:26 crc kubenswrapper[4752]: I1124 12:29:26.333840 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18995463-7a03-467e-9373-e80610a4d15d-kube-api-access-nxx8h" (OuterVolumeSpecName: "kube-api-access-nxx8h") pod "18995463-7a03-467e-9373-e80610a4d15d" (UID: "18995463-7a03-467e-9373-e80610a4d15d"). InnerVolumeSpecName "kube-api-access-nxx8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:29:26 crc kubenswrapper[4752]: I1124 12:29:26.432513 4752 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18995463-7a03-467e-9373-e80610a4d15d-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 12:29:26 crc kubenswrapper[4752]: I1124 12:29:26.432916 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18995463-7a03-467e-9373-e80610a4d15d-config\") on node \"crc\" DevicePath \"\"" Nov 24 12:29:26 crc kubenswrapper[4752]: I1124 12:29:26.432926 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxx8h\" (UniqueName: \"kubernetes.io/projected/18995463-7a03-467e-9373-e80610a4d15d-kube-api-access-nxx8h\") on node \"crc\" DevicePath \"\"" Nov 24 12:29:26 crc kubenswrapper[4752]: I1124 12:29:26.432937 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18995463-7a03-467e-9373-e80610a4d15d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 24 12:29:26 crc kubenswrapper[4752]: I1124 12:29:26.783710 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6fc965c-l88z4"] Nov 24 12:29:27 crc kubenswrapper[4752]: I1124 12:29:27.230451 4752 generic.go:334] "Generic (PLEG): container finished" podID="ff71927d-9232-4525-8ac9-4c6c82eb4ed6" containerID="75badecf99b7f3d4438d5f0647c0ddefed749d64e009033df6e1a39625493cc2" exitCode=0 Nov 24 12:29:27 crc kubenswrapper[4752]: I1124 12:29:27.230935 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-749675b4c7-fjhk2" Nov 24 12:29:27 crc kubenswrapper[4752]: I1124 12:29:27.230504 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6fc965c-l88z4" event={"ID":"ff71927d-9232-4525-8ac9-4c6c82eb4ed6","Type":"ContainerDied","Data":"75badecf99b7f3d4438d5f0647c0ddefed749d64e009033df6e1a39625493cc2"} Nov 24 12:29:27 crc kubenswrapper[4752]: I1124 12:29:27.231034 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6fc965c-l88z4" event={"ID":"ff71927d-9232-4525-8ac9-4c6c82eb4ed6","Type":"ContainerStarted","Data":"dacb2bdced93223d6809a73c6f6f653e83cd35dc24534564213f37686de7ed61"} Nov 24 12:29:27 crc kubenswrapper[4752]: I1124 12:29:27.312998 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-749675b4c7-fjhk2"] Nov 24 12:29:27 crc kubenswrapper[4752]: I1124 12:29:27.328375 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-749675b4c7-fjhk2"] Nov 24 12:29:28 crc kubenswrapper[4752]: I1124 12:29:28.239411 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6fc965c-l88z4" event={"ID":"ff71927d-9232-4525-8ac9-4c6c82eb4ed6","Type":"ContainerStarted","Data":"b9721523a90b8183528f16d698b8b1882e8d8910e3b4940cea4fefc8ddbaeff3"} Nov 24 12:29:28 crc kubenswrapper[4752]: I1124 12:29:28.239779 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6fc965c-l88z4" Nov 24 12:29:28 crc kubenswrapper[4752]: I1124 12:29:28.263705 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6fc965c-l88z4" podStartSLOduration=3.263685309 podStartE2EDuration="3.263685309s" podCreationTimestamp="2025-11-24 12:29:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:29:28.258877621 +0000 UTC m=+4974.243697910" watchObservedRunningTime="2025-11-24 12:29:28.263685309 +0000 UTC m=+4974.248505598" Nov 24 12:29:28 crc kubenswrapper[4752]: I1124 12:29:28.751537 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18995463-7a03-467e-9373-e80610a4d15d" path="/var/lib/kubelet/pods/18995463-7a03-467e-9373-e80610a4d15d/volumes" Nov 24 12:29:28 crc kubenswrapper[4752]: I1124 12:29:28.920629 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Nov 24 12:29:28 crc kubenswrapper[4752]: I1124 12:29:28.921985 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Nov 24 12:29:28 crc kubenswrapper[4752]: I1124 12:29:28.930280 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Nov 24 12:29:28 crc kubenswrapper[4752]: I1124 12:29:28.953257 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Nov 24 12:29:28 crc kubenswrapper[4752]: I1124 12:29:28.981121 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8b5z\" (UniqueName: \"kubernetes.io/projected/95e8030c-fdfe-47e0-811a-0b32eaa08a00-kube-api-access-z8b5z\") pod \"ovn-copy-data\" (UID: \"95e8030c-fdfe-47e0-811a-0b32eaa08a00\") " pod="openstack/ovn-copy-data" Nov 24 12:29:28 crc kubenswrapper[4752]: I1124 12:29:28.981209 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/95e8030c-fdfe-47e0-811a-0b32eaa08a00-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"95e8030c-fdfe-47e0-811a-0b32eaa08a00\") " pod="openstack/ovn-copy-data" Nov 24 12:29:28 crc kubenswrapper[4752]: I1124 12:29:28.981267 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d1ab895b-5977-4f6d-b224-d0056a5f2772\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d1ab895b-5977-4f6d-b224-d0056a5f2772\") pod \"ovn-copy-data\" (UID: \"95e8030c-fdfe-47e0-811a-0b32eaa08a00\") " pod="openstack/ovn-copy-data" Nov 24 12:29:29 crc kubenswrapper[4752]: I1124 12:29:29.082860 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8b5z\" (UniqueName: \"kubernetes.io/projected/95e8030c-fdfe-47e0-811a-0b32eaa08a00-kube-api-access-z8b5z\") pod \"ovn-copy-data\" (UID: \"95e8030c-fdfe-47e0-811a-0b32eaa08a00\") " pod="openstack/ovn-copy-data" Nov 24 12:29:29 crc kubenswrapper[4752]: I1124 12:29:29.083423 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/95e8030c-fdfe-47e0-811a-0b32eaa08a00-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"95e8030c-fdfe-47e0-811a-0b32eaa08a00\") " pod="openstack/ovn-copy-data" Nov 24 12:29:29 crc kubenswrapper[4752]: I1124 12:29:29.083462 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d1ab895b-5977-4f6d-b224-d0056a5f2772\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d1ab895b-5977-4f6d-b224-d0056a5f2772\") pod \"ovn-copy-data\" (UID: \"95e8030c-fdfe-47e0-811a-0b32eaa08a00\") " pod="openstack/ovn-copy-data" Nov 24 12:29:29 crc kubenswrapper[4752]: I1124 12:29:29.089867 4752 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 24 12:29:29 crc kubenswrapper[4752]: I1124 12:29:29.089917 4752 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d1ab895b-5977-4f6d-b224-d0056a5f2772\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d1ab895b-5977-4f6d-b224-d0056a5f2772\") pod \"ovn-copy-data\" (UID: \"95e8030c-fdfe-47e0-811a-0b32eaa08a00\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8ae0f77cec6ac394c3d01a5f63f9f796ed9c2f666ebbbddab98a685ffc825db4/globalmount\"" pod="openstack/ovn-copy-data" Nov 24 12:29:29 crc kubenswrapper[4752]: I1124 12:29:29.095008 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/95e8030c-fdfe-47e0-811a-0b32eaa08a00-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"95e8030c-fdfe-47e0-811a-0b32eaa08a00\") " pod="openstack/ovn-copy-data" Nov 24 12:29:29 crc kubenswrapper[4752]: I1124 12:29:29.103517 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8b5z\" (UniqueName: \"kubernetes.io/projected/95e8030c-fdfe-47e0-811a-0b32eaa08a00-kube-api-access-z8b5z\") pod \"ovn-copy-data\" (UID: \"95e8030c-fdfe-47e0-811a-0b32eaa08a00\") " pod="openstack/ovn-copy-data" Nov 24 12:29:29 crc kubenswrapper[4752]: I1124 12:29:29.133923 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d1ab895b-5977-4f6d-b224-d0056a5f2772\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d1ab895b-5977-4f6d-b224-d0056a5f2772\") pod \"ovn-copy-data\" (UID: \"95e8030c-fdfe-47e0-811a-0b32eaa08a00\") " pod="openstack/ovn-copy-data" Nov 24 12:29:29 crc kubenswrapper[4752]: I1124 12:29:29.255554 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Nov 24 12:29:29 crc kubenswrapper[4752]: I1124 12:29:29.817961 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Nov 24 12:29:30 crc kubenswrapper[4752]: I1124 12:29:30.263291 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"95e8030c-fdfe-47e0-811a-0b32eaa08a00","Type":"ContainerStarted","Data":"5eca5193b89af0ef078eb4162576f03ed515614682c3c6f743e63daaf617eec7"} Nov 24 12:29:30 crc kubenswrapper[4752]: I1124 12:29:30.263340 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"95e8030c-fdfe-47e0-811a-0b32eaa08a00","Type":"ContainerStarted","Data":"5d9a53a6e67c5d11b23ad7e9f0a2390d7740b9b4b708c8dfce6fa81e1c3172fd"} Nov 24 12:29:30 crc kubenswrapper[4752]: I1124 12:29:30.288171 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=3.288151036 podStartE2EDuration="3.288151036s" podCreationTimestamp="2025-11-24 12:29:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:29:30.279973551 +0000 UTC m=+4976.264793860" watchObservedRunningTime="2025-11-24 12:29:30.288151036 +0000 UTC m=+4976.272971335" Nov 24 12:29:35 crc kubenswrapper[4752]: I1124 12:29:35.500809 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Nov 24 12:29:35 crc kubenswrapper[4752]: I1124 12:29:35.508194 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 24 12:29:35 crc kubenswrapper[4752]: I1124 12:29:35.512993 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Nov 24 12:29:35 crc kubenswrapper[4752]: I1124 12:29:35.513201 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Nov 24 12:29:35 crc kubenswrapper[4752]: I1124 12:29:35.513458 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-dngbw" Nov 24 12:29:35 crc kubenswrapper[4752]: I1124 12:29:35.520464 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 24 12:29:35 crc kubenswrapper[4752]: I1124 12:29:35.623837 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/25e82bdb-90e7-49a0-a243-ab4acabf0aa7-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"25e82bdb-90e7-49a0-a243-ab4acabf0aa7\") " pod="openstack/ovn-northd-0" Nov 24 12:29:35 crc kubenswrapper[4752]: I1124 12:29:35.623895 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/25e82bdb-90e7-49a0-a243-ab4acabf0aa7-scripts\") pod \"ovn-northd-0\" (UID: \"25e82bdb-90e7-49a0-a243-ab4acabf0aa7\") " pod="openstack/ovn-northd-0" Nov 24 12:29:35 crc kubenswrapper[4752]: I1124 12:29:35.623933 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djj74\" (UniqueName: \"kubernetes.io/projected/25e82bdb-90e7-49a0-a243-ab4acabf0aa7-kube-api-access-djj74\") pod \"ovn-northd-0\" (UID: \"25e82bdb-90e7-49a0-a243-ab4acabf0aa7\") " pod="openstack/ovn-northd-0" Nov 24 12:29:35 crc kubenswrapper[4752]: I1124 12:29:35.624016 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25e82bdb-90e7-49a0-a243-ab4acabf0aa7-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"25e82bdb-90e7-49a0-a243-ab4acabf0aa7\") " pod="openstack/ovn-northd-0" Nov 24 12:29:35 crc kubenswrapper[4752]: I1124 12:29:35.624095 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25e82bdb-90e7-49a0-a243-ab4acabf0aa7-config\") pod \"ovn-northd-0\" (UID: \"25e82bdb-90e7-49a0-a243-ab4acabf0aa7\") " pod="openstack/ovn-northd-0" Nov 24 12:29:35 crc kubenswrapper[4752]: I1124 12:29:35.725757 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/25e82bdb-90e7-49a0-a243-ab4acabf0aa7-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"25e82bdb-90e7-49a0-a243-ab4acabf0aa7\") " pod="openstack/ovn-northd-0" Nov 24 12:29:35 crc kubenswrapper[4752]: I1124 12:29:35.725814 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/25e82bdb-90e7-49a0-a243-ab4acabf0aa7-scripts\") pod \"ovn-northd-0\" (UID: \"25e82bdb-90e7-49a0-a243-ab4acabf0aa7\") " pod="openstack/ovn-northd-0" Nov 24 12:29:35 crc kubenswrapper[4752]: I1124 12:29:35.725843 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djj74\" (UniqueName: \"kubernetes.io/projected/25e82bdb-90e7-49a0-a243-ab4acabf0aa7-kube-api-access-djj74\") pod \"ovn-northd-0\" (UID: \"25e82bdb-90e7-49a0-a243-ab4acabf0aa7\") " pod="openstack/ovn-northd-0" Nov 24 12:29:35 crc kubenswrapper[4752]: I1124 12:29:35.725881 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25e82bdb-90e7-49a0-a243-ab4acabf0aa7-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"25e82bdb-90e7-49a0-a243-ab4acabf0aa7\") " pod="openstack/ovn-northd-0" Nov 24 12:29:35 crc kubenswrapper[4752]: I1124 12:29:35.725964 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25e82bdb-90e7-49a0-a243-ab4acabf0aa7-config\") pod \"ovn-northd-0\" (UID: \"25e82bdb-90e7-49a0-a243-ab4acabf0aa7\") " pod="openstack/ovn-northd-0" Nov 24 12:29:35 crc kubenswrapper[4752]: I1124 12:29:35.726550 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/25e82bdb-90e7-49a0-a243-ab4acabf0aa7-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"25e82bdb-90e7-49a0-a243-ab4acabf0aa7\") " pod="openstack/ovn-northd-0" Nov 24 12:29:35 crc kubenswrapper[4752]: I1124 12:29:35.727058 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25e82bdb-90e7-49a0-a243-ab4acabf0aa7-config\") pod \"ovn-northd-0\" (UID: \"25e82bdb-90e7-49a0-a243-ab4acabf0aa7\") " pod="openstack/ovn-northd-0" Nov 24 12:29:35 crc kubenswrapper[4752]: I1124 12:29:35.727632 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/25e82bdb-90e7-49a0-a243-ab4acabf0aa7-scripts\") pod \"ovn-northd-0\" (UID: \"25e82bdb-90e7-49a0-a243-ab4acabf0aa7\") " pod="openstack/ovn-northd-0" Nov 24 12:29:35 crc kubenswrapper[4752]: I1124 12:29:35.727666 4752 scope.go:117] "RemoveContainer" containerID="294534fa96e641fffe9b0cce6e2bc4d525450d730fff13a2ed6629d46358ee8a" Nov 24 12:29:35 crc kubenswrapper[4752]: E1124 12:29:35.728002 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:29:35 crc kubenswrapper[4752]: I1124 12:29:35.733945 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25e82bdb-90e7-49a0-a243-ab4acabf0aa7-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"25e82bdb-90e7-49a0-a243-ab4acabf0aa7\") " pod="openstack/ovn-northd-0" Nov 24 12:29:35 crc kubenswrapper[4752]: I1124 12:29:35.752592 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djj74\" (UniqueName: \"kubernetes.io/projected/25e82bdb-90e7-49a0-a243-ab4acabf0aa7-kube-api-access-djj74\") pod \"ovn-northd-0\" (UID: \"25e82bdb-90e7-49a0-a243-ab4acabf0aa7\") " pod="openstack/ovn-northd-0" Nov 24 12:29:35 crc kubenswrapper[4752]: I1124 12:29:35.842625 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 24 12:29:36 crc kubenswrapper[4752]: I1124 12:29:36.293955 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 24 12:29:36 crc kubenswrapper[4752]: I1124 12:29:36.316788 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74f6fc965c-l88z4" Nov 24 12:29:36 crc kubenswrapper[4752]: I1124 12:29:36.317358 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"25e82bdb-90e7-49a0-a243-ab4acabf0aa7","Type":"ContainerStarted","Data":"5eb976d6fbb840ad22916136f68600573070b519bfa2c84a5717d534a23c559e"} Nov 24 12:29:36 crc kubenswrapper[4752]: I1124 12:29:36.373065 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-2ph2n"] Nov 24 12:29:36 crc kubenswrapper[4752]: I1124 12:29:36.373381 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b7946d7b9-2ph2n" podUID="bbc20788-5a71-4bb6-a9f3-6a41f1b9f582" containerName="dnsmasq-dns" containerID="cri-o://b3fea195a21ea21f3da8e4e11888480da96d09fb7eb4f8e0e8744897f13b1b59" gracePeriod=10 Nov 24 12:29:36 crc kubenswrapper[4752]: I1124 12:29:36.914736 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-2ph2n" Nov 24 12:29:37 crc kubenswrapper[4752]: I1124 12:29:37.051111 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbc20788-5a71-4bb6-a9f3-6a41f1b9f582-dns-svc\") pod \"bbc20788-5a71-4bb6-a9f3-6a41f1b9f582\" (UID: \"bbc20788-5a71-4bb6-a9f3-6a41f1b9f582\") " Nov 24 12:29:37 crc kubenswrapper[4752]: I1124 12:29:37.051281 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6x8f\" (UniqueName: \"kubernetes.io/projected/bbc20788-5a71-4bb6-a9f3-6a41f1b9f582-kube-api-access-t6x8f\") pod \"bbc20788-5a71-4bb6-a9f3-6a41f1b9f582\" (UID: \"bbc20788-5a71-4bb6-a9f3-6a41f1b9f582\") " Nov 24 12:29:37 crc kubenswrapper[4752]: I1124 12:29:37.051438 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbc20788-5a71-4bb6-a9f3-6a41f1b9f582-config\") pod \"bbc20788-5a71-4bb6-a9f3-6a41f1b9f582\" (UID: \"bbc20788-5a71-4bb6-a9f3-6a41f1b9f582\") " Nov 24 12:29:37 crc kubenswrapper[4752]: I1124 12:29:37.056649 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbc20788-5a71-4bb6-a9f3-6a41f1b9f582-kube-api-access-t6x8f" (OuterVolumeSpecName: "kube-api-access-t6x8f") pod "bbc20788-5a71-4bb6-a9f3-6a41f1b9f582" (UID: "bbc20788-5a71-4bb6-a9f3-6a41f1b9f582"). InnerVolumeSpecName "kube-api-access-t6x8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:29:37 crc kubenswrapper[4752]: I1124 12:29:37.087203 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbc20788-5a71-4bb6-a9f3-6a41f1b9f582-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bbc20788-5a71-4bb6-a9f3-6a41f1b9f582" (UID: "bbc20788-5a71-4bb6-a9f3-6a41f1b9f582"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:29:37 crc kubenswrapper[4752]: I1124 12:29:37.093673 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbc20788-5a71-4bb6-a9f3-6a41f1b9f582-config" (OuterVolumeSpecName: "config") pod "bbc20788-5a71-4bb6-a9f3-6a41f1b9f582" (UID: "bbc20788-5a71-4bb6-a9f3-6a41f1b9f582"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:29:37 crc kubenswrapper[4752]: I1124 12:29:37.153082 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6x8f\" (UniqueName: \"kubernetes.io/projected/bbc20788-5a71-4bb6-a9f3-6a41f1b9f582-kube-api-access-t6x8f\") on node \"crc\" DevicePath \"\"" Nov 24 12:29:37 crc kubenswrapper[4752]: I1124 12:29:37.153109 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbc20788-5a71-4bb6-a9f3-6a41f1b9f582-config\") on node \"crc\" DevicePath \"\"" Nov 24 12:29:37 crc kubenswrapper[4752]: I1124 12:29:37.153120 4752 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbc20788-5a71-4bb6-a9f3-6a41f1b9f582-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 12:29:37 crc kubenswrapper[4752]: I1124 12:29:37.328300 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"25e82bdb-90e7-49a0-a243-ab4acabf0aa7","Type":"ContainerStarted","Data":"4be2e7b3b1e63644c76a16b29db7e320aba62b8442776978bf2c09ea2f0168a2"} Nov 24 12:29:37 crc kubenswrapper[4752]: I1124 12:29:37.328348 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"25e82bdb-90e7-49a0-a243-ab4acabf0aa7","Type":"ContainerStarted","Data":"3023f7f2e72145a6ca484462cc8009225369a33d177f95193cb54e9de810103f"} Nov 24 12:29:37 crc kubenswrapper[4752]: I1124 12:29:37.328433 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Nov 24 12:29:37 crc kubenswrapper[4752]: I1124 12:29:37.330468 4752 generic.go:334] "Generic (PLEG): container finished" podID="bbc20788-5a71-4bb6-a9f3-6a41f1b9f582" containerID="b3fea195a21ea21f3da8e4e11888480da96d09fb7eb4f8e0e8744897f13b1b59" exitCode=0 Nov 24 12:29:37 crc kubenswrapper[4752]: I1124 12:29:37.330570 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-2ph2n" event={"ID":"bbc20788-5a71-4bb6-a9f3-6a41f1b9f582","Type":"ContainerDied","Data":"b3fea195a21ea21f3da8e4e11888480da96d09fb7eb4f8e0e8744897f13b1b59"} Nov 24 12:29:37 crc kubenswrapper[4752]: I1124 12:29:37.330602 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-2ph2n" event={"ID":"bbc20788-5a71-4bb6-a9f3-6a41f1b9f582","Type":"ContainerDied","Data":"54d3196f8ce8e288a9f3b71217dfaf5cbe8f9efb987a35983cf642af4b89e2a1"} Nov 24 12:29:37 crc kubenswrapper[4752]: I1124 12:29:37.330622 4752 scope.go:117] "RemoveContainer" containerID="b3fea195a21ea21f3da8e4e11888480da96d09fb7eb4f8e0e8744897f13b1b59" Nov 24 12:29:37 crc kubenswrapper[4752]: I1124 12:29:37.330822 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-2ph2n" Nov 24 12:29:37 crc kubenswrapper[4752]: I1124 12:29:37.353087 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.353055082 podStartE2EDuration="2.353055082s" podCreationTimestamp="2025-11-24 12:29:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:29:37.346222616 +0000 UTC m=+4983.331042925" watchObservedRunningTime="2025-11-24 12:29:37.353055082 +0000 UTC m=+4983.337875411" Nov 24 12:29:37 crc kubenswrapper[4752]: I1124 12:29:37.371218 4752 scope.go:117] "RemoveContainer" containerID="2f4d6c8ce2ee68ba5ab0bcd8124fc381a6e0ecaf5294b796c5b9be59ee120a05" Nov 24 12:29:37 crc kubenswrapper[4752]: I1124 12:29:37.382375 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-2ph2n"] Nov 24 12:29:37 crc kubenswrapper[4752]: I1124 12:29:37.391856 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-2ph2n"] Nov 24 12:29:37 crc kubenswrapper[4752]: I1124 12:29:37.396290 4752 scope.go:117] "RemoveContainer" containerID="b3fea195a21ea21f3da8e4e11888480da96d09fb7eb4f8e0e8744897f13b1b59" Nov 24 12:29:37 crc kubenswrapper[4752]: E1124 12:29:37.396789 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3fea195a21ea21f3da8e4e11888480da96d09fb7eb4f8e0e8744897f13b1b59\": container with ID starting with b3fea195a21ea21f3da8e4e11888480da96d09fb7eb4f8e0e8744897f13b1b59 not found: ID does not exist" containerID="b3fea195a21ea21f3da8e4e11888480da96d09fb7eb4f8e0e8744897f13b1b59" Nov 24 12:29:37 crc kubenswrapper[4752]: I1124 12:29:37.396833 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3fea195a21ea21f3da8e4e11888480da96d09fb7eb4f8e0e8744897f13b1b59"} err="failed to get container status \"b3fea195a21ea21f3da8e4e11888480da96d09fb7eb4f8e0e8744897f13b1b59\": rpc error: code = NotFound desc = could not find container \"b3fea195a21ea21f3da8e4e11888480da96d09fb7eb4f8e0e8744897f13b1b59\": container with ID starting with b3fea195a21ea21f3da8e4e11888480da96d09fb7eb4f8e0e8744897f13b1b59 not found: ID does not exist" Nov 24 12:29:37 crc kubenswrapper[4752]: I1124 12:29:37.396864 4752 scope.go:117] "RemoveContainer" containerID="2f4d6c8ce2ee68ba5ab0bcd8124fc381a6e0ecaf5294b796c5b9be59ee120a05" Nov 24 12:29:37 crc kubenswrapper[4752]: E1124 12:29:37.397610 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f4d6c8ce2ee68ba5ab0bcd8124fc381a6e0ecaf5294b796c5b9be59ee120a05\": container with ID starting with 2f4d6c8ce2ee68ba5ab0bcd8124fc381a6e0ecaf5294b796c5b9be59ee120a05 not found: ID does not exist" containerID="2f4d6c8ce2ee68ba5ab0bcd8124fc381a6e0ecaf5294b796c5b9be59ee120a05" Nov 24 12:29:37 crc kubenswrapper[4752]: I1124 12:29:37.397645 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f4d6c8ce2ee68ba5ab0bcd8124fc381a6e0ecaf5294b796c5b9be59ee120a05"} err="failed to get container status \"2f4d6c8ce2ee68ba5ab0bcd8124fc381a6e0ecaf5294b796c5b9be59ee120a05\": rpc error: code = NotFound desc = could not find container \"2f4d6c8ce2ee68ba5ab0bcd8124fc381a6e0ecaf5294b796c5b9be59ee120a05\": container with ID starting with 2f4d6c8ce2ee68ba5ab0bcd8124fc381a6e0ecaf5294b796c5b9be59ee120a05 not found: ID does not exist" Nov 24 12:29:38 crc kubenswrapper[4752]: I1124 12:29:38.739945 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbc20788-5a71-4bb6-a9f3-6a41f1b9f582" path="/var/lib/kubelet/pods/bbc20788-5a71-4bb6-a9f3-6a41f1b9f582/volumes" Nov 24 12:29:40 crc kubenswrapper[4752]: I1124 12:29:40.563394 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-q8jpk"] Nov 24 12:29:40 crc kubenswrapper[4752]: E1124 12:29:40.564951 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbc20788-5a71-4bb6-a9f3-6a41f1b9f582" containerName="init" Nov 24 12:29:40 crc kubenswrapper[4752]: I1124 12:29:40.564992 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbc20788-5a71-4bb6-a9f3-6a41f1b9f582" containerName="init" Nov 24 12:29:40 crc kubenswrapper[4752]: E1124 12:29:40.565008 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbc20788-5a71-4bb6-a9f3-6a41f1b9f582" containerName="dnsmasq-dns" Nov 24 12:29:40 crc kubenswrapper[4752]: I1124 12:29:40.565017 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbc20788-5a71-4bb6-a9f3-6a41f1b9f582" containerName="dnsmasq-dns" Nov 24 12:29:40 crc kubenswrapper[4752]: I1124 12:29:40.565256 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbc20788-5a71-4bb6-a9f3-6a41f1b9f582" containerName="dnsmasq-dns" Nov 24 12:29:40 crc kubenswrapper[4752]: I1124 12:29:40.566740 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-q8jpk" Nov 24 12:29:40 crc kubenswrapper[4752]: I1124 12:29:40.569561 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-q8jpk"] Nov 24 12:29:40 crc kubenswrapper[4752]: I1124 12:29:40.613346 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7dcd7330-c0e2-4951-b4a1-250826e169b5-operator-scripts\") pod \"keystone-db-create-q8jpk\" (UID: \"7dcd7330-c0e2-4951-b4a1-250826e169b5\") " pod="openstack/keystone-db-create-q8jpk" Nov 24 12:29:40 crc kubenswrapper[4752]: I1124 12:29:40.613613 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qkj7\" (UniqueName: \"kubernetes.io/projected/7dcd7330-c0e2-4951-b4a1-250826e169b5-kube-api-access-2qkj7\") pod \"keystone-db-create-q8jpk\" (UID: \"7dcd7330-c0e2-4951-b4a1-250826e169b5\") " pod="openstack/keystone-db-create-q8jpk" Nov 24 12:29:40 crc kubenswrapper[4752]: I1124 12:29:40.669886 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5143-account-create-k4g52"] Nov 24 12:29:40 crc kubenswrapper[4752]: I1124 12:29:40.671067 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5143-account-create-k4g52" Nov 24 12:29:40 crc kubenswrapper[4752]: I1124 12:29:40.673043 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Nov 24 12:29:40 crc kubenswrapper[4752]: I1124 12:29:40.680403 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5143-account-create-k4g52"] Nov 24 12:29:40 crc kubenswrapper[4752]: I1124 12:29:40.715194 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qkj7\" (UniqueName: \"kubernetes.io/projected/7dcd7330-c0e2-4951-b4a1-250826e169b5-kube-api-access-2qkj7\") pod \"keystone-db-create-q8jpk\" (UID: \"7dcd7330-c0e2-4951-b4a1-250826e169b5\") " pod="openstack/keystone-db-create-q8jpk" Nov 24 12:29:40 crc kubenswrapper[4752]: I1124 12:29:40.715305 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/247394d9-138c-4d51-8cee-94abc971faf2-operator-scripts\") pod \"keystone-5143-account-create-k4g52\" (UID: \"247394d9-138c-4d51-8cee-94abc971faf2\") " pod="openstack/keystone-5143-account-create-k4g52" Nov 24 12:29:40 crc kubenswrapper[4752]: I1124 12:29:40.715331 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq4f2\" (UniqueName: \"kubernetes.io/projected/247394d9-138c-4d51-8cee-94abc971faf2-kube-api-access-mq4f2\") pod \"keystone-5143-account-create-k4g52\" (UID: \"247394d9-138c-4d51-8cee-94abc971faf2\") " pod="openstack/keystone-5143-account-create-k4g52" Nov 24 12:29:40 crc kubenswrapper[4752]: I1124 12:29:40.715353 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7dcd7330-c0e2-4951-b4a1-250826e169b5-operator-scripts\") pod \"keystone-db-create-q8jpk\" (UID: \"7dcd7330-c0e2-4951-b4a1-250826e169b5\") " pod="openstack/keystone-db-create-q8jpk" Nov 24 12:29:40 crc kubenswrapper[4752]: I1124 12:29:40.716157 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7dcd7330-c0e2-4951-b4a1-250826e169b5-operator-scripts\") pod \"keystone-db-create-q8jpk\" (UID: \"7dcd7330-c0e2-4951-b4a1-250826e169b5\") " pod="openstack/keystone-db-create-q8jpk" Nov 24 12:29:40 crc kubenswrapper[4752]: I1124 12:29:40.743507 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qkj7\" (UniqueName: \"kubernetes.io/projected/7dcd7330-c0e2-4951-b4a1-250826e169b5-kube-api-access-2qkj7\") pod \"keystone-db-create-q8jpk\" (UID: \"7dcd7330-c0e2-4951-b4a1-250826e169b5\") " pod="openstack/keystone-db-create-q8jpk" Nov 24 12:29:40 crc kubenswrapper[4752]: I1124 12:29:40.817175 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/247394d9-138c-4d51-8cee-94abc971faf2-operator-scripts\") pod \"keystone-5143-account-create-k4g52\" (UID: \"247394d9-138c-4d51-8cee-94abc971faf2\") " pod="openstack/keystone-5143-account-create-k4g52" Nov 24 12:29:40 crc kubenswrapper[4752]: I1124 12:29:40.817224 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq4f2\" (UniqueName: \"kubernetes.io/projected/247394d9-138c-4d51-8cee-94abc971faf2-kube-api-access-mq4f2\") pod \"keystone-5143-account-create-k4g52\" (UID: \"247394d9-138c-4d51-8cee-94abc971faf2\") " pod="openstack/keystone-5143-account-create-k4g52" Nov 24 12:29:40 crc kubenswrapper[4752]: I1124 12:29:40.818100 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/247394d9-138c-4d51-8cee-94abc971faf2-operator-scripts\") pod \"keystone-5143-account-create-k4g52\" (UID: \"247394d9-138c-4d51-8cee-94abc971faf2\") " pod="openstack/keystone-5143-account-create-k4g52" Nov 24 12:29:40 crc kubenswrapper[4752]: I1124 12:29:40.836319 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq4f2\" (UniqueName: \"kubernetes.io/projected/247394d9-138c-4d51-8cee-94abc971faf2-kube-api-access-mq4f2\") pod \"keystone-5143-account-create-k4g52\" (UID: \"247394d9-138c-4d51-8cee-94abc971faf2\") " pod="openstack/keystone-5143-account-create-k4g52" Nov 24 12:29:40 crc kubenswrapper[4752]: I1124 12:29:40.920675 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-q8jpk" Nov 24 12:29:40 crc kubenswrapper[4752]: I1124 12:29:40.986766 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5143-account-create-k4g52" Nov 24 12:29:41 crc kubenswrapper[4752]: I1124 12:29:41.446447 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-q8jpk"] Nov 24 12:29:41 crc kubenswrapper[4752]: W1124 12:29:41.451091 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7dcd7330_c0e2_4951_b4a1_250826e169b5.slice/crio-356288d71b55a2239eb42686b98daa6a9817070ef8e182b8506be24baf1d9897 WatchSource:0}: Error finding container 356288d71b55a2239eb42686b98daa6a9817070ef8e182b8506be24baf1d9897: Status 404 returned error can't find the container with id 356288d71b55a2239eb42686b98daa6a9817070ef8e182b8506be24baf1d9897 Nov 24 12:29:41 crc kubenswrapper[4752]: I1124 12:29:41.537867 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5143-account-create-k4g52"] Nov 24 12:29:41 crc kubenswrapper[4752]: W1124 12:29:41.542254 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod247394d9_138c_4d51_8cee_94abc971faf2.slice/crio-09bce62ea9fd9a94c0708606f4ca6afb8990869b3c04d64d58ef1f99d242f840 WatchSource:0}: Error finding container 09bce62ea9fd9a94c0708606f4ca6afb8990869b3c04d64d58ef1f99d242f840: Status 404 returned error can't find the container with id 09bce62ea9fd9a94c0708606f4ca6afb8990869b3c04d64d58ef1f99d242f840 Nov 24 12:29:42 crc kubenswrapper[4752]: I1124 12:29:42.378964 4752 generic.go:334] "Generic (PLEG): container finished" podID="7dcd7330-c0e2-4951-b4a1-250826e169b5" containerID="e1af3090dba1cf6902ff93cb18c5dbc472f819c44f69f8bfdd6048947c4fd112" exitCode=0 Nov 24 12:29:42 crc kubenswrapper[4752]: I1124 12:29:42.379032 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-q8jpk" event={"ID":"7dcd7330-c0e2-4951-b4a1-250826e169b5","Type":"ContainerDied","Data":"e1af3090dba1cf6902ff93cb18c5dbc472f819c44f69f8bfdd6048947c4fd112"} Nov 24 12:29:42 crc kubenswrapper[4752]: I1124 12:29:42.379111 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-q8jpk" event={"ID":"7dcd7330-c0e2-4951-b4a1-250826e169b5","Type":"ContainerStarted","Data":"356288d71b55a2239eb42686b98daa6a9817070ef8e182b8506be24baf1d9897"} Nov 24 12:29:42 crc kubenswrapper[4752]: I1124 12:29:42.386620 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5143-account-create-k4g52" event={"ID":"247394d9-138c-4d51-8cee-94abc971faf2","Type":"ContainerDied","Data":"97904e03a9e69f15199ad08f14ced0b5d650e7e6aebb105731ce3ea0a5b67737"} Nov 24 12:29:42 crc kubenswrapper[4752]: I1124 12:29:42.386438 4752 generic.go:334] "Generic (PLEG): container finished" podID="247394d9-138c-4d51-8cee-94abc971faf2" containerID="97904e03a9e69f15199ad08f14ced0b5d650e7e6aebb105731ce3ea0a5b67737" exitCode=0 Nov 24 12:29:42 crc kubenswrapper[4752]: I1124 12:29:42.386786 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5143-account-create-k4g52" event={"ID":"247394d9-138c-4d51-8cee-94abc971faf2","Type":"ContainerStarted","Data":"09bce62ea9fd9a94c0708606f4ca6afb8990869b3c04d64d58ef1f99d242f840"} Nov 24 12:29:43 crc kubenswrapper[4752]: I1124 12:29:43.806782 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5143-account-create-k4g52" Nov 24 12:29:43 crc kubenswrapper[4752]: I1124 12:29:43.818000 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-q8jpk" Nov 24 12:29:43 crc kubenswrapper[4752]: I1124 12:29:43.871188 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mq4f2\" (UniqueName: \"kubernetes.io/projected/247394d9-138c-4d51-8cee-94abc971faf2-kube-api-access-mq4f2\") pod \"247394d9-138c-4d51-8cee-94abc971faf2\" (UID: \"247394d9-138c-4d51-8cee-94abc971faf2\") " Nov 24 12:29:43 crc kubenswrapper[4752]: I1124 12:29:43.871237 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/247394d9-138c-4d51-8cee-94abc971faf2-operator-scripts\") pod \"247394d9-138c-4d51-8cee-94abc971faf2\" (UID: \"247394d9-138c-4d51-8cee-94abc971faf2\") " Nov 24 12:29:43 crc kubenswrapper[4752]: I1124 12:29:43.871262 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7dcd7330-c0e2-4951-b4a1-250826e169b5-operator-scripts\") pod \"7dcd7330-c0e2-4951-b4a1-250826e169b5\" (UID: \"7dcd7330-c0e2-4951-b4a1-250826e169b5\") " Nov 24 12:29:43 crc kubenswrapper[4752]: I1124 12:29:43.871304 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qkj7\" (UniqueName: \"kubernetes.io/projected/7dcd7330-c0e2-4951-b4a1-250826e169b5-kube-api-access-2qkj7\") pod \"7dcd7330-c0e2-4951-b4a1-250826e169b5\" (UID: \"7dcd7330-c0e2-4951-b4a1-250826e169b5\") " Nov 24 12:29:43 crc kubenswrapper[4752]: I1124 12:29:43.872119 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dcd7330-c0e2-4951-b4a1-250826e169b5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7dcd7330-c0e2-4951-b4a1-250826e169b5" (UID: "7dcd7330-c0e2-4951-b4a1-250826e169b5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:29:43 crc kubenswrapper[4752]: I1124 12:29:43.872121 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/247394d9-138c-4d51-8cee-94abc971faf2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "247394d9-138c-4d51-8cee-94abc971faf2" (UID: "247394d9-138c-4d51-8cee-94abc971faf2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:29:43 crc kubenswrapper[4752]: I1124 12:29:43.877461 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/247394d9-138c-4d51-8cee-94abc971faf2-kube-api-access-mq4f2" (OuterVolumeSpecName: "kube-api-access-mq4f2") pod "247394d9-138c-4d51-8cee-94abc971faf2" (UID: "247394d9-138c-4d51-8cee-94abc971faf2"). InnerVolumeSpecName "kube-api-access-mq4f2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:29:43 crc kubenswrapper[4752]: I1124 12:29:43.877838 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dcd7330-c0e2-4951-b4a1-250826e169b5-kube-api-access-2qkj7" (OuterVolumeSpecName: "kube-api-access-2qkj7") pod "7dcd7330-c0e2-4951-b4a1-250826e169b5" (UID: "7dcd7330-c0e2-4951-b4a1-250826e169b5"). InnerVolumeSpecName "kube-api-access-2qkj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:29:43 crc kubenswrapper[4752]: I1124 12:29:43.973010 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mq4f2\" (UniqueName: \"kubernetes.io/projected/247394d9-138c-4d51-8cee-94abc971faf2-kube-api-access-mq4f2\") on node \"crc\" DevicePath \"\"" Nov 24 12:29:43 crc kubenswrapper[4752]: I1124 12:29:43.973082 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/247394d9-138c-4d51-8cee-94abc971faf2-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:29:43 crc kubenswrapper[4752]: I1124 12:29:43.973096 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7dcd7330-c0e2-4951-b4a1-250826e169b5-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:29:43 crc kubenswrapper[4752]: I1124 12:29:43.973109 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qkj7\" (UniqueName: \"kubernetes.io/projected/7dcd7330-c0e2-4951-b4a1-250826e169b5-kube-api-access-2qkj7\") on node \"crc\" DevicePath \"\"" Nov 24 12:29:44 crc kubenswrapper[4752]: I1124 12:29:44.412047 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5143-account-create-k4g52" event={"ID":"247394d9-138c-4d51-8cee-94abc971faf2","Type":"ContainerDied","Data":"09bce62ea9fd9a94c0708606f4ca6afb8990869b3c04d64d58ef1f99d242f840"} Nov 24 12:29:44 crc kubenswrapper[4752]: I1124 12:29:44.412482 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09bce62ea9fd9a94c0708606f4ca6afb8990869b3c04d64d58ef1f99d242f840" Nov 24 12:29:44 crc kubenswrapper[4752]: I1124 12:29:44.412090 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5143-account-create-k4g52" Nov 24 12:29:44 crc kubenswrapper[4752]: I1124 12:29:44.414561 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-q8jpk" event={"ID":"7dcd7330-c0e2-4951-b4a1-250826e169b5","Type":"ContainerDied","Data":"356288d71b55a2239eb42686b98daa6a9817070ef8e182b8506be24baf1d9897"} Nov 24 12:29:44 crc kubenswrapper[4752]: I1124 12:29:44.414631 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="356288d71b55a2239eb42686b98daa6a9817070ef8e182b8506be24baf1d9897" Nov 24 12:29:44 crc kubenswrapper[4752]: I1124 12:29:44.414738 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-q8jpk" Nov 24 12:29:46 crc kubenswrapper[4752]: I1124 12:29:46.111679 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-vkk9s"] Nov 24 12:29:46 crc kubenswrapper[4752]: E1124 12:29:46.112048 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dcd7330-c0e2-4951-b4a1-250826e169b5" containerName="mariadb-database-create" Nov 24 12:29:46 crc kubenswrapper[4752]: I1124 12:29:46.112062 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dcd7330-c0e2-4951-b4a1-250826e169b5" containerName="mariadb-database-create" Nov 24 12:29:46 crc kubenswrapper[4752]: E1124 12:29:46.112094 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="247394d9-138c-4d51-8cee-94abc971faf2" containerName="mariadb-account-create" Nov 24 12:29:46 crc kubenswrapper[4752]: I1124 12:29:46.112100 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="247394d9-138c-4d51-8cee-94abc971faf2" containerName="mariadb-account-create" Nov 24 12:29:46 crc kubenswrapper[4752]: I1124 12:29:46.112244 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dcd7330-c0e2-4951-b4a1-250826e169b5" containerName="mariadb-database-create" Nov 24 12:29:46 crc kubenswrapper[4752]: I1124 12:29:46.112266 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="247394d9-138c-4d51-8cee-94abc971faf2" containerName="mariadb-account-create" Nov 24 12:29:46 crc kubenswrapper[4752]: I1124 12:29:46.112786 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vkk9s" Nov 24 12:29:46 crc kubenswrapper[4752]: I1124 12:29:46.114787 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 24 12:29:46 crc kubenswrapper[4752]: I1124 12:29:46.115126 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 24 12:29:46 crc kubenswrapper[4752]: I1124 12:29:46.115269 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jnw8d" Nov 24 12:29:46 crc kubenswrapper[4752]: I1124 12:29:46.122994 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-vkk9s"] Nov 24 12:29:46 crc kubenswrapper[4752]: I1124 12:29:46.124456 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 24 12:29:46 crc kubenswrapper[4752]: I1124 12:29:46.215131 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9np6v\" (UniqueName: \"kubernetes.io/projected/5a44ba09-75b3-494c-88b1-065eb978f33a-kube-api-access-9np6v\") pod \"keystone-db-sync-vkk9s\" (UID: \"5a44ba09-75b3-494c-88b1-065eb978f33a\") " pod="openstack/keystone-db-sync-vkk9s" Nov 24 12:29:46 crc kubenswrapper[4752]: I1124 12:29:46.215210 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a44ba09-75b3-494c-88b1-065eb978f33a-config-data\") pod \"keystone-db-sync-vkk9s\" (UID: \"5a44ba09-75b3-494c-88b1-065eb978f33a\") " pod="openstack/keystone-db-sync-vkk9s" Nov 24 12:29:46 crc kubenswrapper[4752]: I1124 12:29:46.215378 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a44ba09-75b3-494c-88b1-065eb978f33a-combined-ca-bundle\") pod \"keystone-db-sync-vkk9s\" (UID: \"5a44ba09-75b3-494c-88b1-065eb978f33a\") " pod="openstack/keystone-db-sync-vkk9s" Nov 24 12:29:46 crc kubenswrapper[4752]: I1124 12:29:46.317011 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9np6v\" (UniqueName: \"kubernetes.io/projected/5a44ba09-75b3-494c-88b1-065eb978f33a-kube-api-access-9np6v\") pod \"keystone-db-sync-vkk9s\" (UID: \"5a44ba09-75b3-494c-88b1-065eb978f33a\") " pod="openstack/keystone-db-sync-vkk9s" Nov 24 12:29:46 crc kubenswrapper[4752]: I1124 12:29:46.317358 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a44ba09-75b3-494c-88b1-065eb978f33a-config-data\") pod \"keystone-db-sync-vkk9s\" (UID: \"5a44ba09-75b3-494c-88b1-065eb978f33a\") " pod="openstack/keystone-db-sync-vkk9s" Nov 24 12:29:46 crc kubenswrapper[4752]: I1124 12:29:46.317404 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a44ba09-75b3-494c-88b1-065eb978f33a-combined-ca-bundle\") pod \"keystone-db-sync-vkk9s\" (UID: \"5a44ba09-75b3-494c-88b1-065eb978f33a\") " pod="openstack/keystone-db-sync-vkk9s" Nov 24 12:29:46 crc kubenswrapper[4752]: I1124 12:29:46.324258 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a44ba09-75b3-494c-88b1-065eb978f33a-config-data\") pod \"keystone-db-sync-vkk9s\" (UID: \"5a44ba09-75b3-494c-88b1-065eb978f33a\") " pod="openstack/keystone-db-sync-vkk9s" Nov 24 12:29:46 crc kubenswrapper[4752]: I1124 12:29:46.325744 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a44ba09-75b3-494c-88b1-065eb978f33a-combined-ca-bundle\") pod \"keystone-db-sync-vkk9s\" (UID: \"5a44ba09-75b3-494c-88b1-065eb978f33a\") " pod="openstack/keystone-db-sync-vkk9s" Nov 24 12:29:46 crc kubenswrapper[4752]: I1124 12:29:46.342312 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9np6v\" (UniqueName: \"kubernetes.io/projected/5a44ba09-75b3-494c-88b1-065eb978f33a-kube-api-access-9np6v\") pod \"keystone-db-sync-vkk9s\" (UID: \"5a44ba09-75b3-494c-88b1-065eb978f33a\") " pod="openstack/keystone-db-sync-vkk9s" Nov 24 12:29:46 crc kubenswrapper[4752]: I1124 12:29:46.448345 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vkk9s" Nov 24 12:29:46 crc kubenswrapper[4752]: I1124 12:29:46.886557 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-vkk9s"] Nov 24 12:29:47 crc kubenswrapper[4752]: I1124 12:29:47.455179 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vkk9s" event={"ID":"5a44ba09-75b3-494c-88b1-065eb978f33a","Type":"ContainerStarted","Data":"3fcd22e27115e031aa9cfbdaaffd5d8571a4f719060255e300db77cb4053e43e"} Nov 24 12:29:47 crc kubenswrapper[4752]: I1124 12:29:47.455614 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vkk9s" event={"ID":"5a44ba09-75b3-494c-88b1-065eb978f33a","Type":"ContainerStarted","Data":"914d5c018b82e0eac46e94fb960b980a715062f29cfc8961c9a7be6834afd44c"} Nov 24 12:29:47 crc kubenswrapper[4752]: I1124 12:29:47.481394 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-vkk9s" podStartSLOduration=1.481370192 podStartE2EDuration="1.481370192s" podCreationTimestamp="2025-11-24 12:29:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:29:47.480331502 +0000 UTC m=+4993.465151831" watchObservedRunningTime="2025-11-24 12:29:47.481370192 +0000 UTC m=+4993.466190491" Nov 24 12:29:47 crc kubenswrapper[4752]: I1124 12:29:47.728887 4752 scope.go:117] "RemoveContainer" containerID="294534fa96e641fffe9b0cce6e2bc4d525450d730fff13a2ed6629d46358ee8a" Nov 24 12:29:47 crc kubenswrapper[4752]: E1124 12:29:47.729439 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:29:49 crc kubenswrapper[4752]: I1124 12:29:49.475362 4752 generic.go:334] "Generic (PLEG): container finished" podID="5a44ba09-75b3-494c-88b1-065eb978f33a" containerID="3fcd22e27115e031aa9cfbdaaffd5d8571a4f719060255e300db77cb4053e43e" exitCode=0 Nov 24 12:29:49 crc kubenswrapper[4752]: I1124 12:29:49.475813 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vkk9s" event={"ID":"5a44ba09-75b3-494c-88b1-065eb978f33a","Type":"ContainerDied","Data":"3fcd22e27115e031aa9cfbdaaffd5d8571a4f719060255e300db77cb4053e43e"} Nov 24 12:29:50 crc kubenswrapper[4752]: I1124 12:29:50.902885 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Nov 24 12:29:50 crc kubenswrapper[4752]: I1124 12:29:50.908993 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vkk9s" Nov 24 12:29:51 crc kubenswrapper[4752]: I1124 12:29:51.030975 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9np6v\" (UniqueName: \"kubernetes.io/projected/5a44ba09-75b3-494c-88b1-065eb978f33a-kube-api-access-9np6v\") pod \"5a44ba09-75b3-494c-88b1-065eb978f33a\" (UID: \"5a44ba09-75b3-494c-88b1-065eb978f33a\") " Nov 24 12:29:51 crc kubenswrapper[4752]: I1124 12:29:51.031041 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a44ba09-75b3-494c-88b1-065eb978f33a-combined-ca-bundle\") pod \"5a44ba09-75b3-494c-88b1-065eb978f33a\" (UID: \"5a44ba09-75b3-494c-88b1-065eb978f33a\") " Nov 24 12:29:51 crc kubenswrapper[4752]: I1124 12:29:51.031109 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a44ba09-75b3-494c-88b1-065eb978f33a-config-data\") pod \"5a44ba09-75b3-494c-88b1-065eb978f33a\" (UID: \"5a44ba09-75b3-494c-88b1-065eb978f33a\") " Nov 24 12:29:51 crc kubenswrapper[4752]: I1124 12:29:51.037719 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a44ba09-75b3-494c-88b1-065eb978f33a-kube-api-access-9np6v" (OuterVolumeSpecName: "kube-api-access-9np6v") pod "5a44ba09-75b3-494c-88b1-065eb978f33a" (UID: "5a44ba09-75b3-494c-88b1-065eb978f33a"). InnerVolumeSpecName "kube-api-access-9np6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:29:51 crc kubenswrapper[4752]: I1124 12:29:51.061548 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a44ba09-75b3-494c-88b1-065eb978f33a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a44ba09-75b3-494c-88b1-065eb978f33a" (UID: "5a44ba09-75b3-494c-88b1-065eb978f33a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:29:51 crc kubenswrapper[4752]: I1124 12:29:51.082855 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a44ba09-75b3-494c-88b1-065eb978f33a-config-data" (OuterVolumeSpecName: "config-data") pod "5a44ba09-75b3-494c-88b1-065eb978f33a" (UID: "5a44ba09-75b3-494c-88b1-065eb978f33a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:29:51 crc kubenswrapper[4752]: I1124 12:29:51.133436 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9np6v\" (UniqueName: \"kubernetes.io/projected/5a44ba09-75b3-494c-88b1-065eb978f33a-kube-api-access-9np6v\") on node \"crc\" DevicePath \"\"" Nov 24 12:29:51 crc kubenswrapper[4752]: I1124 12:29:51.133476 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a44ba09-75b3-494c-88b1-065eb978f33a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 12:29:51 crc kubenswrapper[4752]: I1124 12:29:51.133489 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a44ba09-75b3-494c-88b1-065eb978f33a-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 12:29:51 crc kubenswrapper[4752]: I1124 12:29:51.497208 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vkk9s" event={"ID":"5a44ba09-75b3-494c-88b1-065eb978f33a","Type":"ContainerDied","Data":"914d5c018b82e0eac46e94fb960b980a715062f29cfc8961c9a7be6834afd44c"} Nov 24 12:29:51 crc kubenswrapper[4752]: I1124 12:29:51.497256 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="914d5c018b82e0eac46e94fb960b980a715062f29cfc8961c9a7be6834afd44c" Nov 24 12:29:51 crc kubenswrapper[4752]: I1124 12:29:51.497304 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vkk9s" Nov 24 12:29:51 crc kubenswrapper[4752]: I1124 12:29:51.779458 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c78d97f89-ljr8f"] Nov 24 12:29:51 crc kubenswrapper[4752]: E1124 12:29:51.785469 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a44ba09-75b3-494c-88b1-065eb978f33a" containerName="keystone-db-sync" Nov 24 12:29:51 crc kubenswrapper[4752]: I1124 12:29:51.785490 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a44ba09-75b3-494c-88b1-065eb978f33a" containerName="keystone-db-sync" Nov 24 12:29:51 crc kubenswrapper[4752]: I1124 12:29:51.785666 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a44ba09-75b3-494c-88b1-065eb978f33a" containerName="keystone-db-sync" Nov 24 12:29:51 crc kubenswrapper[4752]: I1124 12:29:51.786515 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c78d97f89-ljr8f" Nov 24 12:29:51 crc kubenswrapper[4752]: I1124 12:29:51.800006 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c78d97f89-ljr8f"] Nov 24 12:29:51 crc kubenswrapper[4752]: I1124 12:29:51.808435 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-rcq5z"] Nov 24 12:29:51 crc kubenswrapper[4752]: I1124 12:29:51.838900 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rcq5z" Nov 24 12:29:51 crc kubenswrapper[4752]: I1124 12:29:51.843429 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jnw8d" Nov 24 12:29:51 crc kubenswrapper[4752]: I1124 12:29:51.848939 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 24 12:29:51 crc kubenswrapper[4752]: I1124 12:29:51.849146 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 24 12:29:51 crc kubenswrapper[4752]: I1124 12:29:51.849190 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 24 12:29:51 crc kubenswrapper[4752]: I1124 12:29:51.856600 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 24 12:29:51 crc kubenswrapper[4752]: I1124 12:29:51.866271 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rcq5z"] Nov 24 12:29:51 crc kubenswrapper[4752]: I1124 12:29:51.950920 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57dda24e-e364-43f0-86a0-b7c15e511d21-dns-svc\") pod \"dnsmasq-dns-7c78d97f89-ljr8f\" (UID: \"57dda24e-e364-43f0-86a0-b7c15e511d21\") " pod="openstack/dnsmasq-dns-7c78d97f89-ljr8f" Nov 24 12:29:51 crc kubenswrapper[4752]: I1124 12:29:51.950993 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed-combined-ca-bundle\") pod \"keystone-bootstrap-rcq5z\" (UID: \"2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed\") " pod="openstack/keystone-bootstrap-rcq5z" Nov 24 12:29:51 crc kubenswrapper[4752]: I1124 12:29:51.951025 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57dda24e-e364-43f0-86a0-b7c15e511d21-config\") pod \"dnsmasq-dns-7c78d97f89-ljr8f\" (UID: \"57dda24e-e364-43f0-86a0-b7c15e511d21\") " pod="openstack/dnsmasq-dns-7c78d97f89-ljr8f" Nov 24 12:29:51 crc kubenswrapper[4752]: I1124 12:29:51.951054 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed-fernet-keys\") pod \"keystone-bootstrap-rcq5z\" (UID: \"2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed\") " pod="openstack/keystone-bootstrap-rcq5z" Nov 24 12:29:51 crc kubenswrapper[4752]: I1124 12:29:51.951085 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hd58\" (UniqueName: \"kubernetes.io/projected/57dda24e-e364-43f0-86a0-b7c15e511d21-kube-api-access-7hd58\") pod \"dnsmasq-dns-7c78d97f89-ljr8f\" (UID: \"57dda24e-e364-43f0-86a0-b7c15e511d21\") " pod="openstack/dnsmasq-dns-7c78d97f89-ljr8f" Nov 24 12:29:51 crc kubenswrapper[4752]: I1124 12:29:51.951101 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed-scripts\") pod \"keystone-bootstrap-rcq5z\" (UID: \"2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed\") " pod="openstack/keystone-bootstrap-rcq5z" Nov 24 12:29:51 crc kubenswrapper[4752]: I1124 12:29:51.951128 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkgzv\" (UniqueName: \"kubernetes.io/projected/2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed-kube-api-access-zkgzv\") pod \"keystone-bootstrap-rcq5z\" (UID: \"2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed\") " pod="openstack/keystone-bootstrap-rcq5z" Nov 24 12:29:51 crc kubenswrapper[4752]: I1124 12:29:51.951163 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed-credential-keys\") pod \"keystone-bootstrap-rcq5z\" (UID: \"2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed\") " pod="openstack/keystone-bootstrap-rcq5z" Nov 24 12:29:51 crc kubenswrapper[4752]: I1124 12:29:51.951205 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57dda24e-e364-43f0-86a0-b7c15e511d21-ovsdbserver-sb\") pod \"dnsmasq-dns-7c78d97f89-ljr8f\" (UID: \"57dda24e-e364-43f0-86a0-b7c15e511d21\") " pod="openstack/dnsmasq-dns-7c78d97f89-ljr8f" Nov 24 12:29:51 crc kubenswrapper[4752]: I1124 12:29:51.951236 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57dda24e-e364-43f0-86a0-b7c15e511d21-ovsdbserver-nb\") pod \"dnsmasq-dns-7c78d97f89-ljr8f\" (UID: \"57dda24e-e364-43f0-86a0-b7c15e511d21\") " pod="openstack/dnsmasq-dns-7c78d97f89-ljr8f" Nov 24 12:29:51 crc kubenswrapper[4752]: I1124 12:29:51.951257 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed-config-data\") pod \"keystone-bootstrap-rcq5z\" (UID: \"2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed\") " pod="openstack/keystone-bootstrap-rcq5z" Nov 24 12:29:52 crc kubenswrapper[4752]: I1124 12:29:52.052540 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed-credential-keys\") pod \"keystone-bootstrap-rcq5z\" (UID: \"2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed\") " pod="openstack/keystone-bootstrap-rcq5z" Nov 24 12:29:52 crc kubenswrapper[4752]: I1124 12:29:52.052596 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57dda24e-e364-43f0-86a0-b7c15e511d21-ovsdbserver-sb\") pod \"dnsmasq-dns-7c78d97f89-ljr8f\" (UID: \"57dda24e-e364-43f0-86a0-b7c15e511d21\") " pod="openstack/dnsmasq-dns-7c78d97f89-ljr8f" Nov 24 12:29:52 crc kubenswrapper[4752]: I1124 12:29:52.052623 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57dda24e-e364-43f0-86a0-b7c15e511d21-ovsdbserver-nb\") pod \"dnsmasq-dns-7c78d97f89-ljr8f\" (UID: \"57dda24e-e364-43f0-86a0-b7c15e511d21\") " pod="openstack/dnsmasq-dns-7c78d97f89-ljr8f" Nov 24 12:29:52 crc kubenswrapper[4752]: I1124 12:29:52.052652 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed-config-data\") pod \"keystone-bootstrap-rcq5z\" (UID: \"2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed\") " pod="openstack/keystone-bootstrap-rcq5z" Nov 24 12:29:52 crc kubenswrapper[4752]: I1124 12:29:52.052687 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57dda24e-e364-43f0-86a0-b7c15e511d21-dns-svc\") pod \"dnsmasq-dns-7c78d97f89-ljr8f\" (UID: \"57dda24e-e364-43f0-86a0-b7c15e511d21\") " pod="openstack/dnsmasq-dns-7c78d97f89-ljr8f" Nov 24 12:29:52 crc kubenswrapper[4752]: I1124 12:29:52.052743 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed-combined-ca-bundle\") pod \"keystone-bootstrap-rcq5z\" (UID: \"2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed\") " pod="openstack/keystone-bootstrap-rcq5z" Nov 24 12:29:52 crc kubenswrapper[4752]: I1124 12:29:52.052794 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57dda24e-e364-43f0-86a0-b7c15e511d21-config\") pod \"dnsmasq-dns-7c78d97f89-ljr8f\" (UID: \"57dda24e-e364-43f0-86a0-b7c15e511d21\") " pod="openstack/dnsmasq-dns-7c78d97f89-ljr8f" Nov 24 12:29:52 crc kubenswrapper[4752]: I1124 12:29:52.052826 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed-fernet-keys\") pod \"keystone-bootstrap-rcq5z\" (UID: \"2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed\") " pod="openstack/keystone-bootstrap-rcq5z" Nov 24 12:29:52 crc kubenswrapper[4752]: I1124 12:29:52.052868 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hd58\" (UniqueName: \"kubernetes.io/projected/57dda24e-e364-43f0-86a0-b7c15e511d21-kube-api-access-7hd58\") pod \"dnsmasq-dns-7c78d97f89-ljr8f\" (UID: \"57dda24e-e364-43f0-86a0-b7c15e511d21\") " pod="openstack/dnsmasq-dns-7c78d97f89-ljr8f" Nov 24 12:29:52 crc kubenswrapper[4752]: I1124 12:29:52.052890 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed-scripts\") pod \"keystone-bootstrap-rcq5z\" (UID: \"2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed\") " pod="openstack/keystone-bootstrap-rcq5z" Nov 24 12:29:52 crc kubenswrapper[4752]: I1124 12:29:52.052927 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkgzv\" (UniqueName: \"kubernetes.io/projected/2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed-kube-api-access-zkgzv\") pod \"keystone-bootstrap-rcq5z\" (UID: \"2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed\") " pod="openstack/keystone-bootstrap-rcq5z" Nov 24 12:29:52 crc kubenswrapper[4752]: I1124 12:29:52.054990 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57dda24e-e364-43f0-86a0-b7c15e511d21-ovsdbserver-sb\") pod \"dnsmasq-dns-7c78d97f89-ljr8f\" (UID: \"57dda24e-e364-43f0-86a0-b7c15e511d21\") " pod="openstack/dnsmasq-dns-7c78d97f89-ljr8f" Nov 24 12:29:52 crc kubenswrapper[4752]: I1124 12:29:52.054990 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57dda24e-e364-43f0-86a0-b7c15e511d21-config\") pod \"dnsmasq-dns-7c78d97f89-ljr8f\" (UID: \"57dda24e-e364-43f0-86a0-b7c15e511d21\") " pod="openstack/dnsmasq-dns-7c78d97f89-ljr8f" Nov 24 12:29:52 crc kubenswrapper[4752]: I1124 12:29:52.055661 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57dda24e-e364-43f0-86a0-b7c15e511d21-dns-svc\") pod \"dnsmasq-dns-7c78d97f89-ljr8f\" (UID: \"57dda24e-e364-43f0-86a0-b7c15e511d21\") " pod="openstack/dnsmasq-dns-7c78d97f89-ljr8f" Nov 24 12:29:52 crc kubenswrapper[4752]: I1124 12:29:52.056332 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57dda24e-e364-43f0-86a0-b7c15e511d21-ovsdbserver-nb\") pod \"dnsmasq-dns-7c78d97f89-ljr8f\" (UID: \"57dda24e-e364-43f0-86a0-b7c15e511d21\") " pod="openstack/dnsmasq-dns-7c78d97f89-ljr8f" Nov 24 12:29:52 crc kubenswrapper[4752]: I1124 12:29:52.058206 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed-credential-keys\") pod \"keystone-bootstrap-rcq5z\" (UID: \"2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed\") " pod="openstack/keystone-bootstrap-rcq5z" Nov 24 12:29:52 crc kubenswrapper[4752]: I1124 12:29:52.060307 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed-fernet-keys\") pod \"keystone-bootstrap-rcq5z\" (UID: \"2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed\") " pod="openstack/keystone-bootstrap-rcq5z" Nov 24 12:29:52 crc kubenswrapper[4752]: I1124 12:29:52.061393 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed-combined-ca-bundle\") pod \"keystone-bootstrap-rcq5z\" (UID: \"2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed\") " pod="openstack/keystone-bootstrap-rcq5z" Nov 24 12:29:52 crc kubenswrapper[4752]: I1124 12:29:52.070543 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed-config-data\") pod \"keystone-bootstrap-rcq5z\" (UID: \"2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed\") " pod="openstack/keystone-bootstrap-rcq5z" Nov 24 12:29:52 crc kubenswrapper[4752]: I1124 12:29:52.073022 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed-scripts\") pod \"keystone-bootstrap-rcq5z\" (UID: \"2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed\") " pod="openstack/keystone-bootstrap-rcq5z" Nov 24 12:29:52 crc kubenswrapper[4752]: I1124 12:29:52.077425 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hd58\" (UniqueName: \"kubernetes.io/projected/57dda24e-e364-43f0-86a0-b7c15e511d21-kube-api-access-7hd58\") pod \"dnsmasq-dns-7c78d97f89-ljr8f\" (UID: \"57dda24e-e364-43f0-86a0-b7c15e511d21\") " pod="openstack/dnsmasq-dns-7c78d97f89-ljr8f" Nov 24 12:29:52 crc kubenswrapper[4752]: I1124 12:29:52.079805 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkgzv\" (UniqueName: \"kubernetes.io/projected/2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed-kube-api-access-zkgzv\") pod \"keystone-bootstrap-rcq5z\" (UID: \"2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed\") " pod="openstack/keystone-bootstrap-rcq5z" Nov 24 12:29:52 crc kubenswrapper[4752]: I1124 12:29:52.105223 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c78d97f89-ljr8f" Nov 24 12:29:52 crc kubenswrapper[4752]: I1124 12:29:52.171575 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rcq5z" Nov 24 12:29:52 crc kubenswrapper[4752]: I1124 12:29:52.569831 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c78d97f89-ljr8f"] Nov 24 12:29:52 crc kubenswrapper[4752]: W1124 12:29:52.571988 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57dda24e_e364_43f0_86a0_b7c15e511d21.slice/crio-df42fdd2d0f595a9678f2dd3540f6d696aa2034396964902d901b9c1473069e7 WatchSource:0}: Error finding container df42fdd2d0f595a9678f2dd3540f6d696aa2034396964902d901b9c1473069e7: Status 404 returned error can't find the container with id df42fdd2d0f595a9678f2dd3540f6d696aa2034396964902d901b9c1473069e7 Nov 24 12:29:52 crc kubenswrapper[4752]: I1124 12:29:52.658546 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rcq5z"] Nov 24 12:29:53 crc kubenswrapper[4752]: I1124 12:29:53.514459 4752 generic.go:334] "Generic (PLEG): container finished" podID="57dda24e-e364-43f0-86a0-b7c15e511d21" containerID="86241e7383c395847541aff37734208f4111b3efe0c2f9c24c12229d4f17f538" exitCode=0 Nov 24 12:29:53 crc kubenswrapper[4752]: I1124 12:29:53.514578 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c78d97f89-ljr8f" event={"ID":"57dda24e-e364-43f0-86a0-b7c15e511d21","Type":"ContainerDied","Data":"86241e7383c395847541aff37734208f4111b3efe0c2f9c24c12229d4f17f538"} Nov 24 12:29:53 crc kubenswrapper[4752]: I1124 12:29:53.516693 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c78d97f89-ljr8f" event={"ID":"57dda24e-e364-43f0-86a0-b7c15e511d21","Type":"ContainerStarted","Data":"df42fdd2d0f595a9678f2dd3540f6d696aa2034396964902d901b9c1473069e7"} Nov 24 12:29:53 crc kubenswrapper[4752]: I1124 12:29:53.526029 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rcq5z" event={"ID":"2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed","Type":"ContainerStarted","Data":"efbf80a5674af4565dca3f59641ecbc15d40138ce03b714be6ba9f77b29ba893"} Nov 24 12:29:53 crc kubenswrapper[4752]: I1124 12:29:53.526086 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rcq5z" event={"ID":"2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed","Type":"ContainerStarted","Data":"7b60050a5afd68f61d61ecbd5f21f28d784bf11db6ba629b2f0d9ce3fed1b5bf"} Nov 24 12:29:54 crc kubenswrapper[4752]: I1124 12:29:54.536150 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c78d97f89-ljr8f" event={"ID":"57dda24e-e364-43f0-86a0-b7c15e511d21","Type":"ContainerStarted","Data":"4bc2a4252f2a315eda67ff50c5f38e7e0202f8b75440813251bd74851c636377"} Nov 24 12:29:54 crc kubenswrapper[4752]: I1124 12:29:54.536780 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c78d97f89-ljr8f" Nov 24 12:29:54 crc kubenswrapper[4752]: I1124 12:29:54.569135 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-rcq5z" podStartSLOduration=3.569117784 podStartE2EDuration="3.569117784s" podCreationTimestamp="2025-11-24 12:29:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:29:53.550145102 +0000 UTC m=+4999.534965391" watchObservedRunningTime="2025-11-24 12:29:54.569117784 +0000 UTC m=+5000.553938073" Nov 24 12:29:54 crc kubenswrapper[4752]: I1124 12:29:54.570090 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c78d97f89-ljr8f" podStartSLOduration=3.570084712 podStartE2EDuration="3.570084712s" podCreationTimestamp="2025-11-24 12:29:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:29:54.564411939 +0000 UTC m=+5000.549232278" watchObservedRunningTime="2025-11-24 12:29:54.570084712 +0000 UTC m=+5000.554905001" Nov 24 12:29:56 crc kubenswrapper[4752]: I1124 12:29:56.555167 4752 generic.go:334] "Generic (PLEG): container finished" podID="2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed" containerID="efbf80a5674af4565dca3f59641ecbc15d40138ce03b714be6ba9f77b29ba893" exitCode=0 Nov 24 12:29:56 crc kubenswrapper[4752]: I1124 12:29:56.555270 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rcq5z" event={"ID":"2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed","Type":"ContainerDied","Data":"efbf80a5674af4565dca3f59641ecbc15d40138ce03b714be6ba9f77b29ba893"} Nov 24 12:29:57 crc kubenswrapper[4752]: I1124 12:29:57.927280 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rcq5z" Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.053373 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed-combined-ca-bundle\") pod \"2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed\" (UID: \"2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed\") " Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.053462 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed-credential-keys\") pod \"2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed\" (UID: \"2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed\") " Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.053635 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed-config-data\") pod \"2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed\" (UID: \"2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed\") " Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.053716 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed-fernet-keys\") pod \"2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed\" (UID: \"2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed\") " Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.053792 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkgzv\" (UniqueName: \"kubernetes.io/projected/2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed-kube-api-access-zkgzv\") pod \"2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed\" (UID: \"2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed\") " Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.053851 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed-scripts\") pod \"2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed\" (UID: \"2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed\") " Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.060468 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed-scripts" (OuterVolumeSpecName: "scripts") pod "2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed" (UID: "2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.060578 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed" (UID: "2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.062203 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed-kube-api-access-zkgzv" (OuterVolumeSpecName: "kube-api-access-zkgzv") pod "2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed" (UID: "2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed"). InnerVolumeSpecName "kube-api-access-zkgzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.062510 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed" (UID: "2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.081312 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed-config-data" (OuterVolumeSpecName: "config-data") pod "2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed" (UID: "2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.107132 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed" (UID: "2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.156791 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.156850 4752 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.156874 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.156896 4752 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.156915 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkgzv\" (UniqueName: \"kubernetes.io/projected/2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed-kube-api-access-zkgzv\") on node \"crc\" DevicePath \"\"" Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.156936 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.580245 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rcq5z" event={"ID":"2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed","Type":"ContainerDied","Data":"7b60050a5afd68f61d61ecbd5f21f28d784bf11db6ba629b2f0d9ce3fed1b5bf"} Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.580686 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b60050a5afd68f61d61ecbd5f21f28d784bf11db6ba629b2f0d9ce3fed1b5bf" Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.580334 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rcq5z" Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.683821 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-rcq5z"] Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.694260 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-rcq5z"] Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.739726 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed" path="/var/lib/kubelet/pods/2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed/volumes" Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.767443 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-b7ffs"] Nov 24 12:29:58 crc kubenswrapper[4752]: E1124 12:29:58.768002 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed" containerName="keystone-bootstrap" Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.768032 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed" containerName="keystone-bootstrap" Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.768383 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bafaa3c-1fda-4a7d-b06a-4a7e46c7eaed" containerName="keystone-bootstrap" Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.769303 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-b7ffs" Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.772284 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.772529 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.773067 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.773474 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.773721 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jnw8d" Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.779036 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-b7ffs"] Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.871598 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8abdcde-a74d-4d46-9827-720f5202a317-config-data\") pod \"keystone-bootstrap-b7ffs\" (UID: \"c8abdcde-a74d-4d46-9827-720f5202a317\") " pod="openstack/keystone-bootstrap-b7ffs" Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.871786 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c8abdcde-a74d-4d46-9827-720f5202a317-credential-keys\") pod \"keystone-bootstrap-b7ffs\" (UID: \"c8abdcde-a74d-4d46-9827-720f5202a317\") " pod="openstack/keystone-bootstrap-b7ffs" Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.871839 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8abdcde-a74d-4d46-9827-720f5202a317-combined-ca-bundle\") pod \"keystone-bootstrap-b7ffs\" (UID: \"c8abdcde-a74d-4d46-9827-720f5202a317\") " pod="openstack/keystone-bootstrap-b7ffs" Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.872029 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8abdcde-a74d-4d46-9827-720f5202a317-scripts\") pod \"keystone-bootstrap-b7ffs\" (UID: \"c8abdcde-a74d-4d46-9827-720f5202a317\") " pod="openstack/keystone-bootstrap-b7ffs" Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.872159 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c8abdcde-a74d-4d46-9827-720f5202a317-fernet-keys\") pod \"keystone-bootstrap-b7ffs\" (UID: \"c8abdcde-a74d-4d46-9827-720f5202a317\") " pod="openstack/keystone-bootstrap-b7ffs" Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.872213 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt2dd\" (UniqueName: \"kubernetes.io/projected/c8abdcde-a74d-4d46-9827-720f5202a317-kube-api-access-tt2dd\") pod \"keystone-bootstrap-b7ffs\" (UID: \"c8abdcde-a74d-4d46-9827-720f5202a317\") " pod="openstack/keystone-bootstrap-b7ffs" Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.974036 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c8abdcde-a74d-4d46-9827-720f5202a317-credential-keys\") pod \"keystone-bootstrap-b7ffs\" (UID: \"c8abdcde-a74d-4d46-9827-720f5202a317\") " pod="openstack/keystone-bootstrap-b7ffs" Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.974107 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8abdcde-a74d-4d46-9827-720f5202a317-combined-ca-bundle\") pod \"keystone-bootstrap-b7ffs\" (UID: \"c8abdcde-a74d-4d46-9827-720f5202a317\") " pod="openstack/keystone-bootstrap-b7ffs" Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.974265 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8abdcde-a74d-4d46-9827-720f5202a317-scripts\") pod \"keystone-bootstrap-b7ffs\" (UID: \"c8abdcde-a74d-4d46-9827-720f5202a317\") " pod="openstack/keystone-bootstrap-b7ffs" Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.974361 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c8abdcde-a74d-4d46-9827-720f5202a317-fernet-keys\") pod \"keystone-bootstrap-b7ffs\" (UID: \"c8abdcde-a74d-4d46-9827-720f5202a317\") " pod="openstack/keystone-bootstrap-b7ffs" Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.974418 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt2dd\" (UniqueName: \"kubernetes.io/projected/c8abdcde-a74d-4d46-9827-720f5202a317-kube-api-access-tt2dd\") pod \"keystone-bootstrap-b7ffs\" (UID: \"c8abdcde-a74d-4d46-9827-720f5202a317\") " pod="openstack/keystone-bootstrap-b7ffs" Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.974462 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8abdcde-a74d-4d46-9827-720f5202a317-config-data\") pod \"keystone-bootstrap-b7ffs\" (UID: \"c8abdcde-a74d-4d46-9827-720f5202a317\") " pod="openstack/keystone-bootstrap-b7ffs" Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.981312 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8abdcde-a74d-4d46-9827-720f5202a317-scripts\") pod \"keystone-bootstrap-b7ffs\" (UID: \"c8abdcde-a74d-4d46-9827-720f5202a317\") " pod="openstack/keystone-bootstrap-b7ffs" Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.981977 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8abdcde-a74d-4d46-9827-720f5202a317-combined-ca-bundle\") pod \"keystone-bootstrap-b7ffs\" (UID: \"c8abdcde-a74d-4d46-9827-720f5202a317\") " pod="openstack/keystone-bootstrap-b7ffs" Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.982161 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c8abdcde-a74d-4d46-9827-720f5202a317-credential-keys\") pod \"keystone-bootstrap-b7ffs\" (UID: \"c8abdcde-a74d-4d46-9827-720f5202a317\") " pod="openstack/keystone-bootstrap-b7ffs" Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.987994 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c8abdcde-a74d-4d46-9827-720f5202a317-fernet-keys\") pod \"keystone-bootstrap-b7ffs\" (UID: \"c8abdcde-a74d-4d46-9827-720f5202a317\") " pod="openstack/keystone-bootstrap-b7ffs" Nov 24 12:29:58 crc kubenswrapper[4752]: I1124 12:29:58.988891 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8abdcde-a74d-4d46-9827-720f5202a317-config-data\") pod \"keystone-bootstrap-b7ffs\" (UID: \"c8abdcde-a74d-4d46-9827-720f5202a317\") " pod="openstack/keystone-bootstrap-b7ffs" Nov 24 12:29:59 crc kubenswrapper[4752]: I1124 12:29:59.006071 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt2dd\" (UniqueName: \"kubernetes.io/projected/c8abdcde-a74d-4d46-9827-720f5202a317-kube-api-access-tt2dd\") pod \"keystone-bootstrap-b7ffs\" (UID: \"c8abdcde-a74d-4d46-9827-720f5202a317\") " pod="openstack/keystone-bootstrap-b7ffs" Nov 24 12:29:59 crc kubenswrapper[4752]: I1124 12:29:59.100618 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-b7ffs" Nov 24 12:29:59 crc kubenswrapper[4752]: I1124 12:29:59.571416 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-b7ffs"] Nov 24 12:29:59 crc kubenswrapper[4752]: I1124 12:29:59.589078 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-b7ffs" event={"ID":"c8abdcde-a74d-4d46-9827-720f5202a317","Type":"ContainerStarted","Data":"c6666194a771567d86b2b8921b5b839ce169fc1008dea15c067061c9fb8157cb"} Nov 24 12:30:00 crc kubenswrapper[4752]: I1124 12:30:00.144339 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399790-blmbw"] Nov 24 12:30:00 crc kubenswrapper[4752]: I1124 12:30:00.146532 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399790-blmbw" Nov 24 12:30:00 crc kubenswrapper[4752]: I1124 12:30:00.151005 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 24 12:30:00 crc kubenswrapper[4752]: I1124 12:30:00.154946 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399790-blmbw"] Nov 24 12:30:00 crc kubenswrapper[4752]: I1124 12:30:00.156098 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 24 12:30:00 crc kubenswrapper[4752]: I1124 12:30:00.300301 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32-config-volume\") pod \"collect-profiles-29399790-blmbw\" (UID: \"351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399790-blmbw" Nov 24 12:30:00 crc kubenswrapper[4752]: I1124 12:30:00.300358 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32-secret-volume\") pod \"collect-profiles-29399790-blmbw\" (UID: \"351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399790-blmbw" Nov 24 12:30:00 crc kubenswrapper[4752]: I1124 12:30:00.300734 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw2wp\" (UniqueName: \"kubernetes.io/projected/351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32-kube-api-access-jw2wp\") pod \"collect-profiles-29399790-blmbw\" (UID: \"351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399790-blmbw" Nov 24 12:30:00 crc kubenswrapper[4752]: I1124 12:30:00.401958 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw2wp\" (UniqueName: \"kubernetes.io/projected/351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32-kube-api-access-jw2wp\") pod \"collect-profiles-29399790-blmbw\" (UID: \"351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399790-blmbw" Nov 24 12:30:00 crc kubenswrapper[4752]: I1124 12:30:00.402031 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32-secret-volume\") pod \"collect-profiles-29399790-blmbw\" (UID: \"351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399790-blmbw" Nov 24 12:30:00 crc kubenswrapper[4752]: I1124 12:30:00.402081 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32-config-volume\") pod \"collect-profiles-29399790-blmbw\" (UID: \"351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399790-blmbw" Nov 24 12:30:00 crc kubenswrapper[4752]: I1124 12:30:00.403370 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32-config-volume\") pod \"collect-profiles-29399790-blmbw\" (UID: \"351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399790-blmbw" Nov 24 12:30:00 crc kubenswrapper[4752]: I1124 12:30:00.411387 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32-secret-volume\") pod \"collect-profiles-29399790-blmbw\" (UID: \"351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399790-blmbw" Nov 24 12:30:00 crc kubenswrapper[4752]: I1124 12:30:00.418415 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw2wp\" (UniqueName: \"kubernetes.io/projected/351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32-kube-api-access-jw2wp\") pod \"collect-profiles-29399790-blmbw\" (UID: \"351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399790-blmbw" Nov 24 12:30:00 crc kubenswrapper[4752]: I1124 12:30:00.466050 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399790-blmbw" Nov 24 12:30:00 crc kubenswrapper[4752]: I1124 12:30:00.600073 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-b7ffs" event={"ID":"c8abdcde-a74d-4d46-9827-720f5202a317","Type":"ContainerStarted","Data":"9e52b489bccd337633be634c1b697052e7c0f23ed4f699b300767ccfddfc6d97"} Nov 24 12:30:00 crc kubenswrapper[4752]: I1124 12:30:00.616376 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-b7ffs" podStartSLOduration=2.616361786 podStartE2EDuration="2.616361786s" podCreationTimestamp="2025-11-24 12:29:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:30:00.615967725 +0000 UTC m=+5006.600788034" watchObservedRunningTime="2025-11-24 12:30:00.616361786 +0000 UTC m=+5006.601182075" Nov 24 12:30:00 crc kubenswrapper[4752]: I1124 12:30:00.902911 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399790-blmbw"] Nov 24 12:30:00 crc kubenswrapper[4752]: W1124 12:30:00.911080 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod351f3e1f_4d6b_4eaf_a9e1_32063b5d2c32.slice/crio-65f8d5042b40a849ec5b00cd6cdd57dbd4b1ebf680111379f6b7ca665686585e WatchSource:0}: Error finding container 65f8d5042b40a849ec5b00cd6cdd57dbd4b1ebf680111379f6b7ca665686585e: Status 404 returned error can't find the container with id 65f8d5042b40a849ec5b00cd6cdd57dbd4b1ebf680111379f6b7ca665686585e Nov 24 12:30:01 crc kubenswrapper[4752]: I1124 12:30:01.607595 4752 generic.go:334] "Generic (PLEG): container finished" podID="351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32" containerID="ef2fed93ad2b502a48ca45b4eed05d67a84a7e3fea151747bc48d80d57be64d0" exitCode=0 Nov 24 12:30:01 crc kubenswrapper[4752]: I1124 12:30:01.607663 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399790-blmbw" event={"ID":"351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32","Type":"ContainerDied","Data":"ef2fed93ad2b502a48ca45b4eed05d67a84a7e3fea151747bc48d80d57be64d0"} Nov 24 12:30:01 crc kubenswrapper[4752]: I1124 12:30:01.607960 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399790-blmbw" event={"ID":"351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32","Type":"ContainerStarted","Data":"65f8d5042b40a849ec5b00cd6cdd57dbd4b1ebf680111379f6b7ca665686585e"} Nov 24 12:30:02 crc kubenswrapper[4752]: I1124 12:30:02.107660 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7c78d97f89-ljr8f" Nov 24 12:30:02 crc kubenswrapper[4752]: I1124 12:30:02.193486 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6fc965c-l88z4"] Nov 24 12:30:02 crc kubenswrapper[4752]: I1124 12:30:02.194166 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6fc965c-l88z4" podUID="ff71927d-9232-4525-8ac9-4c6c82eb4ed6" containerName="dnsmasq-dns" containerID="cri-o://b9721523a90b8183528f16d698b8b1882e8d8910e3b4940cea4fefc8ddbaeff3" gracePeriod=10 Nov 24 12:30:02 crc kubenswrapper[4752]: I1124 12:30:02.600856 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6fc965c-l88z4" Nov 24 12:30:02 crc kubenswrapper[4752]: I1124 12:30:02.626697 4752 generic.go:334] "Generic (PLEG): container finished" podID="c8abdcde-a74d-4d46-9827-720f5202a317" containerID="9e52b489bccd337633be634c1b697052e7c0f23ed4f699b300767ccfddfc6d97" exitCode=0 Nov 24 12:30:02 crc kubenswrapper[4752]: I1124 12:30:02.626805 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-b7ffs" event={"ID":"c8abdcde-a74d-4d46-9827-720f5202a317","Type":"ContainerDied","Data":"9e52b489bccd337633be634c1b697052e7c0f23ed4f699b300767ccfddfc6d97"} Nov 24 12:30:02 crc kubenswrapper[4752]: I1124 12:30:02.631127 4752 generic.go:334] "Generic (PLEG): container finished" podID="ff71927d-9232-4525-8ac9-4c6c82eb4ed6" containerID="b9721523a90b8183528f16d698b8b1882e8d8910e3b4940cea4fefc8ddbaeff3" exitCode=0 Nov 24 12:30:02 crc kubenswrapper[4752]: I1124 12:30:02.631183 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6fc965c-l88z4" event={"ID":"ff71927d-9232-4525-8ac9-4c6c82eb4ed6","Type":"ContainerDied","Data":"b9721523a90b8183528f16d698b8b1882e8d8910e3b4940cea4fefc8ddbaeff3"} Nov 24 12:30:02 crc kubenswrapper[4752]: I1124 12:30:02.631217 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6fc965c-l88z4" Nov 24 12:30:02 crc kubenswrapper[4752]: I1124 12:30:02.631240 4752 scope.go:117] "RemoveContainer" containerID="b9721523a90b8183528f16d698b8b1882e8d8910e3b4940cea4fefc8ddbaeff3" Nov 24 12:30:02 crc kubenswrapper[4752]: I1124 12:30:02.631224 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6fc965c-l88z4" event={"ID":"ff71927d-9232-4525-8ac9-4c6c82eb4ed6","Type":"ContainerDied","Data":"dacb2bdced93223d6809a73c6f6f653e83cd35dc24534564213f37686de7ed61"} Nov 24 12:30:02 crc kubenswrapper[4752]: I1124 12:30:02.659711 4752 scope.go:117] "RemoveContainer" containerID="75badecf99b7f3d4438d5f0647c0ddefed749d64e009033df6e1a39625493cc2" Nov 24 12:30:02 crc kubenswrapper[4752]: I1124 12:30:02.684916 4752 scope.go:117] "RemoveContainer" containerID="b9721523a90b8183528f16d698b8b1882e8d8910e3b4940cea4fefc8ddbaeff3" Nov 24 12:30:02 crc kubenswrapper[4752]: E1124 12:30:02.685420 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9721523a90b8183528f16d698b8b1882e8d8910e3b4940cea4fefc8ddbaeff3\": container with ID starting with b9721523a90b8183528f16d698b8b1882e8d8910e3b4940cea4fefc8ddbaeff3 not found: ID does not exist" containerID="b9721523a90b8183528f16d698b8b1882e8d8910e3b4940cea4fefc8ddbaeff3" Nov 24 12:30:02 crc kubenswrapper[4752]: I1124 12:30:02.685478 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9721523a90b8183528f16d698b8b1882e8d8910e3b4940cea4fefc8ddbaeff3"} err="failed to get container status \"b9721523a90b8183528f16d698b8b1882e8d8910e3b4940cea4fefc8ddbaeff3\": rpc error: code = NotFound desc = could not find container \"b9721523a90b8183528f16d698b8b1882e8d8910e3b4940cea4fefc8ddbaeff3\": container with ID starting with b9721523a90b8183528f16d698b8b1882e8d8910e3b4940cea4fefc8ddbaeff3 not found: ID does not exist" Nov 24 12:30:02 crc kubenswrapper[4752]: I1124 12:30:02.685507 4752 scope.go:117] "RemoveContainer" containerID="75badecf99b7f3d4438d5f0647c0ddefed749d64e009033df6e1a39625493cc2" Nov 24 12:30:02 crc kubenswrapper[4752]: E1124 12:30:02.686185 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75badecf99b7f3d4438d5f0647c0ddefed749d64e009033df6e1a39625493cc2\": container with ID starting with 75badecf99b7f3d4438d5f0647c0ddefed749d64e009033df6e1a39625493cc2 not found: ID does not exist" containerID="75badecf99b7f3d4438d5f0647c0ddefed749d64e009033df6e1a39625493cc2" Nov 24 12:30:02 crc kubenswrapper[4752]: I1124 12:30:02.686217 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75badecf99b7f3d4438d5f0647c0ddefed749d64e009033df6e1a39625493cc2"} err="failed to get container status \"75badecf99b7f3d4438d5f0647c0ddefed749d64e009033df6e1a39625493cc2\": rpc error: code = NotFound desc = could not find container \"75badecf99b7f3d4438d5f0647c0ddefed749d64e009033df6e1a39625493cc2\": container with ID starting with 75badecf99b7f3d4438d5f0647c0ddefed749d64e009033df6e1a39625493cc2 not found: ID does not exist" Nov 24 12:30:02 crc kubenswrapper[4752]: I1124 12:30:02.742157 4752 scope.go:117] "RemoveContainer" containerID="294534fa96e641fffe9b0cce6e2bc4d525450d730fff13a2ed6629d46358ee8a" Nov 24 12:30:02 crc kubenswrapper[4752]: E1124 12:30:02.742472 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:30:02 crc kubenswrapper[4752]: I1124 12:30:02.745195 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff71927d-9232-4525-8ac9-4c6c82eb4ed6-ovsdbserver-nb\") pod \"ff71927d-9232-4525-8ac9-4c6c82eb4ed6\" (UID: \"ff71927d-9232-4525-8ac9-4c6c82eb4ed6\") " Nov 24 12:30:02 crc kubenswrapper[4752]: I1124 12:30:02.745268 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff71927d-9232-4525-8ac9-4c6c82eb4ed6-ovsdbserver-sb\") pod \"ff71927d-9232-4525-8ac9-4c6c82eb4ed6\" (UID: \"ff71927d-9232-4525-8ac9-4c6c82eb4ed6\") " Nov 24 12:30:02 crc kubenswrapper[4752]: I1124 12:30:02.745321 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff71927d-9232-4525-8ac9-4c6c82eb4ed6-config\") pod \"ff71927d-9232-4525-8ac9-4c6c82eb4ed6\" (UID: \"ff71927d-9232-4525-8ac9-4c6c82eb4ed6\") " Nov 24 12:30:02 crc kubenswrapper[4752]: I1124 12:30:02.745349 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmczf\" (UniqueName: \"kubernetes.io/projected/ff71927d-9232-4525-8ac9-4c6c82eb4ed6-kube-api-access-cmczf\") pod \"ff71927d-9232-4525-8ac9-4c6c82eb4ed6\" (UID: \"ff71927d-9232-4525-8ac9-4c6c82eb4ed6\") " Nov 24 12:30:02 crc kubenswrapper[4752]: I1124 12:30:02.745392 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff71927d-9232-4525-8ac9-4c6c82eb4ed6-dns-svc\") pod \"ff71927d-9232-4525-8ac9-4c6c82eb4ed6\" (UID: \"ff71927d-9232-4525-8ac9-4c6c82eb4ed6\") " Nov 24 12:30:02 crc kubenswrapper[4752]: I1124 12:30:02.754911 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff71927d-9232-4525-8ac9-4c6c82eb4ed6-kube-api-access-cmczf" (OuterVolumeSpecName: "kube-api-access-cmczf") pod "ff71927d-9232-4525-8ac9-4c6c82eb4ed6" (UID: "ff71927d-9232-4525-8ac9-4c6c82eb4ed6"). InnerVolumeSpecName "kube-api-access-cmczf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:30:02 crc kubenswrapper[4752]: I1124 12:30:02.803625 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff71927d-9232-4525-8ac9-4c6c82eb4ed6-config" (OuterVolumeSpecName: "config") pod "ff71927d-9232-4525-8ac9-4c6c82eb4ed6" (UID: "ff71927d-9232-4525-8ac9-4c6c82eb4ed6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:30:02 crc kubenswrapper[4752]: I1124 12:30:02.813554 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff71927d-9232-4525-8ac9-4c6c82eb4ed6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ff71927d-9232-4525-8ac9-4c6c82eb4ed6" (UID: "ff71927d-9232-4525-8ac9-4c6c82eb4ed6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:30:02 crc kubenswrapper[4752]: I1124 12:30:02.817130 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff71927d-9232-4525-8ac9-4c6c82eb4ed6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ff71927d-9232-4525-8ac9-4c6c82eb4ed6" (UID: "ff71927d-9232-4525-8ac9-4c6c82eb4ed6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:30:02 crc kubenswrapper[4752]: I1124 12:30:02.824856 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff71927d-9232-4525-8ac9-4c6c82eb4ed6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ff71927d-9232-4525-8ac9-4c6c82eb4ed6" (UID: "ff71927d-9232-4525-8ac9-4c6c82eb4ed6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:30:02 crc kubenswrapper[4752]: I1124 12:30:02.846884 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff71927d-9232-4525-8ac9-4c6c82eb4ed6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 24 12:30:02 crc kubenswrapper[4752]: I1124 12:30:02.846918 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff71927d-9232-4525-8ac9-4c6c82eb4ed6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 24 12:30:02 crc kubenswrapper[4752]: I1124 12:30:02.846929 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff71927d-9232-4525-8ac9-4c6c82eb4ed6-config\") on node \"crc\" DevicePath \"\"" Nov 24 12:30:02 crc kubenswrapper[4752]: I1124 12:30:02.846941 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmczf\" (UniqueName: \"kubernetes.io/projected/ff71927d-9232-4525-8ac9-4c6c82eb4ed6-kube-api-access-cmczf\") on node \"crc\" DevicePath \"\"" Nov 24 12:30:02 crc kubenswrapper[4752]: I1124 12:30:02.846950 4752 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff71927d-9232-4525-8ac9-4c6c82eb4ed6-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 12:30:02 crc kubenswrapper[4752]: I1124 12:30:02.941639 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399790-blmbw" Nov 24 12:30:02 crc kubenswrapper[4752]: I1124 12:30:02.978361 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6fc965c-l88z4"] Nov 24 12:30:02 crc kubenswrapper[4752]: I1124 12:30:02.987651 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6fc965c-l88z4"] Nov 24 12:30:03 crc kubenswrapper[4752]: I1124 12:30:03.048721 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jw2wp\" (UniqueName: \"kubernetes.io/projected/351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32-kube-api-access-jw2wp\") pod \"351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32\" (UID: \"351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32\") " Nov 24 12:30:03 crc kubenswrapper[4752]: I1124 12:30:03.048811 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32-config-volume\") pod \"351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32\" (UID: \"351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32\") " Nov 24 12:30:03 crc kubenswrapper[4752]: I1124 12:30:03.048895 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32-secret-volume\") pod \"351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32\" (UID: \"351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32\") " Nov 24 12:30:03 crc kubenswrapper[4752]: I1124 12:30:03.050212 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32-config-volume" (OuterVolumeSpecName: "config-volume") pod "351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32" (UID: "351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:30:03 crc kubenswrapper[4752]: I1124 12:30:03.053825 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32" (UID: "351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:30:03 crc kubenswrapper[4752]: I1124 12:30:03.053987 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32-kube-api-access-jw2wp" (OuterVolumeSpecName: "kube-api-access-jw2wp") pod "351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32" (UID: "351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32"). InnerVolumeSpecName "kube-api-access-jw2wp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:30:03 crc kubenswrapper[4752]: I1124 12:30:03.150344 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jw2wp\" (UniqueName: \"kubernetes.io/projected/351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32-kube-api-access-jw2wp\") on node \"crc\" DevicePath \"\"" Nov 24 12:30:03 crc kubenswrapper[4752]: I1124 12:30:03.150377 4752 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32-config-volume\") on node \"crc\" DevicePath \"\"" Nov 24 12:30:03 crc kubenswrapper[4752]: I1124 12:30:03.150389 4752 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 24 12:30:03 crc kubenswrapper[4752]: I1124 12:30:03.641767 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399790-blmbw" event={"ID":"351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32","Type":"ContainerDied","Data":"65f8d5042b40a849ec5b00cd6cdd57dbd4b1ebf680111379f6b7ca665686585e"} Nov 24 12:30:03 crc kubenswrapper[4752]: I1124 12:30:03.642248 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65f8d5042b40a849ec5b00cd6cdd57dbd4b1ebf680111379f6b7ca665686585e" Nov 24 12:30:03 crc kubenswrapper[4752]: I1124 12:30:03.641794 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399790-blmbw" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.022932 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399745-gsknk"] Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.029694 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399745-gsknk"] Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.131586 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-b7ffs" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.270564 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tt2dd\" (UniqueName: \"kubernetes.io/projected/c8abdcde-a74d-4d46-9827-720f5202a317-kube-api-access-tt2dd\") pod \"c8abdcde-a74d-4d46-9827-720f5202a317\" (UID: \"c8abdcde-a74d-4d46-9827-720f5202a317\") " Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.270938 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c8abdcde-a74d-4d46-9827-720f5202a317-fernet-keys\") pod \"c8abdcde-a74d-4d46-9827-720f5202a317\" (UID: \"c8abdcde-a74d-4d46-9827-720f5202a317\") " Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.271099 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8abdcde-a74d-4d46-9827-720f5202a317-scripts\") pod \"c8abdcde-a74d-4d46-9827-720f5202a317\" (UID: \"c8abdcde-a74d-4d46-9827-720f5202a317\") " Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.271205 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8abdcde-a74d-4d46-9827-720f5202a317-config-data\") pod \"c8abdcde-a74d-4d46-9827-720f5202a317\" (UID: \"c8abdcde-a74d-4d46-9827-720f5202a317\") " Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.271369 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c8abdcde-a74d-4d46-9827-720f5202a317-credential-keys\") pod \"c8abdcde-a74d-4d46-9827-720f5202a317\" (UID: \"c8abdcde-a74d-4d46-9827-720f5202a317\") " Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.271474 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8abdcde-a74d-4d46-9827-720f5202a317-combined-ca-bundle\") pod \"c8abdcde-a74d-4d46-9827-720f5202a317\" (UID: \"c8abdcde-a74d-4d46-9827-720f5202a317\") " Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.276301 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8abdcde-a74d-4d46-9827-720f5202a317-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "c8abdcde-a74d-4d46-9827-720f5202a317" (UID: "c8abdcde-a74d-4d46-9827-720f5202a317"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.277055 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8abdcde-a74d-4d46-9827-720f5202a317-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c8abdcde-a74d-4d46-9827-720f5202a317" (UID: "c8abdcde-a74d-4d46-9827-720f5202a317"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.277094 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8abdcde-a74d-4d46-9827-720f5202a317-scripts" (OuterVolumeSpecName: "scripts") pod "c8abdcde-a74d-4d46-9827-720f5202a317" (UID: "c8abdcde-a74d-4d46-9827-720f5202a317"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.277895 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8abdcde-a74d-4d46-9827-720f5202a317-kube-api-access-tt2dd" (OuterVolumeSpecName: "kube-api-access-tt2dd") pod "c8abdcde-a74d-4d46-9827-720f5202a317" (UID: "c8abdcde-a74d-4d46-9827-720f5202a317"). InnerVolumeSpecName "kube-api-access-tt2dd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.299255 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8abdcde-a74d-4d46-9827-720f5202a317-config-data" (OuterVolumeSpecName: "config-data") pod "c8abdcde-a74d-4d46-9827-720f5202a317" (UID: "c8abdcde-a74d-4d46-9827-720f5202a317"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.301133 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8abdcde-a74d-4d46-9827-720f5202a317-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c8abdcde-a74d-4d46-9827-720f5202a317" (UID: "c8abdcde-a74d-4d46-9827-720f5202a317"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.375047 4752 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c8abdcde-a74d-4d46-9827-720f5202a317-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.375093 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8abdcde-a74d-4d46-9827-720f5202a317-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.375111 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tt2dd\" (UniqueName: \"kubernetes.io/projected/c8abdcde-a74d-4d46-9827-720f5202a317-kube-api-access-tt2dd\") on node \"crc\" DevicePath \"\"" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.375132 4752 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c8abdcde-a74d-4d46-9827-720f5202a317-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.375146 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8abdcde-a74d-4d46-9827-720f5202a317-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.375160 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8abdcde-a74d-4d46-9827-720f5202a317-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.657470 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-b7ffs" event={"ID":"c8abdcde-a74d-4d46-9827-720f5202a317","Type":"ContainerDied","Data":"c6666194a771567d86b2b8921b5b839ce169fc1008dea15c067061c9fb8157cb"} Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.657888 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6666194a771567d86b2b8921b5b839ce169fc1008dea15c067061c9fb8157cb" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.657593 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-b7ffs" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.746178 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b232e27-0a9b-4d35-9116-60a26c2deb80" path="/var/lib/kubelet/pods/6b232e27-0a9b-4d35-9116-60a26c2deb80/volumes" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.747019 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff71927d-9232-4525-8ac9-4c6c82eb4ed6" path="/var/lib/kubelet/pods/ff71927d-9232-4525-8ac9-4c6c82eb4ed6/volumes" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.751410 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5d8447c8fd-t89p7"] Nov 24 12:30:04 crc kubenswrapper[4752]: E1124 12:30:04.751964 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff71927d-9232-4525-8ac9-4c6c82eb4ed6" containerName="init" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.751995 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff71927d-9232-4525-8ac9-4c6c82eb4ed6" containerName="init" Nov 24 12:30:04 crc kubenswrapper[4752]: E1124 12:30:04.752033 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff71927d-9232-4525-8ac9-4c6c82eb4ed6" containerName="dnsmasq-dns" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.752046 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff71927d-9232-4525-8ac9-4c6c82eb4ed6" containerName="dnsmasq-dns" Nov 24 12:30:04 crc kubenswrapper[4752]: E1124 12:30:04.752067 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32" containerName="collect-profiles" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.752079 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32" containerName="collect-profiles" Nov 24 12:30:04 crc kubenswrapper[4752]: E1124 12:30:04.752099 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8abdcde-a74d-4d46-9827-720f5202a317" containerName="keystone-bootstrap" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.752126 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8abdcde-a74d-4d46-9827-720f5202a317" containerName="keystone-bootstrap" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.752454 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8abdcde-a74d-4d46-9827-720f5202a317" containerName="keystone-bootstrap" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.752510 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff71927d-9232-4525-8ac9-4c6c82eb4ed6" containerName="dnsmasq-dns" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.752531 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32" containerName="collect-profiles" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.754394 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5d8447c8fd-t89p7" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.757059 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.760019 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5d8447c8fd-t89p7"] Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.760468 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.760931 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.761336 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jnw8d" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.882583 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fd5ad40-5a39-48db-8a9d-9d200ef0d0a8-scripts\") pod \"keystone-5d8447c8fd-t89p7\" (UID: \"1fd5ad40-5a39-48db-8a9d-9d200ef0d0a8\") " pod="openstack/keystone-5d8447c8fd-t89p7" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.882686 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1fd5ad40-5a39-48db-8a9d-9d200ef0d0a8-credential-keys\") pod \"keystone-5d8447c8fd-t89p7\" (UID: \"1fd5ad40-5a39-48db-8a9d-9d200ef0d0a8\") " pod="openstack/keystone-5d8447c8fd-t89p7" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.882871 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fd5ad40-5a39-48db-8a9d-9d200ef0d0a8-config-data\") pod \"keystone-5d8447c8fd-t89p7\" (UID: \"1fd5ad40-5a39-48db-8a9d-9d200ef0d0a8\") " pod="openstack/keystone-5d8447c8fd-t89p7" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.883009 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbvcl\" (UniqueName: \"kubernetes.io/projected/1fd5ad40-5a39-48db-8a9d-9d200ef0d0a8-kube-api-access-hbvcl\") pod \"keystone-5d8447c8fd-t89p7\" (UID: \"1fd5ad40-5a39-48db-8a9d-9d200ef0d0a8\") " pod="openstack/keystone-5d8447c8fd-t89p7" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.883055 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1fd5ad40-5a39-48db-8a9d-9d200ef0d0a8-fernet-keys\") pod \"keystone-5d8447c8fd-t89p7\" (UID: \"1fd5ad40-5a39-48db-8a9d-9d200ef0d0a8\") " pod="openstack/keystone-5d8447c8fd-t89p7" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.883095 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fd5ad40-5a39-48db-8a9d-9d200ef0d0a8-combined-ca-bundle\") pod \"keystone-5d8447c8fd-t89p7\" (UID: \"1fd5ad40-5a39-48db-8a9d-9d200ef0d0a8\") " pod="openstack/keystone-5d8447c8fd-t89p7" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.984606 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fd5ad40-5a39-48db-8a9d-9d200ef0d0a8-config-data\") pod \"keystone-5d8447c8fd-t89p7\" (UID: \"1fd5ad40-5a39-48db-8a9d-9d200ef0d0a8\") " pod="openstack/keystone-5d8447c8fd-t89p7" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.984670 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbvcl\" (UniqueName: \"kubernetes.io/projected/1fd5ad40-5a39-48db-8a9d-9d200ef0d0a8-kube-api-access-hbvcl\") pod \"keystone-5d8447c8fd-t89p7\" (UID: \"1fd5ad40-5a39-48db-8a9d-9d200ef0d0a8\") " pod="openstack/keystone-5d8447c8fd-t89p7" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.984692 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1fd5ad40-5a39-48db-8a9d-9d200ef0d0a8-fernet-keys\") pod \"keystone-5d8447c8fd-t89p7\" (UID: \"1fd5ad40-5a39-48db-8a9d-9d200ef0d0a8\") " pod="openstack/keystone-5d8447c8fd-t89p7" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.984708 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fd5ad40-5a39-48db-8a9d-9d200ef0d0a8-combined-ca-bundle\") pod \"keystone-5d8447c8fd-t89p7\" (UID: \"1fd5ad40-5a39-48db-8a9d-9d200ef0d0a8\") " pod="openstack/keystone-5d8447c8fd-t89p7" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.984794 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fd5ad40-5a39-48db-8a9d-9d200ef0d0a8-scripts\") pod \"keystone-5d8447c8fd-t89p7\" (UID: \"1fd5ad40-5a39-48db-8a9d-9d200ef0d0a8\") " pod="openstack/keystone-5d8447c8fd-t89p7" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.984820 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1fd5ad40-5a39-48db-8a9d-9d200ef0d0a8-credential-keys\") pod \"keystone-5d8447c8fd-t89p7\" (UID: \"1fd5ad40-5a39-48db-8a9d-9d200ef0d0a8\") " pod="openstack/keystone-5d8447c8fd-t89p7" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.989470 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1fd5ad40-5a39-48db-8a9d-9d200ef0d0a8-fernet-keys\") pod \"keystone-5d8447c8fd-t89p7\" (UID: \"1fd5ad40-5a39-48db-8a9d-9d200ef0d0a8\") " pod="openstack/keystone-5d8447c8fd-t89p7" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.989583 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fd5ad40-5a39-48db-8a9d-9d200ef0d0a8-scripts\") pod \"keystone-5d8447c8fd-t89p7\" (UID: \"1fd5ad40-5a39-48db-8a9d-9d200ef0d0a8\") " pod="openstack/keystone-5d8447c8fd-t89p7" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.989716 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1fd5ad40-5a39-48db-8a9d-9d200ef0d0a8-credential-keys\") pod \"keystone-5d8447c8fd-t89p7\" (UID: \"1fd5ad40-5a39-48db-8a9d-9d200ef0d0a8\") " pod="openstack/keystone-5d8447c8fd-t89p7" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.992411 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fd5ad40-5a39-48db-8a9d-9d200ef0d0a8-combined-ca-bundle\") pod \"keystone-5d8447c8fd-t89p7\" (UID: \"1fd5ad40-5a39-48db-8a9d-9d200ef0d0a8\") " pod="openstack/keystone-5d8447c8fd-t89p7" Nov 24 12:30:04 crc kubenswrapper[4752]: I1124 12:30:04.992862 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fd5ad40-5a39-48db-8a9d-9d200ef0d0a8-config-data\") pod \"keystone-5d8447c8fd-t89p7\" (UID: \"1fd5ad40-5a39-48db-8a9d-9d200ef0d0a8\") " pod="openstack/keystone-5d8447c8fd-t89p7" Nov 24 12:30:05 crc kubenswrapper[4752]: I1124 12:30:05.007736 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbvcl\" (UniqueName: \"kubernetes.io/projected/1fd5ad40-5a39-48db-8a9d-9d200ef0d0a8-kube-api-access-hbvcl\") pod \"keystone-5d8447c8fd-t89p7\" (UID: \"1fd5ad40-5a39-48db-8a9d-9d200ef0d0a8\") " pod="openstack/keystone-5d8447c8fd-t89p7" Nov 24 12:30:05 crc kubenswrapper[4752]: I1124 12:30:05.083530 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5d8447c8fd-t89p7" Nov 24 12:30:05 crc kubenswrapper[4752]: I1124 12:30:05.503478 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5d8447c8fd-t89p7"] Nov 24 12:30:05 crc kubenswrapper[4752]: W1124 12:30:05.509328 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fd5ad40_5a39_48db_8a9d_9d200ef0d0a8.slice/crio-c4221dfc8f3b19a9721fd352804bdae5830fab0c59ed5f3affb20fcfc8b52d07 WatchSource:0}: Error finding container c4221dfc8f3b19a9721fd352804bdae5830fab0c59ed5f3affb20fcfc8b52d07: Status 404 returned error can't find the container with id c4221dfc8f3b19a9721fd352804bdae5830fab0c59ed5f3affb20fcfc8b52d07 Nov 24 12:30:05 crc kubenswrapper[4752]: I1124 12:30:05.666098 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5d8447c8fd-t89p7" event={"ID":"1fd5ad40-5a39-48db-8a9d-9d200ef0d0a8","Type":"ContainerStarted","Data":"c4221dfc8f3b19a9721fd352804bdae5830fab0c59ed5f3affb20fcfc8b52d07"} Nov 24 12:30:06 crc kubenswrapper[4752]: I1124 12:30:06.680291 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5d8447c8fd-t89p7" event={"ID":"1fd5ad40-5a39-48db-8a9d-9d200ef0d0a8","Type":"ContainerStarted","Data":"8c7904e4d89e7af4d3479d4179e61e7188da3527c868ffb540b664086a1bf4ac"} Nov 24 12:30:06 crc kubenswrapper[4752]: I1124 12:30:06.680459 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5d8447c8fd-t89p7" Nov 24 12:30:06 crc kubenswrapper[4752]: I1124 12:30:06.707363 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5d8447c8fd-t89p7" podStartSLOduration=2.707336913 podStartE2EDuration="2.707336913s" podCreationTimestamp="2025-11-24 12:30:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:30:06.697618554 +0000 UTC m=+5012.682438843" watchObservedRunningTime="2025-11-24 12:30:06.707336913 +0000 UTC m=+5012.692157202" Nov 24 12:30:17 crc kubenswrapper[4752]: I1124 12:30:17.728967 4752 scope.go:117] "RemoveContainer" containerID="294534fa96e641fffe9b0cce6e2bc4d525450d730fff13a2ed6629d46358ee8a" Nov 24 12:30:17 crc kubenswrapper[4752]: E1124 12:30:17.730282 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:30:31 crc kubenswrapper[4752]: I1124 12:30:31.727458 4752 scope.go:117] "RemoveContainer" containerID="294534fa96e641fffe9b0cce6e2bc4d525450d730fff13a2ed6629d46358ee8a" Nov 24 12:30:31 crc kubenswrapper[4752]: E1124 12:30:31.728046 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:30:36 crc kubenswrapper[4752]: I1124 12:30:36.629258 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5d8447c8fd-t89p7" Nov 24 12:30:40 crc kubenswrapper[4752]: I1124 12:30:40.702258 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Nov 24 12:30:40 crc kubenswrapper[4752]: I1124 12:30:40.704322 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 24 12:30:40 crc kubenswrapper[4752]: I1124 12:30:40.706736 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-9q4w2" Nov 24 12:30:40 crc kubenswrapper[4752]: I1124 12:30:40.707338 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Nov 24 12:30:40 crc kubenswrapper[4752]: I1124 12:30:40.709461 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Nov 24 12:30:40 crc kubenswrapper[4752]: I1124 12:30:40.718217 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 24 12:30:40 crc kubenswrapper[4752]: I1124 12:30:40.772539 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv4rj\" (UniqueName: \"kubernetes.io/projected/aec6027c-fa97-4f99-adb4-8b4e298a7aa4-kube-api-access-xv4rj\") pod \"openstackclient\" (UID: \"aec6027c-fa97-4f99-adb4-8b4e298a7aa4\") " pod="openstack/openstackclient" Nov 24 12:30:40 crc kubenswrapper[4752]: I1124 12:30:40.772669 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aec6027c-fa97-4f99-adb4-8b4e298a7aa4-openstack-config-secret\") pod \"openstackclient\" (UID: \"aec6027c-fa97-4f99-adb4-8b4e298a7aa4\") " pod="openstack/openstackclient" Nov 24 12:30:40 crc kubenswrapper[4752]: I1124 12:30:40.772726 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aec6027c-fa97-4f99-adb4-8b4e298a7aa4-openstack-config\") pod \"openstackclient\" (UID: \"aec6027c-fa97-4f99-adb4-8b4e298a7aa4\") " pod="openstack/openstackclient" Nov 24 12:30:40 crc kubenswrapper[4752]: I1124 12:30:40.874078 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv4rj\" (UniqueName: \"kubernetes.io/projected/aec6027c-fa97-4f99-adb4-8b4e298a7aa4-kube-api-access-xv4rj\") pod \"openstackclient\" (UID: \"aec6027c-fa97-4f99-adb4-8b4e298a7aa4\") " pod="openstack/openstackclient" Nov 24 12:30:40 crc kubenswrapper[4752]: I1124 12:30:40.874160 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aec6027c-fa97-4f99-adb4-8b4e298a7aa4-openstack-config-secret\") pod \"openstackclient\" (UID: \"aec6027c-fa97-4f99-adb4-8b4e298a7aa4\") " pod="openstack/openstackclient" Nov 24 12:30:40 crc kubenswrapper[4752]: I1124 12:30:40.874212 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aec6027c-fa97-4f99-adb4-8b4e298a7aa4-openstack-config\") pod \"openstackclient\" (UID: \"aec6027c-fa97-4f99-adb4-8b4e298a7aa4\") " pod="openstack/openstackclient" Nov 24 12:30:40 crc kubenswrapper[4752]: I1124 12:30:40.875001 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aec6027c-fa97-4f99-adb4-8b4e298a7aa4-openstack-config\") pod \"openstackclient\" (UID: \"aec6027c-fa97-4f99-adb4-8b4e298a7aa4\") " pod="openstack/openstackclient" Nov 24 12:30:40 crc kubenswrapper[4752]: I1124 12:30:40.880627 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aec6027c-fa97-4f99-adb4-8b4e298a7aa4-openstack-config-secret\") pod \"openstackclient\" (UID: \"aec6027c-fa97-4f99-adb4-8b4e298a7aa4\") " pod="openstack/openstackclient" Nov 24 12:30:40 crc kubenswrapper[4752]: I1124 12:30:40.905323 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv4rj\" (UniqueName: \"kubernetes.io/projected/aec6027c-fa97-4f99-adb4-8b4e298a7aa4-kube-api-access-xv4rj\") pod \"openstackclient\" (UID: \"aec6027c-fa97-4f99-adb4-8b4e298a7aa4\") " pod="openstack/openstackclient" Nov 24 12:30:41 crc kubenswrapper[4752]: I1124 12:30:41.041580 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 24 12:30:41 crc kubenswrapper[4752]: I1124 12:30:41.501544 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 24 12:30:42 crc kubenswrapper[4752]: I1124 12:30:42.009148 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"aec6027c-fa97-4f99-adb4-8b4e298a7aa4","Type":"ContainerStarted","Data":"86326bb1ea503c1e9a7da0299a97728771625a5237aa8cf4d82339d6a4fb0c20"} Nov 24 12:30:42 crc kubenswrapper[4752]: I1124 12:30:42.009590 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"aec6027c-fa97-4f99-adb4-8b4e298a7aa4","Type":"ContainerStarted","Data":"29500d1b4c742a38eb522f353fb43729586be77f97ba7a611e2983c20d94fdb6"} Nov 24 12:30:42 crc kubenswrapper[4752]: I1124 12:30:42.031855 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.031839222 podStartE2EDuration="2.031839222s" podCreationTimestamp="2025-11-24 12:30:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:30:42.030433832 +0000 UTC m=+5048.015254181" watchObservedRunningTime="2025-11-24 12:30:42.031839222 +0000 UTC m=+5048.016659501" Nov 24 12:30:42 crc kubenswrapper[4752]: I1124 12:30:42.102982 4752 scope.go:117] "RemoveContainer" containerID="407e994f8b0be39e2be3ef7de51b6edb928df16b26b4146717afff104b33eef4" Nov 24 12:30:42 crc kubenswrapper[4752]: I1124 12:30:42.125377 4752 scope.go:117] "RemoveContainer" containerID="c9ce396e8ff9b199b3f6b73f8dee36a4e25d6e3eaa4bd14066301b8cfc934d59" Nov 24 12:30:42 crc kubenswrapper[4752]: I1124 12:30:42.173982 4752 scope.go:117] "RemoveContainer" containerID="f9e1034cb16de78dc658c4449692af4a495475c73da8f0d44cc63782e23f2016" Nov 24 12:30:42 crc kubenswrapper[4752]: I1124 12:30:42.210479 4752 scope.go:117] "RemoveContainer" containerID="2efd9b4d72b138d56faff435a5fb0f29bfc7b9cfb1549fcd47b983b75bf4c663" Nov 24 12:30:42 crc kubenswrapper[4752]: I1124 12:30:42.263481 4752 scope.go:117] "RemoveContainer" containerID="effa82a305dce892bcb51a1289dac1f34fa53250a8b21c97bc9398e29a999628" Nov 24 12:30:42 crc kubenswrapper[4752]: I1124 12:30:42.281982 4752 scope.go:117] "RemoveContainer" containerID="c2715b13110eb5eb0c124f434cd1db2d3a07e89d32fa818d9c560cf97dfe0ca3" Nov 24 12:30:42 crc kubenswrapper[4752]: I1124 12:30:42.303449 4752 scope.go:117] "RemoveContainer" containerID="ec15ec535fcc135a03d6d7bbab34b33fb48c690aa05f30294625b57690e24388" Nov 24 12:30:42 crc kubenswrapper[4752]: I1124 12:30:42.319603 4752 scope.go:117] "RemoveContainer" containerID="0c3394fc59cac84a57a561fc24bad2cb7b841f6dcd44724cee2d9c7a012cd29e" Nov 24 12:30:46 crc kubenswrapper[4752]: I1124 12:30:46.727879 4752 scope.go:117] "RemoveContainer" containerID="294534fa96e641fffe9b0cce6e2bc4d525450d730fff13a2ed6629d46358ee8a" Nov 24 12:30:46 crc kubenswrapper[4752]: E1124 12:30:46.728619 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:31:00 crc kubenswrapper[4752]: I1124 12:31:00.728720 4752 scope.go:117] "RemoveContainer" containerID="294534fa96e641fffe9b0cce6e2bc4d525450d730fff13a2ed6629d46358ee8a" Nov 24 12:31:00 crc kubenswrapper[4752]: E1124 12:31:00.729879 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:31:13 crc kubenswrapper[4752]: I1124 12:31:13.728115 4752 scope.go:117] "RemoveContainer" containerID="294534fa96e641fffe9b0cce6e2bc4d525450d730fff13a2ed6629d46358ee8a" Nov 24 12:31:13 crc kubenswrapper[4752]: E1124 12:31:13.728925 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:31:27 crc kubenswrapper[4752]: I1124 12:31:27.729467 4752 scope.go:117] "RemoveContainer" containerID="294534fa96e641fffe9b0cce6e2bc4d525450d730fff13a2ed6629d46358ee8a" Nov 24 12:31:27 crc kubenswrapper[4752]: E1124 12:31:27.730977 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:31:38 crc kubenswrapper[4752]: I1124 12:31:38.686471 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mcbk9"] Nov 24 12:31:38 crc kubenswrapper[4752]: I1124 12:31:38.688955 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mcbk9" Nov 24 12:31:38 crc kubenswrapper[4752]: I1124 12:31:38.698485 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mcbk9"] Nov 24 12:31:38 crc kubenswrapper[4752]: I1124 12:31:38.863289 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37860744-8942-4e83-97cb-2aafb7871f2c-utilities\") pod \"redhat-operators-mcbk9\" (UID: \"37860744-8942-4e83-97cb-2aafb7871f2c\") " pod="openshift-marketplace/redhat-operators-mcbk9" Nov 24 12:31:38 crc kubenswrapper[4752]: I1124 12:31:38.863339 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qwsj\" (UniqueName: \"kubernetes.io/projected/37860744-8942-4e83-97cb-2aafb7871f2c-kube-api-access-4qwsj\") pod \"redhat-operators-mcbk9\" (UID: \"37860744-8942-4e83-97cb-2aafb7871f2c\") " pod="openshift-marketplace/redhat-operators-mcbk9" Nov 24 12:31:38 crc kubenswrapper[4752]: I1124 12:31:38.863372 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37860744-8942-4e83-97cb-2aafb7871f2c-catalog-content\") pod \"redhat-operators-mcbk9\" (UID: \"37860744-8942-4e83-97cb-2aafb7871f2c\") " pod="openshift-marketplace/redhat-operators-mcbk9" Nov 24 12:31:38 crc kubenswrapper[4752]: I1124 12:31:38.966611 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37860744-8942-4e83-97cb-2aafb7871f2c-utilities\") pod \"redhat-operators-mcbk9\" (UID: \"37860744-8942-4e83-97cb-2aafb7871f2c\") " pod="openshift-marketplace/redhat-operators-mcbk9" Nov 24 12:31:38 crc kubenswrapper[4752]: I1124 12:31:38.967235 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qwsj\" (UniqueName: \"kubernetes.io/projected/37860744-8942-4e83-97cb-2aafb7871f2c-kube-api-access-4qwsj\") pod \"redhat-operators-mcbk9\" (UID: \"37860744-8942-4e83-97cb-2aafb7871f2c\") " pod="openshift-marketplace/redhat-operators-mcbk9" Nov 24 12:31:38 crc kubenswrapper[4752]: I1124 12:31:38.967324 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37860744-8942-4e83-97cb-2aafb7871f2c-catalog-content\") pod \"redhat-operators-mcbk9\" (UID: \"37860744-8942-4e83-97cb-2aafb7871f2c\") " pod="openshift-marketplace/redhat-operators-mcbk9" Nov 24 12:31:38 crc kubenswrapper[4752]: I1124 12:31:38.967527 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37860744-8942-4e83-97cb-2aafb7871f2c-utilities\") pod \"redhat-operators-mcbk9\" (UID: \"37860744-8942-4e83-97cb-2aafb7871f2c\") " pod="openshift-marketplace/redhat-operators-mcbk9" Nov 24 12:31:38 crc kubenswrapper[4752]: I1124 12:31:38.968058 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37860744-8942-4e83-97cb-2aafb7871f2c-catalog-content\") pod \"redhat-operators-mcbk9\" (UID: \"37860744-8942-4e83-97cb-2aafb7871f2c\") " pod="openshift-marketplace/redhat-operators-mcbk9" Nov 24 12:31:38 crc kubenswrapper[4752]: I1124 12:31:38.995311 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qwsj\" (UniqueName: \"kubernetes.io/projected/37860744-8942-4e83-97cb-2aafb7871f2c-kube-api-access-4qwsj\") pod \"redhat-operators-mcbk9\" (UID: \"37860744-8942-4e83-97cb-2aafb7871f2c\") " pod="openshift-marketplace/redhat-operators-mcbk9" Nov 24 12:31:39 crc kubenswrapper[4752]: I1124 12:31:39.022531 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mcbk9" Nov 24 12:31:39 crc kubenswrapper[4752]: I1124 12:31:39.539277 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mcbk9"] Nov 24 12:31:39 crc kubenswrapper[4752]: I1124 12:31:39.629611 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mcbk9" event={"ID":"37860744-8942-4e83-97cb-2aafb7871f2c","Type":"ContainerStarted","Data":"e122f8e6da9ca193bcbdf2852599604e6e984b42d1d4fe7a401ef5cce61a785d"} Nov 24 12:31:40 crc kubenswrapper[4752]: I1124 12:31:40.640107 4752 generic.go:334] "Generic (PLEG): container finished" podID="37860744-8942-4e83-97cb-2aafb7871f2c" containerID="492684ce571057f012601a4aed4bc76a3bdcd8b50d32c9bbeda3e21a6d95d6e0" exitCode=0 Nov 24 12:31:40 crc kubenswrapper[4752]: I1124 12:31:40.640218 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mcbk9" event={"ID":"37860744-8942-4e83-97cb-2aafb7871f2c","Type":"ContainerDied","Data":"492684ce571057f012601a4aed4bc76a3bdcd8b50d32c9bbeda3e21a6d95d6e0"} Nov 24 12:31:40 crc kubenswrapper[4752]: I1124 12:31:40.643031 4752 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 24 12:31:40 crc kubenswrapper[4752]: I1124 12:31:40.736132 4752 scope.go:117] "RemoveContainer" containerID="294534fa96e641fffe9b0cce6e2bc4d525450d730fff13a2ed6629d46358ee8a" Nov 24 12:31:40 crc kubenswrapper[4752]: E1124 12:31:40.737366 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:31:41 crc kubenswrapper[4752]: I1124 12:31:41.668824 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mcbk9" event={"ID":"37860744-8942-4e83-97cb-2aafb7871f2c","Type":"ContainerStarted","Data":"ac7cb856938cbd6d3bdc6cc7009d9e25d5d350b6e4a8feb0fa0da88a367eed28"} Nov 24 12:31:42 crc kubenswrapper[4752]: I1124 12:31:42.683792 4752 generic.go:334] "Generic (PLEG): container finished" podID="37860744-8942-4e83-97cb-2aafb7871f2c" containerID="ac7cb856938cbd6d3bdc6cc7009d9e25d5d350b6e4a8feb0fa0da88a367eed28" exitCode=0 Nov 24 12:31:42 crc kubenswrapper[4752]: I1124 12:31:42.683866 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mcbk9" event={"ID":"37860744-8942-4e83-97cb-2aafb7871f2c","Type":"ContainerDied","Data":"ac7cb856938cbd6d3bdc6cc7009d9e25d5d350b6e4a8feb0fa0da88a367eed28"} Nov 24 12:31:43 crc kubenswrapper[4752]: I1124 12:31:43.694345 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mcbk9" event={"ID":"37860744-8942-4e83-97cb-2aafb7871f2c","Type":"ContainerStarted","Data":"ec769fc7d43801ad99ddeeafe75fa806c8c24bd37bc1bfe2c9aa6c9e6ab37a7a"} Nov 24 12:31:43 crc kubenswrapper[4752]: I1124 12:31:43.714268 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mcbk9" podStartSLOduration=3.132523937 podStartE2EDuration="5.71424579s" podCreationTimestamp="2025-11-24 12:31:38 +0000 UTC" firstStartedPulling="2025-11-24 12:31:40.642791497 +0000 UTC m=+5106.627611786" lastFinishedPulling="2025-11-24 12:31:43.22451334 +0000 UTC m=+5109.209333639" observedRunningTime="2025-11-24 12:31:43.713810417 +0000 UTC m=+5109.698630706" watchObservedRunningTime="2025-11-24 12:31:43.71424579 +0000 UTC m=+5109.699066089" Nov 24 12:31:49 crc kubenswrapper[4752]: I1124 12:31:49.032432 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mcbk9" Nov 24 12:31:49 crc kubenswrapper[4752]: I1124 12:31:49.033190 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mcbk9" Nov 24 12:31:50 crc kubenswrapper[4752]: I1124 12:31:50.080876 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mcbk9" podUID="37860744-8942-4e83-97cb-2aafb7871f2c" containerName="registry-server" probeResult="failure" output=< Nov 24 12:31:50 crc kubenswrapper[4752]: timeout: failed to connect service ":50051" within 1s Nov 24 12:31:50 crc kubenswrapper[4752]: > Nov 24 12:31:51 crc kubenswrapper[4752]: I1124 12:31:51.728599 4752 scope.go:117] "RemoveContainer" containerID="294534fa96e641fffe9b0cce6e2bc4d525450d730fff13a2ed6629d46358ee8a" Nov 24 12:31:51 crc kubenswrapper[4752]: E1124 12:31:51.729384 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:31:59 crc kubenswrapper[4752]: I1124 12:31:59.080364 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mcbk9" Nov 24 12:31:59 crc kubenswrapper[4752]: I1124 12:31:59.158889 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mcbk9" Nov 24 12:31:59 crc kubenswrapper[4752]: I1124 12:31:59.325540 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mcbk9"] Nov 24 12:32:00 crc kubenswrapper[4752]: I1124 12:32:00.859370 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mcbk9" podUID="37860744-8942-4e83-97cb-2aafb7871f2c" containerName="registry-server" containerID="cri-o://ec769fc7d43801ad99ddeeafe75fa806c8c24bd37bc1bfe2c9aa6c9e6ab37a7a" gracePeriod=2 Nov 24 12:32:01 crc kubenswrapper[4752]: I1124 12:32:01.339422 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mcbk9" Nov 24 12:32:01 crc kubenswrapper[4752]: I1124 12:32:01.408590 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37860744-8942-4e83-97cb-2aafb7871f2c-utilities\") pod \"37860744-8942-4e83-97cb-2aafb7871f2c\" (UID: \"37860744-8942-4e83-97cb-2aafb7871f2c\") " Nov 24 12:32:01 crc kubenswrapper[4752]: I1124 12:32:01.408771 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37860744-8942-4e83-97cb-2aafb7871f2c-catalog-content\") pod \"37860744-8942-4e83-97cb-2aafb7871f2c\" (UID: \"37860744-8942-4e83-97cb-2aafb7871f2c\") " Nov 24 12:32:01 crc kubenswrapper[4752]: I1124 12:32:01.408859 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qwsj\" (UniqueName: \"kubernetes.io/projected/37860744-8942-4e83-97cb-2aafb7871f2c-kube-api-access-4qwsj\") pod \"37860744-8942-4e83-97cb-2aafb7871f2c\" (UID: \"37860744-8942-4e83-97cb-2aafb7871f2c\") " Nov 24 12:32:01 crc kubenswrapper[4752]: I1124 12:32:01.410546 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37860744-8942-4e83-97cb-2aafb7871f2c-utilities" (OuterVolumeSpecName: "utilities") pod "37860744-8942-4e83-97cb-2aafb7871f2c" (UID: "37860744-8942-4e83-97cb-2aafb7871f2c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:32:01 crc kubenswrapper[4752]: I1124 12:32:01.420725 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37860744-8942-4e83-97cb-2aafb7871f2c-kube-api-access-4qwsj" (OuterVolumeSpecName: "kube-api-access-4qwsj") pod "37860744-8942-4e83-97cb-2aafb7871f2c" (UID: "37860744-8942-4e83-97cb-2aafb7871f2c"). InnerVolumeSpecName "kube-api-access-4qwsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:32:01 crc kubenswrapper[4752]: I1124 12:32:01.508080 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37860744-8942-4e83-97cb-2aafb7871f2c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "37860744-8942-4e83-97cb-2aafb7871f2c" (UID: "37860744-8942-4e83-97cb-2aafb7871f2c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:32:01 crc kubenswrapper[4752]: I1124 12:32:01.510705 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qwsj\" (UniqueName: \"kubernetes.io/projected/37860744-8942-4e83-97cb-2aafb7871f2c-kube-api-access-4qwsj\") on node \"crc\" DevicePath \"\"" Nov 24 12:32:01 crc kubenswrapper[4752]: I1124 12:32:01.510736 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37860744-8942-4e83-97cb-2aafb7871f2c-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 12:32:01 crc kubenswrapper[4752]: I1124 12:32:01.511166 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37860744-8942-4e83-97cb-2aafb7871f2c-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 12:32:01 crc kubenswrapper[4752]: I1124 12:32:01.872166 4752 generic.go:334] "Generic (PLEG): container finished" podID="37860744-8942-4e83-97cb-2aafb7871f2c" containerID="ec769fc7d43801ad99ddeeafe75fa806c8c24bd37bc1bfe2c9aa6c9e6ab37a7a" exitCode=0 Nov 24 12:32:01 crc kubenswrapper[4752]: I1124 12:32:01.872219 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mcbk9" event={"ID":"37860744-8942-4e83-97cb-2aafb7871f2c","Type":"ContainerDied","Data":"ec769fc7d43801ad99ddeeafe75fa806c8c24bd37bc1bfe2c9aa6c9e6ab37a7a"} Nov 24 12:32:01 crc kubenswrapper[4752]: I1124 12:32:01.872246 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mcbk9" Nov 24 12:32:01 crc kubenswrapper[4752]: I1124 12:32:01.872301 4752 scope.go:117] "RemoveContainer" containerID="ec769fc7d43801ad99ddeeafe75fa806c8c24bd37bc1bfe2c9aa6c9e6ab37a7a" Nov 24 12:32:01 crc kubenswrapper[4752]: I1124 12:32:01.872257 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mcbk9" event={"ID":"37860744-8942-4e83-97cb-2aafb7871f2c","Type":"ContainerDied","Data":"e122f8e6da9ca193bcbdf2852599604e6e984b42d1d4fe7a401ef5cce61a785d"} Nov 24 12:32:01 crc kubenswrapper[4752]: I1124 12:32:01.912183 4752 scope.go:117] "RemoveContainer" containerID="ac7cb856938cbd6d3bdc6cc7009d9e25d5d350b6e4a8feb0fa0da88a367eed28" Nov 24 12:32:01 crc kubenswrapper[4752]: I1124 12:32:01.912662 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mcbk9"] Nov 24 12:32:01 crc kubenswrapper[4752]: I1124 12:32:01.921947 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mcbk9"] Nov 24 12:32:01 crc kubenswrapper[4752]: I1124 12:32:01.945954 4752 scope.go:117] "RemoveContainer" containerID="492684ce571057f012601a4aed4bc76a3bdcd8b50d32c9bbeda3e21a6d95d6e0" Nov 24 12:32:01 crc kubenswrapper[4752]: I1124 12:32:01.993049 4752 scope.go:117] "RemoveContainer" containerID="ec769fc7d43801ad99ddeeafe75fa806c8c24bd37bc1bfe2c9aa6c9e6ab37a7a" Nov 24 12:32:01 crc kubenswrapper[4752]: E1124 12:32:01.993490 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec769fc7d43801ad99ddeeafe75fa806c8c24bd37bc1bfe2c9aa6c9e6ab37a7a\": container with ID starting with ec769fc7d43801ad99ddeeafe75fa806c8c24bd37bc1bfe2c9aa6c9e6ab37a7a not found: ID does not exist" containerID="ec769fc7d43801ad99ddeeafe75fa806c8c24bd37bc1bfe2c9aa6c9e6ab37a7a" Nov 24 12:32:01 crc kubenswrapper[4752]: I1124 12:32:01.993550 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec769fc7d43801ad99ddeeafe75fa806c8c24bd37bc1bfe2c9aa6c9e6ab37a7a"} err="failed to get container status \"ec769fc7d43801ad99ddeeafe75fa806c8c24bd37bc1bfe2c9aa6c9e6ab37a7a\": rpc error: code = NotFound desc = could not find container \"ec769fc7d43801ad99ddeeafe75fa806c8c24bd37bc1bfe2c9aa6c9e6ab37a7a\": container with ID starting with ec769fc7d43801ad99ddeeafe75fa806c8c24bd37bc1bfe2c9aa6c9e6ab37a7a not found: ID does not exist" Nov 24 12:32:01 crc kubenswrapper[4752]: I1124 12:32:01.993573 4752 scope.go:117] "RemoveContainer" containerID="ac7cb856938cbd6d3bdc6cc7009d9e25d5d350b6e4a8feb0fa0da88a367eed28" Nov 24 12:32:01 crc kubenswrapper[4752]: E1124 12:32:01.993868 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac7cb856938cbd6d3bdc6cc7009d9e25d5d350b6e4a8feb0fa0da88a367eed28\": container with ID starting with ac7cb856938cbd6d3bdc6cc7009d9e25d5d350b6e4a8feb0fa0da88a367eed28 not found: ID does not exist" containerID="ac7cb856938cbd6d3bdc6cc7009d9e25d5d350b6e4a8feb0fa0da88a367eed28" Nov 24 12:32:01 crc kubenswrapper[4752]: I1124 12:32:01.993919 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac7cb856938cbd6d3bdc6cc7009d9e25d5d350b6e4a8feb0fa0da88a367eed28"} err="failed to get container status \"ac7cb856938cbd6d3bdc6cc7009d9e25d5d350b6e4a8feb0fa0da88a367eed28\": rpc error: code = NotFound desc = could not find container \"ac7cb856938cbd6d3bdc6cc7009d9e25d5d350b6e4a8feb0fa0da88a367eed28\": container with ID starting with ac7cb856938cbd6d3bdc6cc7009d9e25d5d350b6e4a8feb0fa0da88a367eed28 not found: ID does not exist" Nov 24 12:32:01 crc kubenswrapper[4752]: I1124 12:32:01.993936 4752 scope.go:117] "RemoveContainer" containerID="492684ce571057f012601a4aed4bc76a3bdcd8b50d32c9bbeda3e21a6d95d6e0" Nov 24 12:32:01 crc kubenswrapper[4752]: E1124 12:32:01.994240 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"492684ce571057f012601a4aed4bc76a3bdcd8b50d32c9bbeda3e21a6d95d6e0\": container with ID starting with 492684ce571057f012601a4aed4bc76a3bdcd8b50d32c9bbeda3e21a6d95d6e0 not found: ID does not exist" containerID="492684ce571057f012601a4aed4bc76a3bdcd8b50d32c9bbeda3e21a6d95d6e0" Nov 24 12:32:01 crc kubenswrapper[4752]: I1124 12:32:01.994271 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"492684ce571057f012601a4aed4bc76a3bdcd8b50d32c9bbeda3e21a6d95d6e0"} err="failed to get container status \"492684ce571057f012601a4aed4bc76a3bdcd8b50d32c9bbeda3e21a6d95d6e0\": rpc error: code = NotFound desc = could not find container \"492684ce571057f012601a4aed4bc76a3bdcd8b50d32c9bbeda3e21a6d95d6e0\": container with ID starting with 492684ce571057f012601a4aed4bc76a3bdcd8b50d32c9bbeda3e21a6d95d6e0 not found: ID does not exist" Nov 24 12:32:02 crc kubenswrapper[4752]: I1124 12:32:02.738200 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37860744-8942-4e83-97cb-2aafb7871f2c" path="/var/lib/kubelet/pods/37860744-8942-4e83-97cb-2aafb7871f2c/volumes" Nov 24 12:32:05 crc kubenswrapper[4752]: I1124 12:32:05.728013 4752 scope.go:117] "RemoveContainer" containerID="294534fa96e641fffe9b0cce6e2bc4d525450d730fff13a2ed6629d46358ee8a" Nov 24 12:32:05 crc kubenswrapper[4752]: E1124 12:32:05.728609 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:32:18 crc kubenswrapper[4752]: I1124 12:32:18.556563 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-nh9fj"] Nov 24 12:32:18 crc kubenswrapper[4752]: E1124 12:32:18.557570 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37860744-8942-4e83-97cb-2aafb7871f2c" containerName="extract-utilities" Nov 24 12:32:18 crc kubenswrapper[4752]: I1124 12:32:18.557587 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="37860744-8942-4e83-97cb-2aafb7871f2c" containerName="extract-utilities" Nov 24 12:32:18 crc kubenswrapper[4752]: E1124 12:32:18.557605 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37860744-8942-4e83-97cb-2aafb7871f2c" containerName="extract-content" Nov 24 12:32:18 crc kubenswrapper[4752]: I1124 12:32:18.557612 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="37860744-8942-4e83-97cb-2aafb7871f2c" containerName="extract-content" Nov 24 12:32:18 crc kubenswrapper[4752]: E1124 12:32:18.557636 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37860744-8942-4e83-97cb-2aafb7871f2c" containerName="registry-server" Nov 24 12:32:18 crc kubenswrapper[4752]: I1124 12:32:18.557644 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="37860744-8942-4e83-97cb-2aafb7871f2c" containerName="registry-server" Nov 24 12:32:18 crc kubenswrapper[4752]: I1124 12:32:18.557840 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="37860744-8942-4e83-97cb-2aafb7871f2c" containerName="registry-server" Nov 24 12:32:18 crc kubenswrapper[4752]: I1124 12:32:18.558516 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-nh9fj" Nov 24 12:32:18 crc kubenswrapper[4752]: I1124 12:32:18.565382 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-nh9fj"] Nov 24 12:32:18 crc kubenswrapper[4752]: I1124 12:32:18.605061 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84e5f7b0-c4d2-4672-8b81-87f84d56ac1e-operator-scripts\") pod \"barbican-db-create-nh9fj\" (UID: \"84e5f7b0-c4d2-4672-8b81-87f84d56ac1e\") " pod="openstack/barbican-db-create-nh9fj" Nov 24 12:32:18 crc kubenswrapper[4752]: I1124 12:32:18.605145 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgnkr\" (UniqueName: \"kubernetes.io/projected/84e5f7b0-c4d2-4672-8b81-87f84d56ac1e-kube-api-access-xgnkr\") pod \"barbican-db-create-nh9fj\" (UID: \"84e5f7b0-c4d2-4672-8b81-87f84d56ac1e\") " pod="openstack/barbican-db-create-nh9fj" Nov 24 12:32:18 crc kubenswrapper[4752]: I1124 12:32:18.656796 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-1e06-account-create-vcjrj"] Nov 24 12:32:18 crc kubenswrapper[4752]: I1124 12:32:18.657861 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-1e06-account-create-vcjrj" Nov 24 12:32:18 crc kubenswrapper[4752]: I1124 12:32:18.659737 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Nov 24 12:32:18 crc kubenswrapper[4752]: I1124 12:32:18.667991 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-1e06-account-create-vcjrj"] Nov 24 12:32:18 crc kubenswrapper[4752]: I1124 12:32:18.707513 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae3d459b-1cb1-4d4a-a781-53f77cb195e4-operator-scripts\") pod \"barbican-1e06-account-create-vcjrj\" (UID: \"ae3d459b-1cb1-4d4a-a781-53f77cb195e4\") " pod="openstack/barbican-1e06-account-create-vcjrj" Nov 24 12:32:18 crc kubenswrapper[4752]: I1124 12:32:18.707587 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p87x2\" (UniqueName: \"kubernetes.io/projected/ae3d459b-1cb1-4d4a-a781-53f77cb195e4-kube-api-access-p87x2\") pod \"barbican-1e06-account-create-vcjrj\" (UID: \"ae3d459b-1cb1-4d4a-a781-53f77cb195e4\") " pod="openstack/barbican-1e06-account-create-vcjrj" Nov 24 12:32:18 crc kubenswrapper[4752]: I1124 12:32:18.707967 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84e5f7b0-c4d2-4672-8b81-87f84d56ac1e-operator-scripts\") pod \"barbican-db-create-nh9fj\" (UID: \"84e5f7b0-c4d2-4672-8b81-87f84d56ac1e\") " pod="openstack/barbican-db-create-nh9fj" Nov 24 12:32:18 crc kubenswrapper[4752]: I1124 12:32:18.708043 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgnkr\" (UniqueName: \"kubernetes.io/projected/84e5f7b0-c4d2-4672-8b81-87f84d56ac1e-kube-api-access-xgnkr\") pod \"barbican-db-create-nh9fj\" (UID: \"84e5f7b0-c4d2-4672-8b81-87f84d56ac1e\") " pod="openstack/barbican-db-create-nh9fj" Nov 24 12:32:18 crc kubenswrapper[4752]: I1124 12:32:18.708870 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84e5f7b0-c4d2-4672-8b81-87f84d56ac1e-operator-scripts\") pod \"barbican-db-create-nh9fj\" (UID: \"84e5f7b0-c4d2-4672-8b81-87f84d56ac1e\") " pod="openstack/barbican-db-create-nh9fj" Nov 24 12:32:18 crc kubenswrapper[4752]: I1124 12:32:18.727296 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgnkr\" (UniqueName: \"kubernetes.io/projected/84e5f7b0-c4d2-4672-8b81-87f84d56ac1e-kube-api-access-xgnkr\") pod \"barbican-db-create-nh9fj\" (UID: \"84e5f7b0-c4d2-4672-8b81-87f84d56ac1e\") " pod="openstack/barbican-db-create-nh9fj" Nov 24 12:32:18 crc kubenswrapper[4752]: I1124 12:32:18.727924 4752 scope.go:117] "RemoveContainer" containerID="294534fa96e641fffe9b0cce6e2bc4d525450d730fff13a2ed6629d46358ee8a" Nov 24 12:32:18 crc kubenswrapper[4752]: E1124 12:32:18.728178 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:32:18 crc kubenswrapper[4752]: I1124 12:32:18.809100 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae3d459b-1cb1-4d4a-a781-53f77cb195e4-operator-scripts\") pod \"barbican-1e06-account-create-vcjrj\" (UID: \"ae3d459b-1cb1-4d4a-a781-53f77cb195e4\") " pod="openstack/barbican-1e06-account-create-vcjrj" Nov 24 12:32:18 crc kubenswrapper[4752]: I1124 12:32:18.809321 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p87x2\" (UniqueName: \"kubernetes.io/projected/ae3d459b-1cb1-4d4a-a781-53f77cb195e4-kube-api-access-p87x2\") pod \"barbican-1e06-account-create-vcjrj\" (UID: \"ae3d459b-1cb1-4d4a-a781-53f77cb195e4\") " pod="openstack/barbican-1e06-account-create-vcjrj" Nov 24 12:32:18 crc kubenswrapper[4752]: I1124 12:32:18.809925 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae3d459b-1cb1-4d4a-a781-53f77cb195e4-operator-scripts\") pod \"barbican-1e06-account-create-vcjrj\" (UID: \"ae3d459b-1cb1-4d4a-a781-53f77cb195e4\") " pod="openstack/barbican-1e06-account-create-vcjrj" Nov 24 12:32:18 crc kubenswrapper[4752]: I1124 12:32:18.827236 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p87x2\" (UniqueName: \"kubernetes.io/projected/ae3d459b-1cb1-4d4a-a781-53f77cb195e4-kube-api-access-p87x2\") pod \"barbican-1e06-account-create-vcjrj\" (UID: \"ae3d459b-1cb1-4d4a-a781-53f77cb195e4\") " pod="openstack/barbican-1e06-account-create-vcjrj" Nov 24 12:32:18 crc kubenswrapper[4752]: I1124 12:32:18.911943 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-nh9fj" Nov 24 12:32:18 crc kubenswrapper[4752]: I1124 12:32:18.975957 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-1e06-account-create-vcjrj" Nov 24 12:32:19 crc kubenswrapper[4752]: I1124 12:32:19.372292 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-nh9fj"] Nov 24 12:32:19 crc kubenswrapper[4752]: I1124 12:32:19.483781 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-1e06-account-create-vcjrj"] Nov 24 12:32:19 crc kubenswrapper[4752]: W1124 12:32:19.508813 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae3d459b_1cb1_4d4a_a781_53f77cb195e4.slice/crio-b169d480572e5ee2e2157dcacbf541f0aa3ccfb96cc284e05e51a4dc120c9558 WatchSource:0}: Error finding container b169d480572e5ee2e2157dcacbf541f0aa3ccfb96cc284e05e51a4dc120c9558: Status 404 returned error can't find the container with id b169d480572e5ee2e2157dcacbf541f0aa3ccfb96cc284e05e51a4dc120c9558 Nov 24 12:32:20 crc kubenswrapper[4752]: I1124 12:32:20.032365 4752 generic.go:334] "Generic (PLEG): container finished" podID="84e5f7b0-c4d2-4672-8b81-87f84d56ac1e" containerID="b665efa9abf8dd36c2480e8a4fef3520a378fc471f87a1d4e667f7f0feb60a5f" exitCode=0 Nov 24 12:32:20 crc kubenswrapper[4752]: I1124 12:32:20.032452 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-nh9fj" event={"ID":"84e5f7b0-c4d2-4672-8b81-87f84d56ac1e","Type":"ContainerDied","Data":"b665efa9abf8dd36c2480e8a4fef3520a378fc471f87a1d4e667f7f0feb60a5f"} Nov 24 12:32:20 crc kubenswrapper[4752]: I1124 12:32:20.032819 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-nh9fj" event={"ID":"84e5f7b0-c4d2-4672-8b81-87f84d56ac1e","Type":"ContainerStarted","Data":"a6032c003bc5e339d590d403ee4e7ca99ed169a2a8b6183c7b121774eebe074a"} Nov 24 12:32:20 crc kubenswrapper[4752]: I1124 12:32:20.034999 4752 generic.go:334] "Generic (PLEG): container finished" podID="ae3d459b-1cb1-4d4a-a781-53f77cb195e4" containerID="0166f69cf3af1e5c15aaa08a472e637dea5361a73d2eaaef55a99edd701b8560" exitCode=0 Nov 24 12:32:20 crc kubenswrapper[4752]: I1124 12:32:20.035027 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-1e06-account-create-vcjrj" event={"ID":"ae3d459b-1cb1-4d4a-a781-53f77cb195e4","Type":"ContainerDied","Data":"0166f69cf3af1e5c15aaa08a472e637dea5361a73d2eaaef55a99edd701b8560"} Nov 24 12:32:20 crc kubenswrapper[4752]: I1124 12:32:20.035050 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-1e06-account-create-vcjrj" event={"ID":"ae3d459b-1cb1-4d4a-a781-53f77cb195e4","Type":"ContainerStarted","Data":"b169d480572e5ee2e2157dcacbf541f0aa3ccfb96cc284e05e51a4dc120c9558"} Nov 24 12:32:21 crc kubenswrapper[4752]: I1124 12:32:21.377999 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-nh9fj" Nov 24 12:32:21 crc kubenswrapper[4752]: I1124 12:32:21.385224 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-1e06-account-create-vcjrj" Nov 24 12:32:21 crc kubenswrapper[4752]: I1124 12:32:21.556506 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae3d459b-1cb1-4d4a-a781-53f77cb195e4-operator-scripts\") pod \"ae3d459b-1cb1-4d4a-a781-53f77cb195e4\" (UID: \"ae3d459b-1cb1-4d4a-a781-53f77cb195e4\") " Nov 24 12:32:21 crc kubenswrapper[4752]: I1124 12:32:21.556919 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p87x2\" (UniqueName: \"kubernetes.io/projected/ae3d459b-1cb1-4d4a-a781-53f77cb195e4-kube-api-access-p87x2\") pod \"ae3d459b-1cb1-4d4a-a781-53f77cb195e4\" (UID: \"ae3d459b-1cb1-4d4a-a781-53f77cb195e4\") " Nov 24 12:32:21 crc kubenswrapper[4752]: I1124 12:32:21.556954 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgnkr\" (UniqueName: \"kubernetes.io/projected/84e5f7b0-c4d2-4672-8b81-87f84d56ac1e-kube-api-access-xgnkr\") pod \"84e5f7b0-c4d2-4672-8b81-87f84d56ac1e\" (UID: \"84e5f7b0-c4d2-4672-8b81-87f84d56ac1e\") " Nov 24 12:32:21 crc kubenswrapper[4752]: I1124 12:32:21.557060 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84e5f7b0-c4d2-4672-8b81-87f84d56ac1e-operator-scripts\") pod \"84e5f7b0-c4d2-4672-8b81-87f84d56ac1e\" (UID: \"84e5f7b0-c4d2-4672-8b81-87f84d56ac1e\") " Nov 24 12:32:21 crc kubenswrapper[4752]: I1124 12:32:21.557487 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae3d459b-1cb1-4d4a-a781-53f77cb195e4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ae3d459b-1cb1-4d4a-a781-53f77cb195e4" (UID: "ae3d459b-1cb1-4d4a-a781-53f77cb195e4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:32:21 crc kubenswrapper[4752]: I1124 12:32:21.557662 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84e5f7b0-c4d2-4672-8b81-87f84d56ac1e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "84e5f7b0-c4d2-4672-8b81-87f84d56ac1e" (UID: "84e5f7b0-c4d2-4672-8b81-87f84d56ac1e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:32:21 crc kubenswrapper[4752]: I1124 12:32:21.563357 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84e5f7b0-c4d2-4672-8b81-87f84d56ac1e-kube-api-access-xgnkr" (OuterVolumeSpecName: "kube-api-access-xgnkr") pod "84e5f7b0-c4d2-4672-8b81-87f84d56ac1e" (UID: "84e5f7b0-c4d2-4672-8b81-87f84d56ac1e"). InnerVolumeSpecName "kube-api-access-xgnkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:32:21 crc kubenswrapper[4752]: I1124 12:32:21.565703 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae3d459b-1cb1-4d4a-a781-53f77cb195e4-kube-api-access-p87x2" (OuterVolumeSpecName: "kube-api-access-p87x2") pod "ae3d459b-1cb1-4d4a-a781-53f77cb195e4" (UID: "ae3d459b-1cb1-4d4a-a781-53f77cb195e4"). InnerVolumeSpecName "kube-api-access-p87x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:32:21 crc kubenswrapper[4752]: I1124 12:32:21.658794 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae3d459b-1cb1-4d4a-a781-53f77cb195e4-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:32:21 crc kubenswrapper[4752]: I1124 12:32:21.658832 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p87x2\" (UniqueName: \"kubernetes.io/projected/ae3d459b-1cb1-4d4a-a781-53f77cb195e4-kube-api-access-p87x2\") on node \"crc\" DevicePath \"\"" Nov 24 12:32:21 crc kubenswrapper[4752]: I1124 12:32:21.658843 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgnkr\" (UniqueName: \"kubernetes.io/projected/84e5f7b0-c4d2-4672-8b81-87f84d56ac1e-kube-api-access-xgnkr\") on node \"crc\" DevicePath \"\"" Nov 24 12:32:21 crc kubenswrapper[4752]: I1124 12:32:21.658852 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84e5f7b0-c4d2-4672-8b81-87f84d56ac1e-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:32:22 crc kubenswrapper[4752]: I1124 12:32:22.057102 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-nh9fj" event={"ID":"84e5f7b0-c4d2-4672-8b81-87f84d56ac1e","Type":"ContainerDied","Data":"a6032c003bc5e339d590d403ee4e7ca99ed169a2a8b6183c7b121774eebe074a"} Nov 24 12:32:22 crc kubenswrapper[4752]: I1124 12:32:22.057142 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-nh9fj" Nov 24 12:32:22 crc kubenswrapper[4752]: I1124 12:32:22.057164 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6032c003bc5e339d590d403ee4e7ca99ed169a2a8b6183c7b121774eebe074a" Nov 24 12:32:22 crc kubenswrapper[4752]: I1124 12:32:22.069543 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-1e06-account-create-vcjrj" event={"ID":"ae3d459b-1cb1-4d4a-a781-53f77cb195e4","Type":"ContainerDied","Data":"b169d480572e5ee2e2157dcacbf541f0aa3ccfb96cc284e05e51a4dc120c9558"} Nov 24 12:32:22 crc kubenswrapper[4752]: I1124 12:32:22.069603 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b169d480572e5ee2e2157dcacbf541f0aa3ccfb96cc284e05e51a4dc120c9558" Nov 24 12:32:22 crc kubenswrapper[4752]: I1124 12:32:22.069617 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-1e06-account-create-vcjrj" Nov 24 12:32:23 crc kubenswrapper[4752]: I1124 12:32:23.925157 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-9jxh8"] Nov 24 12:32:23 crc kubenswrapper[4752]: E1124 12:32:23.925568 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae3d459b-1cb1-4d4a-a781-53f77cb195e4" containerName="mariadb-account-create" Nov 24 12:32:23 crc kubenswrapper[4752]: I1124 12:32:23.925587 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae3d459b-1cb1-4d4a-a781-53f77cb195e4" containerName="mariadb-account-create" Nov 24 12:32:23 crc kubenswrapper[4752]: E1124 12:32:23.925615 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84e5f7b0-c4d2-4672-8b81-87f84d56ac1e" containerName="mariadb-database-create" Nov 24 12:32:23 crc kubenswrapper[4752]: I1124 12:32:23.925623 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="84e5f7b0-c4d2-4672-8b81-87f84d56ac1e" containerName="mariadb-database-create" Nov 24 12:32:23 crc kubenswrapper[4752]: I1124 12:32:23.925816 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="84e5f7b0-c4d2-4672-8b81-87f84d56ac1e" containerName="mariadb-database-create" Nov 24 12:32:23 crc kubenswrapper[4752]: I1124 12:32:23.925843 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae3d459b-1cb1-4d4a-a781-53f77cb195e4" containerName="mariadb-account-create" Nov 24 12:32:23 crc kubenswrapper[4752]: I1124 12:32:23.926492 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-9jxh8" Nov 24 12:32:23 crc kubenswrapper[4752]: I1124 12:32:23.928920 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 24 12:32:23 crc kubenswrapper[4752]: I1124 12:32:23.929062 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-f7rws" Nov 24 12:32:23 crc kubenswrapper[4752]: I1124 12:32:23.946546 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-9jxh8"] Nov 24 12:32:24 crc kubenswrapper[4752]: I1124 12:32:24.101762 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bed6539-d927-4527-95f0-1643bd3b0cc7-combined-ca-bundle\") pod \"barbican-db-sync-9jxh8\" (UID: \"7bed6539-d927-4527-95f0-1643bd3b0cc7\") " pod="openstack/barbican-db-sync-9jxh8" Nov 24 12:32:24 crc kubenswrapper[4752]: I1124 12:32:24.101841 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlhsx\" (UniqueName: \"kubernetes.io/projected/7bed6539-d927-4527-95f0-1643bd3b0cc7-kube-api-access-xlhsx\") pod \"barbican-db-sync-9jxh8\" (UID: \"7bed6539-d927-4527-95f0-1643bd3b0cc7\") " pod="openstack/barbican-db-sync-9jxh8" Nov 24 12:32:24 crc kubenswrapper[4752]: I1124 12:32:24.101992 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7bed6539-d927-4527-95f0-1643bd3b0cc7-db-sync-config-data\") pod \"barbican-db-sync-9jxh8\" (UID: \"7bed6539-d927-4527-95f0-1643bd3b0cc7\") " pod="openstack/barbican-db-sync-9jxh8" Nov 24 12:32:24 crc kubenswrapper[4752]: I1124 12:32:24.203668 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bed6539-d927-4527-95f0-1643bd3b0cc7-combined-ca-bundle\") pod \"barbican-db-sync-9jxh8\" (UID: \"7bed6539-d927-4527-95f0-1643bd3b0cc7\") " pod="openstack/barbican-db-sync-9jxh8" Nov 24 12:32:24 crc kubenswrapper[4752]: I1124 12:32:24.203946 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlhsx\" (UniqueName: \"kubernetes.io/projected/7bed6539-d927-4527-95f0-1643bd3b0cc7-kube-api-access-xlhsx\") pod \"barbican-db-sync-9jxh8\" (UID: \"7bed6539-d927-4527-95f0-1643bd3b0cc7\") " pod="openstack/barbican-db-sync-9jxh8" Nov 24 12:32:24 crc kubenswrapper[4752]: I1124 12:32:24.203996 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7bed6539-d927-4527-95f0-1643bd3b0cc7-db-sync-config-data\") pod \"barbican-db-sync-9jxh8\" (UID: \"7bed6539-d927-4527-95f0-1643bd3b0cc7\") " pod="openstack/barbican-db-sync-9jxh8" Nov 24 12:32:24 crc kubenswrapper[4752]: I1124 12:32:24.209798 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bed6539-d927-4527-95f0-1643bd3b0cc7-combined-ca-bundle\") pod \"barbican-db-sync-9jxh8\" (UID: \"7bed6539-d927-4527-95f0-1643bd3b0cc7\") " pod="openstack/barbican-db-sync-9jxh8" Nov 24 12:32:24 crc kubenswrapper[4752]: I1124 12:32:24.220291 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7bed6539-d927-4527-95f0-1643bd3b0cc7-db-sync-config-data\") pod \"barbican-db-sync-9jxh8\" (UID: \"7bed6539-d927-4527-95f0-1643bd3b0cc7\") " pod="openstack/barbican-db-sync-9jxh8" Nov 24 12:32:24 crc kubenswrapper[4752]: I1124 12:32:24.223851 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlhsx\" (UniqueName: \"kubernetes.io/projected/7bed6539-d927-4527-95f0-1643bd3b0cc7-kube-api-access-xlhsx\") pod \"barbican-db-sync-9jxh8\" (UID: \"7bed6539-d927-4527-95f0-1643bd3b0cc7\") " pod="openstack/barbican-db-sync-9jxh8" Nov 24 12:32:24 crc kubenswrapper[4752]: I1124 12:32:24.243848 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-9jxh8" Nov 24 12:32:24 crc kubenswrapper[4752]: I1124 12:32:24.682804 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-9jxh8"] Nov 24 12:32:24 crc kubenswrapper[4752]: W1124 12:32:24.687937 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7bed6539_d927_4527_95f0_1643bd3b0cc7.slice/crio-4a1841a1349217d5ff5e2ce2b0daef618c6fc9bbe65eb72d8f939c6d73724784 WatchSource:0}: Error finding container 4a1841a1349217d5ff5e2ce2b0daef618c6fc9bbe65eb72d8f939c6d73724784: Status 404 returned error can't find the container with id 4a1841a1349217d5ff5e2ce2b0daef618c6fc9bbe65eb72d8f939c6d73724784 Nov 24 12:32:25 crc kubenswrapper[4752]: I1124 12:32:25.094062 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-9jxh8" event={"ID":"7bed6539-d927-4527-95f0-1643bd3b0cc7","Type":"ContainerStarted","Data":"926aab143f59578ae4b83e8719712c16bcd0b52c98b8f2af47f3ce97fd7afda7"} Nov 24 12:32:25 crc kubenswrapper[4752]: I1124 12:32:25.095460 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-9jxh8" event={"ID":"7bed6539-d927-4527-95f0-1643bd3b0cc7","Type":"ContainerStarted","Data":"4a1841a1349217d5ff5e2ce2b0daef618c6fc9bbe65eb72d8f939c6d73724784"} Nov 24 12:32:25 crc kubenswrapper[4752]: I1124 12:32:25.117229 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-9jxh8" podStartSLOduration=2.117211978 podStartE2EDuration="2.117211978s" podCreationTimestamp="2025-11-24 12:32:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:32:25.114407408 +0000 UTC m=+5151.099227697" watchObservedRunningTime="2025-11-24 12:32:25.117211978 +0000 UTC m=+5151.102032267" Nov 24 12:32:27 crc kubenswrapper[4752]: I1124 12:32:27.119463 4752 generic.go:334] "Generic (PLEG): container finished" podID="7bed6539-d927-4527-95f0-1643bd3b0cc7" containerID="926aab143f59578ae4b83e8719712c16bcd0b52c98b8f2af47f3ce97fd7afda7" exitCode=0 Nov 24 12:32:27 crc kubenswrapper[4752]: I1124 12:32:27.119558 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-9jxh8" event={"ID":"7bed6539-d927-4527-95f0-1643bd3b0cc7","Type":"ContainerDied","Data":"926aab143f59578ae4b83e8719712c16bcd0b52c98b8f2af47f3ce97fd7afda7"} Nov 24 12:32:28 crc kubenswrapper[4752]: I1124 12:32:28.430362 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-9jxh8" Nov 24 12:32:28 crc kubenswrapper[4752]: I1124 12:32:28.584929 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7bed6539-d927-4527-95f0-1643bd3b0cc7-db-sync-config-data\") pod \"7bed6539-d927-4527-95f0-1643bd3b0cc7\" (UID: \"7bed6539-d927-4527-95f0-1643bd3b0cc7\") " Nov 24 12:32:28 crc kubenswrapper[4752]: I1124 12:32:28.585029 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bed6539-d927-4527-95f0-1643bd3b0cc7-combined-ca-bundle\") pod \"7bed6539-d927-4527-95f0-1643bd3b0cc7\" (UID: \"7bed6539-d927-4527-95f0-1643bd3b0cc7\") " Nov 24 12:32:28 crc kubenswrapper[4752]: I1124 12:32:28.585110 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlhsx\" (UniqueName: \"kubernetes.io/projected/7bed6539-d927-4527-95f0-1643bd3b0cc7-kube-api-access-xlhsx\") pod \"7bed6539-d927-4527-95f0-1643bd3b0cc7\" (UID: \"7bed6539-d927-4527-95f0-1643bd3b0cc7\") " Nov 24 12:32:28 crc kubenswrapper[4752]: I1124 12:32:28.590356 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bed6539-d927-4527-95f0-1643bd3b0cc7-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7bed6539-d927-4527-95f0-1643bd3b0cc7" (UID: "7bed6539-d927-4527-95f0-1643bd3b0cc7"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:32:28 crc kubenswrapper[4752]: I1124 12:32:28.590765 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bed6539-d927-4527-95f0-1643bd3b0cc7-kube-api-access-xlhsx" (OuterVolumeSpecName: "kube-api-access-xlhsx") pod "7bed6539-d927-4527-95f0-1643bd3b0cc7" (UID: "7bed6539-d927-4527-95f0-1643bd3b0cc7"). InnerVolumeSpecName "kube-api-access-xlhsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:32:28 crc kubenswrapper[4752]: I1124 12:32:28.608173 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bed6539-d927-4527-95f0-1643bd3b0cc7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7bed6539-d927-4527-95f0-1643bd3b0cc7" (UID: "7bed6539-d927-4527-95f0-1643bd3b0cc7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:32:28 crc kubenswrapper[4752]: I1124 12:32:28.686838 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlhsx\" (UniqueName: \"kubernetes.io/projected/7bed6539-d927-4527-95f0-1643bd3b0cc7-kube-api-access-xlhsx\") on node \"crc\" DevicePath \"\"" Nov 24 12:32:28 crc kubenswrapper[4752]: I1124 12:32:28.686873 4752 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7bed6539-d927-4527-95f0-1643bd3b0cc7-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 12:32:28 crc kubenswrapper[4752]: I1124 12:32:28.686884 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bed6539-d927-4527-95f0-1643bd3b0cc7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.140052 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-9jxh8" event={"ID":"7bed6539-d927-4527-95f0-1643bd3b0cc7","Type":"ContainerDied","Data":"4a1841a1349217d5ff5e2ce2b0daef618c6fc9bbe65eb72d8f939c6d73724784"} Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.140349 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a1841a1349217d5ff5e2ce2b0daef618c6fc9bbe65eb72d8f939c6d73724784" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.140194 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-9jxh8" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.374020 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6bbdb9cc7d-h6rsf"] Nov 24 12:32:29 crc kubenswrapper[4752]: E1124 12:32:29.374333 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bed6539-d927-4527-95f0-1643bd3b0cc7" containerName="barbican-db-sync" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.374345 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bed6539-d927-4527-95f0-1643bd3b0cc7" containerName="barbican-db-sync" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.374532 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bed6539-d927-4527-95f0-1643bd3b0cc7" containerName="barbican-db-sync" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.375381 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6bbdb9cc7d-h6rsf" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.377548 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.380926 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.381200 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-f7rws" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.388823 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-65559fc695-tzc6z"] Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.390310 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-65559fc695-tzc6z" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.394403 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.413573 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6bbdb9cc7d-h6rsf"] Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.433782 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-65559fc695-tzc6z"] Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.501545 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4cbe655-b57b-4d57-97e5-7e1f18c47167-config-data-custom\") pod \"barbican-worker-65559fc695-tzc6z\" (UID: \"a4cbe655-b57b-4d57-97e5-7e1f18c47167\") " pod="openstack/barbican-worker-65559fc695-tzc6z" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.501616 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4cbe655-b57b-4d57-97e5-7e1f18c47167-logs\") pod \"barbican-worker-65559fc695-tzc6z\" (UID: \"a4cbe655-b57b-4d57-97e5-7e1f18c47167\") " pod="openstack/barbican-worker-65559fc695-tzc6z" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.501638 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wv4rx\" (UniqueName: \"kubernetes.io/projected/ab3ff015-f006-4c8b-8cde-6191a3ddf473-kube-api-access-wv4rx\") pod \"barbican-keystone-listener-6bbdb9cc7d-h6rsf\" (UID: \"ab3ff015-f006-4c8b-8cde-6191a3ddf473\") " pod="openstack/barbican-keystone-listener-6bbdb9cc7d-h6rsf" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.501672 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab3ff015-f006-4c8b-8cde-6191a3ddf473-logs\") pod \"barbican-keystone-listener-6bbdb9cc7d-h6rsf\" (UID: \"ab3ff015-f006-4c8b-8cde-6191a3ddf473\") " pod="openstack/barbican-keystone-listener-6bbdb9cc7d-h6rsf" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.501691 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab3ff015-f006-4c8b-8cde-6191a3ddf473-combined-ca-bundle\") pod \"barbican-keystone-listener-6bbdb9cc7d-h6rsf\" (UID: \"ab3ff015-f006-4c8b-8cde-6191a3ddf473\") " pod="openstack/barbican-keystone-listener-6bbdb9cc7d-h6rsf" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.501718 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab3ff015-f006-4c8b-8cde-6191a3ddf473-config-data-custom\") pod \"barbican-keystone-listener-6bbdb9cc7d-h6rsf\" (UID: \"ab3ff015-f006-4c8b-8cde-6191a3ddf473\") " pod="openstack/barbican-keystone-listener-6bbdb9cc7d-h6rsf" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.501737 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4cbe655-b57b-4d57-97e5-7e1f18c47167-config-data\") pod \"barbican-worker-65559fc695-tzc6z\" (UID: \"a4cbe655-b57b-4d57-97e5-7e1f18c47167\") " pod="openstack/barbican-worker-65559fc695-tzc6z" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.501877 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4cbe655-b57b-4d57-97e5-7e1f18c47167-combined-ca-bundle\") pod \"barbican-worker-65559fc695-tzc6z\" (UID: \"a4cbe655-b57b-4d57-97e5-7e1f18c47167\") " pod="openstack/barbican-worker-65559fc695-tzc6z" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.501925 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7b6n\" (UniqueName: \"kubernetes.io/projected/a4cbe655-b57b-4d57-97e5-7e1f18c47167-kube-api-access-m7b6n\") pod \"barbican-worker-65559fc695-tzc6z\" (UID: \"a4cbe655-b57b-4d57-97e5-7e1f18c47167\") " pod="openstack/barbican-worker-65559fc695-tzc6z" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.501968 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab3ff015-f006-4c8b-8cde-6191a3ddf473-config-data\") pod \"barbican-keystone-listener-6bbdb9cc7d-h6rsf\" (UID: \"ab3ff015-f006-4c8b-8cde-6191a3ddf473\") " pod="openstack/barbican-keystone-listener-6bbdb9cc7d-h6rsf" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.518291 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77f8c95469-vlmpv"] Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.519770 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77f8c95469-vlmpv" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.532754 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77f8c95469-vlmpv"] Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.604242 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab3ff015-f006-4c8b-8cde-6191a3ddf473-config-data\") pod \"barbican-keystone-listener-6bbdb9cc7d-h6rsf\" (UID: \"ab3ff015-f006-4c8b-8cde-6191a3ddf473\") " pod="openstack/barbican-keystone-listener-6bbdb9cc7d-h6rsf" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.604682 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4cbe655-b57b-4d57-97e5-7e1f18c47167-config-data-custom\") pod \"barbican-worker-65559fc695-tzc6z\" (UID: \"a4cbe655-b57b-4d57-97e5-7e1f18c47167\") " pod="openstack/barbican-worker-65559fc695-tzc6z" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.604858 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4cbe655-b57b-4d57-97e5-7e1f18c47167-logs\") pod \"barbican-worker-65559fc695-tzc6z\" (UID: \"a4cbe655-b57b-4d57-97e5-7e1f18c47167\") " pod="openstack/barbican-worker-65559fc695-tzc6z" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.604988 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wv4rx\" (UniqueName: \"kubernetes.io/projected/ab3ff015-f006-4c8b-8cde-6191a3ddf473-kube-api-access-wv4rx\") pod \"barbican-keystone-listener-6bbdb9cc7d-h6rsf\" (UID: \"ab3ff015-f006-4c8b-8cde-6191a3ddf473\") " pod="openstack/barbican-keystone-listener-6bbdb9cc7d-h6rsf" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.605101 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab3ff015-f006-4c8b-8cde-6191a3ddf473-logs\") pod \"barbican-keystone-listener-6bbdb9cc7d-h6rsf\" (UID: \"ab3ff015-f006-4c8b-8cde-6191a3ddf473\") " pod="openstack/barbican-keystone-listener-6bbdb9cc7d-h6rsf" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.605182 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab3ff015-f006-4c8b-8cde-6191a3ddf473-combined-ca-bundle\") pod \"barbican-keystone-listener-6bbdb9cc7d-h6rsf\" (UID: \"ab3ff015-f006-4c8b-8cde-6191a3ddf473\") " pod="openstack/barbican-keystone-listener-6bbdb9cc7d-h6rsf" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.605300 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab3ff015-f006-4c8b-8cde-6191a3ddf473-config-data-custom\") pod \"barbican-keystone-listener-6bbdb9cc7d-h6rsf\" (UID: \"ab3ff015-f006-4c8b-8cde-6191a3ddf473\") " pod="openstack/barbican-keystone-listener-6bbdb9cc7d-h6rsf" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.605398 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4cbe655-b57b-4d57-97e5-7e1f18c47167-config-data\") pod \"barbican-worker-65559fc695-tzc6z\" (UID: \"a4cbe655-b57b-4d57-97e5-7e1f18c47167\") " pod="openstack/barbican-worker-65559fc695-tzc6z" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.605467 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4cbe655-b57b-4d57-97e5-7e1f18c47167-combined-ca-bundle\") pod \"barbican-worker-65559fc695-tzc6z\" (UID: \"a4cbe655-b57b-4d57-97e5-7e1f18c47167\") " pod="openstack/barbican-worker-65559fc695-tzc6z" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.605594 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7b6n\" (UniqueName: \"kubernetes.io/projected/a4cbe655-b57b-4d57-97e5-7e1f18c47167-kube-api-access-m7b6n\") pod \"barbican-worker-65559fc695-tzc6z\" (UID: \"a4cbe655-b57b-4d57-97e5-7e1f18c47167\") " pod="openstack/barbican-worker-65559fc695-tzc6z" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.608662 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4cbe655-b57b-4d57-97e5-7e1f18c47167-logs\") pod \"barbican-worker-65559fc695-tzc6z\" (UID: \"a4cbe655-b57b-4d57-97e5-7e1f18c47167\") " pod="openstack/barbican-worker-65559fc695-tzc6z" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.609242 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab3ff015-f006-4c8b-8cde-6191a3ddf473-logs\") pod \"barbican-keystone-listener-6bbdb9cc7d-h6rsf\" (UID: \"ab3ff015-f006-4c8b-8cde-6191a3ddf473\") " pod="openstack/barbican-keystone-listener-6bbdb9cc7d-h6rsf" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.613221 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab3ff015-f006-4c8b-8cde-6191a3ddf473-combined-ca-bundle\") pod \"barbican-keystone-listener-6bbdb9cc7d-h6rsf\" (UID: \"ab3ff015-f006-4c8b-8cde-6191a3ddf473\") " pod="openstack/barbican-keystone-listener-6bbdb9cc7d-h6rsf" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.617829 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4cbe655-b57b-4d57-97e5-7e1f18c47167-combined-ca-bundle\") pod \"barbican-worker-65559fc695-tzc6z\" (UID: \"a4cbe655-b57b-4d57-97e5-7e1f18c47167\") " pod="openstack/barbican-worker-65559fc695-tzc6z" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.618387 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4cbe655-b57b-4d57-97e5-7e1f18c47167-config-data\") pod \"barbican-worker-65559fc695-tzc6z\" (UID: \"a4cbe655-b57b-4d57-97e5-7e1f18c47167\") " pod="openstack/barbican-worker-65559fc695-tzc6z" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.619652 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab3ff015-f006-4c8b-8cde-6191a3ddf473-config-data\") pod \"barbican-keystone-listener-6bbdb9cc7d-h6rsf\" (UID: \"ab3ff015-f006-4c8b-8cde-6191a3ddf473\") " pod="openstack/barbican-keystone-listener-6bbdb9cc7d-h6rsf" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.620153 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4cbe655-b57b-4d57-97e5-7e1f18c47167-config-data-custom\") pod \"barbican-worker-65559fc695-tzc6z\" (UID: \"a4cbe655-b57b-4d57-97e5-7e1f18c47167\") " pod="openstack/barbican-worker-65559fc695-tzc6z" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.621394 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab3ff015-f006-4c8b-8cde-6191a3ddf473-config-data-custom\") pod \"barbican-keystone-listener-6bbdb9cc7d-h6rsf\" (UID: \"ab3ff015-f006-4c8b-8cde-6191a3ddf473\") " pod="openstack/barbican-keystone-listener-6bbdb9cc7d-h6rsf" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.630081 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wv4rx\" (UniqueName: \"kubernetes.io/projected/ab3ff015-f006-4c8b-8cde-6191a3ddf473-kube-api-access-wv4rx\") pod \"barbican-keystone-listener-6bbdb9cc7d-h6rsf\" (UID: \"ab3ff015-f006-4c8b-8cde-6191a3ddf473\") " pod="openstack/barbican-keystone-listener-6bbdb9cc7d-h6rsf" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.644667 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7b6n\" (UniqueName: \"kubernetes.io/projected/a4cbe655-b57b-4d57-97e5-7e1f18c47167-kube-api-access-m7b6n\") pod \"barbican-worker-65559fc695-tzc6z\" (UID: \"a4cbe655-b57b-4d57-97e5-7e1f18c47167\") " pod="openstack/barbican-worker-65559fc695-tzc6z" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.672815 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-547c99f9b6-9fzgx"] Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.674226 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-547c99f9b6-9fzgx" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.680253 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.709118 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0011a25f-8be1-41a4-8ab3-04d9e30f83cc-ovsdbserver-sb\") pod \"dnsmasq-dns-77f8c95469-vlmpv\" (UID: \"0011a25f-8be1-41a4-8ab3-04d9e30f83cc\") " pod="openstack/dnsmasq-dns-77f8c95469-vlmpv" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.709173 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l8lt\" (UniqueName: \"kubernetes.io/projected/0011a25f-8be1-41a4-8ab3-04d9e30f83cc-kube-api-access-2l8lt\") pod \"dnsmasq-dns-77f8c95469-vlmpv\" (UID: \"0011a25f-8be1-41a4-8ab3-04d9e30f83cc\") " pod="openstack/dnsmasq-dns-77f8c95469-vlmpv" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.709218 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0011a25f-8be1-41a4-8ab3-04d9e30f83cc-config\") pod \"dnsmasq-dns-77f8c95469-vlmpv\" (UID: \"0011a25f-8be1-41a4-8ab3-04d9e30f83cc\") " pod="openstack/dnsmasq-dns-77f8c95469-vlmpv" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.709257 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0011a25f-8be1-41a4-8ab3-04d9e30f83cc-dns-svc\") pod \"dnsmasq-dns-77f8c95469-vlmpv\" (UID: \"0011a25f-8be1-41a4-8ab3-04d9e30f83cc\") " pod="openstack/dnsmasq-dns-77f8c95469-vlmpv" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.709322 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0011a25f-8be1-41a4-8ab3-04d9e30f83cc-ovsdbserver-nb\") pod \"dnsmasq-dns-77f8c95469-vlmpv\" (UID: \"0011a25f-8be1-41a4-8ab3-04d9e30f83cc\") " pod="openstack/dnsmasq-dns-77f8c95469-vlmpv" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.712219 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-547c99f9b6-9fzgx"] Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.739393 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6bbdb9cc7d-h6rsf" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.748609 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-65559fc695-tzc6z" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.811614 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0011a25f-8be1-41a4-8ab3-04d9e30f83cc-dns-svc\") pod \"dnsmasq-dns-77f8c95469-vlmpv\" (UID: \"0011a25f-8be1-41a4-8ab3-04d9e30f83cc\") " pod="openstack/dnsmasq-dns-77f8c95469-vlmpv" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.811689 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d182b26e-e887-47d5-a834-4584f6110213-logs\") pod \"barbican-api-547c99f9b6-9fzgx\" (UID: \"d182b26e-e887-47d5-a834-4584f6110213\") " pod="openstack/barbican-api-547c99f9b6-9fzgx" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.811764 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0011a25f-8be1-41a4-8ab3-04d9e30f83cc-ovsdbserver-nb\") pod \"dnsmasq-dns-77f8c95469-vlmpv\" (UID: \"0011a25f-8be1-41a4-8ab3-04d9e30f83cc\") " pod="openstack/dnsmasq-dns-77f8c95469-vlmpv" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.811808 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d182b26e-e887-47d5-a834-4584f6110213-combined-ca-bundle\") pod \"barbican-api-547c99f9b6-9fzgx\" (UID: \"d182b26e-e887-47d5-a834-4584f6110213\") " pod="openstack/barbican-api-547c99f9b6-9fzgx" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.811834 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d182b26e-e887-47d5-a834-4584f6110213-config-data-custom\") pod \"barbican-api-547c99f9b6-9fzgx\" (UID: \"d182b26e-e887-47d5-a834-4584f6110213\") " pod="openstack/barbican-api-547c99f9b6-9fzgx" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.811860 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d182b26e-e887-47d5-a834-4584f6110213-config-data\") pod \"barbican-api-547c99f9b6-9fzgx\" (UID: \"d182b26e-e887-47d5-a834-4584f6110213\") " pod="openstack/barbican-api-547c99f9b6-9fzgx" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.811889 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0011a25f-8be1-41a4-8ab3-04d9e30f83cc-ovsdbserver-sb\") pod \"dnsmasq-dns-77f8c95469-vlmpv\" (UID: \"0011a25f-8be1-41a4-8ab3-04d9e30f83cc\") " pod="openstack/dnsmasq-dns-77f8c95469-vlmpv" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.811915 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2l8lt\" (UniqueName: \"kubernetes.io/projected/0011a25f-8be1-41a4-8ab3-04d9e30f83cc-kube-api-access-2l8lt\") pod \"dnsmasq-dns-77f8c95469-vlmpv\" (UID: \"0011a25f-8be1-41a4-8ab3-04d9e30f83cc\") " pod="openstack/dnsmasq-dns-77f8c95469-vlmpv" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.811940 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbknz\" (UniqueName: \"kubernetes.io/projected/d182b26e-e887-47d5-a834-4584f6110213-kube-api-access-bbknz\") pod \"barbican-api-547c99f9b6-9fzgx\" (UID: \"d182b26e-e887-47d5-a834-4584f6110213\") " pod="openstack/barbican-api-547c99f9b6-9fzgx" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.811987 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0011a25f-8be1-41a4-8ab3-04d9e30f83cc-config\") pod \"dnsmasq-dns-77f8c95469-vlmpv\" (UID: \"0011a25f-8be1-41a4-8ab3-04d9e30f83cc\") " pod="openstack/dnsmasq-dns-77f8c95469-vlmpv" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.813045 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0011a25f-8be1-41a4-8ab3-04d9e30f83cc-config\") pod \"dnsmasq-dns-77f8c95469-vlmpv\" (UID: \"0011a25f-8be1-41a4-8ab3-04d9e30f83cc\") " pod="openstack/dnsmasq-dns-77f8c95469-vlmpv" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.814399 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0011a25f-8be1-41a4-8ab3-04d9e30f83cc-ovsdbserver-nb\") pod \"dnsmasq-dns-77f8c95469-vlmpv\" (UID: \"0011a25f-8be1-41a4-8ab3-04d9e30f83cc\") " pod="openstack/dnsmasq-dns-77f8c95469-vlmpv" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.814399 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0011a25f-8be1-41a4-8ab3-04d9e30f83cc-ovsdbserver-sb\") pod \"dnsmasq-dns-77f8c95469-vlmpv\" (UID: \"0011a25f-8be1-41a4-8ab3-04d9e30f83cc\") " pod="openstack/dnsmasq-dns-77f8c95469-vlmpv" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.816713 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0011a25f-8be1-41a4-8ab3-04d9e30f83cc-dns-svc\") pod \"dnsmasq-dns-77f8c95469-vlmpv\" (UID: \"0011a25f-8be1-41a4-8ab3-04d9e30f83cc\") " pod="openstack/dnsmasq-dns-77f8c95469-vlmpv" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.831448 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l8lt\" (UniqueName: \"kubernetes.io/projected/0011a25f-8be1-41a4-8ab3-04d9e30f83cc-kube-api-access-2l8lt\") pod \"dnsmasq-dns-77f8c95469-vlmpv\" (UID: \"0011a25f-8be1-41a4-8ab3-04d9e30f83cc\") " pod="openstack/dnsmasq-dns-77f8c95469-vlmpv" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.897866 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77f8c95469-vlmpv" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.912939 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d182b26e-e887-47d5-a834-4584f6110213-logs\") pod \"barbican-api-547c99f9b6-9fzgx\" (UID: \"d182b26e-e887-47d5-a834-4584f6110213\") " pod="openstack/barbican-api-547c99f9b6-9fzgx" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.913287 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d182b26e-e887-47d5-a834-4584f6110213-combined-ca-bundle\") pod \"barbican-api-547c99f9b6-9fzgx\" (UID: \"d182b26e-e887-47d5-a834-4584f6110213\") " pod="openstack/barbican-api-547c99f9b6-9fzgx" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.913310 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d182b26e-e887-47d5-a834-4584f6110213-config-data-custom\") pod \"barbican-api-547c99f9b6-9fzgx\" (UID: \"d182b26e-e887-47d5-a834-4584f6110213\") " pod="openstack/barbican-api-547c99f9b6-9fzgx" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.913329 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d182b26e-e887-47d5-a834-4584f6110213-config-data\") pod \"barbican-api-547c99f9b6-9fzgx\" (UID: \"d182b26e-e887-47d5-a834-4584f6110213\") " pod="openstack/barbican-api-547c99f9b6-9fzgx" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.913359 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbknz\" (UniqueName: \"kubernetes.io/projected/d182b26e-e887-47d5-a834-4584f6110213-kube-api-access-bbknz\") pod \"barbican-api-547c99f9b6-9fzgx\" (UID: \"d182b26e-e887-47d5-a834-4584f6110213\") " pod="openstack/barbican-api-547c99f9b6-9fzgx" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.913627 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d182b26e-e887-47d5-a834-4584f6110213-logs\") pod \"barbican-api-547c99f9b6-9fzgx\" (UID: \"d182b26e-e887-47d5-a834-4584f6110213\") " pod="openstack/barbican-api-547c99f9b6-9fzgx" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.923030 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d182b26e-e887-47d5-a834-4584f6110213-config-data\") pod \"barbican-api-547c99f9b6-9fzgx\" (UID: \"d182b26e-e887-47d5-a834-4584f6110213\") " pod="openstack/barbican-api-547c99f9b6-9fzgx" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.928771 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d182b26e-e887-47d5-a834-4584f6110213-config-data-custom\") pod \"barbican-api-547c99f9b6-9fzgx\" (UID: \"d182b26e-e887-47d5-a834-4584f6110213\") " pod="openstack/barbican-api-547c99f9b6-9fzgx" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.930918 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d182b26e-e887-47d5-a834-4584f6110213-combined-ca-bundle\") pod \"barbican-api-547c99f9b6-9fzgx\" (UID: \"d182b26e-e887-47d5-a834-4584f6110213\") " pod="openstack/barbican-api-547c99f9b6-9fzgx" Nov 24 12:32:29 crc kubenswrapper[4752]: I1124 12:32:29.932660 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbknz\" (UniqueName: \"kubernetes.io/projected/d182b26e-e887-47d5-a834-4584f6110213-kube-api-access-bbknz\") pod \"barbican-api-547c99f9b6-9fzgx\" (UID: \"d182b26e-e887-47d5-a834-4584f6110213\") " pod="openstack/barbican-api-547c99f9b6-9fzgx" Nov 24 12:32:30 crc kubenswrapper[4752]: I1124 12:32:30.053600 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-547c99f9b6-9fzgx" Nov 24 12:32:30 crc kubenswrapper[4752]: I1124 12:32:30.333914 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6bbdb9cc7d-h6rsf"] Nov 24 12:32:30 crc kubenswrapper[4752]: I1124 12:32:30.422889 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77f8c95469-vlmpv"] Nov 24 12:32:30 crc kubenswrapper[4752]: I1124 12:32:30.437833 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-65559fc695-tzc6z"] Nov 24 12:32:30 crc kubenswrapper[4752]: I1124 12:32:30.597034 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-547c99f9b6-9fzgx"] Nov 24 12:32:31 crc kubenswrapper[4752]: I1124 12:32:31.189103 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65559fc695-tzc6z" event={"ID":"a4cbe655-b57b-4d57-97e5-7e1f18c47167","Type":"ContainerStarted","Data":"6f1ed3a34df0be2c2dcd4c04d2152130a79d6128792b1ca58883098fb59a77eb"} Nov 24 12:32:31 crc kubenswrapper[4752]: I1124 12:32:31.189413 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65559fc695-tzc6z" event={"ID":"a4cbe655-b57b-4d57-97e5-7e1f18c47167","Type":"ContainerStarted","Data":"e38f4e24adbbfffc7cbd81b43ca70365975900cad61cbd1b6a84b854484cb4d4"} Nov 24 12:32:31 crc kubenswrapper[4752]: I1124 12:32:31.189426 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65559fc695-tzc6z" event={"ID":"a4cbe655-b57b-4d57-97e5-7e1f18c47167","Type":"ContainerStarted","Data":"e1f5152f119bfc6f01c7b6299f51fe5ab62ca809252c6aa5b2865cbf0d69f691"} Nov 24 12:32:31 crc kubenswrapper[4752]: I1124 12:32:31.193044 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-547c99f9b6-9fzgx" event={"ID":"d182b26e-e887-47d5-a834-4584f6110213","Type":"ContainerStarted","Data":"dbb8db57832511033c309ede0f968fc963c3623eb756caafb01e234972b12315"} Nov 24 12:32:31 crc kubenswrapper[4752]: I1124 12:32:31.193083 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-547c99f9b6-9fzgx" event={"ID":"d182b26e-e887-47d5-a834-4584f6110213","Type":"ContainerStarted","Data":"6c73c51284eee82b69f408e227e65fb7b82f9e5aff3e6ca2c5f65538194f868b"} Nov 24 12:32:31 crc kubenswrapper[4752]: I1124 12:32:31.193097 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-547c99f9b6-9fzgx" event={"ID":"d182b26e-e887-47d5-a834-4584f6110213","Type":"ContainerStarted","Data":"d134b226155367de47701971fb79233383ff77a22e1747da12778cfee3327fa2"} Nov 24 12:32:31 crc kubenswrapper[4752]: I1124 12:32:31.193642 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-547c99f9b6-9fzgx" Nov 24 12:32:31 crc kubenswrapper[4752]: I1124 12:32:31.193684 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-547c99f9b6-9fzgx" Nov 24 12:32:31 crc kubenswrapper[4752]: I1124 12:32:31.195795 4752 generic.go:334] "Generic (PLEG): container finished" podID="0011a25f-8be1-41a4-8ab3-04d9e30f83cc" containerID="d1c0f7d057d05811cbdec2b71189c594a5e659d2a841a9576b9ebb28ccec374f" exitCode=0 Nov 24 12:32:31 crc kubenswrapper[4752]: I1124 12:32:31.195856 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77f8c95469-vlmpv" event={"ID":"0011a25f-8be1-41a4-8ab3-04d9e30f83cc","Type":"ContainerDied","Data":"d1c0f7d057d05811cbdec2b71189c594a5e659d2a841a9576b9ebb28ccec374f"} Nov 24 12:32:31 crc kubenswrapper[4752]: I1124 12:32:31.195904 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77f8c95469-vlmpv" event={"ID":"0011a25f-8be1-41a4-8ab3-04d9e30f83cc","Type":"ContainerStarted","Data":"36443f1297c82d50f2087d9b4ece2387dc3d6bd76c4453af79022843356f011f"} Nov 24 12:32:31 crc kubenswrapper[4752]: I1124 12:32:31.198448 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6bbdb9cc7d-h6rsf" event={"ID":"ab3ff015-f006-4c8b-8cde-6191a3ddf473","Type":"ContainerStarted","Data":"1dc1ff2a15a976804c0d4d565a184b6f4a0c2f67d6e89341c3f8c66a64ce17f5"} Nov 24 12:32:31 crc kubenswrapper[4752]: I1124 12:32:31.198516 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6bbdb9cc7d-h6rsf" event={"ID":"ab3ff015-f006-4c8b-8cde-6191a3ddf473","Type":"ContainerStarted","Data":"579c06f62306e3b1eb61e1cf95f3bec1e9b5fa8ddecfb6547a67c0338729c375"} Nov 24 12:32:31 crc kubenswrapper[4752]: I1124 12:32:31.198538 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6bbdb9cc7d-h6rsf" event={"ID":"ab3ff015-f006-4c8b-8cde-6191a3ddf473","Type":"ContainerStarted","Data":"bff1dcf25251c1cec2059f210991ba9999e4c2ffb00b0659771add5510879b10"} Nov 24 12:32:31 crc kubenswrapper[4752]: I1124 12:32:31.218453 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-65559fc695-tzc6z" podStartSLOduration=2.21842957 podStartE2EDuration="2.21842957s" podCreationTimestamp="2025-11-24 12:32:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:32:31.211628564 +0000 UTC m=+5157.196448863" watchObservedRunningTime="2025-11-24 12:32:31.21842957 +0000 UTC m=+5157.203249859" Nov 24 12:32:31 crc kubenswrapper[4752]: I1124 12:32:31.261558 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-547c99f9b6-9fzgx" podStartSLOduration=2.261543016 podStartE2EDuration="2.261543016s" podCreationTimestamp="2025-11-24 12:32:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:32:31.260852027 +0000 UTC m=+5157.245672336" watchObservedRunningTime="2025-11-24 12:32:31.261543016 +0000 UTC m=+5157.246363305" Nov 24 12:32:31 crc kubenswrapper[4752]: I1124 12:32:31.294963 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6bbdb9cc7d-h6rsf" podStartSLOduration=2.294941475 podStartE2EDuration="2.294941475s" podCreationTimestamp="2025-11-24 12:32:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:32:31.29128771 +0000 UTC m=+5157.276108009" watchObservedRunningTime="2025-11-24 12:32:31.294941475 +0000 UTC m=+5157.279761764" Nov 24 12:32:32 crc kubenswrapper[4752]: I1124 12:32:32.209011 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77f8c95469-vlmpv" event={"ID":"0011a25f-8be1-41a4-8ab3-04d9e30f83cc","Type":"ContainerStarted","Data":"add82764b29b4815f8f1bf2fed69c355771ca9fb678df381e996aadccce38e6a"} Nov 24 12:32:32 crc kubenswrapper[4752]: I1124 12:32:32.210835 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77f8c95469-vlmpv" Nov 24 12:32:32 crc kubenswrapper[4752]: I1124 12:32:32.236765 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77f8c95469-vlmpv" podStartSLOduration=3.236724322 podStartE2EDuration="3.236724322s" podCreationTimestamp="2025-11-24 12:32:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:32:32.231613806 +0000 UTC m=+5158.216434095" watchObservedRunningTime="2025-11-24 12:32:32.236724322 +0000 UTC m=+5158.221544611" Nov 24 12:32:32 crc kubenswrapper[4752]: I1124 12:32:32.728347 4752 scope.go:117] "RemoveContainer" containerID="294534fa96e641fffe9b0cce6e2bc4d525450d730fff13a2ed6629d46358ee8a" Nov 24 12:32:32 crc kubenswrapper[4752]: E1124 12:32:32.729045 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:32:36 crc kubenswrapper[4752]: I1124 12:32:36.591577 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-547c99f9b6-9fzgx" Nov 24 12:32:37 crc kubenswrapper[4752]: I1124 12:32:37.939625 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-547c99f9b6-9fzgx" Nov 24 12:32:39 crc kubenswrapper[4752]: I1124 12:32:39.900206 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77f8c95469-vlmpv" Nov 24 12:32:39 crc kubenswrapper[4752]: I1124 12:32:39.978281 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c78d97f89-ljr8f"] Nov 24 12:32:39 crc kubenswrapper[4752]: I1124 12:32:39.978526 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c78d97f89-ljr8f" podUID="57dda24e-e364-43f0-86a0-b7c15e511d21" containerName="dnsmasq-dns" containerID="cri-o://4bc2a4252f2a315eda67ff50c5f38e7e0202f8b75440813251bd74851c636377" gracePeriod=10 Nov 24 12:32:40 crc kubenswrapper[4752]: I1124 12:32:40.288651 4752 generic.go:334] "Generic (PLEG): container finished" podID="57dda24e-e364-43f0-86a0-b7c15e511d21" containerID="4bc2a4252f2a315eda67ff50c5f38e7e0202f8b75440813251bd74851c636377" exitCode=0 Nov 24 12:32:40 crc kubenswrapper[4752]: I1124 12:32:40.288689 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c78d97f89-ljr8f" event={"ID":"57dda24e-e364-43f0-86a0-b7c15e511d21","Type":"ContainerDied","Data":"4bc2a4252f2a315eda67ff50c5f38e7e0202f8b75440813251bd74851c636377"} Nov 24 12:32:40 crc kubenswrapper[4752]: I1124 12:32:40.484005 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c78d97f89-ljr8f" Nov 24 12:32:40 crc kubenswrapper[4752]: I1124 12:32:40.530259 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57dda24e-e364-43f0-86a0-b7c15e511d21-config\") pod \"57dda24e-e364-43f0-86a0-b7c15e511d21\" (UID: \"57dda24e-e364-43f0-86a0-b7c15e511d21\") " Nov 24 12:32:40 crc kubenswrapper[4752]: I1124 12:32:40.530348 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hd58\" (UniqueName: \"kubernetes.io/projected/57dda24e-e364-43f0-86a0-b7c15e511d21-kube-api-access-7hd58\") pod \"57dda24e-e364-43f0-86a0-b7c15e511d21\" (UID: \"57dda24e-e364-43f0-86a0-b7c15e511d21\") " Nov 24 12:32:40 crc kubenswrapper[4752]: I1124 12:32:40.530406 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57dda24e-e364-43f0-86a0-b7c15e511d21-ovsdbserver-nb\") pod \"57dda24e-e364-43f0-86a0-b7c15e511d21\" (UID: \"57dda24e-e364-43f0-86a0-b7c15e511d21\") " Nov 24 12:32:40 crc kubenswrapper[4752]: I1124 12:32:40.550507 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57dda24e-e364-43f0-86a0-b7c15e511d21-kube-api-access-7hd58" (OuterVolumeSpecName: "kube-api-access-7hd58") pod "57dda24e-e364-43f0-86a0-b7c15e511d21" (UID: "57dda24e-e364-43f0-86a0-b7c15e511d21"). InnerVolumeSpecName "kube-api-access-7hd58". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:32:40 crc kubenswrapper[4752]: I1124 12:32:40.590560 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57dda24e-e364-43f0-86a0-b7c15e511d21-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "57dda24e-e364-43f0-86a0-b7c15e511d21" (UID: "57dda24e-e364-43f0-86a0-b7c15e511d21"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:32:40 crc kubenswrapper[4752]: I1124 12:32:40.595966 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57dda24e-e364-43f0-86a0-b7c15e511d21-config" (OuterVolumeSpecName: "config") pod "57dda24e-e364-43f0-86a0-b7c15e511d21" (UID: "57dda24e-e364-43f0-86a0-b7c15e511d21"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:32:40 crc kubenswrapper[4752]: I1124 12:32:40.631474 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57dda24e-e364-43f0-86a0-b7c15e511d21-dns-svc\") pod \"57dda24e-e364-43f0-86a0-b7c15e511d21\" (UID: \"57dda24e-e364-43f0-86a0-b7c15e511d21\") " Nov 24 12:32:40 crc kubenswrapper[4752]: I1124 12:32:40.631952 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57dda24e-e364-43f0-86a0-b7c15e511d21-ovsdbserver-sb\") pod \"57dda24e-e364-43f0-86a0-b7c15e511d21\" (UID: \"57dda24e-e364-43f0-86a0-b7c15e511d21\") " Nov 24 12:32:40 crc kubenswrapper[4752]: I1124 12:32:40.632492 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57dda24e-e364-43f0-86a0-b7c15e511d21-config\") on node \"crc\" DevicePath \"\"" Nov 24 12:32:40 crc kubenswrapper[4752]: I1124 12:32:40.632613 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hd58\" (UniqueName: \"kubernetes.io/projected/57dda24e-e364-43f0-86a0-b7c15e511d21-kube-api-access-7hd58\") on node \"crc\" DevicePath \"\"" Nov 24 12:32:40 crc kubenswrapper[4752]: I1124 12:32:40.632689 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57dda24e-e364-43f0-86a0-b7c15e511d21-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 24 12:32:40 crc kubenswrapper[4752]: I1124 12:32:40.673453 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57dda24e-e364-43f0-86a0-b7c15e511d21-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "57dda24e-e364-43f0-86a0-b7c15e511d21" (UID: "57dda24e-e364-43f0-86a0-b7c15e511d21"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:32:40 crc kubenswrapper[4752]: I1124 12:32:40.681722 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57dda24e-e364-43f0-86a0-b7c15e511d21-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "57dda24e-e364-43f0-86a0-b7c15e511d21" (UID: "57dda24e-e364-43f0-86a0-b7c15e511d21"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:32:40 crc kubenswrapper[4752]: I1124 12:32:40.736198 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57dda24e-e364-43f0-86a0-b7c15e511d21-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 24 12:32:40 crc kubenswrapper[4752]: I1124 12:32:40.736231 4752 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57dda24e-e364-43f0-86a0-b7c15e511d21-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 12:32:41 crc kubenswrapper[4752]: I1124 12:32:41.300285 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c78d97f89-ljr8f" event={"ID":"57dda24e-e364-43f0-86a0-b7c15e511d21","Type":"ContainerDied","Data":"df42fdd2d0f595a9678f2dd3540f6d696aa2034396964902d901b9c1473069e7"} Nov 24 12:32:41 crc kubenswrapper[4752]: I1124 12:32:41.300352 4752 scope.go:117] "RemoveContainer" containerID="4bc2a4252f2a315eda67ff50c5f38e7e0202f8b75440813251bd74851c636377" Nov 24 12:32:41 crc kubenswrapper[4752]: I1124 12:32:41.300371 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c78d97f89-ljr8f" Nov 24 12:32:41 crc kubenswrapper[4752]: I1124 12:32:41.331827 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c78d97f89-ljr8f"] Nov 24 12:32:41 crc kubenswrapper[4752]: I1124 12:32:41.339353 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c78d97f89-ljr8f"] Nov 24 12:32:41 crc kubenswrapper[4752]: I1124 12:32:41.339476 4752 scope.go:117] "RemoveContainer" containerID="86241e7383c395847541aff37734208f4111b3efe0c2f9c24c12229d4f17f538" Nov 24 12:32:42 crc kubenswrapper[4752]: I1124 12:32:42.745781 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57dda24e-e364-43f0-86a0-b7c15e511d21" path="/var/lib/kubelet/pods/57dda24e-e364-43f0-86a0-b7c15e511d21/volumes" Nov 24 12:32:46 crc kubenswrapper[4752]: I1124 12:32:46.728275 4752 scope.go:117] "RemoveContainer" containerID="294534fa96e641fffe9b0cce6e2bc4d525450d730fff13a2ed6629d46358ee8a" Nov 24 12:32:46 crc kubenswrapper[4752]: E1124 12:32:46.729180 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:32:52 crc kubenswrapper[4752]: I1124 12:32:52.464639 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-7v95l"] Nov 24 12:32:52 crc kubenswrapper[4752]: E1124 12:32:52.466864 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57dda24e-e364-43f0-86a0-b7c15e511d21" containerName="init" Nov 24 12:32:52 crc kubenswrapper[4752]: I1124 12:32:52.467046 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="57dda24e-e364-43f0-86a0-b7c15e511d21" containerName="init" Nov 24 12:32:52 crc kubenswrapper[4752]: E1124 12:32:52.467138 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57dda24e-e364-43f0-86a0-b7c15e511d21" containerName="dnsmasq-dns" Nov 24 12:32:52 crc kubenswrapper[4752]: I1124 12:32:52.467195 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="57dda24e-e364-43f0-86a0-b7c15e511d21" containerName="dnsmasq-dns" Nov 24 12:32:52 crc kubenswrapper[4752]: I1124 12:32:52.467410 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="57dda24e-e364-43f0-86a0-b7c15e511d21" containerName="dnsmasq-dns" Nov 24 12:32:52 crc kubenswrapper[4752]: I1124 12:32:52.468197 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7v95l" Nov 24 12:32:52 crc kubenswrapper[4752]: I1124 12:32:52.474269 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-7v95l"] Nov 24 12:32:52 crc kubenswrapper[4752]: I1124 12:32:52.555936 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-05d1-account-create-49gwn"] Nov 24 12:32:52 crc kubenswrapper[4752]: I1124 12:32:52.557217 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-05d1-account-create-49gwn" Nov 24 12:32:52 crc kubenswrapper[4752]: I1124 12:32:52.560700 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Nov 24 12:32:52 crc kubenswrapper[4752]: I1124 12:32:52.567587 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-05d1-account-create-49gwn"] Nov 24 12:32:52 crc kubenswrapper[4752]: I1124 12:32:52.567947 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cbf0bbd-6b1f-48e1-8ec4-8779d4c9d060-operator-scripts\") pod \"neutron-db-create-7v95l\" (UID: \"0cbf0bbd-6b1f-48e1-8ec4-8779d4c9d060\") " pod="openstack/neutron-db-create-7v95l" Nov 24 12:32:52 crc kubenswrapper[4752]: I1124 12:32:52.568127 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7d0d60e-824d-4b68-aadc-0803ccd1130e-operator-scripts\") pod \"neutron-05d1-account-create-49gwn\" (UID: \"b7d0d60e-824d-4b68-aadc-0803ccd1130e\") " pod="openstack/neutron-05d1-account-create-49gwn" Nov 24 12:32:52 crc kubenswrapper[4752]: I1124 12:32:52.568325 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjchk\" (UniqueName: \"kubernetes.io/projected/0cbf0bbd-6b1f-48e1-8ec4-8779d4c9d060-kube-api-access-kjchk\") pod \"neutron-db-create-7v95l\" (UID: \"0cbf0bbd-6b1f-48e1-8ec4-8779d4c9d060\") " pod="openstack/neutron-db-create-7v95l" Nov 24 12:32:52 crc kubenswrapper[4752]: I1124 12:32:52.568470 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd6b8\" (UniqueName: \"kubernetes.io/projected/b7d0d60e-824d-4b68-aadc-0803ccd1130e-kube-api-access-pd6b8\") pod \"neutron-05d1-account-create-49gwn\" (UID: \"b7d0d60e-824d-4b68-aadc-0803ccd1130e\") " pod="openstack/neutron-05d1-account-create-49gwn" Nov 24 12:32:52 crc kubenswrapper[4752]: I1124 12:32:52.670072 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7d0d60e-824d-4b68-aadc-0803ccd1130e-operator-scripts\") pod \"neutron-05d1-account-create-49gwn\" (UID: \"b7d0d60e-824d-4b68-aadc-0803ccd1130e\") " pod="openstack/neutron-05d1-account-create-49gwn" Nov 24 12:32:52 crc kubenswrapper[4752]: I1124 12:32:52.670210 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjchk\" (UniqueName: \"kubernetes.io/projected/0cbf0bbd-6b1f-48e1-8ec4-8779d4c9d060-kube-api-access-kjchk\") pod \"neutron-db-create-7v95l\" (UID: \"0cbf0bbd-6b1f-48e1-8ec4-8779d4c9d060\") " pod="openstack/neutron-db-create-7v95l" Nov 24 12:32:52 crc kubenswrapper[4752]: I1124 12:32:52.670263 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pd6b8\" (UniqueName: \"kubernetes.io/projected/b7d0d60e-824d-4b68-aadc-0803ccd1130e-kube-api-access-pd6b8\") pod \"neutron-05d1-account-create-49gwn\" (UID: \"b7d0d60e-824d-4b68-aadc-0803ccd1130e\") " pod="openstack/neutron-05d1-account-create-49gwn" Nov 24 12:32:52 crc kubenswrapper[4752]: I1124 12:32:52.670311 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cbf0bbd-6b1f-48e1-8ec4-8779d4c9d060-operator-scripts\") pod \"neutron-db-create-7v95l\" (UID: \"0cbf0bbd-6b1f-48e1-8ec4-8779d4c9d060\") " pod="openstack/neutron-db-create-7v95l" Nov 24 12:32:52 crc kubenswrapper[4752]: I1124 12:32:52.671029 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7d0d60e-824d-4b68-aadc-0803ccd1130e-operator-scripts\") pod \"neutron-05d1-account-create-49gwn\" (UID: \"b7d0d60e-824d-4b68-aadc-0803ccd1130e\") " pod="openstack/neutron-05d1-account-create-49gwn" Nov 24 12:32:52 crc kubenswrapper[4752]: I1124 12:32:52.671190 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cbf0bbd-6b1f-48e1-8ec4-8779d4c9d060-operator-scripts\") pod \"neutron-db-create-7v95l\" (UID: \"0cbf0bbd-6b1f-48e1-8ec4-8779d4c9d060\") " pod="openstack/neutron-db-create-7v95l" Nov 24 12:32:52 crc kubenswrapper[4752]: I1124 12:32:52.692472 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjchk\" (UniqueName: \"kubernetes.io/projected/0cbf0bbd-6b1f-48e1-8ec4-8779d4c9d060-kube-api-access-kjchk\") pod \"neutron-db-create-7v95l\" (UID: \"0cbf0bbd-6b1f-48e1-8ec4-8779d4c9d060\") " pod="openstack/neutron-db-create-7v95l" Nov 24 12:32:52 crc kubenswrapper[4752]: I1124 12:32:52.692551 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd6b8\" (UniqueName: \"kubernetes.io/projected/b7d0d60e-824d-4b68-aadc-0803ccd1130e-kube-api-access-pd6b8\") pod \"neutron-05d1-account-create-49gwn\" (UID: \"b7d0d60e-824d-4b68-aadc-0803ccd1130e\") " pod="openstack/neutron-05d1-account-create-49gwn" Nov 24 12:32:52 crc kubenswrapper[4752]: I1124 12:32:52.978590 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7v95l" Nov 24 12:32:52 crc kubenswrapper[4752]: I1124 12:32:52.978637 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-05d1-account-create-49gwn" Nov 24 12:32:53 crc kubenswrapper[4752]: I1124 12:32:53.462787 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-05d1-account-create-49gwn"] Nov 24 12:32:53 crc kubenswrapper[4752]: W1124 12:32:53.468633 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7d0d60e_824d_4b68_aadc_0803ccd1130e.slice/crio-937329a370727cff5a767c2737d6606c3d5122c07b176067566fc2b5968142eb WatchSource:0}: Error finding container 937329a370727cff5a767c2737d6606c3d5122c07b176067566fc2b5968142eb: Status 404 returned error can't find the container with id 937329a370727cff5a767c2737d6606c3d5122c07b176067566fc2b5968142eb Nov 24 12:32:53 crc kubenswrapper[4752]: I1124 12:32:53.512571 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-7v95l"] Nov 24 12:32:53 crc kubenswrapper[4752]: W1124 12:32:53.524245 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cbf0bbd_6b1f_48e1_8ec4_8779d4c9d060.slice/crio-ee31c83c7d6953d1a8b70a01f2255688020b1005659b06bcdcc7fdae556e3063 WatchSource:0}: Error finding container ee31c83c7d6953d1a8b70a01f2255688020b1005659b06bcdcc7fdae556e3063: Status 404 returned error can't find the container with id ee31c83c7d6953d1a8b70a01f2255688020b1005659b06bcdcc7fdae556e3063 Nov 24 12:32:54 crc kubenswrapper[4752]: I1124 12:32:54.436503 4752 generic.go:334] "Generic (PLEG): container finished" podID="0cbf0bbd-6b1f-48e1-8ec4-8779d4c9d060" containerID="62fccc183a19821ab053524c28a352d904efaeac505327c8c34b214b0afa452f" exitCode=0 Nov 24 12:32:54 crc kubenswrapper[4752]: I1124 12:32:54.436624 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7v95l" event={"ID":"0cbf0bbd-6b1f-48e1-8ec4-8779d4c9d060","Type":"ContainerDied","Data":"62fccc183a19821ab053524c28a352d904efaeac505327c8c34b214b0afa452f"} Nov 24 12:32:54 crc kubenswrapper[4752]: I1124 12:32:54.436677 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7v95l" event={"ID":"0cbf0bbd-6b1f-48e1-8ec4-8779d4c9d060","Type":"ContainerStarted","Data":"ee31c83c7d6953d1a8b70a01f2255688020b1005659b06bcdcc7fdae556e3063"} Nov 24 12:32:54 crc kubenswrapper[4752]: I1124 12:32:54.439595 4752 generic.go:334] "Generic (PLEG): container finished" podID="b7d0d60e-824d-4b68-aadc-0803ccd1130e" containerID="ca89dccb86162f3dd03a28a60e76ad881866be87b888235d3fa3a5596c60fe94" exitCode=0 Nov 24 12:32:54 crc kubenswrapper[4752]: I1124 12:32:54.439705 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-05d1-account-create-49gwn" event={"ID":"b7d0d60e-824d-4b68-aadc-0803ccd1130e","Type":"ContainerDied","Data":"ca89dccb86162f3dd03a28a60e76ad881866be87b888235d3fa3a5596c60fe94"} Nov 24 12:32:54 crc kubenswrapper[4752]: I1124 12:32:54.439812 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-05d1-account-create-49gwn" event={"ID":"b7d0d60e-824d-4b68-aadc-0803ccd1130e","Type":"ContainerStarted","Data":"937329a370727cff5a767c2737d6606c3d5122c07b176067566fc2b5968142eb"} Nov 24 12:32:55 crc kubenswrapper[4752]: I1124 12:32:55.874648 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-05d1-account-create-49gwn" Nov 24 12:32:55 crc kubenswrapper[4752]: I1124 12:32:55.889859 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7v95l" Nov 24 12:32:55 crc kubenswrapper[4752]: I1124 12:32:55.936455 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7d0d60e-824d-4b68-aadc-0803ccd1130e-operator-scripts\") pod \"b7d0d60e-824d-4b68-aadc-0803ccd1130e\" (UID: \"b7d0d60e-824d-4b68-aadc-0803ccd1130e\") " Nov 24 12:32:55 crc kubenswrapper[4752]: I1124 12:32:55.936595 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cbf0bbd-6b1f-48e1-8ec4-8779d4c9d060-operator-scripts\") pod \"0cbf0bbd-6b1f-48e1-8ec4-8779d4c9d060\" (UID: \"0cbf0bbd-6b1f-48e1-8ec4-8779d4c9d060\") " Nov 24 12:32:55 crc kubenswrapper[4752]: I1124 12:32:55.936677 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjchk\" (UniqueName: \"kubernetes.io/projected/0cbf0bbd-6b1f-48e1-8ec4-8779d4c9d060-kube-api-access-kjchk\") pod \"0cbf0bbd-6b1f-48e1-8ec4-8779d4c9d060\" (UID: \"0cbf0bbd-6b1f-48e1-8ec4-8779d4c9d060\") " Nov 24 12:32:55 crc kubenswrapper[4752]: I1124 12:32:55.937086 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pd6b8\" (UniqueName: \"kubernetes.io/projected/b7d0d60e-824d-4b68-aadc-0803ccd1130e-kube-api-access-pd6b8\") pod \"b7d0d60e-824d-4b68-aadc-0803ccd1130e\" (UID: \"b7d0d60e-824d-4b68-aadc-0803ccd1130e\") " Nov 24 12:32:55 crc kubenswrapper[4752]: I1124 12:32:55.937315 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cbf0bbd-6b1f-48e1-8ec4-8779d4c9d060-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0cbf0bbd-6b1f-48e1-8ec4-8779d4c9d060" (UID: "0cbf0bbd-6b1f-48e1-8ec4-8779d4c9d060"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:32:55 crc kubenswrapper[4752]: I1124 12:32:55.937382 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7d0d60e-824d-4b68-aadc-0803ccd1130e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b7d0d60e-824d-4b68-aadc-0803ccd1130e" (UID: "b7d0d60e-824d-4b68-aadc-0803ccd1130e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:32:55 crc kubenswrapper[4752]: I1124 12:32:55.937738 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7d0d60e-824d-4b68-aadc-0803ccd1130e-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:32:55 crc kubenswrapper[4752]: I1124 12:32:55.937789 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cbf0bbd-6b1f-48e1-8ec4-8779d4c9d060-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:32:55 crc kubenswrapper[4752]: I1124 12:32:55.944526 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7d0d60e-824d-4b68-aadc-0803ccd1130e-kube-api-access-pd6b8" (OuterVolumeSpecName: "kube-api-access-pd6b8") pod "b7d0d60e-824d-4b68-aadc-0803ccd1130e" (UID: "b7d0d60e-824d-4b68-aadc-0803ccd1130e"). InnerVolumeSpecName "kube-api-access-pd6b8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:32:55 crc kubenswrapper[4752]: I1124 12:32:55.954168 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cbf0bbd-6b1f-48e1-8ec4-8779d4c9d060-kube-api-access-kjchk" (OuterVolumeSpecName: "kube-api-access-kjchk") pod "0cbf0bbd-6b1f-48e1-8ec4-8779d4c9d060" (UID: "0cbf0bbd-6b1f-48e1-8ec4-8779d4c9d060"). InnerVolumeSpecName "kube-api-access-kjchk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:32:56 crc kubenswrapper[4752]: I1124 12:32:56.039454 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pd6b8\" (UniqueName: \"kubernetes.io/projected/b7d0d60e-824d-4b68-aadc-0803ccd1130e-kube-api-access-pd6b8\") on node \"crc\" DevicePath \"\"" Nov 24 12:32:56 crc kubenswrapper[4752]: I1124 12:32:56.039506 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjchk\" (UniqueName: \"kubernetes.io/projected/0cbf0bbd-6b1f-48e1-8ec4-8779d4c9d060-kube-api-access-kjchk\") on node \"crc\" DevicePath \"\"" Nov 24 12:32:56 crc kubenswrapper[4752]: I1124 12:32:56.463290 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-05d1-account-create-49gwn" Nov 24 12:32:56 crc kubenswrapper[4752]: I1124 12:32:56.463341 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-05d1-account-create-49gwn" event={"ID":"b7d0d60e-824d-4b68-aadc-0803ccd1130e","Type":"ContainerDied","Data":"937329a370727cff5a767c2737d6606c3d5122c07b176067566fc2b5968142eb"} Nov 24 12:32:56 crc kubenswrapper[4752]: I1124 12:32:56.463398 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="937329a370727cff5a767c2737d6606c3d5122c07b176067566fc2b5968142eb" Nov 24 12:32:56 crc kubenswrapper[4752]: I1124 12:32:56.465075 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7v95l" event={"ID":"0cbf0bbd-6b1f-48e1-8ec4-8779d4c9d060","Type":"ContainerDied","Data":"ee31c83c7d6953d1a8b70a01f2255688020b1005659b06bcdcc7fdae556e3063"} Nov 24 12:32:56 crc kubenswrapper[4752]: I1124 12:32:56.465118 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee31c83c7d6953d1a8b70a01f2255688020b1005659b06bcdcc7fdae556e3063" Nov 24 12:32:56 crc kubenswrapper[4752]: I1124 12:32:56.465142 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7v95l" Nov 24 12:32:57 crc kubenswrapper[4752]: I1124 12:32:57.739959 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-4cs8l"] Nov 24 12:32:57 crc kubenswrapper[4752]: E1124 12:32:57.740820 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7d0d60e-824d-4b68-aadc-0803ccd1130e" containerName="mariadb-account-create" Nov 24 12:32:57 crc kubenswrapper[4752]: I1124 12:32:57.740841 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7d0d60e-824d-4b68-aadc-0803ccd1130e" containerName="mariadb-account-create" Nov 24 12:32:57 crc kubenswrapper[4752]: E1124 12:32:57.740853 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cbf0bbd-6b1f-48e1-8ec4-8779d4c9d060" containerName="mariadb-database-create" Nov 24 12:32:57 crc kubenswrapper[4752]: I1124 12:32:57.740861 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cbf0bbd-6b1f-48e1-8ec4-8779d4c9d060" containerName="mariadb-database-create" Nov 24 12:32:57 crc kubenswrapper[4752]: I1124 12:32:57.741055 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cbf0bbd-6b1f-48e1-8ec4-8779d4c9d060" containerName="mariadb-database-create" Nov 24 12:32:57 crc kubenswrapper[4752]: I1124 12:32:57.741078 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7d0d60e-824d-4b68-aadc-0803ccd1130e" containerName="mariadb-account-create" Nov 24 12:32:57 crc kubenswrapper[4752]: I1124 12:32:57.741799 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-4cs8l" Nov 24 12:32:57 crc kubenswrapper[4752]: I1124 12:32:57.743461 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-qxls4" Nov 24 12:32:57 crc kubenswrapper[4752]: I1124 12:32:57.743802 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 24 12:32:57 crc kubenswrapper[4752]: I1124 12:32:57.744462 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 24 12:32:57 crc kubenswrapper[4752]: I1124 12:32:57.748079 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-4cs8l"] Nov 24 12:32:57 crc kubenswrapper[4752]: I1124 12:32:57.768245 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq2gs\" (UniqueName: \"kubernetes.io/projected/72555f28-0bcc-4ada-bd18-48e12c193a4a-kube-api-access-dq2gs\") pod \"neutron-db-sync-4cs8l\" (UID: \"72555f28-0bcc-4ada-bd18-48e12c193a4a\") " pod="openstack/neutron-db-sync-4cs8l" Nov 24 12:32:57 crc kubenswrapper[4752]: I1124 12:32:57.768331 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72555f28-0bcc-4ada-bd18-48e12c193a4a-combined-ca-bundle\") pod \"neutron-db-sync-4cs8l\" (UID: \"72555f28-0bcc-4ada-bd18-48e12c193a4a\") " pod="openstack/neutron-db-sync-4cs8l" Nov 24 12:32:57 crc kubenswrapper[4752]: I1124 12:32:57.768412 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/72555f28-0bcc-4ada-bd18-48e12c193a4a-config\") pod \"neutron-db-sync-4cs8l\" (UID: \"72555f28-0bcc-4ada-bd18-48e12c193a4a\") " pod="openstack/neutron-db-sync-4cs8l" Nov 24 12:32:57 crc kubenswrapper[4752]: I1124 12:32:57.869347 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq2gs\" (UniqueName: \"kubernetes.io/projected/72555f28-0bcc-4ada-bd18-48e12c193a4a-kube-api-access-dq2gs\") pod \"neutron-db-sync-4cs8l\" (UID: \"72555f28-0bcc-4ada-bd18-48e12c193a4a\") " pod="openstack/neutron-db-sync-4cs8l" Nov 24 12:32:57 crc kubenswrapper[4752]: I1124 12:32:57.869398 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72555f28-0bcc-4ada-bd18-48e12c193a4a-combined-ca-bundle\") pod \"neutron-db-sync-4cs8l\" (UID: \"72555f28-0bcc-4ada-bd18-48e12c193a4a\") " pod="openstack/neutron-db-sync-4cs8l" Nov 24 12:32:57 crc kubenswrapper[4752]: I1124 12:32:57.869446 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/72555f28-0bcc-4ada-bd18-48e12c193a4a-config\") pod \"neutron-db-sync-4cs8l\" (UID: \"72555f28-0bcc-4ada-bd18-48e12c193a4a\") " pod="openstack/neutron-db-sync-4cs8l" Nov 24 12:32:57 crc kubenswrapper[4752]: I1124 12:32:57.875341 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/72555f28-0bcc-4ada-bd18-48e12c193a4a-config\") pod \"neutron-db-sync-4cs8l\" (UID: \"72555f28-0bcc-4ada-bd18-48e12c193a4a\") " pod="openstack/neutron-db-sync-4cs8l" Nov 24 12:32:57 crc kubenswrapper[4752]: I1124 12:32:57.875507 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72555f28-0bcc-4ada-bd18-48e12c193a4a-combined-ca-bundle\") pod \"neutron-db-sync-4cs8l\" (UID: \"72555f28-0bcc-4ada-bd18-48e12c193a4a\") " pod="openstack/neutron-db-sync-4cs8l" Nov 24 12:32:57 crc kubenswrapper[4752]: I1124 12:32:57.890989 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq2gs\" (UniqueName: \"kubernetes.io/projected/72555f28-0bcc-4ada-bd18-48e12c193a4a-kube-api-access-dq2gs\") pod \"neutron-db-sync-4cs8l\" (UID: \"72555f28-0bcc-4ada-bd18-48e12c193a4a\") " pod="openstack/neutron-db-sync-4cs8l" Nov 24 12:32:58 crc kubenswrapper[4752]: I1124 12:32:58.065917 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-4cs8l" Nov 24 12:32:58 crc kubenswrapper[4752]: I1124 12:32:58.555744 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-4cs8l"] Nov 24 12:32:59 crc kubenswrapper[4752]: I1124 12:32:59.500773 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-4cs8l" event={"ID":"72555f28-0bcc-4ada-bd18-48e12c193a4a","Type":"ContainerStarted","Data":"09b52e83937056b67ffaaf5fd535fd3ec55d586875012f9a299273e31d77d592"} Nov 24 12:32:59 crc kubenswrapper[4752]: I1124 12:32:59.501095 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-4cs8l" event={"ID":"72555f28-0bcc-4ada-bd18-48e12c193a4a","Type":"ContainerStarted","Data":"5f8e43eb7c128d2b1d3b025f08305cafec784c2278cdca8848117404fab023ce"} Nov 24 12:32:59 crc kubenswrapper[4752]: I1124 12:32:59.523268 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-4cs8l" podStartSLOduration=2.523236691 podStartE2EDuration="2.523236691s" podCreationTimestamp="2025-11-24 12:32:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:32:59.520528413 +0000 UTC m=+5185.505348722" watchObservedRunningTime="2025-11-24 12:32:59.523236691 +0000 UTC m=+5185.508057020" Nov 24 12:33:00 crc kubenswrapper[4752]: I1124 12:33:00.729817 4752 scope.go:117] "RemoveContainer" containerID="294534fa96e641fffe9b0cce6e2bc4d525450d730fff13a2ed6629d46358ee8a" Nov 24 12:33:00 crc kubenswrapper[4752]: E1124 12:33:00.732366 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:33:03 crc kubenswrapper[4752]: I1124 12:33:03.543606 4752 generic.go:334] "Generic (PLEG): container finished" podID="72555f28-0bcc-4ada-bd18-48e12c193a4a" containerID="09b52e83937056b67ffaaf5fd535fd3ec55d586875012f9a299273e31d77d592" exitCode=0 Nov 24 12:33:03 crc kubenswrapper[4752]: I1124 12:33:03.543678 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-4cs8l" event={"ID":"72555f28-0bcc-4ada-bd18-48e12c193a4a","Type":"ContainerDied","Data":"09b52e83937056b67ffaaf5fd535fd3ec55d586875012f9a299273e31d77d592"} Nov 24 12:33:04 crc kubenswrapper[4752]: I1124 12:33:04.943810 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-4cs8l" Nov 24 12:33:05 crc kubenswrapper[4752]: I1124 12:33:05.118046 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/72555f28-0bcc-4ada-bd18-48e12c193a4a-config\") pod \"72555f28-0bcc-4ada-bd18-48e12c193a4a\" (UID: \"72555f28-0bcc-4ada-bd18-48e12c193a4a\") " Nov 24 12:33:05 crc kubenswrapper[4752]: I1124 12:33:05.118242 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dq2gs\" (UniqueName: \"kubernetes.io/projected/72555f28-0bcc-4ada-bd18-48e12c193a4a-kube-api-access-dq2gs\") pod \"72555f28-0bcc-4ada-bd18-48e12c193a4a\" (UID: \"72555f28-0bcc-4ada-bd18-48e12c193a4a\") " Nov 24 12:33:05 crc kubenswrapper[4752]: I1124 12:33:05.118309 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72555f28-0bcc-4ada-bd18-48e12c193a4a-combined-ca-bundle\") pod \"72555f28-0bcc-4ada-bd18-48e12c193a4a\" (UID: \"72555f28-0bcc-4ada-bd18-48e12c193a4a\") " Nov 24 12:33:05 crc kubenswrapper[4752]: I1124 12:33:05.131071 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72555f28-0bcc-4ada-bd18-48e12c193a4a-kube-api-access-dq2gs" (OuterVolumeSpecName: "kube-api-access-dq2gs") pod "72555f28-0bcc-4ada-bd18-48e12c193a4a" (UID: "72555f28-0bcc-4ada-bd18-48e12c193a4a"). InnerVolumeSpecName "kube-api-access-dq2gs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:33:05 crc kubenswrapper[4752]: I1124 12:33:05.163143 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72555f28-0bcc-4ada-bd18-48e12c193a4a-config" (OuterVolumeSpecName: "config") pod "72555f28-0bcc-4ada-bd18-48e12c193a4a" (UID: "72555f28-0bcc-4ada-bd18-48e12c193a4a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:33:05 crc kubenswrapper[4752]: I1124 12:33:05.170037 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72555f28-0bcc-4ada-bd18-48e12c193a4a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72555f28-0bcc-4ada-bd18-48e12c193a4a" (UID: "72555f28-0bcc-4ada-bd18-48e12c193a4a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:33:05 crc kubenswrapper[4752]: I1124 12:33:05.220682 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72555f28-0bcc-4ada-bd18-48e12c193a4a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 12:33:05 crc kubenswrapper[4752]: I1124 12:33:05.220729 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/72555f28-0bcc-4ada-bd18-48e12c193a4a-config\") on node \"crc\" DevicePath \"\"" Nov 24 12:33:05 crc kubenswrapper[4752]: I1124 12:33:05.220754 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dq2gs\" (UniqueName: \"kubernetes.io/projected/72555f28-0bcc-4ada-bd18-48e12c193a4a-kube-api-access-dq2gs\") on node \"crc\" DevicePath \"\"" Nov 24 12:33:05 crc kubenswrapper[4752]: I1124 12:33:05.593283 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-4cs8l" event={"ID":"72555f28-0bcc-4ada-bd18-48e12c193a4a","Type":"ContainerDied","Data":"5f8e43eb7c128d2b1d3b025f08305cafec784c2278cdca8848117404fab023ce"} Nov 24 12:33:05 crc kubenswrapper[4752]: I1124 12:33:05.593327 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f8e43eb7c128d2b1d3b025f08305cafec784c2278cdca8848117404fab023ce" Nov 24 12:33:05 crc kubenswrapper[4752]: I1124 12:33:05.593411 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-4cs8l" Nov 24 12:33:05 crc kubenswrapper[4752]: I1124 12:33:05.826336 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c7fcc54fc-r87hf"] Nov 24 12:33:05 crc kubenswrapper[4752]: E1124 12:33:05.826776 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72555f28-0bcc-4ada-bd18-48e12c193a4a" containerName="neutron-db-sync" Nov 24 12:33:05 crc kubenswrapper[4752]: I1124 12:33:05.826792 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="72555f28-0bcc-4ada-bd18-48e12c193a4a" containerName="neutron-db-sync" Nov 24 12:33:05 crc kubenswrapper[4752]: I1124 12:33:05.826989 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="72555f28-0bcc-4ada-bd18-48e12c193a4a" containerName="neutron-db-sync" Nov 24 12:33:05 crc kubenswrapper[4752]: I1124 12:33:05.828054 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c7fcc54fc-r87hf" Nov 24 12:33:05 crc kubenswrapper[4752]: I1124 12:33:05.861795 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c7fcc54fc-r87hf"] Nov 24 12:33:05 crc kubenswrapper[4752]: I1124 12:33:05.937699 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/14d15b12-6098-4bc8-9e5d-481426fcf470-ovsdbserver-sb\") pod \"dnsmasq-dns-7c7fcc54fc-r87hf\" (UID: \"14d15b12-6098-4bc8-9e5d-481426fcf470\") " pod="openstack/dnsmasq-dns-7c7fcc54fc-r87hf" Nov 24 12:33:05 crc kubenswrapper[4752]: I1124 12:33:05.937830 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/14d15b12-6098-4bc8-9e5d-481426fcf470-ovsdbserver-nb\") pod \"dnsmasq-dns-7c7fcc54fc-r87hf\" (UID: \"14d15b12-6098-4bc8-9e5d-481426fcf470\") " pod="openstack/dnsmasq-dns-7c7fcc54fc-r87hf" Nov 24 12:33:05 crc kubenswrapper[4752]: I1124 12:33:05.937875 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14d15b12-6098-4bc8-9e5d-481426fcf470-config\") pod \"dnsmasq-dns-7c7fcc54fc-r87hf\" (UID: \"14d15b12-6098-4bc8-9e5d-481426fcf470\") " pod="openstack/dnsmasq-dns-7c7fcc54fc-r87hf" Nov 24 12:33:05 crc kubenswrapper[4752]: I1124 12:33:05.937916 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hv7h\" (UniqueName: \"kubernetes.io/projected/14d15b12-6098-4bc8-9e5d-481426fcf470-kube-api-access-5hv7h\") pod \"dnsmasq-dns-7c7fcc54fc-r87hf\" (UID: \"14d15b12-6098-4bc8-9e5d-481426fcf470\") " pod="openstack/dnsmasq-dns-7c7fcc54fc-r87hf" Nov 24 12:33:05 crc kubenswrapper[4752]: I1124 12:33:05.937996 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14d15b12-6098-4bc8-9e5d-481426fcf470-dns-svc\") pod \"dnsmasq-dns-7c7fcc54fc-r87hf\" (UID: \"14d15b12-6098-4bc8-9e5d-481426fcf470\") " pod="openstack/dnsmasq-dns-7c7fcc54fc-r87hf" Nov 24 12:33:06 crc kubenswrapper[4752]: I1124 12:33:06.039067 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/14d15b12-6098-4bc8-9e5d-481426fcf470-ovsdbserver-sb\") pod \"dnsmasq-dns-7c7fcc54fc-r87hf\" (UID: \"14d15b12-6098-4bc8-9e5d-481426fcf470\") " pod="openstack/dnsmasq-dns-7c7fcc54fc-r87hf" Nov 24 12:33:06 crc kubenswrapper[4752]: I1124 12:33:06.039158 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/14d15b12-6098-4bc8-9e5d-481426fcf470-ovsdbserver-nb\") pod \"dnsmasq-dns-7c7fcc54fc-r87hf\" (UID: \"14d15b12-6098-4bc8-9e5d-481426fcf470\") " pod="openstack/dnsmasq-dns-7c7fcc54fc-r87hf" Nov 24 12:33:06 crc kubenswrapper[4752]: I1124 12:33:06.039193 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14d15b12-6098-4bc8-9e5d-481426fcf470-config\") pod \"dnsmasq-dns-7c7fcc54fc-r87hf\" (UID: \"14d15b12-6098-4bc8-9e5d-481426fcf470\") " pod="openstack/dnsmasq-dns-7c7fcc54fc-r87hf" Nov 24 12:33:06 crc kubenswrapper[4752]: I1124 12:33:06.039224 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hv7h\" (UniqueName: \"kubernetes.io/projected/14d15b12-6098-4bc8-9e5d-481426fcf470-kube-api-access-5hv7h\") pod \"dnsmasq-dns-7c7fcc54fc-r87hf\" (UID: \"14d15b12-6098-4bc8-9e5d-481426fcf470\") " pod="openstack/dnsmasq-dns-7c7fcc54fc-r87hf" Nov 24 12:33:06 crc kubenswrapper[4752]: I1124 12:33:06.039262 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14d15b12-6098-4bc8-9e5d-481426fcf470-dns-svc\") pod \"dnsmasq-dns-7c7fcc54fc-r87hf\" (UID: \"14d15b12-6098-4bc8-9e5d-481426fcf470\") " pod="openstack/dnsmasq-dns-7c7fcc54fc-r87hf" Nov 24 12:33:06 crc kubenswrapper[4752]: I1124 12:33:06.040870 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/14d15b12-6098-4bc8-9e5d-481426fcf470-ovsdbserver-sb\") pod \"dnsmasq-dns-7c7fcc54fc-r87hf\" (UID: \"14d15b12-6098-4bc8-9e5d-481426fcf470\") " pod="openstack/dnsmasq-dns-7c7fcc54fc-r87hf" Nov 24 12:33:06 crc kubenswrapper[4752]: I1124 12:33:06.040972 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/14d15b12-6098-4bc8-9e5d-481426fcf470-ovsdbserver-nb\") pod \"dnsmasq-dns-7c7fcc54fc-r87hf\" (UID: \"14d15b12-6098-4bc8-9e5d-481426fcf470\") " pod="openstack/dnsmasq-dns-7c7fcc54fc-r87hf" Nov 24 12:33:06 crc kubenswrapper[4752]: I1124 12:33:06.041312 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14d15b12-6098-4bc8-9e5d-481426fcf470-dns-svc\") pod \"dnsmasq-dns-7c7fcc54fc-r87hf\" (UID: \"14d15b12-6098-4bc8-9e5d-481426fcf470\") " pod="openstack/dnsmasq-dns-7c7fcc54fc-r87hf" Nov 24 12:33:06 crc kubenswrapper[4752]: I1124 12:33:06.041466 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14d15b12-6098-4bc8-9e5d-481426fcf470-config\") pod \"dnsmasq-dns-7c7fcc54fc-r87hf\" (UID: \"14d15b12-6098-4bc8-9e5d-481426fcf470\") " pod="openstack/dnsmasq-dns-7c7fcc54fc-r87hf" Nov 24 12:33:06 crc kubenswrapper[4752]: I1124 12:33:06.069801 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hv7h\" (UniqueName: \"kubernetes.io/projected/14d15b12-6098-4bc8-9e5d-481426fcf470-kube-api-access-5hv7h\") pod \"dnsmasq-dns-7c7fcc54fc-r87hf\" (UID: \"14d15b12-6098-4bc8-9e5d-481426fcf470\") " pod="openstack/dnsmasq-dns-7c7fcc54fc-r87hf" Nov 24 12:33:06 crc kubenswrapper[4752]: I1124 12:33:06.114626 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-8658d4b465-f6xtp"] Nov 24 12:33:06 crc kubenswrapper[4752]: I1124 12:33:06.116297 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8658d4b465-f6xtp" Nov 24 12:33:06 crc kubenswrapper[4752]: I1124 12:33:06.119305 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 24 12:33:06 crc kubenswrapper[4752]: I1124 12:33:06.124677 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 24 12:33:06 crc kubenswrapper[4752]: I1124 12:33:06.125034 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-qxls4" Nov 24 12:33:06 crc kubenswrapper[4752]: I1124 12:33:06.163405 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c7fcc54fc-r87hf" Nov 24 12:33:06 crc kubenswrapper[4752]: I1124 12:33:06.179822 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8658d4b465-f6xtp"] Nov 24 12:33:06 crc kubenswrapper[4752]: I1124 12:33:06.246912 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7919c790-c895-4c2c-bae5-4dd6fb5a86bf-httpd-config\") pod \"neutron-8658d4b465-f6xtp\" (UID: \"7919c790-c895-4c2c-bae5-4dd6fb5a86bf\") " pod="openstack/neutron-8658d4b465-f6xtp" Nov 24 12:33:06 crc kubenswrapper[4752]: I1124 12:33:06.247020 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6hh4\" (UniqueName: \"kubernetes.io/projected/7919c790-c895-4c2c-bae5-4dd6fb5a86bf-kube-api-access-g6hh4\") pod \"neutron-8658d4b465-f6xtp\" (UID: \"7919c790-c895-4c2c-bae5-4dd6fb5a86bf\") " pod="openstack/neutron-8658d4b465-f6xtp" Nov 24 12:33:06 crc kubenswrapper[4752]: I1124 12:33:06.247073 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7919c790-c895-4c2c-bae5-4dd6fb5a86bf-combined-ca-bundle\") pod \"neutron-8658d4b465-f6xtp\" (UID: \"7919c790-c895-4c2c-bae5-4dd6fb5a86bf\") " pod="openstack/neutron-8658d4b465-f6xtp" Nov 24 12:33:06 crc kubenswrapper[4752]: I1124 12:33:06.247156 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7919c790-c895-4c2c-bae5-4dd6fb5a86bf-config\") pod \"neutron-8658d4b465-f6xtp\" (UID: \"7919c790-c895-4c2c-bae5-4dd6fb5a86bf\") " pod="openstack/neutron-8658d4b465-f6xtp" Nov 24 12:33:06 crc kubenswrapper[4752]: I1124 12:33:06.348543 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7919c790-c895-4c2c-bae5-4dd6fb5a86bf-config\") pod \"neutron-8658d4b465-f6xtp\" (UID: \"7919c790-c895-4c2c-bae5-4dd6fb5a86bf\") " pod="openstack/neutron-8658d4b465-f6xtp" Nov 24 12:33:06 crc kubenswrapper[4752]: I1124 12:33:06.348590 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7919c790-c895-4c2c-bae5-4dd6fb5a86bf-httpd-config\") pod \"neutron-8658d4b465-f6xtp\" (UID: \"7919c790-c895-4c2c-bae5-4dd6fb5a86bf\") " pod="openstack/neutron-8658d4b465-f6xtp" Nov 24 12:33:06 crc kubenswrapper[4752]: I1124 12:33:06.348669 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6hh4\" (UniqueName: \"kubernetes.io/projected/7919c790-c895-4c2c-bae5-4dd6fb5a86bf-kube-api-access-g6hh4\") pod \"neutron-8658d4b465-f6xtp\" (UID: \"7919c790-c895-4c2c-bae5-4dd6fb5a86bf\") " pod="openstack/neutron-8658d4b465-f6xtp" Nov 24 12:33:06 crc kubenswrapper[4752]: I1124 12:33:06.348718 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7919c790-c895-4c2c-bae5-4dd6fb5a86bf-combined-ca-bundle\") pod \"neutron-8658d4b465-f6xtp\" (UID: \"7919c790-c895-4c2c-bae5-4dd6fb5a86bf\") " pod="openstack/neutron-8658d4b465-f6xtp" Nov 24 12:33:06 crc kubenswrapper[4752]: I1124 12:33:06.354607 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7919c790-c895-4c2c-bae5-4dd6fb5a86bf-httpd-config\") pod \"neutron-8658d4b465-f6xtp\" (UID: \"7919c790-c895-4c2c-bae5-4dd6fb5a86bf\") " pod="openstack/neutron-8658d4b465-f6xtp" Nov 24 12:33:06 crc kubenswrapper[4752]: I1124 12:33:06.355567 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7919c790-c895-4c2c-bae5-4dd6fb5a86bf-combined-ca-bundle\") pod \"neutron-8658d4b465-f6xtp\" (UID: \"7919c790-c895-4c2c-bae5-4dd6fb5a86bf\") " pod="openstack/neutron-8658d4b465-f6xtp" Nov 24 12:33:06 crc kubenswrapper[4752]: I1124 12:33:06.366471 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6hh4\" (UniqueName: \"kubernetes.io/projected/7919c790-c895-4c2c-bae5-4dd6fb5a86bf-kube-api-access-g6hh4\") pod \"neutron-8658d4b465-f6xtp\" (UID: \"7919c790-c895-4c2c-bae5-4dd6fb5a86bf\") " pod="openstack/neutron-8658d4b465-f6xtp" Nov 24 12:33:06 crc kubenswrapper[4752]: I1124 12:33:06.375791 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7919c790-c895-4c2c-bae5-4dd6fb5a86bf-config\") pod \"neutron-8658d4b465-f6xtp\" (UID: \"7919c790-c895-4c2c-bae5-4dd6fb5a86bf\") " pod="openstack/neutron-8658d4b465-f6xtp" Nov 24 12:33:06 crc kubenswrapper[4752]: I1124 12:33:06.444818 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8658d4b465-f6xtp" Nov 24 12:33:06 crc kubenswrapper[4752]: I1124 12:33:06.612657 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c7fcc54fc-r87hf"] Nov 24 12:33:07 crc kubenswrapper[4752]: I1124 12:33:07.049452 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8658d4b465-f6xtp"] Nov 24 12:33:07 crc kubenswrapper[4752]: I1124 12:33:07.624313 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8658d4b465-f6xtp" event={"ID":"7919c790-c895-4c2c-bae5-4dd6fb5a86bf","Type":"ContainerStarted","Data":"bcae33a8a977224319e512245f48b3e395b27d79adfc2d0f86c023019a7f6240"} Nov 24 12:33:07 crc kubenswrapper[4752]: I1124 12:33:07.625172 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-8658d4b465-f6xtp" Nov 24 12:33:07 crc kubenswrapper[4752]: I1124 12:33:07.625224 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8658d4b465-f6xtp" event={"ID":"7919c790-c895-4c2c-bae5-4dd6fb5a86bf","Type":"ContainerStarted","Data":"010cd8210df6b9baaab292a5a09a554caba0f3d374e7b542b8f7446ad52a1d53"} Nov 24 12:33:07 crc kubenswrapper[4752]: I1124 12:33:07.625249 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8658d4b465-f6xtp" event={"ID":"7919c790-c895-4c2c-bae5-4dd6fb5a86bf","Type":"ContainerStarted","Data":"8119a41fa401181652cdb97d2a4ab6bd9c5f02920aaabcbe620c28a9e56dfdc6"} Nov 24 12:33:07 crc kubenswrapper[4752]: I1124 12:33:07.627343 4752 generic.go:334] "Generic (PLEG): container finished" podID="14d15b12-6098-4bc8-9e5d-481426fcf470" containerID="efb2162466882028963a722b81a04320e2e8a9355498ff192236a35dbece1c04" exitCode=0 Nov 24 12:33:07 crc kubenswrapper[4752]: I1124 12:33:07.627378 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c7fcc54fc-r87hf" event={"ID":"14d15b12-6098-4bc8-9e5d-481426fcf470","Type":"ContainerDied","Data":"efb2162466882028963a722b81a04320e2e8a9355498ff192236a35dbece1c04"} Nov 24 12:33:07 crc kubenswrapper[4752]: I1124 12:33:07.627395 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c7fcc54fc-r87hf" event={"ID":"14d15b12-6098-4bc8-9e5d-481426fcf470","Type":"ContainerStarted","Data":"fe8b136a0e074356b49892de787043b91e59049bf55e337aeaebb0ab836f2b5c"} Nov 24 12:33:07 crc kubenswrapper[4752]: I1124 12:33:07.650009 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-8658d4b465-f6xtp" podStartSLOduration=1.6499875689999999 podStartE2EDuration="1.649987569s" podCreationTimestamp="2025-11-24 12:33:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:33:07.644331197 +0000 UTC m=+5193.629151486" watchObservedRunningTime="2025-11-24 12:33:07.649987569 +0000 UTC m=+5193.634807868" Nov 24 12:33:08 crc kubenswrapper[4752]: I1124 12:33:08.653011 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c7fcc54fc-r87hf" event={"ID":"14d15b12-6098-4bc8-9e5d-481426fcf470","Type":"ContainerStarted","Data":"686b3c549cb0ddcb1ba619d2a797fffa21b726b2acf3c639b7c11e85584cf0aa"} Nov 24 12:33:08 crc kubenswrapper[4752]: I1124 12:33:08.687560 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c7fcc54fc-r87hf" podStartSLOduration=3.687541774 podStartE2EDuration="3.687541774s" podCreationTimestamp="2025-11-24 12:33:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:33:08.682229031 +0000 UTC m=+5194.667049330" watchObservedRunningTime="2025-11-24 12:33:08.687541774 +0000 UTC m=+5194.672362063" Nov 24 12:33:09 crc kubenswrapper[4752]: I1124 12:33:09.664420 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c7fcc54fc-r87hf" Nov 24 12:33:13 crc kubenswrapper[4752]: I1124 12:33:13.908216 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-h47fw"] Nov 24 12:33:13 crc kubenswrapper[4752]: I1124 12:33:13.910509 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h47fw" Nov 24 12:33:13 crc kubenswrapper[4752]: I1124 12:33:13.930325 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h47fw"] Nov 24 12:33:13 crc kubenswrapper[4752]: I1124 12:33:13.977397 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4afceb8e-acb1-41c5-b2cf-df888b96892b-catalog-content\") pod \"community-operators-h47fw\" (UID: \"4afceb8e-acb1-41c5-b2cf-df888b96892b\") " pod="openshift-marketplace/community-operators-h47fw" Nov 24 12:33:13 crc kubenswrapper[4752]: I1124 12:33:13.977475 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmhnx\" (UniqueName: \"kubernetes.io/projected/4afceb8e-acb1-41c5-b2cf-df888b96892b-kube-api-access-qmhnx\") pod \"community-operators-h47fw\" (UID: \"4afceb8e-acb1-41c5-b2cf-df888b96892b\") " pod="openshift-marketplace/community-operators-h47fw" Nov 24 12:33:13 crc kubenswrapper[4752]: I1124 12:33:13.977539 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4afceb8e-acb1-41c5-b2cf-df888b96892b-utilities\") pod \"community-operators-h47fw\" (UID: \"4afceb8e-acb1-41c5-b2cf-df888b96892b\") " pod="openshift-marketplace/community-operators-h47fw" Nov 24 12:33:14 crc kubenswrapper[4752]: I1124 12:33:14.079136 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4afceb8e-acb1-41c5-b2cf-df888b96892b-catalog-content\") pod \"community-operators-h47fw\" (UID: \"4afceb8e-acb1-41c5-b2cf-df888b96892b\") " pod="openshift-marketplace/community-operators-h47fw" Nov 24 12:33:14 crc kubenswrapper[4752]: I1124 12:33:14.079203 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmhnx\" (UniqueName: \"kubernetes.io/projected/4afceb8e-acb1-41c5-b2cf-df888b96892b-kube-api-access-qmhnx\") pod \"community-operators-h47fw\" (UID: \"4afceb8e-acb1-41c5-b2cf-df888b96892b\") " pod="openshift-marketplace/community-operators-h47fw" Nov 24 12:33:14 crc kubenswrapper[4752]: I1124 12:33:14.079253 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4afceb8e-acb1-41c5-b2cf-df888b96892b-utilities\") pod \"community-operators-h47fw\" (UID: \"4afceb8e-acb1-41c5-b2cf-df888b96892b\") " pod="openshift-marketplace/community-operators-h47fw" Nov 24 12:33:14 crc kubenswrapper[4752]: I1124 12:33:14.080132 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4afceb8e-acb1-41c5-b2cf-df888b96892b-catalog-content\") pod \"community-operators-h47fw\" (UID: \"4afceb8e-acb1-41c5-b2cf-df888b96892b\") " pod="openshift-marketplace/community-operators-h47fw" Nov 24 12:33:14 crc kubenswrapper[4752]: I1124 12:33:14.080219 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4afceb8e-acb1-41c5-b2cf-df888b96892b-utilities\") pod \"community-operators-h47fw\" (UID: \"4afceb8e-acb1-41c5-b2cf-df888b96892b\") " pod="openshift-marketplace/community-operators-h47fw" Nov 24 12:33:14 crc kubenswrapper[4752]: I1124 12:33:14.103900 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmhnx\" (UniqueName: \"kubernetes.io/projected/4afceb8e-acb1-41c5-b2cf-df888b96892b-kube-api-access-qmhnx\") pod \"community-operators-h47fw\" (UID: \"4afceb8e-acb1-41c5-b2cf-df888b96892b\") " pod="openshift-marketplace/community-operators-h47fw" Nov 24 12:33:14 crc kubenswrapper[4752]: I1124 12:33:14.232995 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h47fw" Nov 24 12:33:14 crc kubenswrapper[4752]: I1124 12:33:14.715280 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h47fw"] Nov 24 12:33:14 crc kubenswrapper[4752]: I1124 12:33:14.734709 4752 scope.go:117] "RemoveContainer" containerID="294534fa96e641fffe9b0cce6e2bc4d525450d730fff13a2ed6629d46358ee8a" Nov 24 12:33:14 crc kubenswrapper[4752]: E1124 12:33:14.735048 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:33:15 crc kubenswrapper[4752]: I1124 12:33:15.718030 4752 generic.go:334] "Generic (PLEG): container finished" podID="4afceb8e-acb1-41c5-b2cf-df888b96892b" containerID="887b2a68c7bd5482c515dfbb48ff81d355cfa050ce544564b6b7a235ebd88001" exitCode=0 Nov 24 12:33:15 crc kubenswrapper[4752]: I1124 12:33:15.718172 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h47fw" event={"ID":"4afceb8e-acb1-41c5-b2cf-df888b96892b","Type":"ContainerDied","Data":"887b2a68c7bd5482c515dfbb48ff81d355cfa050ce544564b6b7a235ebd88001"} Nov 24 12:33:15 crc kubenswrapper[4752]: I1124 12:33:15.718375 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h47fw" event={"ID":"4afceb8e-acb1-41c5-b2cf-df888b96892b","Type":"ContainerStarted","Data":"02bbe28c1c7442283169e448b62519f01774a007f776d81557724966531383a0"} Nov 24 12:33:16 crc kubenswrapper[4752]: I1124 12:33:16.166044 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7c7fcc54fc-r87hf" Nov 24 12:33:16 crc kubenswrapper[4752]: I1124 12:33:16.245066 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77f8c95469-vlmpv"] Nov 24 12:33:16 crc kubenswrapper[4752]: I1124 12:33:16.245536 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77f8c95469-vlmpv" podUID="0011a25f-8be1-41a4-8ab3-04d9e30f83cc" containerName="dnsmasq-dns" containerID="cri-o://add82764b29b4815f8f1bf2fed69c355771ca9fb678df381e996aadccce38e6a" gracePeriod=10 Nov 24 12:33:16 crc kubenswrapper[4752]: I1124 12:33:16.736350 4752 generic.go:334] "Generic (PLEG): container finished" podID="0011a25f-8be1-41a4-8ab3-04d9e30f83cc" containerID="add82764b29b4815f8f1bf2fed69c355771ca9fb678df381e996aadccce38e6a" exitCode=0 Nov 24 12:33:16 crc kubenswrapper[4752]: I1124 12:33:16.737626 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77f8c95469-vlmpv" Nov 24 12:33:16 crc kubenswrapper[4752]: I1124 12:33:16.738834 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h47fw" event={"ID":"4afceb8e-acb1-41c5-b2cf-df888b96892b","Type":"ContainerStarted","Data":"aefdbe9239a1cfc2081f2f6cc1efee287d2f43551eb8fe60251cefe37a25e301"} Nov 24 12:33:16 crc kubenswrapper[4752]: I1124 12:33:16.738880 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77f8c95469-vlmpv" event={"ID":"0011a25f-8be1-41a4-8ab3-04d9e30f83cc","Type":"ContainerDied","Data":"add82764b29b4815f8f1bf2fed69c355771ca9fb678df381e996aadccce38e6a"} Nov 24 12:33:16 crc kubenswrapper[4752]: I1124 12:33:16.738900 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77f8c95469-vlmpv" event={"ID":"0011a25f-8be1-41a4-8ab3-04d9e30f83cc","Type":"ContainerDied","Data":"36443f1297c82d50f2087d9b4ece2387dc3d6bd76c4453af79022843356f011f"} Nov 24 12:33:16 crc kubenswrapper[4752]: I1124 12:33:16.738921 4752 scope.go:117] "RemoveContainer" containerID="add82764b29b4815f8f1bf2fed69c355771ca9fb678df381e996aadccce38e6a" Nov 24 12:33:16 crc kubenswrapper[4752]: I1124 12:33:16.766162 4752 scope.go:117] "RemoveContainer" containerID="d1c0f7d057d05811cbdec2b71189c594a5e659d2a841a9576b9ebb28ccec374f" Nov 24 12:33:16 crc kubenswrapper[4752]: I1124 12:33:16.818865 4752 scope.go:117] "RemoveContainer" containerID="add82764b29b4815f8f1bf2fed69c355771ca9fb678df381e996aadccce38e6a" Nov 24 12:33:16 crc kubenswrapper[4752]: E1124 12:33:16.819914 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"add82764b29b4815f8f1bf2fed69c355771ca9fb678df381e996aadccce38e6a\": container with ID starting with add82764b29b4815f8f1bf2fed69c355771ca9fb678df381e996aadccce38e6a not found: ID does not exist" containerID="add82764b29b4815f8f1bf2fed69c355771ca9fb678df381e996aadccce38e6a" Nov 24 12:33:16 crc kubenswrapper[4752]: I1124 12:33:16.819942 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"add82764b29b4815f8f1bf2fed69c355771ca9fb678df381e996aadccce38e6a"} err="failed to get container status \"add82764b29b4815f8f1bf2fed69c355771ca9fb678df381e996aadccce38e6a\": rpc error: code = NotFound desc = could not find container \"add82764b29b4815f8f1bf2fed69c355771ca9fb678df381e996aadccce38e6a\": container with ID starting with add82764b29b4815f8f1bf2fed69c355771ca9fb678df381e996aadccce38e6a not found: ID does not exist" Nov 24 12:33:16 crc kubenswrapper[4752]: I1124 12:33:16.819961 4752 scope.go:117] "RemoveContainer" containerID="d1c0f7d057d05811cbdec2b71189c594a5e659d2a841a9576b9ebb28ccec374f" Nov 24 12:33:16 crc kubenswrapper[4752]: E1124 12:33:16.820392 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1c0f7d057d05811cbdec2b71189c594a5e659d2a841a9576b9ebb28ccec374f\": container with ID starting with d1c0f7d057d05811cbdec2b71189c594a5e659d2a841a9576b9ebb28ccec374f not found: ID does not exist" containerID="d1c0f7d057d05811cbdec2b71189c594a5e659d2a841a9576b9ebb28ccec374f" Nov 24 12:33:16 crc kubenswrapper[4752]: I1124 12:33:16.820446 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1c0f7d057d05811cbdec2b71189c594a5e659d2a841a9576b9ebb28ccec374f"} err="failed to get container status \"d1c0f7d057d05811cbdec2b71189c594a5e659d2a841a9576b9ebb28ccec374f\": rpc error: code = NotFound desc = could not find container \"d1c0f7d057d05811cbdec2b71189c594a5e659d2a841a9576b9ebb28ccec374f\": container with ID starting with d1c0f7d057d05811cbdec2b71189c594a5e659d2a841a9576b9ebb28ccec374f not found: ID does not exist" Nov 24 12:33:16 crc kubenswrapper[4752]: I1124 12:33:16.842243 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2l8lt\" (UniqueName: \"kubernetes.io/projected/0011a25f-8be1-41a4-8ab3-04d9e30f83cc-kube-api-access-2l8lt\") pod \"0011a25f-8be1-41a4-8ab3-04d9e30f83cc\" (UID: \"0011a25f-8be1-41a4-8ab3-04d9e30f83cc\") " Nov 24 12:33:16 crc kubenswrapper[4752]: I1124 12:33:16.842298 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0011a25f-8be1-41a4-8ab3-04d9e30f83cc-dns-svc\") pod \"0011a25f-8be1-41a4-8ab3-04d9e30f83cc\" (UID: \"0011a25f-8be1-41a4-8ab3-04d9e30f83cc\") " Nov 24 12:33:16 crc kubenswrapper[4752]: I1124 12:33:16.842442 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0011a25f-8be1-41a4-8ab3-04d9e30f83cc-config\") pod \"0011a25f-8be1-41a4-8ab3-04d9e30f83cc\" (UID: \"0011a25f-8be1-41a4-8ab3-04d9e30f83cc\") " Nov 24 12:33:16 crc kubenswrapper[4752]: I1124 12:33:16.842481 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0011a25f-8be1-41a4-8ab3-04d9e30f83cc-ovsdbserver-nb\") pod \"0011a25f-8be1-41a4-8ab3-04d9e30f83cc\" (UID: \"0011a25f-8be1-41a4-8ab3-04d9e30f83cc\") " Nov 24 12:33:16 crc kubenswrapper[4752]: I1124 12:33:16.842521 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0011a25f-8be1-41a4-8ab3-04d9e30f83cc-ovsdbserver-sb\") pod \"0011a25f-8be1-41a4-8ab3-04d9e30f83cc\" (UID: \"0011a25f-8be1-41a4-8ab3-04d9e30f83cc\") " Nov 24 12:33:16 crc kubenswrapper[4752]: I1124 12:33:16.849546 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0011a25f-8be1-41a4-8ab3-04d9e30f83cc-kube-api-access-2l8lt" (OuterVolumeSpecName: "kube-api-access-2l8lt") pod "0011a25f-8be1-41a4-8ab3-04d9e30f83cc" (UID: "0011a25f-8be1-41a4-8ab3-04d9e30f83cc"). InnerVolumeSpecName "kube-api-access-2l8lt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:33:16 crc kubenswrapper[4752]: I1124 12:33:16.883509 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0011a25f-8be1-41a4-8ab3-04d9e30f83cc-config" (OuterVolumeSpecName: "config") pod "0011a25f-8be1-41a4-8ab3-04d9e30f83cc" (UID: "0011a25f-8be1-41a4-8ab3-04d9e30f83cc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:33:16 crc kubenswrapper[4752]: I1124 12:33:16.883821 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0011a25f-8be1-41a4-8ab3-04d9e30f83cc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0011a25f-8be1-41a4-8ab3-04d9e30f83cc" (UID: "0011a25f-8be1-41a4-8ab3-04d9e30f83cc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:33:16 crc kubenswrapper[4752]: I1124 12:33:16.886667 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0011a25f-8be1-41a4-8ab3-04d9e30f83cc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0011a25f-8be1-41a4-8ab3-04d9e30f83cc" (UID: "0011a25f-8be1-41a4-8ab3-04d9e30f83cc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:33:16 crc kubenswrapper[4752]: I1124 12:33:16.888566 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0011a25f-8be1-41a4-8ab3-04d9e30f83cc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0011a25f-8be1-41a4-8ab3-04d9e30f83cc" (UID: "0011a25f-8be1-41a4-8ab3-04d9e30f83cc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:33:16 crc kubenswrapper[4752]: I1124 12:33:16.945063 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2l8lt\" (UniqueName: \"kubernetes.io/projected/0011a25f-8be1-41a4-8ab3-04d9e30f83cc-kube-api-access-2l8lt\") on node \"crc\" DevicePath \"\"" Nov 24 12:33:16 crc kubenswrapper[4752]: I1124 12:33:16.945329 4752 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0011a25f-8be1-41a4-8ab3-04d9e30f83cc-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 12:33:16 crc kubenswrapper[4752]: I1124 12:33:16.945443 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0011a25f-8be1-41a4-8ab3-04d9e30f83cc-config\") on node \"crc\" DevicePath \"\"" Nov 24 12:33:16 crc kubenswrapper[4752]: I1124 12:33:16.945534 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0011a25f-8be1-41a4-8ab3-04d9e30f83cc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 24 12:33:16 crc kubenswrapper[4752]: I1124 12:33:16.945627 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0011a25f-8be1-41a4-8ab3-04d9e30f83cc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 24 12:33:17 crc kubenswrapper[4752]: I1124 12:33:17.749185 4752 generic.go:334] "Generic (PLEG): container finished" podID="4afceb8e-acb1-41c5-b2cf-df888b96892b" containerID="aefdbe9239a1cfc2081f2f6cc1efee287d2f43551eb8fe60251cefe37a25e301" exitCode=0 Nov 24 12:33:17 crc kubenswrapper[4752]: I1124 12:33:17.749287 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h47fw" event={"ID":"4afceb8e-acb1-41c5-b2cf-df888b96892b","Type":"ContainerDied","Data":"aefdbe9239a1cfc2081f2f6cc1efee287d2f43551eb8fe60251cefe37a25e301"} Nov 24 12:33:17 crc kubenswrapper[4752]: I1124 12:33:17.750984 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77f8c95469-vlmpv" Nov 24 12:33:17 crc kubenswrapper[4752]: I1124 12:33:17.792221 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77f8c95469-vlmpv"] Nov 24 12:33:17 crc kubenswrapper[4752]: I1124 12:33:17.798403 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77f8c95469-vlmpv"] Nov 24 12:33:18 crc kubenswrapper[4752]: I1124 12:33:18.746290 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0011a25f-8be1-41a4-8ab3-04d9e30f83cc" path="/var/lib/kubelet/pods/0011a25f-8be1-41a4-8ab3-04d9e30f83cc/volumes" Nov 24 12:33:18 crc kubenswrapper[4752]: I1124 12:33:18.763987 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h47fw" event={"ID":"4afceb8e-acb1-41c5-b2cf-df888b96892b","Type":"ContainerStarted","Data":"99d26af6cd8a6bf957ee59c6305fae92f722d392b500ec4a0a0f7f6252665306"} Nov 24 12:33:18 crc kubenswrapper[4752]: I1124 12:33:18.791188 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-h47fw" podStartSLOduration=3.355411797 podStartE2EDuration="5.791170564s" podCreationTimestamp="2025-11-24 12:33:13 +0000 UTC" firstStartedPulling="2025-11-24 12:33:15.72072788 +0000 UTC m=+5201.705548179" lastFinishedPulling="2025-11-24 12:33:18.156486647 +0000 UTC m=+5204.141306946" observedRunningTime="2025-11-24 12:33:18.787272893 +0000 UTC m=+5204.772093212" watchObservedRunningTime="2025-11-24 12:33:18.791170564 +0000 UTC m=+5204.775990873" Nov 24 12:33:24 crc kubenswrapper[4752]: I1124 12:33:24.233196 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-h47fw" Nov 24 12:33:24 crc kubenswrapper[4752]: I1124 12:33:24.233829 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-h47fw" Nov 24 12:33:24 crc kubenswrapper[4752]: I1124 12:33:24.308114 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-h47fw" Nov 24 12:33:24 crc kubenswrapper[4752]: I1124 12:33:24.871863 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-h47fw" Nov 24 12:33:24 crc kubenswrapper[4752]: I1124 12:33:24.938470 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h47fw"] Nov 24 12:33:26 crc kubenswrapper[4752]: I1124 12:33:26.840231 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-h47fw" podUID="4afceb8e-acb1-41c5-b2cf-df888b96892b" containerName="registry-server" containerID="cri-o://99d26af6cd8a6bf957ee59c6305fae92f722d392b500ec4a0a0f7f6252665306" gracePeriod=2 Nov 24 12:33:27 crc kubenswrapper[4752]: I1124 12:33:27.447292 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h47fw" Nov 24 12:33:27 crc kubenswrapper[4752]: I1124 12:33:27.535729 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4afceb8e-acb1-41c5-b2cf-df888b96892b-catalog-content\") pod \"4afceb8e-acb1-41c5-b2cf-df888b96892b\" (UID: \"4afceb8e-acb1-41c5-b2cf-df888b96892b\") " Nov 24 12:33:27 crc kubenswrapper[4752]: I1124 12:33:27.535830 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmhnx\" (UniqueName: \"kubernetes.io/projected/4afceb8e-acb1-41c5-b2cf-df888b96892b-kube-api-access-qmhnx\") pod \"4afceb8e-acb1-41c5-b2cf-df888b96892b\" (UID: \"4afceb8e-acb1-41c5-b2cf-df888b96892b\") " Nov 24 12:33:27 crc kubenswrapper[4752]: I1124 12:33:27.535935 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4afceb8e-acb1-41c5-b2cf-df888b96892b-utilities\") pod \"4afceb8e-acb1-41c5-b2cf-df888b96892b\" (UID: \"4afceb8e-acb1-41c5-b2cf-df888b96892b\") " Nov 24 12:33:27 crc kubenswrapper[4752]: I1124 12:33:27.536822 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4afceb8e-acb1-41c5-b2cf-df888b96892b-utilities" (OuterVolumeSpecName: "utilities") pod "4afceb8e-acb1-41c5-b2cf-df888b96892b" (UID: "4afceb8e-acb1-41c5-b2cf-df888b96892b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:33:27 crc kubenswrapper[4752]: I1124 12:33:27.541119 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4afceb8e-acb1-41c5-b2cf-df888b96892b-kube-api-access-qmhnx" (OuterVolumeSpecName: "kube-api-access-qmhnx") pod "4afceb8e-acb1-41c5-b2cf-df888b96892b" (UID: "4afceb8e-acb1-41c5-b2cf-df888b96892b"). InnerVolumeSpecName "kube-api-access-qmhnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:33:27 crc kubenswrapper[4752]: I1124 12:33:27.637651 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmhnx\" (UniqueName: \"kubernetes.io/projected/4afceb8e-acb1-41c5-b2cf-df888b96892b-kube-api-access-qmhnx\") on node \"crc\" DevicePath \"\"" Nov 24 12:33:27 crc kubenswrapper[4752]: I1124 12:33:27.638080 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4afceb8e-acb1-41c5-b2cf-df888b96892b-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 12:33:27 crc kubenswrapper[4752]: I1124 12:33:27.654942 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4afceb8e-acb1-41c5-b2cf-df888b96892b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4afceb8e-acb1-41c5-b2cf-df888b96892b" (UID: "4afceb8e-acb1-41c5-b2cf-df888b96892b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:33:27 crc kubenswrapper[4752]: I1124 12:33:27.728232 4752 scope.go:117] "RemoveContainer" containerID="294534fa96e641fffe9b0cce6e2bc4d525450d730fff13a2ed6629d46358ee8a" Nov 24 12:33:27 crc kubenswrapper[4752]: E1124 12:33:27.728544 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:33:27 crc kubenswrapper[4752]: I1124 12:33:27.738946 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4afceb8e-acb1-41c5-b2cf-df888b96892b-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 12:33:27 crc kubenswrapper[4752]: I1124 12:33:27.853256 4752 generic.go:334] "Generic (PLEG): container finished" podID="4afceb8e-acb1-41c5-b2cf-df888b96892b" containerID="99d26af6cd8a6bf957ee59c6305fae92f722d392b500ec4a0a0f7f6252665306" exitCode=0 Nov 24 12:33:27 crc kubenswrapper[4752]: I1124 12:33:27.853318 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h47fw" event={"ID":"4afceb8e-acb1-41c5-b2cf-df888b96892b","Type":"ContainerDied","Data":"99d26af6cd8a6bf957ee59c6305fae92f722d392b500ec4a0a0f7f6252665306"} Nov 24 12:33:27 crc kubenswrapper[4752]: I1124 12:33:27.853369 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h47fw" event={"ID":"4afceb8e-acb1-41c5-b2cf-df888b96892b","Type":"ContainerDied","Data":"02bbe28c1c7442283169e448b62519f01774a007f776d81557724966531383a0"} Nov 24 12:33:27 crc kubenswrapper[4752]: I1124 12:33:27.853406 4752 scope.go:117] "RemoveContainer" containerID="99d26af6cd8a6bf957ee59c6305fae92f722d392b500ec4a0a0f7f6252665306" Nov 24 12:33:27 crc kubenswrapper[4752]: I1124 12:33:27.854664 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h47fw" Nov 24 12:33:27 crc kubenswrapper[4752]: I1124 12:33:27.879604 4752 scope.go:117] "RemoveContainer" containerID="aefdbe9239a1cfc2081f2f6cc1efee287d2f43551eb8fe60251cefe37a25e301" Nov 24 12:33:27 crc kubenswrapper[4752]: I1124 12:33:27.899513 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h47fw"] Nov 24 12:33:27 crc kubenswrapper[4752]: I1124 12:33:27.913309 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-h47fw"] Nov 24 12:33:27 crc kubenswrapper[4752]: I1124 12:33:27.924631 4752 scope.go:117] "RemoveContainer" containerID="887b2a68c7bd5482c515dfbb48ff81d355cfa050ce544564b6b7a235ebd88001" Nov 24 12:33:27 crc kubenswrapper[4752]: I1124 12:33:27.976595 4752 scope.go:117] "RemoveContainer" containerID="99d26af6cd8a6bf957ee59c6305fae92f722d392b500ec4a0a0f7f6252665306" Nov 24 12:33:27 crc kubenswrapper[4752]: E1124 12:33:27.977277 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99d26af6cd8a6bf957ee59c6305fae92f722d392b500ec4a0a0f7f6252665306\": container with ID starting with 99d26af6cd8a6bf957ee59c6305fae92f722d392b500ec4a0a0f7f6252665306 not found: ID does not exist" containerID="99d26af6cd8a6bf957ee59c6305fae92f722d392b500ec4a0a0f7f6252665306" Nov 24 12:33:27 crc kubenswrapper[4752]: I1124 12:33:27.977329 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99d26af6cd8a6bf957ee59c6305fae92f722d392b500ec4a0a0f7f6252665306"} err="failed to get container status \"99d26af6cd8a6bf957ee59c6305fae92f722d392b500ec4a0a0f7f6252665306\": rpc error: code = NotFound desc = could not find container \"99d26af6cd8a6bf957ee59c6305fae92f722d392b500ec4a0a0f7f6252665306\": container with ID starting with 99d26af6cd8a6bf957ee59c6305fae92f722d392b500ec4a0a0f7f6252665306 not found: ID does not exist" Nov 24 12:33:27 crc kubenswrapper[4752]: I1124 12:33:27.977369 4752 scope.go:117] "RemoveContainer" containerID="aefdbe9239a1cfc2081f2f6cc1efee287d2f43551eb8fe60251cefe37a25e301" Nov 24 12:33:27 crc kubenswrapper[4752]: E1124 12:33:27.978081 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aefdbe9239a1cfc2081f2f6cc1efee287d2f43551eb8fe60251cefe37a25e301\": container with ID starting with aefdbe9239a1cfc2081f2f6cc1efee287d2f43551eb8fe60251cefe37a25e301 not found: ID does not exist" containerID="aefdbe9239a1cfc2081f2f6cc1efee287d2f43551eb8fe60251cefe37a25e301" Nov 24 12:33:27 crc kubenswrapper[4752]: I1124 12:33:27.978228 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aefdbe9239a1cfc2081f2f6cc1efee287d2f43551eb8fe60251cefe37a25e301"} err="failed to get container status \"aefdbe9239a1cfc2081f2f6cc1efee287d2f43551eb8fe60251cefe37a25e301\": rpc error: code = NotFound desc = could not find container \"aefdbe9239a1cfc2081f2f6cc1efee287d2f43551eb8fe60251cefe37a25e301\": container with ID starting with aefdbe9239a1cfc2081f2f6cc1efee287d2f43551eb8fe60251cefe37a25e301 not found: ID does not exist" Nov 24 12:33:27 crc kubenswrapper[4752]: I1124 12:33:27.978258 4752 scope.go:117] "RemoveContainer" containerID="887b2a68c7bd5482c515dfbb48ff81d355cfa050ce544564b6b7a235ebd88001" Nov 24 12:33:27 crc kubenswrapper[4752]: E1124 12:33:27.978667 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"887b2a68c7bd5482c515dfbb48ff81d355cfa050ce544564b6b7a235ebd88001\": container with ID starting with 887b2a68c7bd5482c515dfbb48ff81d355cfa050ce544564b6b7a235ebd88001 not found: ID does not exist" containerID="887b2a68c7bd5482c515dfbb48ff81d355cfa050ce544564b6b7a235ebd88001" Nov 24 12:33:27 crc kubenswrapper[4752]: I1124 12:33:27.978703 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"887b2a68c7bd5482c515dfbb48ff81d355cfa050ce544564b6b7a235ebd88001"} err="failed to get container status \"887b2a68c7bd5482c515dfbb48ff81d355cfa050ce544564b6b7a235ebd88001\": rpc error: code = NotFound desc = could not find container \"887b2a68c7bd5482c515dfbb48ff81d355cfa050ce544564b6b7a235ebd88001\": container with ID starting with 887b2a68c7bd5482c515dfbb48ff81d355cfa050ce544564b6b7a235ebd88001 not found: ID does not exist" Nov 24 12:33:28 crc kubenswrapper[4752]: I1124 12:33:28.743512 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4afceb8e-acb1-41c5-b2cf-df888b96892b" path="/var/lib/kubelet/pods/4afceb8e-acb1-41c5-b2cf-df888b96892b/volumes" Nov 24 12:33:36 crc kubenswrapper[4752]: I1124 12:33:36.457998 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-8658d4b465-f6xtp" Nov 24 12:33:42 crc kubenswrapper[4752]: I1124 12:33:42.728684 4752 scope.go:117] "RemoveContainer" containerID="294534fa96e641fffe9b0cce6e2bc4d525450d730fff13a2ed6629d46358ee8a" Nov 24 12:33:42 crc kubenswrapper[4752]: E1124 12:33:42.729833 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:33:43 crc kubenswrapper[4752]: I1124 12:33:43.414331 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-zfhsz"] Nov 24 12:33:43 crc kubenswrapper[4752]: E1124 12:33:43.414717 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4afceb8e-acb1-41c5-b2cf-df888b96892b" containerName="extract-content" Nov 24 12:33:43 crc kubenswrapper[4752]: I1124 12:33:43.415056 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="4afceb8e-acb1-41c5-b2cf-df888b96892b" containerName="extract-content" Nov 24 12:33:43 crc kubenswrapper[4752]: E1124 12:33:43.415075 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4afceb8e-acb1-41c5-b2cf-df888b96892b" containerName="extract-utilities" Nov 24 12:33:43 crc kubenswrapper[4752]: I1124 12:33:43.415083 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="4afceb8e-acb1-41c5-b2cf-df888b96892b" containerName="extract-utilities" Nov 24 12:33:43 crc kubenswrapper[4752]: E1124 12:33:43.415106 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0011a25f-8be1-41a4-8ab3-04d9e30f83cc" containerName="dnsmasq-dns" Nov 24 12:33:43 crc kubenswrapper[4752]: I1124 12:33:43.415115 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="0011a25f-8be1-41a4-8ab3-04d9e30f83cc" containerName="dnsmasq-dns" Nov 24 12:33:43 crc kubenswrapper[4752]: E1124 12:33:43.415133 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4afceb8e-acb1-41c5-b2cf-df888b96892b" containerName="registry-server" Nov 24 12:33:43 crc kubenswrapper[4752]: I1124 12:33:43.415139 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="4afceb8e-acb1-41c5-b2cf-df888b96892b" containerName="registry-server" Nov 24 12:33:43 crc kubenswrapper[4752]: E1124 12:33:43.415159 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0011a25f-8be1-41a4-8ab3-04d9e30f83cc" containerName="init" Nov 24 12:33:43 crc kubenswrapper[4752]: I1124 12:33:43.415166 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="0011a25f-8be1-41a4-8ab3-04d9e30f83cc" containerName="init" Nov 24 12:33:43 crc kubenswrapper[4752]: I1124 12:33:43.415329 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="0011a25f-8be1-41a4-8ab3-04d9e30f83cc" containerName="dnsmasq-dns" Nov 24 12:33:43 crc kubenswrapper[4752]: I1124 12:33:43.415348 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="4afceb8e-acb1-41c5-b2cf-df888b96892b" containerName="registry-server" Nov 24 12:33:43 crc kubenswrapper[4752]: I1124 12:33:43.415884 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-zfhsz" Nov 24 12:33:43 crc kubenswrapper[4752]: I1124 12:33:43.421159 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-zfhsz"] Nov 24 12:33:43 crc kubenswrapper[4752]: I1124 12:33:43.469891 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1653b7b3-f4fb-47a6-ba38-f4d07d0bc55e-operator-scripts\") pod \"glance-db-create-zfhsz\" (UID: \"1653b7b3-f4fb-47a6-ba38-f4d07d0bc55e\") " pod="openstack/glance-db-create-zfhsz" Nov 24 12:33:43 crc kubenswrapper[4752]: I1124 12:33:43.469946 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h7bc\" (UniqueName: \"kubernetes.io/projected/1653b7b3-f4fb-47a6-ba38-f4d07d0bc55e-kube-api-access-9h7bc\") pod \"glance-db-create-zfhsz\" (UID: \"1653b7b3-f4fb-47a6-ba38-f4d07d0bc55e\") " pod="openstack/glance-db-create-zfhsz" Nov 24 12:33:43 crc kubenswrapper[4752]: I1124 12:33:43.510798 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-3f3c-account-create-tkbzs"] Nov 24 12:33:43 crc kubenswrapper[4752]: I1124 12:33:43.512239 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3f3c-account-create-tkbzs" Nov 24 12:33:43 crc kubenswrapper[4752]: I1124 12:33:43.521146 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Nov 24 12:33:43 crc kubenswrapper[4752]: I1124 12:33:43.523646 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3f3c-account-create-tkbzs"] Nov 24 12:33:43 crc kubenswrapper[4752]: I1124 12:33:43.571174 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12492e12-167b-4d92-8a59-fbb72d9b714a-operator-scripts\") pod \"glance-3f3c-account-create-tkbzs\" (UID: \"12492e12-167b-4d92-8a59-fbb72d9b714a\") " pod="openstack/glance-3f3c-account-create-tkbzs" Nov 24 12:33:43 crc kubenswrapper[4752]: I1124 12:33:43.571226 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h7bc\" (UniqueName: \"kubernetes.io/projected/1653b7b3-f4fb-47a6-ba38-f4d07d0bc55e-kube-api-access-9h7bc\") pod \"glance-db-create-zfhsz\" (UID: \"1653b7b3-f4fb-47a6-ba38-f4d07d0bc55e\") " pod="openstack/glance-db-create-zfhsz" Nov 24 12:33:43 crc kubenswrapper[4752]: I1124 12:33:43.571319 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwlm6\" (UniqueName: \"kubernetes.io/projected/12492e12-167b-4d92-8a59-fbb72d9b714a-kube-api-access-pwlm6\") pod \"glance-3f3c-account-create-tkbzs\" (UID: \"12492e12-167b-4d92-8a59-fbb72d9b714a\") " pod="openstack/glance-3f3c-account-create-tkbzs" Nov 24 12:33:43 crc kubenswrapper[4752]: I1124 12:33:43.571386 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1653b7b3-f4fb-47a6-ba38-f4d07d0bc55e-operator-scripts\") pod \"glance-db-create-zfhsz\" (UID: \"1653b7b3-f4fb-47a6-ba38-f4d07d0bc55e\") " pod="openstack/glance-db-create-zfhsz" Nov 24 12:33:43 crc kubenswrapper[4752]: I1124 12:33:43.572046 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1653b7b3-f4fb-47a6-ba38-f4d07d0bc55e-operator-scripts\") pod \"glance-db-create-zfhsz\" (UID: \"1653b7b3-f4fb-47a6-ba38-f4d07d0bc55e\") " pod="openstack/glance-db-create-zfhsz" Nov 24 12:33:43 crc kubenswrapper[4752]: I1124 12:33:43.590663 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h7bc\" (UniqueName: \"kubernetes.io/projected/1653b7b3-f4fb-47a6-ba38-f4d07d0bc55e-kube-api-access-9h7bc\") pod \"glance-db-create-zfhsz\" (UID: \"1653b7b3-f4fb-47a6-ba38-f4d07d0bc55e\") " pod="openstack/glance-db-create-zfhsz" Nov 24 12:33:43 crc kubenswrapper[4752]: I1124 12:33:43.673269 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwlm6\" (UniqueName: \"kubernetes.io/projected/12492e12-167b-4d92-8a59-fbb72d9b714a-kube-api-access-pwlm6\") pod \"glance-3f3c-account-create-tkbzs\" (UID: \"12492e12-167b-4d92-8a59-fbb72d9b714a\") " pod="openstack/glance-3f3c-account-create-tkbzs" Nov 24 12:33:43 crc kubenswrapper[4752]: I1124 12:33:43.673360 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12492e12-167b-4d92-8a59-fbb72d9b714a-operator-scripts\") pod \"glance-3f3c-account-create-tkbzs\" (UID: \"12492e12-167b-4d92-8a59-fbb72d9b714a\") " pod="openstack/glance-3f3c-account-create-tkbzs" Nov 24 12:33:43 crc kubenswrapper[4752]: I1124 12:33:43.674181 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12492e12-167b-4d92-8a59-fbb72d9b714a-operator-scripts\") pod \"glance-3f3c-account-create-tkbzs\" (UID: \"12492e12-167b-4d92-8a59-fbb72d9b714a\") " pod="openstack/glance-3f3c-account-create-tkbzs" Nov 24 12:33:43 crc kubenswrapper[4752]: I1124 12:33:43.694632 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwlm6\" (UniqueName: \"kubernetes.io/projected/12492e12-167b-4d92-8a59-fbb72d9b714a-kube-api-access-pwlm6\") pod \"glance-3f3c-account-create-tkbzs\" (UID: \"12492e12-167b-4d92-8a59-fbb72d9b714a\") " pod="openstack/glance-3f3c-account-create-tkbzs" Nov 24 12:33:43 crc kubenswrapper[4752]: I1124 12:33:43.739047 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-zfhsz" Nov 24 12:33:43 crc kubenswrapper[4752]: I1124 12:33:43.844782 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3f3c-account-create-tkbzs" Nov 24 12:33:44 crc kubenswrapper[4752]: I1124 12:33:44.248718 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-zfhsz"] Nov 24 12:33:44 crc kubenswrapper[4752]: I1124 12:33:44.332209 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3f3c-account-create-tkbzs"] Nov 24 12:33:44 crc kubenswrapper[4752]: W1124 12:33:44.335917 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12492e12_167b_4d92_8a59_fbb72d9b714a.slice/crio-445cd6ba53534f1670c40e57679faf54ac85ea1a51bebf04e48eca10ecbdbc13 WatchSource:0}: Error finding container 445cd6ba53534f1670c40e57679faf54ac85ea1a51bebf04e48eca10ecbdbc13: Status 404 returned error can't find the container with id 445cd6ba53534f1670c40e57679faf54ac85ea1a51bebf04e48eca10ecbdbc13 Nov 24 12:33:45 crc kubenswrapper[4752]: I1124 12:33:45.024459 4752 generic.go:334] "Generic (PLEG): container finished" podID="12492e12-167b-4d92-8a59-fbb72d9b714a" containerID="d087871730aedd97b867333e38a0fd463864679121f1418c73b64a4f44d11f11" exitCode=0 Nov 24 12:33:45 crc kubenswrapper[4752]: I1124 12:33:45.025045 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3f3c-account-create-tkbzs" event={"ID":"12492e12-167b-4d92-8a59-fbb72d9b714a","Type":"ContainerDied","Data":"d087871730aedd97b867333e38a0fd463864679121f1418c73b64a4f44d11f11"} Nov 24 12:33:45 crc kubenswrapper[4752]: I1124 12:33:45.025093 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3f3c-account-create-tkbzs" event={"ID":"12492e12-167b-4d92-8a59-fbb72d9b714a","Type":"ContainerStarted","Data":"445cd6ba53534f1670c40e57679faf54ac85ea1a51bebf04e48eca10ecbdbc13"} Nov 24 12:33:45 crc kubenswrapper[4752]: I1124 12:33:45.028275 4752 generic.go:334] "Generic (PLEG): container finished" podID="1653b7b3-f4fb-47a6-ba38-f4d07d0bc55e" containerID="ad31de39ea38d7148a7a06c921db1ce2e84f70de2691a73cae88ad44790fb196" exitCode=0 Nov 24 12:33:45 crc kubenswrapper[4752]: I1124 12:33:45.028330 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-zfhsz" event={"ID":"1653b7b3-f4fb-47a6-ba38-f4d07d0bc55e","Type":"ContainerDied","Data":"ad31de39ea38d7148a7a06c921db1ce2e84f70de2691a73cae88ad44790fb196"} Nov 24 12:33:45 crc kubenswrapper[4752]: I1124 12:33:45.028363 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-zfhsz" event={"ID":"1653b7b3-f4fb-47a6-ba38-f4d07d0bc55e","Type":"ContainerStarted","Data":"002745778f09a325d3422c78c3eea469658fa1477e4a1f2e47a01c6cce6fad1c"} Nov 24 12:33:46 crc kubenswrapper[4752]: I1124 12:33:46.494025 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-zfhsz" Nov 24 12:33:46 crc kubenswrapper[4752]: I1124 12:33:46.500025 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3f3c-account-create-tkbzs" Nov 24 12:33:46 crc kubenswrapper[4752]: I1124 12:33:46.528609 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwlm6\" (UniqueName: \"kubernetes.io/projected/12492e12-167b-4d92-8a59-fbb72d9b714a-kube-api-access-pwlm6\") pod \"12492e12-167b-4d92-8a59-fbb72d9b714a\" (UID: \"12492e12-167b-4d92-8a59-fbb72d9b714a\") " Nov 24 12:33:46 crc kubenswrapper[4752]: I1124 12:33:46.528681 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12492e12-167b-4d92-8a59-fbb72d9b714a-operator-scripts\") pod \"12492e12-167b-4d92-8a59-fbb72d9b714a\" (UID: \"12492e12-167b-4d92-8a59-fbb72d9b714a\") " Nov 24 12:33:46 crc kubenswrapper[4752]: I1124 12:33:46.528699 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9h7bc\" (UniqueName: \"kubernetes.io/projected/1653b7b3-f4fb-47a6-ba38-f4d07d0bc55e-kube-api-access-9h7bc\") pod \"1653b7b3-f4fb-47a6-ba38-f4d07d0bc55e\" (UID: \"1653b7b3-f4fb-47a6-ba38-f4d07d0bc55e\") " Nov 24 12:33:46 crc kubenswrapper[4752]: I1124 12:33:46.528792 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1653b7b3-f4fb-47a6-ba38-f4d07d0bc55e-operator-scripts\") pod \"1653b7b3-f4fb-47a6-ba38-f4d07d0bc55e\" (UID: \"1653b7b3-f4fb-47a6-ba38-f4d07d0bc55e\") " Nov 24 12:33:46 crc kubenswrapper[4752]: I1124 12:33:46.529423 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1653b7b3-f4fb-47a6-ba38-f4d07d0bc55e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1653b7b3-f4fb-47a6-ba38-f4d07d0bc55e" (UID: "1653b7b3-f4fb-47a6-ba38-f4d07d0bc55e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:33:46 crc kubenswrapper[4752]: I1124 12:33:46.529469 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12492e12-167b-4d92-8a59-fbb72d9b714a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "12492e12-167b-4d92-8a59-fbb72d9b714a" (UID: "12492e12-167b-4d92-8a59-fbb72d9b714a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:33:46 crc kubenswrapper[4752]: I1124 12:33:46.534330 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12492e12-167b-4d92-8a59-fbb72d9b714a-kube-api-access-pwlm6" (OuterVolumeSpecName: "kube-api-access-pwlm6") pod "12492e12-167b-4d92-8a59-fbb72d9b714a" (UID: "12492e12-167b-4d92-8a59-fbb72d9b714a"). InnerVolumeSpecName "kube-api-access-pwlm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:33:46 crc kubenswrapper[4752]: I1124 12:33:46.536039 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1653b7b3-f4fb-47a6-ba38-f4d07d0bc55e-kube-api-access-9h7bc" (OuterVolumeSpecName: "kube-api-access-9h7bc") pod "1653b7b3-f4fb-47a6-ba38-f4d07d0bc55e" (UID: "1653b7b3-f4fb-47a6-ba38-f4d07d0bc55e"). InnerVolumeSpecName "kube-api-access-9h7bc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:33:46 crc kubenswrapper[4752]: I1124 12:33:46.629765 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1653b7b3-f4fb-47a6-ba38-f4d07d0bc55e-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:33:46 crc kubenswrapper[4752]: I1124 12:33:46.629829 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwlm6\" (UniqueName: \"kubernetes.io/projected/12492e12-167b-4d92-8a59-fbb72d9b714a-kube-api-access-pwlm6\") on node \"crc\" DevicePath \"\"" Nov 24 12:33:46 crc kubenswrapper[4752]: I1124 12:33:46.629848 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12492e12-167b-4d92-8a59-fbb72d9b714a-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:33:46 crc kubenswrapper[4752]: I1124 12:33:46.629860 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9h7bc\" (UniqueName: \"kubernetes.io/projected/1653b7b3-f4fb-47a6-ba38-f4d07d0bc55e-kube-api-access-9h7bc\") on node \"crc\" DevicePath \"\"" Nov 24 12:33:47 crc kubenswrapper[4752]: I1124 12:33:47.052231 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3f3c-account-create-tkbzs" Nov 24 12:33:47 crc kubenswrapper[4752]: I1124 12:33:47.052234 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3f3c-account-create-tkbzs" event={"ID":"12492e12-167b-4d92-8a59-fbb72d9b714a","Type":"ContainerDied","Data":"445cd6ba53534f1670c40e57679faf54ac85ea1a51bebf04e48eca10ecbdbc13"} Nov 24 12:33:47 crc kubenswrapper[4752]: I1124 12:33:47.052405 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="445cd6ba53534f1670c40e57679faf54ac85ea1a51bebf04e48eca10ecbdbc13" Nov 24 12:33:47 crc kubenswrapper[4752]: I1124 12:33:47.054582 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-zfhsz" event={"ID":"1653b7b3-f4fb-47a6-ba38-f4d07d0bc55e","Type":"ContainerDied","Data":"002745778f09a325d3422c78c3eea469658fa1477e4a1f2e47a01c6cce6fad1c"} Nov 24 12:33:47 crc kubenswrapper[4752]: I1124 12:33:47.054602 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="002745778f09a325d3422c78c3eea469658fa1477e4a1f2e47a01c6cce6fad1c" Nov 24 12:33:47 crc kubenswrapper[4752]: I1124 12:33:47.054676 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-zfhsz" Nov 24 12:33:48 crc kubenswrapper[4752]: I1124 12:33:48.787353 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-lp9hg"] Nov 24 12:33:48 crc kubenswrapper[4752]: E1124 12:33:48.788126 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12492e12-167b-4d92-8a59-fbb72d9b714a" containerName="mariadb-account-create" Nov 24 12:33:48 crc kubenswrapper[4752]: I1124 12:33:48.788141 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="12492e12-167b-4d92-8a59-fbb72d9b714a" containerName="mariadb-account-create" Nov 24 12:33:48 crc kubenswrapper[4752]: E1124 12:33:48.788188 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1653b7b3-f4fb-47a6-ba38-f4d07d0bc55e" containerName="mariadb-database-create" Nov 24 12:33:48 crc kubenswrapper[4752]: I1124 12:33:48.788196 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="1653b7b3-f4fb-47a6-ba38-f4d07d0bc55e" containerName="mariadb-database-create" Nov 24 12:33:48 crc kubenswrapper[4752]: I1124 12:33:48.788417 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="1653b7b3-f4fb-47a6-ba38-f4d07d0bc55e" containerName="mariadb-database-create" Nov 24 12:33:48 crc kubenswrapper[4752]: I1124 12:33:48.788439 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="12492e12-167b-4d92-8a59-fbb72d9b714a" containerName="mariadb-account-create" Nov 24 12:33:48 crc kubenswrapper[4752]: I1124 12:33:48.789149 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-lp9hg" Nov 24 12:33:48 crc kubenswrapper[4752]: I1124 12:33:48.792423 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-jxspm" Nov 24 12:33:48 crc kubenswrapper[4752]: I1124 12:33:48.800232 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Nov 24 12:33:48 crc kubenswrapper[4752]: I1124 12:33:48.827034 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-lp9hg"] Nov 24 12:33:48 crc kubenswrapper[4752]: I1124 12:33:48.975193 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/52b679aa-aa7b-4e82-9bbb-9c2642c7feb8-db-sync-config-data\") pod \"glance-db-sync-lp9hg\" (UID: \"52b679aa-aa7b-4e82-9bbb-9c2642c7feb8\") " pod="openstack/glance-db-sync-lp9hg" Nov 24 12:33:48 crc kubenswrapper[4752]: I1124 12:33:48.975326 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wthf9\" (UniqueName: \"kubernetes.io/projected/52b679aa-aa7b-4e82-9bbb-9c2642c7feb8-kube-api-access-wthf9\") pod \"glance-db-sync-lp9hg\" (UID: \"52b679aa-aa7b-4e82-9bbb-9c2642c7feb8\") " pod="openstack/glance-db-sync-lp9hg" Nov 24 12:33:48 crc kubenswrapper[4752]: I1124 12:33:48.975375 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52b679aa-aa7b-4e82-9bbb-9c2642c7feb8-combined-ca-bundle\") pod \"glance-db-sync-lp9hg\" (UID: \"52b679aa-aa7b-4e82-9bbb-9c2642c7feb8\") " pod="openstack/glance-db-sync-lp9hg" Nov 24 12:33:48 crc kubenswrapper[4752]: I1124 12:33:48.975404 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52b679aa-aa7b-4e82-9bbb-9c2642c7feb8-config-data\") pod \"glance-db-sync-lp9hg\" (UID: \"52b679aa-aa7b-4e82-9bbb-9c2642c7feb8\") " pod="openstack/glance-db-sync-lp9hg" Nov 24 12:33:49 crc kubenswrapper[4752]: I1124 12:33:49.076413 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52b679aa-aa7b-4e82-9bbb-9c2642c7feb8-combined-ca-bundle\") pod \"glance-db-sync-lp9hg\" (UID: \"52b679aa-aa7b-4e82-9bbb-9c2642c7feb8\") " pod="openstack/glance-db-sync-lp9hg" Nov 24 12:33:49 crc kubenswrapper[4752]: I1124 12:33:49.076810 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52b679aa-aa7b-4e82-9bbb-9c2642c7feb8-config-data\") pod \"glance-db-sync-lp9hg\" (UID: \"52b679aa-aa7b-4e82-9bbb-9c2642c7feb8\") " pod="openstack/glance-db-sync-lp9hg" Nov 24 12:33:49 crc kubenswrapper[4752]: I1124 12:33:49.077012 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/52b679aa-aa7b-4e82-9bbb-9c2642c7feb8-db-sync-config-data\") pod \"glance-db-sync-lp9hg\" (UID: \"52b679aa-aa7b-4e82-9bbb-9c2642c7feb8\") " pod="openstack/glance-db-sync-lp9hg" Nov 24 12:33:49 crc kubenswrapper[4752]: I1124 12:33:49.077188 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wthf9\" (UniqueName: \"kubernetes.io/projected/52b679aa-aa7b-4e82-9bbb-9c2642c7feb8-kube-api-access-wthf9\") pod \"glance-db-sync-lp9hg\" (UID: \"52b679aa-aa7b-4e82-9bbb-9c2642c7feb8\") " pod="openstack/glance-db-sync-lp9hg" Nov 24 12:33:49 crc kubenswrapper[4752]: I1124 12:33:49.082475 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52b679aa-aa7b-4e82-9bbb-9c2642c7feb8-config-data\") pod \"glance-db-sync-lp9hg\" (UID: \"52b679aa-aa7b-4e82-9bbb-9c2642c7feb8\") " pod="openstack/glance-db-sync-lp9hg" Nov 24 12:33:49 crc kubenswrapper[4752]: I1124 12:33:49.084055 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52b679aa-aa7b-4e82-9bbb-9c2642c7feb8-combined-ca-bundle\") pod \"glance-db-sync-lp9hg\" (UID: \"52b679aa-aa7b-4e82-9bbb-9c2642c7feb8\") " pod="openstack/glance-db-sync-lp9hg" Nov 24 12:33:49 crc kubenswrapper[4752]: I1124 12:33:49.084066 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/52b679aa-aa7b-4e82-9bbb-9c2642c7feb8-db-sync-config-data\") pod \"glance-db-sync-lp9hg\" (UID: \"52b679aa-aa7b-4e82-9bbb-9c2642c7feb8\") " pod="openstack/glance-db-sync-lp9hg" Nov 24 12:33:49 crc kubenswrapper[4752]: I1124 12:33:49.110688 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wthf9\" (UniqueName: \"kubernetes.io/projected/52b679aa-aa7b-4e82-9bbb-9c2642c7feb8-kube-api-access-wthf9\") pod \"glance-db-sync-lp9hg\" (UID: \"52b679aa-aa7b-4e82-9bbb-9c2642c7feb8\") " pod="openstack/glance-db-sync-lp9hg" Nov 24 12:33:49 crc kubenswrapper[4752]: I1124 12:33:49.411147 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-lp9hg" Nov 24 12:33:50 crc kubenswrapper[4752]: I1124 12:33:50.017088 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-lp9hg"] Nov 24 12:33:50 crc kubenswrapper[4752]: W1124 12:33:50.019416 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52b679aa_aa7b_4e82_9bbb_9c2642c7feb8.slice/crio-bd8007193455f6b2ce8521e9a0933fc8b18f036cf8f6189048af7ba39a9c05e8 WatchSource:0}: Error finding container bd8007193455f6b2ce8521e9a0933fc8b18f036cf8f6189048af7ba39a9c05e8: Status 404 returned error can't find the container with id bd8007193455f6b2ce8521e9a0933fc8b18f036cf8f6189048af7ba39a9c05e8 Nov 24 12:33:50 crc kubenswrapper[4752]: I1124 12:33:50.096738 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-lp9hg" event={"ID":"52b679aa-aa7b-4e82-9bbb-9c2642c7feb8","Type":"ContainerStarted","Data":"bd8007193455f6b2ce8521e9a0933fc8b18f036cf8f6189048af7ba39a9c05e8"} Nov 24 12:33:51 crc kubenswrapper[4752]: I1124 12:33:51.108324 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-lp9hg" event={"ID":"52b679aa-aa7b-4e82-9bbb-9c2642c7feb8","Type":"ContainerStarted","Data":"008c4ec2a5cb9debd960bd9495d9f670997fe14c99c46158134c2cab1e971864"} Nov 24 12:33:51 crc kubenswrapper[4752]: I1124 12:33:51.127615 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-lp9hg" podStartSLOduration=3.127586723 podStartE2EDuration="3.127586723s" podCreationTimestamp="2025-11-24 12:33:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:33:51.127357937 +0000 UTC m=+5237.112178236" watchObservedRunningTime="2025-11-24 12:33:51.127586723 +0000 UTC m=+5237.112407032" Nov 24 12:33:54 crc kubenswrapper[4752]: I1124 12:33:54.138560 4752 generic.go:334] "Generic (PLEG): container finished" podID="52b679aa-aa7b-4e82-9bbb-9c2642c7feb8" containerID="008c4ec2a5cb9debd960bd9495d9f670997fe14c99c46158134c2cab1e971864" exitCode=0 Nov 24 12:33:54 crc kubenswrapper[4752]: I1124 12:33:54.138605 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-lp9hg" event={"ID":"52b679aa-aa7b-4e82-9bbb-9c2642c7feb8","Type":"ContainerDied","Data":"008c4ec2a5cb9debd960bd9495d9f670997fe14c99c46158134c2cab1e971864"} Nov 24 12:33:55 crc kubenswrapper[4752]: I1124 12:33:55.569475 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-lp9hg" Nov 24 12:33:55 crc kubenswrapper[4752]: I1124 12:33:55.691166 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wthf9\" (UniqueName: \"kubernetes.io/projected/52b679aa-aa7b-4e82-9bbb-9c2642c7feb8-kube-api-access-wthf9\") pod \"52b679aa-aa7b-4e82-9bbb-9c2642c7feb8\" (UID: \"52b679aa-aa7b-4e82-9bbb-9c2642c7feb8\") " Nov 24 12:33:55 crc kubenswrapper[4752]: I1124 12:33:55.691303 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52b679aa-aa7b-4e82-9bbb-9c2642c7feb8-combined-ca-bundle\") pod \"52b679aa-aa7b-4e82-9bbb-9c2642c7feb8\" (UID: \"52b679aa-aa7b-4e82-9bbb-9c2642c7feb8\") " Nov 24 12:33:55 crc kubenswrapper[4752]: I1124 12:33:55.691365 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52b679aa-aa7b-4e82-9bbb-9c2642c7feb8-config-data\") pod \"52b679aa-aa7b-4e82-9bbb-9c2642c7feb8\" (UID: \"52b679aa-aa7b-4e82-9bbb-9c2642c7feb8\") " Nov 24 12:33:55 crc kubenswrapper[4752]: I1124 12:33:55.691531 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/52b679aa-aa7b-4e82-9bbb-9c2642c7feb8-db-sync-config-data\") pod \"52b679aa-aa7b-4e82-9bbb-9c2642c7feb8\" (UID: \"52b679aa-aa7b-4e82-9bbb-9c2642c7feb8\") " Nov 24 12:33:55 crc kubenswrapper[4752]: I1124 12:33:55.706567 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52b679aa-aa7b-4e82-9bbb-9c2642c7feb8-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "52b679aa-aa7b-4e82-9bbb-9c2642c7feb8" (UID: "52b679aa-aa7b-4e82-9bbb-9c2642c7feb8"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:33:55 crc kubenswrapper[4752]: I1124 12:33:55.707067 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52b679aa-aa7b-4e82-9bbb-9c2642c7feb8-kube-api-access-wthf9" (OuterVolumeSpecName: "kube-api-access-wthf9") pod "52b679aa-aa7b-4e82-9bbb-9c2642c7feb8" (UID: "52b679aa-aa7b-4e82-9bbb-9c2642c7feb8"). InnerVolumeSpecName "kube-api-access-wthf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:33:55 crc kubenswrapper[4752]: I1124 12:33:55.735215 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52b679aa-aa7b-4e82-9bbb-9c2642c7feb8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52b679aa-aa7b-4e82-9bbb-9c2642c7feb8" (UID: "52b679aa-aa7b-4e82-9bbb-9c2642c7feb8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:33:55 crc kubenswrapper[4752]: I1124 12:33:55.741432 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52b679aa-aa7b-4e82-9bbb-9c2642c7feb8-config-data" (OuterVolumeSpecName: "config-data") pod "52b679aa-aa7b-4e82-9bbb-9c2642c7feb8" (UID: "52b679aa-aa7b-4e82-9bbb-9c2642c7feb8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:33:55 crc kubenswrapper[4752]: I1124 12:33:55.793730 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wthf9\" (UniqueName: \"kubernetes.io/projected/52b679aa-aa7b-4e82-9bbb-9c2642c7feb8-kube-api-access-wthf9\") on node \"crc\" DevicePath \"\"" Nov 24 12:33:55 crc kubenswrapper[4752]: I1124 12:33:55.793812 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52b679aa-aa7b-4e82-9bbb-9c2642c7feb8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 12:33:55 crc kubenswrapper[4752]: I1124 12:33:55.793822 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52b679aa-aa7b-4e82-9bbb-9c2642c7feb8-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 12:33:55 crc kubenswrapper[4752]: I1124 12:33:55.793833 4752 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/52b679aa-aa7b-4e82-9bbb-9c2642c7feb8-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.161362 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-lp9hg" event={"ID":"52b679aa-aa7b-4e82-9bbb-9c2642c7feb8","Type":"ContainerDied","Data":"bd8007193455f6b2ce8521e9a0933fc8b18f036cf8f6189048af7ba39a9c05e8"} Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.161439 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd8007193455f6b2ce8521e9a0933fc8b18f036cf8f6189048af7ba39a9c05e8" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.161455 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-lp9hg" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.523385 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 12:33:56 crc kubenswrapper[4752]: E1124 12:33:56.523716 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52b679aa-aa7b-4e82-9bbb-9c2642c7feb8" containerName="glance-db-sync" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.523728 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="52b679aa-aa7b-4e82-9bbb-9c2642c7feb8" containerName="glance-db-sync" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.523987 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="52b679aa-aa7b-4e82-9bbb-9c2642c7feb8" containerName="glance-db-sync" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.524922 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.527064 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.528945 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.531960 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.535807 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-jxspm" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.542632 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.605781 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/215da164-20e4-47a8-91a7-d959b8e12463-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"215da164-20e4-47a8-91a7-d959b8e12463\") " pod="openstack/glance-default-external-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.606105 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/215da164-20e4-47a8-91a7-d959b8e12463-scripts\") pod \"glance-default-external-api-0\" (UID: \"215da164-20e4-47a8-91a7-d959b8e12463\") " pod="openstack/glance-default-external-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.606131 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/215da164-20e4-47a8-91a7-d959b8e12463-config-data\") pod \"glance-default-external-api-0\" (UID: \"215da164-20e4-47a8-91a7-d959b8e12463\") " pod="openstack/glance-default-external-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.606164 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/215da164-20e4-47a8-91a7-d959b8e12463-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"215da164-20e4-47a8-91a7-d959b8e12463\") " pod="openstack/glance-default-external-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.606239 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/215da164-20e4-47a8-91a7-d959b8e12463-ceph\") pod \"glance-default-external-api-0\" (UID: \"215da164-20e4-47a8-91a7-d959b8e12463\") " pod="openstack/glance-default-external-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.606346 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/215da164-20e4-47a8-91a7-d959b8e12463-logs\") pod \"glance-default-external-api-0\" (UID: \"215da164-20e4-47a8-91a7-d959b8e12463\") " pod="openstack/glance-default-external-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.606412 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7qp9\" (UniqueName: \"kubernetes.io/projected/215da164-20e4-47a8-91a7-d959b8e12463-kube-api-access-n7qp9\") pod \"glance-default-external-api-0\" (UID: \"215da164-20e4-47a8-91a7-d959b8e12463\") " pod="openstack/glance-default-external-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.632190 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c8c4d4d9c-pncnc"] Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.633585 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c8c4d4d9c-pncnc" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.664138 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c8c4d4d9c-pncnc"] Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.709125 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/215da164-20e4-47a8-91a7-d959b8e12463-ceph\") pod \"glance-default-external-api-0\" (UID: \"215da164-20e4-47a8-91a7-d959b8e12463\") " pod="openstack/glance-default-external-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.709412 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27add03f-da4d-4050-9b42-6bc8979311c4-ovsdbserver-sb\") pod \"dnsmasq-dns-7c8c4d4d9c-pncnc\" (UID: \"27add03f-da4d-4050-9b42-6bc8979311c4\") " pod="openstack/dnsmasq-dns-7c8c4d4d9c-pncnc" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.709470 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/215da164-20e4-47a8-91a7-d959b8e12463-logs\") pod \"glance-default-external-api-0\" (UID: \"215da164-20e4-47a8-91a7-d959b8e12463\") " pod="openstack/glance-default-external-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.709983 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/215da164-20e4-47a8-91a7-d959b8e12463-logs\") pod \"glance-default-external-api-0\" (UID: \"215da164-20e4-47a8-91a7-d959b8e12463\") " pod="openstack/glance-default-external-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.710053 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7qp9\" (UniqueName: \"kubernetes.io/projected/215da164-20e4-47a8-91a7-d959b8e12463-kube-api-access-n7qp9\") pod \"glance-default-external-api-0\" (UID: \"215da164-20e4-47a8-91a7-d959b8e12463\") " pod="openstack/glance-default-external-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.710378 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27add03f-da4d-4050-9b42-6bc8979311c4-ovsdbserver-nb\") pod \"dnsmasq-dns-7c8c4d4d9c-pncnc\" (UID: \"27add03f-da4d-4050-9b42-6bc8979311c4\") " pod="openstack/dnsmasq-dns-7c8c4d4d9c-pncnc" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.710491 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dswm9\" (UniqueName: \"kubernetes.io/projected/27add03f-da4d-4050-9b42-6bc8979311c4-kube-api-access-dswm9\") pod \"dnsmasq-dns-7c8c4d4d9c-pncnc\" (UID: \"27add03f-da4d-4050-9b42-6bc8979311c4\") " pod="openstack/dnsmasq-dns-7c8c4d4d9c-pncnc" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.710602 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27add03f-da4d-4050-9b42-6bc8979311c4-dns-svc\") pod \"dnsmasq-dns-7c8c4d4d9c-pncnc\" (UID: \"27add03f-da4d-4050-9b42-6bc8979311c4\") " pod="openstack/dnsmasq-dns-7c8c4d4d9c-pncnc" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.710649 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/215da164-20e4-47a8-91a7-d959b8e12463-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"215da164-20e4-47a8-91a7-d959b8e12463\") " pod="openstack/glance-default-external-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.710701 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/215da164-20e4-47a8-91a7-d959b8e12463-scripts\") pod \"glance-default-external-api-0\" (UID: \"215da164-20e4-47a8-91a7-d959b8e12463\") " pod="openstack/glance-default-external-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.710736 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/215da164-20e4-47a8-91a7-d959b8e12463-config-data\") pod \"glance-default-external-api-0\" (UID: \"215da164-20e4-47a8-91a7-d959b8e12463\") " pod="openstack/glance-default-external-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.711140 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27add03f-da4d-4050-9b42-6bc8979311c4-config\") pod \"dnsmasq-dns-7c8c4d4d9c-pncnc\" (UID: \"27add03f-da4d-4050-9b42-6bc8979311c4\") " pod="openstack/dnsmasq-dns-7c8c4d4d9c-pncnc" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.711195 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/215da164-20e4-47a8-91a7-d959b8e12463-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"215da164-20e4-47a8-91a7-d959b8e12463\") " pod="openstack/glance-default-external-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.711637 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/215da164-20e4-47a8-91a7-d959b8e12463-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"215da164-20e4-47a8-91a7-d959b8e12463\") " pod="openstack/glance-default-external-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.714614 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/215da164-20e4-47a8-91a7-d959b8e12463-config-data\") pod \"glance-default-external-api-0\" (UID: \"215da164-20e4-47a8-91a7-d959b8e12463\") " pod="openstack/glance-default-external-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.715435 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/215da164-20e4-47a8-91a7-d959b8e12463-scripts\") pod \"glance-default-external-api-0\" (UID: \"215da164-20e4-47a8-91a7-d959b8e12463\") " pod="openstack/glance-default-external-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.717186 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/215da164-20e4-47a8-91a7-d959b8e12463-ceph\") pod \"glance-default-external-api-0\" (UID: \"215da164-20e4-47a8-91a7-d959b8e12463\") " pod="openstack/glance-default-external-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.718414 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.719804 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.725633 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/215da164-20e4-47a8-91a7-d959b8e12463-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"215da164-20e4-47a8-91a7-d959b8e12463\") " pod="openstack/glance-default-external-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.727596 4752 scope.go:117] "RemoveContainer" containerID="294534fa96e641fffe9b0cce6e2bc4d525450d730fff13a2ed6629d46358ee8a" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.744373 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.768869 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.781566 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7qp9\" (UniqueName: \"kubernetes.io/projected/215da164-20e4-47a8-91a7-d959b8e12463-kube-api-access-n7qp9\") pod \"glance-default-external-api-0\" (UID: \"215da164-20e4-47a8-91a7-d959b8e12463\") " pod="openstack/glance-default-external-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.812174 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc79l\" (UniqueName: \"kubernetes.io/projected/79276fef-b858-4128-bddd-a4358cc57711-kube-api-access-pc79l\") pod \"glance-default-internal-api-0\" (UID: \"79276fef-b858-4128-bddd-a4358cc57711\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.812217 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79276fef-b858-4128-bddd-a4358cc57711-logs\") pod \"glance-default-internal-api-0\" (UID: \"79276fef-b858-4128-bddd-a4358cc57711\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.812269 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27add03f-da4d-4050-9b42-6bc8979311c4-ovsdbserver-sb\") pod \"dnsmasq-dns-7c8c4d4d9c-pncnc\" (UID: \"27add03f-da4d-4050-9b42-6bc8979311c4\") " pod="openstack/dnsmasq-dns-7c8c4d4d9c-pncnc" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.812283 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79276fef-b858-4128-bddd-a4358cc57711-scripts\") pod \"glance-default-internal-api-0\" (UID: \"79276fef-b858-4128-bddd-a4358cc57711\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.812319 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/79276fef-b858-4128-bddd-a4358cc57711-ceph\") pod \"glance-default-internal-api-0\" (UID: \"79276fef-b858-4128-bddd-a4358cc57711\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.812342 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27add03f-da4d-4050-9b42-6bc8979311c4-ovsdbserver-nb\") pod \"dnsmasq-dns-7c8c4d4d9c-pncnc\" (UID: \"27add03f-da4d-4050-9b42-6bc8979311c4\") " pod="openstack/dnsmasq-dns-7c8c4d4d9c-pncnc" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.812367 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79276fef-b858-4128-bddd-a4358cc57711-config-data\") pod \"glance-default-internal-api-0\" (UID: \"79276fef-b858-4128-bddd-a4358cc57711\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.812433 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dswm9\" (UniqueName: \"kubernetes.io/projected/27add03f-da4d-4050-9b42-6bc8979311c4-kube-api-access-dswm9\") pod \"dnsmasq-dns-7c8c4d4d9c-pncnc\" (UID: \"27add03f-da4d-4050-9b42-6bc8979311c4\") " pod="openstack/dnsmasq-dns-7c8c4d4d9c-pncnc" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.812460 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27add03f-da4d-4050-9b42-6bc8979311c4-dns-svc\") pod \"dnsmasq-dns-7c8c4d4d9c-pncnc\" (UID: \"27add03f-da4d-4050-9b42-6bc8979311c4\") " pod="openstack/dnsmasq-dns-7c8c4d4d9c-pncnc" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.812495 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79276fef-b858-4128-bddd-a4358cc57711-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"79276fef-b858-4128-bddd-a4358cc57711\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.812518 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27add03f-da4d-4050-9b42-6bc8979311c4-config\") pod \"dnsmasq-dns-7c8c4d4d9c-pncnc\" (UID: \"27add03f-da4d-4050-9b42-6bc8979311c4\") " pod="openstack/dnsmasq-dns-7c8c4d4d9c-pncnc" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.812536 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/79276fef-b858-4128-bddd-a4358cc57711-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"79276fef-b858-4128-bddd-a4358cc57711\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.813709 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27add03f-da4d-4050-9b42-6bc8979311c4-dns-svc\") pod \"dnsmasq-dns-7c8c4d4d9c-pncnc\" (UID: \"27add03f-da4d-4050-9b42-6bc8979311c4\") " pod="openstack/dnsmasq-dns-7c8c4d4d9c-pncnc" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.814073 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27add03f-da4d-4050-9b42-6bc8979311c4-ovsdbserver-sb\") pod \"dnsmasq-dns-7c8c4d4d9c-pncnc\" (UID: \"27add03f-da4d-4050-9b42-6bc8979311c4\") " pod="openstack/dnsmasq-dns-7c8c4d4d9c-pncnc" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.814654 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27add03f-da4d-4050-9b42-6bc8979311c4-ovsdbserver-nb\") pod \"dnsmasq-dns-7c8c4d4d9c-pncnc\" (UID: \"27add03f-da4d-4050-9b42-6bc8979311c4\") " pod="openstack/dnsmasq-dns-7c8c4d4d9c-pncnc" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.814806 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27add03f-da4d-4050-9b42-6bc8979311c4-config\") pod \"dnsmasq-dns-7c8c4d4d9c-pncnc\" (UID: \"27add03f-da4d-4050-9b42-6bc8979311c4\") " pod="openstack/dnsmasq-dns-7c8c4d4d9c-pncnc" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.838737 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dswm9\" (UniqueName: \"kubernetes.io/projected/27add03f-da4d-4050-9b42-6bc8979311c4-kube-api-access-dswm9\") pod \"dnsmasq-dns-7c8c4d4d9c-pncnc\" (UID: \"27add03f-da4d-4050-9b42-6bc8979311c4\") " pod="openstack/dnsmasq-dns-7c8c4d4d9c-pncnc" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.844366 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.913603 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/79276fef-b858-4128-bddd-a4358cc57711-ceph\") pod \"glance-default-internal-api-0\" (UID: \"79276fef-b858-4128-bddd-a4358cc57711\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.914322 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79276fef-b858-4128-bddd-a4358cc57711-config-data\") pod \"glance-default-internal-api-0\" (UID: \"79276fef-b858-4128-bddd-a4358cc57711\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.914696 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79276fef-b858-4128-bddd-a4358cc57711-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"79276fef-b858-4128-bddd-a4358cc57711\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.914837 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/79276fef-b858-4128-bddd-a4358cc57711-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"79276fef-b858-4128-bddd-a4358cc57711\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.914984 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc79l\" (UniqueName: \"kubernetes.io/projected/79276fef-b858-4128-bddd-a4358cc57711-kube-api-access-pc79l\") pod \"glance-default-internal-api-0\" (UID: \"79276fef-b858-4128-bddd-a4358cc57711\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.915099 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79276fef-b858-4128-bddd-a4358cc57711-logs\") pod \"glance-default-internal-api-0\" (UID: \"79276fef-b858-4128-bddd-a4358cc57711\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.915233 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79276fef-b858-4128-bddd-a4358cc57711-scripts\") pod \"glance-default-internal-api-0\" (UID: \"79276fef-b858-4128-bddd-a4358cc57711\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.915368 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/79276fef-b858-4128-bddd-a4358cc57711-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"79276fef-b858-4128-bddd-a4358cc57711\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.915602 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79276fef-b858-4128-bddd-a4358cc57711-logs\") pod \"glance-default-internal-api-0\" (UID: \"79276fef-b858-4128-bddd-a4358cc57711\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.919608 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79276fef-b858-4128-bddd-a4358cc57711-config-data\") pod \"glance-default-internal-api-0\" (UID: \"79276fef-b858-4128-bddd-a4358cc57711\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.927328 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79276fef-b858-4128-bddd-a4358cc57711-scripts\") pod \"glance-default-internal-api-0\" (UID: \"79276fef-b858-4128-bddd-a4358cc57711\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.927371 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/79276fef-b858-4128-bddd-a4358cc57711-ceph\") pod \"glance-default-internal-api-0\" (UID: \"79276fef-b858-4128-bddd-a4358cc57711\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.927464 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79276fef-b858-4128-bddd-a4358cc57711-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"79276fef-b858-4128-bddd-a4358cc57711\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.950852 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc79l\" (UniqueName: \"kubernetes.io/projected/79276fef-b858-4128-bddd-a4358cc57711-kube-api-access-pc79l\") pod \"glance-default-internal-api-0\" (UID: \"79276fef-b858-4128-bddd-a4358cc57711\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:33:56 crc kubenswrapper[4752]: I1124 12:33:56.953472 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c8c4d4d9c-pncnc" Nov 24 12:33:57 crc kubenswrapper[4752]: I1124 12:33:57.191493 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerStarted","Data":"9584ea900c82ce825ee45effc8c387923a07ba88ab659856bc5c308458118a8b"} Nov 24 12:33:57 crc kubenswrapper[4752]: I1124 12:33:57.214176 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 24 12:33:57 crc kubenswrapper[4752]: I1124 12:33:57.399624 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 12:33:57 crc kubenswrapper[4752]: W1124 12:33:57.408121 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod215da164_20e4_47a8_91a7_d959b8e12463.slice/crio-9f20bd4f6b83a80ca3edf55250c9c989f19a79c0d498db03b78f2116ccef1503 WatchSource:0}: Error finding container 9f20bd4f6b83a80ca3edf55250c9c989f19a79c0d498db03b78f2116ccef1503: Status 404 returned error can't find the container with id 9f20bd4f6b83a80ca3edf55250c9c989f19a79c0d498db03b78f2116ccef1503 Nov 24 12:33:57 crc kubenswrapper[4752]: I1124 12:33:57.435897 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c8c4d4d9c-pncnc"] Nov 24 12:33:57 crc kubenswrapper[4752]: I1124 12:33:57.616998 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 12:33:57 crc kubenswrapper[4752]: I1124 12:33:57.825161 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 12:33:57 crc kubenswrapper[4752]: W1124 12:33:57.842131 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79276fef_b858_4128_bddd_a4358cc57711.slice/crio-f128d330260215a2317f41a2b254e9dd30a70dfde10bacf51f68fd781a6af96f WatchSource:0}: Error finding container f128d330260215a2317f41a2b254e9dd30a70dfde10bacf51f68fd781a6af96f: Status 404 returned error can't find the container with id f128d330260215a2317f41a2b254e9dd30a70dfde10bacf51f68fd781a6af96f Nov 24 12:33:58 crc kubenswrapper[4752]: I1124 12:33:58.201212 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"79276fef-b858-4128-bddd-a4358cc57711","Type":"ContainerStarted","Data":"f128d330260215a2317f41a2b254e9dd30a70dfde10bacf51f68fd781a6af96f"} Nov 24 12:33:58 crc kubenswrapper[4752]: I1124 12:33:58.203352 4752 generic.go:334] "Generic (PLEG): container finished" podID="27add03f-da4d-4050-9b42-6bc8979311c4" containerID="e9745bd8b44f469720f5a096053a66d49d8217a2ccf49fb67159efd4dda2a74e" exitCode=0 Nov 24 12:33:58 crc kubenswrapper[4752]: I1124 12:33:58.203425 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c8c4d4d9c-pncnc" event={"ID":"27add03f-da4d-4050-9b42-6bc8979311c4","Type":"ContainerDied","Data":"e9745bd8b44f469720f5a096053a66d49d8217a2ccf49fb67159efd4dda2a74e"} Nov 24 12:33:58 crc kubenswrapper[4752]: I1124 12:33:58.203449 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c8c4d4d9c-pncnc" event={"ID":"27add03f-da4d-4050-9b42-6bc8979311c4","Type":"ContainerStarted","Data":"760e7a6408aad2752b3e6e60ce02243e679db0cd8f15448a29d47a727dc5dfcd"} Nov 24 12:33:58 crc kubenswrapper[4752]: I1124 12:33:58.206218 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"215da164-20e4-47a8-91a7-d959b8e12463","Type":"ContainerStarted","Data":"cea400ea5c6d240b340d69fcdfc2b31fd7f1a18909b55c83a82203cb6d15b769"} Nov 24 12:33:58 crc kubenswrapper[4752]: I1124 12:33:58.206257 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"215da164-20e4-47a8-91a7-d959b8e12463","Type":"ContainerStarted","Data":"9f20bd4f6b83a80ca3edf55250c9c989f19a79c0d498db03b78f2116ccef1503"} Nov 24 12:33:59 crc kubenswrapper[4752]: I1124 12:33:59.215535 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"215da164-20e4-47a8-91a7-d959b8e12463","Type":"ContainerStarted","Data":"c0f406694cb6284119bc5c54c4c56705bbe026e4532a54cbd5bb5861ad8855c0"} Nov 24 12:33:59 crc kubenswrapper[4752]: I1124 12:33:59.215662 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="215da164-20e4-47a8-91a7-d959b8e12463" containerName="glance-log" containerID="cri-o://cea400ea5c6d240b340d69fcdfc2b31fd7f1a18909b55c83a82203cb6d15b769" gracePeriod=30 Nov 24 12:33:59 crc kubenswrapper[4752]: I1124 12:33:59.215700 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="215da164-20e4-47a8-91a7-d959b8e12463" containerName="glance-httpd" containerID="cri-o://c0f406694cb6284119bc5c54c4c56705bbe026e4532a54cbd5bb5861ad8855c0" gracePeriod=30 Nov 24 12:33:59 crc kubenswrapper[4752]: I1124 12:33:59.219201 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"79276fef-b858-4128-bddd-a4358cc57711","Type":"ContainerStarted","Data":"067b656a7885f62440818b30e543018ea0bbf251c468838e9915e0585f6734b3"} Nov 24 12:33:59 crc kubenswrapper[4752]: I1124 12:33:59.219274 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"79276fef-b858-4128-bddd-a4358cc57711","Type":"ContainerStarted","Data":"d9969cf51e7992cc4f342847b4ed5a14d7620b7e9755bcf9787298dc9df2311f"} Nov 24 12:33:59 crc kubenswrapper[4752]: I1124 12:33:59.221906 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c8c4d4d9c-pncnc" event={"ID":"27add03f-da4d-4050-9b42-6bc8979311c4","Type":"ContainerStarted","Data":"1a088eb537ab104cac4db7a20cb40f46d9026ba7bb6913a90cacd74d803f2798"} Nov 24 12:33:59 crc kubenswrapper[4752]: I1124 12:33:59.222170 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c8c4d4d9c-pncnc" Nov 24 12:33:59 crc kubenswrapper[4752]: I1124 12:33:59.245719 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.245697933 podStartE2EDuration="3.245697933s" podCreationTimestamp="2025-11-24 12:33:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:33:59.235117479 +0000 UTC m=+5245.219937768" watchObservedRunningTime="2025-11-24 12:33:59.245697933 +0000 UTC m=+5245.230518232" Nov 24 12:33:59 crc kubenswrapper[4752]: I1124 12:33:59.260367 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.260343693 podStartE2EDuration="3.260343693s" podCreationTimestamp="2025-11-24 12:33:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:33:59.252513218 +0000 UTC m=+5245.237333517" watchObservedRunningTime="2025-11-24 12:33:59.260343693 +0000 UTC m=+5245.245164002" Nov 24 12:33:59 crc kubenswrapper[4752]: I1124 12:33:59.277275 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c8c4d4d9c-pncnc" podStartSLOduration=3.277248698 podStartE2EDuration="3.277248698s" podCreationTimestamp="2025-11-24 12:33:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:33:59.270555786 +0000 UTC m=+5245.255376075" watchObservedRunningTime="2025-11-24 12:33:59.277248698 +0000 UTC m=+5245.262068997" Nov 24 12:33:59 crc kubenswrapper[4752]: I1124 12:33:59.409100 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 12:33:59 crc kubenswrapper[4752]: I1124 12:33:59.829978 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 24 12:33:59 crc kubenswrapper[4752]: I1124 12:33:59.984827 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7qp9\" (UniqueName: \"kubernetes.io/projected/215da164-20e4-47a8-91a7-d959b8e12463-kube-api-access-n7qp9\") pod \"215da164-20e4-47a8-91a7-d959b8e12463\" (UID: \"215da164-20e4-47a8-91a7-d959b8e12463\") " Nov 24 12:33:59 crc kubenswrapper[4752]: I1124 12:33:59.984978 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/215da164-20e4-47a8-91a7-d959b8e12463-scripts\") pod \"215da164-20e4-47a8-91a7-d959b8e12463\" (UID: \"215da164-20e4-47a8-91a7-d959b8e12463\") " Nov 24 12:33:59 crc kubenswrapper[4752]: I1124 12:33:59.985101 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/215da164-20e4-47a8-91a7-d959b8e12463-combined-ca-bundle\") pod \"215da164-20e4-47a8-91a7-d959b8e12463\" (UID: \"215da164-20e4-47a8-91a7-d959b8e12463\") " Nov 24 12:33:59 crc kubenswrapper[4752]: I1124 12:33:59.985181 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/215da164-20e4-47a8-91a7-d959b8e12463-httpd-run\") pod \"215da164-20e4-47a8-91a7-d959b8e12463\" (UID: \"215da164-20e4-47a8-91a7-d959b8e12463\") " Nov 24 12:33:59 crc kubenswrapper[4752]: I1124 12:33:59.985286 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/215da164-20e4-47a8-91a7-d959b8e12463-logs\") pod \"215da164-20e4-47a8-91a7-d959b8e12463\" (UID: \"215da164-20e4-47a8-91a7-d959b8e12463\") " Nov 24 12:33:59 crc kubenswrapper[4752]: I1124 12:33:59.985523 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/215da164-20e4-47a8-91a7-d959b8e12463-ceph\") pod \"215da164-20e4-47a8-91a7-d959b8e12463\" (UID: \"215da164-20e4-47a8-91a7-d959b8e12463\") " Nov 24 12:33:59 crc kubenswrapper[4752]: I1124 12:33:59.985634 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/215da164-20e4-47a8-91a7-d959b8e12463-config-data\") pod \"215da164-20e4-47a8-91a7-d959b8e12463\" (UID: \"215da164-20e4-47a8-91a7-d959b8e12463\") " Nov 24 12:33:59 crc kubenswrapper[4752]: I1124 12:33:59.985673 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/215da164-20e4-47a8-91a7-d959b8e12463-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "215da164-20e4-47a8-91a7-d959b8e12463" (UID: "215da164-20e4-47a8-91a7-d959b8e12463"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:33:59 crc kubenswrapper[4752]: I1124 12:33:59.986312 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/215da164-20e4-47a8-91a7-d959b8e12463-logs" (OuterVolumeSpecName: "logs") pod "215da164-20e4-47a8-91a7-d959b8e12463" (UID: "215da164-20e4-47a8-91a7-d959b8e12463"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:33:59 crc kubenswrapper[4752]: I1124 12:33:59.986437 4752 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/215da164-20e4-47a8-91a7-d959b8e12463-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 24 12:33:59 crc kubenswrapper[4752]: I1124 12:33:59.986477 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/215da164-20e4-47a8-91a7-d959b8e12463-logs\") on node \"crc\" DevicePath \"\"" Nov 24 12:33:59 crc kubenswrapper[4752]: I1124 12:33:59.992386 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/215da164-20e4-47a8-91a7-d959b8e12463-ceph" (OuterVolumeSpecName: "ceph") pod "215da164-20e4-47a8-91a7-d959b8e12463" (UID: "215da164-20e4-47a8-91a7-d959b8e12463"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:33:59 crc kubenswrapper[4752]: I1124 12:33:59.992869 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/215da164-20e4-47a8-91a7-d959b8e12463-kube-api-access-n7qp9" (OuterVolumeSpecName: "kube-api-access-n7qp9") pod "215da164-20e4-47a8-91a7-d959b8e12463" (UID: "215da164-20e4-47a8-91a7-d959b8e12463"). InnerVolumeSpecName "kube-api-access-n7qp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:33:59 crc kubenswrapper[4752]: I1124 12:33:59.993399 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/215da164-20e4-47a8-91a7-d959b8e12463-scripts" (OuterVolumeSpecName: "scripts") pod "215da164-20e4-47a8-91a7-d959b8e12463" (UID: "215da164-20e4-47a8-91a7-d959b8e12463"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.017709 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/215da164-20e4-47a8-91a7-d959b8e12463-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "215da164-20e4-47a8-91a7-d959b8e12463" (UID: "215da164-20e4-47a8-91a7-d959b8e12463"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.039670 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/215da164-20e4-47a8-91a7-d959b8e12463-config-data" (OuterVolumeSpecName: "config-data") pod "215da164-20e4-47a8-91a7-d959b8e12463" (UID: "215da164-20e4-47a8-91a7-d959b8e12463"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.087570 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/215da164-20e4-47a8-91a7-d959b8e12463-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.087605 4752 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/215da164-20e4-47a8-91a7-d959b8e12463-ceph\") on node \"crc\" DevicePath \"\"" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.087617 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/215da164-20e4-47a8-91a7-d959b8e12463-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.087629 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7qp9\" (UniqueName: \"kubernetes.io/projected/215da164-20e4-47a8-91a7-d959b8e12463-kube-api-access-n7qp9\") on node \"crc\" DevicePath \"\"" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.087642 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/215da164-20e4-47a8-91a7-d959b8e12463-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.230197 4752 generic.go:334] "Generic (PLEG): container finished" podID="215da164-20e4-47a8-91a7-d959b8e12463" containerID="c0f406694cb6284119bc5c54c4c56705bbe026e4532a54cbd5bb5861ad8855c0" exitCode=0 Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.230229 4752 generic.go:334] "Generic (PLEG): container finished" podID="215da164-20e4-47a8-91a7-d959b8e12463" containerID="cea400ea5c6d240b340d69fcdfc2b31fd7f1a18909b55c83a82203cb6d15b769" exitCode=143 Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.230229 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"215da164-20e4-47a8-91a7-d959b8e12463","Type":"ContainerDied","Data":"c0f406694cb6284119bc5c54c4c56705bbe026e4532a54cbd5bb5861ad8855c0"} Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.230268 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"215da164-20e4-47a8-91a7-d959b8e12463","Type":"ContainerDied","Data":"cea400ea5c6d240b340d69fcdfc2b31fd7f1a18909b55c83a82203cb6d15b769"} Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.230279 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.230283 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"215da164-20e4-47a8-91a7-d959b8e12463","Type":"ContainerDied","Data":"9f20bd4f6b83a80ca3edf55250c9c989f19a79c0d498db03b78f2116ccef1503"} Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.230294 4752 scope.go:117] "RemoveContainer" containerID="c0f406694cb6284119bc5c54c4c56705bbe026e4532a54cbd5bb5861ad8855c0" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.270808 4752 scope.go:117] "RemoveContainer" containerID="cea400ea5c6d240b340d69fcdfc2b31fd7f1a18909b55c83a82203cb6d15b769" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.273454 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.297405 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.310033 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 12:34:00 crc kubenswrapper[4752]: E1124 12:34:00.310414 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="215da164-20e4-47a8-91a7-d959b8e12463" containerName="glance-log" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.310428 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="215da164-20e4-47a8-91a7-d959b8e12463" containerName="glance-log" Nov 24 12:34:00 crc kubenswrapper[4752]: E1124 12:34:00.310462 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="215da164-20e4-47a8-91a7-d959b8e12463" containerName="glance-httpd" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.310470 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="215da164-20e4-47a8-91a7-d959b8e12463" containerName="glance-httpd" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.310676 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="215da164-20e4-47a8-91a7-d959b8e12463" containerName="glance-httpd" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.310690 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="215da164-20e4-47a8-91a7-d959b8e12463" containerName="glance-log" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.311730 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.315416 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.340057 4752 scope.go:117] "RemoveContainer" containerID="c0f406694cb6284119bc5c54c4c56705bbe026e4532a54cbd5bb5861ad8855c0" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.340874 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 24 12:34:00 crc kubenswrapper[4752]: E1124 12:34:00.341221 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0f406694cb6284119bc5c54c4c56705bbe026e4532a54cbd5bb5861ad8855c0\": container with ID starting with c0f406694cb6284119bc5c54c4c56705bbe026e4532a54cbd5bb5861ad8855c0 not found: ID does not exist" containerID="c0f406694cb6284119bc5c54c4c56705bbe026e4532a54cbd5bb5861ad8855c0" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.341295 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0f406694cb6284119bc5c54c4c56705bbe026e4532a54cbd5bb5861ad8855c0"} err="failed to get container status \"c0f406694cb6284119bc5c54c4c56705bbe026e4532a54cbd5bb5861ad8855c0\": rpc error: code = NotFound desc = could not find container \"c0f406694cb6284119bc5c54c4c56705bbe026e4532a54cbd5bb5861ad8855c0\": container with ID starting with c0f406694cb6284119bc5c54c4c56705bbe026e4532a54cbd5bb5861ad8855c0 not found: ID does not exist" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.341402 4752 scope.go:117] "RemoveContainer" containerID="cea400ea5c6d240b340d69fcdfc2b31fd7f1a18909b55c83a82203cb6d15b769" Nov 24 12:34:00 crc kubenswrapper[4752]: E1124 12:34:00.343075 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cea400ea5c6d240b340d69fcdfc2b31fd7f1a18909b55c83a82203cb6d15b769\": container with ID starting with cea400ea5c6d240b340d69fcdfc2b31fd7f1a18909b55c83a82203cb6d15b769 not found: ID does not exist" containerID="cea400ea5c6d240b340d69fcdfc2b31fd7f1a18909b55c83a82203cb6d15b769" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.343145 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cea400ea5c6d240b340d69fcdfc2b31fd7f1a18909b55c83a82203cb6d15b769"} err="failed to get container status \"cea400ea5c6d240b340d69fcdfc2b31fd7f1a18909b55c83a82203cb6d15b769\": rpc error: code = NotFound desc = could not find container \"cea400ea5c6d240b340d69fcdfc2b31fd7f1a18909b55c83a82203cb6d15b769\": container with ID starting with cea400ea5c6d240b340d69fcdfc2b31fd7f1a18909b55c83a82203cb6d15b769 not found: ID does not exist" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.343195 4752 scope.go:117] "RemoveContainer" containerID="c0f406694cb6284119bc5c54c4c56705bbe026e4532a54cbd5bb5861ad8855c0" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.343544 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0f406694cb6284119bc5c54c4c56705bbe026e4532a54cbd5bb5861ad8855c0"} err="failed to get container status \"c0f406694cb6284119bc5c54c4c56705bbe026e4532a54cbd5bb5861ad8855c0\": rpc error: code = NotFound desc = could not find container \"c0f406694cb6284119bc5c54c4c56705bbe026e4532a54cbd5bb5861ad8855c0\": container with ID starting with c0f406694cb6284119bc5c54c4c56705bbe026e4532a54cbd5bb5861ad8855c0 not found: ID does not exist" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.343566 4752 scope.go:117] "RemoveContainer" containerID="cea400ea5c6d240b340d69fcdfc2b31fd7f1a18909b55c83a82203cb6d15b769" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.344037 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cea400ea5c6d240b340d69fcdfc2b31fd7f1a18909b55c83a82203cb6d15b769"} err="failed to get container status \"cea400ea5c6d240b340d69fcdfc2b31fd7f1a18909b55c83a82203cb6d15b769\": rpc error: code = NotFound desc = could not find container \"cea400ea5c6d240b340d69fcdfc2b31fd7f1a18909b55c83a82203cb6d15b769\": container with ID starting with cea400ea5c6d240b340d69fcdfc2b31fd7f1a18909b55c83a82203cb6d15b769 not found: ID does not exist" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.494144 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e908313d-0c05-4500-9ad2-c8f86d019672-config-data\") pod \"glance-default-external-api-0\" (UID: \"e908313d-0c05-4500-9ad2-c8f86d019672\") " pod="openstack/glance-default-external-api-0" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.494225 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd2mg\" (UniqueName: \"kubernetes.io/projected/e908313d-0c05-4500-9ad2-c8f86d019672-kube-api-access-kd2mg\") pod \"glance-default-external-api-0\" (UID: \"e908313d-0c05-4500-9ad2-c8f86d019672\") " pod="openstack/glance-default-external-api-0" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.494263 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e908313d-0c05-4500-9ad2-c8f86d019672-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e908313d-0c05-4500-9ad2-c8f86d019672\") " pod="openstack/glance-default-external-api-0" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.494339 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e908313d-0c05-4500-9ad2-c8f86d019672-ceph\") pod \"glance-default-external-api-0\" (UID: \"e908313d-0c05-4500-9ad2-c8f86d019672\") " pod="openstack/glance-default-external-api-0" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.494357 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e908313d-0c05-4500-9ad2-c8f86d019672-logs\") pod \"glance-default-external-api-0\" (UID: \"e908313d-0c05-4500-9ad2-c8f86d019672\") " pod="openstack/glance-default-external-api-0" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.494388 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e908313d-0c05-4500-9ad2-c8f86d019672-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e908313d-0c05-4500-9ad2-c8f86d019672\") " pod="openstack/glance-default-external-api-0" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.494415 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e908313d-0c05-4500-9ad2-c8f86d019672-scripts\") pod \"glance-default-external-api-0\" (UID: \"e908313d-0c05-4500-9ad2-c8f86d019672\") " pod="openstack/glance-default-external-api-0" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.596160 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e908313d-0c05-4500-9ad2-c8f86d019672-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e908313d-0c05-4500-9ad2-c8f86d019672\") " pod="openstack/glance-default-external-api-0" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.596277 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e908313d-0c05-4500-9ad2-c8f86d019672-ceph\") pod \"glance-default-external-api-0\" (UID: \"e908313d-0c05-4500-9ad2-c8f86d019672\") " pod="openstack/glance-default-external-api-0" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.596300 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e908313d-0c05-4500-9ad2-c8f86d019672-logs\") pod \"glance-default-external-api-0\" (UID: \"e908313d-0c05-4500-9ad2-c8f86d019672\") " pod="openstack/glance-default-external-api-0" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.596326 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e908313d-0c05-4500-9ad2-c8f86d019672-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e908313d-0c05-4500-9ad2-c8f86d019672\") " pod="openstack/glance-default-external-api-0" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.596351 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e908313d-0c05-4500-9ad2-c8f86d019672-scripts\") pod \"glance-default-external-api-0\" (UID: \"e908313d-0c05-4500-9ad2-c8f86d019672\") " pod="openstack/glance-default-external-api-0" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.596382 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e908313d-0c05-4500-9ad2-c8f86d019672-config-data\") pod \"glance-default-external-api-0\" (UID: \"e908313d-0c05-4500-9ad2-c8f86d019672\") " pod="openstack/glance-default-external-api-0" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.596414 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd2mg\" (UniqueName: \"kubernetes.io/projected/e908313d-0c05-4500-9ad2-c8f86d019672-kube-api-access-kd2mg\") pod \"glance-default-external-api-0\" (UID: \"e908313d-0c05-4500-9ad2-c8f86d019672\") " pod="openstack/glance-default-external-api-0" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.597158 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e908313d-0c05-4500-9ad2-c8f86d019672-logs\") pod \"glance-default-external-api-0\" (UID: \"e908313d-0c05-4500-9ad2-c8f86d019672\") " pod="openstack/glance-default-external-api-0" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.597986 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e908313d-0c05-4500-9ad2-c8f86d019672-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e908313d-0c05-4500-9ad2-c8f86d019672\") " pod="openstack/glance-default-external-api-0" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.601691 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e908313d-0c05-4500-9ad2-c8f86d019672-scripts\") pod \"glance-default-external-api-0\" (UID: \"e908313d-0c05-4500-9ad2-c8f86d019672\") " pod="openstack/glance-default-external-api-0" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.602339 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e908313d-0c05-4500-9ad2-c8f86d019672-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e908313d-0c05-4500-9ad2-c8f86d019672\") " pod="openstack/glance-default-external-api-0" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.602587 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e908313d-0c05-4500-9ad2-c8f86d019672-ceph\") pod \"glance-default-external-api-0\" (UID: \"e908313d-0c05-4500-9ad2-c8f86d019672\") " pod="openstack/glance-default-external-api-0" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.604535 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e908313d-0c05-4500-9ad2-c8f86d019672-config-data\") pod \"glance-default-external-api-0\" (UID: \"e908313d-0c05-4500-9ad2-c8f86d019672\") " pod="openstack/glance-default-external-api-0" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.620054 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd2mg\" (UniqueName: \"kubernetes.io/projected/e908313d-0c05-4500-9ad2-c8f86d019672-kube-api-access-kd2mg\") pod \"glance-default-external-api-0\" (UID: \"e908313d-0c05-4500-9ad2-c8f86d019672\") " pod="openstack/glance-default-external-api-0" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.665849 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 24 12:34:00 crc kubenswrapper[4752]: I1124 12:34:00.738628 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="215da164-20e4-47a8-91a7-d959b8e12463" path="/var/lib/kubelet/pods/215da164-20e4-47a8-91a7-d959b8e12463/volumes" Nov 24 12:34:01 crc kubenswrapper[4752]: I1124 12:34:01.197976 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 12:34:01 crc kubenswrapper[4752]: W1124 12:34:01.201087 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode908313d_0c05_4500_9ad2_c8f86d019672.slice/crio-96fd641a1e03cef22251f28cd456c9730e46bd9ba2e8c56f33d8b8d3ff1498bf WatchSource:0}: Error finding container 96fd641a1e03cef22251f28cd456c9730e46bd9ba2e8c56f33d8b8d3ff1498bf: Status 404 returned error can't find the container with id 96fd641a1e03cef22251f28cd456c9730e46bd9ba2e8c56f33d8b8d3ff1498bf Nov 24 12:34:01 crc kubenswrapper[4752]: I1124 12:34:01.239038 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e908313d-0c05-4500-9ad2-c8f86d019672","Type":"ContainerStarted","Data":"96fd641a1e03cef22251f28cd456c9730e46bd9ba2e8c56f33d8b8d3ff1498bf"} Nov 24 12:34:01 crc kubenswrapper[4752]: I1124 12:34:01.240530 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="79276fef-b858-4128-bddd-a4358cc57711" containerName="glance-log" containerID="cri-o://d9969cf51e7992cc4f342847b4ed5a14d7620b7e9755bcf9787298dc9df2311f" gracePeriod=30 Nov 24 12:34:01 crc kubenswrapper[4752]: I1124 12:34:01.240651 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="79276fef-b858-4128-bddd-a4358cc57711" containerName="glance-httpd" containerID="cri-o://067b656a7885f62440818b30e543018ea0bbf251c468838e9915e0585f6734b3" gracePeriod=30 Nov 24 12:34:01 crc kubenswrapper[4752]: I1124 12:34:01.870301 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.016928 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/79276fef-b858-4128-bddd-a4358cc57711-httpd-run\") pod \"79276fef-b858-4128-bddd-a4358cc57711\" (UID: \"79276fef-b858-4128-bddd-a4358cc57711\") " Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.016984 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pc79l\" (UniqueName: \"kubernetes.io/projected/79276fef-b858-4128-bddd-a4358cc57711-kube-api-access-pc79l\") pod \"79276fef-b858-4128-bddd-a4358cc57711\" (UID: \"79276fef-b858-4128-bddd-a4358cc57711\") " Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.017634 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79276fef-b858-4128-bddd-a4358cc57711-logs\") pod \"79276fef-b858-4128-bddd-a4358cc57711\" (UID: \"79276fef-b858-4128-bddd-a4358cc57711\") " Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.017918 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79276fef-b858-4128-bddd-a4358cc57711-logs" (OuterVolumeSpecName: "logs") pod "79276fef-b858-4128-bddd-a4358cc57711" (UID: "79276fef-b858-4128-bddd-a4358cc57711"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.018557 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79276fef-b858-4128-bddd-a4358cc57711-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "79276fef-b858-4128-bddd-a4358cc57711" (UID: "79276fef-b858-4128-bddd-a4358cc57711"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.018565 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/79276fef-b858-4128-bddd-a4358cc57711-ceph\") pod \"79276fef-b858-4128-bddd-a4358cc57711\" (UID: \"79276fef-b858-4128-bddd-a4358cc57711\") " Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.018838 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79276fef-b858-4128-bddd-a4358cc57711-scripts\") pod \"79276fef-b858-4128-bddd-a4358cc57711\" (UID: \"79276fef-b858-4128-bddd-a4358cc57711\") " Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.019066 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79276fef-b858-4128-bddd-a4358cc57711-config-data\") pod \"79276fef-b858-4128-bddd-a4358cc57711\" (UID: \"79276fef-b858-4128-bddd-a4358cc57711\") " Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.019100 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79276fef-b858-4128-bddd-a4358cc57711-combined-ca-bundle\") pod \"79276fef-b858-4128-bddd-a4358cc57711\" (UID: \"79276fef-b858-4128-bddd-a4358cc57711\") " Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.020336 4752 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/79276fef-b858-4128-bddd-a4358cc57711-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.020360 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79276fef-b858-4128-bddd-a4358cc57711-logs\") on node \"crc\" DevicePath \"\"" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.023224 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79276fef-b858-4128-bddd-a4358cc57711-kube-api-access-pc79l" (OuterVolumeSpecName: "kube-api-access-pc79l") pod "79276fef-b858-4128-bddd-a4358cc57711" (UID: "79276fef-b858-4128-bddd-a4358cc57711"). InnerVolumeSpecName "kube-api-access-pc79l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.035702 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79276fef-b858-4128-bddd-a4358cc57711-ceph" (OuterVolumeSpecName: "ceph") pod "79276fef-b858-4128-bddd-a4358cc57711" (UID: "79276fef-b858-4128-bddd-a4358cc57711"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.036860 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79276fef-b858-4128-bddd-a4358cc57711-scripts" (OuterVolumeSpecName: "scripts") pod "79276fef-b858-4128-bddd-a4358cc57711" (UID: "79276fef-b858-4128-bddd-a4358cc57711"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.057377 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79276fef-b858-4128-bddd-a4358cc57711-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "79276fef-b858-4128-bddd-a4358cc57711" (UID: "79276fef-b858-4128-bddd-a4358cc57711"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.073835 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79276fef-b858-4128-bddd-a4358cc57711-config-data" (OuterVolumeSpecName: "config-data") pod "79276fef-b858-4128-bddd-a4358cc57711" (UID: "79276fef-b858-4128-bddd-a4358cc57711"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.121327 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79276fef-b858-4128-bddd-a4358cc57711-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.121354 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79276fef-b858-4128-bddd-a4358cc57711-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.121366 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pc79l\" (UniqueName: \"kubernetes.io/projected/79276fef-b858-4128-bddd-a4358cc57711-kube-api-access-pc79l\") on node \"crc\" DevicePath \"\"" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.121375 4752 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/79276fef-b858-4128-bddd-a4358cc57711-ceph\") on node \"crc\" DevicePath \"\"" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.121383 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79276fef-b858-4128-bddd-a4358cc57711-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.250272 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e908313d-0c05-4500-9ad2-c8f86d019672","Type":"ContainerStarted","Data":"ff73a3acfb77932f17ac7df874edbbbcc3ae7dbe26c5b6ab575d9a8599eea7c0"} Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.250649 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e908313d-0c05-4500-9ad2-c8f86d019672","Type":"ContainerStarted","Data":"39a6c2690f144ed244f0a31e220599fe374a4c98d8e6ec9697b62bb8c22b3d14"} Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.252774 4752 generic.go:334] "Generic (PLEG): container finished" podID="79276fef-b858-4128-bddd-a4358cc57711" containerID="067b656a7885f62440818b30e543018ea0bbf251c468838e9915e0585f6734b3" exitCode=0 Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.252811 4752 generic.go:334] "Generic (PLEG): container finished" podID="79276fef-b858-4128-bddd-a4358cc57711" containerID="d9969cf51e7992cc4f342847b4ed5a14d7620b7e9755bcf9787298dc9df2311f" exitCode=143 Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.252835 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"79276fef-b858-4128-bddd-a4358cc57711","Type":"ContainerDied","Data":"067b656a7885f62440818b30e543018ea0bbf251c468838e9915e0585f6734b3"} Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.252849 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.252864 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"79276fef-b858-4128-bddd-a4358cc57711","Type":"ContainerDied","Data":"d9969cf51e7992cc4f342847b4ed5a14d7620b7e9755bcf9787298dc9df2311f"} Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.252876 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"79276fef-b858-4128-bddd-a4358cc57711","Type":"ContainerDied","Data":"f128d330260215a2317f41a2b254e9dd30a70dfde10bacf51f68fd781a6af96f"} Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.252893 4752 scope.go:117] "RemoveContainer" containerID="067b656a7885f62440818b30e543018ea0bbf251c468838e9915e0585f6734b3" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.272628 4752 scope.go:117] "RemoveContainer" containerID="d9969cf51e7992cc4f342847b4ed5a14d7620b7e9755bcf9787298dc9df2311f" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.284202 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=2.2841828299999998 podStartE2EDuration="2.28418283s" podCreationTimestamp="2025-11-24 12:34:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:34:02.281521714 +0000 UTC m=+5248.266342013" watchObservedRunningTime="2025-11-24 12:34:02.28418283 +0000 UTC m=+5248.269003119" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.309975 4752 scope.go:117] "RemoveContainer" containerID="067b656a7885f62440818b30e543018ea0bbf251c468838e9915e0585f6734b3" Nov 24 12:34:02 crc kubenswrapper[4752]: E1124 12:34:02.314145 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"067b656a7885f62440818b30e543018ea0bbf251c468838e9915e0585f6734b3\": container with ID starting with 067b656a7885f62440818b30e543018ea0bbf251c468838e9915e0585f6734b3 not found: ID does not exist" containerID="067b656a7885f62440818b30e543018ea0bbf251c468838e9915e0585f6734b3" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.314246 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"067b656a7885f62440818b30e543018ea0bbf251c468838e9915e0585f6734b3"} err="failed to get container status \"067b656a7885f62440818b30e543018ea0bbf251c468838e9915e0585f6734b3\": rpc error: code = NotFound desc = could not find container \"067b656a7885f62440818b30e543018ea0bbf251c468838e9915e0585f6734b3\": container with ID starting with 067b656a7885f62440818b30e543018ea0bbf251c468838e9915e0585f6734b3 not found: ID does not exist" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.314281 4752 scope.go:117] "RemoveContainer" containerID="d9969cf51e7992cc4f342847b4ed5a14d7620b7e9755bcf9787298dc9df2311f" Nov 24 12:34:02 crc kubenswrapper[4752]: E1124 12:34:02.314766 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9969cf51e7992cc4f342847b4ed5a14d7620b7e9755bcf9787298dc9df2311f\": container with ID starting with d9969cf51e7992cc4f342847b4ed5a14d7620b7e9755bcf9787298dc9df2311f not found: ID does not exist" containerID="d9969cf51e7992cc4f342847b4ed5a14d7620b7e9755bcf9787298dc9df2311f" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.314799 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9969cf51e7992cc4f342847b4ed5a14d7620b7e9755bcf9787298dc9df2311f"} err="failed to get container status \"d9969cf51e7992cc4f342847b4ed5a14d7620b7e9755bcf9787298dc9df2311f\": rpc error: code = NotFound desc = could not find container \"d9969cf51e7992cc4f342847b4ed5a14d7620b7e9755bcf9787298dc9df2311f\": container with ID starting with d9969cf51e7992cc4f342847b4ed5a14d7620b7e9755bcf9787298dc9df2311f not found: ID does not exist" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.314817 4752 scope.go:117] "RemoveContainer" containerID="067b656a7885f62440818b30e543018ea0bbf251c468838e9915e0585f6734b3" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.315026 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"067b656a7885f62440818b30e543018ea0bbf251c468838e9915e0585f6734b3"} err="failed to get container status \"067b656a7885f62440818b30e543018ea0bbf251c468838e9915e0585f6734b3\": rpc error: code = NotFound desc = could not find container \"067b656a7885f62440818b30e543018ea0bbf251c468838e9915e0585f6734b3\": container with ID starting with 067b656a7885f62440818b30e543018ea0bbf251c468838e9915e0585f6734b3 not found: ID does not exist" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.315054 4752 scope.go:117] "RemoveContainer" containerID="d9969cf51e7992cc4f342847b4ed5a14d7620b7e9755bcf9787298dc9df2311f" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.315265 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9969cf51e7992cc4f342847b4ed5a14d7620b7e9755bcf9787298dc9df2311f"} err="failed to get container status \"d9969cf51e7992cc4f342847b4ed5a14d7620b7e9755bcf9787298dc9df2311f\": rpc error: code = NotFound desc = could not find container \"d9969cf51e7992cc4f342847b4ed5a14d7620b7e9755bcf9787298dc9df2311f\": container with ID starting with d9969cf51e7992cc4f342847b4ed5a14d7620b7e9755bcf9787298dc9df2311f not found: ID does not exist" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.317262 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.328178 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.337402 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 12:34:02 crc kubenswrapper[4752]: E1124 12:34:02.337800 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79276fef-b858-4128-bddd-a4358cc57711" containerName="glance-log" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.337814 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="79276fef-b858-4128-bddd-a4358cc57711" containerName="glance-log" Nov 24 12:34:02 crc kubenswrapper[4752]: E1124 12:34:02.337825 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79276fef-b858-4128-bddd-a4358cc57711" containerName="glance-httpd" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.337832 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="79276fef-b858-4128-bddd-a4358cc57711" containerName="glance-httpd" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.338060 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="79276fef-b858-4128-bddd-a4358cc57711" containerName="glance-httpd" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.338075 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="79276fef-b858-4128-bddd-a4358cc57711" containerName="glance-log" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.339192 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.346045 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.348252 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.527548 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/72d1cd63-792f-41e5-ab1a-b322680a18f1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"72d1cd63-792f-41e5-ab1a-b322680a18f1\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.527789 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72d1cd63-792f-41e5-ab1a-b322680a18f1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"72d1cd63-792f-41e5-ab1a-b322680a18f1\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.527843 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72d1cd63-792f-41e5-ab1a-b322680a18f1-logs\") pod \"glance-default-internal-api-0\" (UID: \"72d1cd63-792f-41e5-ab1a-b322680a18f1\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.527923 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72d1cd63-792f-41e5-ab1a-b322680a18f1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"72d1cd63-792f-41e5-ab1a-b322680a18f1\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.527972 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/72d1cd63-792f-41e5-ab1a-b322680a18f1-ceph\") pod \"glance-default-internal-api-0\" (UID: \"72d1cd63-792f-41e5-ab1a-b322680a18f1\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.528028 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72d1cd63-792f-41e5-ab1a-b322680a18f1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"72d1cd63-792f-41e5-ab1a-b322680a18f1\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.528051 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8f9x\" (UniqueName: \"kubernetes.io/projected/72d1cd63-792f-41e5-ab1a-b322680a18f1-kube-api-access-n8f9x\") pod \"glance-default-internal-api-0\" (UID: \"72d1cd63-792f-41e5-ab1a-b322680a18f1\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.629787 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72d1cd63-792f-41e5-ab1a-b322680a18f1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"72d1cd63-792f-41e5-ab1a-b322680a18f1\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.629845 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72d1cd63-792f-41e5-ab1a-b322680a18f1-logs\") pod \"glance-default-internal-api-0\" (UID: \"72d1cd63-792f-41e5-ab1a-b322680a18f1\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.629883 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72d1cd63-792f-41e5-ab1a-b322680a18f1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"72d1cd63-792f-41e5-ab1a-b322680a18f1\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.629918 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/72d1cd63-792f-41e5-ab1a-b322680a18f1-ceph\") pod \"glance-default-internal-api-0\" (UID: \"72d1cd63-792f-41e5-ab1a-b322680a18f1\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.629961 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72d1cd63-792f-41e5-ab1a-b322680a18f1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"72d1cd63-792f-41e5-ab1a-b322680a18f1\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.629986 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8f9x\" (UniqueName: \"kubernetes.io/projected/72d1cd63-792f-41e5-ab1a-b322680a18f1-kube-api-access-n8f9x\") pod \"glance-default-internal-api-0\" (UID: \"72d1cd63-792f-41e5-ab1a-b322680a18f1\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.630021 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/72d1cd63-792f-41e5-ab1a-b322680a18f1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"72d1cd63-792f-41e5-ab1a-b322680a18f1\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.630474 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72d1cd63-792f-41e5-ab1a-b322680a18f1-logs\") pod \"glance-default-internal-api-0\" (UID: \"72d1cd63-792f-41e5-ab1a-b322680a18f1\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.630556 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/72d1cd63-792f-41e5-ab1a-b322680a18f1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"72d1cd63-792f-41e5-ab1a-b322680a18f1\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.634717 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72d1cd63-792f-41e5-ab1a-b322680a18f1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"72d1cd63-792f-41e5-ab1a-b322680a18f1\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.635120 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72d1cd63-792f-41e5-ab1a-b322680a18f1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"72d1cd63-792f-41e5-ab1a-b322680a18f1\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.635366 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72d1cd63-792f-41e5-ab1a-b322680a18f1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"72d1cd63-792f-41e5-ab1a-b322680a18f1\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.637655 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/72d1cd63-792f-41e5-ab1a-b322680a18f1-ceph\") pod \"glance-default-internal-api-0\" (UID: \"72d1cd63-792f-41e5-ab1a-b322680a18f1\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.644505 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8f9x\" (UniqueName: \"kubernetes.io/projected/72d1cd63-792f-41e5-ab1a-b322680a18f1-kube-api-access-n8f9x\") pod \"glance-default-internal-api-0\" (UID: \"72d1cd63-792f-41e5-ab1a-b322680a18f1\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.677102 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 24 12:34:02 crc kubenswrapper[4752]: I1124 12:34:02.751698 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79276fef-b858-4128-bddd-a4358cc57711" path="/var/lib/kubelet/pods/79276fef-b858-4128-bddd-a4358cc57711/volumes" Nov 24 12:34:03 crc kubenswrapper[4752]: I1124 12:34:03.238892 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 12:34:03 crc kubenswrapper[4752]: W1124 12:34:03.244252 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72d1cd63_792f_41e5_ab1a_b322680a18f1.slice/crio-37d9198e31cd920fe6bcdc4c246a90e558abde334379af8bb1cb94b965b6d701 WatchSource:0}: Error finding container 37d9198e31cd920fe6bcdc4c246a90e558abde334379af8bb1cb94b965b6d701: Status 404 returned error can't find the container with id 37d9198e31cd920fe6bcdc4c246a90e558abde334379af8bb1cb94b965b6d701 Nov 24 12:34:03 crc kubenswrapper[4752]: I1124 12:34:03.262147 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"72d1cd63-792f-41e5-ab1a-b322680a18f1","Type":"ContainerStarted","Data":"37d9198e31cd920fe6bcdc4c246a90e558abde334379af8bb1cb94b965b6d701"} Nov 24 12:34:04 crc kubenswrapper[4752]: I1124 12:34:04.273733 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"72d1cd63-792f-41e5-ab1a-b322680a18f1","Type":"ContainerStarted","Data":"324e2b142bd12d04603cad9693b5d4ad70118361be149b988b4ca1a86ae49b83"} Nov 24 12:34:04 crc kubenswrapper[4752]: I1124 12:34:04.274110 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"72d1cd63-792f-41e5-ab1a-b322680a18f1","Type":"ContainerStarted","Data":"9b0aae8c06275e141ede063b85eaeddb44ff2527a8822d67d87b4ba9edb9b29c"} Nov 24 12:34:04 crc kubenswrapper[4752]: I1124 12:34:04.296720 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.296705476 podStartE2EDuration="2.296705476s" podCreationTimestamp="2025-11-24 12:34:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:34:04.292249768 +0000 UTC m=+5250.277070067" watchObservedRunningTime="2025-11-24 12:34:04.296705476 +0000 UTC m=+5250.281525765" Nov 24 12:34:06 crc kubenswrapper[4752]: I1124 12:34:06.956034 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7c8c4d4d9c-pncnc" Nov 24 12:34:07 crc kubenswrapper[4752]: I1124 12:34:07.035128 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c7fcc54fc-r87hf"] Nov 24 12:34:07 crc kubenswrapper[4752]: I1124 12:34:07.035404 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c7fcc54fc-r87hf" podUID="14d15b12-6098-4bc8-9e5d-481426fcf470" containerName="dnsmasq-dns" containerID="cri-o://686b3c549cb0ddcb1ba619d2a797fffa21b726b2acf3c639b7c11e85584cf0aa" gracePeriod=10 Nov 24 12:34:07 crc kubenswrapper[4752]: I1124 12:34:07.305700 4752 generic.go:334] "Generic (PLEG): container finished" podID="14d15b12-6098-4bc8-9e5d-481426fcf470" containerID="686b3c549cb0ddcb1ba619d2a797fffa21b726b2acf3c639b7c11e85584cf0aa" exitCode=0 Nov 24 12:34:07 crc kubenswrapper[4752]: I1124 12:34:07.305803 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c7fcc54fc-r87hf" event={"ID":"14d15b12-6098-4bc8-9e5d-481426fcf470","Type":"ContainerDied","Data":"686b3c549cb0ddcb1ba619d2a797fffa21b726b2acf3c639b7c11e85584cf0aa"} Nov 24 12:34:07 crc kubenswrapper[4752]: I1124 12:34:07.568291 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c7fcc54fc-r87hf" Nov 24 12:34:07 crc kubenswrapper[4752]: I1124 12:34:07.719737 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/14d15b12-6098-4bc8-9e5d-481426fcf470-ovsdbserver-nb\") pod \"14d15b12-6098-4bc8-9e5d-481426fcf470\" (UID: \"14d15b12-6098-4bc8-9e5d-481426fcf470\") " Nov 24 12:34:07 crc kubenswrapper[4752]: I1124 12:34:07.719825 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/14d15b12-6098-4bc8-9e5d-481426fcf470-ovsdbserver-sb\") pod \"14d15b12-6098-4bc8-9e5d-481426fcf470\" (UID: \"14d15b12-6098-4bc8-9e5d-481426fcf470\") " Nov 24 12:34:07 crc kubenswrapper[4752]: I1124 12:34:07.719910 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14d15b12-6098-4bc8-9e5d-481426fcf470-config\") pod \"14d15b12-6098-4bc8-9e5d-481426fcf470\" (UID: \"14d15b12-6098-4bc8-9e5d-481426fcf470\") " Nov 24 12:34:07 crc kubenswrapper[4752]: I1124 12:34:07.719978 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14d15b12-6098-4bc8-9e5d-481426fcf470-dns-svc\") pod \"14d15b12-6098-4bc8-9e5d-481426fcf470\" (UID: \"14d15b12-6098-4bc8-9e5d-481426fcf470\") " Nov 24 12:34:07 crc kubenswrapper[4752]: I1124 12:34:07.720057 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hv7h\" (UniqueName: \"kubernetes.io/projected/14d15b12-6098-4bc8-9e5d-481426fcf470-kube-api-access-5hv7h\") pod \"14d15b12-6098-4bc8-9e5d-481426fcf470\" (UID: \"14d15b12-6098-4bc8-9e5d-481426fcf470\") " Nov 24 12:34:07 crc kubenswrapper[4752]: I1124 12:34:07.726926 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14d15b12-6098-4bc8-9e5d-481426fcf470-kube-api-access-5hv7h" (OuterVolumeSpecName: "kube-api-access-5hv7h") pod "14d15b12-6098-4bc8-9e5d-481426fcf470" (UID: "14d15b12-6098-4bc8-9e5d-481426fcf470"). InnerVolumeSpecName "kube-api-access-5hv7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:34:07 crc kubenswrapper[4752]: I1124 12:34:07.763755 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14d15b12-6098-4bc8-9e5d-481426fcf470-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "14d15b12-6098-4bc8-9e5d-481426fcf470" (UID: "14d15b12-6098-4bc8-9e5d-481426fcf470"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:34:07 crc kubenswrapper[4752]: I1124 12:34:07.767994 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14d15b12-6098-4bc8-9e5d-481426fcf470-config" (OuterVolumeSpecName: "config") pod "14d15b12-6098-4bc8-9e5d-481426fcf470" (UID: "14d15b12-6098-4bc8-9e5d-481426fcf470"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:34:07 crc kubenswrapper[4752]: I1124 12:34:07.771649 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14d15b12-6098-4bc8-9e5d-481426fcf470-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "14d15b12-6098-4bc8-9e5d-481426fcf470" (UID: "14d15b12-6098-4bc8-9e5d-481426fcf470"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:34:07 crc kubenswrapper[4752]: I1124 12:34:07.773542 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14d15b12-6098-4bc8-9e5d-481426fcf470-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "14d15b12-6098-4bc8-9e5d-481426fcf470" (UID: "14d15b12-6098-4bc8-9e5d-481426fcf470"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:34:07 crc kubenswrapper[4752]: I1124 12:34:07.822105 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hv7h\" (UniqueName: \"kubernetes.io/projected/14d15b12-6098-4bc8-9e5d-481426fcf470-kube-api-access-5hv7h\") on node \"crc\" DevicePath \"\"" Nov 24 12:34:07 crc kubenswrapper[4752]: I1124 12:34:07.822136 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/14d15b12-6098-4bc8-9e5d-481426fcf470-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 24 12:34:07 crc kubenswrapper[4752]: I1124 12:34:07.822144 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/14d15b12-6098-4bc8-9e5d-481426fcf470-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 24 12:34:07 crc kubenswrapper[4752]: I1124 12:34:07.822153 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14d15b12-6098-4bc8-9e5d-481426fcf470-config\") on node \"crc\" DevicePath \"\"" Nov 24 12:34:07 crc kubenswrapper[4752]: I1124 12:34:07.822162 4752 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14d15b12-6098-4bc8-9e5d-481426fcf470-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 12:34:08 crc kubenswrapper[4752]: I1124 12:34:08.316195 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c7fcc54fc-r87hf" event={"ID":"14d15b12-6098-4bc8-9e5d-481426fcf470","Type":"ContainerDied","Data":"fe8b136a0e074356b49892de787043b91e59049bf55e337aeaebb0ab836f2b5c"} Nov 24 12:34:08 crc kubenswrapper[4752]: I1124 12:34:08.316264 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c7fcc54fc-r87hf" Nov 24 12:34:08 crc kubenswrapper[4752]: I1124 12:34:08.316594 4752 scope.go:117] "RemoveContainer" containerID="686b3c549cb0ddcb1ba619d2a797fffa21b726b2acf3c639b7c11e85584cf0aa" Nov 24 12:34:08 crc kubenswrapper[4752]: I1124 12:34:08.354769 4752 scope.go:117] "RemoveContainer" containerID="efb2162466882028963a722b81a04320e2e8a9355498ff192236a35dbece1c04" Nov 24 12:34:08 crc kubenswrapper[4752]: I1124 12:34:08.374282 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c7fcc54fc-r87hf"] Nov 24 12:34:08 crc kubenswrapper[4752]: I1124 12:34:08.380519 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c7fcc54fc-r87hf"] Nov 24 12:34:08 crc kubenswrapper[4752]: I1124 12:34:08.747542 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14d15b12-6098-4bc8-9e5d-481426fcf470" path="/var/lib/kubelet/pods/14d15b12-6098-4bc8-9e5d-481426fcf470/volumes" Nov 24 12:34:10 crc kubenswrapper[4752]: I1124 12:34:10.667270 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 24 12:34:10 crc kubenswrapper[4752]: I1124 12:34:10.667673 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 24 12:34:10 crc kubenswrapper[4752]: I1124 12:34:10.699453 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 24 12:34:10 crc kubenswrapper[4752]: I1124 12:34:10.708836 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 24 12:34:11 crc kubenswrapper[4752]: I1124 12:34:11.350405 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 24 12:34:11 crc kubenswrapper[4752]: I1124 12:34:11.350439 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 24 12:34:12 crc kubenswrapper[4752]: I1124 12:34:12.678294 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 24 12:34:12 crc kubenswrapper[4752]: I1124 12:34:12.679003 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 24 12:34:12 crc kubenswrapper[4752]: I1124 12:34:12.713078 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 24 12:34:12 crc kubenswrapper[4752]: I1124 12:34:12.745241 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 24 12:34:13 crc kubenswrapper[4752]: I1124 12:34:13.325185 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 24 12:34:13 crc kubenswrapper[4752]: I1124 12:34:13.367616 4752 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 24 12:34:13 crc kubenswrapper[4752]: I1124 12:34:13.368029 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 24 12:34:13 crc kubenswrapper[4752]: I1124 12:34:13.368071 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 24 12:34:13 crc kubenswrapper[4752]: I1124 12:34:13.424563 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 24 12:34:15 crc kubenswrapper[4752]: I1124 12:34:15.414469 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 24 12:34:15 crc kubenswrapper[4752]: I1124 12:34:15.415271 4752 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 24 12:34:15 crc kubenswrapper[4752]: I1124 12:34:15.531223 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 24 12:34:23 crc kubenswrapper[4752]: I1124 12:34:23.344943 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-xrgxg"] Nov 24 12:34:23 crc kubenswrapper[4752]: E1124 12:34:23.346793 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14d15b12-6098-4bc8-9e5d-481426fcf470" containerName="dnsmasq-dns" Nov 24 12:34:23 crc kubenswrapper[4752]: I1124 12:34:23.346903 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="14d15b12-6098-4bc8-9e5d-481426fcf470" containerName="dnsmasq-dns" Nov 24 12:34:23 crc kubenswrapper[4752]: E1124 12:34:23.347000 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14d15b12-6098-4bc8-9e5d-481426fcf470" containerName="init" Nov 24 12:34:23 crc kubenswrapper[4752]: I1124 12:34:23.347080 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="14d15b12-6098-4bc8-9e5d-481426fcf470" containerName="init" Nov 24 12:34:23 crc kubenswrapper[4752]: I1124 12:34:23.347375 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="14d15b12-6098-4bc8-9e5d-481426fcf470" containerName="dnsmasq-dns" Nov 24 12:34:23 crc kubenswrapper[4752]: I1124 12:34:23.348162 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xrgxg" Nov 24 12:34:23 crc kubenswrapper[4752]: I1124 12:34:23.355274 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-xrgxg"] Nov 24 12:34:23 crc kubenswrapper[4752]: I1124 12:34:23.408070 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77hgd\" (UniqueName: \"kubernetes.io/projected/892c65e3-cd5b-4c52-8b65-35205b02e26b-kube-api-access-77hgd\") pod \"placement-db-create-xrgxg\" (UID: \"892c65e3-cd5b-4c52-8b65-35205b02e26b\") " pod="openstack/placement-db-create-xrgxg" Nov 24 12:34:23 crc kubenswrapper[4752]: I1124 12:34:23.408248 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/892c65e3-cd5b-4c52-8b65-35205b02e26b-operator-scripts\") pod \"placement-db-create-xrgxg\" (UID: \"892c65e3-cd5b-4c52-8b65-35205b02e26b\") " pod="openstack/placement-db-create-xrgxg" Nov 24 12:34:23 crc kubenswrapper[4752]: I1124 12:34:23.441985 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7355-account-create-2vzv7"] Nov 24 12:34:23 crc kubenswrapper[4752]: I1124 12:34:23.443543 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7355-account-create-2vzv7" Nov 24 12:34:23 crc kubenswrapper[4752]: I1124 12:34:23.445231 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Nov 24 12:34:23 crc kubenswrapper[4752]: I1124 12:34:23.452174 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7355-account-create-2vzv7"] Nov 24 12:34:23 crc kubenswrapper[4752]: I1124 12:34:23.509939 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/892c65e3-cd5b-4c52-8b65-35205b02e26b-operator-scripts\") pod \"placement-db-create-xrgxg\" (UID: \"892c65e3-cd5b-4c52-8b65-35205b02e26b\") " pod="openstack/placement-db-create-xrgxg" Nov 24 12:34:23 crc kubenswrapper[4752]: I1124 12:34:23.510047 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98072d93-5346-4958-9c03-136c784abc74-operator-scripts\") pod \"placement-7355-account-create-2vzv7\" (UID: \"98072d93-5346-4958-9c03-136c784abc74\") " pod="openstack/placement-7355-account-create-2vzv7" Nov 24 12:34:23 crc kubenswrapper[4752]: I1124 12:34:23.510105 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxrbs\" (UniqueName: \"kubernetes.io/projected/98072d93-5346-4958-9c03-136c784abc74-kube-api-access-kxrbs\") pod \"placement-7355-account-create-2vzv7\" (UID: \"98072d93-5346-4958-9c03-136c784abc74\") " pod="openstack/placement-7355-account-create-2vzv7" Nov 24 12:34:23 crc kubenswrapper[4752]: I1124 12:34:23.510163 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77hgd\" (UniqueName: \"kubernetes.io/projected/892c65e3-cd5b-4c52-8b65-35205b02e26b-kube-api-access-77hgd\") pod \"placement-db-create-xrgxg\" (UID: \"892c65e3-cd5b-4c52-8b65-35205b02e26b\") " pod="openstack/placement-db-create-xrgxg" Nov 24 12:34:23 crc kubenswrapper[4752]: I1124 12:34:23.511559 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/892c65e3-cd5b-4c52-8b65-35205b02e26b-operator-scripts\") pod \"placement-db-create-xrgxg\" (UID: \"892c65e3-cd5b-4c52-8b65-35205b02e26b\") " pod="openstack/placement-db-create-xrgxg" Nov 24 12:34:23 crc kubenswrapper[4752]: I1124 12:34:23.553978 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77hgd\" (UniqueName: \"kubernetes.io/projected/892c65e3-cd5b-4c52-8b65-35205b02e26b-kube-api-access-77hgd\") pod \"placement-db-create-xrgxg\" (UID: \"892c65e3-cd5b-4c52-8b65-35205b02e26b\") " pod="openstack/placement-db-create-xrgxg" Nov 24 12:34:23 crc kubenswrapper[4752]: I1124 12:34:23.611511 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxrbs\" (UniqueName: \"kubernetes.io/projected/98072d93-5346-4958-9c03-136c784abc74-kube-api-access-kxrbs\") pod \"placement-7355-account-create-2vzv7\" (UID: \"98072d93-5346-4958-9c03-136c784abc74\") " pod="openstack/placement-7355-account-create-2vzv7" Nov 24 12:34:23 crc kubenswrapper[4752]: I1124 12:34:23.611889 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98072d93-5346-4958-9c03-136c784abc74-operator-scripts\") pod \"placement-7355-account-create-2vzv7\" (UID: \"98072d93-5346-4958-9c03-136c784abc74\") " pod="openstack/placement-7355-account-create-2vzv7" Nov 24 12:34:23 crc kubenswrapper[4752]: I1124 12:34:23.612563 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98072d93-5346-4958-9c03-136c784abc74-operator-scripts\") pod \"placement-7355-account-create-2vzv7\" (UID: \"98072d93-5346-4958-9c03-136c784abc74\") " pod="openstack/placement-7355-account-create-2vzv7" Nov 24 12:34:23 crc kubenswrapper[4752]: I1124 12:34:23.631652 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxrbs\" (UniqueName: \"kubernetes.io/projected/98072d93-5346-4958-9c03-136c784abc74-kube-api-access-kxrbs\") pod \"placement-7355-account-create-2vzv7\" (UID: \"98072d93-5346-4958-9c03-136c784abc74\") " pod="openstack/placement-7355-account-create-2vzv7" Nov 24 12:34:23 crc kubenswrapper[4752]: I1124 12:34:23.673109 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xrgxg" Nov 24 12:34:23 crc kubenswrapper[4752]: I1124 12:34:23.763115 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7355-account-create-2vzv7" Nov 24 12:34:23 crc kubenswrapper[4752]: I1124 12:34:23.975739 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-xrgxg"] Nov 24 12:34:24 crc kubenswrapper[4752]: I1124 12:34:24.264138 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7355-account-create-2vzv7"] Nov 24 12:34:24 crc kubenswrapper[4752]: W1124 12:34:24.265706 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98072d93_5346_4958_9c03_136c784abc74.slice/crio-f5ddb7ee136b0c2de4b6c2f0cb98aea324d3d799bf1b608bbc65168211a88f2b WatchSource:0}: Error finding container f5ddb7ee136b0c2de4b6c2f0cb98aea324d3d799bf1b608bbc65168211a88f2b: Status 404 returned error can't find the container with id f5ddb7ee136b0c2de4b6c2f0cb98aea324d3d799bf1b608bbc65168211a88f2b Nov 24 12:34:24 crc kubenswrapper[4752]: I1124 12:34:24.501730 4752 generic.go:334] "Generic (PLEG): container finished" podID="892c65e3-cd5b-4c52-8b65-35205b02e26b" containerID="9cf4a0e6c9f7c858c142856c13c36e4eaa4e581765bc099e8a893deceb3cbb1b" exitCode=0 Nov 24 12:34:24 crc kubenswrapper[4752]: I1124 12:34:24.501804 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xrgxg" event={"ID":"892c65e3-cd5b-4c52-8b65-35205b02e26b","Type":"ContainerDied","Data":"9cf4a0e6c9f7c858c142856c13c36e4eaa4e581765bc099e8a893deceb3cbb1b"} Nov 24 12:34:24 crc kubenswrapper[4752]: I1124 12:34:24.501829 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xrgxg" event={"ID":"892c65e3-cd5b-4c52-8b65-35205b02e26b","Type":"ContainerStarted","Data":"4c75d5aa44fc38fa8763dfc7e9c167be0e937266b50333eb5ff0263c114ccbc3"} Nov 24 12:34:24 crc kubenswrapper[4752]: I1124 12:34:24.506245 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7355-account-create-2vzv7" event={"ID":"98072d93-5346-4958-9c03-136c784abc74","Type":"ContainerStarted","Data":"e2113b1afe5b70e17f808ec03a9fb280c36d905d6734ed9b35522d5b29dcf520"} Nov 24 12:34:24 crc kubenswrapper[4752]: I1124 12:34:24.506556 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7355-account-create-2vzv7" event={"ID":"98072d93-5346-4958-9c03-136c784abc74","Type":"ContainerStarted","Data":"f5ddb7ee136b0c2de4b6c2f0cb98aea324d3d799bf1b608bbc65168211a88f2b"} Nov 24 12:34:25 crc kubenswrapper[4752]: I1124 12:34:25.519570 4752 generic.go:334] "Generic (PLEG): container finished" podID="98072d93-5346-4958-9c03-136c784abc74" containerID="e2113b1afe5b70e17f808ec03a9fb280c36d905d6734ed9b35522d5b29dcf520" exitCode=0 Nov 24 12:34:25 crc kubenswrapper[4752]: I1124 12:34:25.519667 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7355-account-create-2vzv7" event={"ID":"98072d93-5346-4958-9c03-136c784abc74","Type":"ContainerDied","Data":"e2113b1afe5b70e17f808ec03a9fb280c36d905d6734ed9b35522d5b29dcf520"} Nov 24 12:34:25 crc kubenswrapper[4752]: I1124 12:34:25.957198 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xrgxg" Nov 24 12:34:26 crc kubenswrapper[4752]: I1124 12:34:26.056654 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77hgd\" (UniqueName: \"kubernetes.io/projected/892c65e3-cd5b-4c52-8b65-35205b02e26b-kube-api-access-77hgd\") pod \"892c65e3-cd5b-4c52-8b65-35205b02e26b\" (UID: \"892c65e3-cd5b-4c52-8b65-35205b02e26b\") " Nov 24 12:34:26 crc kubenswrapper[4752]: I1124 12:34:26.056851 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/892c65e3-cd5b-4c52-8b65-35205b02e26b-operator-scripts\") pod \"892c65e3-cd5b-4c52-8b65-35205b02e26b\" (UID: \"892c65e3-cd5b-4c52-8b65-35205b02e26b\") " Nov 24 12:34:26 crc kubenswrapper[4752]: I1124 12:34:26.057420 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/892c65e3-cd5b-4c52-8b65-35205b02e26b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "892c65e3-cd5b-4c52-8b65-35205b02e26b" (UID: "892c65e3-cd5b-4c52-8b65-35205b02e26b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:34:26 crc kubenswrapper[4752]: I1124 12:34:26.062953 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/892c65e3-cd5b-4c52-8b65-35205b02e26b-kube-api-access-77hgd" (OuterVolumeSpecName: "kube-api-access-77hgd") pod "892c65e3-cd5b-4c52-8b65-35205b02e26b" (UID: "892c65e3-cd5b-4c52-8b65-35205b02e26b"). InnerVolumeSpecName "kube-api-access-77hgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:34:26 crc kubenswrapper[4752]: I1124 12:34:26.159399 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/892c65e3-cd5b-4c52-8b65-35205b02e26b-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:34:26 crc kubenswrapper[4752]: I1124 12:34:26.159447 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77hgd\" (UniqueName: \"kubernetes.io/projected/892c65e3-cd5b-4c52-8b65-35205b02e26b-kube-api-access-77hgd\") on node \"crc\" DevicePath \"\"" Nov 24 12:34:26 crc kubenswrapper[4752]: I1124 12:34:26.533496 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xrgxg" event={"ID":"892c65e3-cd5b-4c52-8b65-35205b02e26b","Type":"ContainerDied","Data":"4c75d5aa44fc38fa8763dfc7e9c167be0e937266b50333eb5ff0263c114ccbc3"} Nov 24 12:34:26 crc kubenswrapper[4752]: I1124 12:34:26.533557 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c75d5aa44fc38fa8763dfc7e9c167be0e937266b50333eb5ff0263c114ccbc3" Nov 24 12:34:26 crc kubenswrapper[4752]: I1124 12:34:26.533523 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xrgxg" Nov 24 12:34:26 crc kubenswrapper[4752]: I1124 12:34:26.982444 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7355-account-create-2vzv7" Nov 24 12:34:27 crc kubenswrapper[4752]: I1124 12:34:27.074247 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98072d93-5346-4958-9c03-136c784abc74-operator-scripts\") pod \"98072d93-5346-4958-9c03-136c784abc74\" (UID: \"98072d93-5346-4958-9c03-136c784abc74\") " Nov 24 12:34:27 crc kubenswrapper[4752]: I1124 12:34:27.074898 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxrbs\" (UniqueName: \"kubernetes.io/projected/98072d93-5346-4958-9c03-136c784abc74-kube-api-access-kxrbs\") pod \"98072d93-5346-4958-9c03-136c784abc74\" (UID: \"98072d93-5346-4958-9c03-136c784abc74\") " Nov 24 12:34:27 crc kubenswrapper[4752]: I1124 12:34:27.074925 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98072d93-5346-4958-9c03-136c784abc74-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "98072d93-5346-4958-9c03-136c784abc74" (UID: "98072d93-5346-4958-9c03-136c784abc74"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:34:27 crc kubenswrapper[4752]: I1124 12:34:27.076029 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98072d93-5346-4958-9c03-136c784abc74-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:34:27 crc kubenswrapper[4752]: I1124 12:34:27.079720 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98072d93-5346-4958-9c03-136c784abc74-kube-api-access-kxrbs" (OuterVolumeSpecName: "kube-api-access-kxrbs") pod "98072d93-5346-4958-9c03-136c784abc74" (UID: "98072d93-5346-4958-9c03-136c784abc74"). InnerVolumeSpecName "kube-api-access-kxrbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:34:27 crc kubenswrapper[4752]: I1124 12:34:27.177794 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxrbs\" (UniqueName: \"kubernetes.io/projected/98072d93-5346-4958-9c03-136c784abc74-kube-api-access-kxrbs\") on node \"crc\" DevicePath \"\"" Nov 24 12:34:27 crc kubenswrapper[4752]: I1124 12:34:27.547350 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7355-account-create-2vzv7" event={"ID":"98072d93-5346-4958-9c03-136c784abc74","Type":"ContainerDied","Data":"f5ddb7ee136b0c2de4b6c2f0cb98aea324d3d799bf1b608bbc65168211a88f2b"} Nov 24 12:34:27 crc kubenswrapper[4752]: I1124 12:34:27.547410 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5ddb7ee136b0c2de4b6c2f0cb98aea324d3d799bf1b608bbc65168211a88f2b" Nov 24 12:34:27 crc kubenswrapper[4752]: I1124 12:34:27.548541 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7355-account-create-2vzv7" Nov 24 12:34:28 crc kubenswrapper[4752]: I1124 12:34:28.784488 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6977c95bf9-rmg2d"] Nov 24 12:34:28 crc kubenswrapper[4752]: E1124 12:34:28.785326 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98072d93-5346-4958-9c03-136c784abc74" containerName="mariadb-account-create" Nov 24 12:34:28 crc kubenswrapper[4752]: I1124 12:34:28.785337 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="98072d93-5346-4958-9c03-136c784abc74" containerName="mariadb-account-create" Nov 24 12:34:28 crc kubenswrapper[4752]: E1124 12:34:28.785356 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="892c65e3-cd5b-4c52-8b65-35205b02e26b" containerName="mariadb-database-create" Nov 24 12:34:28 crc kubenswrapper[4752]: I1124 12:34:28.785362 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="892c65e3-cd5b-4c52-8b65-35205b02e26b" containerName="mariadb-database-create" Nov 24 12:34:28 crc kubenswrapper[4752]: I1124 12:34:28.785520 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="98072d93-5346-4958-9c03-136c784abc74" containerName="mariadb-account-create" Nov 24 12:34:28 crc kubenswrapper[4752]: I1124 12:34:28.785547 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="892c65e3-cd5b-4c52-8b65-35205b02e26b" containerName="mariadb-database-create" Nov 24 12:34:28 crc kubenswrapper[4752]: I1124 12:34:28.786448 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6977c95bf9-rmg2d" Nov 24 12:34:28 crc kubenswrapper[4752]: I1124 12:34:28.798398 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-kqdcp"] Nov 24 12:34:28 crc kubenswrapper[4752]: I1124 12:34:28.800174 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-kqdcp" Nov 24 12:34:28 crc kubenswrapper[4752]: I1124 12:34:28.803707 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 24 12:34:28 crc kubenswrapper[4752]: I1124 12:34:28.803911 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 24 12:34:28 crc kubenswrapper[4752]: I1124 12:34:28.804081 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-g687n" Nov 24 12:34:28 crc kubenswrapper[4752]: I1124 12:34:28.811083 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6977c95bf9-rmg2d"] Nov 24 12:34:28 crc kubenswrapper[4752]: I1124 12:34:28.825213 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-kqdcp"] Nov 24 12:34:28 crc kubenswrapper[4752]: I1124 12:34:28.914514 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33172046-c564-4757-a04a-42174d21426c-config\") pod \"dnsmasq-dns-6977c95bf9-rmg2d\" (UID: \"33172046-c564-4757-a04a-42174d21426c\") " pod="openstack/dnsmasq-dns-6977c95bf9-rmg2d" Nov 24 12:34:28 crc kubenswrapper[4752]: I1124 12:34:28.914558 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f88ca20-3284-434f-a7a6-13e5d3328574-logs\") pod \"placement-db-sync-kqdcp\" (UID: \"1f88ca20-3284-434f-a7a6-13e5d3328574\") " pod="openstack/placement-db-sync-kqdcp" Nov 24 12:34:28 crc kubenswrapper[4752]: I1124 12:34:28.914672 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f88ca20-3284-434f-a7a6-13e5d3328574-config-data\") pod \"placement-db-sync-kqdcp\" (UID: \"1f88ca20-3284-434f-a7a6-13e5d3328574\") " pod="openstack/placement-db-sync-kqdcp" Nov 24 12:34:28 crc kubenswrapper[4752]: I1124 12:34:28.914725 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klkg6\" (UniqueName: \"kubernetes.io/projected/33172046-c564-4757-a04a-42174d21426c-kube-api-access-klkg6\") pod \"dnsmasq-dns-6977c95bf9-rmg2d\" (UID: \"33172046-c564-4757-a04a-42174d21426c\") " pod="openstack/dnsmasq-dns-6977c95bf9-rmg2d" Nov 24 12:34:28 crc kubenswrapper[4752]: I1124 12:34:28.914796 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f88ca20-3284-434f-a7a6-13e5d3328574-scripts\") pod \"placement-db-sync-kqdcp\" (UID: \"1f88ca20-3284-434f-a7a6-13e5d3328574\") " pod="openstack/placement-db-sync-kqdcp" Nov 24 12:34:28 crc kubenswrapper[4752]: I1124 12:34:28.914955 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5bl9\" (UniqueName: \"kubernetes.io/projected/1f88ca20-3284-434f-a7a6-13e5d3328574-kube-api-access-l5bl9\") pod \"placement-db-sync-kqdcp\" (UID: \"1f88ca20-3284-434f-a7a6-13e5d3328574\") " pod="openstack/placement-db-sync-kqdcp" Nov 24 12:34:28 crc kubenswrapper[4752]: I1124 12:34:28.914985 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33172046-c564-4757-a04a-42174d21426c-ovsdbserver-nb\") pod \"dnsmasq-dns-6977c95bf9-rmg2d\" (UID: \"33172046-c564-4757-a04a-42174d21426c\") " pod="openstack/dnsmasq-dns-6977c95bf9-rmg2d" Nov 24 12:34:28 crc kubenswrapper[4752]: I1124 12:34:28.915031 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33172046-c564-4757-a04a-42174d21426c-ovsdbserver-sb\") pod \"dnsmasq-dns-6977c95bf9-rmg2d\" (UID: \"33172046-c564-4757-a04a-42174d21426c\") " pod="openstack/dnsmasq-dns-6977c95bf9-rmg2d" Nov 24 12:34:28 crc kubenswrapper[4752]: I1124 12:34:28.915053 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33172046-c564-4757-a04a-42174d21426c-dns-svc\") pod \"dnsmasq-dns-6977c95bf9-rmg2d\" (UID: \"33172046-c564-4757-a04a-42174d21426c\") " pod="openstack/dnsmasq-dns-6977c95bf9-rmg2d" Nov 24 12:34:28 crc kubenswrapper[4752]: I1124 12:34:28.915143 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f88ca20-3284-434f-a7a6-13e5d3328574-combined-ca-bundle\") pod \"placement-db-sync-kqdcp\" (UID: \"1f88ca20-3284-434f-a7a6-13e5d3328574\") " pod="openstack/placement-db-sync-kqdcp" Nov 24 12:34:29 crc kubenswrapper[4752]: I1124 12:34:29.016511 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f88ca20-3284-434f-a7a6-13e5d3328574-combined-ca-bundle\") pod \"placement-db-sync-kqdcp\" (UID: \"1f88ca20-3284-434f-a7a6-13e5d3328574\") " pod="openstack/placement-db-sync-kqdcp" Nov 24 12:34:29 crc kubenswrapper[4752]: I1124 12:34:29.016574 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33172046-c564-4757-a04a-42174d21426c-config\") pod \"dnsmasq-dns-6977c95bf9-rmg2d\" (UID: \"33172046-c564-4757-a04a-42174d21426c\") " pod="openstack/dnsmasq-dns-6977c95bf9-rmg2d" Nov 24 12:34:29 crc kubenswrapper[4752]: I1124 12:34:29.016600 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f88ca20-3284-434f-a7a6-13e5d3328574-logs\") pod \"placement-db-sync-kqdcp\" (UID: \"1f88ca20-3284-434f-a7a6-13e5d3328574\") " pod="openstack/placement-db-sync-kqdcp" Nov 24 12:34:29 crc kubenswrapper[4752]: I1124 12:34:29.016623 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f88ca20-3284-434f-a7a6-13e5d3328574-config-data\") pod \"placement-db-sync-kqdcp\" (UID: \"1f88ca20-3284-434f-a7a6-13e5d3328574\") " pod="openstack/placement-db-sync-kqdcp" Nov 24 12:34:29 crc kubenswrapper[4752]: I1124 12:34:29.016642 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klkg6\" (UniqueName: \"kubernetes.io/projected/33172046-c564-4757-a04a-42174d21426c-kube-api-access-klkg6\") pod \"dnsmasq-dns-6977c95bf9-rmg2d\" (UID: \"33172046-c564-4757-a04a-42174d21426c\") " pod="openstack/dnsmasq-dns-6977c95bf9-rmg2d" Nov 24 12:34:29 crc kubenswrapper[4752]: I1124 12:34:29.016662 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f88ca20-3284-434f-a7a6-13e5d3328574-scripts\") pod \"placement-db-sync-kqdcp\" (UID: \"1f88ca20-3284-434f-a7a6-13e5d3328574\") " pod="openstack/placement-db-sync-kqdcp" Nov 24 12:34:29 crc kubenswrapper[4752]: I1124 12:34:29.016701 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5bl9\" (UniqueName: \"kubernetes.io/projected/1f88ca20-3284-434f-a7a6-13e5d3328574-kube-api-access-l5bl9\") pod \"placement-db-sync-kqdcp\" (UID: \"1f88ca20-3284-434f-a7a6-13e5d3328574\") " pod="openstack/placement-db-sync-kqdcp" Nov 24 12:34:29 crc kubenswrapper[4752]: I1124 12:34:29.016724 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33172046-c564-4757-a04a-42174d21426c-ovsdbserver-nb\") pod \"dnsmasq-dns-6977c95bf9-rmg2d\" (UID: \"33172046-c564-4757-a04a-42174d21426c\") " pod="openstack/dnsmasq-dns-6977c95bf9-rmg2d" Nov 24 12:34:29 crc kubenswrapper[4752]: I1124 12:34:29.016760 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33172046-c564-4757-a04a-42174d21426c-ovsdbserver-sb\") pod \"dnsmasq-dns-6977c95bf9-rmg2d\" (UID: \"33172046-c564-4757-a04a-42174d21426c\") " pod="openstack/dnsmasq-dns-6977c95bf9-rmg2d" Nov 24 12:34:29 crc kubenswrapper[4752]: I1124 12:34:29.016779 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33172046-c564-4757-a04a-42174d21426c-dns-svc\") pod \"dnsmasq-dns-6977c95bf9-rmg2d\" (UID: \"33172046-c564-4757-a04a-42174d21426c\") " pod="openstack/dnsmasq-dns-6977c95bf9-rmg2d" Nov 24 12:34:29 crc kubenswrapper[4752]: I1124 12:34:29.017515 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f88ca20-3284-434f-a7a6-13e5d3328574-logs\") pod \"placement-db-sync-kqdcp\" (UID: \"1f88ca20-3284-434f-a7a6-13e5d3328574\") " pod="openstack/placement-db-sync-kqdcp" Nov 24 12:34:29 crc kubenswrapper[4752]: I1124 12:34:29.018350 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33172046-c564-4757-a04a-42174d21426c-dns-svc\") pod \"dnsmasq-dns-6977c95bf9-rmg2d\" (UID: \"33172046-c564-4757-a04a-42174d21426c\") " pod="openstack/dnsmasq-dns-6977c95bf9-rmg2d" Nov 24 12:34:29 crc kubenswrapper[4752]: I1124 12:34:29.019476 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33172046-c564-4757-a04a-42174d21426c-config\") pod \"dnsmasq-dns-6977c95bf9-rmg2d\" (UID: \"33172046-c564-4757-a04a-42174d21426c\") " pod="openstack/dnsmasq-dns-6977c95bf9-rmg2d" Nov 24 12:34:29 crc kubenswrapper[4752]: I1124 12:34:29.019548 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33172046-c564-4757-a04a-42174d21426c-ovsdbserver-sb\") pod \"dnsmasq-dns-6977c95bf9-rmg2d\" (UID: \"33172046-c564-4757-a04a-42174d21426c\") " pod="openstack/dnsmasq-dns-6977c95bf9-rmg2d" Nov 24 12:34:29 crc kubenswrapper[4752]: I1124 12:34:29.019773 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33172046-c564-4757-a04a-42174d21426c-ovsdbserver-nb\") pod \"dnsmasq-dns-6977c95bf9-rmg2d\" (UID: \"33172046-c564-4757-a04a-42174d21426c\") " pod="openstack/dnsmasq-dns-6977c95bf9-rmg2d" Nov 24 12:34:29 crc kubenswrapper[4752]: I1124 12:34:29.020998 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f88ca20-3284-434f-a7a6-13e5d3328574-scripts\") pod \"placement-db-sync-kqdcp\" (UID: \"1f88ca20-3284-434f-a7a6-13e5d3328574\") " pod="openstack/placement-db-sync-kqdcp" Nov 24 12:34:29 crc kubenswrapper[4752]: I1124 12:34:29.023067 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f88ca20-3284-434f-a7a6-13e5d3328574-combined-ca-bundle\") pod \"placement-db-sync-kqdcp\" (UID: \"1f88ca20-3284-434f-a7a6-13e5d3328574\") " pod="openstack/placement-db-sync-kqdcp" Nov 24 12:34:29 crc kubenswrapper[4752]: I1124 12:34:29.032501 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f88ca20-3284-434f-a7a6-13e5d3328574-config-data\") pod \"placement-db-sync-kqdcp\" (UID: \"1f88ca20-3284-434f-a7a6-13e5d3328574\") " pod="openstack/placement-db-sync-kqdcp" Nov 24 12:34:29 crc kubenswrapper[4752]: I1124 12:34:29.037512 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5bl9\" (UniqueName: \"kubernetes.io/projected/1f88ca20-3284-434f-a7a6-13e5d3328574-kube-api-access-l5bl9\") pod \"placement-db-sync-kqdcp\" (UID: \"1f88ca20-3284-434f-a7a6-13e5d3328574\") " pod="openstack/placement-db-sync-kqdcp" Nov 24 12:34:29 crc kubenswrapper[4752]: I1124 12:34:29.049016 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klkg6\" (UniqueName: \"kubernetes.io/projected/33172046-c564-4757-a04a-42174d21426c-kube-api-access-klkg6\") pod \"dnsmasq-dns-6977c95bf9-rmg2d\" (UID: \"33172046-c564-4757-a04a-42174d21426c\") " pod="openstack/dnsmasq-dns-6977c95bf9-rmg2d" Nov 24 12:34:29 crc kubenswrapper[4752]: I1124 12:34:29.119118 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6977c95bf9-rmg2d" Nov 24 12:34:29 crc kubenswrapper[4752]: I1124 12:34:29.126267 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-kqdcp" Nov 24 12:34:29 crc kubenswrapper[4752]: I1124 12:34:29.636595 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-kqdcp"] Nov 24 12:34:29 crc kubenswrapper[4752]: W1124 12:34:29.637135 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f88ca20_3284_434f_a7a6_13e5d3328574.slice/crio-4e6cc1c0e421dbe1dd0d77f8853b50cd74103547b9ab3fa312a08adc244ddd77 WatchSource:0}: Error finding container 4e6cc1c0e421dbe1dd0d77f8853b50cd74103547b9ab3fa312a08adc244ddd77: Status 404 returned error can't find the container with id 4e6cc1c0e421dbe1dd0d77f8853b50cd74103547b9ab3fa312a08adc244ddd77 Nov 24 12:34:29 crc kubenswrapper[4752]: I1124 12:34:29.709696 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6977c95bf9-rmg2d"] Nov 24 12:34:30 crc kubenswrapper[4752]: I1124 12:34:30.593379 4752 generic.go:334] "Generic (PLEG): container finished" podID="33172046-c564-4757-a04a-42174d21426c" containerID="e899bb6478cfa04d7c3caf6d3cec520420f6d48cb29c605cb5b697b2f21cd1f2" exitCode=0 Nov 24 12:34:30 crc kubenswrapper[4752]: I1124 12:34:30.593480 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6977c95bf9-rmg2d" event={"ID":"33172046-c564-4757-a04a-42174d21426c","Type":"ContainerDied","Data":"e899bb6478cfa04d7c3caf6d3cec520420f6d48cb29c605cb5b697b2f21cd1f2"} Nov 24 12:34:30 crc kubenswrapper[4752]: I1124 12:34:30.594033 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6977c95bf9-rmg2d" event={"ID":"33172046-c564-4757-a04a-42174d21426c","Type":"ContainerStarted","Data":"a1ca51719c068077c94361ec2a94c59339f440e248000a6c4d2426b0c2e52d84"} Nov 24 12:34:30 crc kubenswrapper[4752]: I1124 12:34:30.596902 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-kqdcp" event={"ID":"1f88ca20-3284-434f-a7a6-13e5d3328574","Type":"ContainerStarted","Data":"b3f0df645f5df99584d6875624dede4a507f863fa3d16229f78dd8e5eda2060e"} Nov 24 12:34:30 crc kubenswrapper[4752]: I1124 12:34:30.596970 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-kqdcp" event={"ID":"1f88ca20-3284-434f-a7a6-13e5d3328574","Type":"ContainerStarted","Data":"4e6cc1c0e421dbe1dd0d77f8853b50cd74103547b9ab3fa312a08adc244ddd77"} Nov 24 12:34:30 crc kubenswrapper[4752]: I1124 12:34:30.659720 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-kqdcp" podStartSLOduration=2.659698531 podStartE2EDuration="2.659698531s" podCreationTimestamp="2025-11-24 12:34:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:34:30.655120529 +0000 UTC m=+5276.639940858" watchObservedRunningTime="2025-11-24 12:34:30.659698531 +0000 UTC m=+5276.644518830" Nov 24 12:34:31 crc kubenswrapper[4752]: I1124 12:34:31.609422 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6977c95bf9-rmg2d" event={"ID":"33172046-c564-4757-a04a-42174d21426c","Type":"ContainerStarted","Data":"7790afb5b21cf87da44155ed6448b08407c798a3c86b7c505eac128c7f84972e"} Nov 24 12:34:31 crc kubenswrapper[4752]: I1124 12:34:31.610030 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6977c95bf9-rmg2d" Nov 24 12:34:31 crc kubenswrapper[4752]: I1124 12:34:31.611282 4752 generic.go:334] "Generic (PLEG): container finished" podID="1f88ca20-3284-434f-a7a6-13e5d3328574" containerID="b3f0df645f5df99584d6875624dede4a507f863fa3d16229f78dd8e5eda2060e" exitCode=0 Nov 24 12:34:31 crc kubenswrapper[4752]: I1124 12:34:31.611401 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-kqdcp" event={"ID":"1f88ca20-3284-434f-a7a6-13e5d3328574","Type":"ContainerDied","Data":"b3f0df645f5df99584d6875624dede4a507f863fa3d16229f78dd8e5eda2060e"} Nov 24 12:34:31 crc kubenswrapper[4752]: I1124 12:34:31.635738 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6977c95bf9-rmg2d" podStartSLOduration=3.635715911 podStartE2EDuration="3.635715911s" podCreationTimestamp="2025-11-24 12:34:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:34:31.633306832 +0000 UTC m=+5277.618127161" watchObservedRunningTime="2025-11-24 12:34:31.635715911 +0000 UTC m=+5277.620536210" Nov 24 12:34:33 crc kubenswrapper[4752]: I1124 12:34:33.094249 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-kqdcp" Nov 24 12:34:33 crc kubenswrapper[4752]: I1124 12:34:33.213847 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5bl9\" (UniqueName: \"kubernetes.io/projected/1f88ca20-3284-434f-a7a6-13e5d3328574-kube-api-access-l5bl9\") pod \"1f88ca20-3284-434f-a7a6-13e5d3328574\" (UID: \"1f88ca20-3284-434f-a7a6-13e5d3328574\") " Nov 24 12:34:33 crc kubenswrapper[4752]: I1124 12:34:33.214467 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f88ca20-3284-434f-a7a6-13e5d3328574-logs\") pod \"1f88ca20-3284-434f-a7a6-13e5d3328574\" (UID: \"1f88ca20-3284-434f-a7a6-13e5d3328574\") " Nov 24 12:34:33 crc kubenswrapper[4752]: I1124 12:34:33.214799 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f88ca20-3284-434f-a7a6-13e5d3328574-logs" (OuterVolumeSpecName: "logs") pod "1f88ca20-3284-434f-a7a6-13e5d3328574" (UID: "1f88ca20-3284-434f-a7a6-13e5d3328574"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:34:33 crc kubenswrapper[4752]: I1124 12:34:33.215219 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f88ca20-3284-434f-a7a6-13e5d3328574-config-data\") pod \"1f88ca20-3284-434f-a7a6-13e5d3328574\" (UID: \"1f88ca20-3284-434f-a7a6-13e5d3328574\") " Nov 24 12:34:33 crc kubenswrapper[4752]: I1124 12:34:33.215445 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f88ca20-3284-434f-a7a6-13e5d3328574-scripts\") pod \"1f88ca20-3284-434f-a7a6-13e5d3328574\" (UID: \"1f88ca20-3284-434f-a7a6-13e5d3328574\") " Nov 24 12:34:33 crc kubenswrapper[4752]: I1124 12:34:33.215621 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f88ca20-3284-434f-a7a6-13e5d3328574-combined-ca-bundle\") pod \"1f88ca20-3284-434f-a7a6-13e5d3328574\" (UID: \"1f88ca20-3284-434f-a7a6-13e5d3328574\") " Nov 24 12:34:33 crc kubenswrapper[4752]: I1124 12:34:33.216296 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f88ca20-3284-434f-a7a6-13e5d3328574-logs\") on node \"crc\" DevicePath \"\"" Nov 24 12:34:33 crc kubenswrapper[4752]: I1124 12:34:33.221698 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f88ca20-3284-434f-a7a6-13e5d3328574-kube-api-access-l5bl9" (OuterVolumeSpecName: "kube-api-access-l5bl9") pod "1f88ca20-3284-434f-a7a6-13e5d3328574" (UID: "1f88ca20-3284-434f-a7a6-13e5d3328574"). InnerVolumeSpecName "kube-api-access-l5bl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:34:33 crc kubenswrapper[4752]: I1124 12:34:33.229223 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f88ca20-3284-434f-a7a6-13e5d3328574-scripts" (OuterVolumeSpecName: "scripts") pod "1f88ca20-3284-434f-a7a6-13e5d3328574" (UID: "1f88ca20-3284-434f-a7a6-13e5d3328574"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:34:33 crc kubenswrapper[4752]: I1124 12:34:33.259625 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f88ca20-3284-434f-a7a6-13e5d3328574-config-data" (OuterVolumeSpecName: "config-data") pod "1f88ca20-3284-434f-a7a6-13e5d3328574" (UID: "1f88ca20-3284-434f-a7a6-13e5d3328574"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:34:33 crc kubenswrapper[4752]: I1124 12:34:33.262375 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f88ca20-3284-434f-a7a6-13e5d3328574-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f88ca20-3284-434f-a7a6-13e5d3328574" (UID: "1f88ca20-3284-434f-a7a6-13e5d3328574"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:34:33 crc kubenswrapper[4752]: I1124 12:34:33.317725 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f88ca20-3284-434f-a7a6-13e5d3328574-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:34:33 crc kubenswrapper[4752]: I1124 12:34:33.318194 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f88ca20-3284-434f-a7a6-13e5d3328574-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 12:34:33 crc kubenswrapper[4752]: I1124 12:34:33.318219 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5bl9\" (UniqueName: \"kubernetes.io/projected/1f88ca20-3284-434f-a7a6-13e5d3328574-kube-api-access-l5bl9\") on node \"crc\" DevicePath \"\"" Nov 24 12:34:33 crc kubenswrapper[4752]: I1124 12:34:33.318238 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f88ca20-3284-434f-a7a6-13e5d3328574-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 12:34:33 crc kubenswrapper[4752]: I1124 12:34:33.637355 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-kqdcp" event={"ID":"1f88ca20-3284-434f-a7a6-13e5d3328574","Type":"ContainerDied","Data":"4e6cc1c0e421dbe1dd0d77f8853b50cd74103547b9ab3fa312a08adc244ddd77"} Nov 24 12:34:33 crc kubenswrapper[4752]: I1124 12:34:33.637622 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e6cc1c0e421dbe1dd0d77f8853b50cd74103547b9ab3fa312a08adc244ddd77" Nov 24 12:34:33 crc kubenswrapper[4752]: I1124 12:34:33.637423 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-kqdcp" Nov 24 12:34:34 crc kubenswrapper[4752]: I1124 12:34:34.212982 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-dd66498d8-m7grv"] Nov 24 12:34:34 crc kubenswrapper[4752]: E1124 12:34:34.213578 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f88ca20-3284-434f-a7a6-13e5d3328574" containerName="placement-db-sync" Nov 24 12:34:34 crc kubenswrapper[4752]: I1124 12:34:34.213597 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f88ca20-3284-434f-a7a6-13e5d3328574" containerName="placement-db-sync" Nov 24 12:34:34 crc kubenswrapper[4752]: I1124 12:34:34.213887 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f88ca20-3284-434f-a7a6-13e5d3328574" containerName="placement-db-sync" Nov 24 12:34:34 crc kubenswrapper[4752]: I1124 12:34:34.215444 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-dd66498d8-m7grv" Nov 24 12:34:34 crc kubenswrapper[4752]: I1124 12:34:34.217628 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 24 12:34:34 crc kubenswrapper[4752]: I1124 12:34:34.218852 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 24 12:34:34 crc kubenswrapper[4752]: I1124 12:34:34.220518 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-g687n" Nov 24 12:34:34 crc kubenswrapper[4752]: I1124 12:34:34.248451 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-dd66498d8-m7grv"] Nov 24 12:34:34 crc kubenswrapper[4752]: I1124 12:34:34.344578 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r6mw\" (UniqueName: \"kubernetes.io/projected/403b2bfd-19de-475e-8460-3d42506d6a5e-kube-api-access-2r6mw\") pod \"placement-dd66498d8-m7grv\" (UID: \"403b2bfd-19de-475e-8460-3d42506d6a5e\") " pod="openstack/placement-dd66498d8-m7grv" Nov 24 12:34:34 crc kubenswrapper[4752]: I1124 12:34:34.344671 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/403b2bfd-19de-475e-8460-3d42506d6a5e-logs\") pod \"placement-dd66498d8-m7grv\" (UID: \"403b2bfd-19de-475e-8460-3d42506d6a5e\") " pod="openstack/placement-dd66498d8-m7grv" Nov 24 12:34:34 crc kubenswrapper[4752]: I1124 12:34:34.344694 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/403b2bfd-19de-475e-8460-3d42506d6a5e-combined-ca-bundle\") pod \"placement-dd66498d8-m7grv\" (UID: \"403b2bfd-19de-475e-8460-3d42506d6a5e\") " pod="openstack/placement-dd66498d8-m7grv" Nov 24 12:34:34 crc kubenswrapper[4752]: I1124 12:34:34.344715 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/403b2bfd-19de-475e-8460-3d42506d6a5e-config-data\") pod \"placement-dd66498d8-m7grv\" (UID: \"403b2bfd-19de-475e-8460-3d42506d6a5e\") " pod="openstack/placement-dd66498d8-m7grv" Nov 24 12:34:34 crc kubenswrapper[4752]: I1124 12:34:34.344822 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/403b2bfd-19de-475e-8460-3d42506d6a5e-scripts\") pod \"placement-dd66498d8-m7grv\" (UID: \"403b2bfd-19de-475e-8460-3d42506d6a5e\") " pod="openstack/placement-dd66498d8-m7grv" Nov 24 12:34:34 crc kubenswrapper[4752]: I1124 12:34:34.446244 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/403b2bfd-19de-475e-8460-3d42506d6a5e-logs\") pod \"placement-dd66498d8-m7grv\" (UID: \"403b2bfd-19de-475e-8460-3d42506d6a5e\") " pod="openstack/placement-dd66498d8-m7grv" Nov 24 12:34:34 crc kubenswrapper[4752]: I1124 12:34:34.446286 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/403b2bfd-19de-475e-8460-3d42506d6a5e-combined-ca-bundle\") pod \"placement-dd66498d8-m7grv\" (UID: \"403b2bfd-19de-475e-8460-3d42506d6a5e\") " pod="openstack/placement-dd66498d8-m7grv" Nov 24 12:34:34 crc kubenswrapper[4752]: I1124 12:34:34.446307 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/403b2bfd-19de-475e-8460-3d42506d6a5e-config-data\") pod \"placement-dd66498d8-m7grv\" (UID: \"403b2bfd-19de-475e-8460-3d42506d6a5e\") " pod="openstack/placement-dd66498d8-m7grv" Nov 24 12:34:34 crc kubenswrapper[4752]: I1124 12:34:34.446386 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/403b2bfd-19de-475e-8460-3d42506d6a5e-scripts\") pod \"placement-dd66498d8-m7grv\" (UID: \"403b2bfd-19de-475e-8460-3d42506d6a5e\") " pod="openstack/placement-dd66498d8-m7grv" Nov 24 12:34:34 crc kubenswrapper[4752]: I1124 12:34:34.446430 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2r6mw\" (UniqueName: \"kubernetes.io/projected/403b2bfd-19de-475e-8460-3d42506d6a5e-kube-api-access-2r6mw\") pod \"placement-dd66498d8-m7grv\" (UID: \"403b2bfd-19de-475e-8460-3d42506d6a5e\") " pod="openstack/placement-dd66498d8-m7grv" Nov 24 12:34:34 crc kubenswrapper[4752]: I1124 12:34:34.447876 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/403b2bfd-19de-475e-8460-3d42506d6a5e-logs\") pod \"placement-dd66498d8-m7grv\" (UID: \"403b2bfd-19de-475e-8460-3d42506d6a5e\") " pod="openstack/placement-dd66498d8-m7grv" Nov 24 12:34:34 crc kubenswrapper[4752]: I1124 12:34:34.462450 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/403b2bfd-19de-475e-8460-3d42506d6a5e-scripts\") pod \"placement-dd66498d8-m7grv\" (UID: \"403b2bfd-19de-475e-8460-3d42506d6a5e\") " pod="openstack/placement-dd66498d8-m7grv" Nov 24 12:34:34 crc kubenswrapper[4752]: I1124 12:34:34.489601 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/403b2bfd-19de-475e-8460-3d42506d6a5e-combined-ca-bundle\") pod \"placement-dd66498d8-m7grv\" (UID: \"403b2bfd-19de-475e-8460-3d42506d6a5e\") " pod="openstack/placement-dd66498d8-m7grv" Nov 24 12:34:34 crc kubenswrapper[4752]: I1124 12:34:34.489846 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/403b2bfd-19de-475e-8460-3d42506d6a5e-config-data\") pod \"placement-dd66498d8-m7grv\" (UID: \"403b2bfd-19de-475e-8460-3d42506d6a5e\") " pod="openstack/placement-dd66498d8-m7grv" Nov 24 12:34:34 crc kubenswrapper[4752]: I1124 12:34:34.493264 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r6mw\" (UniqueName: \"kubernetes.io/projected/403b2bfd-19de-475e-8460-3d42506d6a5e-kube-api-access-2r6mw\") pod \"placement-dd66498d8-m7grv\" (UID: \"403b2bfd-19de-475e-8460-3d42506d6a5e\") " pod="openstack/placement-dd66498d8-m7grv" Nov 24 12:34:34 crc kubenswrapper[4752]: I1124 12:34:34.553516 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-dd66498d8-m7grv" Nov 24 12:34:34 crc kubenswrapper[4752]: I1124 12:34:34.879304 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-dd66498d8-m7grv"] Nov 24 12:34:35 crc kubenswrapper[4752]: I1124 12:34:35.660707 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-dd66498d8-m7grv" event={"ID":"403b2bfd-19de-475e-8460-3d42506d6a5e","Type":"ContainerStarted","Data":"0b6299ac1feee1fe6acfa82eb6e6a2f1714253c695f1c37089732c8f3934ff6c"} Nov 24 12:34:35 crc kubenswrapper[4752]: I1124 12:34:35.661392 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-dd66498d8-m7grv" Nov 24 12:34:35 crc kubenswrapper[4752]: I1124 12:34:35.661425 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-dd66498d8-m7grv" event={"ID":"403b2bfd-19de-475e-8460-3d42506d6a5e","Type":"ContainerStarted","Data":"a440267019ea6a0d64253f3906185385292730a033b280f9a88c1de6163d33dd"} Nov 24 12:34:35 crc kubenswrapper[4752]: I1124 12:34:35.661449 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-dd66498d8-m7grv" event={"ID":"403b2bfd-19de-475e-8460-3d42506d6a5e","Type":"ContainerStarted","Data":"4ef481d2203e527e759236f006c845899b3205536552065cb2b8a55be126e310"} Nov 24 12:34:35 crc kubenswrapper[4752]: I1124 12:34:35.694507 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-dd66498d8-m7grv" podStartSLOduration=1.694471068 podStartE2EDuration="1.694471068s" podCreationTimestamp="2025-11-24 12:34:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:34:35.685376487 +0000 UTC m=+5281.670196816" watchObservedRunningTime="2025-11-24 12:34:35.694471068 +0000 UTC m=+5281.679291397" Nov 24 12:34:36 crc kubenswrapper[4752]: I1124 12:34:36.674306 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-dd66498d8-m7grv" Nov 24 12:34:39 crc kubenswrapper[4752]: I1124 12:34:39.121022 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6977c95bf9-rmg2d" Nov 24 12:34:39 crc kubenswrapper[4752]: I1124 12:34:39.264314 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c8c4d4d9c-pncnc"] Nov 24 12:34:39 crc kubenswrapper[4752]: I1124 12:34:39.265080 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c8c4d4d9c-pncnc" podUID="27add03f-da4d-4050-9b42-6bc8979311c4" containerName="dnsmasq-dns" containerID="cri-o://1a088eb537ab104cac4db7a20cb40f46d9026ba7bb6913a90cacd74d803f2798" gracePeriod=10 Nov 24 12:34:39 crc kubenswrapper[4752]: I1124 12:34:39.707965 4752 generic.go:334] "Generic (PLEG): container finished" podID="27add03f-da4d-4050-9b42-6bc8979311c4" containerID="1a088eb537ab104cac4db7a20cb40f46d9026ba7bb6913a90cacd74d803f2798" exitCode=0 Nov 24 12:34:39 crc kubenswrapper[4752]: I1124 12:34:39.708049 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c8c4d4d9c-pncnc" event={"ID":"27add03f-da4d-4050-9b42-6bc8979311c4","Type":"ContainerDied","Data":"1a088eb537ab104cac4db7a20cb40f46d9026ba7bb6913a90cacd74d803f2798"} Nov 24 12:34:39 crc kubenswrapper[4752]: I1124 12:34:39.708281 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c8c4d4d9c-pncnc" event={"ID":"27add03f-da4d-4050-9b42-6bc8979311c4","Type":"ContainerDied","Data":"760e7a6408aad2752b3e6e60ce02243e679db0cd8f15448a29d47a727dc5dfcd"} Nov 24 12:34:39 crc kubenswrapper[4752]: I1124 12:34:39.708296 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="760e7a6408aad2752b3e6e60ce02243e679db0cd8f15448a29d47a727dc5dfcd" Nov 24 12:34:39 crc kubenswrapper[4752]: I1124 12:34:39.735862 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c8c4d4d9c-pncnc" Nov 24 12:34:39 crc kubenswrapper[4752]: I1124 12:34:39.871329 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dswm9\" (UniqueName: \"kubernetes.io/projected/27add03f-da4d-4050-9b42-6bc8979311c4-kube-api-access-dswm9\") pod \"27add03f-da4d-4050-9b42-6bc8979311c4\" (UID: \"27add03f-da4d-4050-9b42-6bc8979311c4\") " Nov 24 12:34:39 crc kubenswrapper[4752]: I1124 12:34:39.871769 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27add03f-da4d-4050-9b42-6bc8979311c4-ovsdbserver-sb\") pod \"27add03f-da4d-4050-9b42-6bc8979311c4\" (UID: \"27add03f-da4d-4050-9b42-6bc8979311c4\") " Nov 24 12:34:39 crc kubenswrapper[4752]: I1124 12:34:39.871911 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27add03f-da4d-4050-9b42-6bc8979311c4-config\") pod \"27add03f-da4d-4050-9b42-6bc8979311c4\" (UID: \"27add03f-da4d-4050-9b42-6bc8979311c4\") " Nov 24 12:34:39 crc kubenswrapper[4752]: I1124 12:34:39.871954 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27add03f-da4d-4050-9b42-6bc8979311c4-ovsdbserver-nb\") pod \"27add03f-da4d-4050-9b42-6bc8979311c4\" (UID: \"27add03f-da4d-4050-9b42-6bc8979311c4\") " Nov 24 12:34:39 crc kubenswrapper[4752]: I1124 12:34:39.872077 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27add03f-da4d-4050-9b42-6bc8979311c4-dns-svc\") pod \"27add03f-da4d-4050-9b42-6bc8979311c4\" (UID: \"27add03f-da4d-4050-9b42-6bc8979311c4\") " Nov 24 12:34:39 crc kubenswrapper[4752]: I1124 12:34:39.878821 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27add03f-da4d-4050-9b42-6bc8979311c4-kube-api-access-dswm9" (OuterVolumeSpecName: "kube-api-access-dswm9") pod "27add03f-da4d-4050-9b42-6bc8979311c4" (UID: "27add03f-da4d-4050-9b42-6bc8979311c4"). InnerVolumeSpecName "kube-api-access-dswm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:34:39 crc kubenswrapper[4752]: I1124 12:34:39.918046 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27add03f-da4d-4050-9b42-6bc8979311c4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "27add03f-da4d-4050-9b42-6bc8979311c4" (UID: "27add03f-da4d-4050-9b42-6bc8979311c4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:34:39 crc kubenswrapper[4752]: I1124 12:34:39.921039 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27add03f-da4d-4050-9b42-6bc8979311c4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "27add03f-da4d-4050-9b42-6bc8979311c4" (UID: "27add03f-da4d-4050-9b42-6bc8979311c4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:34:39 crc kubenswrapper[4752]: I1124 12:34:39.924030 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27add03f-da4d-4050-9b42-6bc8979311c4-config" (OuterVolumeSpecName: "config") pod "27add03f-da4d-4050-9b42-6bc8979311c4" (UID: "27add03f-da4d-4050-9b42-6bc8979311c4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:34:39 crc kubenswrapper[4752]: I1124 12:34:39.928370 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27add03f-da4d-4050-9b42-6bc8979311c4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "27add03f-da4d-4050-9b42-6bc8979311c4" (UID: "27add03f-da4d-4050-9b42-6bc8979311c4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:34:39 crc kubenswrapper[4752]: I1124 12:34:39.974561 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27add03f-da4d-4050-9b42-6bc8979311c4-config\") on node \"crc\" DevicePath \"\"" Nov 24 12:34:39 crc kubenswrapper[4752]: I1124 12:34:39.974611 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27add03f-da4d-4050-9b42-6bc8979311c4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 24 12:34:39 crc kubenswrapper[4752]: I1124 12:34:39.974631 4752 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27add03f-da4d-4050-9b42-6bc8979311c4-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 12:34:39 crc kubenswrapper[4752]: I1124 12:34:39.974651 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dswm9\" (UniqueName: \"kubernetes.io/projected/27add03f-da4d-4050-9b42-6bc8979311c4-kube-api-access-dswm9\") on node \"crc\" DevicePath \"\"" Nov 24 12:34:39 crc kubenswrapper[4752]: I1124 12:34:39.974669 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27add03f-da4d-4050-9b42-6bc8979311c4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 24 12:34:40 crc kubenswrapper[4752]: I1124 12:34:40.720912 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c8c4d4d9c-pncnc" Nov 24 12:34:40 crc kubenswrapper[4752]: I1124 12:34:40.770560 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c8c4d4d9c-pncnc"] Nov 24 12:34:40 crc kubenswrapper[4752]: I1124 12:34:40.777647 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c8c4d4d9c-pncnc"] Nov 24 12:34:42 crc kubenswrapper[4752]: I1124 12:34:42.581156 4752 scope.go:117] "RemoveContainer" containerID="c8dd2495f532a5a91797f6fcf7f93ffba663cf328ca6507aadf35d76895744b6" Nov 24 12:34:42 crc kubenswrapper[4752]: I1124 12:34:42.742719 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27add03f-da4d-4050-9b42-6bc8979311c4" path="/var/lib/kubelet/pods/27add03f-da4d-4050-9b42-6bc8979311c4/volumes" Nov 24 12:35:05 crc kubenswrapper[4752]: I1124 12:35:05.589926 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-dd66498d8-m7grv" Nov 24 12:35:05 crc kubenswrapper[4752]: I1124 12:35:05.590611 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-dd66498d8-m7grv" Nov 24 12:35:26 crc kubenswrapper[4752]: I1124 12:35:26.500857 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-mlbcr"] Nov 24 12:35:26 crc kubenswrapper[4752]: E1124 12:35:26.502022 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27add03f-da4d-4050-9b42-6bc8979311c4" containerName="init" Nov 24 12:35:26 crc kubenswrapper[4752]: I1124 12:35:26.502042 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="27add03f-da4d-4050-9b42-6bc8979311c4" containerName="init" Nov 24 12:35:26 crc kubenswrapper[4752]: E1124 12:35:26.502069 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27add03f-da4d-4050-9b42-6bc8979311c4" containerName="dnsmasq-dns" Nov 24 12:35:26 crc kubenswrapper[4752]: I1124 12:35:26.502077 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="27add03f-da4d-4050-9b42-6bc8979311c4" containerName="dnsmasq-dns" Nov 24 12:35:26 crc kubenswrapper[4752]: I1124 12:35:26.502332 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="27add03f-da4d-4050-9b42-6bc8979311c4" containerName="dnsmasq-dns" Nov 24 12:35:26 crc kubenswrapper[4752]: I1124 12:35:26.503093 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mlbcr" Nov 24 12:35:26 crc kubenswrapper[4752]: I1124 12:35:26.523957 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-mlbcr"] Nov 24 12:35:26 crc kubenswrapper[4752]: I1124 12:35:26.573869 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-f54k9"] Nov 24 12:35:26 crc kubenswrapper[4752]: I1124 12:35:26.574899 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-f54k9" Nov 24 12:35:26 crc kubenswrapper[4752]: I1124 12:35:26.590666 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-f54k9"] Nov 24 12:35:26 crc kubenswrapper[4752]: I1124 12:35:26.655673 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55c4g\" (UniqueName: \"kubernetes.io/projected/1be35493-c5d0-476a-ae26-42485af9efe5-kube-api-access-55c4g\") pod \"nova-api-db-create-mlbcr\" (UID: \"1be35493-c5d0-476a-ae26-42485af9efe5\") " pod="openstack/nova-api-db-create-mlbcr" Nov 24 12:35:26 crc kubenswrapper[4752]: I1124 12:35:26.656468 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1be35493-c5d0-476a-ae26-42485af9efe5-operator-scripts\") pod \"nova-api-db-create-mlbcr\" (UID: \"1be35493-c5d0-476a-ae26-42485af9efe5\") " pod="openstack/nova-api-db-create-mlbcr" Nov 24 12:35:26 crc kubenswrapper[4752]: I1124 12:35:26.691009 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-fb8b-account-create-8dh5x"] Nov 24 12:35:26 crc kubenswrapper[4752]: I1124 12:35:26.715188 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-fb8b-account-create-8dh5x"] Nov 24 12:35:26 crc kubenswrapper[4752]: I1124 12:35:26.715313 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-fb8b-account-create-8dh5x" Nov 24 12:35:26 crc kubenswrapper[4752]: I1124 12:35:26.722783 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Nov 24 12:35:26 crc kubenswrapper[4752]: I1124 12:35:26.761041 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htcjd\" (UniqueName: \"kubernetes.io/projected/06ebef79-47d8-4a69-a075-ce63622d42c4-kube-api-access-htcjd\") pod \"nova-cell0-db-create-f54k9\" (UID: \"06ebef79-47d8-4a69-a075-ce63622d42c4\") " pod="openstack/nova-cell0-db-create-f54k9" Nov 24 12:35:26 crc kubenswrapper[4752]: I1124 12:35:26.761102 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55c4g\" (UniqueName: \"kubernetes.io/projected/1be35493-c5d0-476a-ae26-42485af9efe5-kube-api-access-55c4g\") pod \"nova-api-db-create-mlbcr\" (UID: \"1be35493-c5d0-476a-ae26-42485af9efe5\") " pod="openstack/nova-api-db-create-mlbcr" Nov 24 12:35:26 crc kubenswrapper[4752]: I1124 12:35:26.761142 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06ebef79-47d8-4a69-a075-ce63622d42c4-operator-scripts\") pod \"nova-cell0-db-create-f54k9\" (UID: \"06ebef79-47d8-4a69-a075-ce63622d42c4\") " pod="openstack/nova-cell0-db-create-f54k9" Nov 24 12:35:26 crc kubenswrapper[4752]: I1124 12:35:26.761193 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1be35493-c5d0-476a-ae26-42485af9efe5-operator-scripts\") pod \"nova-api-db-create-mlbcr\" (UID: \"1be35493-c5d0-476a-ae26-42485af9efe5\") " pod="openstack/nova-api-db-create-mlbcr" Nov 24 12:35:26 crc kubenswrapper[4752]: I1124 12:35:26.762633 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1be35493-c5d0-476a-ae26-42485af9efe5-operator-scripts\") pod \"nova-api-db-create-mlbcr\" (UID: \"1be35493-c5d0-476a-ae26-42485af9efe5\") " pod="openstack/nova-api-db-create-mlbcr" Nov 24 12:35:26 crc kubenswrapper[4752]: I1124 12:35:26.788336 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-dbdgt"] Nov 24 12:35:26 crc kubenswrapper[4752]: I1124 12:35:26.790220 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-dbdgt" Nov 24 12:35:26 crc kubenswrapper[4752]: I1124 12:35:26.794424 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55c4g\" (UniqueName: \"kubernetes.io/projected/1be35493-c5d0-476a-ae26-42485af9efe5-kube-api-access-55c4g\") pod \"nova-api-db-create-mlbcr\" (UID: \"1be35493-c5d0-476a-ae26-42485af9efe5\") " pod="openstack/nova-api-db-create-mlbcr" Nov 24 12:35:26 crc kubenswrapper[4752]: I1124 12:35:26.812184 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-dbdgt"] Nov 24 12:35:26 crc kubenswrapper[4752]: I1124 12:35:26.821839 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mlbcr" Nov 24 12:35:26 crc kubenswrapper[4752]: I1124 12:35:26.862258 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htcjd\" (UniqueName: \"kubernetes.io/projected/06ebef79-47d8-4a69-a075-ce63622d42c4-kube-api-access-htcjd\") pod \"nova-cell0-db-create-f54k9\" (UID: \"06ebef79-47d8-4a69-a075-ce63622d42c4\") " pod="openstack/nova-cell0-db-create-f54k9" Nov 24 12:35:26 crc kubenswrapper[4752]: I1124 12:35:26.862315 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26a41aac-1ab9-4a64-b65c-1c45ea19c56b-operator-scripts\") pod \"nova-api-fb8b-account-create-8dh5x\" (UID: \"26a41aac-1ab9-4a64-b65c-1c45ea19c56b\") " pod="openstack/nova-api-fb8b-account-create-8dh5x" Nov 24 12:35:26 crc kubenswrapper[4752]: I1124 12:35:26.862338 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfvz6\" (UniqueName: \"kubernetes.io/projected/26a41aac-1ab9-4a64-b65c-1c45ea19c56b-kube-api-access-mfvz6\") pod \"nova-api-fb8b-account-create-8dh5x\" (UID: \"26a41aac-1ab9-4a64-b65c-1c45ea19c56b\") " pod="openstack/nova-api-fb8b-account-create-8dh5x" Nov 24 12:35:26 crc kubenswrapper[4752]: I1124 12:35:26.862362 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06ebef79-47d8-4a69-a075-ce63622d42c4-operator-scripts\") pod \"nova-cell0-db-create-f54k9\" (UID: \"06ebef79-47d8-4a69-a075-ce63622d42c4\") " pod="openstack/nova-cell0-db-create-f54k9" Nov 24 12:35:26 crc kubenswrapper[4752]: I1124 12:35:26.864025 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06ebef79-47d8-4a69-a075-ce63622d42c4-operator-scripts\") pod \"nova-cell0-db-create-f54k9\" (UID: \"06ebef79-47d8-4a69-a075-ce63622d42c4\") " pod="openstack/nova-cell0-db-create-f54k9" Nov 24 12:35:26 crc kubenswrapper[4752]: I1124 12:35:26.887211 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htcjd\" (UniqueName: \"kubernetes.io/projected/06ebef79-47d8-4a69-a075-ce63622d42c4-kube-api-access-htcjd\") pod \"nova-cell0-db-create-f54k9\" (UID: \"06ebef79-47d8-4a69-a075-ce63622d42c4\") " pod="openstack/nova-cell0-db-create-f54k9" Nov 24 12:35:26 crc kubenswrapper[4752]: I1124 12:35:26.891980 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-f54k9" Nov 24 12:35:26 crc kubenswrapper[4752]: I1124 12:35:26.895830 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-43c9-account-create-8qtsg"] Nov 24 12:35:26 crc kubenswrapper[4752]: I1124 12:35:26.897332 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-43c9-account-create-8qtsg" Nov 24 12:35:26 crc kubenswrapper[4752]: I1124 12:35:26.901319 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-43c9-account-create-8qtsg"] Nov 24 12:35:26 crc kubenswrapper[4752]: I1124 12:35:26.903306 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Nov 24 12:35:26 crc kubenswrapper[4752]: I1124 12:35:26.963480 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26a41aac-1ab9-4a64-b65c-1c45ea19c56b-operator-scripts\") pod \"nova-api-fb8b-account-create-8dh5x\" (UID: \"26a41aac-1ab9-4a64-b65c-1c45ea19c56b\") " pod="openstack/nova-api-fb8b-account-create-8dh5x" Nov 24 12:35:26 crc kubenswrapper[4752]: I1124 12:35:26.963840 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfvz6\" (UniqueName: \"kubernetes.io/projected/26a41aac-1ab9-4a64-b65c-1c45ea19c56b-kube-api-access-mfvz6\") pod \"nova-api-fb8b-account-create-8dh5x\" (UID: \"26a41aac-1ab9-4a64-b65c-1c45ea19c56b\") " pod="openstack/nova-api-fb8b-account-create-8dh5x" Nov 24 12:35:26 crc kubenswrapper[4752]: I1124 12:35:26.964220 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55251fd6-dac5-445f-99cb-3a600e94217c-operator-scripts\") pod \"nova-cell1-db-create-dbdgt\" (UID: \"55251fd6-dac5-445f-99cb-3a600e94217c\") " pod="openstack/nova-cell1-db-create-dbdgt" Nov 24 12:35:26 crc kubenswrapper[4752]: I1124 12:35:26.964249 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnsb7\" (UniqueName: \"kubernetes.io/projected/55251fd6-dac5-445f-99cb-3a600e94217c-kube-api-access-lnsb7\") pod \"nova-cell1-db-create-dbdgt\" (UID: \"55251fd6-dac5-445f-99cb-3a600e94217c\") " pod="openstack/nova-cell1-db-create-dbdgt" Nov 24 12:35:26 crc kubenswrapper[4752]: I1124 12:35:26.965799 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26a41aac-1ab9-4a64-b65c-1c45ea19c56b-operator-scripts\") pod \"nova-api-fb8b-account-create-8dh5x\" (UID: \"26a41aac-1ab9-4a64-b65c-1c45ea19c56b\") " pod="openstack/nova-api-fb8b-account-create-8dh5x" Nov 24 12:35:26 crc kubenswrapper[4752]: I1124 12:35:26.983655 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfvz6\" (UniqueName: \"kubernetes.io/projected/26a41aac-1ab9-4a64-b65c-1c45ea19c56b-kube-api-access-mfvz6\") pod \"nova-api-fb8b-account-create-8dh5x\" (UID: \"26a41aac-1ab9-4a64-b65c-1c45ea19c56b\") " pod="openstack/nova-api-fb8b-account-create-8dh5x" Nov 24 12:35:27 crc kubenswrapper[4752]: I1124 12:35:27.031789 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-fb8b-account-create-8dh5x" Nov 24 12:35:27 crc kubenswrapper[4752]: I1124 12:35:27.065899 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c58d145a-cf82-4d24-acea-034d8e2b5f6a-operator-scripts\") pod \"nova-cell0-43c9-account-create-8qtsg\" (UID: \"c58d145a-cf82-4d24-acea-034d8e2b5f6a\") " pod="openstack/nova-cell0-43c9-account-create-8qtsg" Nov 24 12:35:27 crc kubenswrapper[4752]: I1124 12:35:27.065947 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55251fd6-dac5-445f-99cb-3a600e94217c-operator-scripts\") pod \"nova-cell1-db-create-dbdgt\" (UID: \"55251fd6-dac5-445f-99cb-3a600e94217c\") " pod="openstack/nova-cell1-db-create-dbdgt" Nov 24 12:35:27 crc kubenswrapper[4752]: I1124 12:35:27.065976 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnsb7\" (UniqueName: \"kubernetes.io/projected/55251fd6-dac5-445f-99cb-3a600e94217c-kube-api-access-lnsb7\") pod \"nova-cell1-db-create-dbdgt\" (UID: \"55251fd6-dac5-445f-99cb-3a600e94217c\") " pod="openstack/nova-cell1-db-create-dbdgt" Nov 24 12:35:27 crc kubenswrapper[4752]: I1124 12:35:27.066089 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zdpp\" (UniqueName: \"kubernetes.io/projected/c58d145a-cf82-4d24-acea-034d8e2b5f6a-kube-api-access-7zdpp\") pod \"nova-cell0-43c9-account-create-8qtsg\" (UID: \"c58d145a-cf82-4d24-acea-034d8e2b5f6a\") " pod="openstack/nova-cell0-43c9-account-create-8qtsg" Nov 24 12:35:27 crc kubenswrapper[4752]: I1124 12:35:27.066625 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55251fd6-dac5-445f-99cb-3a600e94217c-operator-scripts\") pod \"nova-cell1-db-create-dbdgt\" (UID: \"55251fd6-dac5-445f-99cb-3a600e94217c\") " pod="openstack/nova-cell1-db-create-dbdgt" Nov 24 12:35:27 crc kubenswrapper[4752]: I1124 12:35:27.090167 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnsb7\" (UniqueName: \"kubernetes.io/projected/55251fd6-dac5-445f-99cb-3a600e94217c-kube-api-access-lnsb7\") pod \"nova-cell1-db-create-dbdgt\" (UID: \"55251fd6-dac5-445f-99cb-3a600e94217c\") " pod="openstack/nova-cell1-db-create-dbdgt" Nov 24 12:35:27 crc kubenswrapper[4752]: I1124 12:35:27.090436 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-2ea2-account-create-wzkx2"] Nov 24 12:35:27 crc kubenswrapper[4752]: I1124 12:35:27.091853 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2ea2-account-create-wzkx2" Nov 24 12:35:27 crc kubenswrapper[4752]: I1124 12:35:27.093567 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Nov 24 12:35:27 crc kubenswrapper[4752]: I1124 12:35:27.106827 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-2ea2-account-create-wzkx2"] Nov 24 12:35:27 crc kubenswrapper[4752]: I1124 12:35:27.167852 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zdpp\" (UniqueName: \"kubernetes.io/projected/c58d145a-cf82-4d24-acea-034d8e2b5f6a-kube-api-access-7zdpp\") pod \"nova-cell0-43c9-account-create-8qtsg\" (UID: \"c58d145a-cf82-4d24-acea-034d8e2b5f6a\") " pod="openstack/nova-cell0-43c9-account-create-8qtsg" Nov 24 12:35:27 crc kubenswrapper[4752]: I1124 12:35:27.167938 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c58d145a-cf82-4d24-acea-034d8e2b5f6a-operator-scripts\") pod \"nova-cell0-43c9-account-create-8qtsg\" (UID: \"c58d145a-cf82-4d24-acea-034d8e2b5f6a\") " pod="openstack/nova-cell0-43c9-account-create-8qtsg" Nov 24 12:35:27 crc kubenswrapper[4752]: I1124 12:35:27.168780 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c58d145a-cf82-4d24-acea-034d8e2b5f6a-operator-scripts\") pod \"nova-cell0-43c9-account-create-8qtsg\" (UID: \"c58d145a-cf82-4d24-acea-034d8e2b5f6a\") " pod="openstack/nova-cell0-43c9-account-create-8qtsg" Nov 24 12:35:27 crc kubenswrapper[4752]: I1124 12:35:27.191503 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zdpp\" (UniqueName: \"kubernetes.io/projected/c58d145a-cf82-4d24-acea-034d8e2b5f6a-kube-api-access-7zdpp\") pod \"nova-cell0-43c9-account-create-8qtsg\" (UID: \"c58d145a-cf82-4d24-acea-034d8e2b5f6a\") " pod="openstack/nova-cell0-43c9-account-create-8qtsg" Nov 24 12:35:27 crc kubenswrapper[4752]: I1124 12:35:27.269025 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwmm6\" (UniqueName: \"kubernetes.io/projected/44914755-229b-4591-8098-84982a49fdbc-kube-api-access-jwmm6\") pod \"nova-cell1-2ea2-account-create-wzkx2\" (UID: \"44914755-229b-4591-8098-84982a49fdbc\") " pod="openstack/nova-cell1-2ea2-account-create-wzkx2" Nov 24 12:35:27 crc kubenswrapper[4752]: I1124 12:35:27.269309 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44914755-229b-4591-8098-84982a49fdbc-operator-scripts\") pod \"nova-cell1-2ea2-account-create-wzkx2\" (UID: \"44914755-229b-4591-8098-84982a49fdbc\") " pod="openstack/nova-cell1-2ea2-account-create-wzkx2" Nov 24 12:35:27 crc kubenswrapper[4752]: I1124 12:35:27.294888 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-dbdgt" Nov 24 12:35:27 crc kubenswrapper[4752]: I1124 12:35:27.307564 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-mlbcr"] Nov 24 12:35:27 crc kubenswrapper[4752]: I1124 12:35:27.307848 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-43c9-account-create-8qtsg" Nov 24 12:35:27 crc kubenswrapper[4752]: I1124 12:35:27.370348 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwmm6\" (UniqueName: \"kubernetes.io/projected/44914755-229b-4591-8098-84982a49fdbc-kube-api-access-jwmm6\") pod \"nova-cell1-2ea2-account-create-wzkx2\" (UID: \"44914755-229b-4591-8098-84982a49fdbc\") " pod="openstack/nova-cell1-2ea2-account-create-wzkx2" Nov 24 12:35:27 crc kubenswrapper[4752]: I1124 12:35:27.370406 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44914755-229b-4591-8098-84982a49fdbc-operator-scripts\") pod \"nova-cell1-2ea2-account-create-wzkx2\" (UID: \"44914755-229b-4591-8098-84982a49fdbc\") " pod="openstack/nova-cell1-2ea2-account-create-wzkx2" Nov 24 12:35:27 crc kubenswrapper[4752]: I1124 12:35:27.371317 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44914755-229b-4591-8098-84982a49fdbc-operator-scripts\") pod \"nova-cell1-2ea2-account-create-wzkx2\" (UID: \"44914755-229b-4591-8098-84982a49fdbc\") " pod="openstack/nova-cell1-2ea2-account-create-wzkx2" Nov 24 12:35:27 crc kubenswrapper[4752]: I1124 12:35:27.389255 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwmm6\" (UniqueName: \"kubernetes.io/projected/44914755-229b-4591-8098-84982a49fdbc-kube-api-access-jwmm6\") pod \"nova-cell1-2ea2-account-create-wzkx2\" (UID: \"44914755-229b-4591-8098-84982a49fdbc\") " pod="openstack/nova-cell1-2ea2-account-create-wzkx2" Nov 24 12:35:27 crc kubenswrapper[4752]: I1124 12:35:27.410651 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2ea2-account-create-wzkx2" Nov 24 12:35:27 crc kubenswrapper[4752]: I1124 12:35:27.420402 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-f54k9"] Nov 24 12:35:27 crc kubenswrapper[4752]: W1124 12:35:27.443459 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06ebef79_47d8_4a69_a075_ce63622d42c4.slice/crio-7adc0e8c80ac449db63202bfc37b612b577fc0baa29f03208096af6f443f250f WatchSource:0}: Error finding container 7adc0e8c80ac449db63202bfc37b612b577fc0baa29f03208096af6f443f250f: Status 404 returned error can't find the container with id 7adc0e8c80ac449db63202bfc37b612b577fc0baa29f03208096af6f443f250f Nov 24 12:35:27 crc kubenswrapper[4752]: I1124 12:35:27.479716 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mlbcr" event={"ID":"1be35493-c5d0-476a-ae26-42485af9efe5","Type":"ContainerStarted","Data":"eb109a8c3dda3d366888ede4c8cbec2b930d817e0e58f3503c30308f7e2c4339"} Nov 24 12:35:27 crc kubenswrapper[4752]: I1124 12:35:27.482159 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-f54k9" event={"ID":"06ebef79-47d8-4a69-a075-ce63622d42c4","Type":"ContainerStarted","Data":"7adc0e8c80ac449db63202bfc37b612b577fc0baa29f03208096af6f443f250f"} Nov 24 12:35:27 crc kubenswrapper[4752]: I1124 12:35:27.514470 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-fb8b-account-create-8dh5x"] Nov 24 12:35:27 crc kubenswrapper[4752]: W1124 12:35:27.538728 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26a41aac_1ab9_4a64_b65c_1c45ea19c56b.slice/crio-e03c74c50210a1eb0afd9ef7d51699b74fdf120d6c0515114b2304f5a96ba1e0 WatchSource:0}: Error finding container e03c74c50210a1eb0afd9ef7d51699b74fdf120d6c0515114b2304f5a96ba1e0: Status 404 returned error can't find the container with id e03c74c50210a1eb0afd9ef7d51699b74fdf120d6c0515114b2304f5a96ba1e0 Nov 24 12:35:27 crc kubenswrapper[4752]: I1124 12:35:27.837246 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-dbdgt"] Nov 24 12:35:27 crc kubenswrapper[4752]: W1124 12:35:27.842429 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55251fd6_dac5_445f_99cb_3a600e94217c.slice/crio-08b1d885ce6c71401d1591b40a26f10878746c7ccb115e00038ad2cf12b9387f WatchSource:0}: Error finding container 08b1d885ce6c71401d1591b40a26f10878746c7ccb115e00038ad2cf12b9387f: Status 404 returned error can't find the container with id 08b1d885ce6c71401d1591b40a26f10878746c7ccb115e00038ad2cf12b9387f Nov 24 12:35:27 crc kubenswrapper[4752]: I1124 12:35:27.870094 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-43c9-account-create-8qtsg"] Nov 24 12:35:27 crc kubenswrapper[4752]: W1124 12:35:27.880194 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc58d145a_cf82_4d24_acea_034d8e2b5f6a.slice/crio-fda4b93c2c0dbb826f2a987d12151a2dc0a15573ac55e2d33165a5913dbddfc8 WatchSource:0}: Error finding container fda4b93c2c0dbb826f2a987d12151a2dc0a15573ac55e2d33165a5913dbddfc8: Status 404 returned error can't find the container with id fda4b93c2c0dbb826f2a987d12151a2dc0a15573ac55e2d33165a5913dbddfc8 Nov 24 12:35:27 crc kubenswrapper[4752]: I1124 12:35:27.943825 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-2ea2-account-create-wzkx2"] Nov 24 12:35:28 crc kubenswrapper[4752]: I1124 12:35:28.501096 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-dbdgt" event={"ID":"55251fd6-dac5-445f-99cb-3a600e94217c","Type":"ContainerStarted","Data":"6e75e69472046419ee54bd5401569b2d91a312a57f044380163361e9ba16a0b6"} Nov 24 12:35:28 crc kubenswrapper[4752]: I1124 12:35:28.501139 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-dbdgt" event={"ID":"55251fd6-dac5-445f-99cb-3a600e94217c","Type":"ContainerStarted","Data":"08b1d885ce6c71401d1591b40a26f10878746c7ccb115e00038ad2cf12b9387f"} Nov 24 12:35:28 crc kubenswrapper[4752]: I1124 12:35:28.506564 4752 generic.go:334] "Generic (PLEG): container finished" podID="06ebef79-47d8-4a69-a075-ce63622d42c4" containerID="7721169ae85727d9a39cd13e0a925e97b570f57d194841e1265f55da8b66aefb" exitCode=0 Nov 24 12:35:28 crc kubenswrapper[4752]: I1124 12:35:28.506622 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-f54k9" event={"ID":"06ebef79-47d8-4a69-a075-ce63622d42c4","Type":"ContainerDied","Data":"7721169ae85727d9a39cd13e0a925e97b570f57d194841e1265f55da8b66aefb"} Nov 24 12:35:28 crc kubenswrapper[4752]: I1124 12:35:28.508473 4752 generic.go:334] "Generic (PLEG): container finished" podID="26a41aac-1ab9-4a64-b65c-1c45ea19c56b" containerID="a6c4db08d5ed908e7b36eb39e1962f11529ba79d0e4961289bb5fc5026200c06" exitCode=0 Nov 24 12:35:28 crc kubenswrapper[4752]: I1124 12:35:28.508509 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-fb8b-account-create-8dh5x" event={"ID":"26a41aac-1ab9-4a64-b65c-1c45ea19c56b","Type":"ContainerDied","Data":"a6c4db08d5ed908e7b36eb39e1962f11529ba79d0e4961289bb5fc5026200c06"} Nov 24 12:35:28 crc kubenswrapper[4752]: I1124 12:35:28.508525 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-fb8b-account-create-8dh5x" event={"ID":"26a41aac-1ab9-4a64-b65c-1c45ea19c56b","Type":"ContainerStarted","Data":"e03c74c50210a1eb0afd9ef7d51699b74fdf120d6c0515114b2304f5a96ba1e0"} Nov 24 12:35:28 crc kubenswrapper[4752]: I1124 12:35:28.510328 4752 generic.go:334] "Generic (PLEG): container finished" podID="c58d145a-cf82-4d24-acea-034d8e2b5f6a" containerID="5c3ca94672a7e945e47f1589c91eda4ea922c0027b0eece350599aef423112f3" exitCode=0 Nov 24 12:35:28 crc kubenswrapper[4752]: I1124 12:35:28.510374 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-43c9-account-create-8qtsg" event={"ID":"c58d145a-cf82-4d24-acea-034d8e2b5f6a","Type":"ContainerDied","Data":"5c3ca94672a7e945e47f1589c91eda4ea922c0027b0eece350599aef423112f3"} Nov 24 12:35:28 crc kubenswrapper[4752]: I1124 12:35:28.510390 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-43c9-account-create-8qtsg" event={"ID":"c58d145a-cf82-4d24-acea-034d8e2b5f6a","Type":"ContainerStarted","Data":"fda4b93c2c0dbb826f2a987d12151a2dc0a15573ac55e2d33165a5913dbddfc8"} Nov 24 12:35:28 crc kubenswrapper[4752]: I1124 12:35:28.511543 4752 generic.go:334] "Generic (PLEG): container finished" podID="44914755-229b-4591-8098-84982a49fdbc" containerID="70242a0e997c03dadde82918af5df61024dd5115f3af6886478123f5e51f8ccd" exitCode=0 Nov 24 12:35:28 crc kubenswrapper[4752]: I1124 12:35:28.511584 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-2ea2-account-create-wzkx2" event={"ID":"44914755-229b-4591-8098-84982a49fdbc","Type":"ContainerDied","Data":"70242a0e997c03dadde82918af5df61024dd5115f3af6886478123f5e51f8ccd"} Nov 24 12:35:28 crc kubenswrapper[4752]: I1124 12:35:28.511600 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-2ea2-account-create-wzkx2" event={"ID":"44914755-229b-4591-8098-84982a49fdbc","Type":"ContainerStarted","Data":"344806917f03f74de519a5059fad1c367320b2b9c588b81009fc432a8cf67b37"} Nov 24 12:35:28 crc kubenswrapper[4752]: I1124 12:35:28.512550 4752 generic.go:334] "Generic (PLEG): container finished" podID="1be35493-c5d0-476a-ae26-42485af9efe5" containerID="7ee1badcca9a9830941349e81895cdae778096336dfc507f10562388a478a9a8" exitCode=0 Nov 24 12:35:28 crc kubenswrapper[4752]: I1124 12:35:28.512576 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mlbcr" event={"ID":"1be35493-c5d0-476a-ae26-42485af9efe5","Type":"ContainerDied","Data":"7ee1badcca9a9830941349e81895cdae778096336dfc507f10562388a478a9a8"} Nov 24 12:35:29 crc kubenswrapper[4752]: I1124 12:35:29.527645 4752 generic.go:334] "Generic (PLEG): container finished" podID="55251fd6-dac5-445f-99cb-3a600e94217c" containerID="6e75e69472046419ee54bd5401569b2d91a312a57f044380163361e9ba16a0b6" exitCode=0 Nov 24 12:35:29 crc kubenswrapper[4752]: I1124 12:35:29.527788 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-dbdgt" event={"ID":"55251fd6-dac5-445f-99cb-3a600e94217c","Type":"ContainerDied","Data":"6e75e69472046419ee54bd5401569b2d91a312a57f044380163361e9ba16a0b6"} Nov 24 12:35:29 crc kubenswrapper[4752]: I1124 12:35:29.927179 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-f54k9" Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.025677 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06ebef79-47d8-4a69-a075-ce63622d42c4-operator-scripts\") pod \"06ebef79-47d8-4a69-a075-ce63622d42c4\" (UID: \"06ebef79-47d8-4a69-a075-ce63622d42c4\") " Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.025873 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htcjd\" (UniqueName: \"kubernetes.io/projected/06ebef79-47d8-4a69-a075-ce63622d42c4-kube-api-access-htcjd\") pod \"06ebef79-47d8-4a69-a075-ce63622d42c4\" (UID: \"06ebef79-47d8-4a69-a075-ce63622d42c4\") " Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.026266 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06ebef79-47d8-4a69-a075-ce63622d42c4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "06ebef79-47d8-4a69-a075-ce63622d42c4" (UID: "06ebef79-47d8-4a69-a075-ce63622d42c4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.031451 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06ebef79-47d8-4a69-a075-ce63622d42c4-kube-api-access-htcjd" (OuterVolumeSpecName: "kube-api-access-htcjd") pod "06ebef79-47d8-4a69-a075-ce63622d42c4" (UID: "06ebef79-47d8-4a69-a075-ce63622d42c4"). InnerVolumeSpecName "kube-api-access-htcjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.119161 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-fb8b-account-create-8dh5x" Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.128815 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-43c9-account-create-8qtsg" Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.129321 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htcjd\" (UniqueName: \"kubernetes.io/projected/06ebef79-47d8-4a69-a075-ce63622d42c4-kube-api-access-htcjd\") on node \"crc\" DevicePath \"\"" Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.129351 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06ebef79-47d8-4a69-a075-ce63622d42c4-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.145319 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-dbdgt" Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.147567 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2ea2-account-create-wzkx2" Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.156686 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mlbcr" Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.230946 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwmm6\" (UniqueName: \"kubernetes.io/projected/44914755-229b-4591-8098-84982a49fdbc-kube-api-access-jwmm6\") pod \"44914755-229b-4591-8098-84982a49fdbc\" (UID: \"44914755-229b-4591-8098-84982a49fdbc\") " Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.231211 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1be35493-c5d0-476a-ae26-42485af9efe5-operator-scripts\") pod \"1be35493-c5d0-476a-ae26-42485af9efe5\" (UID: \"1be35493-c5d0-476a-ae26-42485af9efe5\") " Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.231362 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55c4g\" (UniqueName: \"kubernetes.io/projected/1be35493-c5d0-476a-ae26-42485af9efe5-kube-api-access-55c4g\") pod \"1be35493-c5d0-476a-ae26-42485af9efe5\" (UID: \"1be35493-c5d0-476a-ae26-42485af9efe5\") " Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.231492 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zdpp\" (UniqueName: \"kubernetes.io/projected/c58d145a-cf82-4d24-acea-034d8e2b5f6a-kube-api-access-7zdpp\") pod \"c58d145a-cf82-4d24-acea-034d8e2b5f6a\" (UID: \"c58d145a-cf82-4d24-acea-034d8e2b5f6a\") " Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.231604 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55251fd6-dac5-445f-99cb-3a600e94217c-operator-scripts\") pod \"55251fd6-dac5-445f-99cb-3a600e94217c\" (UID: \"55251fd6-dac5-445f-99cb-3a600e94217c\") " Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.231771 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfvz6\" (UniqueName: \"kubernetes.io/projected/26a41aac-1ab9-4a64-b65c-1c45ea19c56b-kube-api-access-mfvz6\") pod \"26a41aac-1ab9-4a64-b65c-1c45ea19c56b\" (UID: \"26a41aac-1ab9-4a64-b65c-1c45ea19c56b\") " Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.231937 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnsb7\" (UniqueName: \"kubernetes.io/projected/55251fd6-dac5-445f-99cb-3a600e94217c-kube-api-access-lnsb7\") pod \"55251fd6-dac5-445f-99cb-3a600e94217c\" (UID: \"55251fd6-dac5-445f-99cb-3a600e94217c\") " Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.232119 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c58d145a-cf82-4d24-acea-034d8e2b5f6a-operator-scripts\") pod \"c58d145a-cf82-4d24-acea-034d8e2b5f6a\" (UID: \"c58d145a-cf82-4d24-acea-034d8e2b5f6a\") " Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.232229 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26a41aac-1ab9-4a64-b65c-1c45ea19c56b-operator-scripts\") pod \"26a41aac-1ab9-4a64-b65c-1c45ea19c56b\" (UID: \"26a41aac-1ab9-4a64-b65c-1c45ea19c56b\") " Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.232358 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44914755-229b-4591-8098-84982a49fdbc-operator-scripts\") pod \"44914755-229b-4591-8098-84982a49fdbc\" (UID: \"44914755-229b-4591-8098-84982a49fdbc\") " Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.233319 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44914755-229b-4591-8098-84982a49fdbc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "44914755-229b-4591-8098-84982a49fdbc" (UID: "44914755-229b-4591-8098-84982a49fdbc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.233984 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1be35493-c5d0-476a-ae26-42485af9efe5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1be35493-c5d0-476a-ae26-42485af9efe5" (UID: "1be35493-c5d0-476a-ae26-42485af9efe5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.234293 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44914755-229b-4591-8098-84982a49fdbc-kube-api-access-jwmm6" (OuterVolumeSpecName: "kube-api-access-jwmm6") pod "44914755-229b-4591-8098-84982a49fdbc" (UID: "44914755-229b-4591-8098-84982a49fdbc"). InnerVolumeSpecName "kube-api-access-jwmm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.235094 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26a41aac-1ab9-4a64-b65c-1c45ea19c56b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "26a41aac-1ab9-4a64-b65c-1c45ea19c56b" (UID: "26a41aac-1ab9-4a64-b65c-1c45ea19c56b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.235192 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55251fd6-dac5-445f-99cb-3a600e94217c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "55251fd6-dac5-445f-99cb-3a600e94217c" (UID: "55251fd6-dac5-445f-99cb-3a600e94217c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.235937 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c58d145a-cf82-4d24-acea-034d8e2b5f6a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c58d145a-cf82-4d24-acea-034d8e2b5f6a" (UID: "c58d145a-cf82-4d24-acea-034d8e2b5f6a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.237443 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1be35493-c5d0-476a-ae26-42485af9efe5-kube-api-access-55c4g" (OuterVolumeSpecName: "kube-api-access-55c4g") pod "1be35493-c5d0-476a-ae26-42485af9efe5" (UID: "1be35493-c5d0-476a-ae26-42485af9efe5"). InnerVolumeSpecName "kube-api-access-55c4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.237519 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c58d145a-cf82-4d24-acea-034d8e2b5f6a-kube-api-access-7zdpp" (OuterVolumeSpecName: "kube-api-access-7zdpp") pod "c58d145a-cf82-4d24-acea-034d8e2b5f6a" (UID: "c58d145a-cf82-4d24-acea-034d8e2b5f6a"). InnerVolumeSpecName "kube-api-access-7zdpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.238074 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55251fd6-dac5-445f-99cb-3a600e94217c-kube-api-access-lnsb7" (OuterVolumeSpecName: "kube-api-access-lnsb7") pod "55251fd6-dac5-445f-99cb-3a600e94217c" (UID: "55251fd6-dac5-445f-99cb-3a600e94217c"). InnerVolumeSpecName "kube-api-access-lnsb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.240054 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26a41aac-1ab9-4a64-b65c-1c45ea19c56b-kube-api-access-mfvz6" (OuterVolumeSpecName: "kube-api-access-mfvz6") pod "26a41aac-1ab9-4a64-b65c-1c45ea19c56b" (UID: "26a41aac-1ab9-4a64-b65c-1c45ea19c56b"). InnerVolumeSpecName "kube-api-access-mfvz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.334273 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwmm6\" (UniqueName: \"kubernetes.io/projected/44914755-229b-4591-8098-84982a49fdbc-kube-api-access-jwmm6\") on node \"crc\" DevicePath \"\"" Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.334311 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1be35493-c5d0-476a-ae26-42485af9efe5-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.334321 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55c4g\" (UniqueName: \"kubernetes.io/projected/1be35493-c5d0-476a-ae26-42485af9efe5-kube-api-access-55c4g\") on node \"crc\" DevicePath \"\"" Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.334329 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zdpp\" (UniqueName: \"kubernetes.io/projected/c58d145a-cf82-4d24-acea-034d8e2b5f6a-kube-api-access-7zdpp\") on node \"crc\" DevicePath \"\"" Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.334338 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55251fd6-dac5-445f-99cb-3a600e94217c-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.334347 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfvz6\" (UniqueName: \"kubernetes.io/projected/26a41aac-1ab9-4a64-b65c-1c45ea19c56b-kube-api-access-mfvz6\") on node \"crc\" DevicePath \"\"" Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.334357 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnsb7\" (UniqueName: \"kubernetes.io/projected/55251fd6-dac5-445f-99cb-3a600e94217c-kube-api-access-lnsb7\") on node \"crc\" DevicePath \"\"" Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.334366 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c58d145a-cf82-4d24-acea-034d8e2b5f6a-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.334375 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26a41aac-1ab9-4a64-b65c-1c45ea19c56b-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.334384 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44914755-229b-4591-8098-84982a49fdbc-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.550055 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-fb8b-account-create-8dh5x" Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.550000 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-fb8b-account-create-8dh5x" event={"ID":"26a41aac-1ab9-4a64-b65c-1c45ea19c56b","Type":"ContainerDied","Data":"e03c74c50210a1eb0afd9ef7d51699b74fdf120d6c0515114b2304f5a96ba1e0"} Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.552292 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e03c74c50210a1eb0afd9ef7d51699b74fdf120d6c0515114b2304f5a96ba1e0" Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.553537 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-43c9-account-create-8qtsg" Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.553827 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-43c9-account-create-8qtsg" event={"ID":"c58d145a-cf82-4d24-acea-034d8e2b5f6a","Type":"ContainerDied","Data":"fda4b93c2c0dbb826f2a987d12151a2dc0a15573ac55e2d33165a5913dbddfc8"} Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.553893 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fda4b93c2c0dbb826f2a987d12151a2dc0a15573ac55e2d33165a5913dbddfc8" Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.556575 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-2ea2-account-create-wzkx2" event={"ID":"44914755-229b-4591-8098-84982a49fdbc","Type":"ContainerDied","Data":"344806917f03f74de519a5059fad1c367320b2b9c588b81009fc432a8cf67b37"} Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.556632 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="344806917f03f74de519a5059fad1c367320b2b9c588b81009fc432a8cf67b37" Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.556669 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2ea2-account-create-wzkx2" Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.558926 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mlbcr" event={"ID":"1be35493-c5d0-476a-ae26-42485af9efe5","Type":"ContainerDied","Data":"eb109a8c3dda3d366888ede4c8cbec2b930d817e0e58f3503c30308f7e2c4339"} Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.558980 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb109a8c3dda3d366888ede4c8cbec2b930d817e0e58f3503c30308f7e2c4339" Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.559060 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mlbcr" Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.570585 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-dbdgt" Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.570581 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-dbdgt" event={"ID":"55251fd6-dac5-445f-99cb-3a600e94217c","Type":"ContainerDied","Data":"08b1d885ce6c71401d1591b40a26f10878746c7ccb115e00038ad2cf12b9387f"} Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.570887 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08b1d885ce6c71401d1591b40a26f10878746c7ccb115e00038ad2cf12b9387f" Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.573723 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-f54k9" event={"ID":"06ebef79-47d8-4a69-a075-ce63622d42c4","Type":"ContainerDied","Data":"7adc0e8c80ac449db63202bfc37b612b577fc0baa29f03208096af6f443f250f"} Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.574021 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7adc0e8c80ac449db63202bfc37b612b577fc0baa29f03208096af6f443f250f" Nov 24 12:35:30 crc kubenswrapper[4752]: I1124 12:35:30.573857 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-f54k9" Nov 24 12:35:32 crc kubenswrapper[4752]: I1124 12:35:32.136020 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bdrng"] Nov 24 12:35:32 crc kubenswrapper[4752]: E1124 12:35:32.136715 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26a41aac-1ab9-4a64-b65c-1c45ea19c56b" containerName="mariadb-account-create" Nov 24 12:35:32 crc kubenswrapper[4752]: I1124 12:35:32.136731 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="26a41aac-1ab9-4a64-b65c-1c45ea19c56b" containerName="mariadb-account-create" Nov 24 12:35:32 crc kubenswrapper[4752]: E1124 12:35:32.136784 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44914755-229b-4591-8098-84982a49fdbc" containerName="mariadb-account-create" Nov 24 12:35:32 crc kubenswrapper[4752]: I1124 12:35:32.136793 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="44914755-229b-4591-8098-84982a49fdbc" containerName="mariadb-account-create" Nov 24 12:35:32 crc kubenswrapper[4752]: E1124 12:35:32.136815 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55251fd6-dac5-445f-99cb-3a600e94217c" containerName="mariadb-database-create" Nov 24 12:35:32 crc kubenswrapper[4752]: I1124 12:35:32.136824 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="55251fd6-dac5-445f-99cb-3a600e94217c" containerName="mariadb-database-create" Nov 24 12:35:32 crc kubenswrapper[4752]: E1124 12:35:32.136836 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06ebef79-47d8-4a69-a075-ce63622d42c4" containerName="mariadb-database-create" Nov 24 12:35:32 crc kubenswrapper[4752]: I1124 12:35:32.136845 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="06ebef79-47d8-4a69-a075-ce63622d42c4" containerName="mariadb-database-create" Nov 24 12:35:32 crc kubenswrapper[4752]: E1124 12:35:32.136863 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1be35493-c5d0-476a-ae26-42485af9efe5" containerName="mariadb-database-create" Nov 24 12:35:32 crc kubenswrapper[4752]: I1124 12:35:32.136871 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="1be35493-c5d0-476a-ae26-42485af9efe5" containerName="mariadb-database-create" Nov 24 12:35:32 crc kubenswrapper[4752]: E1124 12:35:32.136883 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c58d145a-cf82-4d24-acea-034d8e2b5f6a" containerName="mariadb-account-create" Nov 24 12:35:32 crc kubenswrapper[4752]: I1124 12:35:32.136891 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="c58d145a-cf82-4d24-acea-034d8e2b5f6a" containerName="mariadb-account-create" Nov 24 12:35:32 crc kubenswrapper[4752]: I1124 12:35:32.137088 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="1be35493-c5d0-476a-ae26-42485af9efe5" containerName="mariadb-database-create" Nov 24 12:35:32 crc kubenswrapper[4752]: I1124 12:35:32.137111 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="44914755-229b-4591-8098-84982a49fdbc" containerName="mariadb-account-create" Nov 24 12:35:32 crc kubenswrapper[4752]: I1124 12:35:32.137125 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="06ebef79-47d8-4a69-a075-ce63622d42c4" containerName="mariadb-database-create" Nov 24 12:35:32 crc kubenswrapper[4752]: I1124 12:35:32.137140 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="26a41aac-1ab9-4a64-b65c-1c45ea19c56b" containerName="mariadb-account-create" Nov 24 12:35:32 crc kubenswrapper[4752]: I1124 12:35:32.137160 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="55251fd6-dac5-445f-99cb-3a600e94217c" containerName="mariadb-database-create" Nov 24 12:35:32 crc kubenswrapper[4752]: I1124 12:35:32.137179 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="c58d145a-cf82-4d24-acea-034d8e2b5f6a" containerName="mariadb-account-create" Nov 24 12:35:32 crc kubenswrapper[4752]: I1124 12:35:32.137905 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-bdrng" Nov 24 12:35:32 crc kubenswrapper[4752]: I1124 12:35:32.140967 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 24 12:35:32 crc kubenswrapper[4752]: I1124 12:35:32.141058 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Nov 24 12:35:32 crc kubenswrapper[4752]: I1124 12:35:32.141693 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-gl7jf" Nov 24 12:35:32 crc kubenswrapper[4752]: I1124 12:35:32.148295 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bdrng"] Nov 24 12:35:32 crc kubenswrapper[4752]: I1124 12:35:32.283363 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e91b6fb-094d-468a-af30-0e161a3e16ab-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-bdrng\" (UID: \"6e91b6fb-094d-468a-af30-0e161a3e16ab\") " pod="openstack/nova-cell0-conductor-db-sync-bdrng" Nov 24 12:35:32 crc kubenswrapper[4752]: I1124 12:35:32.283505 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e91b6fb-094d-468a-af30-0e161a3e16ab-config-data\") pod \"nova-cell0-conductor-db-sync-bdrng\" (UID: \"6e91b6fb-094d-468a-af30-0e161a3e16ab\") " pod="openstack/nova-cell0-conductor-db-sync-bdrng" Nov 24 12:35:32 crc kubenswrapper[4752]: I1124 12:35:32.283649 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g44h5\" (UniqueName: \"kubernetes.io/projected/6e91b6fb-094d-468a-af30-0e161a3e16ab-kube-api-access-g44h5\") pod \"nova-cell0-conductor-db-sync-bdrng\" (UID: \"6e91b6fb-094d-468a-af30-0e161a3e16ab\") " pod="openstack/nova-cell0-conductor-db-sync-bdrng" Nov 24 12:35:32 crc kubenswrapper[4752]: I1124 12:35:32.283688 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e91b6fb-094d-468a-af30-0e161a3e16ab-scripts\") pod \"nova-cell0-conductor-db-sync-bdrng\" (UID: \"6e91b6fb-094d-468a-af30-0e161a3e16ab\") " pod="openstack/nova-cell0-conductor-db-sync-bdrng" Nov 24 12:35:32 crc kubenswrapper[4752]: I1124 12:35:32.386209 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e91b6fb-094d-468a-af30-0e161a3e16ab-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-bdrng\" (UID: \"6e91b6fb-094d-468a-af30-0e161a3e16ab\") " pod="openstack/nova-cell0-conductor-db-sync-bdrng" Nov 24 12:35:32 crc kubenswrapper[4752]: I1124 12:35:32.386366 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e91b6fb-094d-468a-af30-0e161a3e16ab-config-data\") pod \"nova-cell0-conductor-db-sync-bdrng\" (UID: \"6e91b6fb-094d-468a-af30-0e161a3e16ab\") " pod="openstack/nova-cell0-conductor-db-sync-bdrng" Nov 24 12:35:32 crc kubenswrapper[4752]: I1124 12:35:32.386570 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g44h5\" (UniqueName: \"kubernetes.io/projected/6e91b6fb-094d-468a-af30-0e161a3e16ab-kube-api-access-g44h5\") pod \"nova-cell0-conductor-db-sync-bdrng\" (UID: \"6e91b6fb-094d-468a-af30-0e161a3e16ab\") " pod="openstack/nova-cell0-conductor-db-sync-bdrng" Nov 24 12:35:32 crc kubenswrapper[4752]: I1124 12:35:32.386618 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e91b6fb-094d-468a-af30-0e161a3e16ab-scripts\") pod \"nova-cell0-conductor-db-sync-bdrng\" (UID: \"6e91b6fb-094d-468a-af30-0e161a3e16ab\") " pod="openstack/nova-cell0-conductor-db-sync-bdrng" Nov 24 12:35:32 crc kubenswrapper[4752]: I1124 12:35:32.396642 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e91b6fb-094d-468a-af30-0e161a3e16ab-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-bdrng\" (UID: \"6e91b6fb-094d-468a-af30-0e161a3e16ab\") " pod="openstack/nova-cell0-conductor-db-sync-bdrng" Nov 24 12:35:32 crc kubenswrapper[4752]: I1124 12:35:32.402999 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e91b6fb-094d-468a-af30-0e161a3e16ab-scripts\") pod \"nova-cell0-conductor-db-sync-bdrng\" (UID: \"6e91b6fb-094d-468a-af30-0e161a3e16ab\") " pod="openstack/nova-cell0-conductor-db-sync-bdrng" Nov 24 12:35:32 crc kubenswrapper[4752]: I1124 12:35:32.406639 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e91b6fb-094d-468a-af30-0e161a3e16ab-config-data\") pod \"nova-cell0-conductor-db-sync-bdrng\" (UID: \"6e91b6fb-094d-468a-af30-0e161a3e16ab\") " pod="openstack/nova-cell0-conductor-db-sync-bdrng" Nov 24 12:35:32 crc kubenswrapper[4752]: I1124 12:35:32.418351 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g44h5\" (UniqueName: \"kubernetes.io/projected/6e91b6fb-094d-468a-af30-0e161a3e16ab-kube-api-access-g44h5\") pod \"nova-cell0-conductor-db-sync-bdrng\" (UID: \"6e91b6fb-094d-468a-af30-0e161a3e16ab\") " pod="openstack/nova-cell0-conductor-db-sync-bdrng" Nov 24 12:35:32 crc kubenswrapper[4752]: I1124 12:35:32.459849 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-bdrng" Nov 24 12:35:33 crc kubenswrapper[4752]: I1124 12:35:33.008212 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bdrng"] Nov 24 12:35:33 crc kubenswrapper[4752]: W1124 12:35:33.010726 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e91b6fb_094d_468a_af30_0e161a3e16ab.slice/crio-f950bb68ecbe5298aa16adf0bb1b41b38d2c871f74f10dc6718e3d52c8797663 WatchSource:0}: Error finding container f950bb68ecbe5298aa16adf0bb1b41b38d2c871f74f10dc6718e3d52c8797663: Status 404 returned error can't find the container with id f950bb68ecbe5298aa16adf0bb1b41b38d2c871f74f10dc6718e3d52c8797663 Nov 24 12:35:33 crc kubenswrapper[4752]: I1124 12:35:33.614528 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-bdrng" event={"ID":"6e91b6fb-094d-468a-af30-0e161a3e16ab","Type":"ContainerStarted","Data":"f6f1107bb8c6bdab6510656d8c2f84a7b272365cce808da81f9a844af1cc150b"} Nov 24 12:35:33 crc kubenswrapper[4752]: I1124 12:35:33.614989 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-bdrng" event={"ID":"6e91b6fb-094d-468a-af30-0e161a3e16ab","Type":"ContainerStarted","Data":"f950bb68ecbe5298aa16adf0bb1b41b38d2c871f74f10dc6718e3d52c8797663"} Nov 24 12:35:33 crc kubenswrapper[4752]: I1124 12:35:33.653969 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-bdrng" podStartSLOduration=1.653938096 podStartE2EDuration="1.653938096s" podCreationTimestamp="2025-11-24 12:35:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:35:33.637184655 +0000 UTC m=+5339.622004984" watchObservedRunningTime="2025-11-24 12:35:33.653938096 +0000 UTC m=+5339.638758445" Nov 24 12:35:38 crc kubenswrapper[4752]: I1124 12:35:38.678695 4752 generic.go:334] "Generic (PLEG): container finished" podID="6e91b6fb-094d-468a-af30-0e161a3e16ab" containerID="f6f1107bb8c6bdab6510656d8c2f84a7b272365cce808da81f9a844af1cc150b" exitCode=0 Nov 24 12:35:38 crc kubenswrapper[4752]: I1124 12:35:38.678839 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-bdrng" event={"ID":"6e91b6fb-094d-468a-af30-0e161a3e16ab","Type":"ContainerDied","Data":"f6f1107bb8c6bdab6510656d8c2f84a7b272365cce808da81f9a844af1cc150b"} Nov 24 12:35:40 crc kubenswrapper[4752]: I1124 12:35:40.102804 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-bdrng" Nov 24 12:35:40 crc kubenswrapper[4752]: I1124 12:35:40.205170 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e91b6fb-094d-468a-af30-0e161a3e16ab-combined-ca-bundle\") pod \"6e91b6fb-094d-468a-af30-0e161a3e16ab\" (UID: \"6e91b6fb-094d-468a-af30-0e161a3e16ab\") " Nov 24 12:35:40 crc kubenswrapper[4752]: I1124 12:35:40.205226 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e91b6fb-094d-468a-af30-0e161a3e16ab-config-data\") pod \"6e91b6fb-094d-468a-af30-0e161a3e16ab\" (UID: \"6e91b6fb-094d-468a-af30-0e161a3e16ab\") " Nov 24 12:35:40 crc kubenswrapper[4752]: I1124 12:35:40.205274 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e91b6fb-094d-468a-af30-0e161a3e16ab-scripts\") pod \"6e91b6fb-094d-468a-af30-0e161a3e16ab\" (UID: \"6e91b6fb-094d-468a-af30-0e161a3e16ab\") " Nov 24 12:35:40 crc kubenswrapper[4752]: I1124 12:35:40.205302 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g44h5\" (UniqueName: \"kubernetes.io/projected/6e91b6fb-094d-468a-af30-0e161a3e16ab-kube-api-access-g44h5\") pod \"6e91b6fb-094d-468a-af30-0e161a3e16ab\" (UID: \"6e91b6fb-094d-468a-af30-0e161a3e16ab\") " Nov 24 12:35:40 crc kubenswrapper[4752]: I1124 12:35:40.214473 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e91b6fb-094d-468a-af30-0e161a3e16ab-scripts" (OuterVolumeSpecName: "scripts") pod "6e91b6fb-094d-468a-af30-0e161a3e16ab" (UID: "6e91b6fb-094d-468a-af30-0e161a3e16ab"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:35:40 crc kubenswrapper[4752]: I1124 12:35:40.215907 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e91b6fb-094d-468a-af30-0e161a3e16ab-kube-api-access-g44h5" (OuterVolumeSpecName: "kube-api-access-g44h5") pod "6e91b6fb-094d-468a-af30-0e161a3e16ab" (UID: "6e91b6fb-094d-468a-af30-0e161a3e16ab"). InnerVolumeSpecName "kube-api-access-g44h5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:35:40 crc kubenswrapper[4752]: I1124 12:35:40.237798 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e91b6fb-094d-468a-af30-0e161a3e16ab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e91b6fb-094d-468a-af30-0e161a3e16ab" (UID: "6e91b6fb-094d-468a-af30-0e161a3e16ab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:35:40 crc kubenswrapper[4752]: I1124 12:35:40.258494 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e91b6fb-094d-468a-af30-0e161a3e16ab-config-data" (OuterVolumeSpecName: "config-data") pod "6e91b6fb-094d-468a-af30-0e161a3e16ab" (UID: "6e91b6fb-094d-468a-af30-0e161a3e16ab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:35:40 crc kubenswrapper[4752]: I1124 12:35:40.309176 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e91b6fb-094d-468a-af30-0e161a3e16ab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 12:35:40 crc kubenswrapper[4752]: I1124 12:35:40.309233 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e91b6fb-094d-468a-af30-0e161a3e16ab-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 12:35:40 crc kubenswrapper[4752]: I1124 12:35:40.309253 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e91b6fb-094d-468a-af30-0e161a3e16ab-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:35:40 crc kubenswrapper[4752]: I1124 12:35:40.309271 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g44h5\" (UniqueName: \"kubernetes.io/projected/6e91b6fb-094d-468a-af30-0e161a3e16ab-kube-api-access-g44h5\") on node \"crc\" DevicePath \"\"" Nov 24 12:35:40 crc kubenswrapper[4752]: I1124 12:35:40.708529 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-bdrng" event={"ID":"6e91b6fb-094d-468a-af30-0e161a3e16ab","Type":"ContainerDied","Data":"f950bb68ecbe5298aa16adf0bb1b41b38d2c871f74f10dc6718e3d52c8797663"} Nov 24 12:35:40 crc kubenswrapper[4752]: I1124 12:35:40.708604 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-bdrng" Nov 24 12:35:40 crc kubenswrapper[4752]: I1124 12:35:40.708614 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f950bb68ecbe5298aa16adf0bb1b41b38d2c871f74f10dc6718e3d52c8797663" Nov 24 12:35:40 crc kubenswrapper[4752]: I1124 12:35:40.815775 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 24 12:35:40 crc kubenswrapper[4752]: E1124 12:35:40.816466 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e91b6fb-094d-468a-af30-0e161a3e16ab" containerName="nova-cell0-conductor-db-sync" Nov 24 12:35:40 crc kubenswrapper[4752]: I1124 12:35:40.816489 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e91b6fb-094d-468a-af30-0e161a3e16ab" containerName="nova-cell0-conductor-db-sync" Nov 24 12:35:40 crc kubenswrapper[4752]: I1124 12:35:40.816836 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e91b6fb-094d-468a-af30-0e161a3e16ab" containerName="nova-cell0-conductor-db-sync" Nov 24 12:35:40 crc kubenswrapper[4752]: I1124 12:35:40.817868 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 24 12:35:40 crc kubenswrapper[4752]: I1124 12:35:40.861672 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 24 12:35:40 crc kubenswrapper[4752]: I1124 12:35:40.866542 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-gl7jf" Nov 24 12:35:40 crc kubenswrapper[4752]: I1124 12:35:40.901479 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 24 12:35:40 crc kubenswrapper[4752]: I1124 12:35:40.924917 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c82b290-1172-460e-9b64-5a73c92229d0-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7c82b290-1172-460e-9b64-5a73c92229d0\") " pod="openstack/nova-cell0-conductor-0" Nov 24 12:35:40 crc kubenswrapper[4752]: I1124 12:35:40.925007 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb56j\" (UniqueName: \"kubernetes.io/projected/7c82b290-1172-460e-9b64-5a73c92229d0-kube-api-access-hb56j\") pod \"nova-cell0-conductor-0\" (UID: \"7c82b290-1172-460e-9b64-5a73c92229d0\") " pod="openstack/nova-cell0-conductor-0" Nov 24 12:35:40 crc kubenswrapper[4752]: I1124 12:35:40.925036 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c82b290-1172-460e-9b64-5a73c92229d0-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7c82b290-1172-460e-9b64-5a73c92229d0\") " pod="openstack/nova-cell0-conductor-0" Nov 24 12:35:41 crc kubenswrapper[4752]: I1124 12:35:41.026296 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c82b290-1172-460e-9b64-5a73c92229d0-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7c82b290-1172-460e-9b64-5a73c92229d0\") " pod="openstack/nova-cell0-conductor-0" Nov 24 12:35:41 crc kubenswrapper[4752]: I1124 12:35:41.026465 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hb56j\" (UniqueName: \"kubernetes.io/projected/7c82b290-1172-460e-9b64-5a73c92229d0-kube-api-access-hb56j\") pod \"nova-cell0-conductor-0\" (UID: \"7c82b290-1172-460e-9b64-5a73c92229d0\") " pod="openstack/nova-cell0-conductor-0" Nov 24 12:35:41 crc kubenswrapper[4752]: I1124 12:35:41.026505 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c82b290-1172-460e-9b64-5a73c92229d0-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7c82b290-1172-460e-9b64-5a73c92229d0\") " pod="openstack/nova-cell0-conductor-0" Nov 24 12:35:41 crc kubenswrapper[4752]: I1124 12:35:41.032239 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c82b290-1172-460e-9b64-5a73c92229d0-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7c82b290-1172-460e-9b64-5a73c92229d0\") " pod="openstack/nova-cell0-conductor-0" Nov 24 12:35:41 crc kubenswrapper[4752]: I1124 12:35:41.033937 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c82b290-1172-460e-9b64-5a73c92229d0-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7c82b290-1172-460e-9b64-5a73c92229d0\") " pod="openstack/nova-cell0-conductor-0" Nov 24 12:35:41 crc kubenswrapper[4752]: I1124 12:35:41.058779 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb56j\" (UniqueName: \"kubernetes.io/projected/7c82b290-1172-460e-9b64-5a73c92229d0-kube-api-access-hb56j\") pod \"nova-cell0-conductor-0\" (UID: \"7c82b290-1172-460e-9b64-5a73c92229d0\") " pod="openstack/nova-cell0-conductor-0" Nov 24 12:35:41 crc kubenswrapper[4752]: I1124 12:35:41.192427 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 24 12:35:41 crc kubenswrapper[4752]: I1124 12:35:41.526520 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 24 12:35:41 crc kubenswrapper[4752]: I1124 12:35:41.717032 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7c82b290-1172-460e-9b64-5a73c92229d0","Type":"ContainerStarted","Data":"bf97fc942abb6428fee34e917bb698947e7d68921c35c7f8fbb0f66d1676f4b4"} Nov 24 12:35:42 crc kubenswrapper[4752]: I1124 12:35:42.748480 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Nov 24 12:35:42 crc kubenswrapper[4752]: I1124 12:35:42.753198 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7c82b290-1172-460e-9b64-5a73c92229d0","Type":"ContainerStarted","Data":"8ffb055c443d1b0af4eb5c2ab0dd7c04bd559e53d7e2197ca2691405749cd41d"} Nov 24 12:35:42 crc kubenswrapper[4752]: I1124 12:35:42.760937 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.760904004 podStartE2EDuration="2.760904004s" podCreationTimestamp="2025-11-24 12:35:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:35:42.755500949 +0000 UTC m=+5348.740321258" watchObservedRunningTime="2025-11-24 12:35:42.760904004 +0000 UTC m=+5348.745724343" Nov 24 12:35:46 crc kubenswrapper[4752]: I1124 12:35:46.243861 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Nov 24 12:35:46 crc kubenswrapper[4752]: I1124 12:35:46.744191 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-4jn2n"] Nov 24 12:35:46 crc kubenswrapper[4752]: I1124 12:35:46.745438 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4jn2n" Nov 24 12:35:46 crc kubenswrapper[4752]: I1124 12:35:46.747235 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Nov 24 12:35:46 crc kubenswrapper[4752]: I1124 12:35:46.752153 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Nov 24 12:35:46 crc kubenswrapper[4752]: I1124 12:35:46.770051 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-4jn2n"] Nov 24 12:35:46 crc kubenswrapper[4752]: I1124 12:35:46.838052 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mss4j\" (UniqueName: \"kubernetes.io/projected/f0f4c990-46bb-4b1a-ad4b-c7207ab5facd-kube-api-access-mss4j\") pod \"nova-cell0-cell-mapping-4jn2n\" (UID: \"f0f4c990-46bb-4b1a-ad4b-c7207ab5facd\") " pod="openstack/nova-cell0-cell-mapping-4jn2n" Nov 24 12:35:46 crc kubenswrapper[4752]: I1124 12:35:46.838129 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0f4c990-46bb-4b1a-ad4b-c7207ab5facd-scripts\") pod \"nova-cell0-cell-mapping-4jn2n\" (UID: \"f0f4c990-46bb-4b1a-ad4b-c7207ab5facd\") " pod="openstack/nova-cell0-cell-mapping-4jn2n" Nov 24 12:35:46 crc kubenswrapper[4752]: I1124 12:35:46.838190 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0f4c990-46bb-4b1a-ad4b-c7207ab5facd-config-data\") pod \"nova-cell0-cell-mapping-4jn2n\" (UID: \"f0f4c990-46bb-4b1a-ad4b-c7207ab5facd\") " pod="openstack/nova-cell0-cell-mapping-4jn2n" Nov 24 12:35:46 crc kubenswrapper[4752]: I1124 12:35:46.838254 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f4c990-46bb-4b1a-ad4b-c7207ab5facd-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-4jn2n\" (UID: \"f0f4c990-46bb-4b1a-ad4b-c7207ab5facd\") " pod="openstack/nova-cell0-cell-mapping-4jn2n" Nov 24 12:35:46 crc kubenswrapper[4752]: I1124 12:35:46.916601 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 24 12:35:46 crc kubenswrapper[4752]: I1124 12:35:46.917719 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 24 12:35:46 crc kubenswrapper[4752]: I1124 12:35:46.925362 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 24 12:35:46 crc kubenswrapper[4752]: I1124 12:35:46.939692 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mss4j\" (UniqueName: \"kubernetes.io/projected/f0f4c990-46bb-4b1a-ad4b-c7207ab5facd-kube-api-access-mss4j\") pod \"nova-cell0-cell-mapping-4jn2n\" (UID: \"f0f4c990-46bb-4b1a-ad4b-c7207ab5facd\") " pod="openstack/nova-cell0-cell-mapping-4jn2n" Nov 24 12:35:46 crc kubenswrapper[4752]: I1124 12:35:46.940036 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0f4c990-46bb-4b1a-ad4b-c7207ab5facd-scripts\") pod \"nova-cell0-cell-mapping-4jn2n\" (UID: \"f0f4c990-46bb-4b1a-ad4b-c7207ab5facd\") " pod="openstack/nova-cell0-cell-mapping-4jn2n" Nov 24 12:35:46 crc kubenswrapper[4752]: I1124 12:35:46.940227 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0f4c990-46bb-4b1a-ad4b-c7207ab5facd-config-data\") pod \"nova-cell0-cell-mapping-4jn2n\" (UID: \"f0f4c990-46bb-4b1a-ad4b-c7207ab5facd\") " pod="openstack/nova-cell0-cell-mapping-4jn2n" Nov 24 12:35:46 crc kubenswrapper[4752]: I1124 12:35:46.940381 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f4c990-46bb-4b1a-ad4b-c7207ab5facd-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-4jn2n\" (UID: \"f0f4c990-46bb-4b1a-ad4b-c7207ab5facd\") " pod="openstack/nova-cell0-cell-mapping-4jn2n" Nov 24 12:35:46 crc kubenswrapper[4752]: I1124 12:35:46.941825 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 24 12:35:46 crc kubenswrapper[4752]: I1124 12:35:46.948260 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0f4c990-46bb-4b1a-ad4b-c7207ab5facd-scripts\") pod \"nova-cell0-cell-mapping-4jn2n\" (UID: \"f0f4c990-46bb-4b1a-ad4b-c7207ab5facd\") " pod="openstack/nova-cell0-cell-mapping-4jn2n" Nov 24 12:35:46 crc kubenswrapper[4752]: I1124 12:35:46.948260 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0f4c990-46bb-4b1a-ad4b-c7207ab5facd-config-data\") pod \"nova-cell0-cell-mapping-4jn2n\" (UID: \"f0f4c990-46bb-4b1a-ad4b-c7207ab5facd\") " pod="openstack/nova-cell0-cell-mapping-4jn2n" Nov 24 12:35:46 crc kubenswrapper[4752]: I1124 12:35:46.950449 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f4c990-46bb-4b1a-ad4b-c7207ab5facd-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-4jn2n\" (UID: \"f0f4c990-46bb-4b1a-ad4b-c7207ab5facd\") " pod="openstack/nova-cell0-cell-mapping-4jn2n" Nov 24 12:35:46 crc kubenswrapper[4752]: I1124 12:35:46.982368 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mss4j\" (UniqueName: \"kubernetes.io/projected/f0f4c990-46bb-4b1a-ad4b-c7207ab5facd-kube-api-access-mss4j\") pod \"nova-cell0-cell-mapping-4jn2n\" (UID: \"f0f4c990-46bb-4b1a-ad4b-c7207ab5facd\") " pod="openstack/nova-cell0-cell-mapping-4jn2n" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.025999 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.035012 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.041591 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d244a063-29c4-4ae4-b3f3-35dd3232d55b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d244a063-29c4-4ae4-b3f3-35dd3232d55b\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.041651 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d244a063-29c4-4ae4-b3f3-35dd3232d55b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d244a063-29c4-4ae4-b3f3-35dd3232d55b\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.041725 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvc8h\" (UniqueName: \"kubernetes.io/projected/d244a063-29c4-4ae4-b3f3-35dd3232d55b-kube-api-access-xvc8h\") pod \"nova-cell1-novncproxy-0\" (UID: \"d244a063-29c4-4ae4-b3f3-35dd3232d55b\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.042152 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.062498 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.064367 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.069181 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4jn2n" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.070494 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.110457 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.137814 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.143841 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc10936c-50a4-47ec-adb9-b1751a876713-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dc10936c-50a4-47ec-adb9-b1751a876713\") " pod="openstack/nova-scheduler-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.143932 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d244a063-29c4-4ae4-b3f3-35dd3232d55b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d244a063-29c4-4ae4-b3f3-35dd3232d55b\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.143949 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn7vl\" (UniqueName: \"kubernetes.io/projected/dc10936c-50a4-47ec-adb9-b1751a876713-kube-api-access-hn7vl\") pod \"nova-scheduler-0\" (UID: \"dc10936c-50a4-47ec-adb9-b1751a876713\") " pod="openstack/nova-scheduler-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.143979 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d244a063-29c4-4ae4-b3f3-35dd3232d55b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d244a063-29c4-4ae4-b3f3-35dd3232d55b\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.144001 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc10936c-50a4-47ec-adb9-b1751a876713-config-data\") pod \"nova-scheduler-0\" (UID: \"dc10936c-50a4-47ec-adb9-b1751a876713\") " pod="openstack/nova-scheduler-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.144031 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvc8h\" (UniqueName: \"kubernetes.io/projected/d244a063-29c4-4ae4-b3f3-35dd3232d55b-kube-api-access-xvc8h\") pod \"nova-cell1-novncproxy-0\" (UID: \"d244a063-29c4-4ae4-b3f3-35dd3232d55b\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.159491 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d244a063-29c4-4ae4-b3f3-35dd3232d55b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d244a063-29c4-4ae4-b3f3-35dd3232d55b\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.161762 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d244a063-29c4-4ae4-b3f3-35dd3232d55b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d244a063-29c4-4ae4-b3f3-35dd3232d55b\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.175629 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvc8h\" (UniqueName: \"kubernetes.io/projected/d244a063-29c4-4ae4-b3f3-35dd3232d55b-kube-api-access-xvc8h\") pod \"nova-cell1-novncproxy-0\" (UID: \"d244a063-29c4-4ae4-b3f3-35dd3232d55b\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.234803 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.236494 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.236986 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.244255 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.245209 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cff9d56f-a1c7-4aea-ace8-ffb505dffb12-config-data\") pod \"nova-metadata-0\" (UID: \"cff9d56f-a1c7-4aea-ace8-ffb505dffb12\") " pod="openstack/nova-metadata-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.245245 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cff9d56f-a1c7-4aea-ace8-ffb505dffb12-logs\") pod \"nova-metadata-0\" (UID: \"cff9d56f-a1c7-4aea-ace8-ffb505dffb12\") " pod="openstack/nova-metadata-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.245308 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc10936c-50a4-47ec-adb9-b1751a876713-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dc10936c-50a4-47ec-adb9-b1751a876713\") " pod="openstack/nova-scheduler-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.245331 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cff9d56f-a1c7-4aea-ace8-ffb505dffb12-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cff9d56f-a1c7-4aea-ace8-ffb505dffb12\") " pod="openstack/nova-metadata-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.245352 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn7vl\" (UniqueName: \"kubernetes.io/projected/dc10936c-50a4-47ec-adb9-b1751a876713-kube-api-access-hn7vl\") pod \"nova-scheduler-0\" (UID: \"dc10936c-50a4-47ec-adb9-b1751a876713\") " pod="openstack/nova-scheduler-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.245383 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc10936c-50a4-47ec-adb9-b1751a876713-config-data\") pod \"nova-scheduler-0\" (UID: \"dc10936c-50a4-47ec-adb9-b1751a876713\") " pod="openstack/nova-scheduler-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.245454 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk2z6\" (UniqueName: \"kubernetes.io/projected/cff9d56f-a1c7-4aea-ace8-ffb505dffb12-kube-api-access-gk2z6\") pod \"nova-metadata-0\" (UID: \"cff9d56f-a1c7-4aea-ace8-ffb505dffb12\") " pod="openstack/nova-metadata-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.251738 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.252952 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc10936c-50a4-47ec-adb9-b1751a876713-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dc10936c-50a4-47ec-adb9-b1751a876713\") " pod="openstack/nova-scheduler-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.264506 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc10936c-50a4-47ec-adb9-b1751a876713-config-data\") pod \"nova-scheduler-0\" (UID: \"dc10936c-50a4-47ec-adb9-b1751a876713\") " pod="openstack/nova-scheduler-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.308339 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn7vl\" (UniqueName: \"kubernetes.io/projected/dc10936c-50a4-47ec-adb9-b1751a876713-kube-api-access-hn7vl\") pod \"nova-scheduler-0\" (UID: \"dc10936c-50a4-47ec-adb9-b1751a876713\") " pod="openstack/nova-scheduler-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.331889 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64bb9495b5-5vb79"] Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.333315 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64bb9495b5-5vb79" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.349646 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be01389a-2506-46c0-9785-ecb02ade4e83-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"be01389a-2506-46c0-9785-ecb02ade4e83\") " pod="openstack/nova-api-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.349703 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be01389a-2506-46c0-9785-ecb02ade4e83-logs\") pod \"nova-api-0\" (UID: \"be01389a-2506-46c0-9785-ecb02ade4e83\") " pod="openstack/nova-api-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.349799 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk2z6\" (UniqueName: \"kubernetes.io/projected/cff9d56f-a1c7-4aea-ace8-ffb505dffb12-kube-api-access-gk2z6\") pod \"nova-metadata-0\" (UID: \"cff9d56f-a1c7-4aea-ace8-ffb505dffb12\") " pod="openstack/nova-metadata-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.349835 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cff9d56f-a1c7-4aea-ace8-ffb505dffb12-config-data\") pod \"nova-metadata-0\" (UID: \"cff9d56f-a1c7-4aea-ace8-ffb505dffb12\") " pod="openstack/nova-metadata-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.349860 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cff9d56f-a1c7-4aea-ace8-ffb505dffb12-logs\") pod \"nova-metadata-0\" (UID: \"cff9d56f-a1c7-4aea-ace8-ffb505dffb12\") " pod="openstack/nova-metadata-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.349898 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqhvw\" (UniqueName: \"kubernetes.io/projected/be01389a-2506-46c0-9785-ecb02ade4e83-kube-api-access-bqhvw\") pod \"nova-api-0\" (UID: \"be01389a-2506-46c0-9785-ecb02ade4e83\") " pod="openstack/nova-api-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.349940 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be01389a-2506-46c0-9785-ecb02ade4e83-config-data\") pod \"nova-api-0\" (UID: \"be01389a-2506-46c0-9785-ecb02ade4e83\") " pod="openstack/nova-api-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.349967 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cff9d56f-a1c7-4aea-ace8-ffb505dffb12-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cff9d56f-a1c7-4aea-ace8-ffb505dffb12\") " pod="openstack/nova-metadata-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.363916 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cff9d56f-a1c7-4aea-ace8-ffb505dffb12-logs\") pod \"nova-metadata-0\" (UID: \"cff9d56f-a1c7-4aea-ace8-ffb505dffb12\") " pod="openstack/nova-metadata-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.369371 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.395977 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cff9d56f-a1c7-4aea-ace8-ffb505dffb12-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cff9d56f-a1c7-4aea-ace8-ffb505dffb12\") " pod="openstack/nova-metadata-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.398274 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk2z6\" (UniqueName: \"kubernetes.io/projected/cff9d56f-a1c7-4aea-ace8-ffb505dffb12-kube-api-access-gk2z6\") pod \"nova-metadata-0\" (UID: \"cff9d56f-a1c7-4aea-ace8-ffb505dffb12\") " pod="openstack/nova-metadata-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.401139 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64bb9495b5-5vb79"] Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.407961 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cff9d56f-a1c7-4aea-ace8-ffb505dffb12-config-data\") pod \"nova-metadata-0\" (UID: \"cff9d56f-a1c7-4aea-ace8-ffb505dffb12\") " pod="openstack/nova-metadata-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.454572 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxzgm\" (UniqueName: \"kubernetes.io/projected/9224a3bb-bf76-4b9e-bef0-01fbad9dfabc-kube-api-access-mxzgm\") pod \"dnsmasq-dns-64bb9495b5-5vb79\" (UID: \"9224a3bb-bf76-4b9e-bef0-01fbad9dfabc\") " pod="openstack/dnsmasq-dns-64bb9495b5-5vb79" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.454793 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqhvw\" (UniqueName: \"kubernetes.io/projected/be01389a-2506-46c0-9785-ecb02ade4e83-kube-api-access-bqhvw\") pod \"nova-api-0\" (UID: \"be01389a-2506-46c0-9785-ecb02ade4e83\") " pod="openstack/nova-api-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.454857 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9224a3bb-bf76-4b9e-bef0-01fbad9dfabc-dns-svc\") pod \"dnsmasq-dns-64bb9495b5-5vb79\" (UID: \"9224a3bb-bf76-4b9e-bef0-01fbad9dfabc\") " pod="openstack/dnsmasq-dns-64bb9495b5-5vb79" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.454967 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be01389a-2506-46c0-9785-ecb02ade4e83-config-data\") pod \"nova-api-0\" (UID: \"be01389a-2506-46c0-9785-ecb02ade4e83\") " pod="openstack/nova-api-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.455286 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be01389a-2506-46c0-9785-ecb02ade4e83-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"be01389a-2506-46c0-9785-ecb02ade4e83\") " pod="openstack/nova-api-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.456685 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9224a3bb-bf76-4b9e-bef0-01fbad9dfabc-ovsdbserver-nb\") pod \"dnsmasq-dns-64bb9495b5-5vb79\" (UID: \"9224a3bb-bf76-4b9e-bef0-01fbad9dfabc\") " pod="openstack/dnsmasq-dns-64bb9495b5-5vb79" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.456710 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9224a3bb-bf76-4b9e-bef0-01fbad9dfabc-config\") pod \"dnsmasq-dns-64bb9495b5-5vb79\" (UID: \"9224a3bb-bf76-4b9e-bef0-01fbad9dfabc\") " pod="openstack/dnsmasq-dns-64bb9495b5-5vb79" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.456733 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be01389a-2506-46c0-9785-ecb02ade4e83-logs\") pod \"nova-api-0\" (UID: \"be01389a-2506-46c0-9785-ecb02ade4e83\") " pod="openstack/nova-api-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.457013 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9224a3bb-bf76-4b9e-bef0-01fbad9dfabc-ovsdbserver-sb\") pod \"dnsmasq-dns-64bb9495b5-5vb79\" (UID: \"9224a3bb-bf76-4b9e-bef0-01fbad9dfabc\") " pod="openstack/dnsmasq-dns-64bb9495b5-5vb79" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.457320 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be01389a-2506-46c0-9785-ecb02ade4e83-logs\") pod \"nova-api-0\" (UID: \"be01389a-2506-46c0-9785-ecb02ade4e83\") " pod="openstack/nova-api-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.463612 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be01389a-2506-46c0-9785-ecb02ade4e83-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"be01389a-2506-46c0-9785-ecb02ade4e83\") " pod="openstack/nova-api-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.473381 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be01389a-2506-46c0-9785-ecb02ade4e83-config-data\") pod \"nova-api-0\" (UID: \"be01389a-2506-46c0-9785-ecb02ade4e83\") " pod="openstack/nova-api-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.474117 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqhvw\" (UniqueName: \"kubernetes.io/projected/be01389a-2506-46c0-9785-ecb02ade4e83-kube-api-access-bqhvw\") pod \"nova-api-0\" (UID: \"be01389a-2506-46c0-9785-ecb02ade4e83\") " pod="openstack/nova-api-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.560813 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9224a3bb-bf76-4b9e-bef0-01fbad9dfabc-ovsdbserver-nb\") pod \"dnsmasq-dns-64bb9495b5-5vb79\" (UID: \"9224a3bb-bf76-4b9e-bef0-01fbad9dfabc\") " pod="openstack/dnsmasq-dns-64bb9495b5-5vb79" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.560868 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9224a3bb-bf76-4b9e-bef0-01fbad9dfabc-config\") pod \"dnsmasq-dns-64bb9495b5-5vb79\" (UID: \"9224a3bb-bf76-4b9e-bef0-01fbad9dfabc\") " pod="openstack/dnsmasq-dns-64bb9495b5-5vb79" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.560963 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9224a3bb-bf76-4b9e-bef0-01fbad9dfabc-ovsdbserver-sb\") pod \"dnsmasq-dns-64bb9495b5-5vb79\" (UID: \"9224a3bb-bf76-4b9e-bef0-01fbad9dfabc\") " pod="openstack/dnsmasq-dns-64bb9495b5-5vb79" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.560999 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxzgm\" (UniqueName: \"kubernetes.io/projected/9224a3bb-bf76-4b9e-bef0-01fbad9dfabc-kube-api-access-mxzgm\") pod \"dnsmasq-dns-64bb9495b5-5vb79\" (UID: \"9224a3bb-bf76-4b9e-bef0-01fbad9dfabc\") " pod="openstack/dnsmasq-dns-64bb9495b5-5vb79" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.561035 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9224a3bb-bf76-4b9e-bef0-01fbad9dfabc-dns-svc\") pod \"dnsmasq-dns-64bb9495b5-5vb79\" (UID: \"9224a3bb-bf76-4b9e-bef0-01fbad9dfabc\") " pod="openstack/dnsmasq-dns-64bb9495b5-5vb79" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.562129 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9224a3bb-bf76-4b9e-bef0-01fbad9dfabc-dns-svc\") pod \"dnsmasq-dns-64bb9495b5-5vb79\" (UID: \"9224a3bb-bf76-4b9e-bef0-01fbad9dfabc\") " pod="openstack/dnsmasq-dns-64bb9495b5-5vb79" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.563124 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9224a3bb-bf76-4b9e-bef0-01fbad9dfabc-ovsdbserver-sb\") pod \"dnsmasq-dns-64bb9495b5-5vb79\" (UID: \"9224a3bb-bf76-4b9e-bef0-01fbad9dfabc\") " pod="openstack/dnsmasq-dns-64bb9495b5-5vb79" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.563127 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9224a3bb-bf76-4b9e-bef0-01fbad9dfabc-ovsdbserver-nb\") pod \"dnsmasq-dns-64bb9495b5-5vb79\" (UID: \"9224a3bb-bf76-4b9e-bef0-01fbad9dfabc\") " pod="openstack/dnsmasq-dns-64bb9495b5-5vb79" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.564709 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9224a3bb-bf76-4b9e-bef0-01fbad9dfabc-config\") pod \"dnsmasq-dns-64bb9495b5-5vb79\" (UID: \"9224a3bb-bf76-4b9e-bef0-01fbad9dfabc\") " pod="openstack/dnsmasq-dns-64bb9495b5-5vb79" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.584502 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxzgm\" (UniqueName: \"kubernetes.io/projected/9224a3bb-bf76-4b9e-bef0-01fbad9dfabc-kube-api-access-mxzgm\") pod \"dnsmasq-dns-64bb9495b5-5vb79\" (UID: \"9224a3bb-bf76-4b9e-bef0-01fbad9dfabc\") " pod="openstack/dnsmasq-dns-64bb9495b5-5vb79" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.609239 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.688307 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.708316 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64bb9495b5-5vb79" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.835730 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-4jn2n"] Nov 24 12:35:47 crc kubenswrapper[4752]: W1124 12:35:47.859693 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0f4c990_46bb_4b1a_ad4b_c7207ab5facd.slice/crio-462f11a923fb8d9c2f55d9d81ea212c49e2d7a5e1aa02be4a5df5cdeaefdb5d3 WatchSource:0}: Error finding container 462f11a923fb8d9c2f55d9d81ea212c49e2d7a5e1aa02be4a5df5cdeaefdb5d3: Status 404 returned error can't find the container with id 462f11a923fb8d9c2f55d9d81ea212c49e2d7a5e1aa02be4a5df5cdeaefdb5d3 Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.936208 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 24 12:35:47 crc kubenswrapper[4752]: W1124 12:35:47.939050 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd244a063_29c4_4ae4_b3f3_35dd3232d55b.slice/crio-aec4a9ccc6524a446b7b7734b07746cc8534261dc0ffcb51f0c2601396c2cd81 WatchSource:0}: Error finding container aec4a9ccc6524a446b7b7734b07746cc8534261dc0ffcb51f0c2601396c2cd81: Status 404 returned error can't find the container with id aec4a9ccc6524a446b7b7734b07746cc8534261dc0ffcb51f0c2601396c2cd81 Nov 24 12:35:47 crc kubenswrapper[4752]: W1124 12:35:47.958254 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc10936c_50a4_47ec_adb9_b1751a876713.slice/crio-93e340e9644df9d5a4076f1930539aaab7297d0420be66fcab42a3c2f200fd5a WatchSource:0}: Error finding container 93e340e9644df9d5a4076f1930539aaab7297d0420be66fcab42a3c2f200fd5a: Status 404 returned error can't find the container with id 93e340e9644df9d5a4076f1930539aaab7297d0420be66fcab42a3c2f200fd5a Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.963743 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.976693 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-vdv2t"] Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.977969 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-vdv2t" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.982094 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.984621 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-vdv2t"] Nov 24 12:35:47 crc kubenswrapper[4752]: I1124 12:35:47.985428 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Nov 24 12:35:48 crc kubenswrapper[4752]: I1124 12:35:48.074009 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h75c\" (UniqueName: \"kubernetes.io/projected/73c32f7d-4d73-4876-8610-95810a5318c6-kube-api-access-7h75c\") pod \"nova-cell1-conductor-db-sync-vdv2t\" (UID: \"73c32f7d-4d73-4876-8610-95810a5318c6\") " pod="openstack/nova-cell1-conductor-db-sync-vdv2t" Nov 24 12:35:48 crc kubenswrapper[4752]: I1124 12:35:48.074064 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73c32f7d-4d73-4876-8610-95810a5318c6-scripts\") pod \"nova-cell1-conductor-db-sync-vdv2t\" (UID: \"73c32f7d-4d73-4876-8610-95810a5318c6\") " pod="openstack/nova-cell1-conductor-db-sync-vdv2t" Nov 24 12:35:48 crc kubenswrapper[4752]: I1124 12:35:48.074113 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73c32f7d-4d73-4876-8610-95810a5318c6-config-data\") pod \"nova-cell1-conductor-db-sync-vdv2t\" (UID: \"73c32f7d-4d73-4876-8610-95810a5318c6\") " pod="openstack/nova-cell1-conductor-db-sync-vdv2t" Nov 24 12:35:48 crc kubenswrapper[4752]: I1124 12:35:48.074174 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c32f7d-4d73-4876-8610-95810a5318c6-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-vdv2t\" (UID: \"73c32f7d-4d73-4876-8610-95810a5318c6\") " pod="openstack/nova-cell1-conductor-db-sync-vdv2t" Nov 24 12:35:48 crc kubenswrapper[4752]: I1124 12:35:48.175924 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73c32f7d-4d73-4876-8610-95810a5318c6-scripts\") pod \"nova-cell1-conductor-db-sync-vdv2t\" (UID: \"73c32f7d-4d73-4876-8610-95810a5318c6\") " pod="openstack/nova-cell1-conductor-db-sync-vdv2t" Nov 24 12:35:48 crc kubenswrapper[4752]: I1124 12:35:48.176041 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73c32f7d-4d73-4876-8610-95810a5318c6-config-data\") pod \"nova-cell1-conductor-db-sync-vdv2t\" (UID: \"73c32f7d-4d73-4876-8610-95810a5318c6\") " pod="openstack/nova-cell1-conductor-db-sync-vdv2t" Nov 24 12:35:48 crc kubenswrapper[4752]: I1124 12:35:48.176111 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c32f7d-4d73-4876-8610-95810a5318c6-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-vdv2t\" (UID: \"73c32f7d-4d73-4876-8610-95810a5318c6\") " pod="openstack/nova-cell1-conductor-db-sync-vdv2t" Nov 24 12:35:48 crc kubenswrapper[4752]: I1124 12:35:48.176218 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h75c\" (UniqueName: \"kubernetes.io/projected/73c32f7d-4d73-4876-8610-95810a5318c6-kube-api-access-7h75c\") pod \"nova-cell1-conductor-db-sync-vdv2t\" (UID: \"73c32f7d-4d73-4876-8610-95810a5318c6\") " pod="openstack/nova-cell1-conductor-db-sync-vdv2t" Nov 24 12:35:48 crc kubenswrapper[4752]: I1124 12:35:48.182204 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c32f7d-4d73-4876-8610-95810a5318c6-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-vdv2t\" (UID: \"73c32f7d-4d73-4876-8610-95810a5318c6\") " pod="openstack/nova-cell1-conductor-db-sync-vdv2t" Nov 24 12:35:48 crc kubenswrapper[4752]: I1124 12:35:48.182220 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73c32f7d-4d73-4876-8610-95810a5318c6-config-data\") pod \"nova-cell1-conductor-db-sync-vdv2t\" (UID: \"73c32f7d-4d73-4876-8610-95810a5318c6\") " pod="openstack/nova-cell1-conductor-db-sync-vdv2t" Nov 24 12:35:48 crc kubenswrapper[4752]: I1124 12:35:48.182537 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73c32f7d-4d73-4876-8610-95810a5318c6-scripts\") pod \"nova-cell1-conductor-db-sync-vdv2t\" (UID: \"73c32f7d-4d73-4876-8610-95810a5318c6\") " pod="openstack/nova-cell1-conductor-db-sync-vdv2t" Nov 24 12:35:48 crc kubenswrapper[4752]: I1124 12:35:48.190240 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 24 12:35:48 crc kubenswrapper[4752]: I1124 12:35:48.191292 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h75c\" (UniqueName: \"kubernetes.io/projected/73c32f7d-4d73-4876-8610-95810a5318c6-kube-api-access-7h75c\") pod \"nova-cell1-conductor-db-sync-vdv2t\" (UID: \"73c32f7d-4d73-4876-8610-95810a5318c6\") " pod="openstack/nova-cell1-conductor-db-sync-vdv2t" Nov 24 12:35:48 crc kubenswrapper[4752]: W1124 12:35:48.265359 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe01389a_2506_46c0_9785_ecb02ade4e83.slice/crio-e974dea27bdfbabcf39f7c011c681f710f308a42ab116acd43c5452c94de407a WatchSource:0}: Error finding container e974dea27bdfbabcf39f7c011c681f710f308a42ab116acd43c5452c94de407a: Status 404 returned error can't find the container with id e974dea27bdfbabcf39f7c011c681f710f308a42ab116acd43c5452c94de407a Nov 24 12:35:48 crc kubenswrapper[4752]: I1124 12:35:48.328870 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 12:35:48 crc kubenswrapper[4752]: I1124 12:35:48.335040 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64bb9495b5-5vb79"] Nov 24 12:35:48 crc kubenswrapper[4752]: I1124 12:35:48.348471 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-vdv2t" Nov 24 12:35:48 crc kubenswrapper[4752]: I1124 12:35:48.816085 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4jn2n" event={"ID":"f0f4c990-46bb-4b1a-ad4b-c7207ab5facd","Type":"ContainerStarted","Data":"75824353830c05a0521eb1cb0c25fc522e3e8198813f7e10876fe6c76ae15e02"} Nov 24 12:35:48 crc kubenswrapper[4752]: I1124 12:35:48.816675 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4jn2n" event={"ID":"f0f4c990-46bb-4b1a-ad4b-c7207ab5facd","Type":"ContainerStarted","Data":"462f11a923fb8d9c2f55d9d81ea212c49e2d7a5e1aa02be4a5df5cdeaefdb5d3"} Nov 24 12:35:48 crc kubenswrapper[4752]: I1124 12:35:48.820186 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"be01389a-2506-46c0-9785-ecb02ade4e83","Type":"ContainerStarted","Data":"b3c42f90d1bd4b55a0e549de38a3f7c73f4902ac463d307a3e0107550c41bffb"} Nov 24 12:35:48 crc kubenswrapper[4752]: I1124 12:35:48.820236 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"be01389a-2506-46c0-9785-ecb02ade4e83","Type":"ContainerStarted","Data":"f6b1ddf86d7e34d5547d04ad1b5842ca070fcc7c2ac6c08c822fea22830c0d95"} Nov 24 12:35:48 crc kubenswrapper[4752]: I1124 12:35:48.820256 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"be01389a-2506-46c0-9785-ecb02ade4e83","Type":"ContainerStarted","Data":"e974dea27bdfbabcf39f7c011c681f710f308a42ab116acd43c5452c94de407a"} Nov 24 12:35:48 crc kubenswrapper[4752]: I1124 12:35:48.822597 4752 generic.go:334] "Generic (PLEG): container finished" podID="9224a3bb-bf76-4b9e-bef0-01fbad9dfabc" containerID="9aa110f84bb7ad44b315dfb14b527200c50da4083dca798e11cd847a7883a555" exitCode=0 Nov 24 12:35:48 crc kubenswrapper[4752]: I1124 12:35:48.822660 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64bb9495b5-5vb79" event={"ID":"9224a3bb-bf76-4b9e-bef0-01fbad9dfabc","Type":"ContainerDied","Data":"9aa110f84bb7ad44b315dfb14b527200c50da4083dca798e11cd847a7883a555"} Nov 24 12:35:48 crc kubenswrapper[4752]: I1124 12:35:48.822687 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64bb9495b5-5vb79" event={"ID":"9224a3bb-bf76-4b9e-bef0-01fbad9dfabc","Type":"ContainerStarted","Data":"1448edd7c92627010a60ea0fab5463988f207754341bce0e5be8f3d53fb02918"} Nov 24 12:35:48 crc kubenswrapper[4752]: I1124 12:35:48.829580 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-vdv2t"] Nov 24 12:35:48 crc kubenswrapper[4752]: I1124 12:35:48.830924 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d244a063-29c4-4ae4-b3f3-35dd3232d55b","Type":"ContainerStarted","Data":"5edffa8ecc999bd66f216c289dd13f966726c56ab67392ec9c8f239b3f686818"} Nov 24 12:35:48 crc kubenswrapper[4752]: I1124 12:35:48.830988 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d244a063-29c4-4ae4-b3f3-35dd3232d55b","Type":"ContainerStarted","Data":"aec4a9ccc6524a446b7b7734b07746cc8534261dc0ffcb51f0c2601396c2cd81"} Nov 24 12:35:48 crc kubenswrapper[4752]: I1124 12:35:48.836637 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dc10936c-50a4-47ec-adb9-b1751a876713","Type":"ContainerStarted","Data":"5917d6639da31c6bb545817298326c17406430068d1174b253c0915dcdcdd4c9"} Nov 24 12:35:48 crc kubenswrapper[4752]: I1124 12:35:48.836687 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dc10936c-50a4-47ec-adb9-b1751a876713","Type":"ContainerStarted","Data":"93e340e9644df9d5a4076f1930539aaab7297d0420be66fcab42a3c2f200fd5a"} Nov 24 12:35:48 crc kubenswrapper[4752]: I1124 12:35:48.842935 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-4jn2n" podStartSLOduration=2.842914073 podStartE2EDuration="2.842914073s" podCreationTimestamp="2025-11-24 12:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:35:48.832407861 +0000 UTC m=+5354.817228160" watchObservedRunningTime="2025-11-24 12:35:48.842914073 +0000 UTC m=+5354.827734372" Nov 24 12:35:48 crc kubenswrapper[4752]: W1124 12:35:48.844219 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73c32f7d_4d73_4876_8610_95810a5318c6.slice/crio-c0e95002b5b9d186efcf63773c9a295d5baafa9196c5a8c7a6738026d8ecac7a WatchSource:0}: Error finding container c0e95002b5b9d186efcf63773c9a295d5baafa9196c5a8c7a6738026d8ecac7a: Status 404 returned error can't find the container with id c0e95002b5b9d186efcf63773c9a295d5baafa9196c5a8c7a6738026d8ecac7a Nov 24 12:35:48 crc kubenswrapper[4752]: I1124 12:35:48.853204 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cff9d56f-a1c7-4aea-ace8-ffb505dffb12","Type":"ContainerStarted","Data":"8bc24ff87a5f7e1335640989d746e97393a515ae15029e9cb14d330f29cb792e"} Nov 24 12:35:48 crc kubenswrapper[4752]: I1124 12:35:48.853259 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cff9d56f-a1c7-4aea-ace8-ffb505dffb12","Type":"ContainerStarted","Data":"3696dbf1d406feba2a82f62411ff965773e7dabb7807ec1d5f5ff6daa7b5427c"} Nov 24 12:35:48 crc kubenswrapper[4752]: I1124 12:35:48.853273 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cff9d56f-a1c7-4aea-ace8-ffb505dffb12","Type":"ContainerStarted","Data":"b3e0b833635c0cb58a4e7cb1d1236c953e415c26266d2f87af14e8a25cfb9ce8"} Nov 24 12:35:48 crc kubenswrapper[4752]: I1124 12:35:48.876019 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.875993602 podStartE2EDuration="2.875993602s" podCreationTimestamp="2025-11-24 12:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:35:48.847377981 +0000 UTC m=+5354.832198270" watchObservedRunningTime="2025-11-24 12:35:48.875993602 +0000 UTC m=+5354.860813901" Nov 24 12:35:48 crc kubenswrapper[4752]: I1124 12:35:48.897769 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.897728765 podStartE2EDuration="1.897728765s" podCreationTimestamp="2025-11-24 12:35:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:35:48.890634772 +0000 UTC m=+5354.875455061" watchObservedRunningTime="2025-11-24 12:35:48.897728765 +0000 UTC m=+5354.882549054" Nov 24 12:35:48 crc kubenswrapper[4752]: I1124 12:35:48.923191 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.923168155 podStartE2EDuration="2.923168155s" podCreationTimestamp="2025-11-24 12:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:35:48.909472752 +0000 UTC m=+5354.894293041" watchObservedRunningTime="2025-11-24 12:35:48.923168155 +0000 UTC m=+5354.907988444" Nov 24 12:35:48 crc kubenswrapper[4752]: I1124 12:35:48.944127 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.944110216 podStartE2EDuration="1.944110216s" podCreationTimestamp="2025-11-24 12:35:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:35:48.934105589 +0000 UTC m=+5354.918925888" watchObservedRunningTime="2025-11-24 12:35:48.944110216 +0000 UTC m=+5354.928930505" Nov 24 12:35:49 crc kubenswrapper[4752]: I1124 12:35:49.861895 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64bb9495b5-5vb79" event={"ID":"9224a3bb-bf76-4b9e-bef0-01fbad9dfabc","Type":"ContainerStarted","Data":"e1d6d902bef69e3f81e2100e9a35eb76b7328cc8906d6836de3351750c08b218"} Nov 24 12:35:49 crc kubenswrapper[4752]: I1124 12:35:49.863172 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-64bb9495b5-5vb79" Nov 24 12:35:49 crc kubenswrapper[4752]: I1124 12:35:49.863420 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-vdv2t" event={"ID":"73c32f7d-4d73-4876-8610-95810a5318c6","Type":"ContainerStarted","Data":"cd6925524a88bfd66414e81ae9a1567d3985dfbad3b896dfb9fa0bda4ec33241"} Nov 24 12:35:49 crc kubenswrapper[4752]: I1124 12:35:49.863449 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-vdv2t" event={"ID":"73c32f7d-4d73-4876-8610-95810a5318c6","Type":"ContainerStarted","Data":"c0e95002b5b9d186efcf63773c9a295d5baafa9196c5a8c7a6738026d8ecac7a"} Nov 24 12:35:49 crc kubenswrapper[4752]: I1124 12:35:49.886495 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-64bb9495b5-5vb79" podStartSLOduration=2.88647222 podStartE2EDuration="2.88647222s" podCreationTimestamp="2025-11-24 12:35:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:35:49.87915412 +0000 UTC m=+5355.863974409" watchObservedRunningTime="2025-11-24 12:35:49.88647222 +0000 UTC m=+5355.871292529" Nov 24 12:35:49 crc kubenswrapper[4752]: I1124 12:35:49.897849 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-vdv2t" podStartSLOduration=2.897824685 podStartE2EDuration="2.897824685s" podCreationTimestamp="2025-11-24 12:35:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:35:49.89553854 +0000 UTC m=+5355.880358849" watchObservedRunningTime="2025-11-24 12:35:49.897824685 +0000 UTC m=+5355.882644974" Nov 24 12:35:52 crc kubenswrapper[4752]: I1124 12:35:52.238024 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Nov 24 12:35:52 crc kubenswrapper[4752]: I1124 12:35:52.372566 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 24 12:35:52 crc kubenswrapper[4752]: I1124 12:35:52.689028 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 24 12:35:52 crc kubenswrapper[4752]: I1124 12:35:52.691664 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 24 12:35:52 crc kubenswrapper[4752]: I1124 12:35:52.906243 4752 generic.go:334] "Generic (PLEG): container finished" podID="73c32f7d-4d73-4876-8610-95810a5318c6" containerID="cd6925524a88bfd66414e81ae9a1567d3985dfbad3b896dfb9fa0bda4ec33241" exitCode=0 Nov 24 12:35:52 crc kubenswrapper[4752]: I1124 12:35:52.907812 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-vdv2t" event={"ID":"73c32f7d-4d73-4876-8610-95810a5318c6","Type":"ContainerDied","Data":"cd6925524a88bfd66414e81ae9a1567d3985dfbad3b896dfb9fa0bda4ec33241"} Nov 24 12:35:53 crc kubenswrapper[4752]: I1124 12:35:53.920530 4752 generic.go:334] "Generic (PLEG): container finished" podID="f0f4c990-46bb-4b1a-ad4b-c7207ab5facd" containerID="75824353830c05a0521eb1cb0c25fc522e3e8198813f7e10876fe6c76ae15e02" exitCode=0 Nov 24 12:35:53 crc kubenswrapper[4752]: I1124 12:35:53.920806 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4jn2n" event={"ID":"f0f4c990-46bb-4b1a-ad4b-c7207ab5facd","Type":"ContainerDied","Data":"75824353830c05a0521eb1cb0c25fc522e3e8198813f7e10876fe6c76ae15e02"} Nov 24 12:35:54 crc kubenswrapper[4752]: I1124 12:35:54.323878 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-vdv2t" Nov 24 12:35:54 crc kubenswrapper[4752]: I1124 12:35:54.525847 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73c32f7d-4d73-4876-8610-95810a5318c6-scripts\") pod \"73c32f7d-4d73-4876-8610-95810a5318c6\" (UID: \"73c32f7d-4d73-4876-8610-95810a5318c6\") " Nov 24 12:35:54 crc kubenswrapper[4752]: I1124 12:35:54.525951 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c32f7d-4d73-4876-8610-95810a5318c6-combined-ca-bundle\") pod \"73c32f7d-4d73-4876-8610-95810a5318c6\" (UID: \"73c32f7d-4d73-4876-8610-95810a5318c6\") " Nov 24 12:35:54 crc kubenswrapper[4752]: I1124 12:35:54.526041 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7h75c\" (UniqueName: \"kubernetes.io/projected/73c32f7d-4d73-4876-8610-95810a5318c6-kube-api-access-7h75c\") pod \"73c32f7d-4d73-4876-8610-95810a5318c6\" (UID: \"73c32f7d-4d73-4876-8610-95810a5318c6\") " Nov 24 12:35:54 crc kubenswrapper[4752]: I1124 12:35:54.526065 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73c32f7d-4d73-4876-8610-95810a5318c6-config-data\") pod \"73c32f7d-4d73-4876-8610-95810a5318c6\" (UID: \"73c32f7d-4d73-4876-8610-95810a5318c6\") " Nov 24 12:35:54 crc kubenswrapper[4752]: I1124 12:35:54.532024 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73c32f7d-4d73-4876-8610-95810a5318c6-scripts" (OuterVolumeSpecName: "scripts") pod "73c32f7d-4d73-4876-8610-95810a5318c6" (UID: "73c32f7d-4d73-4876-8610-95810a5318c6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:35:54 crc kubenswrapper[4752]: I1124 12:35:54.540906 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73c32f7d-4d73-4876-8610-95810a5318c6-kube-api-access-7h75c" (OuterVolumeSpecName: "kube-api-access-7h75c") pod "73c32f7d-4d73-4876-8610-95810a5318c6" (UID: "73c32f7d-4d73-4876-8610-95810a5318c6"). InnerVolumeSpecName "kube-api-access-7h75c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:35:54 crc kubenswrapper[4752]: I1124 12:35:54.552566 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73c32f7d-4d73-4876-8610-95810a5318c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73c32f7d-4d73-4876-8610-95810a5318c6" (UID: "73c32f7d-4d73-4876-8610-95810a5318c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:35:54 crc kubenswrapper[4752]: I1124 12:35:54.557019 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73c32f7d-4d73-4876-8610-95810a5318c6-config-data" (OuterVolumeSpecName: "config-data") pod "73c32f7d-4d73-4876-8610-95810a5318c6" (UID: "73c32f7d-4d73-4876-8610-95810a5318c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:35:54 crc kubenswrapper[4752]: I1124 12:35:54.629730 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7h75c\" (UniqueName: \"kubernetes.io/projected/73c32f7d-4d73-4876-8610-95810a5318c6-kube-api-access-7h75c\") on node \"crc\" DevicePath \"\"" Nov 24 12:35:54 crc kubenswrapper[4752]: I1124 12:35:54.629829 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73c32f7d-4d73-4876-8610-95810a5318c6-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 12:35:54 crc kubenswrapper[4752]: I1124 12:35:54.629851 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73c32f7d-4d73-4876-8610-95810a5318c6-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:35:54 crc kubenswrapper[4752]: I1124 12:35:54.629872 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c32f7d-4d73-4876-8610-95810a5318c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 12:35:54 crc kubenswrapper[4752]: I1124 12:35:54.947325 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-vdv2t" Nov 24 12:35:54 crc kubenswrapper[4752]: I1124 12:35:54.948964 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-vdv2t" event={"ID":"73c32f7d-4d73-4876-8610-95810a5318c6","Type":"ContainerDied","Data":"c0e95002b5b9d186efcf63773c9a295d5baafa9196c5a8c7a6738026d8ecac7a"} Nov 24 12:35:54 crc kubenswrapper[4752]: I1124 12:35:54.949106 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0e95002b5b9d186efcf63773c9a295d5baafa9196c5a8c7a6738026d8ecac7a" Nov 24 12:35:55 crc kubenswrapper[4752]: I1124 12:35:55.065650 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 24 12:35:55 crc kubenswrapper[4752]: E1124 12:35:55.066836 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73c32f7d-4d73-4876-8610-95810a5318c6" containerName="nova-cell1-conductor-db-sync" Nov 24 12:35:55 crc kubenswrapper[4752]: I1124 12:35:55.066871 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="73c32f7d-4d73-4876-8610-95810a5318c6" containerName="nova-cell1-conductor-db-sync" Nov 24 12:35:55 crc kubenswrapper[4752]: I1124 12:35:55.067255 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="73c32f7d-4d73-4876-8610-95810a5318c6" containerName="nova-cell1-conductor-db-sync" Nov 24 12:35:55 crc kubenswrapper[4752]: I1124 12:35:55.068399 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 24 12:35:55 crc kubenswrapper[4752]: I1124 12:35:55.080331 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 24 12:35:55 crc kubenswrapper[4752]: I1124 12:35:55.089145 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 24 12:35:55 crc kubenswrapper[4752]: I1124 12:35:55.241521 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0418f6f-75c8-4dac-b7d4-c95946071dec-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a0418f6f-75c8-4dac-b7d4-c95946071dec\") " pod="openstack/nova-cell1-conductor-0" Nov 24 12:35:55 crc kubenswrapper[4752]: I1124 12:35:55.241576 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0418f6f-75c8-4dac-b7d4-c95946071dec-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a0418f6f-75c8-4dac-b7d4-c95946071dec\") " pod="openstack/nova-cell1-conductor-0" Nov 24 12:35:55 crc kubenswrapper[4752]: I1124 12:35:55.242632 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvptj\" (UniqueName: \"kubernetes.io/projected/a0418f6f-75c8-4dac-b7d4-c95946071dec-kube-api-access-pvptj\") pod \"nova-cell1-conductor-0\" (UID: \"a0418f6f-75c8-4dac-b7d4-c95946071dec\") " pod="openstack/nova-cell1-conductor-0" Nov 24 12:35:55 crc kubenswrapper[4752]: I1124 12:35:55.344502 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0418f6f-75c8-4dac-b7d4-c95946071dec-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a0418f6f-75c8-4dac-b7d4-c95946071dec\") " pod="openstack/nova-cell1-conductor-0" Nov 24 12:35:55 crc kubenswrapper[4752]: I1124 12:35:55.344556 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0418f6f-75c8-4dac-b7d4-c95946071dec-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a0418f6f-75c8-4dac-b7d4-c95946071dec\") " pod="openstack/nova-cell1-conductor-0" Nov 24 12:35:55 crc kubenswrapper[4752]: I1124 12:35:55.344701 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvptj\" (UniqueName: \"kubernetes.io/projected/a0418f6f-75c8-4dac-b7d4-c95946071dec-kube-api-access-pvptj\") pod \"nova-cell1-conductor-0\" (UID: \"a0418f6f-75c8-4dac-b7d4-c95946071dec\") " pod="openstack/nova-cell1-conductor-0" Nov 24 12:35:55 crc kubenswrapper[4752]: I1124 12:35:55.357555 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0418f6f-75c8-4dac-b7d4-c95946071dec-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a0418f6f-75c8-4dac-b7d4-c95946071dec\") " pod="openstack/nova-cell1-conductor-0" Nov 24 12:35:55 crc kubenswrapper[4752]: I1124 12:35:55.357849 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0418f6f-75c8-4dac-b7d4-c95946071dec-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a0418f6f-75c8-4dac-b7d4-c95946071dec\") " pod="openstack/nova-cell1-conductor-0" Nov 24 12:35:55 crc kubenswrapper[4752]: I1124 12:35:55.359429 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvptj\" (UniqueName: \"kubernetes.io/projected/a0418f6f-75c8-4dac-b7d4-c95946071dec-kube-api-access-pvptj\") pod \"nova-cell1-conductor-0\" (UID: \"a0418f6f-75c8-4dac-b7d4-c95946071dec\") " pod="openstack/nova-cell1-conductor-0" Nov 24 12:35:55 crc kubenswrapper[4752]: I1124 12:35:55.393545 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 24 12:35:55 crc kubenswrapper[4752]: I1124 12:35:55.506004 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4jn2n" Nov 24 12:35:55 crc kubenswrapper[4752]: I1124 12:35:55.653275 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mss4j\" (UniqueName: \"kubernetes.io/projected/f0f4c990-46bb-4b1a-ad4b-c7207ab5facd-kube-api-access-mss4j\") pod \"f0f4c990-46bb-4b1a-ad4b-c7207ab5facd\" (UID: \"f0f4c990-46bb-4b1a-ad4b-c7207ab5facd\") " Nov 24 12:35:55 crc kubenswrapper[4752]: I1124 12:35:55.653338 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0f4c990-46bb-4b1a-ad4b-c7207ab5facd-config-data\") pod \"f0f4c990-46bb-4b1a-ad4b-c7207ab5facd\" (UID: \"f0f4c990-46bb-4b1a-ad4b-c7207ab5facd\") " Nov 24 12:35:55 crc kubenswrapper[4752]: I1124 12:35:55.653398 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f4c990-46bb-4b1a-ad4b-c7207ab5facd-combined-ca-bundle\") pod \"f0f4c990-46bb-4b1a-ad4b-c7207ab5facd\" (UID: \"f0f4c990-46bb-4b1a-ad4b-c7207ab5facd\") " Nov 24 12:35:55 crc kubenswrapper[4752]: I1124 12:35:55.653444 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0f4c990-46bb-4b1a-ad4b-c7207ab5facd-scripts\") pod \"f0f4c990-46bb-4b1a-ad4b-c7207ab5facd\" (UID: \"f0f4c990-46bb-4b1a-ad4b-c7207ab5facd\") " Nov 24 12:35:55 crc kubenswrapper[4752]: I1124 12:35:55.657668 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0f4c990-46bb-4b1a-ad4b-c7207ab5facd-scripts" (OuterVolumeSpecName: "scripts") pod "f0f4c990-46bb-4b1a-ad4b-c7207ab5facd" (UID: "f0f4c990-46bb-4b1a-ad4b-c7207ab5facd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:35:55 crc kubenswrapper[4752]: I1124 12:35:55.658987 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0f4c990-46bb-4b1a-ad4b-c7207ab5facd-kube-api-access-mss4j" (OuterVolumeSpecName: "kube-api-access-mss4j") pod "f0f4c990-46bb-4b1a-ad4b-c7207ab5facd" (UID: "f0f4c990-46bb-4b1a-ad4b-c7207ab5facd"). InnerVolumeSpecName "kube-api-access-mss4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:35:55 crc kubenswrapper[4752]: I1124 12:35:55.686935 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0f4c990-46bb-4b1a-ad4b-c7207ab5facd-config-data" (OuterVolumeSpecName: "config-data") pod "f0f4c990-46bb-4b1a-ad4b-c7207ab5facd" (UID: "f0f4c990-46bb-4b1a-ad4b-c7207ab5facd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:35:55 crc kubenswrapper[4752]: I1124 12:35:55.687079 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0f4c990-46bb-4b1a-ad4b-c7207ab5facd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0f4c990-46bb-4b1a-ad4b-c7207ab5facd" (UID: "f0f4c990-46bb-4b1a-ad4b-c7207ab5facd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:35:55 crc kubenswrapper[4752]: I1124 12:35:55.755965 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mss4j\" (UniqueName: \"kubernetes.io/projected/f0f4c990-46bb-4b1a-ad4b-c7207ab5facd-kube-api-access-mss4j\") on node \"crc\" DevicePath \"\"" Nov 24 12:35:55 crc kubenswrapper[4752]: I1124 12:35:55.756326 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0f4c990-46bb-4b1a-ad4b-c7207ab5facd-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 12:35:55 crc kubenswrapper[4752]: I1124 12:35:55.756340 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f4c990-46bb-4b1a-ad4b-c7207ab5facd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 12:35:55 crc kubenswrapper[4752]: I1124 12:35:55.756350 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0f4c990-46bb-4b1a-ad4b-c7207ab5facd-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:35:55 crc kubenswrapper[4752]: W1124 12:35:55.883959 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0418f6f_75c8_4dac_b7d4_c95946071dec.slice/crio-4f51c5d9c0bb117e0729422a349ea73baba475063206a83cdb8058ad8ccd2eb1 WatchSource:0}: Error finding container 4f51c5d9c0bb117e0729422a349ea73baba475063206a83cdb8058ad8ccd2eb1: Status 404 returned error can't find the container with id 4f51c5d9c0bb117e0729422a349ea73baba475063206a83cdb8058ad8ccd2eb1 Nov 24 12:35:55 crc kubenswrapper[4752]: I1124 12:35:55.885633 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 24 12:35:55 crc kubenswrapper[4752]: I1124 12:35:55.971288 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4jn2n" Nov 24 12:35:55 crc kubenswrapper[4752]: I1124 12:35:55.971289 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4jn2n" event={"ID":"f0f4c990-46bb-4b1a-ad4b-c7207ab5facd","Type":"ContainerDied","Data":"462f11a923fb8d9c2f55d9d81ea212c49e2d7a5e1aa02be4a5df5cdeaefdb5d3"} Nov 24 12:35:55 crc kubenswrapper[4752]: I1124 12:35:55.971464 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="462f11a923fb8d9c2f55d9d81ea212c49e2d7a5e1aa02be4a5df5cdeaefdb5d3" Nov 24 12:35:55 crc kubenswrapper[4752]: I1124 12:35:55.975291 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a0418f6f-75c8-4dac-b7d4-c95946071dec","Type":"ContainerStarted","Data":"4f51c5d9c0bb117e0729422a349ea73baba475063206a83cdb8058ad8ccd2eb1"} Nov 24 12:35:56 crc kubenswrapper[4752]: I1124 12:35:56.133799 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 24 12:35:56 crc kubenswrapper[4752]: I1124 12:35:56.134139 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="be01389a-2506-46c0-9785-ecb02ade4e83" containerName="nova-api-log" containerID="cri-o://f6b1ddf86d7e34d5547d04ad1b5842ca070fcc7c2ac6c08c822fea22830c0d95" gracePeriod=30 Nov 24 12:35:56 crc kubenswrapper[4752]: I1124 12:35:56.134338 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="be01389a-2506-46c0-9785-ecb02ade4e83" containerName="nova-api-api" containerID="cri-o://b3c42f90d1bd4b55a0e549de38a3f7c73f4902ac463d307a3e0107550c41bffb" gracePeriod=30 Nov 24 12:35:56 crc kubenswrapper[4752]: I1124 12:35:56.150653 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 12:35:56 crc kubenswrapper[4752]: I1124 12:35:56.151012 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="dc10936c-50a4-47ec-adb9-b1751a876713" containerName="nova-scheduler-scheduler" containerID="cri-o://5917d6639da31c6bb545817298326c17406430068d1174b253c0915dcdcdd4c9" gracePeriod=30 Nov 24 12:35:56 crc kubenswrapper[4752]: I1124 12:35:56.213986 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 12:35:56 crc kubenswrapper[4752]: I1124 12:35:56.214309 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cff9d56f-a1c7-4aea-ace8-ffb505dffb12" containerName="nova-metadata-log" containerID="cri-o://3696dbf1d406feba2a82f62411ff965773e7dabb7807ec1d5f5ff6daa7b5427c" gracePeriod=30 Nov 24 12:35:56 crc kubenswrapper[4752]: I1124 12:35:56.214476 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cff9d56f-a1c7-4aea-ace8-ffb505dffb12" containerName="nova-metadata-metadata" containerID="cri-o://8bc24ff87a5f7e1335640989d746e97393a515ae15029e9cb14d330f29cb792e" gracePeriod=30 Nov 24 12:35:56 crc kubenswrapper[4752]: I1124 12:35:56.785463 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 12:35:56 crc kubenswrapper[4752]: I1124 12:35:56.790683 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 12:35:56 crc kubenswrapper[4752]: I1124 12:35:56.980600 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be01389a-2506-46c0-9785-ecb02ade4e83-combined-ca-bundle\") pod \"be01389a-2506-46c0-9785-ecb02ade4e83\" (UID: \"be01389a-2506-46c0-9785-ecb02ade4e83\") " Nov 24 12:35:56 crc kubenswrapper[4752]: I1124 12:35:56.980685 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be01389a-2506-46c0-9785-ecb02ade4e83-logs\") pod \"be01389a-2506-46c0-9785-ecb02ade4e83\" (UID: \"be01389a-2506-46c0-9785-ecb02ade4e83\") " Nov 24 12:35:56 crc kubenswrapper[4752]: I1124 12:35:56.980805 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cff9d56f-a1c7-4aea-ace8-ffb505dffb12-logs\") pod \"cff9d56f-a1c7-4aea-ace8-ffb505dffb12\" (UID: \"cff9d56f-a1c7-4aea-ace8-ffb505dffb12\") " Nov 24 12:35:56 crc kubenswrapper[4752]: I1124 12:35:56.980869 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqhvw\" (UniqueName: \"kubernetes.io/projected/be01389a-2506-46c0-9785-ecb02ade4e83-kube-api-access-bqhvw\") pod \"be01389a-2506-46c0-9785-ecb02ade4e83\" (UID: \"be01389a-2506-46c0-9785-ecb02ade4e83\") " Nov 24 12:35:56 crc kubenswrapper[4752]: I1124 12:35:56.980937 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cff9d56f-a1c7-4aea-ace8-ffb505dffb12-combined-ca-bundle\") pod \"cff9d56f-a1c7-4aea-ace8-ffb505dffb12\" (UID: \"cff9d56f-a1c7-4aea-ace8-ffb505dffb12\") " Nov 24 12:35:56 crc kubenswrapper[4752]: I1124 12:35:56.981019 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gk2z6\" (UniqueName: \"kubernetes.io/projected/cff9d56f-a1c7-4aea-ace8-ffb505dffb12-kube-api-access-gk2z6\") pod \"cff9d56f-a1c7-4aea-ace8-ffb505dffb12\" (UID: \"cff9d56f-a1c7-4aea-ace8-ffb505dffb12\") " Nov 24 12:35:56 crc kubenswrapper[4752]: I1124 12:35:56.981229 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cff9d56f-a1c7-4aea-ace8-ffb505dffb12-config-data\") pod \"cff9d56f-a1c7-4aea-ace8-ffb505dffb12\" (UID: \"cff9d56f-a1c7-4aea-ace8-ffb505dffb12\") " Nov 24 12:35:56 crc kubenswrapper[4752]: I1124 12:35:56.981279 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be01389a-2506-46c0-9785-ecb02ade4e83-config-data\") pod \"be01389a-2506-46c0-9785-ecb02ade4e83\" (UID: \"be01389a-2506-46c0-9785-ecb02ade4e83\") " Nov 24 12:35:56 crc kubenswrapper[4752]: I1124 12:35:56.981561 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cff9d56f-a1c7-4aea-ace8-ffb505dffb12-logs" (OuterVolumeSpecName: "logs") pod "cff9d56f-a1c7-4aea-ace8-ffb505dffb12" (UID: "cff9d56f-a1c7-4aea-ace8-ffb505dffb12"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:35:56 crc kubenswrapper[4752]: I1124 12:35:56.981584 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be01389a-2506-46c0-9785-ecb02ade4e83-logs" (OuterVolumeSpecName: "logs") pod "be01389a-2506-46c0-9785-ecb02ade4e83" (UID: "be01389a-2506-46c0-9785-ecb02ade4e83"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:35:56 crc kubenswrapper[4752]: I1124 12:35:56.982173 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be01389a-2506-46c0-9785-ecb02ade4e83-logs\") on node \"crc\" DevicePath \"\"" Nov 24 12:35:56 crc kubenswrapper[4752]: I1124 12:35:56.982214 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cff9d56f-a1c7-4aea-ace8-ffb505dffb12-logs\") on node \"crc\" DevicePath \"\"" Nov 24 12:35:56 crc kubenswrapper[4752]: I1124 12:35:56.990306 4752 generic.go:334] "Generic (PLEG): container finished" podID="be01389a-2506-46c0-9785-ecb02ade4e83" containerID="b3c42f90d1bd4b55a0e549de38a3f7c73f4902ac463d307a3e0107550c41bffb" exitCode=0 Nov 24 12:35:56 crc kubenswrapper[4752]: I1124 12:35:56.990366 4752 generic.go:334] "Generic (PLEG): container finished" podID="be01389a-2506-46c0-9785-ecb02ade4e83" containerID="f6b1ddf86d7e34d5547d04ad1b5842ca070fcc7c2ac6c08c822fea22830c0d95" exitCode=143 Nov 24 12:35:56 crc kubenswrapper[4752]: I1124 12:35:56.990426 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 12:35:56 crc kubenswrapper[4752]: I1124 12:35:56.990495 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"be01389a-2506-46c0-9785-ecb02ade4e83","Type":"ContainerDied","Data":"b3c42f90d1bd4b55a0e549de38a3f7c73f4902ac463d307a3e0107550c41bffb"} Nov 24 12:35:56 crc kubenswrapper[4752]: I1124 12:35:56.990569 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"be01389a-2506-46c0-9785-ecb02ade4e83","Type":"ContainerDied","Data":"f6b1ddf86d7e34d5547d04ad1b5842ca070fcc7c2ac6c08c822fea22830c0d95"} Nov 24 12:35:56 crc kubenswrapper[4752]: I1124 12:35:56.990563 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cff9d56f-a1c7-4aea-ace8-ffb505dffb12-kube-api-access-gk2z6" (OuterVolumeSpecName: "kube-api-access-gk2z6") pod "cff9d56f-a1c7-4aea-ace8-ffb505dffb12" (UID: "cff9d56f-a1c7-4aea-ace8-ffb505dffb12"). InnerVolumeSpecName "kube-api-access-gk2z6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:35:56 crc kubenswrapper[4752]: I1124 12:35:56.990668 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"be01389a-2506-46c0-9785-ecb02ade4e83","Type":"ContainerDied","Data":"e974dea27bdfbabcf39f7c011c681f710f308a42ab116acd43c5452c94de407a"} Nov 24 12:35:56 crc kubenswrapper[4752]: I1124 12:35:56.990717 4752 scope.go:117] "RemoveContainer" containerID="b3c42f90d1bd4b55a0e549de38a3f7c73f4902ac463d307a3e0107550c41bffb" Nov 24 12:35:56 crc kubenswrapper[4752]: I1124 12:35:56.995804 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a0418f6f-75c8-4dac-b7d4-c95946071dec","Type":"ContainerStarted","Data":"c57c4f1a5a103affa7b2607ae35afd356a03cb2c59858a44fe7a879bc72b9ff2"} Nov 24 12:35:56 crc kubenswrapper[4752]: I1124 12:35:56.995853 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.001156 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be01389a-2506-46c0-9785-ecb02ade4e83-kube-api-access-bqhvw" (OuterVolumeSpecName: "kube-api-access-bqhvw") pod "be01389a-2506-46c0-9785-ecb02ade4e83" (UID: "be01389a-2506-46c0-9785-ecb02ade4e83"). InnerVolumeSpecName "kube-api-access-bqhvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.003679 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.009201 4752 generic.go:334] "Generic (PLEG): container finished" podID="cff9d56f-a1c7-4aea-ace8-ffb505dffb12" containerID="8bc24ff87a5f7e1335640989d746e97393a515ae15029e9cb14d330f29cb792e" exitCode=0 Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.009317 4752 generic.go:334] "Generic (PLEG): container finished" podID="cff9d56f-a1c7-4aea-ace8-ffb505dffb12" containerID="3696dbf1d406feba2a82f62411ff965773e7dabb7807ec1d5f5ff6daa7b5427c" exitCode=143 Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.009665 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cff9d56f-a1c7-4aea-ace8-ffb505dffb12","Type":"ContainerDied","Data":"8bc24ff87a5f7e1335640989d746e97393a515ae15029e9cb14d330f29cb792e"} Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.009933 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cff9d56f-a1c7-4aea-ace8-ffb505dffb12","Type":"ContainerDied","Data":"3696dbf1d406feba2a82f62411ff965773e7dabb7807ec1d5f5ff6daa7b5427c"} Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.009971 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cff9d56f-a1c7-4aea-ace8-ffb505dffb12","Type":"ContainerDied","Data":"b3e0b833635c0cb58a4e7cb1d1236c953e415c26266d2f87af14e8a25cfb9ce8"} Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.026877 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.026853071 podStartE2EDuration="2.026853071s" podCreationTimestamp="2025-11-24 12:35:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:35:57.017481562 +0000 UTC m=+5363.002301861" watchObservedRunningTime="2025-11-24 12:35:57.026853071 +0000 UTC m=+5363.011673370" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.034930 4752 scope.go:117] "RemoveContainer" containerID="f6b1ddf86d7e34d5547d04ad1b5842ca070fcc7c2ac6c08c822fea22830c0d95" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.035403 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be01389a-2506-46c0-9785-ecb02ade4e83-config-data" (OuterVolumeSpecName: "config-data") pod "be01389a-2506-46c0-9785-ecb02ade4e83" (UID: "be01389a-2506-46c0-9785-ecb02ade4e83"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.040627 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cff9d56f-a1c7-4aea-ace8-ffb505dffb12-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cff9d56f-a1c7-4aea-ace8-ffb505dffb12" (UID: "cff9d56f-a1c7-4aea-ace8-ffb505dffb12"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.042339 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cff9d56f-a1c7-4aea-ace8-ffb505dffb12-config-data" (OuterVolumeSpecName: "config-data") pod "cff9d56f-a1c7-4aea-ace8-ffb505dffb12" (UID: "cff9d56f-a1c7-4aea-ace8-ffb505dffb12"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.045852 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be01389a-2506-46c0-9785-ecb02ade4e83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be01389a-2506-46c0-9785-ecb02ade4e83" (UID: "be01389a-2506-46c0-9785-ecb02ade4e83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.061524 4752 scope.go:117] "RemoveContainer" containerID="b3c42f90d1bd4b55a0e549de38a3f7c73f4902ac463d307a3e0107550c41bffb" Nov 24 12:35:57 crc kubenswrapper[4752]: E1124 12:35:57.062263 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3c42f90d1bd4b55a0e549de38a3f7c73f4902ac463d307a3e0107550c41bffb\": container with ID starting with b3c42f90d1bd4b55a0e549de38a3f7c73f4902ac463d307a3e0107550c41bffb not found: ID does not exist" containerID="b3c42f90d1bd4b55a0e549de38a3f7c73f4902ac463d307a3e0107550c41bffb" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.062329 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3c42f90d1bd4b55a0e549de38a3f7c73f4902ac463d307a3e0107550c41bffb"} err="failed to get container status \"b3c42f90d1bd4b55a0e549de38a3f7c73f4902ac463d307a3e0107550c41bffb\": rpc error: code = NotFound desc = could not find container \"b3c42f90d1bd4b55a0e549de38a3f7c73f4902ac463d307a3e0107550c41bffb\": container with ID starting with b3c42f90d1bd4b55a0e549de38a3f7c73f4902ac463d307a3e0107550c41bffb not found: ID does not exist" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.062376 4752 scope.go:117] "RemoveContainer" containerID="f6b1ddf86d7e34d5547d04ad1b5842ca070fcc7c2ac6c08c822fea22830c0d95" Nov 24 12:35:57 crc kubenswrapper[4752]: E1124 12:35:57.062778 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6b1ddf86d7e34d5547d04ad1b5842ca070fcc7c2ac6c08c822fea22830c0d95\": container with ID starting with f6b1ddf86d7e34d5547d04ad1b5842ca070fcc7c2ac6c08c822fea22830c0d95 not found: ID does not exist" containerID="f6b1ddf86d7e34d5547d04ad1b5842ca070fcc7c2ac6c08c822fea22830c0d95" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.062812 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6b1ddf86d7e34d5547d04ad1b5842ca070fcc7c2ac6c08c822fea22830c0d95"} err="failed to get container status \"f6b1ddf86d7e34d5547d04ad1b5842ca070fcc7c2ac6c08c822fea22830c0d95\": rpc error: code = NotFound desc = could not find container \"f6b1ddf86d7e34d5547d04ad1b5842ca070fcc7c2ac6c08c822fea22830c0d95\": container with ID starting with f6b1ddf86d7e34d5547d04ad1b5842ca070fcc7c2ac6c08c822fea22830c0d95 not found: ID does not exist" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.062829 4752 scope.go:117] "RemoveContainer" containerID="b3c42f90d1bd4b55a0e549de38a3f7c73f4902ac463d307a3e0107550c41bffb" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.063302 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3c42f90d1bd4b55a0e549de38a3f7c73f4902ac463d307a3e0107550c41bffb"} err="failed to get container status \"b3c42f90d1bd4b55a0e549de38a3f7c73f4902ac463d307a3e0107550c41bffb\": rpc error: code = NotFound desc = could not find container \"b3c42f90d1bd4b55a0e549de38a3f7c73f4902ac463d307a3e0107550c41bffb\": container with ID starting with b3c42f90d1bd4b55a0e549de38a3f7c73f4902ac463d307a3e0107550c41bffb not found: ID does not exist" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.063336 4752 scope.go:117] "RemoveContainer" containerID="f6b1ddf86d7e34d5547d04ad1b5842ca070fcc7c2ac6c08c822fea22830c0d95" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.063574 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6b1ddf86d7e34d5547d04ad1b5842ca070fcc7c2ac6c08c822fea22830c0d95"} err="failed to get container status \"f6b1ddf86d7e34d5547d04ad1b5842ca070fcc7c2ac6c08c822fea22830c0d95\": rpc error: code = NotFound desc = could not find container \"f6b1ddf86d7e34d5547d04ad1b5842ca070fcc7c2ac6c08c822fea22830c0d95\": container with ID starting with f6b1ddf86d7e34d5547d04ad1b5842ca070fcc7c2ac6c08c822fea22830c0d95 not found: ID does not exist" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.063610 4752 scope.go:117] "RemoveContainer" containerID="8bc24ff87a5f7e1335640989d746e97393a515ae15029e9cb14d330f29cb792e" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.083271 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqhvw\" (UniqueName: \"kubernetes.io/projected/be01389a-2506-46c0-9785-ecb02ade4e83-kube-api-access-bqhvw\") on node \"crc\" DevicePath \"\"" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.083321 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cff9d56f-a1c7-4aea-ace8-ffb505dffb12-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.083338 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gk2z6\" (UniqueName: \"kubernetes.io/projected/cff9d56f-a1c7-4aea-ace8-ffb505dffb12-kube-api-access-gk2z6\") on node \"crc\" DevicePath \"\"" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.083353 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cff9d56f-a1c7-4aea-ace8-ffb505dffb12-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.083368 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be01389a-2506-46c0-9785-ecb02ade4e83-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.083382 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be01389a-2506-46c0-9785-ecb02ade4e83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.113632 4752 scope.go:117] "RemoveContainer" containerID="3696dbf1d406feba2a82f62411ff965773e7dabb7807ec1d5f5ff6daa7b5427c" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.134523 4752 scope.go:117] "RemoveContainer" containerID="8bc24ff87a5f7e1335640989d746e97393a515ae15029e9cb14d330f29cb792e" Nov 24 12:35:57 crc kubenswrapper[4752]: E1124 12:35:57.135214 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bc24ff87a5f7e1335640989d746e97393a515ae15029e9cb14d330f29cb792e\": container with ID starting with 8bc24ff87a5f7e1335640989d746e97393a515ae15029e9cb14d330f29cb792e not found: ID does not exist" containerID="8bc24ff87a5f7e1335640989d746e97393a515ae15029e9cb14d330f29cb792e" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.135266 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bc24ff87a5f7e1335640989d746e97393a515ae15029e9cb14d330f29cb792e"} err="failed to get container status \"8bc24ff87a5f7e1335640989d746e97393a515ae15029e9cb14d330f29cb792e\": rpc error: code = NotFound desc = could not find container \"8bc24ff87a5f7e1335640989d746e97393a515ae15029e9cb14d330f29cb792e\": container with ID starting with 8bc24ff87a5f7e1335640989d746e97393a515ae15029e9cb14d330f29cb792e not found: ID does not exist" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.135299 4752 scope.go:117] "RemoveContainer" containerID="3696dbf1d406feba2a82f62411ff965773e7dabb7807ec1d5f5ff6daa7b5427c" Nov 24 12:35:57 crc kubenswrapper[4752]: E1124 12:35:57.135823 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3696dbf1d406feba2a82f62411ff965773e7dabb7807ec1d5f5ff6daa7b5427c\": container with ID starting with 3696dbf1d406feba2a82f62411ff965773e7dabb7807ec1d5f5ff6daa7b5427c not found: ID does not exist" containerID="3696dbf1d406feba2a82f62411ff965773e7dabb7807ec1d5f5ff6daa7b5427c" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.135859 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3696dbf1d406feba2a82f62411ff965773e7dabb7807ec1d5f5ff6daa7b5427c"} err="failed to get container status \"3696dbf1d406feba2a82f62411ff965773e7dabb7807ec1d5f5ff6daa7b5427c\": rpc error: code = NotFound desc = could not find container \"3696dbf1d406feba2a82f62411ff965773e7dabb7807ec1d5f5ff6daa7b5427c\": container with ID starting with 3696dbf1d406feba2a82f62411ff965773e7dabb7807ec1d5f5ff6daa7b5427c not found: ID does not exist" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.135880 4752 scope.go:117] "RemoveContainer" containerID="8bc24ff87a5f7e1335640989d746e97393a515ae15029e9cb14d330f29cb792e" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.136287 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bc24ff87a5f7e1335640989d746e97393a515ae15029e9cb14d330f29cb792e"} err="failed to get container status \"8bc24ff87a5f7e1335640989d746e97393a515ae15029e9cb14d330f29cb792e\": rpc error: code = NotFound desc = could not find container \"8bc24ff87a5f7e1335640989d746e97393a515ae15029e9cb14d330f29cb792e\": container with ID starting with 8bc24ff87a5f7e1335640989d746e97393a515ae15029e9cb14d330f29cb792e not found: ID does not exist" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.136523 4752 scope.go:117] "RemoveContainer" containerID="3696dbf1d406feba2a82f62411ff965773e7dabb7807ec1d5f5ff6daa7b5427c" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.137177 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3696dbf1d406feba2a82f62411ff965773e7dabb7807ec1d5f5ff6daa7b5427c"} err="failed to get container status \"3696dbf1d406feba2a82f62411ff965773e7dabb7807ec1d5f5ff6daa7b5427c\": rpc error: code = NotFound desc = could not find container \"3696dbf1d406feba2a82f62411ff965773e7dabb7807ec1d5f5ff6daa7b5427c\": container with ID starting with 3696dbf1d406feba2a82f62411ff965773e7dabb7807ec1d5f5ff6daa7b5427c not found: ID does not exist" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.237522 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.264313 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.359314 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.384317 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.398689 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.412550 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.417593 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 24 12:35:57 crc kubenswrapper[4752]: E1124 12:35:57.418080 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be01389a-2506-46c0-9785-ecb02ade4e83" containerName="nova-api-log" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.418095 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="be01389a-2506-46c0-9785-ecb02ade4e83" containerName="nova-api-log" Nov 24 12:35:57 crc kubenswrapper[4752]: E1124 12:35:57.418109 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0f4c990-46bb-4b1a-ad4b-c7207ab5facd" containerName="nova-manage" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.418116 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0f4c990-46bb-4b1a-ad4b-c7207ab5facd" containerName="nova-manage" Nov 24 12:35:57 crc kubenswrapper[4752]: E1124 12:35:57.418124 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be01389a-2506-46c0-9785-ecb02ade4e83" containerName="nova-api-api" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.418135 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="be01389a-2506-46c0-9785-ecb02ade4e83" containerName="nova-api-api" Nov 24 12:35:57 crc kubenswrapper[4752]: E1124 12:35:57.418152 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cff9d56f-a1c7-4aea-ace8-ffb505dffb12" containerName="nova-metadata-log" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.418159 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="cff9d56f-a1c7-4aea-ace8-ffb505dffb12" containerName="nova-metadata-log" Nov 24 12:35:57 crc kubenswrapper[4752]: E1124 12:35:57.418168 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cff9d56f-a1c7-4aea-ace8-ffb505dffb12" containerName="nova-metadata-metadata" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.418175 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="cff9d56f-a1c7-4aea-ace8-ffb505dffb12" containerName="nova-metadata-metadata" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.418429 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="be01389a-2506-46c0-9785-ecb02ade4e83" containerName="nova-api-log" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.418441 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="be01389a-2506-46c0-9785-ecb02ade4e83" containerName="nova-api-api" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.418457 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0f4c990-46bb-4b1a-ad4b-c7207ab5facd" containerName="nova-manage" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.418477 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="cff9d56f-a1c7-4aea-ace8-ffb505dffb12" containerName="nova-metadata-metadata" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.418488 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="cff9d56f-a1c7-4aea-ace8-ffb505dffb12" containerName="nova-metadata-log" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.421890 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.425214 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.435232 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.438420 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.440107 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.444253 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.454328 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.494611 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f455325-fbd9-40da-ad04-f9c27ce5e506-config-data\") pod \"nova-api-0\" (UID: \"4f455325-fbd9-40da-ad04-f9c27ce5e506\") " pod="openstack/nova-api-0" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.494655 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1673cf77-92e6-4619-83b7-6be76390a879-config-data\") pod \"nova-metadata-0\" (UID: \"1673cf77-92e6-4619-83b7-6be76390a879\") " pod="openstack/nova-metadata-0" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.494760 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htq2p\" (UniqueName: \"kubernetes.io/projected/4f455325-fbd9-40da-ad04-f9c27ce5e506-kube-api-access-htq2p\") pod \"nova-api-0\" (UID: \"4f455325-fbd9-40da-ad04-f9c27ce5e506\") " pod="openstack/nova-api-0" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.494796 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1673cf77-92e6-4619-83b7-6be76390a879-logs\") pod \"nova-metadata-0\" (UID: \"1673cf77-92e6-4619-83b7-6be76390a879\") " pod="openstack/nova-metadata-0" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.494845 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f455325-fbd9-40da-ad04-f9c27ce5e506-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4f455325-fbd9-40da-ad04-f9c27ce5e506\") " pod="openstack/nova-api-0" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.494873 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1673cf77-92e6-4619-83b7-6be76390a879-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1673cf77-92e6-4619-83b7-6be76390a879\") " pod="openstack/nova-metadata-0" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.494912 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltvpj\" (UniqueName: \"kubernetes.io/projected/1673cf77-92e6-4619-83b7-6be76390a879-kube-api-access-ltvpj\") pod \"nova-metadata-0\" (UID: \"1673cf77-92e6-4619-83b7-6be76390a879\") " pod="openstack/nova-metadata-0" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.494989 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f455325-fbd9-40da-ad04-f9c27ce5e506-logs\") pod \"nova-api-0\" (UID: \"4f455325-fbd9-40da-ad04-f9c27ce5e506\") " pod="openstack/nova-api-0" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.596855 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f455325-fbd9-40da-ad04-f9c27ce5e506-logs\") pod \"nova-api-0\" (UID: \"4f455325-fbd9-40da-ad04-f9c27ce5e506\") " pod="openstack/nova-api-0" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.596932 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f455325-fbd9-40da-ad04-f9c27ce5e506-config-data\") pod \"nova-api-0\" (UID: \"4f455325-fbd9-40da-ad04-f9c27ce5e506\") " pod="openstack/nova-api-0" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.596972 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1673cf77-92e6-4619-83b7-6be76390a879-config-data\") pod \"nova-metadata-0\" (UID: \"1673cf77-92e6-4619-83b7-6be76390a879\") " pod="openstack/nova-metadata-0" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.597216 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htq2p\" (UniqueName: \"kubernetes.io/projected/4f455325-fbd9-40da-ad04-f9c27ce5e506-kube-api-access-htq2p\") pod \"nova-api-0\" (UID: \"4f455325-fbd9-40da-ad04-f9c27ce5e506\") " pod="openstack/nova-api-0" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.597245 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1673cf77-92e6-4619-83b7-6be76390a879-logs\") pod \"nova-metadata-0\" (UID: \"1673cf77-92e6-4619-83b7-6be76390a879\") " pod="openstack/nova-metadata-0" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.597290 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f455325-fbd9-40da-ad04-f9c27ce5e506-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4f455325-fbd9-40da-ad04-f9c27ce5e506\") " pod="openstack/nova-api-0" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.597325 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1673cf77-92e6-4619-83b7-6be76390a879-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1673cf77-92e6-4619-83b7-6be76390a879\") " pod="openstack/nova-metadata-0" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.597353 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltvpj\" (UniqueName: \"kubernetes.io/projected/1673cf77-92e6-4619-83b7-6be76390a879-kube-api-access-ltvpj\") pod \"nova-metadata-0\" (UID: \"1673cf77-92e6-4619-83b7-6be76390a879\") " pod="openstack/nova-metadata-0" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.598182 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f455325-fbd9-40da-ad04-f9c27ce5e506-logs\") pod \"nova-api-0\" (UID: \"4f455325-fbd9-40da-ad04-f9c27ce5e506\") " pod="openstack/nova-api-0" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.598774 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1673cf77-92e6-4619-83b7-6be76390a879-logs\") pod \"nova-metadata-0\" (UID: \"1673cf77-92e6-4619-83b7-6be76390a879\") " pod="openstack/nova-metadata-0" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.603733 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f455325-fbd9-40da-ad04-f9c27ce5e506-config-data\") pod \"nova-api-0\" (UID: \"4f455325-fbd9-40da-ad04-f9c27ce5e506\") " pod="openstack/nova-api-0" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.604463 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1673cf77-92e6-4619-83b7-6be76390a879-config-data\") pod \"nova-metadata-0\" (UID: \"1673cf77-92e6-4619-83b7-6be76390a879\") " pod="openstack/nova-metadata-0" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.606377 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f455325-fbd9-40da-ad04-f9c27ce5e506-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4f455325-fbd9-40da-ad04-f9c27ce5e506\") " pod="openstack/nova-api-0" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.616371 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htq2p\" (UniqueName: \"kubernetes.io/projected/4f455325-fbd9-40da-ad04-f9c27ce5e506-kube-api-access-htq2p\") pod \"nova-api-0\" (UID: \"4f455325-fbd9-40da-ad04-f9c27ce5e506\") " pod="openstack/nova-api-0" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.622384 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1673cf77-92e6-4619-83b7-6be76390a879-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1673cf77-92e6-4619-83b7-6be76390a879\") " pod="openstack/nova-metadata-0" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.631078 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltvpj\" (UniqueName: \"kubernetes.io/projected/1673cf77-92e6-4619-83b7-6be76390a879-kube-api-access-ltvpj\") pod \"nova-metadata-0\" (UID: \"1673cf77-92e6-4619-83b7-6be76390a879\") " pod="openstack/nova-metadata-0" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.710997 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-64bb9495b5-5vb79" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.761302 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.768556 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.778250 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6977c95bf9-rmg2d"] Nov 24 12:35:57 crc kubenswrapper[4752]: I1124 12:35:57.778524 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6977c95bf9-rmg2d" podUID="33172046-c564-4757-a04a-42174d21426c" containerName="dnsmasq-dns" containerID="cri-o://7790afb5b21cf87da44155ed6448b08407c798a3c86b7c505eac128c7f84972e" gracePeriod=10 Nov 24 12:35:58 crc kubenswrapper[4752]: I1124 12:35:58.029043 4752 generic.go:334] "Generic (PLEG): container finished" podID="33172046-c564-4757-a04a-42174d21426c" containerID="7790afb5b21cf87da44155ed6448b08407c798a3c86b7c505eac128c7f84972e" exitCode=0 Nov 24 12:35:58 crc kubenswrapper[4752]: I1124 12:35:58.029149 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6977c95bf9-rmg2d" event={"ID":"33172046-c564-4757-a04a-42174d21426c","Type":"ContainerDied","Data":"7790afb5b21cf87da44155ed6448b08407c798a3c86b7c505eac128c7f84972e"} Nov 24 12:35:58 crc kubenswrapper[4752]: I1124 12:35:58.049071 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Nov 24 12:35:58 crc kubenswrapper[4752]: I1124 12:35:58.214268 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6977c95bf9-rmg2d" Nov 24 12:35:58 crc kubenswrapper[4752]: I1124 12:35:58.312009 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klkg6\" (UniqueName: \"kubernetes.io/projected/33172046-c564-4757-a04a-42174d21426c-kube-api-access-klkg6\") pod \"33172046-c564-4757-a04a-42174d21426c\" (UID: \"33172046-c564-4757-a04a-42174d21426c\") " Nov 24 12:35:58 crc kubenswrapper[4752]: I1124 12:35:58.312136 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33172046-c564-4757-a04a-42174d21426c-dns-svc\") pod \"33172046-c564-4757-a04a-42174d21426c\" (UID: \"33172046-c564-4757-a04a-42174d21426c\") " Nov 24 12:35:58 crc kubenswrapper[4752]: I1124 12:35:58.312198 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33172046-c564-4757-a04a-42174d21426c-config\") pod \"33172046-c564-4757-a04a-42174d21426c\" (UID: \"33172046-c564-4757-a04a-42174d21426c\") " Nov 24 12:35:58 crc kubenswrapper[4752]: I1124 12:35:58.312303 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33172046-c564-4757-a04a-42174d21426c-ovsdbserver-nb\") pod \"33172046-c564-4757-a04a-42174d21426c\" (UID: \"33172046-c564-4757-a04a-42174d21426c\") " Nov 24 12:35:58 crc kubenswrapper[4752]: I1124 12:35:58.312382 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33172046-c564-4757-a04a-42174d21426c-ovsdbserver-sb\") pod \"33172046-c564-4757-a04a-42174d21426c\" (UID: \"33172046-c564-4757-a04a-42174d21426c\") " Nov 24 12:35:58 crc kubenswrapper[4752]: I1124 12:35:58.321031 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33172046-c564-4757-a04a-42174d21426c-kube-api-access-klkg6" (OuterVolumeSpecName: "kube-api-access-klkg6") pod "33172046-c564-4757-a04a-42174d21426c" (UID: "33172046-c564-4757-a04a-42174d21426c"). InnerVolumeSpecName "kube-api-access-klkg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:35:58 crc kubenswrapper[4752]: I1124 12:35:58.359975 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33172046-c564-4757-a04a-42174d21426c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "33172046-c564-4757-a04a-42174d21426c" (UID: "33172046-c564-4757-a04a-42174d21426c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:35:58 crc kubenswrapper[4752]: I1124 12:35:58.368063 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33172046-c564-4757-a04a-42174d21426c-config" (OuterVolumeSpecName: "config") pod "33172046-c564-4757-a04a-42174d21426c" (UID: "33172046-c564-4757-a04a-42174d21426c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:35:58 crc kubenswrapper[4752]: I1124 12:35:58.390509 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 24 12:35:58 crc kubenswrapper[4752]: I1124 12:35:58.414376 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klkg6\" (UniqueName: \"kubernetes.io/projected/33172046-c564-4757-a04a-42174d21426c-kube-api-access-klkg6\") on node \"crc\" DevicePath \"\"" Nov 24 12:35:58 crc kubenswrapper[4752]: I1124 12:35:58.414405 4752 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33172046-c564-4757-a04a-42174d21426c-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 12:35:58 crc kubenswrapper[4752]: I1124 12:35:58.414415 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33172046-c564-4757-a04a-42174d21426c-config\") on node \"crc\" DevicePath \"\"" Nov 24 12:35:58 crc kubenswrapper[4752]: I1124 12:35:58.430566 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33172046-c564-4757-a04a-42174d21426c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "33172046-c564-4757-a04a-42174d21426c" (UID: "33172046-c564-4757-a04a-42174d21426c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:35:58 crc kubenswrapper[4752]: I1124 12:35:58.458323 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33172046-c564-4757-a04a-42174d21426c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "33172046-c564-4757-a04a-42174d21426c" (UID: "33172046-c564-4757-a04a-42174d21426c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:35:58 crc kubenswrapper[4752]: I1124 12:35:58.471954 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 12:35:58 crc kubenswrapper[4752]: I1124 12:35:58.515854 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33172046-c564-4757-a04a-42174d21426c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 24 12:35:58 crc kubenswrapper[4752]: I1124 12:35:58.515884 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33172046-c564-4757-a04a-42174d21426c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 24 12:35:58 crc kubenswrapper[4752]: I1124 12:35:58.741163 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be01389a-2506-46c0-9785-ecb02ade4e83" path="/var/lib/kubelet/pods/be01389a-2506-46c0-9785-ecb02ade4e83/volumes" Nov 24 12:35:58 crc kubenswrapper[4752]: I1124 12:35:58.742295 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cff9d56f-a1c7-4aea-ace8-ffb505dffb12" path="/var/lib/kubelet/pods/cff9d56f-a1c7-4aea-ace8-ffb505dffb12/volumes" Nov 24 12:35:59 crc kubenswrapper[4752]: I1124 12:35:59.053153 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6977c95bf9-rmg2d" event={"ID":"33172046-c564-4757-a04a-42174d21426c","Type":"ContainerDied","Data":"a1ca51719c068077c94361ec2a94c59339f440e248000a6c4d2426b0c2e52d84"} Nov 24 12:35:59 crc kubenswrapper[4752]: I1124 12:35:59.053206 4752 scope.go:117] "RemoveContainer" containerID="7790afb5b21cf87da44155ed6448b08407c798a3c86b7c505eac128c7f84972e" Nov 24 12:35:59 crc kubenswrapper[4752]: I1124 12:35:59.053298 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6977c95bf9-rmg2d" Nov 24 12:35:59 crc kubenswrapper[4752]: I1124 12:35:59.061433 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4f455325-fbd9-40da-ad04-f9c27ce5e506","Type":"ContainerStarted","Data":"360a18c9cf640eefbbd5868c84293b52e48884d265ffc943b1b66d1f19f7a7e1"} Nov 24 12:35:59 crc kubenswrapper[4752]: I1124 12:35:59.061495 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4f455325-fbd9-40da-ad04-f9c27ce5e506","Type":"ContainerStarted","Data":"7804130b6c437536708001b3996b0192fc8a6dc91cdaed041dd96e2734768458"} Nov 24 12:35:59 crc kubenswrapper[4752]: I1124 12:35:59.061520 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4f455325-fbd9-40da-ad04-f9c27ce5e506","Type":"ContainerStarted","Data":"413b1e621f34c32e5795e0655d2b99ceb7ac7b80e5bab9345935371d15485ab0"} Nov 24 12:35:59 crc kubenswrapper[4752]: I1124 12:35:59.065833 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1673cf77-92e6-4619-83b7-6be76390a879","Type":"ContainerStarted","Data":"3712d3b35fcff251cf7df74713eede5b7740abf884064a5915f83011bbc1ddbc"} Nov 24 12:35:59 crc kubenswrapper[4752]: I1124 12:35:59.065877 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1673cf77-92e6-4619-83b7-6be76390a879","Type":"ContainerStarted","Data":"442252e897da7c5af4ee0860dfe42b690d20c3a15a6a3ad3bbcff684c71c8668"} Nov 24 12:35:59 crc kubenswrapper[4752]: I1124 12:35:59.065890 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1673cf77-92e6-4619-83b7-6be76390a879","Type":"ContainerStarted","Data":"ef4ad1744ac20103c1299a66186f17a374949e0a2727c26a008d992af6b5a0e3"} Nov 24 12:35:59 crc kubenswrapper[4752]: I1124 12:35:59.087921 4752 scope.go:117] "RemoveContainer" containerID="e899bb6478cfa04d7c3caf6d3cec520420f6d48cb29c605cb5b697b2f21cd1f2" Nov 24 12:35:59 crc kubenswrapper[4752]: I1124 12:35:59.088568 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.088550277 podStartE2EDuration="2.088550277s" podCreationTimestamp="2025-11-24 12:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:35:59.079457146 +0000 UTC m=+5365.064277455" watchObservedRunningTime="2025-11-24 12:35:59.088550277 +0000 UTC m=+5365.073370566" Nov 24 12:35:59 crc kubenswrapper[4752]: I1124 12:35:59.121860 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6977c95bf9-rmg2d"] Nov 24 12:35:59 crc kubenswrapper[4752]: I1124 12:35:59.133295 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6977c95bf9-rmg2d"] Nov 24 12:35:59 crc kubenswrapper[4752]: I1124 12:35:59.140754 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.140721944 podStartE2EDuration="2.140721944s" podCreationTimestamp="2025-11-24 12:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:35:59.125717273 +0000 UTC m=+5365.110537552" watchObservedRunningTime="2025-11-24 12:35:59.140721944 +0000 UTC m=+5365.125542233" Nov 24 12:35:59 crc kubenswrapper[4752]: I1124 12:35:59.999026 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 24 12:36:00 crc kubenswrapper[4752]: I1124 12:36:00.076215 4752 generic.go:334] "Generic (PLEG): container finished" podID="dc10936c-50a4-47ec-adb9-b1751a876713" containerID="5917d6639da31c6bb545817298326c17406430068d1174b253c0915dcdcdd4c9" exitCode=0 Nov 24 12:36:00 crc kubenswrapper[4752]: I1124 12:36:00.076265 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 24 12:36:00 crc kubenswrapper[4752]: I1124 12:36:00.076260 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dc10936c-50a4-47ec-adb9-b1751a876713","Type":"ContainerDied","Data":"5917d6639da31c6bb545817298326c17406430068d1174b253c0915dcdcdd4c9"} Nov 24 12:36:00 crc kubenswrapper[4752]: I1124 12:36:00.076489 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dc10936c-50a4-47ec-adb9-b1751a876713","Type":"ContainerDied","Data":"93e340e9644df9d5a4076f1930539aaab7297d0420be66fcab42a3c2f200fd5a"} Nov 24 12:36:00 crc kubenswrapper[4752]: I1124 12:36:00.076508 4752 scope.go:117] "RemoveContainer" containerID="5917d6639da31c6bb545817298326c17406430068d1174b253c0915dcdcdd4c9" Nov 24 12:36:00 crc kubenswrapper[4752]: I1124 12:36:00.108386 4752 scope.go:117] "RemoveContainer" containerID="5917d6639da31c6bb545817298326c17406430068d1174b253c0915dcdcdd4c9" Nov 24 12:36:00 crc kubenswrapper[4752]: E1124 12:36:00.109203 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5917d6639da31c6bb545817298326c17406430068d1174b253c0915dcdcdd4c9\": container with ID starting with 5917d6639da31c6bb545817298326c17406430068d1174b253c0915dcdcdd4c9 not found: ID does not exist" containerID="5917d6639da31c6bb545817298326c17406430068d1174b253c0915dcdcdd4c9" Nov 24 12:36:00 crc kubenswrapper[4752]: I1124 12:36:00.109232 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5917d6639da31c6bb545817298326c17406430068d1174b253c0915dcdcdd4c9"} err="failed to get container status \"5917d6639da31c6bb545817298326c17406430068d1174b253c0915dcdcdd4c9\": rpc error: code = NotFound desc = could not find container \"5917d6639da31c6bb545817298326c17406430068d1174b253c0915dcdcdd4c9\": container with ID starting with 5917d6639da31c6bb545817298326c17406430068d1174b253c0915dcdcdd4c9 not found: ID does not exist" Nov 24 12:36:00 crc kubenswrapper[4752]: I1124 12:36:00.153266 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hn7vl\" (UniqueName: \"kubernetes.io/projected/dc10936c-50a4-47ec-adb9-b1751a876713-kube-api-access-hn7vl\") pod \"dc10936c-50a4-47ec-adb9-b1751a876713\" (UID: \"dc10936c-50a4-47ec-adb9-b1751a876713\") " Nov 24 12:36:00 crc kubenswrapper[4752]: I1124 12:36:00.153382 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc10936c-50a4-47ec-adb9-b1751a876713-combined-ca-bundle\") pod \"dc10936c-50a4-47ec-adb9-b1751a876713\" (UID: \"dc10936c-50a4-47ec-adb9-b1751a876713\") " Nov 24 12:36:00 crc kubenswrapper[4752]: I1124 12:36:00.153480 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc10936c-50a4-47ec-adb9-b1751a876713-config-data\") pod \"dc10936c-50a4-47ec-adb9-b1751a876713\" (UID: \"dc10936c-50a4-47ec-adb9-b1751a876713\") " Nov 24 12:36:00 crc kubenswrapper[4752]: I1124 12:36:00.158614 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc10936c-50a4-47ec-adb9-b1751a876713-kube-api-access-hn7vl" (OuterVolumeSpecName: "kube-api-access-hn7vl") pod "dc10936c-50a4-47ec-adb9-b1751a876713" (UID: "dc10936c-50a4-47ec-adb9-b1751a876713"). InnerVolumeSpecName "kube-api-access-hn7vl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:36:00 crc kubenswrapper[4752]: I1124 12:36:00.180520 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc10936c-50a4-47ec-adb9-b1751a876713-config-data" (OuterVolumeSpecName: "config-data") pod "dc10936c-50a4-47ec-adb9-b1751a876713" (UID: "dc10936c-50a4-47ec-adb9-b1751a876713"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:36:00 crc kubenswrapper[4752]: I1124 12:36:00.189722 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc10936c-50a4-47ec-adb9-b1751a876713-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc10936c-50a4-47ec-adb9-b1751a876713" (UID: "dc10936c-50a4-47ec-adb9-b1751a876713"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:36:00 crc kubenswrapper[4752]: I1124 12:36:00.256003 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hn7vl\" (UniqueName: \"kubernetes.io/projected/dc10936c-50a4-47ec-adb9-b1751a876713-kube-api-access-hn7vl\") on node \"crc\" DevicePath \"\"" Nov 24 12:36:00 crc kubenswrapper[4752]: I1124 12:36:00.256047 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc10936c-50a4-47ec-adb9-b1751a876713-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 12:36:00 crc kubenswrapper[4752]: I1124 12:36:00.256062 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc10936c-50a4-47ec-adb9-b1751a876713-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 12:36:00 crc kubenswrapper[4752]: I1124 12:36:00.488318 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 12:36:00 crc kubenswrapper[4752]: I1124 12:36:00.507572 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 12:36:00 crc kubenswrapper[4752]: I1124 12:36:00.524711 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 12:36:00 crc kubenswrapper[4752]: E1124 12:36:00.525827 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33172046-c564-4757-a04a-42174d21426c" containerName="init" Nov 24 12:36:00 crc kubenswrapper[4752]: I1124 12:36:00.525852 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="33172046-c564-4757-a04a-42174d21426c" containerName="init" Nov 24 12:36:00 crc kubenswrapper[4752]: E1124 12:36:00.525884 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33172046-c564-4757-a04a-42174d21426c" containerName="dnsmasq-dns" Nov 24 12:36:00 crc kubenswrapper[4752]: I1124 12:36:00.525895 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="33172046-c564-4757-a04a-42174d21426c" containerName="dnsmasq-dns" Nov 24 12:36:00 crc kubenswrapper[4752]: E1124 12:36:00.525906 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc10936c-50a4-47ec-adb9-b1751a876713" containerName="nova-scheduler-scheduler" Nov 24 12:36:00 crc kubenswrapper[4752]: I1124 12:36:00.525915 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc10936c-50a4-47ec-adb9-b1751a876713" containerName="nova-scheduler-scheduler" Nov 24 12:36:00 crc kubenswrapper[4752]: I1124 12:36:00.526125 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc10936c-50a4-47ec-adb9-b1751a876713" containerName="nova-scheduler-scheduler" Nov 24 12:36:00 crc kubenswrapper[4752]: I1124 12:36:00.526184 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="33172046-c564-4757-a04a-42174d21426c" containerName="dnsmasq-dns" Nov 24 12:36:00 crc kubenswrapper[4752]: I1124 12:36:00.527005 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 24 12:36:00 crc kubenswrapper[4752]: I1124 12:36:00.532184 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 24 12:36:00 crc kubenswrapper[4752]: I1124 12:36:00.535575 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 12:36:00 crc kubenswrapper[4752]: I1124 12:36:00.665304 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvgrg\" (UniqueName: \"kubernetes.io/projected/72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d-kube-api-access-jvgrg\") pod \"nova-scheduler-0\" (UID: \"72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d\") " pod="openstack/nova-scheduler-0" Nov 24 12:36:00 crc kubenswrapper[4752]: I1124 12:36:00.665443 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d\") " pod="openstack/nova-scheduler-0" Nov 24 12:36:00 crc kubenswrapper[4752]: I1124 12:36:00.665496 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d-config-data\") pod \"nova-scheduler-0\" (UID: \"72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d\") " pod="openstack/nova-scheduler-0" Nov 24 12:36:00 crc kubenswrapper[4752]: I1124 12:36:00.745647 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33172046-c564-4757-a04a-42174d21426c" path="/var/lib/kubelet/pods/33172046-c564-4757-a04a-42174d21426c/volumes" Nov 24 12:36:00 crc kubenswrapper[4752]: I1124 12:36:00.746611 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc10936c-50a4-47ec-adb9-b1751a876713" path="/var/lib/kubelet/pods/dc10936c-50a4-47ec-adb9-b1751a876713/volumes" Nov 24 12:36:00 crc kubenswrapper[4752]: I1124 12:36:00.766474 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvgrg\" (UniqueName: \"kubernetes.io/projected/72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d-kube-api-access-jvgrg\") pod \"nova-scheduler-0\" (UID: \"72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d\") " pod="openstack/nova-scheduler-0" Nov 24 12:36:00 crc kubenswrapper[4752]: I1124 12:36:00.766601 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d\") " pod="openstack/nova-scheduler-0" Nov 24 12:36:00 crc kubenswrapper[4752]: I1124 12:36:00.766663 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d-config-data\") pod \"nova-scheduler-0\" (UID: \"72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d\") " pod="openstack/nova-scheduler-0" Nov 24 12:36:00 crc kubenswrapper[4752]: I1124 12:36:00.772332 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d-config-data\") pod \"nova-scheduler-0\" (UID: \"72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d\") " pod="openstack/nova-scheduler-0" Nov 24 12:36:00 crc kubenswrapper[4752]: I1124 12:36:00.772642 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d\") " pod="openstack/nova-scheduler-0" Nov 24 12:36:00 crc kubenswrapper[4752]: I1124 12:36:00.790253 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvgrg\" (UniqueName: \"kubernetes.io/projected/72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d-kube-api-access-jvgrg\") pod \"nova-scheduler-0\" (UID: \"72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d\") " pod="openstack/nova-scheduler-0" Nov 24 12:36:00 crc kubenswrapper[4752]: I1124 12:36:00.854677 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 24 12:36:01 crc kubenswrapper[4752]: I1124 12:36:01.372467 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 12:36:01 crc kubenswrapper[4752]: W1124 12:36:01.375621 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72ff0de6_c8b4_4d94_b1cc_f09abf4bc73d.slice/crio-4ff53b033dfd09bcb19dd2c4589a581db5ef87009f633b0306070003ecc12451 WatchSource:0}: Error finding container 4ff53b033dfd09bcb19dd2c4589a581db5ef87009f633b0306070003ecc12451: Status 404 returned error can't find the container with id 4ff53b033dfd09bcb19dd2c4589a581db5ef87009f633b0306070003ecc12451 Nov 24 12:36:02 crc kubenswrapper[4752]: I1124 12:36:02.109647 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d","Type":"ContainerStarted","Data":"fd9e0d7459c2767f381ad84477ef9afda7619ee1a2842df2c7ea445b04c8c2c7"} Nov 24 12:36:02 crc kubenswrapper[4752]: I1124 12:36:02.109988 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d","Type":"ContainerStarted","Data":"4ff53b033dfd09bcb19dd2c4589a581db5ef87009f633b0306070003ecc12451"} Nov 24 12:36:02 crc kubenswrapper[4752]: I1124 12:36:02.135696 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.135677942 podStartE2EDuration="2.135677942s" podCreationTimestamp="2025-11-24 12:36:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:36:02.130251777 +0000 UTC m=+5368.115072066" watchObservedRunningTime="2025-11-24 12:36:02.135677942 +0000 UTC m=+5368.120498231" Nov 24 12:36:02 crc kubenswrapper[4752]: I1124 12:36:02.769596 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 24 12:36:02 crc kubenswrapper[4752]: I1124 12:36:02.769735 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 24 12:36:05 crc kubenswrapper[4752]: I1124 12:36:05.426666 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Nov 24 12:36:05 crc kubenswrapper[4752]: I1124 12:36:05.855392 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 24 12:36:05 crc kubenswrapper[4752]: I1124 12:36:05.967910 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-zmsrk"] Nov 24 12:36:05 crc kubenswrapper[4752]: I1124 12:36:05.969251 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-zmsrk" Nov 24 12:36:05 crc kubenswrapper[4752]: I1124 12:36:05.971813 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Nov 24 12:36:05 crc kubenswrapper[4752]: I1124 12:36:05.972009 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Nov 24 12:36:05 crc kubenswrapper[4752]: I1124 12:36:05.980327 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-zmsrk"] Nov 24 12:36:06 crc kubenswrapper[4752]: I1124 12:36:06.010510 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7af01e52-ecc0-4328-9619-a48520ec233e-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-zmsrk\" (UID: \"7af01e52-ecc0-4328-9619-a48520ec233e\") " pod="openstack/nova-cell1-cell-mapping-zmsrk" Nov 24 12:36:06 crc kubenswrapper[4752]: I1124 12:36:06.010581 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zjvp\" (UniqueName: \"kubernetes.io/projected/7af01e52-ecc0-4328-9619-a48520ec233e-kube-api-access-5zjvp\") pod \"nova-cell1-cell-mapping-zmsrk\" (UID: \"7af01e52-ecc0-4328-9619-a48520ec233e\") " pod="openstack/nova-cell1-cell-mapping-zmsrk" Nov 24 12:36:06 crc kubenswrapper[4752]: I1124 12:36:06.010628 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7af01e52-ecc0-4328-9619-a48520ec233e-scripts\") pod \"nova-cell1-cell-mapping-zmsrk\" (UID: \"7af01e52-ecc0-4328-9619-a48520ec233e\") " pod="openstack/nova-cell1-cell-mapping-zmsrk" Nov 24 12:36:06 crc kubenswrapper[4752]: I1124 12:36:06.010774 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7af01e52-ecc0-4328-9619-a48520ec233e-config-data\") pod \"nova-cell1-cell-mapping-zmsrk\" (UID: \"7af01e52-ecc0-4328-9619-a48520ec233e\") " pod="openstack/nova-cell1-cell-mapping-zmsrk" Nov 24 12:36:06 crc kubenswrapper[4752]: I1124 12:36:06.112864 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7af01e52-ecc0-4328-9619-a48520ec233e-config-data\") pod \"nova-cell1-cell-mapping-zmsrk\" (UID: \"7af01e52-ecc0-4328-9619-a48520ec233e\") " pod="openstack/nova-cell1-cell-mapping-zmsrk" Nov 24 12:36:06 crc kubenswrapper[4752]: I1124 12:36:06.115120 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7af01e52-ecc0-4328-9619-a48520ec233e-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-zmsrk\" (UID: \"7af01e52-ecc0-4328-9619-a48520ec233e\") " pod="openstack/nova-cell1-cell-mapping-zmsrk" Nov 24 12:36:06 crc kubenswrapper[4752]: I1124 12:36:06.115166 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zjvp\" (UniqueName: \"kubernetes.io/projected/7af01e52-ecc0-4328-9619-a48520ec233e-kube-api-access-5zjvp\") pod \"nova-cell1-cell-mapping-zmsrk\" (UID: \"7af01e52-ecc0-4328-9619-a48520ec233e\") " pod="openstack/nova-cell1-cell-mapping-zmsrk" Nov 24 12:36:06 crc kubenswrapper[4752]: I1124 12:36:06.115223 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7af01e52-ecc0-4328-9619-a48520ec233e-scripts\") pod \"nova-cell1-cell-mapping-zmsrk\" (UID: \"7af01e52-ecc0-4328-9619-a48520ec233e\") " pod="openstack/nova-cell1-cell-mapping-zmsrk" Nov 24 12:36:06 crc kubenswrapper[4752]: I1124 12:36:06.128177 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7af01e52-ecc0-4328-9619-a48520ec233e-config-data\") pod \"nova-cell1-cell-mapping-zmsrk\" (UID: \"7af01e52-ecc0-4328-9619-a48520ec233e\") " pod="openstack/nova-cell1-cell-mapping-zmsrk" Nov 24 12:36:06 crc kubenswrapper[4752]: I1124 12:36:06.128197 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7af01e52-ecc0-4328-9619-a48520ec233e-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-zmsrk\" (UID: \"7af01e52-ecc0-4328-9619-a48520ec233e\") " pod="openstack/nova-cell1-cell-mapping-zmsrk" Nov 24 12:36:06 crc kubenswrapper[4752]: I1124 12:36:06.129016 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7af01e52-ecc0-4328-9619-a48520ec233e-scripts\") pod \"nova-cell1-cell-mapping-zmsrk\" (UID: \"7af01e52-ecc0-4328-9619-a48520ec233e\") " pod="openstack/nova-cell1-cell-mapping-zmsrk" Nov 24 12:36:06 crc kubenswrapper[4752]: I1124 12:36:06.139213 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zjvp\" (UniqueName: \"kubernetes.io/projected/7af01e52-ecc0-4328-9619-a48520ec233e-kube-api-access-5zjvp\") pod \"nova-cell1-cell-mapping-zmsrk\" (UID: \"7af01e52-ecc0-4328-9619-a48520ec233e\") " pod="openstack/nova-cell1-cell-mapping-zmsrk" Nov 24 12:36:06 crc kubenswrapper[4752]: I1124 12:36:06.291341 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-zmsrk" Nov 24 12:36:06 crc kubenswrapper[4752]: W1124 12:36:06.810578 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7af01e52_ecc0_4328_9619_a48520ec233e.slice/crio-353eef615a71c1e648237e05deba764a45608e678897c0b0b6bd977b1fa55aab WatchSource:0}: Error finding container 353eef615a71c1e648237e05deba764a45608e678897c0b0b6bd977b1fa55aab: Status 404 returned error can't find the container with id 353eef615a71c1e648237e05deba764a45608e678897c0b0b6bd977b1fa55aab Nov 24 12:36:06 crc kubenswrapper[4752]: I1124 12:36:06.812817 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-zmsrk"] Nov 24 12:36:07 crc kubenswrapper[4752]: I1124 12:36:07.164888 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-zmsrk" event={"ID":"7af01e52-ecc0-4328-9619-a48520ec233e","Type":"ContainerStarted","Data":"fcf86d523ed08db04b505e8ec59028ab8a5a646c2138613945af49f0b4e19c6b"} Nov 24 12:36:07 crc kubenswrapper[4752]: I1124 12:36:07.164937 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-zmsrk" event={"ID":"7af01e52-ecc0-4328-9619-a48520ec233e","Type":"ContainerStarted","Data":"353eef615a71c1e648237e05deba764a45608e678897c0b0b6bd977b1fa55aab"} Nov 24 12:36:07 crc kubenswrapper[4752]: I1124 12:36:07.189463 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-zmsrk" podStartSLOduration=2.189439503 podStartE2EDuration="2.189439503s" podCreationTimestamp="2025-11-24 12:36:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:36:07.184054629 +0000 UTC m=+5373.168874938" watchObservedRunningTime="2025-11-24 12:36:07.189439503 +0000 UTC m=+5373.174259802" Nov 24 12:36:07 crc kubenswrapper[4752]: I1124 12:36:07.765246 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 24 12:36:07 crc kubenswrapper[4752]: I1124 12:36:07.765884 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 24 12:36:07 crc kubenswrapper[4752]: I1124 12:36:07.770179 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 24 12:36:07 crc kubenswrapper[4752]: I1124 12:36:07.770245 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 24 12:36:08 crc kubenswrapper[4752]: I1124 12:36:08.930956 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4f455325-fbd9-40da-ad04-f9c27ce5e506" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.69:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 24 12:36:08 crc kubenswrapper[4752]: I1124 12:36:08.930999 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="1673cf77-92e6-4619-83b7-6be76390a879" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.70:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 24 12:36:08 crc kubenswrapper[4752]: I1124 12:36:08.930956 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4f455325-fbd9-40da-ad04-f9c27ce5e506" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.69:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 24 12:36:08 crc kubenswrapper[4752]: I1124 12:36:08.930950 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="1673cf77-92e6-4619-83b7-6be76390a879" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.70:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 24 12:36:10 crc kubenswrapper[4752]: I1124 12:36:10.855078 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 24 12:36:10 crc kubenswrapper[4752]: I1124 12:36:10.912190 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 24 12:36:11 crc kubenswrapper[4752]: I1124 12:36:11.250637 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 24 12:36:12 crc kubenswrapper[4752]: I1124 12:36:12.212989 4752 generic.go:334] "Generic (PLEG): container finished" podID="7af01e52-ecc0-4328-9619-a48520ec233e" containerID="fcf86d523ed08db04b505e8ec59028ab8a5a646c2138613945af49f0b4e19c6b" exitCode=0 Nov 24 12:36:12 crc kubenswrapper[4752]: I1124 12:36:12.213069 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-zmsrk" event={"ID":"7af01e52-ecc0-4328-9619-a48520ec233e","Type":"ContainerDied","Data":"fcf86d523ed08db04b505e8ec59028ab8a5a646c2138613945af49f0b4e19c6b"} Nov 24 12:36:13 crc kubenswrapper[4752]: I1124 12:36:13.630723 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-zmsrk" Nov 24 12:36:13 crc kubenswrapper[4752]: I1124 12:36:13.728601 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7af01e52-ecc0-4328-9619-a48520ec233e-combined-ca-bundle\") pod \"7af01e52-ecc0-4328-9619-a48520ec233e\" (UID: \"7af01e52-ecc0-4328-9619-a48520ec233e\") " Nov 24 12:36:13 crc kubenswrapper[4752]: I1124 12:36:13.728794 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zjvp\" (UniqueName: \"kubernetes.io/projected/7af01e52-ecc0-4328-9619-a48520ec233e-kube-api-access-5zjvp\") pod \"7af01e52-ecc0-4328-9619-a48520ec233e\" (UID: \"7af01e52-ecc0-4328-9619-a48520ec233e\") " Nov 24 12:36:13 crc kubenswrapper[4752]: I1124 12:36:13.728850 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7af01e52-ecc0-4328-9619-a48520ec233e-config-data\") pod \"7af01e52-ecc0-4328-9619-a48520ec233e\" (UID: \"7af01e52-ecc0-4328-9619-a48520ec233e\") " Nov 24 12:36:13 crc kubenswrapper[4752]: I1124 12:36:13.728876 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7af01e52-ecc0-4328-9619-a48520ec233e-scripts\") pod \"7af01e52-ecc0-4328-9619-a48520ec233e\" (UID: \"7af01e52-ecc0-4328-9619-a48520ec233e\") " Nov 24 12:36:13 crc kubenswrapper[4752]: I1124 12:36:13.735170 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7af01e52-ecc0-4328-9619-a48520ec233e-kube-api-access-5zjvp" (OuterVolumeSpecName: "kube-api-access-5zjvp") pod "7af01e52-ecc0-4328-9619-a48520ec233e" (UID: "7af01e52-ecc0-4328-9619-a48520ec233e"). InnerVolumeSpecName "kube-api-access-5zjvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:36:13 crc kubenswrapper[4752]: I1124 12:36:13.738823 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7af01e52-ecc0-4328-9619-a48520ec233e-scripts" (OuterVolumeSpecName: "scripts") pod "7af01e52-ecc0-4328-9619-a48520ec233e" (UID: "7af01e52-ecc0-4328-9619-a48520ec233e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:36:13 crc kubenswrapper[4752]: I1124 12:36:13.761863 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7af01e52-ecc0-4328-9619-a48520ec233e-config-data" (OuterVolumeSpecName: "config-data") pod "7af01e52-ecc0-4328-9619-a48520ec233e" (UID: "7af01e52-ecc0-4328-9619-a48520ec233e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:36:13 crc kubenswrapper[4752]: I1124 12:36:13.774915 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7af01e52-ecc0-4328-9619-a48520ec233e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7af01e52-ecc0-4328-9619-a48520ec233e" (UID: "7af01e52-ecc0-4328-9619-a48520ec233e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:36:13 crc kubenswrapper[4752]: I1124 12:36:13.832278 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7af01e52-ecc0-4328-9619-a48520ec233e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 12:36:13 crc kubenswrapper[4752]: I1124 12:36:13.832321 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zjvp\" (UniqueName: \"kubernetes.io/projected/7af01e52-ecc0-4328-9619-a48520ec233e-kube-api-access-5zjvp\") on node \"crc\" DevicePath \"\"" Nov 24 12:36:13 crc kubenswrapper[4752]: I1124 12:36:13.832335 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7af01e52-ecc0-4328-9619-a48520ec233e-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 12:36:13 crc kubenswrapper[4752]: I1124 12:36:13.832348 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7af01e52-ecc0-4328-9619-a48520ec233e-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:36:14 crc kubenswrapper[4752]: I1124 12:36:14.238477 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-zmsrk" event={"ID":"7af01e52-ecc0-4328-9619-a48520ec233e","Type":"ContainerDied","Data":"353eef615a71c1e648237e05deba764a45608e678897c0b0b6bd977b1fa55aab"} Nov 24 12:36:14 crc kubenswrapper[4752]: I1124 12:36:14.238944 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="353eef615a71c1e648237e05deba764a45608e678897c0b0b6bd977b1fa55aab" Nov 24 12:36:14 crc kubenswrapper[4752]: I1124 12:36:14.238669 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-zmsrk" Nov 24 12:36:14 crc kubenswrapper[4752]: I1124 12:36:14.451198 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 24 12:36:14 crc kubenswrapper[4752]: I1124 12:36:14.453802 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4f455325-fbd9-40da-ad04-f9c27ce5e506" containerName="nova-api-log" containerID="cri-o://7804130b6c437536708001b3996b0192fc8a6dc91cdaed041dd96e2734768458" gracePeriod=30 Nov 24 12:36:14 crc kubenswrapper[4752]: I1124 12:36:14.454003 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4f455325-fbd9-40da-ad04-f9c27ce5e506" containerName="nova-api-api" containerID="cri-o://360a18c9cf640eefbbd5868c84293b52e48884d265ffc943b1b66d1f19f7a7e1" gracePeriod=30 Nov 24 12:36:14 crc kubenswrapper[4752]: I1124 12:36:14.510176 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 12:36:14 crc kubenswrapper[4752]: I1124 12:36:14.510854 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d" containerName="nova-scheduler-scheduler" containerID="cri-o://fd9e0d7459c2767f381ad84477ef9afda7619ee1a2842df2c7ea445b04c8c2c7" gracePeriod=30 Nov 24 12:36:14 crc kubenswrapper[4752]: I1124 12:36:14.525206 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 12:36:14 crc kubenswrapper[4752]: I1124 12:36:14.525532 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1673cf77-92e6-4619-83b7-6be76390a879" containerName="nova-metadata-log" containerID="cri-o://442252e897da7c5af4ee0860dfe42b690d20c3a15a6a3ad3bbcff684c71c8668" gracePeriod=30 Nov 24 12:36:14 crc kubenswrapper[4752]: I1124 12:36:14.525570 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1673cf77-92e6-4619-83b7-6be76390a879" containerName="nova-metadata-metadata" containerID="cri-o://3712d3b35fcff251cf7df74713eede5b7740abf884064a5915f83011bbc1ddbc" gracePeriod=30 Nov 24 12:36:15 crc kubenswrapper[4752]: I1124 12:36:15.253190 4752 generic.go:334] "Generic (PLEG): container finished" podID="4f455325-fbd9-40da-ad04-f9c27ce5e506" containerID="7804130b6c437536708001b3996b0192fc8a6dc91cdaed041dd96e2734768458" exitCode=143 Nov 24 12:36:15 crc kubenswrapper[4752]: I1124 12:36:15.253253 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4f455325-fbd9-40da-ad04-f9c27ce5e506","Type":"ContainerDied","Data":"7804130b6c437536708001b3996b0192fc8a6dc91cdaed041dd96e2734768458"} Nov 24 12:36:15 crc kubenswrapper[4752]: I1124 12:36:15.256438 4752 generic.go:334] "Generic (PLEG): container finished" podID="1673cf77-92e6-4619-83b7-6be76390a879" containerID="442252e897da7c5af4ee0860dfe42b690d20c3a15a6a3ad3bbcff684c71c8668" exitCode=143 Nov 24 12:36:15 crc kubenswrapper[4752]: I1124 12:36:15.256481 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1673cf77-92e6-4619-83b7-6be76390a879","Type":"ContainerDied","Data":"442252e897da7c5af4ee0860dfe42b690d20c3a15a6a3ad3bbcff684c71c8668"} Nov 24 12:36:15 crc kubenswrapper[4752]: I1124 12:36:15.468938 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 12:36:15 crc kubenswrapper[4752]: I1124 12:36:15.469041 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 12:36:15 crc kubenswrapper[4752]: E1124 12:36:15.857437 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fd9e0d7459c2767f381ad84477ef9afda7619ee1a2842df2c7ea445b04c8c2c7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 24 12:36:15 crc kubenswrapper[4752]: E1124 12:36:15.859040 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fd9e0d7459c2767f381ad84477ef9afda7619ee1a2842df2c7ea445b04c8c2c7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 24 12:36:15 crc kubenswrapper[4752]: E1124 12:36:15.861085 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fd9e0d7459c2767f381ad84477ef9afda7619ee1a2842df2c7ea445b04c8c2c7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 24 12:36:15 crc kubenswrapper[4752]: E1124 12:36:15.861232 4752 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d" containerName="nova-scheduler-scheduler" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.140678 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.229465 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f455325-fbd9-40da-ad04-f9c27ce5e506-combined-ca-bundle\") pod \"4f455325-fbd9-40da-ad04-f9c27ce5e506\" (UID: \"4f455325-fbd9-40da-ad04-f9c27ce5e506\") " Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.229514 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f455325-fbd9-40da-ad04-f9c27ce5e506-config-data\") pod \"4f455325-fbd9-40da-ad04-f9c27ce5e506\" (UID: \"4f455325-fbd9-40da-ad04-f9c27ce5e506\") " Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.229592 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f455325-fbd9-40da-ad04-f9c27ce5e506-logs\") pod \"4f455325-fbd9-40da-ad04-f9c27ce5e506\" (UID: \"4f455325-fbd9-40da-ad04-f9c27ce5e506\") " Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.229740 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htq2p\" (UniqueName: \"kubernetes.io/projected/4f455325-fbd9-40da-ad04-f9c27ce5e506-kube-api-access-htq2p\") pod \"4f455325-fbd9-40da-ad04-f9c27ce5e506\" (UID: \"4f455325-fbd9-40da-ad04-f9c27ce5e506\") " Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.230931 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f455325-fbd9-40da-ad04-f9c27ce5e506-logs" (OuterVolumeSpecName: "logs") pod "4f455325-fbd9-40da-ad04-f9c27ce5e506" (UID: "4f455325-fbd9-40da-ad04-f9c27ce5e506"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.237921 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f455325-fbd9-40da-ad04-f9c27ce5e506-kube-api-access-htq2p" (OuterVolumeSpecName: "kube-api-access-htq2p") pod "4f455325-fbd9-40da-ad04-f9c27ce5e506" (UID: "4f455325-fbd9-40da-ad04-f9c27ce5e506"). InnerVolumeSpecName "kube-api-access-htq2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.254899 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f455325-fbd9-40da-ad04-f9c27ce5e506-config-data" (OuterVolumeSpecName: "config-data") pod "4f455325-fbd9-40da-ad04-f9c27ce5e506" (UID: "4f455325-fbd9-40da-ad04-f9c27ce5e506"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.257407 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f455325-fbd9-40da-ad04-f9c27ce5e506-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f455325-fbd9-40da-ad04-f9c27ce5e506" (UID: "4f455325-fbd9-40da-ad04-f9c27ce5e506"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.297891 4752 generic.go:334] "Generic (PLEG): container finished" podID="72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d" containerID="fd9e0d7459c2767f381ad84477ef9afda7619ee1a2842df2c7ea445b04c8c2c7" exitCode=0 Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.297962 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d","Type":"ContainerDied","Data":"fd9e0d7459c2767f381ad84477ef9afda7619ee1a2842df2c7ea445b04c8c2c7"} Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.303698 4752 generic.go:334] "Generic (PLEG): container finished" podID="1673cf77-92e6-4619-83b7-6be76390a879" containerID="3712d3b35fcff251cf7df74713eede5b7740abf884064a5915f83011bbc1ddbc" exitCode=0 Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.303932 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1673cf77-92e6-4619-83b7-6be76390a879","Type":"ContainerDied","Data":"3712d3b35fcff251cf7df74713eede5b7740abf884064a5915f83011bbc1ddbc"} Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.306930 4752 generic.go:334] "Generic (PLEG): container finished" podID="4f455325-fbd9-40da-ad04-f9c27ce5e506" containerID="360a18c9cf640eefbbd5868c84293b52e48884d265ffc943b1b66d1f19f7a7e1" exitCode=0 Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.306997 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4f455325-fbd9-40da-ad04-f9c27ce5e506","Type":"ContainerDied","Data":"360a18c9cf640eefbbd5868c84293b52e48884d265ffc943b1b66d1f19f7a7e1"} Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.307038 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4f455325-fbd9-40da-ad04-f9c27ce5e506","Type":"ContainerDied","Data":"413b1e621f34c32e5795e0655d2b99ceb7ac7b80e5bab9345935371d15485ab0"} Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.307069 4752 scope.go:117] "RemoveContainer" containerID="360a18c9cf640eefbbd5868c84293b52e48884d265ffc943b1b66d1f19f7a7e1" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.307260 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.332552 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htq2p\" (UniqueName: \"kubernetes.io/projected/4f455325-fbd9-40da-ad04-f9c27ce5e506-kube-api-access-htq2p\") on node \"crc\" DevicePath \"\"" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.332582 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f455325-fbd9-40da-ad04-f9c27ce5e506-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.332596 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f455325-fbd9-40da-ad04-f9c27ce5e506-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.332610 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f455325-fbd9-40da-ad04-f9c27ce5e506-logs\") on node \"crc\" DevicePath \"\"" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.345939 4752 scope.go:117] "RemoveContainer" containerID="7804130b6c437536708001b3996b0192fc8a6dc91cdaed041dd96e2734768458" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.348888 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.363622 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.374971 4752 scope.go:117] "RemoveContainer" containerID="360a18c9cf640eefbbd5868c84293b52e48884d265ffc943b1b66d1f19f7a7e1" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.376810 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 24 12:36:18 crc kubenswrapper[4752]: E1124 12:36:18.377128 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"360a18c9cf640eefbbd5868c84293b52e48884d265ffc943b1b66d1f19f7a7e1\": container with ID starting with 360a18c9cf640eefbbd5868c84293b52e48884d265ffc943b1b66d1f19f7a7e1 not found: ID does not exist" containerID="360a18c9cf640eefbbd5868c84293b52e48884d265ffc943b1b66d1f19f7a7e1" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.377164 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"360a18c9cf640eefbbd5868c84293b52e48884d265ffc943b1b66d1f19f7a7e1"} err="failed to get container status \"360a18c9cf640eefbbd5868c84293b52e48884d265ffc943b1b66d1f19f7a7e1\": rpc error: code = NotFound desc = could not find container \"360a18c9cf640eefbbd5868c84293b52e48884d265ffc943b1b66d1f19f7a7e1\": container with ID starting with 360a18c9cf640eefbbd5868c84293b52e48884d265ffc943b1b66d1f19f7a7e1 not found: ID does not exist" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.377186 4752 scope.go:117] "RemoveContainer" containerID="7804130b6c437536708001b3996b0192fc8a6dc91cdaed041dd96e2734768458" Nov 24 12:36:18 crc kubenswrapper[4752]: E1124 12:36:18.377630 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f455325-fbd9-40da-ad04-f9c27ce5e506" containerName="nova-api-log" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.377657 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f455325-fbd9-40da-ad04-f9c27ce5e506" containerName="nova-api-log" Nov 24 12:36:18 crc kubenswrapper[4752]: E1124 12:36:18.377688 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f455325-fbd9-40da-ad04-f9c27ce5e506" containerName="nova-api-api" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.377694 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f455325-fbd9-40da-ad04-f9c27ce5e506" containerName="nova-api-api" Nov 24 12:36:18 crc kubenswrapper[4752]: E1124 12:36:18.377714 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7af01e52-ecc0-4328-9619-a48520ec233e" containerName="nova-manage" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.377720 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="7af01e52-ecc0-4328-9619-a48520ec233e" containerName="nova-manage" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.377945 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="7af01e52-ecc0-4328-9619-a48520ec233e" containerName="nova-manage" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.377962 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f455325-fbd9-40da-ad04-f9c27ce5e506" containerName="nova-api-log" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.377972 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f455325-fbd9-40da-ad04-f9c27ce5e506" containerName="nova-api-api" Nov 24 12:36:18 crc kubenswrapper[4752]: E1124 12:36:18.378451 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7804130b6c437536708001b3996b0192fc8a6dc91cdaed041dd96e2734768458\": container with ID starting with 7804130b6c437536708001b3996b0192fc8a6dc91cdaed041dd96e2734768458 not found: ID does not exist" containerID="7804130b6c437536708001b3996b0192fc8a6dc91cdaed041dd96e2734768458" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.378502 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7804130b6c437536708001b3996b0192fc8a6dc91cdaed041dd96e2734768458"} err="failed to get container status \"7804130b6c437536708001b3996b0192fc8a6dc91cdaed041dd96e2734768458\": rpc error: code = NotFound desc = could not find container \"7804130b6c437536708001b3996b0192fc8a6dc91cdaed041dd96e2734768458\": container with ID starting with 7804130b6c437536708001b3996b0192fc8a6dc91cdaed041dd96e2734768458 not found: ID does not exist" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.384348 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.388266 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.390156 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.434078 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2f7f\" (UniqueName: \"kubernetes.io/projected/487dd637-63f1-43f5-b9a2-2c5b8c5a2453-kube-api-access-x2f7f\") pod \"nova-api-0\" (UID: \"487dd637-63f1-43f5-b9a2-2c5b8c5a2453\") " pod="openstack/nova-api-0" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.434360 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/487dd637-63f1-43f5-b9a2-2c5b8c5a2453-logs\") pod \"nova-api-0\" (UID: \"487dd637-63f1-43f5-b9a2-2c5b8c5a2453\") " pod="openstack/nova-api-0" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.434425 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/487dd637-63f1-43f5-b9a2-2c5b8c5a2453-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"487dd637-63f1-43f5-b9a2-2c5b8c5a2453\") " pod="openstack/nova-api-0" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.434461 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/487dd637-63f1-43f5-b9a2-2c5b8c5a2453-config-data\") pod \"nova-api-0\" (UID: \"487dd637-63f1-43f5-b9a2-2c5b8c5a2453\") " pod="openstack/nova-api-0" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.537017 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/487dd637-63f1-43f5-b9a2-2c5b8c5a2453-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"487dd637-63f1-43f5-b9a2-2c5b8c5a2453\") " pod="openstack/nova-api-0" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.537112 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/487dd637-63f1-43f5-b9a2-2c5b8c5a2453-config-data\") pod \"nova-api-0\" (UID: \"487dd637-63f1-43f5-b9a2-2c5b8c5a2453\") " pod="openstack/nova-api-0" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.537210 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2f7f\" (UniqueName: \"kubernetes.io/projected/487dd637-63f1-43f5-b9a2-2c5b8c5a2453-kube-api-access-x2f7f\") pod \"nova-api-0\" (UID: \"487dd637-63f1-43f5-b9a2-2c5b8c5a2453\") " pod="openstack/nova-api-0" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.537310 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/487dd637-63f1-43f5-b9a2-2c5b8c5a2453-logs\") pod \"nova-api-0\" (UID: \"487dd637-63f1-43f5-b9a2-2c5b8c5a2453\") " pod="openstack/nova-api-0" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.537802 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/487dd637-63f1-43f5-b9a2-2c5b8c5a2453-logs\") pod \"nova-api-0\" (UID: \"487dd637-63f1-43f5-b9a2-2c5b8c5a2453\") " pod="openstack/nova-api-0" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.542287 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/487dd637-63f1-43f5-b9a2-2c5b8c5a2453-config-data\") pod \"nova-api-0\" (UID: \"487dd637-63f1-43f5-b9a2-2c5b8c5a2453\") " pod="openstack/nova-api-0" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.542719 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/487dd637-63f1-43f5-b9a2-2c5b8c5a2453-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"487dd637-63f1-43f5-b9a2-2c5b8c5a2453\") " pod="openstack/nova-api-0" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.552993 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2f7f\" (UniqueName: \"kubernetes.io/projected/487dd637-63f1-43f5-b9a2-2c5b8c5a2453-kube-api-access-x2f7f\") pod \"nova-api-0\" (UID: \"487dd637-63f1-43f5-b9a2-2c5b8c5a2453\") " pod="openstack/nova-api-0" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.700941 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.741205 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f455325-fbd9-40da-ad04-f9c27ce5e506" path="/var/lib/kubelet/pods/4f455325-fbd9-40da-ad04-f9c27ce5e506/volumes" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.873509 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.878624 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.946735 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d-combined-ca-bundle\") pod \"72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d\" (UID: \"72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d\") " Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.946832 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1673cf77-92e6-4619-83b7-6be76390a879-config-data\") pod \"1673cf77-92e6-4619-83b7-6be76390a879\" (UID: \"1673cf77-92e6-4619-83b7-6be76390a879\") " Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.946884 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvgrg\" (UniqueName: \"kubernetes.io/projected/72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d-kube-api-access-jvgrg\") pod \"72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d\" (UID: \"72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d\") " Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.946961 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d-config-data\") pod \"72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d\" (UID: \"72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d\") " Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.947013 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1673cf77-92e6-4619-83b7-6be76390a879-logs\") pod \"1673cf77-92e6-4619-83b7-6be76390a879\" (UID: \"1673cf77-92e6-4619-83b7-6be76390a879\") " Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.947088 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltvpj\" (UniqueName: \"kubernetes.io/projected/1673cf77-92e6-4619-83b7-6be76390a879-kube-api-access-ltvpj\") pod \"1673cf77-92e6-4619-83b7-6be76390a879\" (UID: \"1673cf77-92e6-4619-83b7-6be76390a879\") " Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.947162 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1673cf77-92e6-4619-83b7-6be76390a879-combined-ca-bundle\") pod \"1673cf77-92e6-4619-83b7-6be76390a879\" (UID: \"1673cf77-92e6-4619-83b7-6be76390a879\") " Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.949117 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1673cf77-92e6-4619-83b7-6be76390a879-logs" (OuterVolumeSpecName: "logs") pod "1673cf77-92e6-4619-83b7-6be76390a879" (UID: "1673cf77-92e6-4619-83b7-6be76390a879"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.952459 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1673cf77-92e6-4619-83b7-6be76390a879-kube-api-access-ltvpj" (OuterVolumeSpecName: "kube-api-access-ltvpj") pod "1673cf77-92e6-4619-83b7-6be76390a879" (UID: "1673cf77-92e6-4619-83b7-6be76390a879"). InnerVolumeSpecName "kube-api-access-ltvpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.967220 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d-kube-api-access-jvgrg" (OuterVolumeSpecName: "kube-api-access-jvgrg") pod "72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d" (UID: "72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d"). InnerVolumeSpecName "kube-api-access-jvgrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.982009 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1673cf77-92e6-4619-83b7-6be76390a879-config-data" (OuterVolumeSpecName: "config-data") pod "1673cf77-92e6-4619-83b7-6be76390a879" (UID: "1673cf77-92e6-4619-83b7-6be76390a879"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.982838 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1673cf77-92e6-4619-83b7-6be76390a879-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1673cf77-92e6-4619-83b7-6be76390a879" (UID: "1673cf77-92e6-4619-83b7-6be76390a879"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.983935 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d-config-data" (OuterVolumeSpecName: "config-data") pod "72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d" (UID: "72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:36:18 crc kubenswrapper[4752]: I1124 12:36:18.985773 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d" (UID: "72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.051053 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.051099 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1673cf77-92e6-4619-83b7-6be76390a879-logs\") on node \"crc\" DevicePath \"\"" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.051112 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltvpj\" (UniqueName: \"kubernetes.io/projected/1673cf77-92e6-4619-83b7-6be76390a879-kube-api-access-ltvpj\") on node \"crc\" DevicePath \"\"" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.051129 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1673cf77-92e6-4619-83b7-6be76390a879-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.051143 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.051153 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1673cf77-92e6-4619-83b7-6be76390a879-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.051164 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvgrg\" (UniqueName: \"kubernetes.io/projected/72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d-kube-api-access-jvgrg\") on node \"crc\" DevicePath \"\"" Nov 24 12:36:19 crc kubenswrapper[4752]: W1124 12:36:19.173022 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod487dd637_63f1_43f5_b9a2_2c5b8c5a2453.slice/crio-c6878aa6d97d8733fde17c0133cd3b7994a5a0e1449f38a3a1c6b796a9eea970 WatchSource:0}: Error finding container c6878aa6d97d8733fde17c0133cd3b7994a5a0e1449f38a3a1c6b796a9eea970: Status 404 returned error can't find the container with id c6878aa6d97d8733fde17c0133cd3b7994a5a0e1449f38a3a1c6b796a9eea970 Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.174607 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.321060 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"487dd637-63f1-43f5-b9a2-2c5b8c5a2453","Type":"ContainerStarted","Data":"c6878aa6d97d8733fde17c0133cd3b7994a5a0e1449f38a3a1c6b796a9eea970"} Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.323568 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d","Type":"ContainerDied","Data":"4ff53b033dfd09bcb19dd2c4589a581db5ef87009f633b0306070003ecc12451"} Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.323694 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.323730 4752 scope.go:117] "RemoveContainer" containerID="fd9e0d7459c2767f381ad84477ef9afda7619ee1a2842df2c7ea445b04c8c2c7" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.327532 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1673cf77-92e6-4619-83b7-6be76390a879","Type":"ContainerDied","Data":"ef4ad1744ac20103c1299a66186f17a374949e0a2727c26a008d992af6b5a0e3"} Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.327593 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.350387 4752 scope.go:117] "RemoveContainer" containerID="3712d3b35fcff251cf7df74713eede5b7740abf884064a5915f83011bbc1ddbc" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.380653 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.391134 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.405310 4752 scope.go:117] "RemoveContainer" containerID="442252e897da7c5af4ee0860dfe42b690d20c3a15a6a3ad3bbcff684c71c8668" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.433811 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 24 12:36:19 crc kubenswrapper[4752]: E1124 12:36:19.434366 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d" containerName="nova-scheduler-scheduler" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.434378 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d" containerName="nova-scheduler-scheduler" Nov 24 12:36:19 crc kubenswrapper[4752]: E1124 12:36:19.434388 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1673cf77-92e6-4619-83b7-6be76390a879" containerName="nova-metadata-log" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.434394 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="1673cf77-92e6-4619-83b7-6be76390a879" containerName="nova-metadata-log" Nov 24 12:36:19 crc kubenswrapper[4752]: E1124 12:36:19.434410 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1673cf77-92e6-4619-83b7-6be76390a879" containerName="nova-metadata-metadata" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.434416 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="1673cf77-92e6-4619-83b7-6be76390a879" containerName="nova-metadata-metadata" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.434569 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="1673cf77-92e6-4619-83b7-6be76390a879" containerName="nova-metadata-metadata" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.434581 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d" containerName="nova-scheduler-scheduler" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.434601 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="1673cf77-92e6-4619-83b7-6be76390a879" containerName="nova-metadata-log" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.435393 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.435407 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.435483 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.443521 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.446708 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.457262 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.458473 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.459583 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf2ced4e-3e49-4591-a1c2-88cb0941afd4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cf2ced4e-3e49-4591-a1c2-88cb0941afd4\") " pod="openstack/nova-metadata-0" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.459652 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cxkb\" (UniqueName: \"kubernetes.io/projected/cf2ced4e-3e49-4591-a1c2-88cb0941afd4-kube-api-access-8cxkb\") pod \"nova-metadata-0\" (UID: \"cf2ced4e-3e49-4591-a1c2-88cb0941afd4\") " pod="openstack/nova-metadata-0" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.459697 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf2ced4e-3e49-4591-a1c2-88cb0941afd4-config-data\") pod \"nova-metadata-0\" (UID: \"cf2ced4e-3e49-4591-a1c2-88cb0941afd4\") " pod="openstack/nova-metadata-0" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.459726 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf2ced4e-3e49-4591-a1c2-88cb0941afd4-logs\") pod \"nova-metadata-0\" (UID: \"cf2ced4e-3e49-4591-a1c2-88cb0941afd4\") " pod="openstack/nova-metadata-0" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.463560 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.512734 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.561313 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvmft\" (UniqueName: \"kubernetes.io/projected/ff4093e0-835e-4833-afb9-bbd891f00f93-kube-api-access-pvmft\") pod \"nova-scheduler-0\" (UID: \"ff4093e0-835e-4833-afb9-bbd891f00f93\") " pod="openstack/nova-scheduler-0" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.561398 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cxkb\" (UniqueName: \"kubernetes.io/projected/cf2ced4e-3e49-4591-a1c2-88cb0941afd4-kube-api-access-8cxkb\") pod \"nova-metadata-0\" (UID: \"cf2ced4e-3e49-4591-a1c2-88cb0941afd4\") " pod="openstack/nova-metadata-0" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.561450 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff4093e0-835e-4833-afb9-bbd891f00f93-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ff4093e0-835e-4833-afb9-bbd891f00f93\") " pod="openstack/nova-scheduler-0" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.561485 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf2ced4e-3e49-4591-a1c2-88cb0941afd4-config-data\") pod \"nova-metadata-0\" (UID: \"cf2ced4e-3e49-4591-a1c2-88cb0941afd4\") " pod="openstack/nova-metadata-0" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.561520 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf2ced4e-3e49-4591-a1c2-88cb0941afd4-logs\") pod \"nova-metadata-0\" (UID: \"cf2ced4e-3e49-4591-a1c2-88cb0941afd4\") " pod="openstack/nova-metadata-0" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.561580 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf2ced4e-3e49-4591-a1c2-88cb0941afd4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cf2ced4e-3e49-4591-a1c2-88cb0941afd4\") " pod="openstack/nova-metadata-0" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.561614 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff4093e0-835e-4833-afb9-bbd891f00f93-config-data\") pod \"nova-scheduler-0\" (UID: \"ff4093e0-835e-4833-afb9-bbd891f00f93\") " pod="openstack/nova-scheduler-0" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.561892 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf2ced4e-3e49-4591-a1c2-88cb0941afd4-logs\") pod \"nova-metadata-0\" (UID: \"cf2ced4e-3e49-4591-a1c2-88cb0941afd4\") " pod="openstack/nova-metadata-0" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.566024 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf2ced4e-3e49-4591-a1c2-88cb0941afd4-config-data\") pod \"nova-metadata-0\" (UID: \"cf2ced4e-3e49-4591-a1c2-88cb0941afd4\") " pod="openstack/nova-metadata-0" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.566189 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf2ced4e-3e49-4591-a1c2-88cb0941afd4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cf2ced4e-3e49-4591-a1c2-88cb0941afd4\") " pod="openstack/nova-metadata-0" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.576872 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cxkb\" (UniqueName: \"kubernetes.io/projected/cf2ced4e-3e49-4591-a1c2-88cb0941afd4-kube-api-access-8cxkb\") pod \"nova-metadata-0\" (UID: \"cf2ced4e-3e49-4591-a1c2-88cb0941afd4\") " pod="openstack/nova-metadata-0" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.663303 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff4093e0-835e-4833-afb9-bbd891f00f93-config-data\") pod \"nova-scheduler-0\" (UID: \"ff4093e0-835e-4833-afb9-bbd891f00f93\") " pod="openstack/nova-scheduler-0" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.663342 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvmft\" (UniqueName: \"kubernetes.io/projected/ff4093e0-835e-4833-afb9-bbd891f00f93-kube-api-access-pvmft\") pod \"nova-scheduler-0\" (UID: \"ff4093e0-835e-4833-afb9-bbd891f00f93\") " pod="openstack/nova-scheduler-0" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.663432 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff4093e0-835e-4833-afb9-bbd891f00f93-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ff4093e0-835e-4833-afb9-bbd891f00f93\") " pod="openstack/nova-scheduler-0" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.666178 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff4093e0-835e-4833-afb9-bbd891f00f93-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ff4093e0-835e-4833-afb9-bbd891f00f93\") " pod="openstack/nova-scheduler-0" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.669544 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff4093e0-835e-4833-afb9-bbd891f00f93-config-data\") pod \"nova-scheduler-0\" (UID: \"ff4093e0-835e-4833-afb9-bbd891f00f93\") " pod="openstack/nova-scheduler-0" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.682958 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvmft\" (UniqueName: \"kubernetes.io/projected/ff4093e0-835e-4833-afb9-bbd891f00f93-kube-api-access-pvmft\") pod \"nova-scheduler-0\" (UID: \"ff4093e0-835e-4833-afb9-bbd891f00f93\") " pod="openstack/nova-scheduler-0" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.791801 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 12:36:19 crc kubenswrapper[4752]: I1124 12:36:19.802357 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 24 12:36:20 crc kubenswrapper[4752]: W1124 12:36:20.261014 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff4093e0_835e_4833_afb9_bbd891f00f93.slice/crio-cc22278d447679c4d20d95bc3a68063e9a3447c9d62ca2ade43e7a89c8300084 WatchSource:0}: Error finding container cc22278d447679c4d20d95bc3a68063e9a3447c9d62ca2ade43e7a89c8300084: Status 404 returned error can't find the container with id cc22278d447679c4d20d95bc3a68063e9a3447c9d62ca2ade43e7a89c8300084 Nov 24 12:36:20 crc kubenswrapper[4752]: I1124 12:36:20.261675 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 12:36:20 crc kubenswrapper[4752]: I1124 12:36:20.325152 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 12:36:20 crc kubenswrapper[4752]: W1124 12:36:20.335676 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf2ced4e_3e49_4591_a1c2_88cb0941afd4.slice/crio-68b5ae7fd207080fc8a812b70ec9d0b6d3db436f7b2e8fc3101a6b0e86154221 WatchSource:0}: Error finding container 68b5ae7fd207080fc8a812b70ec9d0b6d3db436f7b2e8fc3101a6b0e86154221: Status 404 returned error can't find the container with id 68b5ae7fd207080fc8a812b70ec9d0b6d3db436f7b2e8fc3101a6b0e86154221 Nov 24 12:36:20 crc kubenswrapper[4752]: I1124 12:36:20.344770 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ff4093e0-835e-4833-afb9-bbd891f00f93","Type":"ContainerStarted","Data":"cc22278d447679c4d20d95bc3a68063e9a3447c9d62ca2ade43e7a89c8300084"} Nov 24 12:36:20 crc kubenswrapper[4752]: I1124 12:36:20.347064 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"487dd637-63f1-43f5-b9a2-2c5b8c5a2453","Type":"ContainerStarted","Data":"6768b83abb28a36d2a6ed96add9a5c55c2e434e3164c29ee672f564f229f0568"} Nov 24 12:36:20 crc kubenswrapper[4752]: I1124 12:36:20.347159 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"487dd637-63f1-43f5-b9a2-2c5b8c5a2453","Type":"ContainerStarted","Data":"58980bb9e0653822f9cec600c9f30f3d298cd6e78077c65e7784c608d941999e"} Nov 24 12:36:20 crc kubenswrapper[4752]: I1124 12:36:20.376444 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.376423008 podStartE2EDuration="2.376423008s" podCreationTimestamp="2025-11-24 12:36:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:36:20.369323694 +0000 UTC m=+5386.354143993" watchObservedRunningTime="2025-11-24 12:36:20.376423008 +0000 UTC m=+5386.361243297" Nov 24 12:36:20 crc kubenswrapper[4752]: I1124 12:36:20.742315 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1673cf77-92e6-4619-83b7-6be76390a879" path="/var/lib/kubelet/pods/1673cf77-92e6-4619-83b7-6be76390a879/volumes" Nov 24 12:36:20 crc kubenswrapper[4752]: I1124 12:36:20.743330 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d" path="/var/lib/kubelet/pods/72ff0de6-c8b4-4d94-b1cc-f09abf4bc73d/volumes" Nov 24 12:36:21 crc kubenswrapper[4752]: I1124 12:36:21.363895 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ff4093e0-835e-4833-afb9-bbd891f00f93","Type":"ContainerStarted","Data":"f285d79c70d707f82914d71de3363f5ad93fc63c12316bd22012b8a653ce79ef"} Nov 24 12:36:21 crc kubenswrapper[4752]: I1124 12:36:21.376044 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cf2ced4e-3e49-4591-a1c2-88cb0941afd4","Type":"ContainerStarted","Data":"6b0f22bbd3c5774761af0482964d2ad3f1c3f5e2bf85245a0374282b45094763"} Nov 24 12:36:21 crc kubenswrapper[4752]: I1124 12:36:21.376099 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cf2ced4e-3e49-4591-a1c2-88cb0941afd4","Type":"ContainerStarted","Data":"de3314570b7010816dc37a0aa540096ee297b7eaac083fc6d467ae1624081b06"} Nov 24 12:36:21 crc kubenswrapper[4752]: I1124 12:36:21.376115 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cf2ced4e-3e49-4591-a1c2-88cb0941afd4","Type":"ContainerStarted","Data":"68b5ae7fd207080fc8a812b70ec9d0b6d3db436f7b2e8fc3101a6b0e86154221"} Nov 24 12:36:21 crc kubenswrapper[4752]: I1124 12:36:21.436385 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.436363205 podStartE2EDuration="2.436363205s" podCreationTimestamp="2025-11-24 12:36:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:36:21.388173372 +0000 UTC m=+5387.372993701" watchObservedRunningTime="2025-11-24 12:36:21.436363205 +0000 UTC m=+5387.421183494" Nov 24 12:36:21 crc kubenswrapper[4752]: I1124 12:36:21.445308 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.44527756 podStartE2EDuration="2.44527756s" podCreationTimestamp="2025-11-24 12:36:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:36:21.417401231 +0000 UTC m=+5387.402221560" watchObservedRunningTime="2025-11-24 12:36:21.44527756 +0000 UTC m=+5387.430097879" Nov 24 12:36:24 crc kubenswrapper[4752]: I1124 12:36:24.792398 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 24 12:36:24 crc kubenswrapper[4752]: I1124 12:36:24.792889 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 24 12:36:24 crc kubenswrapper[4752]: I1124 12:36:24.803456 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 24 12:36:28 crc kubenswrapper[4752]: I1124 12:36:28.701659 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 24 12:36:28 crc kubenswrapper[4752]: I1124 12:36:28.704267 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 24 12:36:29 crc kubenswrapper[4752]: I1124 12:36:29.784021 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="487dd637-63f1-43f5-b9a2-2c5b8c5a2453" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.73:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 24 12:36:29 crc kubenswrapper[4752]: I1124 12:36:29.784137 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="487dd637-63f1-43f5-b9a2-2c5b8c5a2453" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.73:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 24 12:36:29 crc kubenswrapper[4752]: I1124 12:36:29.793241 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 24 12:36:29 crc kubenswrapper[4752]: I1124 12:36:29.793286 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 24 12:36:29 crc kubenswrapper[4752]: I1124 12:36:29.803691 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 24 12:36:29 crc kubenswrapper[4752]: I1124 12:36:29.843488 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 24 12:36:30 crc kubenswrapper[4752]: I1124 12:36:30.530529 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 24 12:36:30 crc kubenswrapper[4752]: I1124 12:36:30.875117 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="cf2ced4e-3e49-4591-a1c2-88cb0941afd4" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.74:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 24 12:36:30 crc kubenswrapper[4752]: I1124 12:36:30.875191 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="cf2ced4e-3e49-4591-a1c2-88cb0941afd4" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.74:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 24 12:36:38 crc kubenswrapper[4752]: I1124 12:36:38.707897 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 24 12:36:38 crc kubenswrapper[4752]: I1124 12:36:38.708768 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 24 12:36:38 crc kubenswrapper[4752]: I1124 12:36:38.709502 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 24 12:36:38 crc kubenswrapper[4752]: I1124 12:36:38.709553 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 24 12:36:38 crc kubenswrapper[4752]: I1124 12:36:38.714362 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 24 12:36:38 crc kubenswrapper[4752]: I1124 12:36:38.715633 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 24 12:36:38 crc kubenswrapper[4752]: I1124 12:36:38.928882 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57958c8f89-5j58w"] Nov 24 12:36:38 crc kubenswrapper[4752]: I1124 12:36:38.930950 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57958c8f89-5j58w" Nov 24 12:36:38 crc kubenswrapper[4752]: I1124 12:36:38.989277 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57958c8f89-5j58w"] Nov 24 12:36:39 crc kubenswrapper[4752]: I1124 12:36:39.095499 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a8699ad-3e05-4eb4-bcc1-d476094c9629-config\") pod \"dnsmasq-dns-57958c8f89-5j58w\" (UID: \"7a8699ad-3e05-4eb4-bcc1-d476094c9629\") " pod="openstack/dnsmasq-dns-57958c8f89-5j58w" Nov 24 12:36:39 crc kubenswrapper[4752]: I1124 12:36:39.095979 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a8699ad-3e05-4eb4-bcc1-d476094c9629-ovsdbserver-nb\") pod \"dnsmasq-dns-57958c8f89-5j58w\" (UID: \"7a8699ad-3e05-4eb4-bcc1-d476094c9629\") " pod="openstack/dnsmasq-dns-57958c8f89-5j58w" Nov 24 12:36:39 crc kubenswrapper[4752]: I1124 12:36:39.096070 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a8699ad-3e05-4eb4-bcc1-d476094c9629-dns-svc\") pod \"dnsmasq-dns-57958c8f89-5j58w\" (UID: \"7a8699ad-3e05-4eb4-bcc1-d476094c9629\") " pod="openstack/dnsmasq-dns-57958c8f89-5j58w" Nov 24 12:36:39 crc kubenswrapper[4752]: I1124 12:36:39.096202 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-942rz\" (UniqueName: \"kubernetes.io/projected/7a8699ad-3e05-4eb4-bcc1-d476094c9629-kube-api-access-942rz\") pod \"dnsmasq-dns-57958c8f89-5j58w\" (UID: \"7a8699ad-3e05-4eb4-bcc1-d476094c9629\") " pod="openstack/dnsmasq-dns-57958c8f89-5j58w" Nov 24 12:36:39 crc kubenswrapper[4752]: I1124 12:36:39.096305 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a8699ad-3e05-4eb4-bcc1-d476094c9629-ovsdbserver-sb\") pod \"dnsmasq-dns-57958c8f89-5j58w\" (UID: \"7a8699ad-3e05-4eb4-bcc1-d476094c9629\") " pod="openstack/dnsmasq-dns-57958c8f89-5j58w" Nov 24 12:36:39 crc kubenswrapper[4752]: I1124 12:36:39.197418 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a8699ad-3e05-4eb4-bcc1-d476094c9629-dns-svc\") pod \"dnsmasq-dns-57958c8f89-5j58w\" (UID: \"7a8699ad-3e05-4eb4-bcc1-d476094c9629\") " pod="openstack/dnsmasq-dns-57958c8f89-5j58w" Nov 24 12:36:39 crc kubenswrapper[4752]: I1124 12:36:39.197503 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-942rz\" (UniqueName: \"kubernetes.io/projected/7a8699ad-3e05-4eb4-bcc1-d476094c9629-kube-api-access-942rz\") pod \"dnsmasq-dns-57958c8f89-5j58w\" (UID: \"7a8699ad-3e05-4eb4-bcc1-d476094c9629\") " pod="openstack/dnsmasq-dns-57958c8f89-5j58w" Nov 24 12:36:39 crc kubenswrapper[4752]: I1124 12:36:39.197526 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a8699ad-3e05-4eb4-bcc1-d476094c9629-ovsdbserver-sb\") pod \"dnsmasq-dns-57958c8f89-5j58w\" (UID: \"7a8699ad-3e05-4eb4-bcc1-d476094c9629\") " pod="openstack/dnsmasq-dns-57958c8f89-5j58w" Nov 24 12:36:39 crc kubenswrapper[4752]: I1124 12:36:39.197606 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a8699ad-3e05-4eb4-bcc1-d476094c9629-config\") pod \"dnsmasq-dns-57958c8f89-5j58w\" (UID: \"7a8699ad-3e05-4eb4-bcc1-d476094c9629\") " pod="openstack/dnsmasq-dns-57958c8f89-5j58w" Nov 24 12:36:39 crc kubenswrapper[4752]: I1124 12:36:39.197636 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a8699ad-3e05-4eb4-bcc1-d476094c9629-ovsdbserver-nb\") pod \"dnsmasq-dns-57958c8f89-5j58w\" (UID: \"7a8699ad-3e05-4eb4-bcc1-d476094c9629\") " pod="openstack/dnsmasq-dns-57958c8f89-5j58w" Nov 24 12:36:39 crc kubenswrapper[4752]: I1124 12:36:39.198355 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a8699ad-3e05-4eb4-bcc1-d476094c9629-dns-svc\") pod \"dnsmasq-dns-57958c8f89-5j58w\" (UID: \"7a8699ad-3e05-4eb4-bcc1-d476094c9629\") " pod="openstack/dnsmasq-dns-57958c8f89-5j58w" Nov 24 12:36:39 crc kubenswrapper[4752]: I1124 12:36:39.198370 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a8699ad-3e05-4eb4-bcc1-d476094c9629-ovsdbserver-nb\") pod \"dnsmasq-dns-57958c8f89-5j58w\" (UID: \"7a8699ad-3e05-4eb4-bcc1-d476094c9629\") " pod="openstack/dnsmasq-dns-57958c8f89-5j58w" Nov 24 12:36:39 crc kubenswrapper[4752]: I1124 12:36:39.198672 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a8699ad-3e05-4eb4-bcc1-d476094c9629-ovsdbserver-sb\") pod \"dnsmasq-dns-57958c8f89-5j58w\" (UID: \"7a8699ad-3e05-4eb4-bcc1-d476094c9629\") " pod="openstack/dnsmasq-dns-57958c8f89-5j58w" Nov 24 12:36:39 crc kubenswrapper[4752]: I1124 12:36:39.198944 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a8699ad-3e05-4eb4-bcc1-d476094c9629-config\") pod \"dnsmasq-dns-57958c8f89-5j58w\" (UID: \"7a8699ad-3e05-4eb4-bcc1-d476094c9629\") " pod="openstack/dnsmasq-dns-57958c8f89-5j58w" Nov 24 12:36:39 crc kubenswrapper[4752]: I1124 12:36:39.215617 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-942rz\" (UniqueName: \"kubernetes.io/projected/7a8699ad-3e05-4eb4-bcc1-d476094c9629-kube-api-access-942rz\") pod \"dnsmasq-dns-57958c8f89-5j58w\" (UID: \"7a8699ad-3e05-4eb4-bcc1-d476094c9629\") " pod="openstack/dnsmasq-dns-57958c8f89-5j58w" Nov 24 12:36:39 crc kubenswrapper[4752]: I1124 12:36:39.256182 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57958c8f89-5j58w" Nov 24 12:36:39 crc kubenswrapper[4752]: I1124 12:36:39.751127 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57958c8f89-5j58w"] Nov 24 12:36:39 crc kubenswrapper[4752]: W1124 12:36:39.758439 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a8699ad_3e05_4eb4_bcc1_d476094c9629.slice/crio-2593f165ed1d9172642d3a3ce9037ce384c4ee2e8f5b0dd32919c1afa06962fe WatchSource:0}: Error finding container 2593f165ed1d9172642d3a3ce9037ce384c4ee2e8f5b0dd32919c1afa06962fe: Status 404 returned error can't find the container with id 2593f165ed1d9172642d3a3ce9037ce384c4ee2e8f5b0dd32919c1afa06962fe Nov 24 12:36:39 crc kubenswrapper[4752]: I1124 12:36:39.794669 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 24 12:36:39 crc kubenswrapper[4752]: I1124 12:36:39.796732 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 24 12:36:39 crc kubenswrapper[4752]: I1124 12:36:39.803621 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 24 12:36:40 crc kubenswrapper[4752]: I1124 12:36:40.606121 4752 generic.go:334] "Generic (PLEG): container finished" podID="7a8699ad-3e05-4eb4-bcc1-d476094c9629" containerID="d157a34ec1e14de37832c067c582ad26d1f5490de44b8d27eae6feba5fa54941" exitCode=0 Nov 24 12:36:40 crc kubenswrapper[4752]: I1124 12:36:40.606213 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57958c8f89-5j58w" event={"ID":"7a8699ad-3e05-4eb4-bcc1-d476094c9629","Type":"ContainerDied","Data":"d157a34ec1e14de37832c067c582ad26d1f5490de44b8d27eae6feba5fa54941"} Nov 24 12:36:40 crc kubenswrapper[4752]: I1124 12:36:40.606672 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57958c8f89-5j58w" event={"ID":"7a8699ad-3e05-4eb4-bcc1-d476094c9629","Type":"ContainerStarted","Data":"2593f165ed1d9172642d3a3ce9037ce384c4ee2e8f5b0dd32919c1afa06962fe"} Nov 24 12:36:40 crc kubenswrapper[4752]: I1124 12:36:40.610302 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 24 12:36:41 crc kubenswrapper[4752]: I1124 12:36:41.622965 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57958c8f89-5j58w" event={"ID":"7a8699ad-3e05-4eb4-bcc1-d476094c9629","Type":"ContainerStarted","Data":"889690f91cb28a25ac4b6fb6ceac8ee093072c57bcd40b593278e6f41a5a5c43"} Nov 24 12:36:41 crc kubenswrapper[4752]: I1124 12:36:41.623412 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57958c8f89-5j58w" Nov 24 12:36:42 crc kubenswrapper[4752]: I1124 12:36:42.730068 4752 scope.go:117] "RemoveContainer" containerID="efbf80a5674af4565dca3f59641ecbc15d40138ce03b714be6ba9f77b29ba893" Nov 24 12:36:45 crc kubenswrapper[4752]: I1124 12:36:45.469215 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 12:36:45 crc kubenswrapper[4752]: I1124 12:36:45.469723 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 12:36:49 crc kubenswrapper[4752]: I1124 12:36:49.257900 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57958c8f89-5j58w" Nov 24 12:36:49 crc kubenswrapper[4752]: I1124 12:36:49.290507 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57958c8f89-5j58w" podStartSLOduration=11.29047881 podStartE2EDuration="11.29047881s" podCreationTimestamp="2025-11-24 12:36:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:36:41.661635184 +0000 UTC m=+5407.646455473" watchObservedRunningTime="2025-11-24 12:36:49.29047881 +0000 UTC m=+5415.275299119" Nov 24 12:36:49 crc kubenswrapper[4752]: I1124 12:36:49.340231 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64bb9495b5-5vb79"] Nov 24 12:36:49 crc kubenswrapper[4752]: I1124 12:36:49.340584 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-64bb9495b5-5vb79" podUID="9224a3bb-bf76-4b9e-bef0-01fbad9dfabc" containerName="dnsmasq-dns" containerID="cri-o://e1d6d902bef69e3f81e2100e9a35eb76b7328cc8906d6836de3351750c08b218" gracePeriod=10 Nov 24 12:36:49 crc kubenswrapper[4752]: I1124 12:36:49.743382 4752 generic.go:334] "Generic (PLEG): container finished" podID="9224a3bb-bf76-4b9e-bef0-01fbad9dfabc" containerID="e1d6d902bef69e3f81e2100e9a35eb76b7328cc8906d6836de3351750c08b218" exitCode=0 Nov 24 12:36:49 crc kubenswrapper[4752]: I1124 12:36:49.743819 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64bb9495b5-5vb79" event={"ID":"9224a3bb-bf76-4b9e-bef0-01fbad9dfabc","Type":"ContainerDied","Data":"e1d6d902bef69e3f81e2100e9a35eb76b7328cc8906d6836de3351750c08b218"} Nov 24 12:36:49 crc kubenswrapper[4752]: I1124 12:36:49.915588 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64bb9495b5-5vb79" Nov 24 12:36:50 crc kubenswrapper[4752]: I1124 12:36:50.070174 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxzgm\" (UniqueName: \"kubernetes.io/projected/9224a3bb-bf76-4b9e-bef0-01fbad9dfabc-kube-api-access-mxzgm\") pod \"9224a3bb-bf76-4b9e-bef0-01fbad9dfabc\" (UID: \"9224a3bb-bf76-4b9e-bef0-01fbad9dfabc\") " Nov 24 12:36:50 crc kubenswrapper[4752]: I1124 12:36:50.070254 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9224a3bb-bf76-4b9e-bef0-01fbad9dfabc-dns-svc\") pod \"9224a3bb-bf76-4b9e-bef0-01fbad9dfabc\" (UID: \"9224a3bb-bf76-4b9e-bef0-01fbad9dfabc\") " Nov 24 12:36:50 crc kubenswrapper[4752]: I1124 12:36:50.070271 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9224a3bb-bf76-4b9e-bef0-01fbad9dfabc-ovsdbserver-nb\") pod \"9224a3bb-bf76-4b9e-bef0-01fbad9dfabc\" (UID: \"9224a3bb-bf76-4b9e-bef0-01fbad9dfabc\") " Nov 24 12:36:50 crc kubenswrapper[4752]: I1124 12:36:50.070333 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9224a3bb-bf76-4b9e-bef0-01fbad9dfabc-ovsdbserver-sb\") pod \"9224a3bb-bf76-4b9e-bef0-01fbad9dfabc\" (UID: \"9224a3bb-bf76-4b9e-bef0-01fbad9dfabc\") " Nov 24 12:36:50 crc kubenswrapper[4752]: I1124 12:36:50.070406 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9224a3bb-bf76-4b9e-bef0-01fbad9dfabc-config\") pod \"9224a3bb-bf76-4b9e-bef0-01fbad9dfabc\" (UID: \"9224a3bb-bf76-4b9e-bef0-01fbad9dfabc\") " Nov 24 12:36:50 crc kubenswrapper[4752]: I1124 12:36:50.076017 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9224a3bb-bf76-4b9e-bef0-01fbad9dfabc-kube-api-access-mxzgm" (OuterVolumeSpecName: "kube-api-access-mxzgm") pod "9224a3bb-bf76-4b9e-bef0-01fbad9dfabc" (UID: "9224a3bb-bf76-4b9e-bef0-01fbad9dfabc"). InnerVolumeSpecName "kube-api-access-mxzgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:36:50 crc kubenswrapper[4752]: I1124 12:36:50.121568 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9224a3bb-bf76-4b9e-bef0-01fbad9dfabc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9224a3bb-bf76-4b9e-bef0-01fbad9dfabc" (UID: "9224a3bb-bf76-4b9e-bef0-01fbad9dfabc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:36:50 crc kubenswrapper[4752]: I1124 12:36:50.128672 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9224a3bb-bf76-4b9e-bef0-01fbad9dfabc-config" (OuterVolumeSpecName: "config") pod "9224a3bb-bf76-4b9e-bef0-01fbad9dfabc" (UID: "9224a3bb-bf76-4b9e-bef0-01fbad9dfabc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:36:50 crc kubenswrapper[4752]: I1124 12:36:50.133081 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9224a3bb-bf76-4b9e-bef0-01fbad9dfabc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9224a3bb-bf76-4b9e-bef0-01fbad9dfabc" (UID: "9224a3bb-bf76-4b9e-bef0-01fbad9dfabc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:36:50 crc kubenswrapper[4752]: I1124 12:36:50.138718 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9224a3bb-bf76-4b9e-bef0-01fbad9dfabc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9224a3bb-bf76-4b9e-bef0-01fbad9dfabc" (UID: "9224a3bb-bf76-4b9e-bef0-01fbad9dfabc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:36:50 crc kubenswrapper[4752]: I1124 12:36:50.172409 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9224a3bb-bf76-4b9e-bef0-01fbad9dfabc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 24 12:36:50 crc kubenswrapper[4752]: I1124 12:36:50.172642 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9224a3bb-bf76-4b9e-bef0-01fbad9dfabc-config\") on node \"crc\" DevicePath \"\"" Nov 24 12:36:50 crc kubenswrapper[4752]: I1124 12:36:50.172792 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxzgm\" (UniqueName: \"kubernetes.io/projected/9224a3bb-bf76-4b9e-bef0-01fbad9dfabc-kube-api-access-mxzgm\") on node \"crc\" DevicePath \"\"" Nov 24 12:36:50 crc kubenswrapper[4752]: I1124 12:36:50.172933 4752 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9224a3bb-bf76-4b9e-bef0-01fbad9dfabc-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 12:36:50 crc kubenswrapper[4752]: I1124 12:36:50.173110 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9224a3bb-bf76-4b9e-bef0-01fbad9dfabc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 24 12:36:50 crc kubenswrapper[4752]: I1124 12:36:50.756458 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64bb9495b5-5vb79" event={"ID":"9224a3bb-bf76-4b9e-bef0-01fbad9dfabc","Type":"ContainerDied","Data":"1448edd7c92627010a60ea0fab5463988f207754341bce0e5be8f3d53fb02918"} Nov 24 12:36:50 crc kubenswrapper[4752]: I1124 12:36:50.756845 4752 scope.go:117] "RemoveContainer" containerID="e1d6d902bef69e3f81e2100e9a35eb76b7328cc8906d6836de3351750c08b218" Nov 24 12:36:50 crc kubenswrapper[4752]: I1124 12:36:50.757018 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64bb9495b5-5vb79" Nov 24 12:36:50 crc kubenswrapper[4752]: I1124 12:36:50.803165 4752 scope.go:117] "RemoveContainer" containerID="9aa110f84bb7ad44b315dfb14b527200c50da4083dca798e11cd847a7883a555" Nov 24 12:36:50 crc kubenswrapper[4752]: I1124 12:36:50.813543 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64bb9495b5-5vb79"] Nov 24 12:36:50 crc kubenswrapper[4752]: I1124 12:36:50.833228 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64bb9495b5-5vb79"] Nov 24 12:36:51 crc kubenswrapper[4752]: I1124 12:36:51.264786 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-r4mbw"] Nov 24 12:36:51 crc kubenswrapper[4752]: E1124 12:36:51.265316 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9224a3bb-bf76-4b9e-bef0-01fbad9dfabc" containerName="dnsmasq-dns" Nov 24 12:36:51 crc kubenswrapper[4752]: I1124 12:36:51.265339 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="9224a3bb-bf76-4b9e-bef0-01fbad9dfabc" containerName="dnsmasq-dns" Nov 24 12:36:51 crc kubenswrapper[4752]: E1124 12:36:51.265391 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9224a3bb-bf76-4b9e-bef0-01fbad9dfabc" containerName="init" Nov 24 12:36:51 crc kubenswrapper[4752]: I1124 12:36:51.265402 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="9224a3bb-bf76-4b9e-bef0-01fbad9dfabc" containerName="init" Nov 24 12:36:51 crc kubenswrapper[4752]: I1124 12:36:51.265691 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="9224a3bb-bf76-4b9e-bef0-01fbad9dfabc" containerName="dnsmasq-dns" Nov 24 12:36:51 crc kubenswrapper[4752]: I1124 12:36:51.266635 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-r4mbw" Nov 24 12:36:51 crc kubenswrapper[4752]: I1124 12:36:51.285796 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-r4mbw"] Nov 24 12:36:51 crc kubenswrapper[4752]: I1124 12:36:51.358904 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-483d-account-create-6rq5c"] Nov 24 12:36:51 crc kubenswrapper[4752]: I1124 12:36:51.360443 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-483d-account-create-6rq5c" Nov 24 12:36:51 crc kubenswrapper[4752]: I1124 12:36:51.362303 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Nov 24 12:36:51 crc kubenswrapper[4752]: I1124 12:36:51.369591 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-483d-account-create-6rq5c"] Nov 24 12:36:51 crc kubenswrapper[4752]: I1124 12:36:51.408274 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzqtw\" (UniqueName: \"kubernetes.io/projected/21eb33dc-8ff4-486c-94e2-6bb575ae96b1-kube-api-access-tzqtw\") pod \"cinder-db-create-r4mbw\" (UID: \"21eb33dc-8ff4-486c-94e2-6bb575ae96b1\") " pod="openstack/cinder-db-create-r4mbw" Nov 24 12:36:51 crc kubenswrapper[4752]: I1124 12:36:51.408588 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21eb33dc-8ff4-486c-94e2-6bb575ae96b1-operator-scripts\") pod \"cinder-db-create-r4mbw\" (UID: \"21eb33dc-8ff4-486c-94e2-6bb575ae96b1\") " pod="openstack/cinder-db-create-r4mbw" Nov 24 12:36:51 crc kubenswrapper[4752]: I1124 12:36:51.509917 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghcmp\" (UniqueName: \"kubernetes.io/projected/df5574bf-0917-4afa-b978-788f73ad765f-kube-api-access-ghcmp\") pod \"cinder-483d-account-create-6rq5c\" (UID: \"df5574bf-0917-4afa-b978-788f73ad765f\") " pod="openstack/cinder-483d-account-create-6rq5c" Nov 24 12:36:51 crc kubenswrapper[4752]: I1124 12:36:51.510071 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21eb33dc-8ff4-486c-94e2-6bb575ae96b1-operator-scripts\") pod \"cinder-db-create-r4mbw\" (UID: \"21eb33dc-8ff4-486c-94e2-6bb575ae96b1\") " pod="openstack/cinder-db-create-r4mbw" Nov 24 12:36:51 crc kubenswrapper[4752]: I1124 12:36:51.510116 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzqtw\" (UniqueName: \"kubernetes.io/projected/21eb33dc-8ff4-486c-94e2-6bb575ae96b1-kube-api-access-tzqtw\") pod \"cinder-db-create-r4mbw\" (UID: \"21eb33dc-8ff4-486c-94e2-6bb575ae96b1\") " pod="openstack/cinder-db-create-r4mbw" Nov 24 12:36:51 crc kubenswrapper[4752]: I1124 12:36:51.510152 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df5574bf-0917-4afa-b978-788f73ad765f-operator-scripts\") pod \"cinder-483d-account-create-6rq5c\" (UID: \"df5574bf-0917-4afa-b978-788f73ad765f\") " pod="openstack/cinder-483d-account-create-6rq5c" Nov 24 12:36:51 crc kubenswrapper[4752]: I1124 12:36:51.510992 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21eb33dc-8ff4-486c-94e2-6bb575ae96b1-operator-scripts\") pod \"cinder-db-create-r4mbw\" (UID: \"21eb33dc-8ff4-486c-94e2-6bb575ae96b1\") " pod="openstack/cinder-db-create-r4mbw" Nov 24 12:36:51 crc kubenswrapper[4752]: I1124 12:36:51.527361 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzqtw\" (UniqueName: \"kubernetes.io/projected/21eb33dc-8ff4-486c-94e2-6bb575ae96b1-kube-api-access-tzqtw\") pod \"cinder-db-create-r4mbw\" (UID: \"21eb33dc-8ff4-486c-94e2-6bb575ae96b1\") " pod="openstack/cinder-db-create-r4mbw" Nov 24 12:36:51 crc kubenswrapper[4752]: I1124 12:36:51.595040 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-r4mbw" Nov 24 12:36:51 crc kubenswrapper[4752]: I1124 12:36:51.611951 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghcmp\" (UniqueName: \"kubernetes.io/projected/df5574bf-0917-4afa-b978-788f73ad765f-kube-api-access-ghcmp\") pod \"cinder-483d-account-create-6rq5c\" (UID: \"df5574bf-0917-4afa-b978-788f73ad765f\") " pod="openstack/cinder-483d-account-create-6rq5c" Nov 24 12:36:51 crc kubenswrapper[4752]: I1124 12:36:51.612119 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df5574bf-0917-4afa-b978-788f73ad765f-operator-scripts\") pod \"cinder-483d-account-create-6rq5c\" (UID: \"df5574bf-0917-4afa-b978-788f73ad765f\") " pod="openstack/cinder-483d-account-create-6rq5c" Nov 24 12:36:51 crc kubenswrapper[4752]: I1124 12:36:51.612779 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df5574bf-0917-4afa-b978-788f73ad765f-operator-scripts\") pod \"cinder-483d-account-create-6rq5c\" (UID: \"df5574bf-0917-4afa-b978-788f73ad765f\") " pod="openstack/cinder-483d-account-create-6rq5c" Nov 24 12:36:51 crc kubenswrapper[4752]: I1124 12:36:51.633002 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghcmp\" (UniqueName: \"kubernetes.io/projected/df5574bf-0917-4afa-b978-788f73ad765f-kube-api-access-ghcmp\") pod \"cinder-483d-account-create-6rq5c\" (UID: \"df5574bf-0917-4afa-b978-788f73ad765f\") " pod="openstack/cinder-483d-account-create-6rq5c" Nov 24 12:36:51 crc kubenswrapper[4752]: I1124 12:36:51.686483 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-483d-account-create-6rq5c" Nov 24 12:36:52 crc kubenswrapper[4752]: I1124 12:36:52.083452 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-r4mbw"] Nov 24 12:36:52 crc kubenswrapper[4752]: W1124 12:36:52.233797 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf5574bf_0917_4afa_b978_788f73ad765f.slice/crio-64a212aec65be487da33cf511a2ae484ff5b5f35cf9417e3398ed76ea5e2c748 WatchSource:0}: Error finding container 64a212aec65be487da33cf511a2ae484ff5b5f35cf9417e3398ed76ea5e2c748: Status 404 returned error can't find the container with id 64a212aec65be487da33cf511a2ae484ff5b5f35cf9417e3398ed76ea5e2c748 Nov 24 12:36:52 crc kubenswrapper[4752]: I1124 12:36:52.234742 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-483d-account-create-6rq5c"] Nov 24 12:36:52 crc kubenswrapper[4752]: I1124 12:36:52.738693 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9224a3bb-bf76-4b9e-bef0-01fbad9dfabc" path="/var/lib/kubelet/pods/9224a3bb-bf76-4b9e-bef0-01fbad9dfabc/volumes" Nov 24 12:36:52 crc kubenswrapper[4752]: I1124 12:36:52.782210 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-r4mbw" event={"ID":"21eb33dc-8ff4-486c-94e2-6bb575ae96b1","Type":"ContainerStarted","Data":"17e2f74d0ed3b38d56f78d5df5499ca44082fa9de130f96a5c55a03a87e949e4"} Nov 24 12:36:52 crc kubenswrapper[4752]: I1124 12:36:52.782265 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-r4mbw" event={"ID":"21eb33dc-8ff4-486c-94e2-6bb575ae96b1","Type":"ContainerStarted","Data":"db8013a6012c3fb16d5813f5323a9be6d33b0c10526d57af42ff4f0d47819c40"} Nov 24 12:36:52 crc kubenswrapper[4752]: I1124 12:36:52.788637 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-483d-account-create-6rq5c" event={"ID":"df5574bf-0917-4afa-b978-788f73ad765f","Type":"ContainerStarted","Data":"7ed569668bcc9bfb02226bffc809ab3ba71d247d314b7301cba9f30bcad94d0e"} Nov 24 12:36:52 crc kubenswrapper[4752]: I1124 12:36:52.788675 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-483d-account-create-6rq5c" event={"ID":"df5574bf-0917-4afa-b978-788f73ad765f","Type":"ContainerStarted","Data":"64a212aec65be487da33cf511a2ae484ff5b5f35cf9417e3398ed76ea5e2c748"} Nov 24 12:36:52 crc kubenswrapper[4752]: I1124 12:36:52.801077 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-r4mbw" podStartSLOduration=1.801054321 podStartE2EDuration="1.801054321s" podCreationTimestamp="2025-11-24 12:36:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:36:52.797340064 +0000 UTC m=+5418.782160363" watchObservedRunningTime="2025-11-24 12:36:52.801054321 +0000 UTC m=+5418.785874630" Nov 24 12:36:52 crc kubenswrapper[4752]: I1124 12:36:52.817683 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-483d-account-create-6rq5c" podStartSLOduration=1.817666537 podStartE2EDuration="1.817666537s" podCreationTimestamp="2025-11-24 12:36:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:36:52.811897882 +0000 UTC m=+5418.796718171" watchObservedRunningTime="2025-11-24 12:36:52.817666537 +0000 UTC m=+5418.802486826" Nov 24 12:36:53 crc kubenswrapper[4752]: I1124 12:36:53.813202 4752 generic.go:334] "Generic (PLEG): container finished" podID="df5574bf-0917-4afa-b978-788f73ad765f" containerID="7ed569668bcc9bfb02226bffc809ab3ba71d247d314b7301cba9f30bcad94d0e" exitCode=0 Nov 24 12:36:53 crc kubenswrapper[4752]: I1124 12:36:53.813555 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-483d-account-create-6rq5c" event={"ID":"df5574bf-0917-4afa-b978-788f73ad765f","Type":"ContainerDied","Data":"7ed569668bcc9bfb02226bffc809ab3ba71d247d314b7301cba9f30bcad94d0e"} Nov 24 12:36:53 crc kubenswrapper[4752]: I1124 12:36:53.824051 4752 generic.go:334] "Generic (PLEG): container finished" podID="21eb33dc-8ff4-486c-94e2-6bb575ae96b1" containerID="17e2f74d0ed3b38d56f78d5df5499ca44082fa9de130f96a5c55a03a87e949e4" exitCode=0 Nov 24 12:36:53 crc kubenswrapper[4752]: I1124 12:36:53.824143 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-r4mbw" event={"ID":"21eb33dc-8ff4-486c-94e2-6bb575ae96b1","Type":"ContainerDied","Data":"17e2f74d0ed3b38d56f78d5df5499ca44082fa9de130f96a5c55a03a87e949e4"} Nov 24 12:36:55 crc kubenswrapper[4752]: I1124 12:36:55.341957 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-483d-account-create-6rq5c" Nov 24 12:36:55 crc kubenswrapper[4752]: I1124 12:36:55.349455 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-r4mbw" Nov 24 12:36:55 crc kubenswrapper[4752]: I1124 12:36:55.500696 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzqtw\" (UniqueName: \"kubernetes.io/projected/21eb33dc-8ff4-486c-94e2-6bb575ae96b1-kube-api-access-tzqtw\") pod \"21eb33dc-8ff4-486c-94e2-6bb575ae96b1\" (UID: \"21eb33dc-8ff4-486c-94e2-6bb575ae96b1\") " Nov 24 12:36:55 crc kubenswrapper[4752]: I1124 12:36:55.500862 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghcmp\" (UniqueName: \"kubernetes.io/projected/df5574bf-0917-4afa-b978-788f73ad765f-kube-api-access-ghcmp\") pod \"df5574bf-0917-4afa-b978-788f73ad765f\" (UID: \"df5574bf-0917-4afa-b978-788f73ad765f\") " Nov 24 12:36:55 crc kubenswrapper[4752]: I1124 12:36:55.501084 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21eb33dc-8ff4-486c-94e2-6bb575ae96b1-operator-scripts\") pod \"21eb33dc-8ff4-486c-94e2-6bb575ae96b1\" (UID: \"21eb33dc-8ff4-486c-94e2-6bb575ae96b1\") " Nov 24 12:36:55 crc kubenswrapper[4752]: I1124 12:36:55.501193 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df5574bf-0917-4afa-b978-788f73ad765f-operator-scripts\") pod \"df5574bf-0917-4afa-b978-788f73ad765f\" (UID: \"df5574bf-0917-4afa-b978-788f73ad765f\") " Nov 24 12:36:55 crc kubenswrapper[4752]: I1124 12:36:55.501834 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df5574bf-0917-4afa-b978-788f73ad765f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "df5574bf-0917-4afa-b978-788f73ad765f" (UID: "df5574bf-0917-4afa-b978-788f73ad765f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:36:55 crc kubenswrapper[4752]: I1124 12:36:55.501988 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21eb33dc-8ff4-486c-94e2-6bb575ae96b1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "21eb33dc-8ff4-486c-94e2-6bb575ae96b1" (UID: "21eb33dc-8ff4-486c-94e2-6bb575ae96b1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:36:55 crc kubenswrapper[4752]: I1124 12:36:55.509098 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21eb33dc-8ff4-486c-94e2-6bb575ae96b1-kube-api-access-tzqtw" (OuterVolumeSpecName: "kube-api-access-tzqtw") pod "21eb33dc-8ff4-486c-94e2-6bb575ae96b1" (UID: "21eb33dc-8ff4-486c-94e2-6bb575ae96b1"). InnerVolumeSpecName "kube-api-access-tzqtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:36:55 crc kubenswrapper[4752]: I1124 12:36:55.509296 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df5574bf-0917-4afa-b978-788f73ad765f-kube-api-access-ghcmp" (OuterVolumeSpecName: "kube-api-access-ghcmp") pod "df5574bf-0917-4afa-b978-788f73ad765f" (UID: "df5574bf-0917-4afa-b978-788f73ad765f"). InnerVolumeSpecName "kube-api-access-ghcmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:36:55 crc kubenswrapper[4752]: I1124 12:36:55.603680 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df5574bf-0917-4afa-b978-788f73ad765f-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:36:55 crc kubenswrapper[4752]: I1124 12:36:55.603779 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzqtw\" (UniqueName: \"kubernetes.io/projected/21eb33dc-8ff4-486c-94e2-6bb575ae96b1-kube-api-access-tzqtw\") on node \"crc\" DevicePath \"\"" Nov 24 12:36:55 crc kubenswrapper[4752]: I1124 12:36:55.603802 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghcmp\" (UniqueName: \"kubernetes.io/projected/df5574bf-0917-4afa-b978-788f73ad765f-kube-api-access-ghcmp\") on node \"crc\" DevicePath \"\"" Nov 24 12:36:55 crc kubenswrapper[4752]: I1124 12:36:55.603823 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21eb33dc-8ff4-486c-94e2-6bb575ae96b1-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:36:55 crc kubenswrapper[4752]: I1124 12:36:55.872057 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-483d-account-create-6rq5c" event={"ID":"df5574bf-0917-4afa-b978-788f73ad765f","Type":"ContainerDied","Data":"64a212aec65be487da33cf511a2ae484ff5b5f35cf9417e3398ed76ea5e2c748"} Nov 24 12:36:55 crc kubenswrapper[4752]: I1124 12:36:55.872110 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64a212aec65be487da33cf511a2ae484ff5b5f35cf9417e3398ed76ea5e2c748" Nov 24 12:36:55 crc kubenswrapper[4752]: I1124 12:36:55.872081 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-483d-account-create-6rq5c" Nov 24 12:36:55 crc kubenswrapper[4752]: I1124 12:36:55.874248 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-r4mbw" event={"ID":"21eb33dc-8ff4-486c-94e2-6bb575ae96b1","Type":"ContainerDied","Data":"db8013a6012c3fb16d5813f5323a9be6d33b0c10526d57af42ff4f0d47819c40"} Nov 24 12:36:55 crc kubenswrapper[4752]: I1124 12:36:55.874282 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db8013a6012c3fb16d5813f5323a9be6d33b0c10526d57af42ff4f0d47819c40" Nov 24 12:36:55 crc kubenswrapper[4752]: I1124 12:36:55.874320 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-r4mbw" Nov 24 12:36:56 crc kubenswrapper[4752]: I1124 12:36:56.573130 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-wtnw8"] Nov 24 12:36:56 crc kubenswrapper[4752]: E1124 12:36:56.573726 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df5574bf-0917-4afa-b978-788f73ad765f" containerName="mariadb-account-create" Nov 24 12:36:56 crc kubenswrapper[4752]: I1124 12:36:56.573773 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="df5574bf-0917-4afa-b978-788f73ad765f" containerName="mariadb-account-create" Nov 24 12:36:56 crc kubenswrapper[4752]: E1124 12:36:56.573839 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21eb33dc-8ff4-486c-94e2-6bb575ae96b1" containerName="mariadb-database-create" Nov 24 12:36:56 crc kubenswrapper[4752]: I1124 12:36:56.573852 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="21eb33dc-8ff4-486c-94e2-6bb575ae96b1" containerName="mariadb-database-create" Nov 24 12:36:56 crc kubenswrapper[4752]: I1124 12:36:56.574167 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="df5574bf-0917-4afa-b978-788f73ad765f" containerName="mariadb-account-create" Nov 24 12:36:56 crc kubenswrapper[4752]: I1124 12:36:56.574203 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="21eb33dc-8ff4-486c-94e2-6bb575ae96b1" containerName="mariadb-database-create" Nov 24 12:36:56 crc kubenswrapper[4752]: I1124 12:36:56.575217 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-wtnw8" Nov 24 12:36:56 crc kubenswrapper[4752]: I1124 12:36:56.577547 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 24 12:36:56 crc kubenswrapper[4752]: I1124 12:36:56.578146 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 24 12:36:56 crc kubenswrapper[4752]: I1124 12:36:56.580571 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-ml56q" Nov 24 12:36:56 crc kubenswrapper[4752]: I1124 12:36:56.593044 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-wtnw8"] Nov 24 12:36:56 crc kubenswrapper[4752]: I1124 12:36:56.654869 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj2h5\" (UniqueName: \"kubernetes.io/projected/150b6dd6-25f2-4854-8f0c-2088c5db37b2-kube-api-access-xj2h5\") pod \"cinder-db-sync-wtnw8\" (UID: \"150b6dd6-25f2-4854-8f0c-2088c5db37b2\") " pod="openstack/cinder-db-sync-wtnw8" Nov 24 12:36:56 crc kubenswrapper[4752]: I1124 12:36:56.655289 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/150b6dd6-25f2-4854-8f0c-2088c5db37b2-config-data\") pod \"cinder-db-sync-wtnw8\" (UID: \"150b6dd6-25f2-4854-8f0c-2088c5db37b2\") " pod="openstack/cinder-db-sync-wtnw8" Nov 24 12:36:56 crc kubenswrapper[4752]: I1124 12:36:56.655428 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/150b6dd6-25f2-4854-8f0c-2088c5db37b2-db-sync-config-data\") pod \"cinder-db-sync-wtnw8\" (UID: \"150b6dd6-25f2-4854-8f0c-2088c5db37b2\") " pod="openstack/cinder-db-sync-wtnw8" Nov 24 12:36:56 crc kubenswrapper[4752]: I1124 12:36:56.655548 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/150b6dd6-25f2-4854-8f0c-2088c5db37b2-etc-machine-id\") pod \"cinder-db-sync-wtnw8\" (UID: \"150b6dd6-25f2-4854-8f0c-2088c5db37b2\") " pod="openstack/cinder-db-sync-wtnw8" Nov 24 12:36:56 crc kubenswrapper[4752]: I1124 12:36:56.655654 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/150b6dd6-25f2-4854-8f0c-2088c5db37b2-scripts\") pod \"cinder-db-sync-wtnw8\" (UID: \"150b6dd6-25f2-4854-8f0c-2088c5db37b2\") " pod="openstack/cinder-db-sync-wtnw8" Nov 24 12:36:56 crc kubenswrapper[4752]: I1124 12:36:56.655769 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/150b6dd6-25f2-4854-8f0c-2088c5db37b2-combined-ca-bundle\") pod \"cinder-db-sync-wtnw8\" (UID: \"150b6dd6-25f2-4854-8f0c-2088c5db37b2\") " pod="openstack/cinder-db-sync-wtnw8" Nov 24 12:36:56 crc kubenswrapper[4752]: I1124 12:36:56.756325 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/150b6dd6-25f2-4854-8f0c-2088c5db37b2-scripts\") pod \"cinder-db-sync-wtnw8\" (UID: \"150b6dd6-25f2-4854-8f0c-2088c5db37b2\") " pod="openstack/cinder-db-sync-wtnw8" Nov 24 12:36:56 crc kubenswrapper[4752]: I1124 12:36:56.756387 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/150b6dd6-25f2-4854-8f0c-2088c5db37b2-combined-ca-bundle\") pod \"cinder-db-sync-wtnw8\" (UID: \"150b6dd6-25f2-4854-8f0c-2088c5db37b2\") " pod="openstack/cinder-db-sync-wtnw8" Nov 24 12:36:56 crc kubenswrapper[4752]: I1124 12:36:56.756432 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj2h5\" (UniqueName: \"kubernetes.io/projected/150b6dd6-25f2-4854-8f0c-2088c5db37b2-kube-api-access-xj2h5\") pod \"cinder-db-sync-wtnw8\" (UID: \"150b6dd6-25f2-4854-8f0c-2088c5db37b2\") " pod="openstack/cinder-db-sync-wtnw8" Nov 24 12:36:56 crc kubenswrapper[4752]: I1124 12:36:56.756477 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/150b6dd6-25f2-4854-8f0c-2088c5db37b2-config-data\") pod \"cinder-db-sync-wtnw8\" (UID: \"150b6dd6-25f2-4854-8f0c-2088c5db37b2\") " pod="openstack/cinder-db-sync-wtnw8" Nov 24 12:36:56 crc kubenswrapper[4752]: I1124 12:36:56.756525 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/150b6dd6-25f2-4854-8f0c-2088c5db37b2-db-sync-config-data\") pod \"cinder-db-sync-wtnw8\" (UID: \"150b6dd6-25f2-4854-8f0c-2088c5db37b2\") " pod="openstack/cinder-db-sync-wtnw8" Nov 24 12:36:56 crc kubenswrapper[4752]: I1124 12:36:56.756559 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/150b6dd6-25f2-4854-8f0c-2088c5db37b2-etc-machine-id\") pod \"cinder-db-sync-wtnw8\" (UID: \"150b6dd6-25f2-4854-8f0c-2088c5db37b2\") " pod="openstack/cinder-db-sync-wtnw8" Nov 24 12:36:56 crc kubenswrapper[4752]: I1124 12:36:56.757884 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/150b6dd6-25f2-4854-8f0c-2088c5db37b2-etc-machine-id\") pod \"cinder-db-sync-wtnw8\" (UID: \"150b6dd6-25f2-4854-8f0c-2088c5db37b2\") " pod="openstack/cinder-db-sync-wtnw8" Nov 24 12:36:56 crc kubenswrapper[4752]: I1124 12:36:56.766975 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/150b6dd6-25f2-4854-8f0c-2088c5db37b2-scripts\") pod \"cinder-db-sync-wtnw8\" (UID: \"150b6dd6-25f2-4854-8f0c-2088c5db37b2\") " pod="openstack/cinder-db-sync-wtnw8" Nov 24 12:36:56 crc kubenswrapper[4752]: I1124 12:36:56.769112 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/150b6dd6-25f2-4854-8f0c-2088c5db37b2-combined-ca-bundle\") pod \"cinder-db-sync-wtnw8\" (UID: \"150b6dd6-25f2-4854-8f0c-2088c5db37b2\") " pod="openstack/cinder-db-sync-wtnw8" Nov 24 12:36:56 crc kubenswrapper[4752]: I1124 12:36:56.769230 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/150b6dd6-25f2-4854-8f0c-2088c5db37b2-config-data\") pod \"cinder-db-sync-wtnw8\" (UID: \"150b6dd6-25f2-4854-8f0c-2088c5db37b2\") " pod="openstack/cinder-db-sync-wtnw8" Nov 24 12:36:56 crc kubenswrapper[4752]: I1124 12:36:56.779012 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/150b6dd6-25f2-4854-8f0c-2088c5db37b2-db-sync-config-data\") pod \"cinder-db-sync-wtnw8\" (UID: \"150b6dd6-25f2-4854-8f0c-2088c5db37b2\") " pod="openstack/cinder-db-sync-wtnw8" Nov 24 12:36:56 crc kubenswrapper[4752]: I1124 12:36:56.788436 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj2h5\" (UniqueName: \"kubernetes.io/projected/150b6dd6-25f2-4854-8f0c-2088c5db37b2-kube-api-access-xj2h5\") pod \"cinder-db-sync-wtnw8\" (UID: \"150b6dd6-25f2-4854-8f0c-2088c5db37b2\") " pod="openstack/cinder-db-sync-wtnw8" Nov 24 12:36:56 crc kubenswrapper[4752]: I1124 12:36:56.906655 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-wtnw8" Nov 24 12:36:57 crc kubenswrapper[4752]: I1124 12:36:57.378798 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-wtnw8"] Nov 24 12:36:57 crc kubenswrapper[4752]: I1124 12:36:57.897675 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-wtnw8" event={"ID":"150b6dd6-25f2-4854-8f0c-2088c5db37b2","Type":"ContainerStarted","Data":"a14a77dcef09bb2a9e8bac61474a9c0ff1a7b3b22e9cf156a6522e2860663803"} Nov 24 12:36:58 crc kubenswrapper[4752]: I1124 12:36:58.909853 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-wtnw8" event={"ID":"150b6dd6-25f2-4854-8f0c-2088c5db37b2","Type":"ContainerStarted","Data":"1ddfdd3eaa014a677e29890ad9b7ba6b478751bd11e918a04e2b3f15ed57f85e"} Nov 24 12:36:58 crc kubenswrapper[4752]: I1124 12:36:58.935307 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-wtnw8" podStartSLOduration=2.935288287 podStartE2EDuration="2.935288287s" podCreationTimestamp="2025-11-24 12:36:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:36:58.932077955 +0000 UTC m=+5424.916898254" watchObservedRunningTime="2025-11-24 12:36:58.935288287 +0000 UTC m=+5424.920108586" Nov 24 12:37:00 crc kubenswrapper[4752]: I1124 12:37:00.930311 4752 generic.go:334] "Generic (PLEG): container finished" podID="150b6dd6-25f2-4854-8f0c-2088c5db37b2" containerID="1ddfdd3eaa014a677e29890ad9b7ba6b478751bd11e918a04e2b3f15ed57f85e" exitCode=0 Nov 24 12:37:00 crc kubenswrapper[4752]: I1124 12:37:00.930442 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-wtnw8" event={"ID":"150b6dd6-25f2-4854-8f0c-2088c5db37b2","Type":"ContainerDied","Data":"1ddfdd3eaa014a677e29890ad9b7ba6b478751bd11e918a04e2b3f15ed57f85e"} Nov 24 12:37:02 crc kubenswrapper[4752]: I1124 12:37:02.377486 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-wtnw8" Nov 24 12:37:02 crc kubenswrapper[4752]: I1124 12:37:02.571347 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/150b6dd6-25f2-4854-8f0c-2088c5db37b2-combined-ca-bundle\") pod \"150b6dd6-25f2-4854-8f0c-2088c5db37b2\" (UID: \"150b6dd6-25f2-4854-8f0c-2088c5db37b2\") " Nov 24 12:37:02 crc kubenswrapper[4752]: I1124 12:37:02.571815 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xj2h5\" (UniqueName: \"kubernetes.io/projected/150b6dd6-25f2-4854-8f0c-2088c5db37b2-kube-api-access-xj2h5\") pod \"150b6dd6-25f2-4854-8f0c-2088c5db37b2\" (UID: \"150b6dd6-25f2-4854-8f0c-2088c5db37b2\") " Nov 24 12:37:02 crc kubenswrapper[4752]: I1124 12:37:02.572022 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/150b6dd6-25f2-4854-8f0c-2088c5db37b2-config-data\") pod \"150b6dd6-25f2-4854-8f0c-2088c5db37b2\" (UID: \"150b6dd6-25f2-4854-8f0c-2088c5db37b2\") " Nov 24 12:37:02 crc kubenswrapper[4752]: I1124 12:37:02.572189 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/150b6dd6-25f2-4854-8f0c-2088c5db37b2-etc-machine-id\") pod \"150b6dd6-25f2-4854-8f0c-2088c5db37b2\" (UID: \"150b6dd6-25f2-4854-8f0c-2088c5db37b2\") " Nov 24 12:37:02 crc kubenswrapper[4752]: I1124 12:37:02.572654 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/150b6dd6-25f2-4854-8f0c-2088c5db37b2-db-sync-config-data\") pod \"150b6dd6-25f2-4854-8f0c-2088c5db37b2\" (UID: \"150b6dd6-25f2-4854-8f0c-2088c5db37b2\") " Nov 24 12:37:02 crc kubenswrapper[4752]: I1124 12:37:02.572872 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/150b6dd6-25f2-4854-8f0c-2088c5db37b2-scripts\") pod \"150b6dd6-25f2-4854-8f0c-2088c5db37b2\" (UID: \"150b6dd6-25f2-4854-8f0c-2088c5db37b2\") " Nov 24 12:37:02 crc kubenswrapper[4752]: I1124 12:37:02.572727 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/150b6dd6-25f2-4854-8f0c-2088c5db37b2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "150b6dd6-25f2-4854-8f0c-2088c5db37b2" (UID: "150b6dd6-25f2-4854-8f0c-2088c5db37b2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 12:37:02 crc kubenswrapper[4752]: I1124 12:37:02.574162 4752 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/150b6dd6-25f2-4854-8f0c-2088c5db37b2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 24 12:37:02 crc kubenswrapper[4752]: I1124 12:37:02.577798 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/150b6dd6-25f2-4854-8f0c-2088c5db37b2-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "150b6dd6-25f2-4854-8f0c-2088c5db37b2" (UID: "150b6dd6-25f2-4854-8f0c-2088c5db37b2"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:37:02 crc kubenswrapper[4752]: I1124 12:37:02.583319 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/150b6dd6-25f2-4854-8f0c-2088c5db37b2-scripts" (OuterVolumeSpecName: "scripts") pod "150b6dd6-25f2-4854-8f0c-2088c5db37b2" (UID: "150b6dd6-25f2-4854-8f0c-2088c5db37b2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:37:02 crc kubenswrapper[4752]: I1124 12:37:02.595001 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/150b6dd6-25f2-4854-8f0c-2088c5db37b2-kube-api-access-xj2h5" (OuterVolumeSpecName: "kube-api-access-xj2h5") pod "150b6dd6-25f2-4854-8f0c-2088c5db37b2" (UID: "150b6dd6-25f2-4854-8f0c-2088c5db37b2"). InnerVolumeSpecName "kube-api-access-xj2h5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:37:02 crc kubenswrapper[4752]: I1124 12:37:02.615793 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/150b6dd6-25f2-4854-8f0c-2088c5db37b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "150b6dd6-25f2-4854-8f0c-2088c5db37b2" (UID: "150b6dd6-25f2-4854-8f0c-2088c5db37b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:37:02 crc kubenswrapper[4752]: I1124 12:37:02.630957 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/150b6dd6-25f2-4854-8f0c-2088c5db37b2-config-data" (OuterVolumeSpecName: "config-data") pod "150b6dd6-25f2-4854-8f0c-2088c5db37b2" (UID: "150b6dd6-25f2-4854-8f0c-2088c5db37b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:37:02 crc kubenswrapper[4752]: I1124 12:37:02.675163 4752 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/150b6dd6-25f2-4854-8f0c-2088c5db37b2-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 12:37:02 crc kubenswrapper[4752]: I1124 12:37:02.675346 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/150b6dd6-25f2-4854-8f0c-2088c5db37b2-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:37:02 crc kubenswrapper[4752]: I1124 12:37:02.675445 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/150b6dd6-25f2-4854-8f0c-2088c5db37b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 12:37:02 crc kubenswrapper[4752]: I1124 12:37:02.675523 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xj2h5\" (UniqueName: \"kubernetes.io/projected/150b6dd6-25f2-4854-8f0c-2088c5db37b2-kube-api-access-xj2h5\") on node \"crc\" DevicePath \"\"" Nov 24 12:37:02 crc kubenswrapper[4752]: I1124 12:37:02.675594 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/150b6dd6-25f2-4854-8f0c-2088c5db37b2-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 12:37:02 crc kubenswrapper[4752]: I1124 12:37:02.958775 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-wtnw8" event={"ID":"150b6dd6-25f2-4854-8f0c-2088c5db37b2","Type":"ContainerDied","Data":"a14a77dcef09bb2a9e8bac61474a9c0ff1a7b3b22e9cf156a6522e2860663803"} Nov 24 12:37:02 crc kubenswrapper[4752]: I1124 12:37:02.958842 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a14a77dcef09bb2a9e8bac61474a9c0ff1a7b3b22e9cf156a6522e2860663803" Nov 24 12:37:02 crc kubenswrapper[4752]: I1124 12:37:02.958884 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-wtnw8" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.329479 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6fd99f9cb5-xl88k"] Nov 24 12:37:03 crc kubenswrapper[4752]: E1124 12:37:03.338280 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="150b6dd6-25f2-4854-8f0c-2088c5db37b2" containerName="cinder-db-sync" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.338325 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="150b6dd6-25f2-4854-8f0c-2088c5db37b2" containerName="cinder-db-sync" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.338727 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="150b6dd6-25f2-4854-8f0c-2088c5db37b2" containerName="cinder-db-sync" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.339977 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fd99f9cb5-xl88k" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.388196 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fd99f9cb5-xl88k"] Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.390802 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0a1d649-16e0-474e-ae26-2c0f17f2ef74-ovsdbserver-sb\") pod \"dnsmasq-dns-6fd99f9cb5-xl88k\" (UID: \"e0a1d649-16e0-474e-ae26-2c0f17f2ef74\") " pod="openstack/dnsmasq-dns-6fd99f9cb5-xl88k" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.390861 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7nxd\" (UniqueName: \"kubernetes.io/projected/e0a1d649-16e0-474e-ae26-2c0f17f2ef74-kube-api-access-h7nxd\") pod \"dnsmasq-dns-6fd99f9cb5-xl88k\" (UID: \"e0a1d649-16e0-474e-ae26-2c0f17f2ef74\") " pod="openstack/dnsmasq-dns-6fd99f9cb5-xl88k" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.390943 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0a1d649-16e0-474e-ae26-2c0f17f2ef74-ovsdbserver-nb\") pod \"dnsmasq-dns-6fd99f9cb5-xl88k\" (UID: \"e0a1d649-16e0-474e-ae26-2c0f17f2ef74\") " pod="openstack/dnsmasq-dns-6fd99f9cb5-xl88k" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.390999 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0a1d649-16e0-474e-ae26-2c0f17f2ef74-config\") pod \"dnsmasq-dns-6fd99f9cb5-xl88k\" (UID: \"e0a1d649-16e0-474e-ae26-2c0f17f2ef74\") " pod="openstack/dnsmasq-dns-6fd99f9cb5-xl88k" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.391055 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0a1d649-16e0-474e-ae26-2c0f17f2ef74-dns-svc\") pod \"dnsmasq-dns-6fd99f9cb5-xl88k\" (UID: \"e0a1d649-16e0-474e-ae26-2c0f17f2ef74\") " pod="openstack/dnsmasq-dns-6fd99f9cb5-xl88k" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.492154 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0a1d649-16e0-474e-ae26-2c0f17f2ef74-dns-svc\") pod \"dnsmasq-dns-6fd99f9cb5-xl88k\" (UID: \"e0a1d649-16e0-474e-ae26-2c0f17f2ef74\") " pod="openstack/dnsmasq-dns-6fd99f9cb5-xl88k" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.492196 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0a1d649-16e0-474e-ae26-2c0f17f2ef74-ovsdbserver-sb\") pod \"dnsmasq-dns-6fd99f9cb5-xl88k\" (UID: \"e0a1d649-16e0-474e-ae26-2c0f17f2ef74\") " pod="openstack/dnsmasq-dns-6fd99f9cb5-xl88k" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.492219 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7nxd\" (UniqueName: \"kubernetes.io/projected/e0a1d649-16e0-474e-ae26-2c0f17f2ef74-kube-api-access-h7nxd\") pod \"dnsmasq-dns-6fd99f9cb5-xl88k\" (UID: \"e0a1d649-16e0-474e-ae26-2c0f17f2ef74\") " pod="openstack/dnsmasq-dns-6fd99f9cb5-xl88k" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.492286 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0a1d649-16e0-474e-ae26-2c0f17f2ef74-ovsdbserver-nb\") pod \"dnsmasq-dns-6fd99f9cb5-xl88k\" (UID: \"e0a1d649-16e0-474e-ae26-2c0f17f2ef74\") " pod="openstack/dnsmasq-dns-6fd99f9cb5-xl88k" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.492326 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0a1d649-16e0-474e-ae26-2c0f17f2ef74-config\") pod \"dnsmasq-dns-6fd99f9cb5-xl88k\" (UID: \"e0a1d649-16e0-474e-ae26-2c0f17f2ef74\") " pod="openstack/dnsmasq-dns-6fd99f9cb5-xl88k" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.493221 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0a1d649-16e0-474e-ae26-2c0f17f2ef74-config\") pod \"dnsmasq-dns-6fd99f9cb5-xl88k\" (UID: \"e0a1d649-16e0-474e-ae26-2c0f17f2ef74\") " pod="openstack/dnsmasq-dns-6fd99f9cb5-xl88k" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.493622 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0a1d649-16e0-474e-ae26-2c0f17f2ef74-ovsdbserver-sb\") pod \"dnsmasq-dns-6fd99f9cb5-xl88k\" (UID: \"e0a1d649-16e0-474e-ae26-2c0f17f2ef74\") " pod="openstack/dnsmasq-dns-6fd99f9cb5-xl88k" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.493766 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0a1d649-16e0-474e-ae26-2c0f17f2ef74-dns-svc\") pod \"dnsmasq-dns-6fd99f9cb5-xl88k\" (UID: \"e0a1d649-16e0-474e-ae26-2c0f17f2ef74\") " pod="openstack/dnsmasq-dns-6fd99f9cb5-xl88k" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.494041 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0a1d649-16e0-474e-ae26-2c0f17f2ef74-ovsdbserver-nb\") pod \"dnsmasq-dns-6fd99f9cb5-xl88k\" (UID: \"e0a1d649-16e0-474e-ae26-2c0f17f2ef74\") " pod="openstack/dnsmasq-dns-6fd99f9cb5-xl88k" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.511544 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7nxd\" (UniqueName: \"kubernetes.io/projected/e0a1d649-16e0-474e-ae26-2c0f17f2ef74-kube-api-access-h7nxd\") pod \"dnsmasq-dns-6fd99f9cb5-xl88k\" (UID: \"e0a1d649-16e0-474e-ae26-2c0f17f2ef74\") " pod="openstack/dnsmasq-dns-6fd99f9cb5-xl88k" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.614459 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.616236 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.622837 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.623149 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-ml56q" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.623426 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.628755 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.643186 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.680949 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fd99f9cb5-xl88k" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.699427 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bd76c10-aef6-4597-9937-e9ec21321a89-logs\") pod \"cinder-api-0\" (UID: \"4bd76c10-aef6-4597-9937-e9ec21321a89\") " pod="openstack/cinder-api-0" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.699493 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bd76c10-aef6-4597-9937-e9ec21321a89-config-data\") pod \"cinder-api-0\" (UID: \"4bd76c10-aef6-4597-9937-e9ec21321a89\") " pod="openstack/cinder-api-0" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.699513 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bd76c10-aef6-4597-9937-e9ec21321a89-scripts\") pod \"cinder-api-0\" (UID: \"4bd76c10-aef6-4597-9937-e9ec21321a89\") " pod="openstack/cinder-api-0" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.699570 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4bd76c10-aef6-4597-9937-e9ec21321a89-config-data-custom\") pod \"cinder-api-0\" (UID: \"4bd76c10-aef6-4597-9937-e9ec21321a89\") " pod="openstack/cinder-api-0" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.699592 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bd76c10-aef6-4597-9937-e9ec21321a89-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4bd76c10-aef6-4597-9937-e9ec21321a89\") " pod="openstack/cinder-api-0" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.699623 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x8c5\" (UniqueName: \"kubernetes.io/projected/4bd76c10-aef6-4597-9937-e9ec21321a89-kube-api-access-4x8c5\") pod \"cinder-api-0\" (UID: \"4bd76c10-aef6-4597-9937-e9ec21321a89\") " pod="openstack/cinder-api-0" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.699652 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4bd76c10-aef6-4597-9937-e9ec21321a89-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4bd76c10-aef6-4597-9937-e9ec21321a89\") " pod="openstack/cinder-api-0" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.800309 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bd76c10-aef6-4597-9937-e9ec21321a89-logs\") pod \"cinder-api-0\" (UID: \"4bd76c10-aef6-4597-9937-e9ec21321a89\") " pod="openstack/cinder-api-0" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.800366 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bd76c10-aef6-4597-9937-e9ec21321a89-config-data\") pod \"cinder-api-0\" (UID: \"4bd76c10-aef6-4597-9937-e9ec21321a89\") " pod="openstack/cinder-api-0" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.800388 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bd76c10-aef6-4597-9937-e9ec21321a89-scripts\") pod \"cinder-api-0\" (UID: \"4bd76c10-aef6-4597-9937-e9ec21321a89\") " pod="openstack/cinder-api-0" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.800461 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4bd76c10-aef6-4597-9937-e9ec21321a89-config-data-custom\") pod \"cinder-api-0\" (UID: \"4bd76c10-aef6-4597-9937-e9ec21321a89\") " pod="openstack/cinder-api-0" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.800490 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bd76c10-aef6-4597-9937-e9ec21321a89-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4bd76c10-aef6-4597-9937-e9ec21321a89\") " pod="openstack/cinder-api-0" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.800547 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x8c5\" (UniqueName: \"kubernetes.io/projected/4bd76c10-aef6-4597-9937-e9ec21321a89-kube-api-access-4x8c5\") pod \"cinder-api-0\" (UID: \"4bd76c10-aef6-4597-9937-e9ec21321a89\") " pod="openstack/cinder-api-0" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.800589 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4bd76c10-aef6-4597-9937-e9ec21321a89-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4bd76c10-aef6-4597-9937-e9ec21321a89\") " pod="openstack/cinder-api-0" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.800911 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bd76c10-aef6-4597-9937-e9ec21321a89-logs\") pod \"cinder-api-0\" (UID: \"4bd76c10-aef6-4597-9937-e9ec21321a89\") " pod="openstack/cinder-api-0" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.801922 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4bd76c10-aef6-4597-9937-e9ec21321a89-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4bd76c10-aef6-4597-9937-e9ec21321a89\") " pod="openstack/cinder-api-0" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.804879 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bd76c10-aef6-4597-9937-e9ec21321a89-config-data\") pod \"cinder-api-0\" (UID: \"4bd76c10-aef6-4597-9937-e9ec21321a89\") " pod="openstack/cinder-api-0" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.805502 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bd76c10-aef6-4597-9937-e9ec21321a89-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4bd76c10-aef6-4597-9937-e9ec21321a89\") " pod="openstack/cinder-api-0" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.807060 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bd76c10-aef6-4597-9937-e9ec21321a89-scripts\") pod \"cinder-api-0\" (UID: \"4bd76c10-aef6-4597-9937-e9ec21321a89\") " pod="openstack/cinder-api-0" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.815948 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4bd76c10-aef6-4597-9937-e9ec21321a89-config-data-custom\") pod \"cinder-api-0\" (UID: \"4bd76c10-aef6-4597-9937-e9ec21321a89\") " pod="openstack/cinder-api-0" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.827207 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x8c5\" (UniqueName: \"kubernetes.io/projected/4bd76c10-aef6-4597-9937-e9ec21321a89-kube-api-access-4x8c5\") pod \"cinder-api-0\" (UID: \"4bd76c10-aef6-4597-9937-e9ec21321a89\") " pod="openstack/cinder-api-0" Nov 24 12:37:03 crc kubenswrapper[4752]: I1124 12:37:03.938849 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 24 12:37:04 crc kubenswrapper[4752]: I1124 12:37:04.185824 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fd99f9cb5-xl88k"] Nov 24 12:37:04 crc kubenswrapper[4752]: I1124 12:37:04.449546 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 24 12:37:04 crc kubenswrapper[4752]: W1124 12:37:04.491886 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4bd76c10_aef6_4597_9937_e9ec21321a89.slice/crio-fba1e60892eb93d1fd3d851ccfd4aeede217597564a739d3f175b36a75400a81 WatchSource:0}: Error finding container fba1e60892eb93d1fd3d851ccfd4aeede217597564a739d3f175b36a75400a81: Status 404 returned error can't find the container with id fba1e60892eb93d1fd3d851ccfd4aeede217597564a739d3f175b36a75400a81 Nov 24 12:37:04 crc kubenswrapper[4752]: I1124 12:37:04.977886 4752 generic.go:334] "Generic (PLEG): container finished" podID="e0a1d649-16e0-474e-ae26-2c0f17f2ef74" containerID="cca1b550908ef80250525dc8f0ed13ea8ec3d7f84c7e38e339ed6318e9cccca7" exitCode=0 Nov 24 12:37:04 crc kubenswrapper[4752]: I1124 12:37:04.978225 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fd99f9cb5-xl88k" event={"ID":"e0a1d649-16e0-474e-ae26-2c0f17f2ef74","Type":"ContainerDied","Data":"cca1b550908ef80250525dc8f0ed13ea8ec3d7f84c7e38e339ed6318e9cccca7"} Nov 24 12:37:04 crc kubenswrapper[4752]: I1124 12:37:04.978325 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fd99f9cb5-xl88k" event={"ID":"e0a1d649-16e0-474e-ae26-2c0f17f2ef74","Type":"ContainerStarted","Data":"3effb433d190370929b7ef4b23f91d6a6356e0d4fc65b0cfb98bd45919a5f5ca"} Nov 24 12:37:04 crc kubenswrapper[4752]: I1124 12:37:04.981421 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4bd76c10-aef6-4597-9937-e9ec21321a89","Type":"ContainerStarted","Data":"fba1e60892eb93d1fd3d851ccfd4aeede217597564a739d3f175b36a75400a81"} Nov 24 12:37:06 crc kubenswrapper[4752]: I1124 12:37:05.999934 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fd99f9cb5-xl88k" event={"ID":"e0a1d649-16e0-474e-ae26-2c0f17f2ef74","Type":"ContainerStarted","Data":"ab8869d564759e9fc6fcf1fbbfec19c6b2e4acce0b1acbff0de373405c7023fa"} Nov 24 12:37:06 crc kubenswrapper[4752]: I1124 12:37:06.000867 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6fd99f9cb5-xl88k" Nov 24 12:37:06 crc kubenswrapper[4752]: I1124 12:37:06.006378 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4bd76c10-aef6-4597-9937-e9ec21321a89","Type":"ContainerStarted","Data":"2a54b902784c1bf8c77233fbff52a962739d578cf72af273e27de0975becc04f"} Nov 24 12:37:06 crc kubenswrapper[4752]: I1124 12:37:06.006430 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4bd76c10-aef6-4597-9937-e9ec21321a89","Type":"ContainerStarted","Data":"4bc47068e9f871859110d4156867d2bc4007226c5b47af9a4a36112882caa235"} Nov 24 12:37:06 crc kubenswrapper[4752]: I1124 12:37:06.006737 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Nov 24 12:37:06 crc kubenswrapper[4752]: I1124 12:37:06.022450 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6fd99f9cb5-xl88k" podStartSLOduration=3.022431742 podStartE2EDuration="3.022431742s" podCreationTimestamp="2025-11-24 12:37:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:37:06.016234304 +0000 UTC m=+5432.001054593" watchObservedRunningTime="2025-11-24 12:37:06.022431742 +0000 UTC m=+5432.007252031" Nov 24 12:37:06 crc kubenswrapper[4752]: I1124 12:37:06.034992 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.034971441 podStartE2EDuration="3.034971441s" podCreationTimestamp="2025-11-24 12:37:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:37:06.032085629 +0000 UTC m=+5432.016905958" watchObservedRunningTime="2025-11-24 12:37:06.034971441 +0000 UTC m=+5432.019791740" Nov 24 12:37:13 crc kubenswrapper[4752]: I1124 12:37:13.682927 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6fd99f9cb5-xl88k" Nov 24 12:37:13 crc kubenswrapper[4752]: I1124 12:37:13.751415 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57958c8f89-5j58w"] Nov 24 12:37:13 crc kubenswrapper[4752]: I1124 12:37:13.751661 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57958c8f89-5j58w" podUID="7a8699ad-3e05-4eb4-bcc1-d476094c9629" containerName="dnsmasq-dns" containerID="cri-o://889690f91cb28a25ac4b6fb6ceac8ee093072c57bcd40b593278e6f41a5a5c43" gracePeriod=10 Nov 24 12:37:14 crc kubenswrapper[4752]: I1124 12:37:14.097203 4752 generic.go:334] "Generic (PLEG): container finished" podID="7a8699ad-3e05-4eb4-bcc1-d476094c9629" containerID="889690f91cb28a25ac4b6fb6ceac8ee093072c57bcd40b593278e6f41a5a5c43" exitCode=0 Nov 24 12:37:14 crc kubenswrapper[4752]: I1124 12:37:14.097250 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57958c8f89-5j58w" event={"ID":"7a8699ad-3e05-4eb4-bcc1-d476094c9629","Type":"ContainerDied","Data":"889690f91cb28a25ac4b6fb6ceac8ee093072c57bcd40b593278e6f41a5a5c43"} Nov 24 12:37:14 crc kubenswrapper[4752]: I1124 12:37:14.444676 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57958c8f89-5j58w" Nov 24 12:37:14 crc kubenswrapper[4752]: I1124 12:37:14.644472 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a8699ad-3e05-4eb4-bcc1-d476094c9629-ovsdbserver-nb\") pod \"7a8699ad-3e05-4eb4-bcc1-d476094c9629\" (UID: \"7a8699ad-3e05-4eb4-bcc1-d476094c9629\") " Nov 24 12:37:14 crc kubenswrapper[4752]: I1124 12:37:14.644617 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a8699ad-3e05-4eb4-bcc1-d476094c9629-config\") pod \"7a8699ad-3e05-4eb4-bcc1-d476094c9629\" (UID: \"7a8699ad-3e05-4eb4-bcc1-d476094c9629\") " Nov 24 12:37:14 crc kubenswrapper[4752]: I1124 12:37:14.644677 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a8699ad-3e05-4eb4-bcc1-d476094c9629-dns-svc\") pod \"7a8699ad-3e05-4eb4-bcc1-d476094c9629\" (UID: \"7a8699ad-3e05-4eb4-bcc1-d476094c9629\") " Nov 24 12:37:14 crc kubenswrapper[4752]: I1124 12:37:14.644763 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-942rz\" (UniqueName: \"kubernetes.io/projected/7a8699ad-3e05-4eb4-bcc1-d476094c9629-kube-api-access-942rz\") pod \"7a8699ad-3e05-4eb4-bcc1-d476094c9629\" (UID: \"7a8699ad-3e05-4eb4-bcc1-d476094c9629\") " Nov 24 12:37:14 crc kubenswrapper[4752]: I1124 12:37:14.644908 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a8699ad-3e05-4eb4-bcc1-d476094c9629-ovsdbserver-sb\") pod \"7a8699ad-3e05-4eb4-bcc1-d476094c9629\" (UID: \"7a8699ad-3e05-4eb4-bcc1-d476094c9629\") " Nov 24 12:37:14 crc kubenswrapper[4752]: I1124 12:37:14.658973 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a8699ad-3e05-4eb4-bcc1-d476094c9629-kube-api-access-942rz" (OuterVolumeSpecName: "kube-api-access-942rz") pod "7a8699ad-3e05-4eb4-bcc1-d476094c9629" (UID: "7a8699ad-3e05-4eb4-bcc1-d476094c9629"). InnerVolumeSpecName "kube-api-access-942rz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:37:14 crc kubenswrapper[4752]: I1124 12:37:14.706390 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a8699ad-3e05-4eb4-bcc1-d476094c9629-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7a8699ad-3e05-4eb4-bcc1-d476094c9629" (UID: "7a8699ad-3e05-4eb4-bcc1-d476094c9629"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:37:14 crc kubenswrapper[4752]: I1124 12:37:14.724296 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a8699ad-3e05-4eb4-bcc1-d476094c9629-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7a8699ad-3e05-4eb4-bcc1-d476094c9629" (UID: "7a8699ad-3e05-4eb4-bcc1-d476094c9629"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:37:14 crc kubenswrapper[4752]: I1124 12:37:14.732129 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a8699ad-3e05-4eb4-bcc1-d476094c9629-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7a8699ad-3e05-4eb4-bcc1-d476094c9629" (UID: "7a8699ad-3e05-4eb4-bcc1-d476094c9629"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:37:14 crc kubenswrapper[4752]: I1124 12:37:14.744190 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a8699ad-3e05-4eb4-bcc1-d476094c9629-config" (OuterVolumeSpecName: "config") pod "7a8699ad-3e05-4eb4-bcc1-d476094c9629" (UID: "7a8699ad-3e05-4eb4-bcc1-d476094c9629"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:37:14 crc kubenswrapper[4752]: I1124 12:37:14.750919 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a8699ad-3e05-4eb4-bcc1-d476094c9629-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 24 12:37:14 crc kubenswrapper[4752]: I1124 12:37:14.754184 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a8699ad-3e05-4eb4-bcc1-d476094c9629-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 24 12:37:14 crc kubenswrapper[4752]: I1124 12:37:14.754205 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a8699ad-3e05-4eb4-bcc1-d476094c9629-config\") on node \"crc\" DevicePath \"\"" Nov 24 12:37:14 crc kubenswrapper[4752]: I1124 12:37:14.754215 4752 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a8699ad-3e05-4eb4-bcc1-d476094c9629-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 12:37:14 crc kubenswrapper[4752]: I1124 12:37:14.754226 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-942rz\" (UniqueName: \"kubernetes.io/projected/7a8699ad-3e05-4eb4-bcc1-d476094c9629-kube-api-access-942rz\") on node \"crc\" DevicePath \"\"" Nov 24 12:37:15 crc kubenswrapper[4752]: I1124 12:37:15.108028 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57958c8f89-5j58w" event={"ID":"7a8699ad-3e05-4eb4-bcc1-d476094c9629","Type":"ContainerDied","Data":"2593f165ed1d9172642d3a3ce9037ce384c4ee2e8f5b0dd32919c1afa06962fe"} Nov 24 12:37:15 crc kubenswrapper[4752]: I1124 12:37:15.108091 4752 scope.go:117] "RemoveContainer" containerID="889690f91cb28a25ac4b6fb6ceac8ee093072c57bcd40b593278e6f41a5a5c43" Nov 24 12:37:15 crc kubenswrapper[4752]: I1124 12:37:15.108236 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57958c8f89-5j58w" Nov 24 12:37:15 crc kubenswrapper[4752]: I1124 12:37:15.130117 4752 scope.go:117] "RemoveContainer" containerID="d157a34ec1e14de37832c067c582ad26d1f5490de44b8d27eae6feba5fa54941" Nov 24 12:37:15 crc kubenswrapper[4752]: I1124 12:37:15.148242 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57958c8f89-5j58w"] Nov 24 12:37:15 crc kubenswrapper[4752]: I1124 12:37:15.162632 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57958c8f89-5j58w"] Nov 24 12:37:15 crc kubenswrapper[4752]: I1124 12:37:15.224849 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 24 12:37:15 crc kubenswrapper[4752]: I1124 12:37:15.225045 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="7c82b290-1172-460e-9b64-5a73c92229d0" containerName="nova-cell0-conductor-conductor" containerID="cri-o://8ffb055c443d1b0af4eb5c2ab0dd7c04bd559e53d7e2197ca2691405749cd41d" gracePeriod=30 Nov 24 12:37:15 crc kubenswrapper[4752]: I1124 12:37:15.243872 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 24 12:37:15 crc kubenswrapper[4752]: I1124 12:37:15.244217 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="487dd637-63f1-43f5-b9a2-2c5b8c5a2453" containerName="nova-api-log" containerID="cri-o://58980bb9e0653822f9cec600c9f30f3d298cd6e78077c65e7784c608d941999e" gracePeriod=30 Nov 24 12:37:15 crc kubenswrapper[4752]: I1124 12:37:15.244737 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="487dd637-63f1-43f5-b9a2-2c5b8c5a2453" containerName="nova-api-api" containerID="cri-o://6768b83abb28a36d2a6ed96add9a5c55c2e434e3164c29ee672f564f229f0568" gracePeriod=30 Nov 24 12:37:15 crc kubenswrapper[4752]: I1124 12:37:15.268820 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 24 12:37:15 crc kubenswrapper[4752]: I1124 12:37:15.269123 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="d244a063-29c4-4ae4-b3f3-35dd3232d55b" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://5edffa8ecc999bd66f216c289dd13f966726c56ab67392ec9c8f239b3f686818" gracePeriod=30 Nov 24 12:37:15 crc kubenswrapper[4752]: I1124 12:37:15.283866 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 12:37:15 crc kubenswrapper[4752]: I1124 12:37:15.284144 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="ff4093e0-835e-4833-afb9-bbd891f00f93" containerName="nova-scheduler-scheduler" containerID="cri-o://f285d79c70d707f82914d71de3363f5ad93fc63c12316bd22012b8a653ce79ef" gracePeriod=30 Nov 24 12:37:15 crc kubenswrapper[4752]: I1124 12:37:15.293187 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 12:37:15 crc kubenswrapper[4752]: I1124 12:37:15.293449 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cf2ced4e-3e49-4591-a1c2-88cb0941afd4" containerName="nova-metadata-log" containerID="cri-o://de3314570b7010816dc37a0aa540096ee297b7eaac083fc6d467ae1624081b06" gracePeriod=30 Nov 24 12:37:15 crc kubenswrapper[4752]: I1124 12:37:15.293963 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cf2ced4e-3e49-4591-a1c2-88cb0941afd4" containerName="nova-metadata-metadata" containerID="cri-o://6b0f22bbd3c5774761af0482964d2ad3f1c3f5e2bf85245a0374282b45094763" gracePeriod=30 Nov 24 12:37:15 crc kubenswrapper[4752]: I1124 12:37:15.394210 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 24 12:37:15 crc kubenswrapper[4752]: I1124 12:37:15.394565 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="a0418f6f-75c8-4dac-b7d4-c95946071dec" containerName="nova-cell1-conductor-conductor" containerID="cri-o://c57c4f1a5a103affa7b2607ae35afd356a03cb2c59858a44fe7a879bc72b9ff2" gracePeriod=30 Nov 24 12:37:15 crc kubenswrapper[4752]: E1124 12:37:15.396163 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c57c4f1a5a103affa7b2607ae35afd356a03cb2c59858a44fe7a879bc72b9ff2" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 24 12:37:15 crc kubenswrapper[4752]: E1124 12:37:15.401631 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c57c4f1a5a103affa7b2607ae35afd356a03cb2c59858a44fe7a879bc72b9ff2" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 24 12:37:15 crc kubenswrapper[4752]: E1124 12:37:15.402862 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c57c4f1a5a103affa7b2607ae35afd356a03cb2c59858a44fe7a879bc72b9ff2" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 24 12:37:15 crc kubenswrapper[4752]: E1124 12:37:15.402928 4752 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="a0418f6f-75c8-4dac-b7d4-c95946071dec" containerName="nova-cell1-conductor-conductor" Nov 24 12:37:15 crc kubenswrapper[4752]: I1124 12:37:15.468878 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 12:37:15 crc kubenswrapper[4752]: I1124 12:37:15.468942 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 12:37:15 crc kubenswrapper[4752]: I1124 12:37:15.468989 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 12:37:15 crc kubenswrapper[4752]: I1124 12:37:15.469633 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9584ea900c82ce825ee45effc8c387923a07ba88ab659856bc5c308458118a8b"} pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 24 12:37:15 crc kubenswrapper[4752]: I1124 12:37:15.469710 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" containerID="cri-o://9584ea900c82ce825ee45effc8c387923a07ba88ab659856bc5c308458118a8b" gracePeriod=600 Nov 24 12:37:16 crc kubenswrapper[4752]: I1124 12:37:16.123370 4752 generic.go:334] "Generic (PLEG): container finished" podID="487dd637-63f1-43f5-b9a2-2c5b8c5a2453" containerID="58980bb9e0653822f9cec600c9f30f3d298cd6e78077c65e7784c608d941999e" exitCode=143 Nov 24 12:37:16 crc kubenswrapper[4752]: I1124 12:37:16.123505 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"487dd637-63f1-43f5-b9a2-2c5b8c5a2453","Type":"ContainerDied","Data":"58980bb9e0653822f9cec600c9f30f3d298cd6e78077c65e7784c608d941999e"} Nov 24 12:37:16 crc kubenswrapper[4752]: I1124 12:37:16.126679 4752 generic.go:334] "Generic (PLEG): container finished" podID="d244a063-29c4-4ae4-b3f3-35dd3232d55b" containerID="5edffa8ecc999bd66f216c289dd13f966726c56ab67392ec9c8f239b3f686818" exitCode=0 Nov 24 12:37:16 crc kubenswrapper[4752]: I1124 12:37:16.126767 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d244a063-29c4-4ae4-b3f3-35dd3232d55b","Type":"ContainerDied","Data":"5edffa8ecc999bd66f216c289dd13f966726c56ab67392ec9c8f239b3f686818"} Nov 24 12:37:16 crc kubenswrapper[4752]: I1124 12:37:16.128881 4752 generic.go:334] "Generic (PLEG): container finished" podID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerID="9584ea900c82ce825ee45effc8c387923a07ba88ab659856bc5c308458118a8b" exitCode=0 Nov 24 12:37:16 crc kubenswrapper[4752]: I1124 12:37:16.128928 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerDied","Data":"9584ea900c82ce825ee45effc8c387923a07ba88ab659856bc5c308458118a8b"} Nov 24 12:37:16 crc kubenswrapper[4752]: I1124 12:37:16.128956 4752 scope.go:117] "RemoveContainer" containerID="294534fa96e641fffe9b0cce6e2bc4d525450d730fff13a2ed6629d46358ee8a" Nov 24 12:37:16 crc kubenswrapper[4752]: I1124 12:37:16.131601 4752 generic.go:334] "Generic (PLEG): container finished" podID="cf2ced4e-3e49-4591-a1c2-88cb0941afd4" containerID="de3314570b7010816dc37a0aa540096ee297b7eaac083fc6d467ae1624081b06" exitCode=143 Nov 24 12:37:16 crc kubenswrapper[4752]: I1124 12:37:16.131702 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cf2ced4e-3e49-4591-a1c2-88cb0941afd4","Type":"ContainerDied","Data":"de3314570b7010816dc37a0aa540096ee297b7eaac083fc6d467ae1624081b06"} Nov 24 12:37:16 crc kubenswrapper[4752]: I1124 12:37:16.184331 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Nov 24 12:37:16 crc kubenswrapper[4752]: E1124 12:37:16.212277 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8ffb055c443d1b0af4eb5c2ab0dd7c04bd559e53d7e2197ca2691405749cd41d" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 24 12:37:16 crc kubenswrapper[4752]: E1124 12:37:16.240961 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8ffb055c443d1b0af4eb5c2ab0dd7c04bd559e53d7e2197ca2691405749cd41d" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 24 12:37:16 crc kubenswrapper[4752]: E1124 12:37:16.243189 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8ffb055c443d1b0af4eb5c2ab0dd7c04bd559e53d7e2197ca2691405749cd41d" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 24 12:37:16 crc kubenswrapper[4752]: E1124 12:37:16.243240 4752 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="7c82b290-1172-460e-9b64-5a73c92229d0" containerName="nova-cell0-conductor-conductor" Nov 24 12:37:16 crc kubenswrapper[4752]: I1124 12:37:16.436961 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 24 12:37:16 crc kubenswrapper[4752]: I1124 12:37:16.620031 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d244a063-29c4-4ae4-b3f3-35dd3232d55b-config-data\") pod \"d244a063-29c4-4ae4-b3f3-35dd3232d55b\" (UID: \"d244a063-29c4-4ae4-b3f3-35dd3232d55b\") " Nov 24 12:37:16 crc kubenswrapper[4752]: I1124 12:37:16.620296 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvc8h\" (UniqueName: \"kubernetes.io/projected/d244a063-29c4-4ae4-b3f3-35dd3232d55b-kube-api-access-xvc8h\") pod \"d244a063-29c4-4ae4-b3f3-35dd3232d55b\" (UID: \"d244a063-29c4-4ae4-b3f3-35dd3232d55b\") " Nov 24 12:37:16 crc kubenswrapper[4752]: I1124 12:37:16.620357 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d244a063-29c4-4ae4-b3f3-35dd3232d55b-combined-ca-bundle\") pod \"d244a063-29c4-4ae4-b3f3-35dd3232d55b\" (UID: \"d244a063-29c4-4ae4-b3f3-35dd3232d55b\") " Nov 24 12:37:16 crc kubenswrapper[4752]: I1124 12:37:16.635279 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d244a063-29c4-4ae4-b3f3-35dd3232d55b-kube-api-access-xvc8h" (OuterVolumeSpecName: "kube-api-access-xvc8h") pod "d244a063-29c4-4ae4-b3f3-35dd3232d55b" (UID: "d244a063-29c4-4ae4-b3f3-35dd3232d55b"). InnerVolumeSpecName "kube-api-access-xvc8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:37:16 crc kubenswrapper[4752]: I1124 12:37:16.662084 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d244a063-29c4-4ae4-b3f3-35dd3232d55b-config-data" (OuterVolumeSpecName: "config-data") pod "d244a063-29c4-4ae4-b3f3-35dd3232d55b" (UID: "d244a063-29c4-4ae4-b3f3-35dd3232d55b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:37:16 crc kubenswrapper[4752]: I1124 12:37:16.662241 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d244a063-29c4-4ae4-b3f3-35dd3232d55b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d244a063-29c4-4ae4-b3f3-35dd3232d55b" (UID: "d244a063-29c4-4ae4-b3f3-35dd3232d55b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:37:16 crc kubenswrapper[4752]: I1124 12:37:16.722862 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvc8h\" (UniqueName: \"kubernetes.io/projected/d244a063-29c4-4ae4-b3f3-35dd3232d55b-kube-api-access-xvc8h\") on node \"crc\" DevicePath \"\"" Nov 24 12:37:16 crc kubenswrapper[4752]: I1124 12:37:16.722900 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d244a063-29c4-4ae4-b3f3-35dd3232d55b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 12:37:16 crc kubenswrapper[4752]: I1124 12:37:16.722911 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d244a063-29c4-4ae4-b3f3-35dd3232d55b-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 12:37:16 crc kubenswrapper[4752]: I1124 12:37:16.743028 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a8699ad-3e05-4eb4-bcc1-d476094c9629" path="/var/lib/kubelet/pods/7a8699ad-3e05-4eb4-bcc1-d476094c9629/volumes" Nov 24 12:37:17 crc kubenswrapper[4752]: I1124 12:37:17.149689 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 24 12:37:17 crc kubenswrapper[4752]: I1124 12:37:17.149824 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d244a063-29c4-4ae4-b3f3-35dd3232d55b","Type":"ContainerDied","Data":"aec4a9ccc6524a446b7b7734b07746cc8534261dc0ffcb51f0c2601396c2cd81"} Nov 24 12:37:17 crc kubenswrapper[4752]: I1124 12:37:17.150562 4752 scope.go:117] "RemoveContainer" containerID="5edffa8ecc999bd66f216c289dd13f966726c56ab67392ec9c8f239b3f686818" Nov 24 12:37:17 crc kubenswrapper[4752]: I1124 12:37:17.159130 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerStarted","Data":"2856f89824856d474bf147248c78cc6aab4d866c6c4a631c464c721708251588"} Nov 24 12:37:17 crc kubenswrapper[4752]: I1124 12:37:17.186955 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 24 12:37:17 crc kubenswrapper[4752]: I1124 12:37:17.196600 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 24 12:37:17 crc kubenswrapper[4752]: I1124 12:37:17.218111 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 24 12:37:17 crc kubenswrapper[4752]: E1124 12:37:17.218622 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a8699ad-3e05-4eb4-bcc1-d476094c9629" containerName="init" Nov 24 12:37:17 crc kubenswrapper[4752]: I1124 12:37:17.218638 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a8699ad-3e05-4eb4-bcc1-d476094c9629" containerName="init" Nov 24 12:37:17 crc kubenswrapper[4752]: E1124 12:37:17.218651 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a8699ad-3e05-4eb4-bcc1-d476094c9629" containerName="dnsmasq-dns" Nov 24 12:37:17 crc kubenswrapper[4752]: I1124 12:37:17.218661 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a8699ad-3e05-4eb4-bcc1-d476094c9629" containerName="dnsmasq-dns" Nov 24 12:37:17 crc kubenswrapper[4752]: E1124 12:37:17.218700 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d244a063-29c4-4ae4-b3f3-35dd3232d55b" containerName="nova-cell1-novncproxy-novncproxy" Nov 24 12:37:17 crc kubenswrapper[4752]: I1124 12:37:17.218709 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="d244a063-29c4-4ae4-b3f3-35dd3232d55b" containerName="nova-cell1-novncproxy-novncproxy" Nov 24 12:37:17 crc kubenswrapper[4752]: I1124 12:37:17.218943 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a8699ad-3e05-4eb4-bcc1-d476094c9629" containerName="dnsmasq-dns" Nov 24 12:37:17 crc kubenswrapper[4752]: I1124 12:37:17.218968 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="d244a063-29c4-4ae4-b3f3-35dd3232d55b" containerName="nova-cell1-novncproxy-novncproxy" Nov 24 12:37:17 crc kubenswrapper[4752]: I1124 12:37:17.219773 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 24 12:37:17 crc kubenswrapper[4752]: I1124 12:37:17.223461 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 24 12:37:17 crc kubenswrapper[4752]: I1124 12:37:17.228180 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 24 12:37:17 crc kubenswrapper[4752]: I1124 12:37:17.330984 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq7gv\" (UniqueName: \"kubernetes.io/projected/a3e6bac3-32fc-4dfe-9925-e297bc7c1059-kube-api-access-sq7gv\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3e6bac3-32fc-4dfe-9925-e297bc7c1059\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 12:37:17 crc kubenswrapper[4752]: I1124 12:37:17.331280 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3e6bac3-32fc-4dfe-9925-e297bc7c1059-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3e6bac3-32fc-4dfe-9925-e297bc7c1059\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 12:37:17 crc kubenswrapper[4752]: I1124 12:37:17.331373 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3e6bac3-32fc-4dfe-9925-e297bc7c1059-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3e6bac3-32fc-4dfe-9925-e297bc7c1059\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 12:37:17 crc kubenswrapper[4752]: I1124 12:37:17.432615 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sq7gv\" (UniqueName: \"kubernetes.io/projected/a3e6bac3-32fc-4dfe-9925-e297bc7c1059-kube-api-access-sq7gv\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3e6bac3-32fc-4dfe-9925-e297bc7c1059\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 12:37:17 crc kubenswrapper[4752]: I1124 12:37:17.432952 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3e6bac3-32fc-4dfe-9925-e297bc7c1059-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3e6bac3-32fc-4dfe-9925-e297bc7c1059\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 12:37:17 crc kubenswrapper[4752]: I1124 12:37:17.433915 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3e6bac3-32fc-4dfe-9925-e297bc7c1059-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3e6bac3-32fc-4dfe-9925-e297bc7c1059\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 12:37:17 crc kubenswrapper[4752]: I1124 12:37:17.439444 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3e6bac3-32fc-4dfe-9925-e297bc7c1059-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3e6bac3-32fc-4dfe-9925-e297bc7c1059\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 12:37:17 crc kubenswrapper[4752]: I1124 12:37:17.465698 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq7gv\" (UniqueName: \"kubernetes.io/projected/a3e6bac3-32fc-4dfe-9925-e297bc7c1059-kube-api-access-sq7gv\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3e6bac3-32fc-4dfe-9925-e297bc7c1059\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 12:37:17 crc kubenswrapper[4752]: I1124 12:37:17.489637 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3e6bac3-32fc-4dfe-9925-e297bc7c1059-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a3e6bac3-32fc-4dfe-9925-e297bc7c1059\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 12:37:17 crc kubenswrapper[4752]: I1124 12:37:17.571722 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 24 12:37:18 crc kubenswrapper[4752]: I1124 12:37:18.067303 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 24 12:37:18 crc kubenswrapper[4752]: I1124 12:37:18.181213 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a3e6bac3-32fc-4dfe-9925-e297bc7c1059","Type":"ContainerStarted","Data":"bbacb006933a1a64a297a33dde03c8974abb314740528fd67f2fe91f231e953b"} Nov 24 12:37:18 crc kubenswrapper[4752]: I1124 12:37:18.703043 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="487dd637-63f1-43f5-b9a2-2c5b8c5a2453" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.73:8774/\": dial tcp 10.217.1.73:8774: connect: connection refused" Nov 24 12:37:18 crc kubenswrapper[4752]: I1124 12:37:18.706839 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="487dd637-63f1-43f5-b9a2-2c5b8c5a2453" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.73:8774/\": dial tcp 10.217.1.73:8774: connect: connection refused" Nov 24 12:37:18 crc kubenswrapper[4752]: I1124 12:37:18.737380 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d244a063-29c4-4ae4-b3f3-35dd3232d55b" path="/var/lib/kubelet/pods/d244a063-29c4-4ae4-b3f3-35dd3232d55b/volumes" Nov 24 12:37:19 crc kubenswrapper[4752]: I1124 12:37:19.197251 4752 generic.go:334] "Generic (PLEG): container finished" podID="ff4093e0-835e-4833-afb9-bbd891f00f93" containerID="f285d79c70d707f82914d71de3363f5ad93fc63c12316bd22012b8a653ce79ef" exitCode=0 Nov 24 12:37:19 crc kubenswrapper[4752]: I1124 12:37:19.197573 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ff4093e0-835e-4833-afb9-bbd891f00f93","Type":"ContainerDied","Data":"f285d79c70d707f82914d71de3363f5ad93fc63c12316bd22012b8a653ce79ef"} Nov 24 12:37:19 crc kubenswrapper[4752]: I1124 12:37:19.199880 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a3e6bac3-32fc-4dfe-9925-e297bc7c1059","Type":"ContainerStarted","Data":"2428a452a149ac5cc50b638504ce290b551642a52e5de6b959d7de021da154d5"} Nov 24 12:37:19 crc kubenswrapper[4752]: I1124 12:37:19.202117 4752 generic.go:334] "Generic (PLEG): container finished" podID="487dd637-63f1-43f5-b9a2-2c5b8c5a2453" containerID="6768b83abb28a36d2a6ed96add9a5c55c2e434e3164c29ee672f564f229f0568" exitCode=0 Nov 24 12:37:19 crc kubenswrapper[4752]: I1124 12:37:19.202159 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"487dd637-63f1-43f5-b9a2-2c5b8c5a2453","Type":"ContainerDied","Data":"6768b83abb28a36d2a6ed96add9a5c55c2e434e3164c29ee672f564f229f0568"} Nov 24 12:37:19 crc kubenswrapper[4752]: I1124 12:37:19.205184 4752 generic.go:334] "Generic (PLEG): container finished" podID="cf2ced4e-3e49-4591-a1c2-88cb0941afd4" containerID="6b0f22bbd3c5774761af0482964d2ad3f1c3f5e2bf85245a0374282b45094763" exitCode=0 Nov 24 12:37:19 crc kubenswrapper[4752]: I1124 12:37:19.205204 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cf2ced4e-3e49-4591-a1c2-88cb0941afd4","Type":"ContainerDied","Data":"6b0f22bbd3c5774761af0482964d2ad3f1c3f5e2bf85245a0374282b45094763"} Nov 24 12:37:19 crc kubenswrapper[4752]: I1124 12:37:19.232214 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.232195332 podStartE2EDuration="2.232195332s" podCreationTimestamp="2025-11-24 12:37:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:37:19.220464725 +0000 UTC m=+5445.205285014" watchObservedRunningTime="2025-11-24 12:37:19.232195332 +0000 UTC m=+5445.217015611" Nov 24 12:37:19 crc kubenswrapper[4752]: I1124 12:37:19.264045 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57958c8f89-5j58w" podUID="7a8699ad-3e05-4eb4-bcc1-d476094c9629" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.76:5353: i/o timeout" Nov 24 12:37:19 crc kubenswrapper[4752]: I1124 12:37:19.332327 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 24 12:37:19 crc kubenswrapper[4752]: I1124 12:37:19.382140 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvmft\" (UniqueName: \"kubernetes.io/projected/ff4093e0-835e-4833-afb9-bbd891f00f93-kube-api-access-pvmft\") pod \"ff4093e0-835e-4833-afb9-bbd891f00f93\" (UID: \"ff4093e0-835e-4833-afb9-bbd891f00f93\") " Nov 24 12:37:19 crc kubenswrapper[4752]: I1124 12:37:19.382227 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff4093e0-835e-4833-afb9-bbd891f00f93-config-data\") pod \"ff4093e0-835e-4833-afb9-bbd891f00f93\" (UID: \"ff4093e0-835e-4833-afb9-bbd891f00f93\") " Nov 24 12:37:19 crc kubenswrapper[4752]: I1124 12:37:19.382267 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff4093e0-835e-4833-afb9-bbd891f00f93-combined-ca-bundle\") pod \"ff4093e0-835e-4833-afb9-bbd891f00f93\" (UID: \"ff4093e0-835e-4833-afb9-bbd891f00f93\") " Nov 24 12:37:19 crc kubenswrapper[4752]: I1124 12:37:19.432520 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff4093e0-835e-4833-afb9-bbd891f00f93-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff4093e0-835e-4833-afb9-bbd891f00f93" (UID: "ff4093e0-835e-4833-afb9-bbd891f00f93"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:37:19 crc kubenswrapper[4752]: I1124 12:37:19.440497 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff4093e0-835e-4833-afb9-bbd891f00f93-kube-api-access-pvmft" (OuterVolumeSpecName: "kube-api-access-pvmft") pod "ff4093e0-835e-4833-afb9-bbd891f00f93" (UID: "ff4093e0-835e-4833-afb9-bbd891f00f93"). InnerVolumeSpecName "kube-api-access-pvmft". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:37:19 crc kubenswrapper[4752]: I1124 12:37:19.483850 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff4093e0-835e-4833-afb9-bbd891f00f93-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 12:37:19 crc kubenswrapper[4752]: I1124 12:37:19.483899 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvmft\" (UniqueName: \"kubernetes.io/projected/ff4093e0-835e-4833-afb9-bbd891f00f93-kube-api-access-pvmft\") on node \"crc\" DevicePath \"\"" Nov 24 12:37:19 crc kubenswrapper[4752]: I1124 12:37:19.490522 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff4093e0-835e-4833-afb9-bbd891f00f93-config-data" (OuterVolumeSpecName: "config-data") pod "ff4093e0-835e-4833-afb9-bbd891f00f93" (UID: "ff4093e0-835e-4833-afb9-bbd891f00f93"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:37:19 crc kubenswrapper[4752]: I1124 12:37:19.588716 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff4093e0-835e-4833-afb9-bbd891f00f93-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 12:37:19 crc kubenswrapper[4752]: I1124 12:37:19.664582 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 12:37:19 crc kubenswrapper[4752]: I1124 12:37:19.691178 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/487dd637-63f1-43f5-b9a2-2c5b8c5a2453-config-data\") pod \"487dd637-63f1-43f5-b9a2-2c5b8c5a2453\" (UID: \"487dd637-63f1-43f5-b9a2-2c5b8c5a2453\") " Nov 24 12:37:19 crc kubenswrapper[4752]: I1124 12:37:19.691231 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/487dd637-63f1-43f5-b9a2-2c5b8c5a2453-combined-ca-bundle\") pod \"487dd637-63f1-43f5-b9a2-2c5b8c5a2453\" (UID: \"487dd637-63f1-43f5-b9a2-2c5b8c5a2453\") " Nov 24 12:37:19 crc kubenswrapper[4752]: I1124 12:37:19.691259 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2f7f\" (UniqueName: \"kubernetes.io/projected/487dd637-63f1-43f5-b9a2-2c5b8c5a2453-kube-api-access-x2f7f\") pod \"487dd637-63f1-43f5-b9a2-2c5b8c5a2453\" (UID: \"487dd637-63f1-43f5-b9a2-2c5b8c5a2453\") " Nov 24 12:37:19 crc kubenswrapper[4752]: I1124 12:37:19.691333 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/487dd637-63f1-43f5-b9a2-2c5b8c5a2453-logs\") pod \"487dd637-63f1-43f5-b9a2-2c5b8c5a2453\" (UID: \"487dd637-63f1-43f5-b9a2-2c5b8c5a2453\") " Nov 24 12:37:19 crc kubenswrapper[4752]: I1124 12:37:19.692931 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/487dd637-63f1-43f5-b9a2-2c5b8c5a2453-logs" (OuterVolumeSpecName: "logs") pod "487dd637-63f1-43f5-b9a2-2c5b8c5a2453" (UID: "487dd637-63f1-43f5-b9a2-2c5b8c5a2453"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:37:19 crc kubenswrapper[4752]: I1124 12:37:19.711918 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/487dd637-63f1-43f5-b9a2-2c5b8c5a2453-kube-api-access-x2f7f" (OuterVolumeSpecName: "kube-api-access-x2f7f") pod "487dd637-63f1-43f5-b9a2-2c5b8c5a2453" (UID: "487dd637-63f1-43f5-b9a2-2c5b8c5a2453"). InnerVolumeSpecName "kube-api-access-x2f7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:37:19 crc kubenswrapper[4752]: I1124 12:37:19.752928 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/487dd637-63f1-43f5-b9a2-2c5b8c5a2453-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "487dd637-63f1-43f5-b9a2-2c5b8c5a2453" (UID: "487dd637-63f1-43f5-b9a2-2c5b8c5a2453"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:37:19 crc kubenswrapper[4752]: I1124 12:37:19.774889 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/487dd637-63f1-43f5-b9a2-2c5b8c5a2453-config-data" (OuterVolumeSpecName: "config-data") pod "487dd637-63f1-43f5-b9a2-2c5b8c5a2453" (UID: "487dd637-63f1-43f5-b9a2-2c5b8c5a2453"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:37:19 crc kubenswrapper[4752]: I1124 12:37:19.793322 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/487dd637-63f1-43f5-b9a2-2c5b8c5a2453-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 12:37:19 crc kubenswrapper[4752]: I1124 12:37:19.793357 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/487dd637-63f1-43f5-b9a2-2c5b8c5a2453-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 12:37:19 crc kubenswrapper[4752]: I1124 12:37:19.793371 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2f7f\" (UniqueName: \"kubernetes.io/projected/487dd637-63f1-43f5-b9a2-2c5b8c5a2453-kube-api-access-x2f7f\") on node \"crc\" DevicePath \"\"" Nov 24 12:37:19 crc kubenswrapper[4752]: I1124 12:37:19.793383 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/487dd637-63f1-43f5-b9a2-2c5b8c5a2453-logs\") on node \"crc\" DevicePath \"\"" Nov 24 12:37:19 crc kubenswrapper[4752]: I1124 12:37:19.795974 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 12:37:19 crc kubenswrapper[4752]: I1124 12:37:19.996271 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf2ced4e-3e49-4591-a1c2-88cb0941afd4-logs" (OuterVolumeSpecName: "logs") pod "cf2ced4e-3e49-4591-a1c2-88cb0941afd4" (UID: "cf2ced4e-3e49-4591-a1c2-88cb0941afd4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:37:19 crc kubenswrapper[4752]: I1124 12:37:19.996315 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf2ced4e-3e49-4591-a1c2-88cb0941afd4-logs\") pod \"cf2ced4e-3e49-4591-a1c2-88cb0941afd4\" (UID: \"cf2ced4e-3e49-4591-a1c2-88cb0941afd4\") " Nov 24 12:37:19 crc kubenswrapper[4752]: I1124 12:37:19.996370 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf2ced4e-3e49-4591-a1c2-88cb0941afd4-config-data\") pod \"cf2ced4e-3e49-4591-a1c2-88cb0941afd4\" (UID: \"cf2ced4e-3e49-4591-a1c2-88cb0941afd4\") " Nov 24 12:37:19 crc kubenswrapper[4752]: I1124 12:37:19.996473 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cxkb\" (UniqueName: \"kubernetes.io/projected/cf2ced4e-3e49-4591-a1c2-88cb0941afd4-kube-api-access-8cxkb\") pod \"cf2ced4e-3e49-4591-a1c2-88cb0941afd4\" (UID: \"cf2ced4e-3e49-4591-a1c2-88cb0941afd4\") " Nov 24 12:37:19 crc kubenswrapper[4752]: I1124 12:37:19.996501 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf2ced4e-3e49-4591-a1c2-88cb0941afd4-combined-ca-bundle\") pod \"cf2ced4e-3e49-4591-a1c2-88cb0941afd4\" (UID: \"cf2ced4e-3e49-4591-a1c2-88cb0941afd4\") " Nov 24 12:37:19 crc kubenswrapper[4752]: I1124 12:37:19.996977 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf2ced4e-3e49-4591-a1c2-88cb0941afd4-logs\") on node \"crc\" DevicePath \"\"" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.001472 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf2ced4e-3e49-4591-a1c2-88cb0941afd4-kube-api-access-8cxkb" (OuterVolumeSpecName: "kube-api-access-8cxkb") pod "cf2ced4e-3e49-4591-a1c2-88cb0941afd4" (UID: "cf2ced4e-3e49-4591-a1c2-88cb0941afd4"). InnerVolumeSpecName "kube-api-access-8cxkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.022184 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf2ced4e-3e49-4591-a1c2-88cb0941afd4-config-data" (OuterVolumeSpecName: "config-data") pod "cf2ced4e-3e49-4591-a1c2-88cb0941afd4" (UID: "cf2ced4e-3e49-4591-a1c2-88cb0941afd4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.025675 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf2ced4e-3e49-4591-a1c2-88cb0941afd4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf2ced4e-3e49-4591-a1c2-88cb0941afd4" (UID: "cf2ced4e-3e49-4591-a1c2-88cb0941afd4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.098412 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf2ced4e-3e49-4591-a1c2-88cb0941afd4-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.098448 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cxkb\" (UniqueName: \"kubernetes.io/projected/cf2ced4e-3e49-4591-a1c2-88cb0941afd4-kube-api-access-8cxkb\") on node \"crc\" DevicePath \"\"" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.098461 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf2ced4e-3e49-4591-a1c2-88cb0941afd4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.218544 4752 generic.go:334] "Generic (PLEG): container finished" podID="7c82b290-1172-460e-9b64-5a73c92229d0" containerID="8ffb055c443d1b0af4eb5c2ab0dd7c04bd559e53d7e2197ca2691405749cd41d" exitCode=0 Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.218603 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7c82b290-1172-460e-9b64-5a73c92229d0","Type":"ContainerDied","Data":"8ffb055c443d1b0af4eb5c2ab0dd7c04bd559e53d7e2197ca2691405749cd41d"} Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.224772 4752 generic.go:334] "Generic (PLEG): container finished" podID="a0418f6f-75c8-4dac-b7d4-c95946071dec" containerID="c57c4f1a5a103affa7b2607ae35afd356a03cb2c59858a44fe7a879bc72b9ff2" exitCode=0 Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.224840 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a0418f6f-75c8-4dac-b7d4-c95946071dec","Type":"ContainerDied","Data":"c57c4f1a5a103affa7b2607ae35afd356a03cb2c59858a44fe7a879bc72b9ff2"} Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.231067 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"487dd637-63f1-43f5-b9a2-2c5b8c5a2453","Type":"ContainerDied","Data":"c6878aa6d97d8733fde17c0133cd3b7994a5a0e1449f38a3a1c6b796a9eea970"} Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.231100 4752 scope.go:117] "RemoveContainer" containerID="6768b83abb28a36d2a6ed96add9a5c55c2e434e3164c29ee672f564f229f0568" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.231186 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.273076 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cf2ced4e-3e49-4591-a1c2-88cb0941afd4","Type":"ContainerDied","Data":"68b5ae7fd207080fc8a812b70ec9d0b6d3db436f7b2e8fc3101a6b0e86154221"} Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.273176 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.310784 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.310819 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ff4093e0-835e-4833-afb9-bbd891f00f93","Type":"ContainerDied","Data":"cc22278d447679c4d20d95bc3a68063e9a3447c9d62ca2ade43e7a89c8300084"} Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.348074 4752 scope.go:117] "RemoveContainer" containerID="58980bb9e0653822f9cec600c9f30f3d298cd6e78077c65e7784c608d941999e" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.349675 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.371160 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.402911 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 24 12:37:20 crc kubenswrapper[4752]: E1124 12:37:20.403342 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="487dd637-63f1-43f5-b9a2-2c5b8c5a2453" containerName="nova-api-log" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.403360 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="487dd637-63f1-43f5-b9a2-2c5b8c5a2453" containerName="nova-api-log" Nov 24 12:37:20 crc kubenswrapper[4752]: E1124 12:37:20.403373 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="487dd637-63f1-43f5-b9a2-2c5b8c5a2453" containerName="nova-api-api" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.403382 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="487dd637-63f1-43f5-b9a2-2c5b8c5a2453" containerName="nova-api-api" Nov 24 12:37:20 crc kubenswrapper[4752]: E1124 12:37:20.403390 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff4093e0-835e-4833-afb9-bbd891f00f93" containerName="nova-scheduler-scheduler" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.403396 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff4093e0-835e-4833-afb9-bbd891f00f93" containerName="nova-scheduler-scheduler" Nov 24 12:37:20 crc kubenswrapper[4752]: E1124 12:37:20.403406 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf2ced4e-3e49-4591-a1c2-88cb0941afd4" containerName="nova-metadata-metadata" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.403412 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf2ced4e-3e49-4591-a1c2-88cb0941afd4" containerName="nova-metadata-metadata" Nov 24 12:37:20 crc kubenswrapper[4752]: E1124 12:37:20.403427 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf2ced4e-3e49-4591-a1c2-88cb0941afd4" containerName="nova-metadata-log" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.403433 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf2ced4e-3e49-4591-a1c2-88cb0941afd4" containerName="nova-metadata-log" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.403604 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff4093e0-835e-4833-afb9-bbd891f00f93" containerName="nova-scheduler-scheduler" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.403619 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf2ced4e-3e49-4591-a1c2-88cb0941afd4" containerName="nova-metadata-log" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.403635 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="487dd637-63f1-43f5-b9a2-2c5b8c5a2453" containerName="nova-api-api" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.403644 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf2ced4e-3e49-4591-a1c2-88cb0941afd4" containerName="nova-metadata-metadata" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.403655 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="487dd637-63f1-43f5-b9a2-2c5b8c5a2453" containerName="nova-api-log" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.404550 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.406201 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57480bf6-a5ed-4c0b-b787-fc2863744323-config-data\") pod \"nova-api-0\" (UID: \"57480bf6-a5ed-4c0b-b787-fc2863744323\") " pod="openstack/nova-api-0" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.406236 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57480bf6-a5ed-4c0b-b787-fc2863744323-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"57480bf6-a5ed-4c0b-b787-fc2863744323\") " pod="openstack/nova-api-0" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.406421 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqb4w\" (UniqueName: \"kubernetes.io/projected/57480bf6-a5ed-4c0b-b787-fc2863744323-kube-api-access-qqb4w\") pod \"nova-api-0\" (UID: \"57480bf6-a5ed-4c0b-b787-fc2863744323\") " pod="openstack/nova-api-0" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.406450 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57480bf6-a5ed-4c0b-b787-fc2863744323-logs\") pod \"nova-api-0\" (UID: \"57480bf6-a5ed-4c0b-b787-fc2863744323\") " pod="openstack/nova-api-0" Nov 24 12:37:20 crc kubenswrapper[4752]: E1124 12:37:20.406622 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c57c4f1a5a103affa7b2607ae35afd356a03cb2c59858a44fe7a879bc72b9ff2 is running failed: container process not found" containerID="c57c4f1a5a103affa7b2607ae35afd356a03cb2c59858a44fe7a879bc72b9ff2" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 24 12:37:20 crc kubenswrapper[4752]: E1124 12:37:20.407908 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c57c4f1a5a103affa7b2607ae35afd356a03cb2c59858a44fe7a879bc72b9ff2 is running failed: container process not found" containerID="c57c4f1a5a103affa7b2607ae35afd356a03cb2c59858a44fe7a879bc72b9ff2" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 24 12:37:20 crc kubenswrapper[4752]: E1124 12:37:20.410961 4752 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c57c4f1a5a103affa7b2607ae35afd356a03cb2c59858a44fe7a879bc72b9ff2 is running failed: container process not found" containerID="c57c4f1a5a103affa7b2607ae35afd356a03cb2c59858a44fe7a879bc72b9ff2" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 24 12:37:20 crc kubenswrapper[4752]: E1124 12:37:20.410995 4752 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c57c4f1a5a103affa7b2607ae35afd356a03cb2c59858a44fe7a879bc72b9ff2 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="a0418f6f-75c8-4dac-b7d4-c95946071dec" containerName="nova-cell1-conductor-conductor" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.413804 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.421320 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.425816 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.430498 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.446236 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.447176 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.447848 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.449977 4752 scope.go:117] "RemoveContainer" containerID="6b0f22bbd3c5774761af0482964d2ad3f1c3f5e2bf85245a0374282b45094763" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.456166 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.458313 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.501036 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.512338 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57480bf6-a5ed-4c0b-b787-fc2863744323-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"57480bf6-a5ed-4c0b-b787-fc2863744323\") " pod="openstack/nova-api-0" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.512479 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb42f12d-f3af-4fb3-9a41-e60dae4129e9-logs\") pod \"nova-metadata-0\" (UID: \"fb42f12d-f3af-4fb3-9a41-e60dae4129e9\") " pod="openstack/nova-metadata-0" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.512575 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqb4w\" (UniqueName: \"kubernetes.io/projected/57480bf6-a5ed-4c0b-b787-fc2863744323-kube-api-access-qqb4w\") pod \"nova-api-0\" (UID: \"57480bf6-a5ed-4c0b-b787-fc2863744323\") " pod="openstack/nova-api-0" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.512615 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57480bf6-a5ed-4c0b-b787-fc2863744323-logs\") pod \"nova-api-0\" (UID: \"57480bf6-a5ed-4c0b-b787-fc2863744323\") " pod="openstack/nova-api-0" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.512800 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb42f12d-f3af-4fb3-9a41-e60dae4129e9-config-data\") pod \"nova-metadata-0\" (UID: \"fb42f12d-f3af-4fb3-9a41-e60dae4129e9\") " pod="openstack/nova-metadata-0" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.512823 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mhx5\" (UniqueName: \"kubernetes.io/projected/fb42f12d-f3af-4fb3-9a41-e60dae4129e9-kube-api-access-4mhx5\") pod \"nova-metadata-0\" (UID: \"fb42f12d-f3af-4fb3-9a41-e60dae4129e9\") " pod="openstack/nova-metadata-0" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.512865 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb42f12d-f3af-4fb3-9a41-e60dae4129e9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fb42f12d-f3af-4fb3-9a41-e60dae4129e9\") " pod="openstack/nova-metadata-0" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.512906 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57480bf6-a5ed-4c0b-b787-fc2863744323-config-data\") pod \"nova-api-0\" (UID: \"57480bf6-a5ed-4c0b-b787-fc2863744323\") " pod="openstack/nova-api-0" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.513560 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57480bf6-a5ed-4c0b-b787-fc2863744323-logs\") pod \"nova-api-0\" (UID: \"57480bf6-a5ed-4c0b-b787-fc2863744323\") " pod="openstack/nova-api-0" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.517790 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57480bf6-a5ed-4c0b-b787-fc2863744323-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"57480bf6-a5ed-4c0b-b787-fc2863744323\") " pod="openstack/nova-api-0" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.521231 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57480bf6-a5ed-4c0b-b787-fc2863744323-config-data\") pod \"nova-api-0\" (UID: \"57480bf6-a5ed-4c0b-b787-fc2863744323\") " pod="openstack/nova-api-0" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.533449 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.544712 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqb4w\" (UniqueName: \"kubernetes.io/projected/57480bf6-a5ed-4c0b-b787-fc2863744323-kube-api-access-qqb4w\") pod \"nova-api-0\" (UID: \"57480bf6-a5ed-4c0b-b787-fc2863744323\") " pod="openstack/nova-api-0" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.545644 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.556694 4752 scope.go:117] "RemoveContainer" containerID="de3314570b7010816dc37a0aa540096ee297b7eaac083fc6d467ae1624081b06" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.557080 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 12:37:20 crc kubenswrapper[4752]: E1124 12:37:20.557469 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0418f6f-75c8-4dac-b7d4-c95946071dec" containerName="nova-cell1-conductor-conductor" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.557484 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0418f6f-75c8-4dac-b7d4-c95946071dec" containerName="nova-cell1-conductor-conductor" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.557678 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0418f6f-75c8-4dac-b7d4-c95946071dec" containerName="nova-cell1-conductor-conductor" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.558451 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.558579 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.563509 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.585954 4752 scope.go:117] "RemoveContainer" containerID="f285d79c70d707f82914d71de3363f5ad93fc63c12316bd22012b8a653ce79ef" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.599007 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.619827 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvptj\" (UniqueName: \"kubernetes.io/projected/a0418f6f-75c8-4dac-b7d4-c95946071dec-kube-api-access-pvptj\") pod \"a0418f6f-75c8-4dac-b7d4-c95946071dec\" (UID: \"a0418f6f-75c8-4dac-b7d4-c95946071dec\") " Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.620216 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0418f6f-75c8-4dac-b7d4-c95946071dec-combined-ca-bundle\") pod \"a0418f6f-75c8-4dac-b7d4-c95946071dec\" (UID: \"a0418f6f-75c8-4dac-b7d4-c95946071dec\") " Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.622571 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0418f6f-75c8-4dac-b7d4-c95946071dec-config-data\") pod \"a0418f6f-75c8-4dac-b7d4-c95946071dec\" (UID: \"a0418f6f-75c8-4dac-b7d4-c95946071dec\") " Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.623062 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d7b2d8a-5981-49b8-bebe-43341522af04-config-data\") pod \"nova-scheduler-0\" (UID: \"0d7b2d8a-5981-49b8-bebe-43341522af04\") " pod="openstack/nova-scheduler-0" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.623207 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb42f12d-f3af-4fb3-9a41-e60dae4129e9-config-data\") pod \"nova-metadata-0\" (UID: \"fb42f12d-f3af-4fb3-9a41-e60dae4129e9\") " pod="openstack/nova-metadata-0" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.623296 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mhx5\" (UniqueName: \"kubernetes.io/projected/fb42f12d-f3af-4fb3-9a41-e60dae4129e9-kube-api-access-4mhx5\") pod \"nova-metadata-0\" (UID: \"fb42f12d-f3af-4fb3-9a41-e60dae4129e9\") " pod="openstack/nova-metadata-0" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.623384 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6w8m\" (UniqueName: \"kubernetes.io/projected/0d7b2d8a-5981-49b8-bebe-43341522af04-kube-api-access-b6w8m\") pod \"nova-scheduler-0\" (UID: \"0d7b2d8a-5981-49b8-bebe-43341522af04\") " pod="openstack/nova-scheduler-0" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.623485 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb42f12d-f3af-4fb3-9a41-e60dae4129e9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fb42f12d-f3af-4fb3-9a41-e60dae4129e9\") " pod="openstack/nova-metadata-0" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.623638 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d7b2d8a-5981-49b8-bebe-43341522af04-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0d7b2d8a-5981-49b8-bebe-43341522af04\") " pod="openstack/nova-scheduler-0" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.625153 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb42f12d-f3af-4fb3-9a41-e60dae4129e9-logs\") pod \"nova-metadata-0\" (UID: \"fb42f12d-f3af-4fb3-9a41-e60dae4129e9\") " pod="openstack/nova-metadata-0" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.625665 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb42f12d-f3af-4fb3-9a41-e60dae4129e9-logs\") pod \"nova-metadata-0\" (UID: \"fb42f12d-f3af-4fb3-9a41-e60dae4129e9\") " pod="openstack/nova-metadata-0" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.626431 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0418f6f-75c8-4dac-b7d4-c95946071dec-kube-api-access-pvptj" (OuterVolumeSpecName: "kube-api-access-pvptj") pod "a0418f6f-75c8-4dac-b7d4-c95946071dec" (UID: "a0418f6f-75c8-4dac-b7d4-c95946071dec"). InnerVolumeSpecName "kube-api-access-pvptj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.627333 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb42f12d-f3af-4fb3-9a41-e60dae4129e9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fb42f12d-f3af-4fb3-9a41-e60dae4129e9\") " pod="openstack/nova-metadata-0" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.628351 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb42f12d-f3af-4fb3-9a41-e60dae4129e9-config-data\") pod \"nova-metadata-0\" (UID: \"fb42f12d-f3af-4fb3-9a41-e60dae4129e9\") " pod="openstack/nova-metadata-0" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.642091 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mhx5\" (UniqueName: \"kubernetes.io/projected/fb42f12d-f3af-4fb3-9a41-e60dae4129e9-kube-api-access-4mhx5\") pod \"nova-metadata-0\" (UID: \"fb42f12d-f3af-4fb3-9a41-e60dae4129e9\") " pod="openstack/nova-metadata-0" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.647903 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0418f6f-75c8-4dac-b7d4-c95946071dec-config-data" (OuterVolumeSpecName: "config-data") pod "a0418f6f-75c8-4dac-b7d4-c95946071dec" (UID: "a0418f6f-75c8-4dac-b7d4-c95946071dec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.661639 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0418f6f-75c8-4dac-b7d4-c95946071dec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0418f6f-75c8-4dac-b7d4-c95946071dec" (UID: "a0418f6f-75c8-4dac-b7d4-c95946071dec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.726927 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c82b290-1172-460e-9b64-5a73c92229d0-combined-ca-bundle\") pod \"7c82b290-1172-460e-9b64-5a73c92229d0\" (UID: \"7c82b290-1172-460e-9b64-5a73c92229d0\") " Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.727064 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hb56j\" (UniqueName: \"kubernetes.io/projected/7c82b290-1172-460e-9b64-5a73c92229d0-kube-api-access-hb56j\") pod \"7c82b290-1172-460e-9b64-5a73c92229d0\" (UID: \"7c82b290-1172-460e-9b64-5a73c92229d0\") " Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.727176 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c82b290-1172-460e-9b64-5a73c92229d0-config-data\") pod \"7c82b290-1172-460e-9b64-5a73c92229d0\" (UID: \"7c82b290-1172-460e-9b64-5a73c92229d0\") " Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.727363 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d7b2d8a-5981-49b8-bebe-43341522af04-config-data\") pod \"nova-scheduler-0\" (UID: \"0d7b2d8a-5981-49b8-bebe-43341522af04\") " pod="openstack/nova-scheduler-0" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.727395 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6w8m\" (UniqueName: \"kubernetes.io/projected/0d7b2d8a-5981-49b8-bebe-43341522af04-kube-api-access-b6w8m\") pod \"nova-scheduler-0\" (UID: \"0d7b2d8a-5981-49b8-bebe-43341522af04\") " pod="openstack/nova-scheduler-0" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.727433 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d7b2d8a-5981-49b8-bebe-43341522af04-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0d7b2d8a-5981-49b8-bebe-43341522af04\") " pod="openstack/nova-scheduler-0" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.728163 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0418f6f-75c8-4dac-b7d4-c95946071dec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.729863 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0418f6f-75c8-4dac-b7d4-c95946071dec-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.730426 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvptj\" (UniqueName: \"kubernetes.io/projected/a0418f6f-75c8-4dac-b7d4-c95946071dec-kube-api-access-pvptj\") on node \"crc\" DevicePath \"\"" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.733537 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d7b2d8a-5981-49b8-bebe-43341522af04-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0d7b2d8a-5981-49b8-bebe-43341522af04\") " pod="openstack/nova-scheduler-0" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.734212 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d7b2d8a-5981-49b8-bebe-43341522af04-config-data\") pod \"nova-scheduler-0\" (UID: \"0d7b2d8a-5981-49b8-bebe-43341522af04\") " pod="openstack/nova-scheduler-0" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.745590 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="487dd637-63f1-43f5-b9a2-2c5b8c5a2453" path="/var/lib/kubelet/pods/487dd637-63f1-43f5-b9a2-2c5b8c5a2453/volumes" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.746077 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6w8m\" (UniqueName: \"kubernetes.io/projected/0d7b2d8a-5981-49b8-bebe-43341522af04-kube-api-access-b6w8m\") pod \"nova-scheduler-0\" (UID: \"0d7b2d8a-5981-49b8-bebe-43341522af04\") " pod="openstack/nova-scheduler-0" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.746280 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf2ced4e-3e49-4591-a1c2-88cb0941afd4" path="/var/lib/kubelet/pods/cf2ced4e-3e49-4591-a1c2-88cb0941afd4/volumes" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.747167 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff4093e0-835e-4833-afb9-bbd891f00f93" path="/var/lib/kubelet/pods/ff4093e0-835e-4833-afb9-bbd891f00f93/volumes" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.759970 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c82b290-1172-460e-9b64-5a73c92229d0-kube-api-access-hb56j" (OuterVolumeSpecName: "kube-api-access-hb56j") pod "7c82b290-1172-460e-9b64-5a73c92229d0" (UID: "7c82b290-1172-460e-9b64-5a73c92229d0"). InnerVolumeSpecName "kube-api-access-hb56j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.760046 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c82b290-1172-460e-9b64-5a73c92229d0-config-data" (OuterVolumeSpecName: "config-data") pod "7c82b290-1172-460e-9b64-5a73c92229d0" (UID: "7c82b290-1172-460e-9b64-5a73c92229d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.761425 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c82b290-1172-460e-9b64-5a73c92229d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c82b290-1172-460e-9b64-5a73c92229d0" (UID: "7c82b290-1172-460e-9b64-5a73c92229d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.834462 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hb56j\" (UniqueName: \"kubernetes.io/projected/7c82b290-1172-460e-9b64-5a73c92229d0-kube-api-access-hb56j\") on node \"crc\" DevicePath \"\"" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.834496 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c82b290-1172-460e-9b64-5a73c92229d0-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.834506 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c82b290-1172-460e-9b64-5a73c92229d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.870357 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 12:37:20 crc kubenswrapper[4752]: I1124 12:37:20.908240 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.064963 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 24 12:37:21 crc kubenswrapper[4752]: W1124 12:37:21.077238 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57480bf6_a5ed_4c0b_b787_fc2863744323.slice/crio-ec9161adbb3f6f2858f6501a2517eba020cba302b7a1cb700122281b3fa877d9 WatchSource:0}: Error finding container ec9161adbb3f6f2858f6501a2517eba020cba302b7a1cb700122281b3fa877d9: Status 404 returned error can't find the container with id ec9161adbb3f6f2858f6501a2517eba020cba302b7a1cb700122281b3fa877d9 Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.326007 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7c82b290-1172-460e-9b64-5a73c92229d0","Type":"ContainerDied","Data":"bf97fc942abb6428fee34e917bb698947e7d68921c35c7f8fbb0f66d1676f4b4"} Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.326244 4752 scope.go:117] "RemoveContainer" containerID="8ffb055c443d1b0af4eb5c2ab0dd7c04bd559e53d7e2197ca2691405749cd41d" Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.326363 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.343995 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a0418f6f-75c8-4dac-b7d4-c95946071dec","Type":"ContainerDied","Data":"4f51c5d9c0bb117e0729422a349ea73baba475063206a83cdb8058ad8ccd2eb1"} Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.344045 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.355907 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"57480bf6-a5ed-4c0b-b787-fc2863744323","Type":"ContainerStarted","Data":"39289c74d13bc13a138076c50a5b483d89a5ef2e1c92a883045bd725d62f81d5"} Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.355961 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"57480bf6-a5ed-4c0b-b787-fc2863744323","Type":"ContainerStarted","Data":"ec9161adbb3f6f2858f6501a2517eba020cba302b7a1cb700122281b3fa877d9"} Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.365248 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.370995 4752 scope.go:117] "RemoveContainer" containerID="c57c4f1a5a103affa7b2607ae35afd356a03cb2c59858a44fe7a879bc72b9ff2" Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.481969 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.499866 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.515854 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.522845 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 24 12:37:21 crc kubenswrapper[4752]: E1124 12:37:21.523306 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c82b290-1172-460e-9b64-5a73c92229d0" containerName="nova-cell0-conductor-conductor" Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.523332 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c82b290-1172-460e-9b64-5a73c92229d0" containerName="nova-cell0-conductor-conductor" Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.523618 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c82b290-1172-460e-9b64-5a73c92229d0" containerName="nova-cell0-conductor-conductor" Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.524370 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.527024 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.542292 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.550959 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnqbr\" (UniqueName: \"kubernetes.io/projected/041c17c0-b4dd-46a2-ba18-2f972ad0ff6a-kube-api-access-hnqbr\") pod \"nova-cell0-conductor-0\" (UID: \"041c17c0-b4dd-46a2-ba18-2f972ad0ff6a\") " pod="openstack/nova-cell0-conductor-0" Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.551547 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/041c17c0-b4dd-46a2-ba18-2f972ad0ff6a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"041c17c0-b4dd-46a2-ba18-2f972ad0ff6a\") " pod="openstack/nova-cell0-conductor-0" Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.551601 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/041c17c0-b4dd-46a2-ba18-2f972ad0ff6a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"041c17c0-b4dd-46a2-ba18-2f972ad0ff6a\") " pod="openstack/nova-cell0-conductor-0" Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.585194 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.594198 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.603800 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.605287 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.607907 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.613379 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.653719 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/041c17c0-b4dd-46a2-ba18-2f972ad0ff6a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"041c17c0-b4dd-46a2-ba18-2f972ad0ff6a\") " pod="openstack/nova-cell0-conductor-0" Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.653907 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/041c17c0-b4dd-46a2-ba18-2f972ad0ff6a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"041c17c0-b4dd-46a2-ba18-2f972ad0ff6a\") " pod="openstack/nova-cell0-conductor-0" Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.653984 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d14b0aa7-9a78-47bc-b0ae-e78d717028fc-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"d14b0aa7-9a78-47bc-b0ae-e78d717028fc\") " pod="openstack/nova-cell1-conductor-0" Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.654038 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmb46\" (UniqueName: \"kubernetes.io/projected/d14b0aa7-9a78-47bc-b0ae-e78d717028fc-kube-api-access-wmb46\") pod \"nova-cell1-conductor-0\" (UID: \"d14b0aa7-9a78-47bc-b0ae-e78d717028fc\") " pod="openstack/nova-cell1-conductor-0" Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.654161 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnqbr\" (UniqueName: \"kubernetes.io/projected/041c17c0-b4dd-46a2-ba18-2f972ad0ff6a-kube-api-access-hnqbr\") pod \"nova-cell0-conductor-0\" (UID: \"041c17c0-b4dd-46a2-ba18-2f972ad0ff6a\") " pod="openstack/nova-cell0-conductor-0" Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.654193 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d14b0aa7-9a78-47bc-b0ae-e78d717028fc-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"d14b0aa7-9a78-47bc-b0ae-e78d717028fc\") " pod="openstack/nova-cell1-conductor-0" Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.660299 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/041c17c0-b4dd-46a2-ba18-2f972ad0ff6a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"041c17c0-b4dd-46a2-ba18-2f972ad0ff6a\") " pod="openstack/nova-cell0-conductor-0" Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.660425 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/041c17c0-b4dd-46a2-ba18-2f972ad0ff6a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"041c17c0-b4dd-46a2-ba18-2f972ad0ff6a\") " pod="openstack/nova-cell0-conductor-0" Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.683183 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnqbr\" (UniqueName: \"kubernetes.io/projected/041c17c0-b4dd-46a2-ba18-2f972ad0ff6a-kube-api-access-hnqbr\") pod \"nova-cell0-conductor-0\" (UID: \"041c17c0-b4dd-46a2-ba18-2f972ad0ff6a\") " pod="openstack/nova-cell0-conductor-0" Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.755604 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d14b0aa7-9a78-47bc-b0ae-e78d717028fc-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"d14b0aa7-9a78-47bc-b0ae-e78d717028fc\") " pod="openstack/nova-cell1-conductor-0" Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.756194 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d14b0aa7-9a78-47bc-b0ae-e78d717028fc-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"d14b0aa7-9a78-47bc-b0ae-e78d717028fc\") " pod="openstack/nova-cell1-conductor-0" Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.756259 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmb46\" (UniqueName: \"kubernetes.io/projected/d14b0aa7-9a78-47bc-b0ae-e78d717028fc-kube-api-access-wmb46\") pod \"nova-cell1-conductor-0\" (UID: \"d14b0aa7-9a78-47bc-b0ae-e78d717028fc\") " pod="openstack/nova-cell1-conductor-0" Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.759906 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d14b0aa7-9a78-47bc-b0ae-e78d717028fc-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"d14b0aa7-9a78-47bc-b0ae-e78d717028fc\") " pod="openstack/nova-cell1-conductor-0" Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.761009 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d14b0aa7-9a78-47bc-b0ae-e78d717028fc-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"d14b0aa7-9a78-47bc-b0ae-e78d717028fc\") " pod="openstack/nova-cell1-conductor-0" Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.773376 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmb46\" (UniqueName: \"kubernetes.io/projected/d14b0aa7-9a78-47bc-b0ae-e78d717028fc-kube-api-access-wmb46\") pod \"nova-cell1-conductor-0\" (UID: \"d14b0aa7-9a78-47bc-b0ae-e78d717028fc\") " pod="openstack/nova-cell1-conductor-0" Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.886425 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 24 12:37:21 crc kubenswrapper[4752]: I1124 12:37:21.941338 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 24 12:37:22 crc kubenswrapper[4752]: I1124 12:37:22.372645 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"57480bf6-a5ed-4c0b-b787-fc2863744323","Type":"ContainerStarted","Data":"1b657b262b7c927418844cefa14a24a1f08b8545a12dc6c0372ba15a05df8e69"} Nov 24 12:37:22 crc kubenswrapper[4752]: I1124 12:37:22.377621 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fb42f12d-f3af-4fb3-9a41-e60dae4129e9","Type":"ContainerStarted","Data":"c4f711db230ebc73693e2f6379e0f90fe08e7aafcdf39ea36f909c060c2f4c6b"} Nov 24 12:37:22 crc kubenswrapper[4752]: I1124 12:37:22.377646 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fb42f12d-f3af-4fb3-9a41-e60dae4129e9","Type":"ContainerStarted","Data":"a56df5bb968f85b9c4b51ed2a7e57a2c030658b57c502d986589c01c7a4913f5"} Nov 24 12:37:22 crc kubenswrapper[4752]: I1124 12:37:22.377657 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fb42f12d-f3af-4fb3-9a41-e60dae4129e9","Type":"ContainerStarted","Data":"acffa6ece8773a2be2418b546c4583b183f6d3675783f85a1309276db36ace32"} Nov 24 12:37:22 crc kubenswrapper[4752]: I1124 12:37:22.382927 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0d7b2d8a-5981-49b8-bebe-43341522af04","Type":"ContainerStarted","Data":"84a56a87791f3a03020377a51d622beb111f52f3e3611e7f4f1dd6c2e0bea6ea"} Nov 24 12:37:22 crc kubenswrapper[4752]: I1124 12:37:22.382977 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0d7b2d8a-5981-49b8-bebe-43341522af04","Type":"ContainerStarted","Data":"a52245349bad39fbaad08e3ce6f1c6051909cb812c27b73697f40a4cacd6ea91"} Nov 24 12:37:22 crc kubenswrapper[4752]: I1124 12:37:22.400339 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.400321208 podStartE2EDuration="2.400321208s" podCreationTimestamp="2025-11-24 12:37:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:37:22.39166008 +0000 UTC m=+5448.376480369" watchObservedRunningTime="2025-11-24 12:37:22.400321208 +0000 UTC m=+5448.385141497" Nov 24 12:37:22 crc kubenswrapper[4752]: I1124 12:37:22.419015 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.418995184 podStartE2EDuration="2.418995184s" podCreationTimestamp="2025-11-24 12:37:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:37:22.411136508 +0000 UTC m=+5448.395956797" watchObservedRunningTime="2025-11-24 12:37:22.418995184 +0000 UTC m=+5448.403815473" Nov 24 12:37:22 crc kubenswrapper[4752]: I1124 12:37:22.434716 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.434695604 podStartE2EDuration="2.434695604s" podCreationTimestamp="2025-11-24 12:37:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:37:22.431096531 +0000 UTC m=+5448.415916820" watchObservedRunningTime="2025-11-24 12:37:22.434695604 +0000 UTC m=+5448.419515893" Nov 24 12:37:22 crc kubenswrapper[4752]: I1124 12:37:22.543520 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 24 12:37:22 crc kubenswrapper[4752]: W1124 12:37:22.546282 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod041c17c0_b4dd_46a2_ba18_2f972ad0ff6a.slice/crio-c1044144a0c40a324651100bfb962ebfd5b2fa6eacefe7c64c56ac9562bab6c9 WatchSource:0}: Error finding container c1044144a0c40a324651100bfb962ebfd5b2fa6eacefe7c64c56ac9562bab6c9: Status 404 returned error can't find the container with id c1044144a0c40a324651100bfb962ebfd5b2fa6eacefe7c64c56ac9562bab6c9 Nov 24 12:37:22 crc kubenswrapper[4752]: I1124 12:37:22.572184 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Nov 24 12:37:22 crc kubenswrapper[4752]: I1124 12:37:22.594338 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 24 12:37:22 crc kubenswrapper[4752]: I1124 12:37:22.745607 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c82b290-1172-460e-9b64-5a73c92229d0" path="/var/lib/kubelet/pods/7c82b290-1172-460e-9b64-5a73c92229d0/volumes" Nov 24 12:37:22 crc kubenswrapper[4752]: I1124 12:37:22.746383 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0418f6f-75c8-4dac-b7d4-c95946071dec" path="/var/lib/kubelet/pods/a0418f6f-75c8-4dac-b7d4-c95946071dec/volumes" Nov 24 12:37:23 crc kubenswrapper[4752]: I1124 12:37:23.396344 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"041c17c0-b4dd-46a2-ba18-2f972ad0ff6a","Type":"ContainerStarted","Data":"e990c3beac152b75ded134cdd0efd6eeba5575ee5897b402d556662bee762d5f"} Nov 24 12:37:23 crc kubenswrapper[4752]: I1124 12:37:23.396398 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"041c17c0-b4dd-46a2-ba18-2f972ad0ff6a","Type":"ContainerStarted","Data":"c1044144a0c40a324651100bfb962ebfd5b2fa6eacefe7c64c56ac9562bab6c9"} Nov 24 12:37:23 crc kubenswrapper[4752]: I1124 12:37:23.396477 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Nov 24 12:37:23 crc kubenswrapper[4752]: I1124 12:37:23.401424 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"d14b0aa7-9a78-47bc-b0ae-e78d717028fc","Type":"ContainerStarted","Data":"732e9fc273cea2ed5d9d1b80fcc7ea7ac14e61ae1aeb25031f0da78653225ac8"} Nov 24 12:37:23 crc kubenswrapper[4752]: I1124 12:37:23.401475 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"d14b0aa7-9a78-47bc-b0ae-e78d717028fc","Type":"ContainerStarted","Data":"6caa0846c9f8e67e866e7e3fdccd79606ee83c416acd802ba145599757a0079e"} Nov 24 12:37:23 crc kubenswrapper[4752]: I1124 12:37:23.403370 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Nov 24 12:37:23 crc kubenswrapper[4752]: I1124 12:37:23.453067 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.453048319 podStartE2EDuration="2.453048319s" podCreationTimestamp="2025-11-24 12:37:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:37:23.447300864 +0000 UTC m=+5449.432121163" watchObservedRunningTime="2025-11-24 12:37:23.453048319 +0000 UTC m=+5449.437868608" Nov 24 12:37:23 crc kubenswrapper[4752]: I1124 12:37:23.453467 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.453460491 podStartE2EDuration="2.453460491s" podCreationTimestamp="2025-11-24 12:37:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:37:23.425773056 +0000 UTC m=+5449.410593355" watchObservedRunningTime="2025-11-24 12:37:23.453460491 +0000 UTC m=+5449.438280780" Nov 24 12:37:24 crc kubenswrapper[4752]: I1124 12:37:24.793159 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="cf2ced4e-3e49-4591-a1c2-88cb0941afd4" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.74:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 24 12:37:24 crc kubenswrapper[4752]: I1124 12:37:24.793242 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="cf2ced4e-3e49-4591-a1c2-88cb0941afd4" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.74:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 24 12:37:25 crc kubenswrapper[4752]: I1124 12:37:25.870966 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 24 12:37:25 crc kubenswrapper[4752]: I1124 12:37:25.871480 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 24 12:37:25 crc kubenswrapper[4752]: I1124 12:37:25.908411 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 24 12:37:27 crc kubenswrapper[4752]: I1124 12:37:27.572841 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Nov 24 12:37:27 crc kubenswrapper[4752]: I1124 12:37:27.585614 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Nov 24 12:37:28 crc kubenswrapper[4752]: I1124 12:37:28.477104 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Nov 24 12:37:30 crc kubenswrapper[4752]: I1124 12:37:30.546529 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 24 12:37:30 crc kubenswrapper[4752]: I1124 12:37:30.546955 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 24 12:37:30 crc kubenswrapper[4752]: I1124 12:37:30.870686 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 24 12:37:30 crc kubenswrapper[4752]: I1124 12:37:30.870780 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 24 12:37:30 crc kubenswrapper[4752]: I1124 12:37:30.908599 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 24 12:37:30 crc kubenswrapper[4752]: I1124 12:37:30.940167 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 24 12:37:31 crc kubenswrapper[4752]: I1124 12:37:31.504253 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 24 12:37:31 crc kubenswrapper[4752]: I1124 12:37:31.628924 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="57480bf6-a5ed-4c0b-b787-fc2863744323" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.83:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 24 12:37:31 crc kubenswrapper[4752]: I1124 12:37:31.629257 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="57480bf6-a5ed-4c0b-b787-fc2863744323" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.83:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 24 12:37:31 crc kubenswrapper[4752]: I1124 12:37:31.915450 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Nov 24 12:37:31 crc kubenswrapper[4752]: I1124 12:37:31.953863 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="fb42f12d-f3af-4fb3-9a41-e60dae4129e9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.84:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 24 12:37:31 crc kubenswrapper[4752]: I1124 12:37:31.954171 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="fb42f12d-f3af-4fb3-9a41-e60dae4129e9" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.84:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 24 12:37:31 crc kubenswrapper[4752]: I1124 12:37:31.993843 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Nov 24 12:37:34 crc kubenswrapper[4752]: I1124 12:37:34.209439 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 24 12:37:34 crc kubenswrapper[4752]: I1124 12:37:34.245934 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 24 12:37:34 crc kubenswrapper[4752]: I1124 12:37:34.247027 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 24 12:37:34 crc kubenswrapper[4752]: I1124 12:37:34.254297 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 24 12:37:34 crc kubenswrapper[4752]: I1124 12:37:34.293245 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7b61e6c-8fef-4dd4-8283-6643bb8fff3f-scripts\") pod \"cinder-scheduler-0\" (UID: \"b7b61e6c-8fef-4dd4-8283-6643bb8fff3f\") " pod="openstack/cinder-scheduler-0" Nov 24 12:37:34 crc kubenswrapper[4752]: I1124 12:37:34.293299 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7b61e6c-8fef-4dd4-8283-6643bb8fff3f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b7b61e6c-8fef-4dd4-8283-6643bb8fff3f\") " pod="openstack/cinder-scheduler-0" Nov 24 12:37:34 crc kubenswrapper[4752]: I1124 12:37:34.293334 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6bwf\" (UniqueName: \"kubernetes.io/projected/b7b61e6c-8fef-4dd4-8283-6643bb8fff3f-kube-api-access-f6bwf\") pod \"cinder-scheduler-0\" (UID: \"b7b61e6c-8fef-4dd4-8283-6643bb8fff3f\") " pod="openstack/cinder-scheduler-0" Nov 24 12:37:34 crc kubenswrapper[4752]: I1124 12:37:34.293403 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7b61e6c-8fef-4dd4-8283-6643bb8fff3f-config-data\") pod \"cinder-scheduler-0\" (UID: \"b7b61e6c-8fef-4dd4-8283-6643bb8fff3f\") " pod="openstack/cinder-scheduler-0" Nov 24 12:37:34 crc kubenswrapper[4752]: I1124 12:37:34.293460 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b7b61e6c-8fef-4dd4-8283-6643bb8fff3f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b7b61e6c-8fef-4dd4-8283-6643bb8fff3f\") " pod="openstack/cinder-scheduler-0" Nov 24 12:37:34 crc kubenswrapper[4752]: I1124 12:37:34.293478 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7b61e6c-8fef-4dd4-8283-6643bb8fff3f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b7b61e6c-8fef-4dd4-8283-6643bb8fff3f\") " pod="openstack/cinder-scheduler-0" Nov 24 12:37:34 crc kubenswrapper[4752]: I1124 12:37:34.396877 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6bwf\" (UniqueName: \"kubernetes.io/projected/b7b61e6c-8fef-4dd4-8283-6643bb8fff3f-kube-api-access-f6bwf\") pod \"cinder-scheduler-0\" (UID: \"b7b61e6c-8fef-4dd4-8283-6643bb8fff3f\") " pod="openstack/cinder-scheduler-0" Nov 24 12:37:34 crc kubenswrapper[4752]: I1124 12:37:34.397095 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7b61e6c-8fef-4dd4-8283-6643bb8fff3f-config-data\") pod \"cinder-scheduler-0\" (UID: \"b7b61e6c-8fef-4dd4-8283-6643bb8fff3f\") " pod="openstack/cinder-scheduler-0" Nov 24 12:37:34 crc kubenswrapper[4752]: I1124 12:37:34.398715 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b7b61e6c-8fef-4dd4-8283-6643bb8fff3f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b7b61e6c-8fef-4dd4-8283-6643bb8fff3f\") " pod="openstack/cinder-scheduler-0" Nov 24 12:37:34 crc kubenswrapper[4752]: I1124 12:37:34.398812 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7b61e6c-8fef-4dd4-8283-6643bb8fff3f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b7b61e6c-8fef-4dd4-8283-6643bb8fff3f\") " pod="openstack/cinder-scheduler-0" Nov 24 12:37:34 crc kubenswrapper[4752]: I1124 12:37:34.398935 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7b61e6c-8fef-4dd4-8283-6643bb8fff3f-scripts\") pod \"cinder-scheduler-0\" (UID: \"b7b61e6c-8fef-4dd4-8283-6643bb8fff3f\") " pod="openstack/cinder-scheduler-0" Nov 24 12:37:34 crc kubenswrapper[4752]: I1124 12:37:34.399010 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7b61e6c-8fef-4dd4-8283-6643bb8fff3f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b7b61e6c-8fef-4dd4-8283-6643bb8fff3f\") " pod="openstack/cinder-scheduler-0" Nov 24 12:37:34 crc kubenswrapper[4752]: I1124 12:37:34.400784 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b7b61e6c-8fef-4dd4-8283-6643bb8fff3f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b7b61e6c-8fef-4dd4-8283-6643bb8fff3f\") " pod="openstack/cinder-scheduler-0" Nov 24 12:37:34 crc kubenswrapper[4752]: I1124 12:37:34.408467 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7b61e6c-8fef-4dd4-8283-6643bb8fff3f-config-data\") pod \"cinder-scheduler-0\" (UID: \"b7b61e6c-8fef-4dd4-8283-6643bb8fff3f\") " pod="openstack/cinder-scheduler-0" Nov 24 12:37:34 crc kubenswrapper[4752]: I1124 12:37:34.410002 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7b61e6c-8fef-4dd4-8283-6643bb8fff3f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b7b61e6c-8fef-4dd4-8283-6643bb8fff3f\") " pod="openstack/cinder-scheduler-0" Nov 24 12:37:34 crc kubenswrapper[4752]: I1124 12:37:34.410136 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7b61e6c-8fef-4dd4-8283-6643bb8fff3f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b7b61e6c-8fef-4dd4-8283-6643bb8fff3f\") " pod="openstack/cinder-scheduler-0" Nov 24 12:37:34 crc kubenswrapper[4752]: I1124 12:37:34.421248 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7b61e6c-8fef-4dd4-8283-6643bb8fff3f-scripts\") pod \"cinder-scheduler-0\" (UID: \"b7b61e6c-8fef-4dd4-8283-6643bb8fff3f\") " pod="openstack/cinder-scheduler-0" Nov 24 12:37:34 crc kubenswrapper[4752]: I1124 12:37:34.436861 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6bwf\" (UniqueName: \"kubernetes.io/projected/b7b61e6c-8fef-4dd4-8283-6643bb8fff3f-kube-api-access-f6bwf\") pod \"cinder-scheduler-0\" (UID: \"b7b61e6c-8fef-4dd4-8283-6643bb8fff3f\") " pod="openstack/cinder-scheduler-0" Nov 24 12:37:34 crc kubenswrapper[4752]: I1124 12:37:34.573475 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 24 12:37:35 crc kubenswrapper[4752]: I1124 12:37:35.031976 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 24 12:37:35 crc kubenswrapper[4752]: W1124 12:37:35.038423 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7b61e6c_8fef_4dd4_8283_6643bb8fff3f.slice/crio-026b42471a07eb922ca8a53d99895f11412a852f79b989de209130cf102e9f09 WatchSource:0}: Error finding container 026b42471a07eb922ca8a53d99895f11412a852f79b989de209130cf102e9f09: Status 404 returned error can't find the container with id 026b42471a07eb922ca8a53d99895f11412a852f79b989de209130cf102e9f09 Nov 24 12:37:35 crc kubenswrapper[4752]: I1124 12:37:35.333552 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 24 12:37:35 crc kubenswrapper[4752]: I1124 12:37:35.333860 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="4bd76c10-aef6-4597-9937-e9ec21321a89" containerName="cinder-api-log" containerID="cri-o://4bc47068e9f871859110d4156867d2bc4007226c5b47af9a4a36112882caa235" gracePeriod=30 Nov 24 12:37:35 crc kubenswrapper[4752]: I1124 12:37:35.333950 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="4bd76c10-aef6-4597-9937-e9ec21321a89" containerName="cinder-api" containerID="cri-o://2a54b902784c1bf8c77233fbff52a962739d578cf72af273e27de0975becc04f" gracePeriod=30 Nov 24 12:37:35 crc kubenswrapper[4752]: I1124 12:37:35.514568 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b7b61e6c-8fef-4dd4-8283-6643bb8fff3f","Type":"ContainerStarted","Data":"026b42471a07eb922ca8a53d99895f11412a852f79b989de209130cf102e9f09"} Nov 24 12:37:35 crc kubenswrapper[4752]: I1124 12:37:35.516982 4752 generic.go:334] "Generic (PLEG): container finished" podID="4bd76c10-aef6-4597-9937-e9ec21321a89" containerID="4bc47068e9f871859110d4156867d2bc4007226c5b47af9a4a36112882caa235" exitCode=143 Nov 24 12:37:35 crc kubenswrapper[4752]: I1124 12:37:35.517010 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4bd76c10-aef6-4597-9937-e9ec21321a89","Type":"ContainerDied","Data":"4bc47068e9f871859110d4156867d2bc4007226c5b47af9a4a36112882caa235"} Nov 24 12:37:35 crc kubenswrapper[4752]: I1124 12:37:35.992153 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Nov 24 12:37:35 crc kubenswrapper[4752]: I1124 12:37:35.994391 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:35 crc kubenswrapper[4752]: I1124 12:37:35.998259 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:35.999684 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.133758 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/71d02fa2-b391-4f1a-9181-25e3469dd49b-run\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.133821 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/71d02fa2-b391-4f1a-9181-25e3469dd49b-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.133855 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/71d02fa2-b391-4f1a-9181-25e3469dd49b-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.133882 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/71d02fa2-b391-4f1a-9181-25e3469dd49b-sys\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.133906 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/71d02fa2-b391-4f1a-9181-25e3469dd49b-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.134116 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/71d02fa2-b391-4f1a-9181-25e3469dd49b-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.134219 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n4wm\" (UniqueName: \"kubernetes.io/projected/71d02fa2-b391-4f1a-9181-25e3469dd49b-kube-api-access-9n4wm\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.134273 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/71d02fa2-b391-4f1a-9181-25e3469dd49b-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.134316 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71d02fa2-b391-4f1a-9181-25e3469dd49b-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.134480 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/71d02fa2-b391-4f1a-9181-25e3469dd49b-dev\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.134545 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/71d02fa2-b391-4f1a-9181-25e3469dd49b-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.134565 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/71d02fa2-b391-4f1a-9181-25e3469dd49b-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.134587 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71d02fa2-b391-4f1a-9181-25e3469dd49b-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.134648 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71d02fa2-b391-4f1a-9181-25e3469dd49b-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.134683 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/71d02fa2-b391-4f1a-9181-25e3469dd49b-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.134819 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/71d02fa2-b391-4f1a-9181-25e3469dd49b-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.237159 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/71d02fa2-b391-4f1a-9181-25e3469dd49b-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.237248 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/71d02fa2-b391-4f1a-9181-25e3469dd49b-run\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.237277 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/71d02fa2-b391-4f1a-9181-25e3469dd49b-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.237323 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/71d02fa2-b391-4f1a-9181-25e3469dd49b-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.237352 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/71d02fa2-b391-4f1a-9181-25e3469dd49b-sys\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.237370 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/71d02fa2-b391-4f1a-9181-25e3469dd49b-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.237400 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/71d02fa2-b391-4f1a-9181-25e3469dd49b-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.237414 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/71d02fa2-b391-4f1a-9181-25e3469dd49b-sys\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.237419 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n4wm\" (UniqueName: \"kubernetes.io/projected/71d02fa2-b391-4f1a-9181-25e3469dd49b-kube-api-access-9n4wm\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.237488 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/71d02fa2-b391-4f1a-9181-25e3469dd49b-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.237487 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/71d02fa2-b391-4f1a-9181-25e3469dd49b-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.237516 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71d02fa2-b391-4f1a-9181-25e3469dd49b-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.237538 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/71d02fa2-b391-4f1a-9181-25e3469dd49b-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.237504 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/71d02fa2-b391-4f1a-9181-25e3469dd49b-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.237590 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/71d02fa2-b391-4f1a-9181-25e3469dd49b-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.237605 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/71d02fa2-b391-4f1a-9181-25e3469dd49b-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.237835 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/71d02fa2-b391-4f1a-9181-25e3469dd49b-dev\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.237883 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/71d02fa2-b391-4f1a-9181-25e3469dd49b-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.237903 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/71d02fa2-b391-4f1a-9181-25e3469dd49b-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.237923 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71d02fa2-b391-4f1a-9181-25e3469dd49b-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.237957 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71d02fa2-b391-4f1a-9181-25e3469dd49b-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.237992 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/71d02fa2-b391-4f1a-9181-25e3469dd49b-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.238176 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/71d02fa2-b391-4f1a-9181-25e3469dd49b-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.238211 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/71d02fa2-b391-4f1a-9181-25e3469dd49b-dev\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.238213 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/71d02fa2-b391-4f1a-9181-25e3469dd49b-run\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.238304 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/71d02fa2-b391-4f1a-9181-25e3469dd49b-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.242262 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71d02fa2-b391-4f1a-9181-25e3469dd49b-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.242422 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71d02fa2-b391-4f1a-9181-25e3469dd49b-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.244103 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71d02fa2-b391-4f1a-9181-25e3469dd49b-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.250354 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/71d02fa2-b391-4f1a-9181-25e3469dd49b-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.250989 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/71d02fa2-b391-4f1a-9181-25e3469dd49b-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.257008 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n4wm\" (UniqueName: \"kubernetes.io/projected/71d02fa2-b391-4f1a-9181-25e3469dd49b-kube-api-access-9n4wm\") pod \"cinder-volume-volume1-0\" (UID: \"71d02fa2-b391-4f1a-9181-25e3469dd49b\") " pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.335161 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.440377 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.441825 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.444529 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.466400 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.538810 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b7b61e6c-8fef-4dd4-8283-6643bb8fff3f","Type":"ContainerStarted","Data":"fb57c505afd59c5ae2614f32e29426b5867aad3123eca32b5f902cace25dedc8"} Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.539084 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b7b61e6c-8fef-4dd4-8283-6643bb8fff3f","Type":"ContainerStarted","Data":"926eef375fa36b41433023bd73ed0c11109e723582e815b9baef8594dc74dba8"} Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.543778 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e731d115-86e9-4f89-b91d-955e67f8309c-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.543826 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e731d115-86e9-4f89-b91d-955e67f8309c-lib-modules\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.543850 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e731d115-86e9-4f89-b91d-955e67f8309c-run\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.543883 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6skbf\" (UniqueName: \"kubernetes.io/projected/e731d115-86e9-4f89-b91d-955e67f8309c-kube-api-access-6skbf\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.543936 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e731d115-86e9-4f89-b91d-955e67f8309c-config-data-custom\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.543954 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e731d115-86e9-4f89-b91d-955e67f8309c-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.543986 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e731d115-86e9-4f89-b91d-955e67f8309c-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.544004 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e731d115-86e9-4f89-b91d-955e67f8309c-dev\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.544017 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e731d115-86e9-4f89-b91d-955e67f8309c-sys\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.544047 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e731d115-86e9-4f89-b91d-955e67f8309c-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.544063 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e731d115-86e9-4f89-b91d-955e67f8309c-config-data\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.544077 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e731d115-86e9-4f89-b91d-955e67f8309c-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.544115 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e731d115-86e9-4f89-b91d-955e67f8309c-ceph\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.544157 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e731d115-86e9-4f89-b91d-955e67f8309c-etc-nvme\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.544195 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e731d115-86e9-4f89-b91d-955e67f8309c-scripts\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.544220 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e731d115-86e9-4f89-b91d-955e67f8309c-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.572362 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.572343912 podStartE2EDuration="2.572343912s" podCreationTimestamp="2025-11-24 12:37:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:37:36.559167584 +0000 UTC m=+5462.543987873" watchObservedRunningTime="2025-11-24 12:37:36.572343912 +0000 UTC m=+5462.557164201" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.645941 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e731d115-86e9-4f89-b91d-955e67f8309c-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.645998 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e731d115-86e9-4f89-b91d-955e67f8309c-lib-modules\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.646009 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e731d115-86e9-4f89-b91d-955e67f8309c-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.646020 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e731d115-86e9-4f89-b91d-955e67f8309c-run\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.646061 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e731d115-86e9-4f89-b91d-955e67f8309c-lib-modules\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.646065 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6skbf\" (UniqueName: \"kubernetes.io/projected/e731d115-86e9-4f89-b91d-955e67f8309c-kube-api-access-6skbf\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.646043 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e731d115-86e9-4f89-b91d-955e67f8309c-run\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.646497 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e731d115-86e9-4f89-b91d-955e67f8309c-config-data-custom\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.646557 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e731d115-86e9-4f89-b91d-955e67f8309c-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.646643 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e731d115-86e9-4f89-b91d-955e67f8309c-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.646679 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e731d115-86e9-4f89-b91d-955e67f8309c-dev\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.646699 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e731d115-86e9-4f89-b91d-955e67f8309c-sys\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.646777 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e731d115-86e9-4f89-b91d-955e67f8309c-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.646793 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e731d115-86e9-4f89-b91d-955e67f8309c-dev\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.646802 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e731d115-86e9-4f89-b91d-955e67f8309c-config-data\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.646833 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e731d115-86e9-4f89-b91d-955e67f8309c-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.646821 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e731d115-86e9-4f89-b91d-955e67f8309c-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.647045 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e731d115-86e9-4f89-b91d-955e67f8309c-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.647072 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e731d115-86e9-4f89-b91d-955e67f8309c-sys\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.647113 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e731d115-86e9-4f89-b91d-955e67f8309c-ceph\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.647149 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e731d115-86e9-4f89-b91d-955e67f8309c-etc-nvme\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.647172 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e731d115-86e9-4f89-b91d-955e67f8309c-scripts\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.647188 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e731d115-86e9-4f89-b91d-955e67f8309c-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.647263 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e731d115-86e9-4f89-b91d-955e67f8309c-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.647479 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e731d115-86e9-4f89-b91d-955e67f8309c-etc-nvme\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.647626 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e731d115-86e9-4f89-b91d-955e67f8309c-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.652626 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e731d115-86e9-4f89-b91d-955e67f8309c-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.652921 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e731d115-86e9-4f89-b91d-955e67f8309c-config-data\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.653060 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e731d115-86e9-4f89-b91d-955e67f8309c-config-data-custom\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.653232 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e731d115-86e9-4f89-b91d-955e67f8309c-scripts\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.654292 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e731d115-86e9-4f89-b91d-955e67f8309c-ceph\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.667220 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6skbf\" (UniqueName: \"kubernetes.io/projected/e731d115-86e9-4f89-b91d-955e67f8309c-kube-api-access-6skbf\") pod \"cinder-backup-0\" (UID: \"e731d115-86e9-4f89-b91d-955e67f8309c\") " pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.783908 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.935168 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Nov 24 12:37:36 crc kubenswrapper[4752]: I1124 12:37:36.963555 4752 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 24 12:37:37 crc kubenswrapper[4752]: I1124 12:37:37.345913 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Nov 24 12:37:37 crc kubenswrapper[4752]: W1124 12:37:37.346866 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode731d115_86e9_4f89_b91d_955e67f8309c.slice/crio-323c9c3c89aee6da27d97c567f1ae584525c903fbdc1b8a3ff2707d8706536ae WatchSource:0}: Error finding container 323c9c3c89aee6da27d97c567f1ae584525c903fbdc1b8a3ff2707d8706536ae: Status 404 returned error can't find the container with id 323c9c3c89aee6da27d97c567f1ae584525c903fbdc1b8a3ff2707d8706536ae Nov 24 12:37:37 crc kubenswrapper[4752]: I1124 12:37:37.548491 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"71d02fa2-b391-4f1a-9181-25e3469dd49b","Type":"ContainerStarted","Data":"9a47493fa7a1de0af4c75bc97909b0c029e59f45551c91b6f52d069ad9c6f74e"} Nov 24 12:37:37 crc kubenswrapper[4752]: I1124 12:37:37.549982 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"e731d115-86e9-4f89-b91d-955e67f8309c","Type":"ContainerStarted","Data":"323c9c3c89aee6da27d97c567f1ae584525c903fbdc1b8a3ff2707d8706536ae"} Nov 24 12:37:38 crc kubenswrapper[4752]: I1124 12:37:38.562229 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"71d02fa2-b391-4f1a-9181-25e3469dd49b","Type":"ContainerStarted","Data":"1c423d78614c297c90f6dd550479fbd7e07bbf70f48bba385b30cb6ade558a22"} Nov 24 12:37:38 crc kubenswrapper[4752]: I1124 12:37:38.562786 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"71d02fa2-b391-4f1a-9181-25e3469dd49b","Type":"ContainerStarted","Data":"040cb29e19bfe724fae78feecc90c522905b58ac7020545fa6f3529ccab0f199"} Nov 24 12:37:38 crc kubenswrapper[4752]: I1124 12:37:38.570181 4752 generic.go:334] "Generic (PLEG): container finished" podID="4bd76c10-aef6-4597-9937-e9ec21321a89" containerID="2a54b902784c1bf8c77233fbff52a962739d578cf72af273e27de0975becc04f" exitCode=0 Nov 24 12:37:38 crc kubenswrapper[4752]: I1124 12:37:38.570294 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4bd76c10-aef6-4597-9937-e9ec21321a89","Type":"ContainerDied","Data":"2a54b902784c1bf8c77233fbff52a962739d578cf72af273e27de0975becc04f"} Nov 24 12:37:38 crc kubenswrapper[4752]: I1124 12:37:38.572170 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"e731d115-86e9-4f89-b91d-955e67f8309c","Type":"ContainerStarted","Data":"ec7dac3287c19d72693caaaeeff159e47fb4fb342b85ecffe3fc7354da284d1a"} Nov 24 12:37:38 crc kubenswrapper[4752]: I1124 12:37:38.601230 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=2.85293291 podStartE2EDuration="3.601209447s" podCreationTimestamp="2025-11-24 12:37:35 +0000 UTC" firstStartedPulling="2025-11-24 12:37:36.962852305 +0000 UTC m=+5462.947672594" lastFinishedPulling="2025-11-24 12:37:37.711128842 +0000 UTC m=+5463.695949131" observedRunningTime="2025-11-24 12:37:38.592321222 +0000 UTC m=+5464.577141521" watchObservedRunningTime="2025-11-24 12:37:38.601209447 +0000 UTC m=+5464.586029746" Nov 24 12:37:38 crc kubenswrapper[4752]: I1124 12:37:38.894378 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.005366 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bd76c10-aef6-4597-9937-e9ec21321a89-scripts\") pod \"4bd76c10-aef6-4597-9937-e9ec21321a89\" (UID: \"4bd76c10-aef6-4597-9937-e9ec21321a89\") " Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.006814 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bd76c10-aef6-4597-9937-e9ec21321a89-combined-ca-bundle\") pod \"4bd76c10-aef6-4597-9937-e9ec21321a89\" (UID: \"4bd76c10-aef6-4597-9937-e9ec21321a89\") " Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.007086 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bd76c10-aef6-4597-9937-e9ec21321a89-logs\") pod \"4bd76c10-aef6-4597-9937-e9ec21321a89\" (UID: \"4bd76c10-aef6-4597-9937-e9ec21321a89\") " Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.007202 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4bd76c10-aef6-4597-9937-e9ec21321a89-config-data-custom\") pod \"4bd76c10-aef6-4597-9937-e9ec21321a89\" (UID: \"4bd76c10-aef6-4597-9937-e9ec21321a89\") " Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.007350 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bd76c10-aef6-4597-9937-e9ec21321a89-config-data\") pod \"4bd76c10-aef6-4597-9937-e9ec21321a89\" (UID: \"4bd76c10-aef6-4597-9937-e9ec21321a89\") " Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.007455 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4x8c5\" (UniqueName: \"kubernetes.io/projected/4bd76c10-aef6-4597-9937-e9ec21321a89-kube-api-access-4x8c5\") pod \"4bd76c10-aef6-4597-9937-e9ec21321a89\" (UID: \"4bd76c10-aef6-4597-9937-e9ec21321a89\") " Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.007655 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4bd76c10-aef6-4597-9937-e9ec21321a89-etc-machine-id\") pod \"4bd76c10-aef6-4597-9937-e9ec21321a89\" (UID: \"4bd76c10-aef6-4597-9937-e9ec21321a89\") " Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.008287 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4bd76c10-aef6-4597-9937-e9ec21321a89-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4bd76c10-aef6-4597-9937-e9ec21321a89" (UID: "4bd76c10-aef6-4597-9937-e9ec21321a89"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.010235 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bd76c10-aef6-4597-9937-e9ec21321a89-logs" (OuterVolumeSpecName: "logs") pod "4bd76c10-aef6-4597-9937-e9ec21321a89" (UID: "4bd76c10-aef6-4597-9937-e9ec21321a89"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.020786 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bd76c10-aef6-4597-9937-e9ec21321a89-scripts" (OuterVolumeSpecName: "scripts") pod "4bd76c10-aef6-4597-9937-e9ec21321a89" (UID: "4bd76c10-aef6-4597-9937-e9ec21321a89"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.021052 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bd76c10-aef6-4597-9937-e9ec21321a89-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4bd76c10-aef6-4597-9937-e9ec21321a89" (UID: "4bd76c10-aef6-4597-9937-e9ec21321a89"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.023227 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bd76c10-aef6-4597-9937-e9ec21321a89-kube-api-access-4x8c5" (OuterVolumeSpecName: "kube-api-access-4x8c5") pod "4bd76c10-aef6-4597-9937-e9ec21321a89" (UID: "4bd76c10-aef6-4597-9937-e9ec21321a89"). InnerVolumeSpecName "kube-api-access-4x8c5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.041325 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bd76c10-aef6-4597-9937-e9ec21321a89-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4bd76c10-aef6-4597-9937-e9ec21321a89" (UID: "4bd76c10-aef6-4597-9937-e9ec21321a89"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.080615 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bd76c10-aef6-4597-9937-e9ec21321a89-config-data" (OuterVolumeSpecName: "config-data") pod "4bd76c10-aef6-4597-9937-e9ec21321a89" (UID: "4bd76c10-aef6-4597-9937-e9ec21321a89"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.112049 4752 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4bd76c10-aef6-4597-9937-e9ec21321a89-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.112264 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bd76c10-aef6-4597-9937-e9ec21321a89-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.112358 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4x8c5\" (UniqueName: \"kubernetes.io/projected/4bd76c10-aef6-4597-9937-e9ec21321a89-kube-api-access-4x8c5\") on node \"crc\" DevicePath \"\"" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.112426 4752 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4bd76c10-aef6-4597-9937-e9ec21321a89-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.112478 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bd76c10-aef6-4597-9937-e9ec21321a89-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.112537 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bd76c10-aef6-4597-9937-e9ec21321a89-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.112630 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bd76c10-aef6-4597-9937-e9ec21321a89-logs\") on node \"crc\" DevicePath \"\"" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.573792 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.582295 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4bd76c10-aef6-4597-9937-e9ec21321a89","Type":"ContainerDied","Data":"fba1e60892eb93d1fd3d851ccfd4aeede217597564a739d3f175b36a75400a81"} Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.582626 4752 scope.go:117] "RemoveContainer" containerID="2a54b902784c1bf8c77233fbff52a962739d578cf72af273e27de0975becc04f" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.582368 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.585903 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"e731d115-86e9-4f89-b91d-955e67f8309c","Type":"ContainerStarted","Data":"26884945b9d77e1865dfa18081f41517761dc72b71234822773361e03c15779f"} Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.625201 4752 scope.go:117] "RemoveContainer" containerID="4bc47068e9f871859110d4156867d2bc4007226c5b47af9a4a36112882caa235" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.627941 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=2.813819036 podStartE2EDuration="3.627917791s" podCreationTimestamp="2025-11-24 12:37:36 +0000 UTC" firstStartedPulling="2025-11-24 12:37:37.349418355 +0000 UTC m=+5463.334238644" lastFinishedPulling="2025-11-24 12:37:38.1635171 +0000 UTC m=+5464.148337399" observedRunningTime="2025-11-24 12:37:39.621215359 +0000 UTC m=+5465.606035658" watchObservedRunningTime="2025-11-24 12:37:39.627917791 +0000 UTC m=+5465.612738090" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.650614 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.661251 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.669473 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 24 12:37:39 crc kubenswrapper[4752]: E1124 12:37:39.669952 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bd76c10-aef6-4597-9937-e9ec21321a89" containerName="cinder-api" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.669966 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bd76c10-aef6-4597-9937-e9ec21321a89" containerName="cinder-api" Nov 24 12:37:39 crc kubenswrapper[4752]: E1124 12:37:39.670283 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bd76c10-aef6-4597-9937-e9ec21321a89" containerName="cinder-api-log" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.670295 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bd76c10-aef6-4597-9937-e9ec21321a89" containerName="cinder-api-log" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.670928 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bd76c10-aef6-4597-9937-e9ec21321a89" containerName="cinder-api-log" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.670958 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bd76c10-aef6-4597-9937-e9ec21321a89" containerName="cinder-api" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.671944 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.674517 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.721814 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.725008 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1df89d62-db8a-458a-a85b-cf1e95d942e8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1df89d62-db8a-458a-a85b-cf1e95d942e8\") " pod="openstack/cinder-api-0" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.725340 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1df89d62-db8a-458a-a85b-cf1e95d942e8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1df89d62-db8a-458a-a85b-cf1e95d942e8\") " pod="openstack/cinder-api-0" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.725422 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1df89d62-db8a-458a-a85b-cf1e95d942e8-scripts\") pod \"cinder-api-0\" (UID: \"1df89d62-db8a-458a-a85b-cf1e95d942e8\") " pod="openstack/cinder-api-0" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.725456 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1df89d62-db8a-458a-a85b-cf1e95d942e8-config-data\") pod \"cinder-api-0\" (UID: \"1df89d62-db8a-458a-a85b-cf1e95d942e8\") " pod="openstack/cinder-api-0" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.725501 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1df89d62-db8a-458a-a85b-cf1e95d942e8-logs\") pod \"cinder-api-0\" (UID: \"1df89d62-db8a-458a-a85b-cf1e95d942e8\") " pod="openstack/cinder-api-0" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.725725 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1df89d62-db8a-458a-a85b-cf1e95d942e8-config-data-custom\") pod \"cinder-api-0\" (UID: \"1df89d62-db8a-458a-a85b-cf1e95d942e8\") " pod="openstack/cinder-api-0" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.725780 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmgpr\" (UniqueName: \"kubernetes.io/projected/1df89d62-db8a-458a-a85b-cf1e95d942e8-kube-api-access-xmgpr\") pod \"cinder-api-0\" (UID: \"1df89d62-db8a-458a-a85b-cf1e95d942e8\") " pod="openstack/cinder-api-0" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.827763 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1df89d62-db8a-458a-a85b-cf1e95d942e8-scripts\") pod \"cinder-api-0\" (UID: \"1df89d62-db8a-458a-a85b-cf1e95d942e8\") " pod="openstack/cinder-api-0" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.828140 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1df89d62-db8a-458a-a85b-cf1e95d942e8-config-data\") pod \"cinder-api-0\" (UID: \"1df89d62-db8a-458a-a85b-cf1e95d942e8\") " pod="openstack/cinder-api-0" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.828180 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1df89d62-db8a-458a-a85b-cf1e95d942e8-logs\") pod \"cinder-api-0\" (UID: \"1df89d62-db8a-458a-a85b-cf1e95d942e8\") " pod="openstack/cinder-api-0" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.828445 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1df89d62-db8a-458a-a85b-cf1e95d942e8-config-data-custom\") pod \"cinder-api-0\" (UID: \"1df89d62-db8a-458a-a85b-cf1e95d942e8\") " pod="openstack/cinder-api-0" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.828508 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmgpr\" (UniqueName: \"kubernetes.io/projected/1df89d62-db8a-458a-a85b-cf1e95d942e8-kube-api-access-xmgpr\") pod \"cinder-api-0\" (UID: \"1df89d62-db8a-458a-a85b-cf1e95d942e8\") " pod="openstack/cinder-api-0" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.828596 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1df89d62-db8a-458a-a85b-cf1e95d942e8-logs\") pod \"cinder-api-0\" (UID: \"1df89d62-db8a-458a-a85b-cf1e95d942e8\") " pod="openstack/cinder-api-0" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.828638 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1df89d62-db8a-458a-a85b-cf1e95d942e8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1df89d62-db8a-458a-a85b-cf1e95d942e8\") " pod="openstack/cinder-api-0" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.828685 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1df89d62-db8a-458a-a85b-cf1e95d942e8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1df89d62-db8a-458a-a85b-cf1e95d942e8\") " pod="openstack/cinder-api-0" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.829319 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1df89d62-db8a-458a-a85b-cf1e95d942e8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1df89d62-db8a-458a-a85b-cf1e95d942e8\") " pod="openstack/cinder-api-0" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.832935 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1df89d62-db8a-458a-a85b-cf1e95d942e8-scripts\") pod \"cinder-api-0\" (UID: \"1df89d62-db8a-458a-a85b-cf1e95d942e8\") " pod="openstack/cinder-api-0" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.834732 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1df89d62-db8a-458a-a85b-cf1e95d942e8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1df89d62-db8a-458a-a85b-cf1e95d942e8\") " pod="openstack/cinder-api-0" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.836555 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1df89d62-db8a-458a-a85b-cf1e95d942e8-config-data-custom\") pod \"cinder-api-0\" (UID: \"1df89d62-db8a-458a-a85b-cf1e95d942e8\") " pod="openstack/cinder-api-0" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.840454 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1df89d62-db8a-458a-a85b-cf1e95d942e8-config-data\") pod \"cinder-api-0\" (UID: \"1df89d62-db8a-458a-a85b-cf1e95d942e8\") " pod="openstack/cinder-api-0" Nov 24 12:37:39 crc kubenswrapper[4752]: I1124 12:37:39.847480 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmgpr\" (UniqueName: \"kubernetes.io/projected/1df89d62-db8a-458a-a85b-cf1e95d942e8-kube-api-access-xmgpr\") pod \"cinder-api-0\" (UID: \"1df89d62-db8a-458a-a85b-cf1e95d942e8\") " pod="openstack/cinder-api-0" Nov 24 12:37:40 crc kubenswrapper[4752]: I1124 12:37:40.018409 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 24 12:37:40 crc kubenswrapper[4752]: I1124 12:37:40.453985 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 24 12:37:40 crc kubenswrapper[4752]: I1124 12:37:40.551297 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 24 12:37:40 crc kubenswrapper[4752]: I1124 12:37:40.551616 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 24 12:37:40 crc kubenswrapper[4752]: I1124 12:37:40.554504 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 24 12:37:40 crc kubenswrapper[4752]: I1124 12:37:40.558071 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 24 12:37:40 crc kubenswrapper[4752]: I1124 12:37:40.599907 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1df89d62-db8a-458a-a85b-cf1e95d942e8","Type":"ContainerStarted","Data":"4d15f4b4de7e8671b7e7826341a1a12d4e0b7d3d2fab9c8a20cccd6ae692ae3c"} Nov 24 12:37:40 crc kubenswrapper[4752]: I1124 12:37:40.603453 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 24 12:37:40 crc kubenswrapper[4752]: I1124 12:37:40.608051 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 24 12:37:40 crc kubenswrapper[4752]: I1124 12:37:40.739361 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bd76c10-aef6-4597-9937-e9ec21321a89" path="/var/lib/kubelet/pods/4bd76c10-aef6-4597-9937-e9ec21321a89/volumes" Nov 24 12:37:40 crc kubenswrapper[4752]: I1124 12:37:40.877683 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 24 12:37:40 crc kubenswrapper[4752]: I1124 12:37:40.877757 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 24 12:37:40 crc kubenswrapper[4752]: I1124 12:37:40.882657 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 24 12:37:40 crc kubenswrapper[4752]: I1124 12:37:40.887260 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 24 12:37:41 crc kubenswrapper[4752]: I1124 12:37:41.335767 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:41 crc kubenswrapper[4752]: I1124 12:37:41.621924 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1df89d62-db8a-458a-a85b-cf1e95d942e8","Type":"ContainerStarted","Data":"ced9f598598c8990afb3b2daba553e7884af6feec4191bf1f2c2ad0a6445d267"} Nov 24 12:37:41 crc kubenswrapper[4752]: I1124 12:37:41.784341 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Nov 24 12:37:42 crc kubenswrapper[4752]: I1124 12:37:42.637800 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1df89d62-db8a-458a-a85b-cf1e95d942e8","Type":"ContainerStarted","Data":"4a3fa6d01b0726c66b594a93af792a1463f4d6835df81b226dded40d6526d5b5"} Nov 24 12:37:42 crc kubenswrapper[4752]: I1124 12:37:42.663485 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.663458864 podStartE2EDuration="3.663458864s" podCreationTimestamp="2025-11-24 12:37:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:37:42.658310366 +0000 UTC m=+5468.643130675" watchObservedRunningTime="2025-11-24 12:37:42.663458864 +0000 UTC m=+5468.648279203" Nov 24 12:37:43 crc kubenswrapper[4752]: I1124 12:37:43.645534 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Nov 24 12:37:44 crc kubenswrapper[4752]: I1124 12:37:44.794512 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 24 12:37:44 crc kubenswrapper[4752]: I1124 12:37:44.851897 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 24 12:37:45 crc kubenswrapper[4752]: I1124 12:37:45.660389 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="b7b61e6c-8fef-4dd4-8283-6643bb8fff3f" containerName="cinder-scheduler" containerID="cri-o://926eef375fa36b41433023bd73ed0c11109e723582e815b9baef8594dc74dba8" gracePeriod=30 Nov 24 12:37:45 crc kubenswrapper[4752]: I1124 12:37:45.660470 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="b7b61e6c-8fef-4dd4-8283-6643bb8fff3f" containerName="probe" containerID="cri-o://fb57c505afd59c5ae2614f32e29426b5867aad3123eca32b5f902cace25dedc8" gracePeriod=30 Nov 24 12:37:46 crc kubenswrapper[4752]: I1124 12:37:46.588628 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Nov 24 12:37:46 crc kubenswrapper[4752]: I1124 12:37:46.673694 4752 generic.go:334] "Generic (PLEG): container finished" podID="b7b61e6c-8fef-4dd4-8283-6643bb8fff3f" containerID="fb57c505afd59c5ae2614f32e29426b5867aad3123eca32b5f902cace25dedc8" exitCode=0 Nov 24 12:37:46 crc kubenswrapper[4752]: I1124 12:37:46.673822 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b7b61e6c-8fef-4dd4-8283-6643bb8fff3f","Type":"ContainerDied","Data":"fb57c505afd59c5ae2614f32e29426b5867aad3123eca32b5f902cace25dedc8"} Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.066811 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.409854 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.529375 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6bwf\" (UniqueName: \"kubernetes.io/projected/b7b61e6c-8fef-4dd4-8283-6643bb8fff3f-kube-api-access-f6bwf\") pod \"b7b61e6c-8fef-4dd4-8283-6643bb8fff3f\" (UID: \"b7b61e6c-8fef-4dd4-8283-6643bb8fff3f\") " Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.532330 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7b61e6c-8fef-4dd4-8283-6643bb8fff3f-combined-ca-bundle\") pod \"b7b61e6c-8fef-4dd4-8283-6643bb8fff3f\" (UID: \"b7b61e6c-8fef-4dd4-8283-6643bb8fff3f\") " Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.532388 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b7b61e6c-8fef-4dd4-8283-6643bb8fff3f-etc-machine-id\") pod \"b7b61e6c-8fef-4dd4-8283-6643bb8fff3f\" (UID: \"b7b61e6c-8fef-4dd4-8283-6643bb8fff3f\") " Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.532505 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7b61e6c-8fef-4dd4-8283-6643bb8fff3f-config-data\") pod \"b7b61e6c-8fef-4dd4-8283-6643bb8fff3f\" (UID: \"b7b61e6c-8fef-4dd4-8283-6643bb8fff3f\") " Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.532684 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7b61e6c-8fef-4dd4-8283-6643bb8fff3f-config-data-custom\") pod \"b7b61e6c-8fef-4dd4-8283-6643bb8fff3f\" (UID: \"b7b61e6c-8fef-4dd4-8283-6643bb8fff3f\") " Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.532794 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7b61e6c-8fef-4dd4-8283-6643bb8fff3f-scripts\") pod \"b7b61e6c-8fef-4dd4-8283-6643bb8fff3f\" (UID: \"b7b61e6c-8fef-4dd4-8283-6643bb8fff3f\") " Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.533568 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7b61e6c-8fef-4dd4-8283-6643bb8fff3f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b7b61e6c-8fef-4dd4-8283-6643bb8fff3f" (UID: "b7b61e6c-8fef-4dd4-8283-6643bb8fff3f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.534598 4752 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b7b61e6c-8fef-4dd4-8283-6643bb8fff3f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.537079 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7b61e6c-8fef-4dd4-8283-6643bb8fff3f-kube-api-access-f6bwf" (OuterVolumeSpecName: "kube-api-access-f6bwf") pod "b7b61e6c-8fef-4dd4-8283-6643bb8fff3f" (UID: "b7b61e6c-8fef-4dd4-8283-6643bb8fff3f"). InnerVolumeSpecName "kube-api-access-f6bwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.541393 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7b61e6c-8fef-4dd4-8283-6643bb8fff3f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b7b61e6c-8fef-4dd4-8283-6643bb8fff3f" (UID: "b7b61e6c-8fef-4dd4-8283-6643bb8fff3f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.557057 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7b61e6c-8fef-4dd4-8283-6643bb8fff3f-scripts" (OuterVolumeSpecName: "scripts") pod "b7b61e6c-8fef-4dd4-8283-6643bb8fff3f" (UID: "b7b61e6c-8fef-4dd4-8283-6643bb8fff3f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.626398 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7b61e6c-8fef-4dd4-8283-6643bb8fff3f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7b61e6c-8fef-4dd4-8283-6643bb8fff3f" (UID: "b7b61e6c-8fef-4dd4-8283-6643bb8fff3f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.636781 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7b61e6c-8fef-4dd4-8283-6643bb8fff3f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.636811 4752 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7b61e6c-8fef-4dd4-8283-6643bb8fff3f-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.636821 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7b61e6c-8fef-4dd4-8283-6643bb8fff3f-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.636830 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6bwf\" (UniqueName: \"kubernetes.io/projected/b7b61e6c-8fef-4dd4-8283-6643bb8fff3f-kube-api-access-f6bwf\") on node \"crc\" DevicePath \"\"" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.649456 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7b61e6c-8fef-4dd4-8283-6643bb8fff3f-config-data" (OuterVolumeSpecName: "config-data") pod "b7b61e6c-8fef-4dd4-8283-6643bb8fff3f" (UID: "b7b61e6c-8fef-4dd4-8283-6643bb8fff3f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.686468 4752 generic.go:334] "Generic (PLEG): container finished" podID="b7b61e6c-8fef-4dd4-8283-6643bb8fff3f" containerID="926eef375fa36b41433023bd73ed0c11109e723582e815b9baef8594dc74dba8" exitCode=0 Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.686647 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b7b61e6c-8fef-4dd4-8283-6643bb8fff3f","Type":"ContainerDied","Data":"926eef375fa36b41433023bd73ed0c11109e723582e815b9baef8594dc74dba8"} Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.696246 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b7b61e6c-8fef-4dd4-8283-6643bb8fff3f","Type":"ContainerDied","Data":"026b42471a07eb922ca8a53d99895f11412a852f79b989de209130cf102e9f09"} Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.696339 4752 scope.go:117] "RemoveContainer" containerID="fb57c505afd59c5ae2614f32e29426b5867aad3123eca32b5f902cace25dedc8" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.686772 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.737769 4752 scope.go:117] "RemoveContainer" containerID="926eef375fa36b41433023bd73ed0c11109e723582e815b9baef8594dc74dba8" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.738568 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.742014 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7b61e6c-8fef-4dd4-8283-6643bb8fff3f-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.750847 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.769784 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 24 12:37:47 crc kubenswrapper[4752]: E1124 12:37:47.770227 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7b61e6c-8fef-4dd4-8283-6643bb8fff3f" containerName="cinder-scheduler" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.770247 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7b61e6c-8fef-4dd4-8283-6643bb8fff3f" containerName="cinder-scheduler" Nov 24 12:37:47 crc kubenswrapper[4752]: E1124 12:37:47.770274 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7b61e6c-8fef-4dd4-8283-6643bb8fff3f" containerName="probe" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.770279 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7b61e6c-8fef-4dd4-8283-6643bb8fff3f" containerName="probe" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.770437 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7b61e6c-8fef-4dd4-8283-6643bb8fff3f" containerName="probe" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.770456 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7b61e6c-8fef-4dd4-8283-6643bb8fff3f" containerName="cinder-scheduler" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.772239 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.776222 4752 scope.go:117] "RemoveContainer" containerID="fb57c505afd59c5ae2614f32e29426b5867aad3123eca32b5f902cace25dedc8" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.776356 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 24 12:37:47 crc kubenswrapper[4752]: E1124 12:37:47.777116 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb57c505afd59c5ae2614f32e29426b5867aad3123eca32b5f902cace25dedc8\": container with ID starting with fb57c505afd59c5ae2614f32e29426b5867aad3123eca32b5f902cace25dedc8 not found: ID does not exist" containerID="fb57c505afd59c5ae2614f32e29426b5867aad3123eca32b5f902cace25dedc8" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.777158 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb57c505afd59c5ae2614f32e29426b5867aad3123eca32b5f902cace25dedc8"} err="failed to get container status \"fb57c505afd59c5ae2614f32e29426b5867aad3123eca32b5f902cace25dedc8\": rpc error: code = NotFound desc = could not find container \"fb57c505afd59c5ae2614f32e29426b5867aad3123eca32b5f902cace25dedc8\": container with ID starting with fb57c505afd59c5ae2614f32e29426b5867aad3123eca32b5f902cace25dedc8 not found: ID does not exist" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.777184 4752 scope.go:117] "RemoveContainer" containerID="926eef375fa36b41433023bd73ed0c11109e723582e815b9baef8594dc74dba8" Nov 24 12:37:47 crc kubenswrapper[4752]: E1124 12:37:47.777370 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"926eef375fa36b41433023bd73ed0c11109e723582e815b9baef8594dc74dba8\": container with ID starting with 926eef375fa36b41433023bd73ed0c11109e723582e815b9baef8594dc74dba8 not found: ID does not exist" containerID="926eef375fa36b41433023bd73ed0c11109e723582e815b9baef8594dc74dba8" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.777399 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"926eef375fa36b41433023bd73ed0c11109e723582e815b9baef8594dc74dba8"} err="failed to get container status \"926eef375fa36b41433023bd73ed0c11109e723582e815b9baef8594dc74dba8\": rpc error: code = NotFound desc = could not find container \"926eef375fa36b41433023bd73ed0c11109e723582e815b9baef8594dc74dba8\": container with ID starting with 926eef375fa36b41433023bd73ed0c11109e723582e815b9baef8594dc74dba8 not found: ID does not exist" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.783788 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.844250 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/019f042d-12be-4cdc-b195-470abf83bb3a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"019f042d-12be-4cdc-b195-470abf83bb3a\") " pod="openstack/cinder-scheduler-0" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.844323 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/019f042d-12be-4cdc-b195-470abf83bb3a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"019f042d-12be-4cdc-b195-470abf83bb3a\") " pod="openstack/cinder-scheduler-0" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.844373 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4k4xl\" (UniqueName: \"kubernetes.io/projected/019f042d-12be-4cdc-b195-470abf83bb3a-kube-api-access-4k4xl\") pod \"cinder-scheduler-0\" (UID: \"019f042d-12be-4cdc-b195-470abf83bb3a\") " pod="openstack/cinder-scheduler-0" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.844442 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/019f042d-12be-4cdc-b195-470abf83bb3a-config-data\") pod \"cinder-scheduler-0\" (UID: \"019f042d-12be-4cdc-b195-470abf83bb3a\") " pod="openstack/cinder-scheduler-0" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.844469 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/019f042d-12be-4cdc-b195-470abf83bb3a-scripts\") pod \"cinder-scheduler-0\" (UID: \"019f042d-12be-4cdc-b195-470abf83bb3a\") " pod="openstack/cinder-scheduler-0" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.844510 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/019f042d-12be-4cdc-b195-470abf83bb3a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"019f042d-12be-4cdc-b195-470abf83bb3a\") " pod="openstack/cinder-scheduler-0" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.945945 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/019f042d-12be-4cdc-b195-470abf83bb3a-config-data\") pod \"cinder-scheduler-0\" (UID: \"019f042d-12be-4cdc-b195-470abf83bb3a\") " pod="openstack/cinder-scheduler-0" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.945998 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/019f042d-12be-4cdc-b195-470abf83bb3a-scripts\") pod \"cinder-scheduler-0\" (UID: \"019f042d-12be-4cdc-b195-470abf83bb3a\") " pod="openstack/cinder-scheduler-0" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.946040 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/019f042d-12be-4cdc-b195-470abf83bb3a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"019f042d-12be-4cdc-b195-470abf83bb3a\") " pod="openstack/cinder-scheduler-0" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.946158 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/019f042d-12be-4cdc-b195-470abf83bb3a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"019f042d-12be-4cdc-b195-470abf83bb3a\") " pod="openstack/cinder-scheduler-0" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.946209 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/019f042d-12be-4cdc-b195-470abf83bb3a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"019f042d-12be-4cdc-b195-470abf83bb3a\") " pod="openstack/cinder-scheduler-0" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.946248 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4k4xl\" (UniqueName: \"kubernetes.io/projected/019f042d-12be-4cdc-b195-470abf83bb3a-kube-api-access-4k4xl\") pod \"cinder-scheduler-0\" (UID: \"019f042d-12be-4cdc-b195-470abf83bb3a\") " pod="openstack/cinder-scheduler-0" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.946267 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/019f042d-12be-4cdc-b195-470abf83bb3a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"019f042d-12be-4cdc-b195-470abf83bb3a\") " pod="openstack/cinder-scheduler-0" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.951300 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/019f042d-12be-4cdc-b195-470abf83bb3a-config-data\") pod \"cinder-scheduler-0\" (UID: \"019f042d-12be-4cdc-b195-470abf83bb3a\") " pod="openstack/cinder-scheduler-0" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.953082 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/019f042d-12be-4cdc-b195-470abf83bb3a-scripts\") pod \"cinder-scheduler-0\" (UID: \"019f042d-12be-4cdc-b195-470abf83bb3a\") " pod="openstack/cinder-scheduler-0" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.953133 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/019f042d-12be-4cdc-b195-470abf83bb3a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"019f042d-12be-4cdc-b195-470abf83bb3a\") " pod="openstack/cinder-scheduler-0" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.953624 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/019f042d-12be-4cdc-b195-470abf83bb3a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"019f042d-12be-4cdc-b195-470abf83bb3a\") " pod="openstack/cinder-scheduler-0" Nov 24 12:37:47 crc kubenswrapper[4752]: I1124 12:37:47.963690 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4k4xl\" (UniqueName: \"kubernetes.io/projected/019f042d-12be-4cdc-b195-470abf83bb3a-kube-api-access-4k4xl\") pod \"cinder-scheduler-0\" (UID: \"019f042d-12be-4cdc-b195-470abf83bb3a\") " pod="openstack/cinder-scheduler-0" Nov 24 12:37:48 crc kubenswrapper[4752]: I1124 12:37:48.090937 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 24 12:37:48 crc kubenswrapper[4752]: I1124 12:37:48.397905 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 24 12:37:48 crc kubenswrapper[4752]: I1124 12:37:48.712135 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"019f042d-12be-4cdc-b195-470abf83bb3a","Type":"ContainerStarted","Data":"cf95db168130c3d2f815f31f4322154ad78f19fd22f3193aa29c97e251f7f3a5"} Nov 24 12:37:48 crc kubenswrapper[4752]: I1124 12:37:48.746651 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7b61e6c-8fef-4dd4-8283-6643bb8fff3f" path="/var/lib/kubelet/pods/b7b61e6c-8fef-4dd4-8283-6643bb8fff3f/volumes" Nov 24 12:37:49 crc kubenswrapper[4752]: I1124 12:37:49.732181 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"019f042d-12be-4cdc-b195-470abf83bb3a","Type":"ContainerStarted","Data":"1f099a82b4f45d34fd7a679057081c6acdbfe7e45438d072379aa8d1989ad67b"} Nov 24 12:37:49 crc kubenswrapper[4752]: I1124 12:37:49.732919 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"019f042d-12be-4cdc-b195-470abf83bb3a","Type":"ContainerStarted","Data":"ec9caf622c8cf6e851ed99260245545003a4f6e7c6dc065b9b335efd3cb4779e"} Nov 24 12:37:49 crc kubenswrapper[4752]: I1124 12:37:49.781414 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.781390232 podStartE2EDuration="2.781390232s" podCreationTimestamp="2025-11-24 12:37:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:37:49.776075499 +0000 UTC m=+5475.760895788" watchObservedRunningTime="2025-11-24 12:37:49.781390232 +0000 UTC m=+5475.766210551" Nov 24 12:37:51 crc kubenswrapper[4752]: I1124 12:37:51.752770 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Nov 24 12:37:53 crc kubenswrapper[4752]: I1124 12:37:53.092304 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 24 12:37:53 crc kubenswrapper[4752]: I1124 12:37:53.500124 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 24 12:39:40 crc kubenswrapper[4752]: I1124 12:39:40.497037 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-f95n8"] Nov 24 12:39:40 crc kubenswrapper[4752]: I1124 12:39:40.500020 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-f95n8" Nov 24 12:39:40 crc kubenswrapper[4752]: I1124 12:39:40.504614 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-5pcfg" Nov 24 12:39:40 crc kubenswrapper[4752]: I1124 12:39:40.504686 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Nov 24 12:39:40 crc kubenswrapper[4752]: I1124 12:39:40.510134 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-f95n8"] Nov 24 12:39:40 crc kubenswrapper[4752]: I1124 12:39:40.522142 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-z24v5"] Nov 24 12:39:40 crc kubenswrapper[4752]: I1124 12:39:40.524266 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-z24v5" Nov 24 12:39:40 crc kubenswrapper[4752]: I1124 12:39:40.526365 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-z24v5"] Nov 24 12:39:40 crc kubenswrapper[4752]: I1124 12:39:40.607109 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wzwb\" (UniqueName: \"kubernetes.io/projected/0be41fba-f8d1-426b-bedc-9d318f73bbbd-kube-api-access-7wzwb\") pod \"ovn-controller-f95n8\" (UID: \"0be41fba-f8d1-426b-bedc-9d318f73bbbd\") " pod="openstack/ovn-controller-f95n8" Nov 24 12:39:40 crc kubenswrapper[4752]: I1124 12:39:40.607215 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0be41fba-f8d1-426b-bedc-9d318f73bbbd-var-log-ovn\") pod \"ovn-controller-f95n8\" (UID: \"0be41fba-f8d1-426b-bedc-9d318f73bbbd\") " pod="openstack/ovn-controller-f95n8" Nov 24 12:39:40 crc kubenswrapper[4752]: I1124 12:39:40.607353 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0be41fba-f8d1-426b-bedc-9d318f73bbbd-var-run-ovn\") pod \"ovn-controller-f95n8\" (UID: \"0be41fba-f8d1-426b-bedc-9d318f73bbbd\") " pod="openstack/ovn-controller-f95n8" Nov 24 12:39:40 crc kubenswrapper[4752]: I1124 12:39:40.607432 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0be41fba-f8d1-426b-bedc-9d318f73bbbd-var-run\") pod \"ovn-controller-f95n8\" (UID: \"0be41fba-f8d1-426b-bedc-9d318f73bbbd\") " pod="openstack/ovn-controller-f95n8" Nov 24 12:39:40 crc kubenswrapper[4752]: I1124 12:39:40.607500 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9e66e131-ee32-478d-86d3-c32da4efcb08-var-run\") pod \"ovn-controller-ovs-z24v5\" (UID: \"9e66e131-ee32-478d-86d3-c32da4efcb08\") " pod="openstack/ovn-controller-ovs-z24v5" Nov 24 12:39:40 crc kubenswrapper[4752]: I1124 12:39:40.607527 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9e66e131-ee32-478d-86d3-c32da4efcb08-etc-ovs\") pod \"ovn-controller-ovs-z24v5\" (UID: \"9e66e131-ee32-478d-86d3-c32da4efcb08\") " pod="openstack/ovn-controller-ovs-z24v5" Nov 24 12:39:40 crc kubenswrapper[4752]: I1124 12:39:40.607554 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0be41fba-f8d1-426b-bedc-9d318f73bbbd-scripts\") pod \"ovn-controller-f95n8\" (UID: \"0be41fba-f8d1-426b-bedc-9d318f73bbbd\") " pod="openstack/ovn-controller-f95n8" Nov 24 12:39:40 crc kubenswrapper[4752]: I1124 12:39:40.607851 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9e66e131-ee32-478d-86d3-c32da4efcb08-var-lib\") pod \"ovn-controller-ovs-z24v5\" (UID: \"9e66e131-ee32-478d-86d3-c32da4efcb08\") " pod="openstack/ovn-controller-ovs-z24v5" Nov 24 12:39:40 crc kubenswrapper[4752]: I1124 12:39:40.607878 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msczk\" (UniqueName: \"kubernetes.io/projected/9e66e131-ee32-478d-86d3-c32da4efcb08-kube-api-access-msczk\") pod \"ovn-controller-ovs-z24v5\" (UID: \"9e66e131-ee32-478d-86d3-c32da4efcb08\") " pod="openstack/ovn-controller-ovs-z24v5" Nov 24 12:39:40 crc kubenswrapper[4752]: I1124 12:39:40.607913 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9e66e131-ee32-478d-86d3-c32da4efcb08-var-log\") pod \"ovn-controller-ovs-z24v5\" (UID: \"9e66e131-ee32-478d-86d3-c32da4efcb08\") " pod="openstack/ovn-controller-ovs-z24v5" Nov 24 12:39:40 crc kubenswrapper[4752]: I1124 12:39:40.608017 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e66e131-ee32-478d-86d3-c32da4efcb08-scripts\") pod \"ovn-controller-ovs-z24v5\" (UID: \"9e66e131-ee32-478d-86d3-c32da4efcb08\") " pod="openstack/ovn-controller-ovs-z24v5" Nov 24 12:39:40 crc kubenswrapper[4752]: I1124 12:39:40.709431 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wzwb\" (UniqueName: \"kubernetes.io/projected/0be41fba-f8d1-426b-bedc-9d318f73bbbd-kube-api-access-7wzwb\") pod \"ovn-controller-f95n8\" (UID: \"0be41fba-f8d1-426b-bedc-9d318f73bbbd\") " pod="openstack/ovn-controller-f95n8" Nov 24 12:39:40 crc kubenswrapper[4752]: I1124 12:39:40.709908 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0be41fba-f8d1-426b-bedc-9d318f73bbbd-var-log-ovn\") pod \"ovn-controller-f95n8\" (UID: \"0be41fba-f8d1-426b-bedc-9d318f73bbbd\") " pod="openstack/ovn-controller-f95n8" Nov 24 12:39:40 crc kubenswrapper[4752]: I1124 12:39:40.709949 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0be41fba-f8d1-426b-bedc-9d318f73bbbd-var-run-ovn\") pod \"ovn-controller-f95n8\" (UID: \"0be41fba-f8d1-426b-bedc-9d318f73bbbd\") " pod="openstack/ovn-controller-f95n8" Nov 24 12:39:40 crc kubenswrapper[4752]: I1124 12:39:40.709986 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0be41fba-f8d1-426b-bedc-9d318f73bbbd-var-run\") pod \"ovn-controller-f95n8\" (UID: \"0be41fba-f8d1-426b-bedc-9d318f73bbbd\") " pod="openstack/ovn-controller-f95n8" Nov 24 12:39:40 crc kubenswrapper[4752]: I1124 12:39:40.710034 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9e66e131-ee32-478d-86d3-c32da4efcb08-var-run\") pod \"ovn-controller-ovs-z24v5\" (UID: \"9e66e131-ee32-478d-86d3-c32da4efcb08\") " pod="openstack/ovn-controller-ovs-z24v5" Nov 24 12:39:40 crc kubenswrapper[4752]: I1124 12:39:40.710061 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9e66e131-ee32-478d-86d3-c32da4efcb08-etc-ovs\") pod \"ovn-controller-ovs-z24v5\" (UID: \"9e66e131-ee32-478d-86d3-c32da4efcb08\") " pod="openstack/ovn-controller-ovs-z24v5" Nov 24 12:39:40 crc kubenswrapper[4752]: I1124 12:39:40.710097 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0be41fba-f8d1-426b-bedc-9d318f73bbbd-scripts\") pod \"ovn-controller-f95n8\" (UID: \"0be41fba-f8d1-426b-bedc-9d318f73bbbd\") " pod="openstack/ovn-controller-f95n8" Nov 24 12:39:40 crc kubenswrapper[4752]: I1124 12:39:40.710131 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9e66e131-ee32-478d-86d3-c32da4efcb08-var-lib\") pod \"ovn-controller-ovs-z24v5\" (UID: \"9e66e131-ee32-478d-86d3-c32da4efcb08\") " pod="openstack/ovn-controller-ovs-z24v5" Nov 24 12:39:40 crc kubenswrapper[4752]: I1124 12:39:40.710153 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msczk\" (UniqueName: \"kubernetes.io/projected/9e66e131-ee32-478d-86d3-c32da4efcb08-kube-api-access-msczk\") pod \"ovn-controller-ovs-z24v5\" (UID: \"9e66e131-ee32-478d-86d3-c32da4efcb08\") " pod="openstack/ovn-controller-ovs-z24v5" Nov 24 12:39:40 crc kubenswrapper[4752]: I1124 12:39:40.710184 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9e66e131-ee32-478d-86d3-c32da4efcb08-var-log\") pod \"ovn-controller-ovs-z24v5\" (UID: \"9e66e131-ee32-478d-86d3-c32da4efcb08\") " pod="openstack/ovn-controller-ovs-z24v5" Nov 24 12:39:40 crc kubenswrapper[4752]: I1124 12:39:40.710212 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0be41fba-f8d1-426b-bedc-9d318f73bbbd-var-log-ovn\") pod \"ovn-controller-f95n8\" (UID: \"0be41fba-f8d1-426b-bedc-9d318f73bbbd\") " pod="openstack/ovn-controller-f95n8" Nov 24 12:39:40 crc kubenswrapper[4752]: I1124 12:39:40.710232 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e66e131-ee32-478d-86d3-c32da4efcb08-scripts\") pod \"ovn-controller-ovs-z24v5\" (UID: \"9e66e131-ee32-478d-86d3-c32da4efcb08\") " pod="openstack/ovn-controller-ovs-z24v5" Nov 24 12:39:40 crc kubenswrapper[4752]: I1124 12:39:40.710494 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9e66e131-ee32-478d-86d3-c32da4efcb08-etc-ovs\") pod \"ovn-controller-ovs-z24v5\" (UID: \"9e66e131-ee32-478d-86d3-c32da4efcb08\") " pod="openstack/ovn-controller-ovs-z24v5" Nov 24 12:39:40 crc kubenswrapper[4752]: I1124 12:39:40.710545 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9e66e131-ee32-478d-86d3-c32da4efcb08-var-run\") pod \"ovn-controller-ovs-z24v5\" (UID: \"9e66e131-ee32-478d-86d3-c32da4efcb08\") " pod="openstack/ovn-controller-ovs-z24v5" Nov 24 12:39:40 crc kubenswrapper[4752]: I1124 12:39:40.710572 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9e66e131-ee32-478d-86d3-c32da4efcb08-var-lib\") pod \"ovn-controller-ovs-z24v5\" (UID: \"9e66e131-ee32-478d-86d3-c32da4efcb08\") " pod="openstack/ovn-controller-ovs-z24v5" Nov 24 12:39:40 crc kubenswrapper[4752]: I1124 12:39:40.710605 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0be41fba-f8d1-426b-bedc-9d318f73bbbd-var-run-ovn\") pod \"ovn-controller-f95n8\" (UID: \"0be41fba-f8d1-426b-bedc-9d318f73bbbd\") " pod="openstack/ovn-controller-f95n8" Nov 24 12:39:40 crc kubenswrapper[4752]: I1124 12:39:40.710670 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9e66e131-ee32-478d-86d3-c32da4efcb08-var-log\") pod \"ovn-controller-ovs-z24v5\" (UID: \"9e66e131-ee32-478d-86d3-c32da4efcb08\") " pod="openstack/ovn-controller-ovs-z24v5" Nov 24 12:39:40 crc kubenswrapper[4752]: I1124 12:39:40.710677 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0be41fba-f8d1-426b-bedc-9d318f73bbbd-var-run\") pod \"ovn-controller-f95n8\" (UID: \"0be41fba-f8d1-426b-bedc-9d318f73bbbd\") " pod="openstack/ovn-controller-f95n8" Nov 24 12:39:40 crc kubenswrapper[4752]: I1124 12:39:40.712467 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0be41fba-f8d1-426b-bedc-9d318f73bbbd-scripts\") pod \"ovn-controller-f95n8\" (UID: \"0be41fba-f8d1-426b-bedc-9d318f73bbbd\") " pod="openstack/ovn-controller-f95n8" Nov 24 12:39:40 crc kubenswrapper[4752]: I1124 12:39:40.712611 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e66e131-ee32-478d-86d3-c32da4efcb08-scripts\") pod \"ovn-controller-ovs-z24v5\" (UID: \"9e66e131-ee32-478d-86d3-c32da4efcb08\") " pod="openstack/ovn-controller-ovs-z24v5" Nov 24 12:39:40 crc kubenswrapper[4752]: I1124 12:39:40.728892 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wzwb\" (UniqueName: \"kubernetes.io/projected/0be41fba-f8d1-426b-bedc-9d318f73bbbd-kube-api-access-7wzwb\") pod \"ovn-controller-f95n8\" (UID: \"0be41fba-f8d1-426b-bedc-9d318f73bbbd\") " pod="openstack/ovn-controller-f95n8" Nov 24 12:39:40 crc kubenswrapper[4752]: I1124 12:39:40.735298 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msczk\" (UniqueName: \"kubernetes.io/projected/9e66e131-ee32-478d-86d3-c32da4efcb08-kube-api-access-msczk\") pod \"ovn-controller-ovs-z24v5\" (UID: \"9e66e131-ee32-478d-86d3-c32da4efcb08\") " pod="openstack/ovn-controller-ovs-z24v5" Nov 24 12:39:40 crc kubenswrapper[4752]: I1124 12:39:40.877289 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-f95n8" Nov 24 12:39:40 crc kubenswrapper[4752]: I1124 12:39:40.884826 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-z24v5" Nov 24 12:39:41 crc kubenswrapper[4752]: I1124 12:39:41.456289 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-f95n8"] Nov 24 12:39:41 crc kubenswrapper[4752]: I1124 12:39:41.723881 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-z24v5"] Nov 24 12:39:41 crc kubenswrapper[4752]: I1124 12:39:41.874689 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-t82l7"] Nov 24 12:39:41 crc kubenswrapper[4752]: I1124 12:39:41.878167 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-t82l7" Nov 24 12:39:41 crc kubenswrapper[4752]: I1124 12:39:41.880297 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Nov 24 12:39:41 crc kubenswrapper[4752]: I1124 12:39:41.891473 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-t82l7"] Nov 24 12:39:41 crc kubenswrapper[4752]: I1124 12:39:41.939967 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/02042310-6413-447c-9f15-0973957090ad-ovn-rundir\") pod \"ovn-controller-metrics-t82l7\" (UID: \"02042310-6413-447c-9f15-0973957090ad\") " pod="openstack/ovn-controller-metrics-t82l7" Nov 24 12:39:41 crc kubenswrapper[4752]: I1124 12:39:41.940260 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8n6t\" (UniqueName: \"kubernetes.io/projected/02042310-6413-447c-9f15-0973957090ad-kube-api-access-m8n6t\") pod \"ovn-controller-metrics-t82l7\" (UID: \"02042310-6413-447c-9f15-0973957090ad\") " pod="openstack/ovn-controller-metrics-t82l7" Nov 24 12:39:41 crc kubenswrapper[4752]: I1124 12:39:41.940289 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/02042310-6413-447c-9f15-0973957090ad-ovs-rundir\") pod \"ovn-controller-metrics-t82l7\" (UID: \"02042310-6413-447c-9f15-0973957090ad\") " pod="openstack/ovn-controller-metrics-t82l7" Nov 24 12:39:41 crc kubenswrapper[4752]: I1124 12:39:41.940337 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02042310-6413-447c-9f15-0973957090ad-config\") pod \"ovn-controller-metrics-t82l7\" (UID: \"02042310-6413-447c-9f15-0973957090ad\") " pod="openstack/ovn-controller-metrics-t82l7" Nov 24 12:39:42 crc kubenswrapper[4752]: I1124 12:39:42.042216 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/02042310-6413-447c-9f15-0973957090ad-ovs-rundir\") pod \"ovn-controller-metrics-t82l7\" (UID: \"02042310-6413-447c-9f15-0973957090ad\") " pod="openstack/ovn-controller-metrics-t82l7" Nov 24 12:39:42 crc kubenswrapper[4752]: I1124 12:39:42.042257 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8n6t\" (UniqueName: \"kubernetes.io/projected/02042310-6413-447c-9f15-0973957090ad-kube-api-access-m8n6t\") pod \"ovn-controller-metrics-t82l7\" (UID: \"02042310-6413-447c-9f15-0973957090ad\") " pod="openstack/ovn-controller-metrics-t82l7" Nov 24 12:39:42 crc kubenswrapper[4752]: I1124 12:39:42.042288 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02042310-6413-447c-9f15-0973957090ad-config\") pod \"ovn-controller-metrics-t82l7\" (UID: \"02042310-6413-447c-9f15-0973957090ad\") " pod="openstack/ovn-controller-metrics-t82l7" Nov 24 12:39:42 crc kubenswrapper[4752]: I1124 12:39:42.042390 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/02042310-6413-447c-9f15-0973957090ad-ovn-rundir\") pod \"ovn-controller-metrics-t82l7\" (UID: \"02042310-6413-447c-9f15-0973957090ad\") " pod="openstack/ovn-controller-metrics-t82l7" Nov 24 12:39:42 crc kubenswrapper[4752]: I1124 12:39:42.042553 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/02042310-6413-447c-9f15-0973957090ad-ovs-rundir\") pod \"ovn-controller-metrics-t82l7\" (UID: \"02042310-6413-447c-9f15-0973957090ad\") " pod="openstack/ovn-controller-metrics-t82l7" Nov 24 12:39:42 crc kubenswrapper[4752]: I1124 12:39:42.042592 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/02042310-6413-447c-9f15-0973957090ad-ovn-rundir\") pod \"ovn-controller-metrics-t82l7\" (UID: \"02042310-6413-447c-9f15-0973957090ad\") " pod="openstack/ovn-controller-metrics-t82l7" Nov 24 12:39:42 crc kubenswrapper[4752]: I1124 12:39:42.043126 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02042310-6413-447c-9f15-0973957090ad-config\") pod \"ovn-controller-metrics-t82l7\" (UID: \"02042310-6413-447c-9f15-0973957090ad\") " pod="openstack/ovn-controller-metrics-t82l7" Nov 24 12:39:42 crc kubenswrapper[4752]: I1124 12:39:42.062368 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8n6t\" (UniqueName: \"kubernetes.io/projected/02042310-6413-447c-9f15-0973957090ad-kube-api-access-m8n6t\") pod \"ovn-controller-metrics-t82l7\" (UID: \"02042310-6413-447c-9f15-0973957090ad\") " pod="openstack/ovn-controller-metrics-t82l7" Nov 24 12:39:42 crc kubenswrapper[4752]: I1124 12:39:42.100711 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-f95n8" event={"ID":"0be41fba-f8d1-426b-bedc-9d318f73bbbd","Type":"ContainerStarted","Data":"20b59f1f85110bdfc01b2473b81f45b20b6779bdda157992b7bd7054a2ced9c1"} Nov 24 12:39:42 crc kubenswrapper[4752]: I1124 12:39:42.100768 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-f95n8" event={"ID":"0be41fba-f8d1-426b-bedc-9d318f73bbbd","Type":"ContainerStarted","Data":"c181064023191d8b17ba117b2aa1743e0b50246ad6987f205299aaf43759921d"} Nov 24 12:39:42 crc kubenswrapper[4752]: I1124 12:39:42.100806 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-f95n8" Nov 24 12:39:42 crc kubenswrapper[4752]: I1124 12:39:42.102577 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-z24v5" event={"ID":"9e66e131-ee32-478d-86d3-c32da4efcb08","Type":"ContainerStarted","Data":"23b78b7cd0e97c9a79349768f363c2813eb738b14fe177c18a2c96146433914d"} Nov 24 12:39:42 crc kubenswrapper[4752]: I1124 12:39:42.102615 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-z24v5" event={"ID":"9e66e131-ee32-478d-86d3-c32da4efcb08","Type":"ContainerStarted","Data":"99eb4dcd22c69762c9a905937f1523aa104cf98f843f2c324cf803b98be6a428"} Nov 24 12:39:42 crc kubenswrapper[4752]: I1124 12:39:42.115824 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-f95n8" podStartSLOduration=2.115806673 podStartE2EDuration="2.115806673s" podCreationTimestamp="2025-11-24 12:39:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:39:42.113947899 +0000 UTC m=+5588.098768188" watchObservedRunningTime="2025-11-24 12:39:42.115806673 +0000 UTC m=+5588.100626952" Nov 24 12:39:42 crc kubenswrapper[4752]: I1124 12:39:42.222034 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-t82l7" Nov 24 12:39:42 crc kubenswrapper[4752]: I1124 12:39:42.657846 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-t82l7"] Nov 24 12:39:42 crc kubenswrapper[4752]: W1124 12:39:42.659297 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02042310_6413_447c_9f15_0973957090ad.slice/crio-4573779a4f3b525db86dbd93eabef31466a936cb13d6dafa0c4385c091f6409f WatchSource:0}: Error finding container 4573779a4f3b525db86dbd93eabef31466a936cb13d6dafa0c4385c091f6409f: Status 404 returned error can't find the container with id 4573779a4f3b525db86dbd93eabef31466a936cb13d6dafa0c4385c091f6409f Nov 24 12:39:42 crc kubenswrapper[4752]: I1124 12:39:42.696004 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-create-shb95"] Nov 24 12:39:42 crc kubenswrapper[4752]: I1124 12:39:42.697375 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-shb95" Nov 24 12:39:42 crc kubenswrapper[4752]: I1124 12:39:42.704753 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-shb95"] Nov 24 12:39:42 crc kubenswrapper[4752]: I1124 12:39:42.861112 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3847960-35ca-445f-8c43-06dfbef18148-operator-scripts\") pod \"octavia-db-create-shb95\" (UID: \"d3847960-35ca-445f-8c43-06dfbef18148\") " pod="openstack/octavia-db-create-shb95" Nov 24 12:39:42 crc kubenswrapper[4752]: I1124 12:39:42.861414 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77rll\" (UniqueName: \"kubernetes.io/projected/d3847960-35ca-445f-8c43-06dfbef18148-kube-api-access-77rll\") pod \"octavia-db-create-shb95\" (UID: \"d3847960-35ca-445f-8c43-06dfbef18148\") " pod="openstack/octavia-db-create-shb95" Nov 24 12:39:42 crc kubenswrapper[4752]: I1124 12:39:42.963743 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3847960-35ca-445f-8c43-06dfbef18148-operator-scripts\") pod \"octavia-db-create-shb95\" (UID: \"d3847960-35ca-445f-8c43-06dfbef18148\") " pod="openstack/octavia-db-create-shb95" Nov 24 12:39:42 crc kubenswrapper[4752]: I1124 12:39:42.963810 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77rll\" (UniqueName: \"kubernetes.io/projected/d3847960-35ca-445f-8c43-06dfbef18148-kube-api-access-77rll\") pod \"octavia-db-create-shb95\" (UID: \"d3847960-35ca-445f-8c43-06dfbef18148\") " pod="openstack/octavia-db-create-shb95" Nov 24 12:39:42 crc kubenswrapper[4752]: I1124 12:39:42.964837 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3847960-35ca-445f-8c43-06dfbef18148-operator-scripts\") pod \"octavia-db-create-shb95\" (UID: \"d3847960-35ca-445f-8c43-06dfbef18148\") " pod="openstack/octavia-db-create-shb95" Nov 24 12:39:42 crc kubenswrapper[4752]: I1124 12:39:42.987902 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77rll\" (UniqueName: \"kubernetes.io/projected/d3847960-35ca-445f-8c43-06dfbef18148-kube-api-access-77rll\") pod \"octavia-db-create-shb95\" (UID: \"d3847960-35ca-445f-8c43-06dfbef18148\") " pod="openstack/octavia-db-create-shb95" Nov 24 12:39:43 crc kubenswrapper[4752]: I1124 12:39:43.025021 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-shb95" Nov 24 12:39:43 crc kubenswrapper[4752]: I1124 12:39:43.129641 4752 generic.go:334] "Generic (PLEG): container finished" podID="9e66e131-ee32-478d-86d3-c32da4efcb08" containerID="23b78b7cd0e97c9a79349768f363c2813eb738b14fe177c18a2c96146433914d" exitCode=0 Nov 24 12:39:43 crc kubenswrapper[4752]: I1124 12:39:43.129856 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-z24v5" event={"ID":"9e66e131-ee32-478d-86d3-c32da4efcb08","Type":"ContainerDied","Data":"23b78b7cd0e97c9a79349768f363c2813eb738b14fe177c18a2c96146433914d"} Nov 24 12:39:43 crc kubenswrapper[4752]: I1124 12:39:43.150449 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-t82l7" event={"ID":"02042310-6413-447c-9f15-0973957090ad","Type":"ContainerStarted","Data":"6450fd1889a6a14f3f6823241a0b9fe7335587b540c469fc5e7c2ab12106dadd"} Nov 24 12:39:43 crc kubenswrapper[4752]: I1124 12:39:43.150504 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-t82l7" event={"ID":"02042310-6413-447c-9f15-0973957090ad","Type":"ContainerStarted","Data":"4573779a4f3b525db86dbd93eabef31466a936cb13d6dafa0c4385c091f6409f"} Nov 24 12:39:43 crc kubenswrapper[4752]: I1124 12:39:43.245062 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-t82l7" podStartSLOduration=2.245033031 podStartE2EDuration="2.245033031s" podCreationTimestamp="2025-11-24 12:39:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:39:43.184995306 +0000 UTC m=+5589.169815585" watchObservedRunningTime="2025-11-24 12:39:43.245033031 +0000 UTC m=+5589.229853330" Nov 24 12:39:43 crc kubenswrapper[4752]: I1124 12:39:43.567034 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-shb95"] Nov 24 12:39:43 crc kubenswrapper[4752]: W1124 12:39:43.576294 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3847960_35ca_445f_8c43_06dfbef18148.slice/crio-80647382aa94d46f53adffe91b99e452ade0f0039e4f142ae54bb39096bc6be2 WatchSource:0}: Error finding container 80647382aa94d46f53adffe91b99e452ade0f0039e4f142ae54bb39096bc6be2: Status 404 returned error can't find the container with id 80647382aa94d46f53adffe91b99e452ade0f0039e4f142ae54bb39096bc6be2 Nov 24 12:39:43 crc kubenswrapper[4752]: I1124 12:39:43.957110 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-8169-account-create-76s8k"] Nov 24 12:39:43 crc kubenswrapper[4752]: I1124 12:39:43.959049 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-8169-account-create-76s8k" Nov 24 12:39:43 crc kubenswrapper[4752]: I1124 12:39:43.963031 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-db-secret" Nov 24 12:39:43 crc kubenswrapper[4752]: I1124 12:39:43.983926 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-8169-account-create-76s8k"] Nov 24 12:39:44 crc kubenswrapper[4752]: I1124 12:39:44.082774 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5143-account-create-k4g52"] Nov 24 12:39:44 crc kubenswrapper[4752]: I1124 12:39:44.093675 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-q8jpk"] Nov 24 12:39:44 crc kubenswrapper[4752]: I1124 12:39:44.102993 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-5143-account-create-k4g52"] Nov 24 12:39:44 crc kubenswrapper[4752]: I1124 12:39:44.106337 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hcg8\" (UniqueName: \"kubernetes.io/projected/7f5322bd-b92a-44ec-92ff-519b2d922f8e-kube-api-access-9hcg8\") pod \"octavia-8169-account-create-76s8k\" (UID: \"7f5322bd-b92a-44ec-92ff-519b2d922f8e\") " pod="openstack/octavia-8169-account-create-76s8k" Nov 24 12:39:44 crc kubenswrapper[4752]: I1124 12:39:44.106443 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f5322bd-b92a-44ec-92ff-519b2d922f8e-operator-scripts\") pod \"octavia-8169-account-create-76s8k\" (UID: \"7f5322bd-b92a-44ec-92ff-519b2d922f8e\") " pod="openstack/octavia-8169-account-create-76s8k" Nov 24 12:39:44 crc kubenswrapper[4752]: I1124 12:39:44.111931 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-q8jpk"] Nov 24 12:39:44 crc kubenswrapper[4752]: I1124 12:39:44.161718 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-z24v5" event={"ID":"9e66e131-ee32-478d-86d3-c32da4efcb08","Type":"ContainerStarted","Data":"4c70041aa90d49f704d91e75cf7ccc12f1b22bd5eda189876fe0594d53809050"} Nov 24 12:39:44 crc kubenswrapper[4752]: I1124 12:39:44.161783 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-z24v5" event={"ID":"9e66e131-ee32-478d-86d3-c32da4efcb08","Type":"ContainerStarted","Data":"fdd3cb3882e7883da589f7bc682bdea56f25b2b39438fcc6f9eb465b4fe5a815"} Nov 24 12:39:44 crc kubenswrapper[4752]: I1124 12:39:44.162983 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-z24v5" Nov 24 12:39:44 crc kubenswrapper[4752]: I1124 12:39:44.163020 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-z24v5" Nov 24 12:39:44 crc kubenswrapper[4752]: I1124 12:39:44.165244 4752 generic.go:334] "Generic (PLEG): container finished" podID="d3847960-35ca-445f-8c43-06dfbef18148" containerID="d62f337cc56a0924cbe6a1ebc6adc4ff3b24f9b292d2fa77ad418a375db03505" exitCode=0 Nov 24 12:39:44 crc kubenswrapper[4752]: I1124 12:39:44.166049 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-shb95" event={"ID":"d3847960-35ca-445f-8c43-06dfbef18148","Type":"ContainerDied","Data":"d62f337cc56a0924cbe6a1ebc6adc4ff3b24f9b292d2fa77ad418a375db03505"} Nov 24 12:39:44 crc kubenswrapper[4752]: I1124 12:39:44.166082 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-shb95" event={"ID":"d3847960-35ca-445f-8c43-06dfbef18148","Type":"ContainerStarted","Data":"80647382aa94d46f53adffe91b99e452ade0f0039e4f142ae54bb39096bc6be2"} Nov 24 12:39:44 crc kubenswrapper[4752]: I1124 12:39:44.184470 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-z24v5" podStartSLOduration=4.184451567 podStartE2EDuration="4.184451567s" podCreationTimestamp="2025-11-24 12:39:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:39:44.181579595 +0000 UTC m=+5590.166399884" watchObservedRunningTime="2025-11-24 12:39:44.184451567 +0000 UTC m=+5590.169271856" Nov 24 12:39:44 crc kubenswrapper[4752]: I1124 12:39:44.208586 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f5322bd-b92a-44ec-92ff-519b2d922f8e-operator-scripts\") pod \"octavia-8169-account-create-76s8k\" (UID: \"7f5322bd-b92a-44ec-92ff-519b2d922f8e\") " pod="openstack/octavia-8169-account-create-76s8k" Nov 24 12:39:44 crc kubenswrapper[4752]: I1124 12:39:44.208736 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hcg8\" (UniqueName: \"kubernetes.io/projected/7f5322bd-b92a-44ec-92ff-519b2d922f8e-kube-api-access-9hcg8\") pod \"octavia-8169-account-create-76s8k\" (UID: \"7f5322bd-b92a-44ec-92ff-519b2d922f8e\") " pod="openstack/octavia-8169-account-create-76s8k" Nov 24 12:39:44 crc kubenswrapper[4752]: I1124 12:39:44.210008 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f5322bd-b92a-44ec-92ff-519b2d922f8e-operator-scripts\") pod \"octavia-8169-account-create-76s8k\" (UID: \"7f5322bd-b92a-44ec-92ff-519b2d922f8e\") " pod="openstack/octavia-8169-account-create-76s8k" Nov 24 12:39:44 crc kubenswrapper[4752]: I1124 12:39:44.229906 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hcg8\" (UniqueName: \"kubernetes.io/projected/7f5322bd-b92a-44ec-92ff-519b2d922f8e-kube-api-access-9hcg8\") pod \"octavia-8169-account-create-76s8k\" (UID: \"7f5322bd-b92a-44ec-92ff-519b2d922f8e\") " pod="openstack/octavia-8169-account-create-76s8k" Nov 24 12:39:44 crc kubenswrapper[4752]: I1124 12:39:44.317836 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-8169-account-create-76s8k" Nov 24 12:39:44 crc kubenswrapper[4752]: I1124 12:39:44.744546 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="247394d9-138c-4d51-8cee-94abc971faf2" path="/var/lib/kubelet/pods/247394d9-138c-4d51-8cee-94abc971faf2/volumes" Nov 24 12:39:44 crc kubenswrapper[4752]: I1124 12:39:44.745893 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dcd7330-c0e2-4951-b4a1-250826e169b5" path="/var/lib/kubelet/pods/7dcd7330-c0e2-4951-b4a1-250826e169b5/volumes" Nov 24 12:39:44 crc kubenswrapper[4752]: I1124 12:39:44.791314 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-8169-account-create-76s8k"] Nov 24 12:39:44 crc kubenswrapper[4752]: W1124 12:39:44.804208 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f5322bd_b92a_44ec_92ff_519b2d922f8e.slice/crio-e0d46e56d364bff772c4a054f4bdc6384fd9f7ec350d99072eac42a009b1c90d WatchSource:0}: Error finding container e0d46e56d364bff772c4a054f4bdc6384fd9f7ec350d99072eac42a009b1c90d: Status 404 returned error can't find the container with id e0d46e56d364bff772c4a054f4bdc6384fd9f7ec350d99072eac42a009b1c90d Nov 24 12:39:45 crc kubenswrapper[4752]: I1124 12:39:45.175440 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-8169-account-create-76s8k" event={"ID":"7f5322bd-b92a-44ec-92ff-519b2d922f8e","Type":"ContainerStarted","Data":"38d2f9b6f84d73d99ba0e9c3d9cbcf112f0dd97838b9ab9f3dcbb916f7f442b4"} Nov 24 12:39:45 crc kubenswrapper[4752]: I1124 12:39:45.175501 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-8169-account-create-76s8k" event={"ID":"7f5322bd-b92a-44ec-92ff-519b2d922f8e","Type":"ContainerStarted","Data":"e0d46e56d364bff772c4a054f4bdc6384fd9f7ec350d99072eac42a009b1c90d"} Nov 24 12:39:45 crc kubenswrapper[4752]: I1124 12:39:45.198913 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-8169-account-create-76s8k" podStartSLOduration=2.198895609 podStartE2EDuration="2.198895609s" podCreationTimestamp="2025-11-24 12:39:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:39:45.195936474 +0000 UTC m=+5591.180756763" watchObservedRunningTime="2025-11-24 12:39:45.198895609 +0000 UTC m=+5591.183715898" Nov 24 12:39:45 crc kubenswrapper[4752]: I1124 12:39:45.473042 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 12:39:45 crc kubenswrapper[4752]: I1124 12:39:45.473106 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 12:39:45 crc kubenswrapper[4752]: I1124 12:39:45.692206 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-shb95" Nov 24 12:39:45 crc kubenswrapper[4752]: I1124 12:39:45.745858 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77rll\" (UniqueName: \"kubernetes.io/projected/d3847960-35ca-445f-8c43-06dfbef18148-kube-api-access-77rll\") pod \"d3847960-35ca-445f-8c43-06dfbef18148\" (UID: \"d3847960-35ca-445f-8c43-06dfbef18148\") " Nov 24 12:39:45 crc kubenswrapper[4752]: I1124 12:39:45.747010 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3847960-35ca-445f-8c43-06dfbef18148-operator-scripts\") pod \"d3847960-35ca-445f-8c43-06dfbef18148\" (UID: \"d3847960-35ca-445f-8c43-06dfbef18148\") " Nov 24 12:39:45 crc kubenswrapper[4752]: I1124 12:39:45.748305 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3847960-35ca-445f-8c43-06dfbef18148-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d3847960-35ca-445f-8c43-06dfbef18148" (UID: "d3847960-35ca-445f-8c43-06dfbef18148"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:39:45 crc kubenswrapper[4752]: I1124 12:39:45.754182 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3847960-35ca-445f-8c43-06dfbef18148-kube-api-access-77rll" (OuterVolumeSpecName: "kube-api-access-77rll") pod "d3847960-35ca-445f-8c43-06dfbef18148" (UID: "d3847960-35ca-445f-8c43-06dfbef18148"). InnerVolumeSpecName "kube-api-access-77rll". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:39:45 crc kubenswrapper[4752]: I1124 12:39:45.849462 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77rll\" (UniqueName: \"kubernetes.io/projected/d3847960-35ca-445f-8c43-06dfbef18148-kube-api-access-77rll\") on node \"crc\" DevicePath \"\"" Nov 24 12:39:45 crc kubenswrapper[4752]: I1124 12:39:45.849493 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3847960-35ca-445f-8c43-06dfbef18148-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:39:46 crc kubenswrapper[4752]: I1124 12:39:46.185401 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-shb95" event={"ID":"d3847960-35ca-445f-8c43-06dfbef18148","Type":"ContainerDied","Data":"80647382aa94d46f53adffe91b99e452ade0f0039e4f142ae54bb39096bc6be2"} Nov 24 12:39:46 crc kubenswrapper[4752]: I1124 12:39:46.185448 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80647382aa94d46f53adffe91b99e452ade0f0039e4f142ae54bb39096bc6be2" Nov 24 12:39:46 crc kubenswrapper[4752]: I1124 12:39:46.185427 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-shb95" Nov 24 12:39:46 crc kubenswrapper[4752]: I1124 12:39:46.187505 4752 generic.go:334] "Generic (PLEG): container finished" podID="7f5322bd-b92a-44ec-92ff-519b2d922f8e" containerID="38d2f9b6f84d73d99ba0e9c3d9cbcf112f0dd97838b9ab9f3dcbb916f7f442b4" exitCode=0 Nov 24 12:39:46 crc kubenswrapper[4752]: I1124 12:39:46.187583 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-8169-account-create-76s8k" event={"ID":"7f5322bd-b92a-44ec-92ff-519b2d922f8e","Type":"ContainerDied","Data":"38d2f9b6f84d73d99ba0e9c3d9cbcf112f0dd97838b9ab9f3dcbb916f7f442b4"} Nov 24 12:39:47 crc kubenswrapper[4752]: I1124 12:39:47.651217 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-8169-account-create-76s8k" Nov 24 12:39:47 crc kubenswrapper[4752]: I1124 12:39:47.705433 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f5322bd-b92a-44ec-92ff-519b2d922f8e-operator-scripts\") pod \"7f5322bd-b92a-44ec-92ff-519b2d922f8e\" (UID: \"7f5322bd-b92a-44ec-92ff-519b2d922f8e\") " Nov 24 12:39:47 crc kubenswrapper[4752]: I1124 12:39:47.705488 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hcg8\" (UniqueName: \"kubernetes.io/projected/7f5322bd-b92a-44ec-92ff-519b2d922f8e-kube-api-access-9hcg8\") pod \"7f5322bd-b92a-44ec-92ff-519b2d922f8e\" (UID: \"7f5322bd-b92a-44ec-92ff-519b2d922f8e\") " Nov 24 12:39:47 crc kubenswrapper[4752]: I1124 12:39:47.706383 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f5322bd-b92a-44ec-92ff-519b2d922f8e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7f5322bd-b92a-44ec-92ff-519b2d922f8e" (UID: "7f5322bd-b92a-44ec-92ff-519b2d922f8e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:39:47 crc kubenswrapper[4752]: I1124 12:39:47.711666 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f5322bd-b92a-44ec-92ff-519b2d922f8e-kube-api-access-9hcg8" (OuterVolumeSpecName: "kube-api-access-9hcg8") pod "7f5322bd-b92a-44ec-92ff-519b2d922f8e" (UID: "7f5322bd-b92a-44ec-92ff-519b2d922f8e"). InnerVolumeSpecName "kube-api-access-9hcg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:39:47 crc kubenswrapper[4752]: I1124 12:39:47.808304 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f5322bd-b92a-44ec-92ff-519b2d922f8e-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:39:47 crc kubenswrapper[4752]: I1124 12:39:47.808370 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hcg8\" (UniqueName: \"kubernetes.io/projected/7f5322bd-b92a-44ec-92ff-519b2d922f8e-kube-api-access-9hcg8\") on node \"crc\" DevicePath \"\"" Nov 24 12:39:48 crc kubenswrapper[4752]: I1124 12:39:48.213066 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-8169-account-create-76s8k" event={"ID":"7f5322bd-b92a-44ec-92ff-519b2d922f8e","Type":"ContainerDied","Data":"e0d46e56d364bff772c4a054f4bdc6384fd9f7ec350d99072eac42a009b1c90d"} Nov 24 12:39:48 crc kubenswrapper[4752]: I1124 12:39:48.213558 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0d46e56d364bff772c4a054f4bdc6384fd9f7ec350d99072eac42a009b1c90d" Nov 24 12:39:48 crc kubenswrapper[4752]: I1124 12:39:48.213150 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-8169-account-create-76s8k" Nov 24 12:39:49 crc kubenswrapper[4752]: I1124 12:39:49.839524 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-persistence-db-create-m5lqs"] Nov 24 12:39:49 crc kubenswrapper[4752]: E1124 12:39:49.840051 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f5322bd-b92a-44ec-92ff-519b2d922f8e" containerName="mariadb-account-create" Nov 24 12:39:49 crc kubenswrapper[4752]: I1124 12:39:49.840066 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f5322bd-b92a-44ec-92ff-519b2d922f8e" containerName="mariadb-account-create" Nov 24 12:39:49 crc kubenswrapper[4752]: E1124 12:39:49.840088 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3847960-35ca-445f-8c43-06dfbef18148" containerName="mariadb-database-create" Nov 24 12:39:49 crc kubenswrapper[4752]: I1124 12:39:49.840095 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3847960-35ca-445f-8c43-06dfbef18148" containerName="mariadb-database-create" Nov 24 12:39:49 crc kubenswrapper[4752]: I1124 12:39:49.840314 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3847960-35ca-445f-8c43-06dfbef18148" containerName="mariadb-database-create" Nov 24 12:39:49 crc kubenswrapper[4752]: I1124 12:39:49.840342 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f5322bd-b92a-44ec-92ff-519b2d922f8e" containerName="mariadb-account-create" Nov 24 12:39:49 crc kubenswrapper[4752]: I1124 12:39:49.841075 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-m5lqs" Nov 24 12:39:49 crc kubenswrapper[4752]: I1124 12:39:49.866888 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-m5lqs"] Nov 24 12:39:49 crc kubenswrapper[4752]: I1124 12:39:49.951938 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c470417c-fbc7-4525-a5c5-e5d7a99cfa08-operator-scripts\") pod \"octavia-persistence-db-create-m5lqs\" (UID: \"c470417c-fbc7-4525-a5c5-e5d7a99cfa08\") " pod="openstack/octavia-persistence-db-create-m5lqs" Nov 24 12:39:49 crc kubenswrapper[4752]: I1124 12:39:49.952295 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtjqb\" (UniqueName: \"kubernetes.io/projected/c470417c-fbc7-4525-a5c5-e5d7a99cfa08-kube-api-access-dtjqb\") pod \"octavia-persistence-db-create-m5lqs\" (UID: \"c470417c-fbc7-4525-a5c5-e5d7a99cfa08\") " pod="openstack/octavia-persistence-db-create-m5lqs" Nov 24 12:39:50 crc kubenswrapper[4752]: I1124 12:39:50.054484 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtjqb\" (UniqueName: \"kubernetes.io/projected/c470417c-fbc7-4525-a5c5-e5d7a99cfa08-kube-api-access-dtjqb\") pod \"octavia-persistence-db-create-m5lqs\" (UID: \"c470417c-fbc7-4525-a5c5-e5d7a99cfa08\") " pod="openstack/octavia-persistence-db-create-m5lqs" Nov 24 12:39:50 crc kubenswrapper[4752]: I1124 12:39:50.054601 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c470417c-fbc7-4525-a5c5-e5d7a99cfa08-operator-scripts\") pod \"octavia-persistence-db-create-m5lqs\" (UID: \"c470417c-fbc7-4525-a5c5-e5d7a99cfa08\") " pod="openstack/octavia-persistence-db-create-m5lqs" Nov 24 12:39:50 crc kubenswrapper[4752]: I1124 12:39:50.055544 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c470417c-fbc7-4525-a5c5-e5d7a99cfa08-operator-scripts\") pod \"octavia-persistence-db-create-m5lqs\" (UID: \"c470417c-fbc7-4525-a5c5-e5d7a99cfa08\") " pod="openstack/octavia-persistence-db-create-m5lqs" Nov 24 12:39:50 crc kubenswrapper[4752]: I1124 12:39:50.083952 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtjqb\" (UniqueName: \"kubernetes.io/projected/c470417c-fbc7-4525-a5c5-e5d7a99cfa08-kube-api-access-dtjqb\") pod \"octavia-persistence-db-create-m5lqs\" (UID: \"c470417c-fbc7-4525-a5c5-e5d7a99cfa08\") " pod="openstack/octavia-persistence-db-create-m5lqs" Nov 24 12:39:50 crc kubenswrapper[4752]: I1124 12:39:50.180130 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-m5lqs" Nov 24 12:39:50 crc kubenswrapper[4752]: I1124 12:39:50.685977 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-m5lqs"] Nov 24 12:39:50 crc kubenswrapper[4752]: I1124 12:39:50.928268 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-b221-account-create-lm2zh"] Nov 24 12:39:50 crc kubenswrapper[4752]: I1124 12:39:50.931238 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-b221-account-create-lm2zh" Nov 24 12:39:50 crc kubenswrapper[4752]: I1124 12:39:50.933922 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-persistence-db-secret" Nov 24 12:39:50 crc kubenswrapper[4752]: I1124 12:39:50.960922 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-b221-account-create-lm2zh"] Nov 24 12:39:50 crc kubenswrapper[4752]: I1124 12:39:50.975836 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx7jq\" (UniqueName: \"kubernetes.io/projected/8a8ac779-7b0c-4aba-9a2c-8624a1a55516-kube-api-access-lx7jq\") pod \"octavia-b221-account-create-lm2zh\" (UID: \"8a8ac779-7b0c-4aba-9a2c-8624a1a55516\") " pod="openstack/octavia-b221-account-create-lm2zh" Nov 24 12:39:50 crc kubenswrapper[4752]: I1124 12:39:50.975955 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a8ac779-7b0c-4aba-9a2c-8624a1a55516-operator-scripts\") pod \"octavia-b221-account-create-lm2zh\" (UID: \"8a8ac779-7b0c-4aba-9a2c-8624a1a55516\") " pod="openstack/octavia-b221-account-create-lm2zh" Nov 24 12:39:51 crc kubenswrapper[4752]: I1124 12:39:51.028292 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-vkk9s"] Nov 24 12:39:51 crc kubenswrapper[4752]: I1124 12:39:51.036114 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-vkk9s"] Nov 24 12:39:51 crc kubenswrapper[4752]: I1124 12:39:51.077021 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lx7jq\" (UniqueName: \"kubernetes.io/projected/8a8ac779-7b0c-4aba-9a2c-8624a1a55516-kube-api-access-lx7jq\") pod \"octavia-b221-account-create-lm2zh\" (UID: \"8a8ac779-7b0c-4aba-9a2c-8624a1a55516\") " pod="openstack/octavia-b221-account-create-lm2zh" Nov 24 12:39:51 crc kubenswrapper[4752]: I1124 12:39:51.077293 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a8ac779-7b0c-4aba-9a2c-8624a1a55516-operator-scripts\") pod \"octavia-b221-account-create-lm2zh\" (UID: \"8a8ac779-7b0c-4aba-9a2c-8624a1a55516\") " pod="openstack/octavia-b221-account-create-lm2zh" Nov 24 12:39:51 crc kubenswrapper[4752]: I1124 12:39:51.078074 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a8ac779-7b0c-4aba-9a2c-8624a1a55516-operator-scripts\") pod \"octavia-b221-account-create-lm2zh\" (UID: \"8a8ac779-7b0c-4aba-9a2c-8624a1a55516\") " pod="openstack/octavia-b221-account-create-lm2zh" Nov 24 12:39:51 crc kubenswrapper[4752]: I1124 12:39:51.107194 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx7jq\" (UniqueName: \"kubernetes.io/projected/8a8ac779-7b0c-4aba-9a2c-8624a1a55516-kube-api-access-lx7jq\") pod \"octavia-b221-account-create-lm2zh\" (UID: \"8a8ac779-7b0c-4aba-9a2c-8624a1a55516\") " pod="openstack/octavia-b221-account-create-lm2zh" Nov 24 12:39:51 crc kubenswrapper[4752]: I1124 12:39:51.239612 4752 generic.go:334] "Generic (PLEG): container finished" podID="c470417c-fbc7-4525-a5c5-e5d7a99cfa08" containerID="8d77796551eb0cbd11259ae6a87dbc713018f76403c385718038614f08eaf4bd" exitCode=0 Nov 24 12:39:51 crc kubenswrapper[4752]: I1124 12:39:51.239663 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-m5lqs" event={"ID":"c470417c-fbc7-4525-a5c5-e5d7a99cfa08","Type":"ContainerDied","Data":"8d77796551eb0cbd11259ae6a87dbc713018f76403c385718038614f08eaf4bd"} Nov 24 12:39:51 crc kubenswrapper[4752]: I1124 12:39:51.239694 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-m5lqs" event={"ID":"c470417c-fbc7-4525-a5c5-e5d7a99cfa08","Type":"ContainerStarted","Data":"ba9497b2949066e741660500d4a58519da5fca773e01e854f848fe7da85f35e3"} Nov 24 12:39:51 crc kubenswrapper[4752]: I1124 12:39:51.264116 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-b221-account-create-lm2zh" Nov 24 12:39:51 crc kubenswrapper[4752]: I1124 12:39:51.917904 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-b221-account-create-lm2zh"] Nov 24 12:39:51 crc kubenswrapper[4752]: W1124 12:39:51.932413 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a8ac779_7b0c_4aba_9a2c_8624a1a55516.slice/crio-561cc35c7fd45ce6f77eb76a9abb0cd4d9387c3633930aa059a5cf0813abd35b WatchSource:0}: Error finding container 561cc35c7fd45ce6f77eb76a9abb0cd4d9387c3633930aa059a5cf0813abd35b: Status 404 returned error can't find the container with id 561cc35c7fd45ce6f77eb76a9abb0cd4d9387c3633930aa059a5cf0813abd35b Nov 24 12:39:52 crc kubenswrapper[4752]: I1124 12:39:52.250778 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-b221-account-create-lm2zh" event={"ID":"8a8ac779-7b0c-4aba-9a2c-8624a1a55516","Type":"ContainerStarted","Data":"9a713a4c2e75a1604df15ef04f5e0a88c28b15a38d1e4b44d9f490fc28d47551"} Nov 24 12:39:52 crc kubenswrapper[4752]: I1124 12:39:52.250821 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-b221-account-create-lm2zh" event={"ID":"8a8ac779-7b0c-4aba-9a2c-8624a1a55516","Type":"ContainerStarted","Data":"561cc35c7fd45ce6f77eb76a9abb0cd4d9387c3633930aa059a5cf0813abd35b"} Nov 24 12:39:52 crc kubenswrapper[4752]: I1124 12:39:52.280845 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-b221-account-create-lm2zh" podStartSLOduration=2.280824366 podStartE2EDuration="2.280824366s" podCreationTimestamp="2025-11-24 12:39:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:39:52.266388652 +0000 UTC m=+5598.251208961" watchObservedRunningTime="2025-11-24 12:39:52.280824366 +0000 UTC m=+5598.265644665" Nov 24 12:39:52 crc kubenswrapper[4752]: I1124 12:39:52.607310 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-m5lqs" Nov 24 12:39:52 crc kubenswrapper[4752]: I1124 12:39:52.724342 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtjqb\" (UniqueName: \"kubernetes.io/projected/c470417c-fbc7-4525-a5c5-e5d7a99cfa08-kube-api-access-dtjqb\") pod \"c470417c-fbc7-4525-a5c5-e5d7a99cfa08\" (UID: \"c470417c-fbc7-4525-a5c5-e5d7a99cfa08\") " Nov 24 12:39:52 crc kubenswrapper[4752]: I1124 12:39:52.724604 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c470417c-fbc7-4525-a5c5-e5d7a99cfa08-operator-scripts\") pod \"c470417c-fbc7-4525-a5c5-e5d7a99cfa08\" (UID: \"c470417c-fbc7-4525-a5c5-e5d7a99cfa08\") " Nov 24 12:39:52 crc kubenswrapper[4752]: I1124 12:39:52.725254 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c470417c-fbc7-4525-a5c5-e5d7a99cfa08-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c470417c-fbc7-4525-a5c5-e5d7a99cfa08" (UID: "c470417c-fbc7-4525-a5c5-e5d7a99cfa08"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:39:52 crc kubenswrapper[4752]: I1124 12:39:52.730695 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c470417c-fbc7-4525-a5c5-e5d7a99cfa08-kube-api-access-dtjqb" (OuterVolumeSpecName: "kube-api-access-dtjqb") pod "c470417c-fbc7-4525-a5c5-e5d7a99cfa08" (UID: "c470417c-fbc7-4525-a5c5-e5d7a99cfa08"). InnerVolumeSpecName "kube-api-access-dtjqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:39:52 crc kubenswrapper[4752]: I1124 12:39:52.744629 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a44ba09-75b3-494c-88b1-065eb978f33a" path="/var/lib/kubelet/pods/5a44ba09-75b3-494c-88b1-065eb978f33a/volumes" Nov 24 12:39:52 crc kubenswrapper[4752]: I1124 12:39:52.826718 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c470417c-fbc7-4525-a5c5-e5d7a99cfa08-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:39:52 crc kubenswrapper[4752]: I1124 12:39:52.826766 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtjqb\" (UniqueName: \"kubernetes.io/projected/c470417c-fbc7-4525-a5c5-e5d7a99cfa08-kube-api-access-dtjqb\") on node \"crc\" DevicePath \"\"" Nov 24 12:39:53 crc kubenswrapper[4752]: I1124 12:39:53.262440 4752 generic.go:334] "Generic (PLEG): container finished" podID="8a8ac779-7b0c-4aba-9a2c-8624a1a55516" containerID="9a713a4c2e75a1604df15ef04f5e0a88c28b15a38d1e4b44d9f490fc28d47551" exitCode=0 Nov 24 12:39:53 crc kubenswrapper[4752]: I1124 12:39:53.262517 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-b221-account-create-lm2zh" event={"ID":"8a8ac779-7b0c-4aba-9a2c-8624a1a55516","Type":"ContainerDied","Data":"9a713a4c2e75a1604df15ef04f5e0a88c28b15a38d1e4b44d9f490fc28d47551"} Nov 24 12:39:53 crc kubenswrapper[4752]: I1124 12:39:53.264286 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-m5lqs" event={"ID":"c470417c-fbc7-4525-a5c5-e5d7a99cfa08","Type":"ContainerDied","Data":"ba9497b2949066e741660500d4a58519da5fca773e01e854f848fe7da85f35e3"} Nov 24 12:39:53 crc kubenswrapper[4752]: I1124 12:39:53.264593 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba9497b2949066e741660500d4a58519da5fca773e01e854f848fe7da85f35e3" Nov 24 12:39:53 crc kubenswrapper[4752]: I1124 12:39:53.264397 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-m5lqs" Nov 24 12:39:54 crc kubenswrapper[4752]: I1124 12:39:54.754452 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-b221-account-create-lm2zh" Nov 24 12:39:54 crc kubenswrapper[4752]: I1124 12:39:54.882570 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a8ac779-7b0c-4aba-9a2c-8624a1a55516-operator-scripts\") pod \"8a8ac779-7b0c-4aba-9a2c-8624a1a55516\" (UID: \"8a8ac779-7b0c-4aba-9a2c-8624a1a55516\") " Nov 24 12:39:54 crc kubenswrapper[4752]: I1124 12:39:54.882801 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lx7jq\" (UniqueName: \"kubernetes.io/projected/8a8ac779-7b0c-4aba-9a2c-8624a1a55516-kube-api-access-lx7jq\") pod \"8a8ac779-7b0c-4aba-9a2c-8624a1a55516\" (UID: \"8a8ac779-7b0c-4aba-9a2c-8624a1a55516\") " Nov 24 12:39:54 crc kubenswrapper[4752]: I1124 12:39:54.883163 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a8ac779-7b0c-4aba-9a2c-8624a1a55516-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8a8ac779-7b0c-4aba-9a2c-8624a1a55516" (UID: "8a8ac779-7b0c-4aba-9a2c-8624a1a55516"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:39:54 crc kubenswrapper[4752]: I1124 12:39:54.888020 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a8ac779-7b0c-4aba-9a2c-8624a1a55516-kube-api-access-lx7jq" (OuterVolumeSpecName: "kube-api-access-lx7jq") pod "8a8ac779-7b0c-4aba-9a2c-8624a1a55516" (UID: "8a8ac779-7b0c-4aba-9a2c-8624a1a55516"). InnerVolumeSpecName "kube-api-access-lx7jq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:39:54 crc kubenswrapper[4752]: I1124 12:39:54.984883 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a8ac779-7b0c-4aba-9a2c-8624a1a55516-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:39:54 crc kubenswrapper[4752]: I1124 12:39:54.984928 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lx7jq\" (UniqueName: \"kubernetes.io/projected/8a8ac779-7b0c-4aba-9a2c-8624a1a55516-kube-api-access-lx7jq\") on node \"crc\" DevicePath \"\"" Nov 24 12:39:55 crc kubenswrapper[4752]: I1124 12:39:55.293736 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-b221-account-create-lm2zh" event={"ID":"8a8ac779-7b0c-4aba-9a2c-8624a1a55516","Type":"ContainerDied","Data":"561cc35c7fd45ce6f77eb76a9abb0cd4d9387c3633930aa059a5cf0813abd35b"} Nov 24 12:39:55 crc kubenswrapper[4752]: I1124 12:39:55.293796 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="561cc35c7fd45ce6f77eb76a9abb0cd4d9387c3633930aa059a5cf0813abd35b" Nov 24 12:39:55 crc kubenswrapper[4752]: I1124 12:39:55.293850 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-b221-account-create-lm2zh" Nov 24 12:39:56 crc kubenswrapper[4752]: I1124 12:39:56.975207 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-85bf5b5c78-688s2"] Nov 24 12:39:56 crc kubenswrapper[4752]: E1124 12:39:56.975882 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a8ac779-7b0c-4aba-9a2c-8624a1a55516" containerName="mariadb-account-create" Nov 24 12:39:56 crc kubenswrapper[4752]: I1124 12:39:56.975895 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a8ac779-7b0c-4aba-9a2c-8624a1a55516" containerName="mariadb-account-create" Nov 24 12:39:56 crc kubenswrapper[4752]: E1124 12:39:56.975923 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c470417c-fbc7-4525-a5c5-e5d7a99cfa08" containerName="mariadb-database-create" Nov 24 12:39:56 crc kubenswrapper[4752]: I1124 12:39:56.975928 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="c470417c-fbc7-4525-a5c5-e5d7a99cfa08" containerName="mariadb-database-create" Nov 24 12:39:56 crc kubenswrapper[4752]: I1124 12:39:56.976099 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a8ac779-7b0c-4aba-9a2c-8624a1a55516" containerName="mariadb-account-create" Nov 24 12:39:56 crc kubenswrapper[4752]: I1124 12:39:56.976114 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="c470417c-fbc7-4525-a5c5-e5d7a99cfa08" containerName="mariadb-database-create" Nov 24 12:39:56 crc kubenswrapper[4752]: I1124 12:39:56.984119 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-85bf5b5c78-688s2" Nov 24 12:39:56 crc kubenswrapper[4752]: I1124 12:39:56.987834 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-octavia-dockercfg-bvggb" Nov 24 12:39:56 crc kubenswrapper[4752]: I1124 12:39:56.988023 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-config-data" Nov 24 12:39:56 crc kubenswrapper[4752]: I1124 12:39:56.988459 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-scripts" Nov 24 12:39:56 crc kubenswrapper[4752]: I1124 12:39:56.989300 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-85bf5b5c78-688s2"] Nov 24 12:39:57 crc kubenswrapper[4752]: I1124 12:39:57.020858 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/497355f0-5071-444a-96df-145e4220d015-octavia-run\") pod \"octavia-api-85bf5b5c78-688s2\" (UID: \"497355f0-5071-444a-96df-145e4220d015\") " pod="openstack/octavia-api-85bf5b5c78-688s2" Nov 24 12:39:57 crc kubenswrapper[4752]: I1124 12:39:57.020955 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/497355f0-5071-444a-96df-145e4220d015-scripts\") pod \"octavia-api-85bf5b5c78-688s2\" (UID: \"497355f0-5071-444a-96df-145e4220d015\") " pod="openstack/octavia-api-85bf5b5c78-688s2" Nov 24 12:39:57 crc kubenswrapper[4752]: I1124 12:39:57.020980 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/497355f0-5071-444a-96df-145e4220d015-config-data-merged\") pod \"octavia-api-85bf5b5c78-688s2\" (UID: \"497355f0-5071-444a-96df-145e4220d015\") " pod="openstack/octavia-api-85bf5b5c78-688s2" Nov 24 12:39:57 crc kubenswrapper[4752]: I1124 12:39:57.021020 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/497355f0-5071-444a-96df-145e4220d015-config-data\") pod \"octavia-api-85bf5b5c78-688s2\" (UID: \"497355f0-5071-444a-96df-145e4220d015\") " pod="openstack/octavia-api-85bf5b5c78-688s2" Nov 24 12:39:57 crc kubenswrapper[4752]: I1124 12:39:57.021051 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/497355f0-5071-444a-96df-145e4220d015-combined-ca-bundle\") pod \"octavia-api-85bf5b5c78-688s2\" (UID: \"497355f0-5071-444a-96df-145e4220d015\") " pod="openstack/octavia-api-85bf5b5c78-688s2" Nov 24 12:39:57 crc kubenswrapper[4752]: I1124 12:39:57.122339 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/497355f0-5071-444a-96df-145e4220d015-config-data\") pod \"octavia-api-85bf5b5c78-688s2\" (UID: \"497355f0-5071-444a-96df-145e4220d015\") " pod="openstack/octavia-api-85bf5b5c78-688s2" Nov 24 12:39:57 crc kubenswrapper[4752]: I1124 12:39:57.122687 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/497355f0-5071-444a-96df-145e4220d015-combined-ca-bundle\") pod \"octavia-api-85bf5b5c78-688s2\" (UID: \"497355f0-5071-444a-96df-145e4220d015\") " pod="openstack/octavia-api-85bf5b5c78-688s2" Nov 24 12:39:57 crc kubenswrapper[4752]: I1124 12:39:57.122789 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/497355f0-5071-444a-96df-145e4220d015-octavia-run\") pod \"octavia-api-85bf5b5c78-688s2\" (UID: \"497355f0-5071-444a-96df-145e4220d015\") " pod="openstack/octavia-api-85bf5b5c78-688s2" Nov 24 12:39:57 crc kubenswrapper[4752]: I1124 12:39:57.122863 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/497355f0-5071-444a-96df-145e4220d015-scripts\") pod \"octavia-api-85bf5b5c78-688s2\" (UID: \"497355f0-5071-444a-96df-145e4220d015\") " pod="openstack/octavia-api-85bf5b5c78-688s2" Nov 24 12:39:57 crc kubenswrapper[4752]: I1124 12:39:57.122885 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/497355f0-5071-444a-96df-145e4220d015-config-data-merged\") pod \"octavia-api-85bf5b5c78-688s2\" (UID: \"497355f0-5071-444a-96df-145e4220d015\") " pod="openstack/octavia-api-85bf5b5c78-688s2" Nov 24 12:39:57 crc kubenswrapper[4752]: I1124 12:39:57.123426 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/497355f0-5071-444a-96df-145e4220d015-config-data-merged\") pod \"octavia-api-85bf5b5c78-688s2\" (UID: \"497355f0-5071-444a-96df-145e4220d015\") " pod="openstack/octavia-api-85bf5b5c78-688s2" Nov 24 12:39:57 crc kubenswrapper[4752]: I1124 12:39:57.123420 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/497355f0-5071-444a-96df-145e4220d015-octavia-run\") pod \"octavia-api-85bf5b5c78-688s2\" (UID: \"497355f0-5071-444a-96df-145e4220d015\") " pod="openstack/octavia-api-85bf5b5c78-688s2" Nov 24 12:39:57 crc kubenswrapper[4752]: I1124 12:39:57.128882 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/497355f0-5071-444a-96df-145e4220d015-scripts\") pod \"octavia-api-85bf5b5c78-688s2\" (UID: \"497355f0-5071-444a-96df-145e4220d015\") " pod="openstack/octavia-api-85bf5b5c78-688s2" Nov 24 12:39:57 crc kubenswrapper[4752]: I1124 12:39:57.132074 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/497355f0-5071-444a-96df-145e4220d015-combined-ca-bundle\") pod \"octavia-api-85bf5b5c78-688s2\" (UID: \"497355f0-5071-444a-96df-145e4220d015\") " pod="openstack/octavia-api-85bf5b5c78-688s2" Nov 24 12:39:57 crc kubenswrapper[4752]: I1124 12:39:57.132334 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/497355f0-5071-444a-96df-145e4220d015-config-data\") pod \"octavia-api-85bf5b5c78-688s2\" (UID: \"497355f0-5071-444a-96df-145e4220d015\") " pod="openstack/octavia-api-85bf5b5c78-688s2" Nov 24 12:39:57 crc kubenswrapper[4752]: I1124 12:39:57.309050 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-85bf5b5c78-688s2" Nov 24 12:39:57 crc kubenswrapper[4752]: I1124 12:39:57.802098 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-85bf5b5c78-688s2"] Nov 24 12:39:57 crc kubenswrapper[4752]: W1124 12:39:57.806704 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod497355f0_5071_444a_96df_145e4220d015.slice/crio-ccf52194301bf4ba755b2c34a5eb2e681ed45de26a6ada5b7556fd7abc198f03 WatchSource:0}: Error finding container ccf52194301bf4ba755b2c34a5eb2e681ed45de26a6ada5b7556fd7abc198f03: Status 404 returned error can't find the container with id ccf52194301bf4ba755b2c34a5eb2e681ed45de26a6ada5b7556fd7abc198f03 Nov 24 12:39:58 crc kubenswrapper[4752]: I1124 12:39:58.330927 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-85bf5b5c78-688s2" event={"ID":"497355f0-5071-444a-96df-145e4220d015","Type":"ContainerStarted","Data":"ccf52194301bf4ba755b2c34a5eb2e681ed45de26a6ada5b7556fd7abc198f03"} Nov 24 12:40:04 crc kubenswrapper[4752]: I1124 12:40:04.053695 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-b7ffs"] Nov 24 12:40:04 crc kubenswrapper[4752]: I1124 12:40:04.063630 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-b7ffs"] Nov 24 12:40:04 crc kubenswrapper[4752]: I1124 12:40:04.740148 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8abdcde-a74d-4d46-9827-720f5202a317" path="/var/lib/kubelet/pods/c8abdcde-a74d-4d46-9827-720f5202a317/volumes" Nov 24 12:40:07 crc kubenswrapper[4752]: I1124 12:40:07.426970 4752 generic.go:334] "Generic (PLEG): container finished" podID="497355f0-5071-444a-96df-145e4220d015" containerID="fe7ef4e8469f19a367e397e1c5a36c7dd66a32f9966211f3137e74848fd83426" exitCode=0 Nov 24 12:40:07 crc kubenswrapper[4752]: I1124 12:40:07.427084 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-85bf5b5c78-688s2" event={"ID":"497355f0-5071-444a-96df-145e4220d015","Type":"ContainerDied","Data":"fe7ef4e8469f19a367e397e1c5a36c7dd66a32f9966211f3137e74848fd83426"} Nov 24 12:40:08 crc kubenswrapper[4752]: I1124 12:40:08.438985 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-85bf5b5c78-688s2" event={"ID":"497355f0-5071-444a-96df-145e4220d015","Type":"ContainerStarted","Data":"85198b62a39829859e09c2884ddda3c67bb9609479478dcba4c7a2f8a4087155"} Nov 24 12:40:08 crc kubenswrapper[4752]: I1124 12:40:08.439522 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-85bf5b5c78-688s2" event={"ID":"497355f0-5071-444a-96df-145e4220d015","Type":"ContainerStarted","Data":"e79ab5a446a54ead739b5b237ea3fa94cd98d86c6df472088bf81454e09c57b7"} Nov 24 12:40:08 crc kubenswrapper[4752]: I1124 12:40:08.440107 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-85bf5b5c78-688s2" Nov 24 12:40:08 crc kubenswrapper[4752]: I1124 12:40:08.440167 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-85bf5b5c78-688s2" Nov 24 12:40:08 crc kubenswrapper[4752]: I1124 12:40:08.461631 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-85bf5b5c78-688s2" podStartSLOduration=4.024385683 podStartE2EDuration="12.46161226s" podCreationTimestamp="2025-11-24 12:39:56 +0000 UTC" firstStartedPulling="2025-11-24 12:39:57.809261183 +0000 UTC m=+5603.794081462" lastFinishedPulling="2025-11-24 12:40:06.24648774 +0000 UTC m=+5612.231308039" observedRunningTime="2025-11-24 12:40:08.460113217 +0000 UTC m=+5614.444933506" watchObservedRunningTime="2025-11-24 12:40:08.46161226 +0000 UTC m=+5614.446432549" Nov 24 12:40:15 crc kubenswrapper[4752]: I1124 12:40:15.469432 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 12:40:15 crc kubenswrapper[4752]: I1124 12:40:15.470367 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 12:40:15 crc kubenswrapper[4752]: I1124 12:40:15.941354 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-f95n8" podUID="0be41fba-f8d1-426b-bedc-9d318f73bbbd" containerName="ovn-controller" probeResult="failure" output=< Nov 24 12:40:15 crc kubenswrapper[4752]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Nov 24 12:40:15 crc kubenswrapper[4752]: > Nov 24 12:40:15 crc kubenswrapper[4752]: I1124 12:40:15.969136 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-z24v5" Nov 24 12:40:15 crc kubenswrapper[4752]: I1124 12:40:15.984258 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-z24v5" Nov 24 12:40:16 crc kubenswrapper[4752]: I1124 12:40:16.103218 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-f95n8-config-d6mt8"] Nov 24 12:40:16 crc kubenswrapper[4752]: I1124 12:40:16.104326 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-f95n8-config-d6mt8" Nov 24 12:40:16 crc kubenswrapper[4752]: I1124 12:40:16.106416 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Nov 24 12:40:16 crc kubenswrapper[4752]: I1124 12:40:16.119666 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-f95n8-config-d6mt8"] Nov 24 12:40:16 crc kubenswrapper[4752]: I1124 12:40:16.244892 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56d37b02-6163-4e93-8a79-1418dd176661-scripts\") pod \"ovn-controller-f95n8-config-d6mt8\" (UID: \"56d37b02-6163-4e93-8a79-1418dd176661\") " pod="openstack/ovn-controller-f95n8-config-d6mt8" Nov 24 12:40:16 crc kubenswrapper[4752]: I1124 12:40:16.245033 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/56d37b02-6163-4e93-8a79-1418dd176661-var-run-ovn\") pod \"ovn-controller-f95n8-config-d6mt8\" (UID: \"56d37b02-6163-4e93-8a79-1418dd176661\") " pod="openstack/ovn-controller-f95n8-config-d6mt8" Nov 24 12:40:16 crc kubenswrapper[4752]: I1124 12:40:16.245065 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/56d37b02-6163-4e93-8a79-1418dd176661-var-run\") pod \"ovn-controller-f95n8-config-d6mt8\" (UID: \"56d37b02-6163-4e93-8a79-1418dd176661\") " pod="openstack/ovn-controller-f95n8-config-d6mt8" Nov 24 12:40:16 crc kubenswrapper[4752]: I1124 12:40:16.245125 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8zh4\" (UniqueName: \"kubernetes.io/projected/56d37b02-6163-4e93-8a79-1418dd176661-kube-api-access-g8zh4\") pod \"ovn-controller-f95n8-config-d6mt8\" (UID: \"56d37b02-6163-4e93-8a79-1418dd176661\") " pod="openstack/ovn-controller-f95n8-config-d6mt8" Nov 24 12:40:16 crc kubenswrapper[4752]: I1124 12:40:16.245153 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/56d37b02-6163-4e93-8a79-1418dd176661-var-log-ovn\") pod \"ovn-controller-f95n8-config-d6mt8\" (UID: \"56d37b02-6163-4e93-8a79-1418dd176661\") " pod="openstack/ovn-controller-f95n8-config-d6mt8" Nov 24 12:40:16 crc kubenswrapper[4752]: I1124 12:40:16.245176 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/56d37b02-6163-4e93-8a79-1418dd176661-additional-scripts\") pod \"ovn-controller-f95n8-config-d6mt8\" (UID: \"56d37b02-6163-4e93-8a79-1418dd176661\") " pod="openstack/ovn-controller-f95n8-config-d6mt8" Nov 24 12:40:16 crc kubenswrapper[4752]: I1124 12:40:16.346370 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8zh4\" (UniqueName: \"kubernetes.io/projected/56d37b02-6163-4e93-8a79-1418dd176661-kube-api-access-g8zh4\") pod \"ovn-controller-f95n8-config-d6mt8\" (UID: \"56d37b02-6163-4e93-8a79-1418dd176661\") " pod="openstack/ovn-controller-f95n8-config-d6mt8" Nov 24 12:40:16 crc kubenswrapper[4752]: I1124 12:40:16.346429 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/56d37b02-6163-4e93-8a79-1418dd176661-var-log-ovn\") pod \"ovn-controller-f95n8-config-d6mt8\" (UID: \"56d37b02-6163-4e93-8a79-1418dd176661\") " pod="openstack/ovn-controller-f95n8-config-d6mt8" Nov 24 12:40:16 crc kubenswrapper[4752]: I1124 12:40:16.346457 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/56d37b02-6163-4e93-8a79-1418dd176661-additional-scripts\") pod \"ovn-controller-f95n8-config-d6mt8\" (UID: \"56d37b02-6163-4e93-8a79-1418dd176661\") " pod="openstack/ovn-controller-f95n8-config-d6mt8" Nov 24 12:40:16 crc kubenswrapper[4752]: I1124 12:40:16.346501 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56d37b02-6163-4e93-8a79-1418dd176661-scripts\") pod \"ovn-controller-f95n8-config-d6mt8\" (UID: \"56d37b02-6163-4e93-8a79-1418dd176661\") " pod="openstack/ovn-controller-f95n8-config-d6mt8" Nov 24 12:40:16 crc kubenswrapper[4752]: I1124 12:40:16.346584 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/56d37b02-6163-4e93-8a79-1418dd176661-var-run-ovn\") pod \"ovn-controller-f95n8-config-d6mt8\" (UID: \"56d37b02-6163-4e93-8a79-1418dd176661\") " pod="openstack/ovn-controller-f95n8-config-d6mt8" Nov 24 12:40:16 crc kubenswrapper[4752]: I1124 12:40:16.346613 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/56d37b02-6163-4e93-8a79-1418dd176661-var-run\") pod \"ovn-controller-f95n8-config-d6mt8\" (UID: \"56d37b02-6163-4e93-8a79-1418dd176661\") " pod="openstack/ovn-controller-f95n8-config-d6mt8" Nov 24 12:40:16 crc kubenswrapper[4752]: I1124 12:40:16.347027 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/56d37b02-6163-4e93-8a79-1418dd176661-var-run\") pod \"ovn-controller-f95n8-config-d6mt8\" (UID: \"56d37b02-6163-4e93-8a79-1418dd176661\") " pod="openstack/ovn-controller-f95n8-config-d6mt8" Nov 24 12:40:16 crc kubenswrapper[4752]: I1124 12:40:16.347116 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/56d37b02-6163-4e93-8a79-1418dd176661-var-run-ovn\") pod \"ovn-controller-f95n8-config-d6mt8\" (UID: \"56d37b02-6163-4e93-8a79-1418dd176661\") " pod="openstack/ovn-controller-f95n8-config-d6mt8" Nov 24 12:40:16 crc kubenswrapper[4752]: I1124 12:40:16.347148 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/56d37b02-6163-4e93-8a79-1418dd176661-var-log-ovn\") pod \"ovn-controller-f95n8-config-d6mt8\" (UID: \"56d37b02-6163-4e93-8a79-1418dd176661\") " pod="openstack/ovn-controller-f95n8-config-d6mt8" Nov 24 12:40:16 crc kubenswrapper[4752]: I1124 12:40:16.347512 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/56d37b02-6163-4e93-8a79-1418dd176661-additional-scripts\") pod \"ovn-controller-f95n8-config-d6mt8\" (UID: \"56d37b02-6163-4e93-8a79-1418dd176661\") " pod="openstack/ovn-controller-f95n8-config-d6mt8" Nov 24 12:40:16 crc kubenswrapper[4752]: I1124 12:40:16.349618 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56d37b02-6163-4e93-8a79-1418dd176661-scripts\") pod \"ovn-controller-f95n8-config-d6mt8\" (UID: \"56d37b02-6163-4e93-8a79-1418dd176661\") " pod="openstack/ovn-controller-f95n8-config-d6mt8" Nov 24 12:40:16 crc kubenswrapper[4752]: I1124 12:40:16.376786 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8zh4\" (UniqueName: \"kubernetes.io/projected/56d37b02-6163-4e93-8a79-1418dd176661-kube-api-access-g8zh4\") pod \"ovn-controller-f95n8-config-d6mt8\" (UID: \"56d37b02-6163-4e93-8a79-1418dd176661\") " pod="openstack/ovn-controller-f95n8-config-d6mt8" Nov 24 12:40:16 crc kubenswrapper[4752]: I1124 12:40:16.426181 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-f95n8-config-d6mt8" Nov 24 12:40:16 crc kubenswrapper[4752]: I1124 12:40:16.692183 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-85bf5b5c78-688s2" Nov 24 12:40:16 crc kubenswrapper[4752]: I1124 12:40:16.777346 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-85bf5b5c78-688s2" Nov 24 12:40:16 crc kubenswrapper[4752]: I1124 12:40:16.936195 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-f95n8-config-d6mt8"] Nov 24 12:40:17 crc kubenswrapper[4752]: I1124 12:40:17.599331 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-f95n8-config-d6mt8" event={"ID":"56d37b02-6163-4e93-8a79-1418dd176661","Type":"ContainerStarted","Data":"a4b317cccf9b069d669d79631708f878dfceb62d2f2df166276b3ddae208c6c9"} Nov 24 12:40:17 crc kubenswrapper[4752]: I1124 12:40:17.599387 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-f95n8-config-d6mt8" event={"ID":"56d37b02-6163-4e93-8a79-1418dd176661","Type":"ContainerStarted","Data":"b155d81e3a5c3b30f7c194c7eb6cc0953f68f67fa7c82cfb09b1bcbe138b613a"} Nov 24 12:40:17 crc kubenswrapper[4752]: I1124 12:40:17.620257 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-f95n8-config-d6mt8" podStartSLOduration=1.620239922 podStartE2EDuration="1.620239922s" podCreationTimestamp="2025-11-24 12:40:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:40:17.614051074 +0000 UTC m=+5623.598871353" watchObservedRunningTime="2025-11-24 12:40:17.620239922 +0000 UTC m=+5623.605060211" Nov 24 12:40:18 crc kubenswrapper[4752]: I1124 12:40:18.611400 4752 generic.go:334] "Generic (PLEG): container finished" podID="56d37b02-6163-4e93-8a79-1418dd176661" containerID="a4b317cccf9b069d669d79631708f878dfceb62d2f2df166276b3ddae208c6c9" exitCode=0 Nov 24 12:40:18 crc kubenswrapper[4752]: I1124 12:40:18.611482 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-f95n8-config-d6mt8" event={"ID":"56d37b02-6163-4e93-8a79-1418dd176661","Type":"ContainerDied","Data":"a4b317cccf9b069d669d79631708f878dfceb62d2f2df166276b3ddae208c6c9"} Nov 24 12:40:20 crc kubenswrapper[4752]: I1124 12:40:20.024252 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-f95n8-config-d6mt8" Nov 24 12:40:20 crc kubenswrapper[4752]: I1124 12:40:20.150336 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/56d37b02-6163-4e93-8a79-1418dd176661-var-run-ovn\") pod \"56d37b02-6163-4e93-8a79-1418dd176661\" (UID: \"56d37b02-6163-4e93-8a79-1418dd176661\") " Nov 24 12:40:20 crc kubenswrapper[4752]: I1124 12:40:20.150457 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/56d37b02-6163-4e93-8a79-1418dd176661-additional-scripts\") pod \"56d37b02-6163-4e93-8a79-1418dd176661\" (UID: \"56d37b02-6163-4e93-8a79-1418dd176661\") " Nov 24 12:40:20 crc kubenswrapper[4752]: I1124 12:40:20.150532 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56d37b02-6163-4e93-8a79-1418dd176661-scripts\") pod \"56d37b02-6163-4e93-8a79-1418dd176661\" (UID: \"56d37b02-6163-4e93-8a79-1418dd176661\") " Nov 24 12:40:20 crc kubenswrapper[4752]: I1124 12:40:20.150552 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56d37b02-6163-4e93-8a79-1418dd176661-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "56d37b02-6163-4e93-8a79-1418dd176661" (UID: "56d37b02-6163-4e93-8a79-1418dd176661"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 12:40:20 crc kubenswrapper[4752]: I1124 12:40:20.150613 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56d37b02-6163-4e93-8a79-1418dd176661-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "56d37b02-6163-4e93-8a79-1418dd176661" (UID: "56d37b02-6163-4e93-8a79-1418dd176661"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 12:40:20 crc kubenswrapper[4752]: I1124 12:40:20.150579 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/56d37b02-6163-4e93-8a79-1418dd176661-var-log-ovn\") pod \"56d37b02-6163-4e93-8a79-1418dd176661\" (UID: \"56d37b02-6163-4e93-8a79-1418dd176661\") " Nov 24 12:40:20 crc kubenswrapper[4752]: I1124 12:40:20.150801 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8zh4\" (UniqueName: \"kubernetes.io/projected/56d37b02-6163-4e93-8a79-1418dd176661-kube-api-access-g8zh4\") pod \"56d37b02-6163-4e93-8a79-1418dd176661\" (UID: \"56d37b02-6163-4e93-8a79-1418dd176661\") " Nov 24 12:40:20 crc kubenswrapper[4752]: I1124 12:40:20.150849 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/56d37b02-6163-4e93-8a79-1418dd176661-var-run\") pod \"56d37b02-6163-4e93-8a79-1418dd176661\" (UID: \"56d37b02-6163-4e93-8a79-1418dd176661\") " Nov 24 12:40:20 crc kubenswrapper[4752]: I1124 12:40:20.151056 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56d37b02-6163-4e93-8a79-1418dd176661-var-run" (OuterVolumeSpecName: "var-run") pod "56d37b02-6163-4e93-8a79-1418dd176661" (UID: "56d37b02-6163-4e93-8a79-1418dd176661"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 12:40:20 crc kubenswrapper[4752]: I1124 12:40:20.151627 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56d37b02-6163-4e93-8a79-1418dd176661-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "56d37b02-6163-4e93-8a79-1418dd176661" (UID: "56d37b02-6163-4e93-8a79-1418dd176661"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:40:20 crc kubenswrapper[4752]: I1124 12:40:20.151729 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56d37b02-6163-4e93-8a79-1418dd176661-scripts" (OuterVolumeSpecName: "scripts") pod "56d37b02-6163-4e93-8a79-1418dd176661" (UID: "56d37b02-6163-4e93-8a79-1418dd176661"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:40:20 crc kubenswrapper[4752]: I1124 12:40:20.151832 4752 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/56d37b02-6163-4e93-8a79-1418dd176661-var-log-ovn\") on node \"crc\" DevicePath \"\"" Nov 24 12:40:20 crc kubenswrapper[4752]: I1124 12:40:20.151875 4752 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/56d37b02-6163-4e93-8a79-1418dd176661-var-run\") on node \"crc\" DevicePath \"\"" Nov 24 12:40:20 crc kubenswrapper[4752]: I1124 12:40:20.151902 4752 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/56d37b02-6163-4e93-8a79-1418dd176661-var-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 24 12:40:20 crc kubenswrapper[4752]: I1124 12:40:20.160302 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56d37b02-6163-4e93-8a79-1418dd176661-kube-api-access-g8zh4" (OuterVolumeSpecName: "kube-api-access-g8zh4") pod "56d37b02-6163-4e93-8a79-1418dd176661" (UID: "56d37b02-6163-4e93-8a79-1418dd176661"). InnerVolumeSpecName "kube-api-access-g8zh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:40:20 crc kubenswrapper[4752]: I1124 12:40:20.254274 4752 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/56d37b02-6163-4e93-8a79-1418dd176661-additional-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:40:20 crc kubenswrapper[4752]: I1124 12:40:20.254315 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56d37b02-6163-4e93-8a79-1418dd176661-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:40:20 crc kubenswrapper[4752]: I1124 12:40:20.254329 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8zh4\" (UniqueName: \"kubernetes.io/projected/56d37b02-6163-4e93-8a79-1418dd176661-kube-api-access-g8zh4\") on node \"crc\" DevicePath \"\"" Nov 24 12:40:20 crc kubenswrapper[4752]: I1124 12:40:20.637047 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-f95n8-config-d6mt8" event={"ID":"56d37b02-6163-4e93-8a79-1418dd176661","Type":"ContainerDied","Data":"b155d81e3a5c3b30f7c194c7eb6cc0953f68f67fa7c82cfb09b1bcbe138b613a"} Nov 24 12:40:20 crc kubenswrapper[4752]: I1124 12:40:20.637109 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b155d81e3a5c3b30f7c194c7eb6cc0953f68f67fa7c82cfb09b1bcbe138b613a" Nov 24 12:40:20 crc kubenswrapper[4752]: I1124 12:40:20.637146 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-f95n8-config-d6mt8" Nov 24 12:40:20 crc kubenswrapper[4752]: I1124 12:40:20.713792 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-f95n8-config-d6mt8"] Nov 24 12:40:20 crc kubenswrapper[4752]: I1124 12:40:20.738204 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-f95n8-config-d6mt8"] Nov 24 12:40:20 crc kubenswrapper[4752]: I1124 12:40:20.766814 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56d37b02-6163-4e93-8a79-1418dd176661" path="/var/lib/kubelet/pods/56d37b02-6163-4e93-8a79-1418dd176661/volumes" Nov 24 12:40:20 crc kubenswrapper[4752]: I1124 12:40:20.954892 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-f95n8" Nov 24 12:40:22 crc kubenswrapper[4752]: I1124 12:40:22.453312 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-rsyslog-795nt"] Nov 24 12:40:22 crc kubenswrapper[4752]: E1124 12:40:22.454353 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56d37b02-6163-4e93-8a79-1418dd176661" containerName="ovn-config" Nov 24 12:40:22 crc kubenswrapper[4752]: I1124 12:40:22.454377 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="56d37b02-6163-4e93-8a79-1418dd176661" containerName="ovn-config" Nov 24 12:40:22 crc kubenswrapper[4752]: I1124 12:40:22.454696 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="56d37b02-6163-4e93-8a79-1418dd176661" containerName="ovn-config" Nov 24 12:40:22 crc kubenswrapper[4752]: I1124 12:40:22.456436 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-795nt" Nov 24 12:40:22 crc kubenswrapper[4752]: I1124 12:40:22.459046 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-config-data" Nov 24 12:40:22 crc kubenswrapper[4752]: I1124 12:40:22.459292 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-scripts" Nov 24 12:40:22 crc kubenswrapper[4752]: I1124 12:40:22.459499 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"octavia-hmport-map" Nov 24 12:40:22 crc kubenswrapper[4752]: I1124 12:40:22.464360 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-795nt"] Nov 24 12:40:22 crc kubenswrapper[4752]: I1124 12:40:22.600930 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e2a0fc67-fcee-4ec4-ae55-de5a44214b27-config-data-merged\") pod \"octavia-rsyslog-795nt\" (UID: \"e2a0fc67-fcee-4ec4-ae55-de5a44214b27\") " pod="openstack/octavia-rsyslog-795nt" Nov 24 12:40:22 crc kubenswrapper[4752]: I1124 12:40:22.601088 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2a0fc67-fcee-4ec4-ae55-de5a44214b27-config-data\") pod \"octavia-rsyslog-795nt\" (UID: \"e2a0fc67-fcee-4ec4-ae55-de5a44214b27\") " pod="openstack/octavia-rsyslog-795nt" Nov 24 12:40:22 crc kubenswrapper[4752]: I1124 12:40:22.601228 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2a0fc67-fcee-4ec4-ae55-de5a44214b27-scripts\") pod \"octavia-rsyslog-795nt\" (UID: \"e2a0fc67-fcee-4ec4-ae55-de5a44214b27\") " pod="openstack/octavia-rsyslog-795nt" Nov 24 12:40:22 crc kubenswrapper[4752]: I1124 12:40:22.601370 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/e2a0fc67-fcee-4ec4-ae55-de5a44214b27-hm-ports\") pod \"octavia-rsyslog-795nt\" (UID: \"e2a0fc67-fcee-4ec4-ae55-de5a44214b27\") " pod="openstack/octavia-rsyslog-795nt" Nov 24 12:40:22 crc kubenswrapper[4752]: I1124 12:40:22.703621 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e2a0fc67-fcee-4ec4-ae55-de5a44214b27-config-data-merged\") pod \"octavia-rsyslog-795nt\" (UID: \"e2a0fc67-fcee-4ec4-ae55-de5a44214b27\") " pod="openstack/octavia-rsyslog-795nt" Nov 24 12:40:22 crc kubenswrapper[4752]: I1124 12:40:22.703716 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2a0fc67-fcee-4ec4-ae55-de5a44214b27-config-data\") pod \"octavia-rsyslog-795nt\" (UID: \"e2a0fc67-fcee-4ec4-ae55-de5a44214b27\") " pod="openstack/octavia-rsyslog-795nt" Nov 24 12:40:22 crc kubenswrapper[4752]: I1124 12:40:22.703901 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2a0fc67-fcee-4ec4-ae55-de5a44214b27-scripts\") pod \"octavia-rsyslog-795nt\" (UID: \"e2a0fc67-fcee-4ec4-ae55-de5a44214b27\") " pod="openstack/octavia-rsyslog-795nt" Nov 24 12:40:22 crc kubenswrapper[4752]: I1124 12:40:22.704011 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/e2a0fc67-fcee-4ec4-ae55-de5a44214b27-hm-ports\") pod \"octavia-rsyslog-795nt\" (UID: \"e2a0fc67-fcee-4ec4-ae55-de5a44214b27\") " pod="openstack/octavia-rsyslog-795nt" Nov 24 12:40:22 crc kubenswrapper[4752]: I1124 12:40:22.704082 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e2a0fc67-fcee-4ec4-ae55-de5a44214b27-config-data-merged\") pod \"octavia-rsyslog-795nt\" (UID: \"e2a0fc67-fcee-4ec4-ae55-de5a44214b27\") " pod="openstack/octavia-rsyslog-795nt" Nov 24 12:40:22 crc kubenswrapper[4752]: I1124 12:40:22.705490 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/e2a0fc67-fcee-4ec4-ae55-de5a44214b27-hm-ports\") pod \"octavia-rsyslog-795nt\" (UID: \"e2a0fc67-fcee-4ec4-ae55-de5a44214b27\") " pod="openstack/octavia-rsyslog-795nt" Nov 24 12:40:22 crc kubenswrapper[4752]: I1124 12:40:22.712487 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2a0fc67-fcee-4ec4-ae55-de5a44214b27-config-data\") pod \"octavia-rsyslog-795nt\" (UID: \"e2a0fc67-fcee-4ec4-ae55-de5a44214b27\") " pod="openstack/octavia-rsyslog-795nt" Nov 24 12:40:22 crc kubenswrapper[4752]: I1124 12:40:22.712504 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2a0fc67-fcee-4ec4-ae55-de5a44214b27-scripts\") pod \"octavia-rsyslog-795nt\" (UID: \"e2a0fc67-fcee-4ec4-ae55-de5a44214b27\") " pod="openstack/octavia-rsyslog-795nt" Nov 24 12:40:22 crc kubenswrapper[4752]: I1124 12:40:22.792088 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-795nt" Nov 24 12:40:23 crc kubenswrapper[4752]: I1124 12:40:23.022703 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-59f8cff499-cp9rm"] Nov 24 12:40:23 crc kubenswrapper[4752]: I1124 12:40:23.025425 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-cp9rm" Nov 24 12:40:23 crc kubenswrapper[4752]: I1124 12:40:23.033582 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Nov 24 12:40:23 crc kubenswrapper[4752]: I1124 12:40:23.037638 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-cp9rm"] Nov 24 12:40:23 crc kubenswrapper[4752]: I1124 12:40:23.115715 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/cbbdcce4-d55e-4504-83df-236789ebed4b-amphora-image\") pod \"octavia-image-upload-59f8cff499-cp9rm\" (UID: \"cbbdcce4-d55e-4504-83df-236789ebed4b\") " pod="openstack/octavia-image-upload-59f8cff499-cp9rm" Nov 24 12:40:23 crc kubenswrapper[4752]: I1124 12:40:23.115874 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cbbdcce4-d55e-4504-83df-236789ebed4b-httpd-config\") pod \"octavia-image-upload-59f8cff499-cp9rm\" (UID: \"cbbdcce4-d55e-4504-83df-236789ebed4b\") " pod="openstack/octavia-image-upload-59f8cff499-cp9rm" Nov 24 12:40:23 crc kubenswrapper[4752]: I1124 12:40:23.218150 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cbbdcce4-d55e-4504-83df-236789ebed4b-httpd-config\") pod \"octavia-image-upload-59f8cff499-cp9rm\" (UID: \"cbbdcce4-d55e-4504-83df-236789ebed4b\") " pod="openstack/octavia-image-upload-59f8cff499-cp9rm" Nov 24 12:40:23 crc kubenswrapper[4752]: I1124 12:40:23.218686 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/cbbdcce4-d55e-4504-83df-236789ebed4b-amphora-image\") pod \"octavia-image-upload-59f8cff499-cp9rm\" (UID: \"cbbdcce4-d55e-4504-83df-236789ebed4b\") " pod="openstack/octavia-image-upload-59f8cff499-cp9rm" Nov 24 12:40:23 crc kubenswrapper[4752]: I1124 12:40:23.219167 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/cbbdcce4-d55e-4504-83df-236789ebed4b-amphora-image\") pod \"octavia-image-upload-59f8cff499-cp9rm\" (UID: \"cbbdcce4-d55e-4504-83df-236789ebed4b\") " pod="openstack/octavia-image-upload-59f8cff499-cp9rm" Nov 24 12:40:23 crc kubenswrapper[4752]: I1124 12:40:23.225604 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cbbdcce4-d55e-4504-83df-236789ebed4b-httpd-config\") pod \"octavia-image-upload-59f8cff499-cp9rm\" (UID: \"cbbdcce4-d55e-4504-83df-236789ebed4b\") " pod="openstack/octavia-image-upload-59f8cff499-cp9rm" Nov 24 12:40:23 crc kubenswrapper[4752]: I1124 12:40:23.358442 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-cp9rm" Nov 24 12:40:23 crc kubenswrapper[4752]: I1124 12:40:23.435539 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-795nt"] Nov 24 12:40:23 crc kubenswrapper[4752]: I1124 12:40:23.684548 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-795nt" event={"ID":"e2a0fc67-fcee-4ec4-ae55-de5a44214b27","Type":"ContainerStarted","Data":"5570f058b18751ff5886f3fde8aed7e399478c60117ed6e3a7c86674ae069d9d"} Nov 24 12:40:23 crc kubenswrapper[4752]: I1124 12:40:23.820377 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-cp9rm"] Nov 24 12:40:23 crc kubenswrapper[4752]: W1124 12:40:23.824885 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbbdcce4_d55e_4504_83df_236789ebed4b.slice/crio-a9e4a8fa5292424ca6ba13fef92bb4e342f0c69d8c193996ce37f67dbc78b07a WatchSource:0}: Error finding container a9e4a8fa5292424ca6ba13fef92bb4e342f0c69d8c193996ce37f67dbc78b07a: Status 404 returned error can't find the container with id a9e4a8fa5292424ca6ba13fef92bb4e342f0c69d8c193996ce37f67dbc78b07a Nov 24 12:40:24 crc kubenswrapper[4752]: I1124 12:40:24.698484 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-cp9rm" event={"ID":"cbbdcce4-d55e-4504-83df-236789ebed4b","Type":"ContainerStarted","Data":"a9e4a8fa5292424ca6ba13fef92bb4e342f0c69d8c193996ce37f67dbc78b07a"} Nov 24 12:40:25 crc kubenswrapper[4752]: I1124 12:40:25.710962 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-795nt" event={"ID":"e2a0fc67-fcee-4ec4-ae55-de5a44214b27","Type":"ContainerStarted","Data":"0a45c9dd33af2b3b80ce37b51eb98a6397bf6ad15d123daf26c92f19fb9b3858"} Nov 24 12:40:27 crc kubenswrapper[4752]: I1124 12:40:27.883065 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-healthmanager-d6fr7"] Nov 24 12:40:27 crc kubenswrapper[4752]: I1124 12:40:27.885882 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-d6fr7" Nov 24 12:40:27 crc kubenswrapper[4752]: I1124 12:40:27.888461 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-certs-secret" Nov 24 12:40:27 crc kubenswrapper[4752]: I1124 12:40:27.888631 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-config-data" Nov 24 12:40:27 crc kubenswrapper[4752]: I1124 12:40:27.888757 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-scripts" Nov 24 12:40:27 crc kubenswrapper[4752]: I1124 12:40:27.900588 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-d6fr7"] Nov 24 12:40:28 crc kubenswrapper[4752]: I1124 12:40:28.024128 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97c0db40-afff-4067-9fee-06657cfbf155-scripts\") pod \"octavia-healthmanager-d6fr7\" (UID: \"97c0db40-afff-4067-9fee-06657cfbf155\") " pod="openstack/octavia-healthmanager-d6fr7" Nov 24 12:40:28 crc kubenswrapper[4752]: I1124 12:40:28.024165 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97c0db40-afff-4067-9fee-06657cfbf155-combined-ca-bundle\") pod \"octavia-healthmanager-d6fr7\" (UID: \"97c0db40-afff-4067-9fee-06657cfbf155\") " pod="openstack/octavia-healthmanager-d6fr7" Nov 24 12:40:28 crc kubenswrapper[4752]: I1124 12:40:28.024279 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/97c0db40-afff-4067-9fee-06657cfbf155-hm-ports\") pod \"octavia-healthmanager-d6fr7\" (UID: \"97c0db40-afff-4067-9fee-06657cfbf155\") " pod="openstack/octavia-healthmanager-d6fr7" Nov 24 12:40:28 crc kubenswrapper[4752]: I1124 12:40:28.024358 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/97c0db40-afff-4067-9fee-06657cfbf155-config-data-merged\") pod \"octavia-healthmanager-d6fr7\" (UID: \"97c0db40-afff-4067-9fee-06657cfbf155\") " pod="openstack/octavia-healthmanager-d6fr7" Nov 24 12:40:28 crc kubenswrapper[4752]: I1124 12:40:28.024387 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97c0db40-afff-4067-9fee-06657cfbf155-config-data\") pod \"octavia-healthmanager-d6fr7\" (UID: \"97c0db40-afff-4067-9fee-06657cfbf155\") " pod="openstack/octavia-healthmanager-d6fr7" Nov 24 12:40:28 crc kubenswrapper[4752]: I1124 12:40:28.024508 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/97c0db40-afff-4067-9fee-06657cfbf155-amphora-certs\") pod \"octavia-healthmanager-d6fr7\" (UID: \"97c0db40-afff-4067-9fee-06657cfbf155\") " pod="openstack/octavia-healthmanager-d6fr7" Nov 24 12:40:28 crc kubenswrapper[4752]: I1124 12:40:28.126431 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/97c0db40-afff-4067-9fee-06657cfbf155-hm-ports\") pod \"octavia-healthmanager-d6fr7\" (UID: \"97c0db40-afff-4067-9fee-06657cfbf155\") " pod="openstack/octavia-healthmanager-d6fr7" Nov 24 12:40:28 crc kubenswrapper[4752]: I1124 12:40:28.126495 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/97c0db40-afff-4067-9fee-06657cfbf155-config-data-merged\") pod \"octavia-healthmanager-d6fr7\" (UID: \"97c0db40-afff-4067-9fee-06657cfbf155\") " pod="openstack/octavia-healthmanager-d6fr7" Nov 24 12:40:28 crc kubenswrapper[4752]: I1124 12:40:28.126518 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97c0db40-afff-4067-9fee-06657cfbf155-config-data\") pod \"octavia-healthmanager-d6fr7\" (UID: \"97c0db40-afff-4067-9fee-06657cfbf155\") " pod="openstack/octavia-healthmanager-d6fr7" Nov 24 12:40:28 crc kubenswrapper[4752]: I1124 12:40:28.126546 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/97c0db40-afff-4067-9fee-06657cfbf155-amphora-certs\") pod \"octavia-healthmanager-d6fr7\" (UID: \"97c0db40-afff-4067-9fee-06657cfbf155\") " pod="openstack/octavia-healthmanager-d6fr7" Nov 24 12:40:28 crc kubenswrapper[4752]: I1124 12:40:28.126599 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97c0db40-afff-4067-9fee-06657cfbf155-scripts\") pod \"octavia-healthmanager-d6fr7\" (UID: \"97c0db40-afff-4067-9fee-06657cfbf155\") " pod="openstack/octavia-healthmanager-d6fr7" Nov 24 12:40:28 crc kubenswrapper[4752]: I1124 12:40:28.126616 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97c0db40-afff-4067-9fee-06657cfbf155-combined-ca-bundle\") pod \"octavia-healthmanager-d6fr7\" (UID: \"97c0db40-afff-4067-9fee-06657cfbf155\") " pod="openstack/octavia-healthmanager-d6fr7" Nov 24 12:40:28 crc kubenswrapper[4752]: I1124 12:40:28.128367 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/97c0db40-afff-4067-9fee-06657cfbf155-hm-ports\") pod \"octavia-healthmanager-d6fr7\" (UID: \"97c0db40-afff-4067-9fee-06657cfbf155\") " pod="openstack/octavia-healthmanager-d6fr7" Nov 24 12:40:28 crc kubenswrapper[4752]: I1124 12:40:28.128447 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/97c0db40-afff-4067-9fee-06657cfbf155-config-data-merged\") pod \"octavia-healthmanager-d6fr7\" (UID: \"97c0db40-afff-4067-9fee-06657cfbf155\") " pod="openstack/octavia-healthmanager-d6fr7" Nov 24 12:40:28 crc kubenswrapper[4752]: I1124 12:40:28.139062 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97c0db40-afff-4067-9fee-06657cfbf155-config-data\") pod \"octavia-healthmanager-d6fr7\" (UID: \"97c0db40-afff-4067-9fee-06657cfbf155\") " pod="openstack/octavia-healthmanager-d6fr7" Nov 24 12:40:28 crc kubenswrapper[4752]: I1124 12:40:28.139465 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97c0db40-afff-4067-9fee-06657cfbf155-combined-ca-bundle\") pod \"octavia-healthmanager-d6fr7\" (UID: \"97c0db40-afff-4067-9fee-06657cfbf155\") " pod="openstack/octavia-healthmanager-d6fr7" Nov 24 12:40:28 crc kubenswrapper[4752]: I1124 12:40:28.140161 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97c0db40-afff-4067-9fee-06657cfbf155-scripts\") pod \"octavia-healthmanager-d6fr7\" (UID: \"97c0db40-afff-4067-9fee-06657cfbf155\") " pod="openstack/octavia-healthmanager-d6fr7" Nov 24 12:40:28 crc kubenswrapper[4752]: I1124 12:40:28.145561 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/97c0db40-afff-4067-9fee-06657cfbf155-amphora-certs\") pod \"octavia-healthmanager-d6fr7\" (UID: \"97c0db40-afff-4067-9fee-06657cfbf155\") " pod="openstack/octavia-healthmanager-d6fr7" Nov 24 12:40:28 crc kubenswrapper[4752]: I1124 12:40:28.258024 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-d6fr7" Nov 24 12:40:28 crc kubenswrapper[4752]: I1124 12:40:28.741048 4752 generic.go:334] "Generic (PLEG): container finished" podID="e2a0fc67-fcee-4ec4-ae55-de5a44214b27" containerID="0a45c9dd33af2b3b80ce37b51eb98a6397bf6ad15d123daf26c92f19fb9b3858" exitCode=0 Nov 24 12:40:28 crc kubenswrapper[4752]: I1124 12:40:28.741327 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-795nt" event={"ID":"e2a0fc67-fcee-4ec4-ae55-de5a44214b27","Type":"ContainerDied","Data":"0a45c9dd33af2b3b80ce37b51eb98a6397bf6ad15d123daf26c92f19fb9b3858"} Nov 24 12:40:28 crc kubenswrapper[4752]: I1124 12:40:28.932892 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-d6fr7"] Nov 24 12:40:28 crc kubenswrapper[4752]: I1124 12:40:28.955420 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-sync-gjzpt"] Nov 24 12:40:28 crc kubenswrapper[4752]: I1124 12:40:28.957051 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-gjzpt" Nov 24 12:40:28 crc kubenswrapper[4752]: I1124 12:40:28.962640 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-scripts" Nov 24 12:40:28 crc kubenswrapper[4752]: I1124 12:40:28.981628 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-gjzpt"] Nov 24 12:40:29 crc kubenswrapper[4752]: I1124 12:40:29.052400 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc345722-cbf8-460e-bc35-86df16ab5f27-config-data\") pod \"octavia-db-sync-gjzpt\" (UID: \"dc345722-cbf8-460e-bc35-86df16ab5f27\") " pod="openstack/octavia-db-sync-gjzpt" Nov 24 12:40:29 crc kubenswrapper[4752]: I1124 12:40:29.052463 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc345722-cbf8-460e-bc35-86df16ab5f27-combined-ca-bundle\") pod \"octavia-db-sync-gjzpt\" (UID: \"dc345722-cbf8-460e-bc35-86df16ab5f27\") " pod="openstack/octavia-db-sync-gjzpt" Nov 24 12:40:29 crc kubenswrapper[4752]: I1124 12:40:29.052512 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/dc345722-cbf8-460e-bc35-86df16ab5f27-config-data-merged\") pod \"octavia-db-sync-gjzpt\" (UID: \"dc345722-cbf8-460e-bc35-86df16ab5f27\") " pod="openstack/octavia-db-sync-gjzpt" Nov 24 12:40:29 crc kubenswrapper[4752]: I1124 12:40:29.052632 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc345722-cbf8-460e-bc35-86df16ab5f27-scripts\") pod \"octavia-db-sync-gjzpt\" (UID: \"dc345722-cbf8-460e-bc35-86df16ab5f27\") " pod="openstack/octavia-db-sync-gjzpt" Nov 24 12:40:29 crc kubenswrapper[4752]: I1124 12:40:29.154075 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc345722-cbf8-460e-bc35-86df16ab5f27-config-data\") pod \"octavia-db-sync-gjzpt\" (UID: \"dc345722-cbf8-460e-bc35-86df16ab5f27\") " pod="openstack/octavia-db-sync-gjzpt" Nov 24 12:40:29 crc kubenswrapper[4752]: I1124 12:40:29.154148 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc345722-cbf8-460e-bc35-86df16ab5f27-combined-ca-bundle\") pod \"octavia-db-sync-gjzpt\" (UID: \"dc345722-cbf8-460e-bc35-86df16ab5f27\") " pod="openstack/octavia-db-sync-gjzpt" Nov 24 12:40:29 crc kubenswrapper[4752]: I1124 12:40:29.154188 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/dc345722-cbf8-460e-bc35-86df16ab5f27-config-data-merged\") pod \"octavia-db-sync-gjzpt\" (UID: \"dc345722-cbf8-460e-bc35-86df16ab5f27\") " pod="openstack/octavia-db-sync-gjzpt" Nov 24 12:40:29 crc kubenswrapper[4752]: I1124 12:40:29.154229 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc345722-cbf8-460e-bc35-86df16ab5f27-scripts\") pod \"octavia-db-sync-gjzpt\" (UID: \"dc345722-cbf8-460e-bc35-86df16ab5f27\") " pod="openstack/octavia-db-sync-gjzpt" Nov 24 12:40:29 crc kubenswrapper[4752]: I1124 12:40:29.155256 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/dc345722-cbf8-460e-bc35-86df16ab5f27-config-data-merged\") pod \"octavia-db-sync-gjzpt\" (UID: \"dc345722-cbf8-460e-bc35-86df16ab5f27\") " pod="openstack/octavia-db-sync-gjzpt" Nov 24 12:40:29 crc kubenswrapper[4752]: I1124 12:40:29.167018 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc345722-cbf8-460e-bc35-86df16ab5f27-combined-ca-bundle\") pod \"octavia-db-sync-gjzpt\" (UID: \"dc345722-cbf8-460e-bc35-86df16ab5f27\") " pod="openstack/octavia-db-sync-gjzpt" Nov 24 12:40:29 crc kubenswrapper[4752]: I1124 12:40:29.167326 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc345722-cbf8-460e-bc35-86df16ab5f27-scripts\") pod \"octavia-db-sync-gjzpt\" (UID: \"dc345722-cbf8-460e-bc35-86df16ab5f27\") " pod="openstack/octavia-db-sync-gjzpt" Nov 24 12:40:29 crc kubenswrapper[4752]: I1124 12:40:29.167387 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc345722-cbf8-460e-bc35-86df16ab5f27-config-data\") pod \"octavia-db-sync-gjzpt\" (UID: \"dc345722-cbf8-460e-bc35-86df16ab5f27\") " pod="openstack/octavia-db-sync-gjzpt" Nov 24 12:40:29 crc kubenswrapper[4752]: I1124 12:40:29.352675 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-gjzpt" Nov 24 12:40:29 crc kubenswrapper[4752]: I1124 12:40:29.751949 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-d6fr7" event={"ID":"97c0db40-afff-4067-9fee-06657cfbf155","Type":"ContainerStarted","Data":"16e7e2afa122412d553afcfc9ff1cc4ee577f872f1a5e218845287a35e73aba9"} Nov 24 12:40:29 crc kubenswrapper[4752]: I1124 12:40:29.752002 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-d6fr7" event={"ID":"97c0db40-afff-4067-9fee-06657cfbf155","Type":"ContainerStarted","Data":"f56d0e4045b6fa974597eb645f17f1490c000bd8c6df8a30dc6a975518622584"} Nov 24 12:40:29 crc kubenswrapper[4752]: I1124 12:40:29.988467 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-housekeeping-zvcnm"] Nov 24 12:40:29 crc kubenswrapper[4752]: I1124 12:40:29.990909 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-zvcnm" Nov 24 12:40:29 crc kubenswrapper[4752]: I1124 12:40:29.996141 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-scripts" Nov 24 12:40:29 crc kubenswrapper[4752]: I1124 12:40:29.996301 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-config-data" Nov 24 12:40:30 crc kubenswrapper[4752]: I1124 12:40:30.014609 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-zvcnm"] Nov 24 12:40:30 crc kubenswrapper[4752]: I1124 12:40:30.074985 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fb19eb4-b714-4b63-a9b3-9e2427994194-config-data\") pod \"octavia-housekeeping-zvcnm\" (UID: \"4fb19eb4-b714-4b63-a9b3-9e2427994194\") " pod="openstack/octavia-housekeeping-zvcnm" Nov 24 12:40:30 crc kubenswrapper[4752]: I1124 12:40:30.075045 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/4fb19eb4-b714-4b63-a9b3-9e2427994194-hm-ports\") pod \"octavia-housekeeping-zvcnm\" (UID: \"4fb19eb4-b714-4b63-a9b3-9e2427994194\") " pod="openstack/octavia-housekeeping-zvcnm" Nov 24 12:40:30 crc kubenswrapper[4752]: I1124 12:40:30.075110 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fb19eb4-b714-4b63-a9b3-9e2427994194-combined-ca-bundle\") pod \"octavia-housekeeping-zvcnm\" (UID: \"4fb19eb4-b714-4b63-a9b3-9e2427994194\") " pod="openstack/octavia-housekeeping-zvcnm" Nov 24 12:40:30 crc kubenswrapper[4752]: I1124 12:40:30.075188 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/4fb19eb4-b714-4b63-a9b3-9e2427994194-config-data-merged\") pod \"octavia-housekeeping-zvcnm\" (UID: \"4fb19eb4-b714-4b63-a9b3-9e2427994194\") " pod="openstack/octavia-housekeeping-zvcnm" Nov 24 12:40:30 crc kubenswrapper[4752]: I1124 12:40:30.075226 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fb19eb4-b714-4b63-a9b3-9e2427994194-scripts\") pod \"octavia-housekeeping-zvcnm\" (UID: \"4fb19eb4-b714-4b63-a9b3-9e2427994194\") " pod="openstack/octavia-housekeeping-zvcnm" Nov 24 12:40:30 crc kubenswrapper[4752]: I1124 12:40:30.075253 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/4fb19eb4-b714-4b63-a9b3-9e2427994194-amphora-certs\") pod \"octavia-housekeeping-zvcnm\" (UID: \"4fb19eb4-b714-4b63-a9b3-9e2427994194\") " pod="openstack/octavia-housekeeping-zvcnm" Nov 24 12:40:30 crc kubenswrapper[4752]: I1124 12:40:30.176650 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/4fb19eb4-b714-4b63-a9b3-9e2427994194-config-data-merged\") pod \"octavia-housekeeping-zvcnm\" (UID: \"4fb19eb4-b714-4b63-a9b3-9e2427994194\") " pod="openstack/octavia-housekeeping-zvcnm" Nov 24 12:40:30 crc kubenswrapper[4752]: I1124 12:40:30.176734 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fb19eb4-b714-4b63-a9b3-9e2427994194-scripts\") pod \"octavia-housekeeping-zvcnm\" (UID: \"4fb19eb4-b714-4b63-a9b3-9e2427994194\") " pod="openstack/octavia-housekeeping-zvcnm" Nov 24 12:40:30 crc kubenswrapper[4752]: I1124 12:40:30.176788 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/4fb19eb4-b714-4b63-a9b3-9e2427994194-amphora-certs\") pod \"octavia-housekeeping-zvcnm\" (UID: \"4fb19eb4-b714-4b63-a9b3-9e2427994194\") " pod="openstack/octavia-housekeeping-zvcnm" Nov 24 12:40:30 crc kubenswrapper[4752]: I1124 12:40:30.176968 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fb19eb4-b714-4b63-a9b3-9e2427994194-config-data\") pod \"octavia-housekeeping-zvcnm\" (UID: \"4fb19eb4-b714-4b63-a9b3-9e2427994194\") " pod="openstack/octavia-housekeeping-zvcnm" Nov 24 12:40:30 crc kubenswrapper[4752]: I1124 12:40:30.176992 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/4fb19eb4-b714-4b63-a9b3-9e2427994194-hm-ports\") pod \"octavia-housekeeping-zvcnm\" (UID: \"4fb19eb4-b714-4b63-a9b3-9e2427994194\") " pod="openstack/octavia-housekeeping-zvcnm" Nov 24 12:40:30 crc kubenswrapper[4752]: I1124 12:40:30.177063 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fb19eb4-b714-4b63-a9b3-9e2427994194-combined-ca-bundle\") pod \"octavia-housekeeping-zvcnm\" (UID: \"4fb19eb4-b714-4b63-a9b3-9e2427994194\") " pod="openstack/octavia-housekeeping-zvcnm" Nov 24 12:40:30 crc kubenswrapper[4752]: I1124 12:40:30.178242 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/4fb19eb4-b714-4b63-a9b3-9e2427994194-config-data-merged\") pod \"octavia-housekeeping-zvcnm\" (UID: \"4fb19eb4-b714-4b63-a9b3-9e2427994194\") " pod="openstack/octavia-housekeeping-zvcnm" Nov 24 12:40:30 crc kubenswrapper[4752]: I1124 12:40:30.179724 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/4fb19eb4-b714-4b63-a9b3-9e2427994194-hm-ports\") pod \"octavia-housekeeping-zvcnm\" (UID: \"4fb19eb4-b714-4b63-a9b3-9e2427994194\") " pod="openstack/octavia-housekeeping-zvcnm" Nov 24 12:40:30 crc kubenswrapper[4752]: I1124 12:40:30.183580 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fb19eb4-b714-4b63-a9b3-9e2427994194-combined-ca-bundle\") pod \"octavia-housekeeping-zvcnm\" (UID: \"4fb19eb4-b714-4b63-a9b3-9e2427994194\") " pod="openstack/octavia-housekeeping-zvcnm" Nov 24 12:40:30 crc kubenswrapper[4752]: I1124 12:40:30.184821 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fb19eb4-b714-4b63-a9b3-9e2427994194-scripts\") pod \"octavia-housekeeping-zvcnm\" (UID: \"4fb19eb4-b714-4b63-a9b3-9e2427994194\") " pod="openstack/octavia-housekeeping-zvcnm" Nov 24 12:40:30 crc kubenswrapper[4752]: I1124 12:40:30.185837 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fb19eb4-b714-4b63-a9b3-9e2427994194-config-data\") pod \"octavia-housekeeping-zvcnm\" (UID: \"4fb19eb4-b714-4b63-a9b3-9e2427994194\") " pod="openstack/octavia-housekeeping-zvcnm" Nov 24 12:40:30 crc kubenswrapper[4752]: I1124 12:40:30.186827 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/4fb19eb4-b714-4b63-a9b3-9e2427994194-amphora-certs\") pod \"octavia-housekeeping-zvcnm\" (UID: \"4fb19eb4-b714-4b63-a9b3-9e2427994194\") " pod="openstack/octavia-housekeeping-zvcnm" Nov 24 12:40:30 crc kubenswrapper[4752]: I1124 12:40:30.332001 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-zvcnm" Nov 24 12:40:31 crc kubenswrapper[4752]: I1124 12:40:31.772598 4752 generic.go:334] "Generic (PLEG): container finished" podID="97c0db40-afff-4067-9fee-06657cfbf155" containerID="16e7e2afa122412d553afcfc9ff1cc4ee577f872f1a5e218845287a35e73aba9" exitCode=0 Nov 24 12:40:31 crc kubenswrapper[4752]: I1124 12:40:31.772790 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-d6fr7" event={"ID":"97c0db40-afff-4067-9fee-06657cfbf155","Type":"ContainerDied","Data":"16e7e2afa122412d553afcfc9ff1cc4ee577f872f1a5e218845287a35e73aba9"} Nov 24 12:40:33 crc kubenswrapper[4752]: I1124 12:40:33.361134 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-worker-fztv9"] Nov 24 12:40:33 crc kubenswrapper[4752]: I1124 12:40:33.363862 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-fztv9" Nov 24 12:40:33 crc kubenswrapper[4752]: I1124 12:40:33.367327 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-config-data" Nov 24 12:40:33 crc kubenswrapper[4752]: I1124 12:40:33.367454 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-scripts" Nov 24 12:40:33 crc kubenswrapper[4752]: I1124 12:40:33.371181 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-fztv9"] Nov 24 12:40:33 crc kubenswrapper[4752]: I1124 12:40:33.438920 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec1f33db-dd01-4521-b4de-c2d6cecc5695-combined-ca-bundle\") pod \"octavia-worker-fztv9\" (UID: \"ec1f33db-dd01-4521-b4de-c2d6cecc5695\") " pod="openstack/octavia-worker-fztv9" Nov 24 12:40:33 crc kubenswrapper[4752]: I1124 12:40:33.438999 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec1f33db-dd01-4521-b4de-c2d6cecc5695-scripts\") pod \"octavia-worker-fztv9\" (UID: \"ec1f33db-dd01-4521-b4de-c2d6cecc5695\") " pod="openstack/octavia-worker-fztv9" Nov 24 12:40:33 crc kubenswrapper[4752]: I1124 12:40:33.439205 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ec1f33db-dd01-4521-b4de-c2d6cecc5695-config-data-merged\") pod \"octavia-worker-fztv9\" (UID: \"ec1f33db-dd01-4521-b4de-c2d6cecc5695\") " pod="openstack/octavia-worker-fztv9" Nov 24 12:40:33 crc kubenswrapper[4752]: I1124 12:40:33.439250 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/ec1f33db-dd01-4521-b4de-c2d6cecc5695-hm-ports\") pod \"octavia-worker-fztv9\" (UID: \"ec1f33db-dd01-4521-b4de-c2d6cecc5695\") " pod="openstack/octavia-worker-fztv9" Nov 24 12:40:33 crc kubenswrapper[4752]: I1124 12:40:33.439416 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec1f33db-dd01-4521-b4de-c2d6cecc5695-config-data\") pod \"octavia-worker-fztv9\" (UID: \"ec1f33db-dd01-4521-b4de-c2d6cecc5695\") " pod="openstack/octavia-worker-fztv9" Nov 24 12:40:33 crc kubenswrapper[4752]: I1124 12:40:33.439629 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/ec1f33db-dd01-4521-b4de-c2d6cecc5695-amphora-certs\") pod \"octavia-worker-fztv9\" (UID: \"ec1f33db-dd01-4521-b4de-c2d6cecc5695\") " pod="openstack/octavia-worker-fztv9" Nov 24 12:40:33 crc kubenswrapper[4752]: I1124 12:40:33.541498 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ec1f33db-dd01-4521-b4de-c2d6cecc5695-config-data-merged\") pod \"octavia-worker-fztv9\" (UID: \"ec1f33db-dd01-4521-b4de-c2d6cecc5695\") " pod="openstack/octavia-worker-fztv9" Nov 24 12:40:33 crc kubenswrapper[4752]: I1124 12:40:33.541556 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/ec1f33db-dd01-4521-b4de-c2d6cecc5695-hm-ports\") pod \"octavia-worker-fztv9\" (UID: \"ec1f33db-dd01-4521-b4de-c2d6cecc5695\") " pod="openstack/octavia-worker-fztv9" Nov 24 12:40:33 crc kubenswrapper[4752]: I1124 12:40:33.541617 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec1f33db-dd01-4521-b4de-c2d6cecc5695-config-data\") pod \"octavia-worker-fztv9\" (UID: \"ec1f33db-dd01-4521-b4de-c2d6cecc5695\") " pod="openstack/octavia-worker-fztv9" Nov 24 12:40:33 crc kubenswrapper[4752]: I1124 12:40:33.541703 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/ec1f33db-dd01-4521-b4de-c2d6cecc5695-amphora-certs\") pod \"octavia-worker-fztv9\" (UID: \"ec1f33db-dd01-4521-b4de-c2d6cecc5695\") " pod="openstack/octavia-worker-fztv9" Nov 24 12:40:33 crc kubenswrapper[4752]: I1124 12:40:33.541730 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec1f33db-dd01-4521-b4de-c2d6cecc5695-combined-ca-bundle\") pod \"octavia-worker-fztv9\" (UID: \"ec1f33db-dd01-4521-b4de-c2d6cecc5695\") " pod="openstack/octavia-worker-fztv9" Nov 24 12:40:33 crc kubenswrapper[4752]: I1124 12:40:33.541770 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec1f33db-dd01-4521-b4de-c2d6cecc5695-scripts\") pod \"octavia-worker-fztv9\" (UID: \"ec1f33db-dd01-4521-b4de-c2d6cecc5695\") " pod="openstack/octavia-worker-fztv9" Nov 24 12:40:33 crc kubenswrapper[4752]: I1124 12:40:33.542134 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ec1f33db-dd01-4521-b4de-c2d6cecc5695-config-data-merged\") pod \"octavia-worker-fztv9\" (UID: \"ec1f33db-dd01-4521-b4de-c2d6cecc5695\") " pod="openstack/octavia-worker-fztv9" Nov 24 12:40:33 crc kubenswrapper[4752]: I1124 12:40:33.542655 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/ec1f33db-dd01-4521-b4de-c2d6cecc5695-hm-ports\") pod \"octavia-worker-fztv9\" (UID: \"ec1f33db-dd01-4521-b4de-c2d6cecc5695\") " pod="openstack/octavia-worker-fztv9" Nov 24 12:40:33 crc kubenswrapper[4752]: I1124 12:40:33.547667 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/ec1f33db-dd01-4521-b4de-c2d6cecc5695-amphora-certs\") pod \"octavia-worker-fztv9\" (UID: \"ec1f33db-dd01-4521-b4de-c2d6cecc5695\") " pod="openstack/octavia-worker-fztv9" Nov 24 12:40:33 crc kubenswrapper[4752]: I1124 12:40:33.547776 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec1f33db-dd01-4521-b4de-c2d6cecc5695-config-data\") pod \"octavia-worker-fztv9\" (UID: \"ec1f33db-dd01-4521-b4de-c2d6cecc5695\") " pod="openstack/octavia-worker-fztv9" Nov 24 12:40:33 crc kubenswrapper[4752]: I1124 12:40:33.547871 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec1f33db-dd01-4521-b4de-c2d6cecc5695-scripts\") pod \"octavia-worker-fztv9\" (UID: \"ec1f33db-dd01-4521-b4de-c2d6cecc5695\") " pod="openstack/octavia-worker-fztv9" Nov 24 12:40:33 crc kubenswrapper[4752]: I1124 12:40:33.565656 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec1f33db-dd01-4521-b4de-c2d6cecc5695-combined-ca-bundle\") pod \"octavia-worker-fztv9\" (UID: \"ec1f33db-dd01-4521-b4de-c2d6cecc5695\") " pod="openstack/octavia-worker-fztv9" Nov 24 12:40:33 crc kubenswrapper[4752]: I1124 12:40:33.689176 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-fztv9" Nov 24 12:40:34 crc kubenswrapper[4752]: I1124 12:40:34.622971 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-gjzpt"] Nov 24 12:40:34 crc kubenswrapper[4752]: I1124 12:40:34.754374 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-fztv9"] Nov 24 12:40:34 crc kubenswrapper[4752]: I1124 12:40:34.822106 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-cp9rm" event={"ID":"cbbdcce4-d55e-4504-83df-236789ebed4b","Type":"ContainerStarted","Data":"7b0bcc9ed59adbc2a52af1dc4b864172d5155149bd05e98e3da1e9ea3332e7fb"} Nov 24 12:40:34 crc kubenswrapper[4752]: I1124 12:40:34.824803 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-fztv9" event={"ID":"ec1f33db-dd01-4521-b4de-c2d6cecc5695","Type":"ContainerStarted","Data":"df565b24bf5a532eb658d698f2d53107cc072d3b0b7a96c8f871b79262260d07"} Nov 24 12:40:34 crc kubenswrapper[4752]: I1124 12:40:34.828126 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-795nt" event={"ID":"e2a0fc67-fcee-4ec4-ae55-de5a44214b27","Type":"ContainerStarted","Data":"07d4eb5f7f42b0eb2a48ec38429e5eb95c600da8824bf6cdafa99e7545b741d2"} Nov 24 12:40:34 crc kubenswrapper[4752]: I1124 12:40:34.828578 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-rsyslog-795nt" Nov 24 12:40:34 crc kubenswrapper[4752]: I1124 12:40:34.842260 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-d6fr7" event={"ID":"97c0db40-afff-4067-9fee-06657cfbf155","Type":"ContainerStarted","Data":"f41894ab0b2d38d6385b07215b7af9a300bd7a180e9a51d12ede77a710050feb"} Nov 24 12:40:34 crc kubenswrapper[4752]: I1124 12:40:34.842492 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-healthmanager-d6fr7" Nov 24 12:40:34 crc kubenswrapper[4752]: I1124 12:40:34.845508 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-gjzpt" event={"ID":"dc345722-cbf8-460e-bc35-86df16ab5f27","Type":"ContainerStarted","Data":"f2bb6546a5231eb916a1f71c976c404f87e5875a776609c266cd096aff65f2aa"} Nov 24 12:40:34 crc kubenswrapper[4752]: I1124 12:40:34.863505 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-rsyslog-795nt" podStartSLOduration=2.276066094 podStartE2EDuration="12.863483386s" podCreationTimestamp="2025-11-24 12:40:22 +0000 UTC" firstStartedPulling="2025-11-24 12:40:23.446651706 +0000 UTC m=+5629.431472005" lastFinishedPulling="2025-11-24 12:40:34.034069018 +0000 UTC m=+5640.018889297" observedRunningTime="2025-11-24 12:40:34.854248021 +0000 UTC m=+5640.839068340" watchObservedRunningTime="2025-11-24 12:40:34.863483386 +0000 UTC m=+5640.848303675" Nov 24 12:40:34 crc kubenswrapper[4752]: I1124 12:40:34.876972 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-healthmanager-d6fr7" podStartSLOduration=7.876952683 podStartE2EDuration="7.876952683s" podCreationTimestamp="2025-11-24 12:40:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:40:34.873279018 +0000 UTC m=+5640.858099317" watchObservedRunningTime="2025-11-24 12:40:34.876952683 +0000 UTC m=+5640.861772982" Nov 24 12:40:35 crc kubenswrapper[4752]: I1124 12:40:35.414796 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-zvcnm"] Nov 24 12:40:35 crc kubenswrapper[4752]: I1124 12:40:35.861877 4752 generic.go:334] "Generic (PLEG): container finished" podID="dc345722-cbf8-460e-bc35-86df16ab5f27" containerID="ea501866f7dd91a7fecc400f2b7b86685e522964dbe74fbd25586f3fa286ca0d" exitCode=0 Nov 24 12:40:35 crc kubenswrapper[4752]: I1124 12:40:35.861968 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-gjzpt" event={"ID":"dc345722-cbf8-460e-bc35-86df16ab5f27","Type":"ContainerDied","Data":"ea501866f7dd91a7fecc400f2b7b86685e522964dbe74fbd25586f3fa286ca0d"} Nov 24 12:40:35 crc kubenswrapper[4752]: I1124 12:40:35.863591 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-zvcnm" event={"ID":"4fb19eb4-b714-4b63-a9b3-9e2427994194","Type":"ContainerStarted","Data":"8f0639a08474d4588670798f86cf328caca2824c51f7954486bca184f5b5c902"} Nov 24 12:40:35 crc kubenswrapper[4752]: I1124 12:40:35.867355 4752 generic.go:334] "Generic (PLEG): container finished" podID="cbbdcce4-d55e-4504-83df-236789ebed4b" containerID="7b0bcc9ed59adbc2a52af1dc4b864172d5155149bd05e98e3da1e9ea3332e7fb" exitCode=0 Nov 24 12:40:35 crc kubenswrapper[4752]: I1124 12:40:35.868082 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-cp9rm" event={"ID":"cbbdcce4-d55e-4504-83df-236789ebed4b","Type":"ContainerDied","Data":"7b0bcc9ed59adbc2a52af1dc4b864172d5155149bd05e98e3da1e9ea3332e7fb"} Nov 24 12:40:36 crc kubenswrapper[4752]: I1124 12:40:36.878911 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-gjzpt" event={"ID":"dc345722-cbf8-460e-bc35-86df16ab5f27","Type":"ContainerStarted","Data":"3a07ae90a33a655478501b1e0bd4ed8027fe71ee1680e3cb4425379acc09b6de"} Nov 24 12:40:36 crc kubenswrapper[4752]: I1124 12:40:36.895048 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-sync-gjzpt" podStartSLOduration=8.895026715 podStartE2EDuration="8.895026715s" podCreationTimestamp="2025-11-24 12:40:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:40:36.892207234 +0000 UTC m=+5642.877027533" watchObservedRunningTime="2025-11-24 12:40:36.895026715 +0000 UTC m=+5642.879847004" Nov 24 12:40:38 crc kubenswrapper[4752]: I1124 12:40:38.911441 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-fztv9" event={"ID":"ec1f33db-dd01-4521-b4de-c2d6cecc5695","Type":"ContainerStarted","Data":"721a2ef1a4f34f8146f7bf984e09dad100912d49508190aae08d670bc016f2e1"} Nov 24 12:40:38 crc kubenswrapper[4752]: I1124 12:40:38.915335 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-zvcnm" event={"ID":"4fb19eb4-b714-4b63-a9b3-9e2427994194","Type":"ContainerStarted","Data":"39966f8a54f4fba186d44b34c4a28cb1061db0329a3ebcd37d2bc6b132e640aa"} Nov 24 12:40:38 crc kubenswrapper[4752]: I1124 12:40:38.918499 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-cp9rm" event={"ID":"cbbdcce4-d55e-4504-83df-236789ebed4b","Type":"ContainerStarted","Data":"3951e138e9b41b2d66f47ddaa321916a25e36217a469fdf8b51884b9d52552c8"} Nov 24 12:40:38 crc kubenswrapper[4752]: I1124 12:40:38.976145 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-59f8cff499-cp9rm" podStartSLOduration=2.523663875 podStartE2EDuration="16.976125867s" podCreationTimestamp="2025-11-24 12:40:22 +0000 UTC" firstStartedPulling="2025-11-24 12:40:23.826865395 +0000 UTC m=+5629.811685684" lastFinishedPulling="2025-11-24 12:40:38.279327397 +0000 UTC m=+5644.264147676" observedRunningTime="2025-11-24 12:40:38.972869353 +0000 UTC m=+5644.957689682" watchObservedRunningTime="2025-11-24 12:40:38.976125867 +0000 UTC m=+5644.960946166" Nov 24 12:40:39 crc kubenswrapper[4752]: I1124 12:40:39.932524 4752 generic.go:334] "Generic (PLEG): container finished" podID="dc345722-cbf8-460e-bc35-86df16ab5f27" containerID="3a07ae90a33a655478501b1e0bd4ed8027fe71ee1680e3cb4425379acc09b6de" exitCode=0 Nov 24 12:40:39 crc kubenswrapper[4752]: I1124 12:40:39.932645 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-gjzpt" event={"ID":"dc345722-cbf8-460e-bc35-86df16ab5f27","Type":"ContainerDied","Data":"3a07ae90a33a655478501b1e0bd4ed8027fe71ee1680e3cb4425379acc09b6de"} Nov 24 12:40:39 crc kubenswrapper[4752]: I1124 12:40:39.934946 4752 generic.go:334] "Generic (PLEG): container finished" podID="4fb19eb4-b714-4b63-a9b3-9e2427994194" containerID="39966f8a54f4fba186d44b34c4a28cb1061db0329a3ebcd37d2bc6b132e640aa" exitCode=0 Nov 24 12:40:39 crc kubenswrapper[4752]: I1124 12:40:39.935035 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-zvcnm" event={"ID":"4fb19eb4-b714-4b63-a9b3-9e2427994194","Type":"ContainerDied","Data":"39966f8a54f4fba186d44b34c4a28cb1061db0329a3ebcd37d2bc6b132e640aa"} Nov 24 12:40:39 crc kubenswrapper[4752]: I1124 12:40:39.936925 4752 generic.go:334] "Generic (PLEG): container finished" podID="ec1f33db-dd01-4521-b4de-c2d6cecc5695" containerID="721a2ef1a4f34f8146f7bf984e09dad100912d49508190aae08d670bc016f2e1" exitCode=0 Nov 24 12:40:39 crc kubenswrapper[4752]: I1124 12:40:39.937111 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-fztv9" event={"ID":"ec1f33db-dd01-4521-b4de-c2d6cecc5695","Type":"ContainerDied","Data":"721a2ef1a4f34f8146f7bf984e09dad100912d49508190aae08d670bc016f2e1"} Nov 24 12:40:40 crc kubenswrapper[4752]: I1124 12:40:40.963108 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-zvcnm" event={"ID":"4fb19eb4-b714-4b63-a9b3-9e2427994194","Type":"ContainerStarted","Data":"6f64788d1a9712dccc24de4c1aba542ec6b3116448be296f905a27f5d7d3da7d"} Nov 24 12:40:40 crc kubenswrapper[4752]: I1124 12:40:40.963526 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-housekeeping-zvcnm" Nov 24 12:40:40 crc kubenswrapper[4752]: I1124 12:40:40.971576 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-fztv9" event={"ID":"ec1f33db-dd01-4521-b4de-c2d6cecc5695","Type":"ContainerStarted","Data":"5facdc5361e48e7375fa6e9ae9262d0a62b6363709756076bb448e958fc68db9"} Nov 24 12:40:40 crc kubenswrapper[4752]: I1124 12:40:40.971622 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-worker-fztv9" Nov 24 12:40:41 crc kubenswrapper[4752]: I1124 12:40:41.005251 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-housekeeping-zvcnm" podStartSLOduration=9.168139994 podStartE2EDuration="12.005229165s" podCreationTimestamp="2025-11-24 12:40:29 +0000 UTC" firstStartedPulling="2025-11-24 12:40:35.426444663 +0000 UTC m=+5641.411264952" lastFinishedPulling="2025-11-24 12:40:38.263533794 +0000 UTC m=+5644.248354123" observedRunningTime="2025-11-24 12:40:40.986168108 +0000 UTC m=+5646.970988417" watchObservedRunningTime="2025-11-24 12:40:41.005229165 +0000 UTC m=+5646.990049454" Nov 24 12:40:41 crc kubenswrapper[4752]: I1124 12:40:41.013565 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-worker-fztv9" podStartSLOduration=4.510635553 podStartE2EDuration="8.013538664s" podCreationTimestamp="2025-11-24 12:40:33 +0000 UTC" firstStartedPulling="2025-11-24 12:40:34.76089652 +0000 UTC m=+5640.745716819" lastFinishedPulling="2025-11-24 12:40:38.263799611 +0000 UTC m=+5644.248619930" observedRunningTime="2025-11-24 12:40:41.00468894 +0000 UTC m=+5646.989509229" watchObservedRunningTime="2025-11-24 12:40:41.013538664 +0000 UTC m=+5646.998358963" Nov 24 12:40:41 crc kubenswrapper[4752]: I1124 12:40:41.380222 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-gjzpt" Nov 24 12:40:41 crc kubenswrapper[4752]: I1124 12:40:41.486528 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc345722-cbf8-460e-bc35-86df16ab5f27-config-data\") pod \"dc345722-cbf8-460e-bc35-86df16ab5f27\" (UID: \"dc345722-cbf8-460e-bc35-86df16ab5f27\") " Nov 24 12:40:41 crc kubenswrapper[4752]: I1124 12:40:41.486785 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc345722-cbf8-460e-bc35-86df16ab5f27-scripts\") pod \"dc345722-cbf8-460e-bc35-86df16ab5f27\" (UID: \"dc345722-cbf8-460e-bc35-86df16ab5f27\") " Nov 24 12:40:41 crc kubenswrapper[4752]: I1124 12:40:41.486852 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/dc345722-cbf8-460e-bc35-86df16ab5f27-config-data-merged\") pod \"dc345722-cbf8-460e-bc35-86df16ab5f27\" (UID: \"dc345722-cbf8-460e-bc35-86df16ab5f27\") " Nov 24 12:40:41 crc kubenswrapper[4752]: I1124 12:40:41.486951 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc345722-cbf8-460e-bc35-86df16ab5f27-combined-ca-bundle\") pod \"dc345722-cbf8-460e-bc35-86df16ab5f27\" (UID: \"dc345722-cbf8-460e-bc35-86df16ab5f27\") " Nov 24 12:40:41 crc kubenswrapper[4752]: I1124 12:40:41.492121 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc345722-cbf8-460e-bc35-86df16ab5f27-config-data" (OuterVolumeSpecName: "config-data") pod "dc345722-cbf8-460e-bc35-86df16ab5f27" (UID: "dc345722-cbf8-460e-bc35-86df16ab5f27"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:40:41 crc kubenswrapper[4752]: I1124 12:40:41.493915 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc345722-cbf8-460e-bc35-86df16ab5f27-scripts" (OuterVolumeSpecName: "scripts") pod "dc345722-cbf8-460e-bc35-86df16ab5f27" (UID: "dc345722-cbf8-460e-bc35-86df16ab5f27"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:40:41 crc kubenswrapper[4752]: I1124 12:40:41.517338 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc345722-cbf8-460e-bc35-86df16ab5f27-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc345722-cbf8-460e-bc35-86df16ab5f27" (UID: "dc345722-cbf8-460e-bc35-86df16ab5f27"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:40:41 crc kubenswrapper[4752]: I1124 12:40:41.518110 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc345722-cbf8-460e-bc35-86df16ab5f27-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "dc345722-cbf8-460e-bc35-86df16ab5f27" (UID: "dc345722-cbf8-460e-bc35-86df16ab5f27"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:40:41 crc kubenswrapper[4752]: I1124 12:40:41.589680 4752 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/dc345722-cbf8-460e-bc35-86df16ab5f27-config-data-merged\") on node \"crc\" DevicePath \"\"" Nov 24 12:40:41 crc kubenswrapper[4752]: I1124 12:40:41.589742 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc345722-cbf8-460e-bc35-86df16ab5f27-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 12:40:41 crc kubenswrapper[4752]: I1124 12:40:41.589799 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc345722-cbf8-460e-bc35-86df16ab5f27-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 12:40:41 crc kubenswrapper[4752]: I1124 12:40:41.589816 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc345722-cbf8-460e-bc35-86df16ab5f27-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:40:41 crc kubenswrapper[4752]: I1124 12:40:41.998769 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-gjzpt" Nov 24 12:40:42 crc kubenswrapper[4752]: I1124 12:40:42.003690 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-gjzpt" event={"ID":"dc345722-cbf8-460e-bc35-86df16ab5f27","Type":"ContainerDied","Data":"f2bb6546a5231eb916a1f71c976c404f87e5875a776609c266cd096aff65f2aa"} Nov 24 12:40:42 crc kubenswrapper[4752]: I1124 12:40:42.003729 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2bb6546a5231eb916a1f71c976c404f87e5875a776609c266cd096aff65f2aa" Nov 24 12:40:43 crc kubenswrapper[4752]: I1124 12:40:43.159353 4752 scope.go:117] "RemoveContainer" containerID="97904e03a9e69f15199ad08f14ced0b5d650e7e6aebb105731ce3ea0a5b67737" Nov 24 12:40:43 crc kubenswrapper[4752]: I1124 12:40:43.211510 4752 scope.go:117] "RemoveContainer" containerID="9e52b489bccd337633be634c1b697052e7c0f23ed4f699b300767ccfddfc6d97" Nov 24 12:40:43 crc kubenswrapper[4752]: I1124 12:40:43.270591 4752 scope.go:117] "RemoveContainer" containerID="3fcd22e27115e031aa9cfbdaaffd5d8571a4f719060255e300db77cb4053e43e" Nov 24 12:40:43 crc kubenswrapper[4752]: I1124 12:40:43.306402 4752 scope.go:117] "RemoveContainer" containerID="1a088eb537ab104cac4db7a20cb40f46d9026ba7bb6913a90cacd74d803f2798" Nov 24 12:40:43 crc kubenswrapper[4752]: I1124 12:40:43.306980 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-healthmanager-d6fr7" Nov 24 12:40:43 crc kubenswrapper[4752]: I1124 12:40:43.345124 4752 scope.go:117] "RemoveContainer" containerID="e1af3090dba1cf6902ff93cb18c5dbc472f819c44f69f8bfdd6048947c4fd112" Nov 24 12:40:43 crc kubenswrapper[4752]: I1124 12:40:43.379375 4752 scope.go:117] "RemoveContainer" containerID="e9745bd8b44f469720f5a096053a66d49d8217a2ccf49fb67159efd4dda2a74e" Nov 24 12:40:45 crc kubenswrapper[4752]: I1124 12:40:45.372072 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-housekeeping-zvcnm" Nov 24 12:40:45 crc kubenswrapper[4752]: I1124 12:40:45.468781 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 12:40:45 crc kubenswrapper[4752]: I1124 12:40:45.468857 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 12:40:45 crc kubenswrapper[4752]: I1124 12:40:45.468924 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 12:40:45 crc kubenswrapper[4752]: I1124 12:40:45.470005 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2856f89824856d474bf147248c78cc6aab4d866c6c4a631c464c721708251588"} pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 24 12:40:45 crc kubenswrapper[4752]: I1124 12:40:45.470112 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" containerID="cri-o://2856f89824856d474bf147248c78cc6aab4d866c6c4a631c464c721708251588" gracePeriod=600 Nov 24 12:40:45 crc kubenswrapper[4752]: E1124 12:40:45.597669 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:40:46 crc kubenswrapper[4752]: I1124 12:40:46.051002 4752 generic.go:334] "Generic (PLEG): container finished" podID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerID="2856f89824856d474bf147248c78cc6aab4d866c6c4a631c464c721708251588" exitCode=0 Nov 24 12:40:46 crc kubenswrapper[4752]: I1124 12:40:46.051039 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerDied","Data":"2856f89824856d474bf147248c78cc6aab4d866c6c4a631c464c721708251588"} Nov 24 12:40:46 crc kubenswrapper[4752]: I1124 12:40:46.051346 4752 scope.go:117] "RemoveContainer" containerID="9584ea900c82ce825ee45effc8c387923a07ba88ab659856bc5c308458118a8b" Nov 24 12:40:46 crc kubenswrapper[4752]: I1124 12:40:46.051986 4752 scope.go:117] "RemoveContainer" containerID="2856f89824856d474bf147248c78cc6aab4d866c6c4a631c464c721708251588" Nov 24 12:40:46 crc kubenswrapper[4752]: E1124 12:40:46.052254 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:40:48 crc kubenswrapper[4752]: I1124 12:40:48.723945 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-worker-fztv9" Nov 24 12:40:52 crc kubenswrapper[4752]: I1124 12:40:52.841220 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-rsyslog-795nt" Nov 24 12:40:58 crc kubenswrapper[4752]: I1124 12:40:58.729329 4752 scope.go:117] "RemoveContainer" containerID="2856f89824856d474bf147248c78cc6aab4d866c6c4a631c464c721708251588" Nov 24 12:40:58 crc kubenswrapper[4752]: E1124 12:40:58.730700 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:41:08 crc kubenswrapper[4752]: I1124 12:41:08.610242 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-cp9rm"] Nov 24 12:41:08 crc kubenswrapper[4752]: I1124 12:41:08.611318 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-image-upload-59f8cff499-cp9rm" podUID="cbbdcce4-d55e-4504-83df-236789ebed4b" containerName="octavia-amphora-httpd" containerID="cri-o://3951e138e9b41b2d66f47ddaa321916a25e36217a469fdf8b51884b9d52552c8" gracePeriod=30 Nov 24 12:41:09 crc kubenswrapper[4752]: I1124 12:41:09.223195 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-cp9rm" Nov 24 12:41:09 crc kubenswrapper[4752]: I1124 12:41:09.274316 4752 generic.go:334] "Generic (PLEG): container finished" podID="cbbdcce4-d55e-4504-83df-236789ebed4b" containerID="3951e138e9b41b2d66f47ddaa321916a25e36217a469fdf8b51884b9d52552c8" exitCode=0 Nov 24 12:41:09 crc kubenswrapper[4752]: I1124 12:41:09.274358 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-cp9rm" event={"ID":"cbbdcce4-d55e-4504-83df-236789ebed4b","Type":"ContainerDied","Data":"3951e138e9b41b2d66f47ddaa321916a25e36217a469fdf8b51884b9d52552c8"} Nov 24 12:41:09 crc kubenswrapper[4752]: I1124 12:41:09.274386 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-cp9rm" event={"ID":"cbbdcce4-d55e-4504-83df-236789ebed4b","Type":"ContainerDied","Data":"a9e4a8fa5292424ca6ba13fef92bb4e342f0c69d8c193996ce37f67dbc78b07a"} Nov 24 12:41:09 crc kubenswrapper[4752]: I1124 12:41:09.274407 4752 scope.go:117] "RemoveContainer" containerID="3951e138e9b41b2d66f47ddaa321916a25e36217a469fdf8b51884b9d52552c8" Nov 24 12:41:09 crc kubenswrapper[4752]: I1124 12:41:09.274415 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-cp9rm" Nov 24 12:41:09 crc kubenswrapper[4752]: I1124 12:41:09.291759 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/cbbdcce4-d55e-4504-83df-236789ebed4b-amphora-image\") pod \"cbbdcce4-d55e-4504-83df-236789ebed4b\" (UID: \"cbbdcce4-d55e-4504-83df-236789ebed4b\") " Nov 24 12:41:09 crc kubenswrapper[4752]: I1124 12:41:09.291856 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cbbdcce4-d55e-4504-83df-236789ebed4b-httpd-config\") pod \"cbbdcce4-d55e-4504-83df-236789ebed4b\" (UID: \"cbbdcce4-d55e-4504-83df-236789ebed4b\") " Nov 24 12:41:09 crc kubenswrapper[4752]: I1124 12:41:09.314202 4752 scope.go:117] "RemoveContainer" containerID="7b0bcc9ed59adbc2a52af1dc4b864172d5155149bd05e98e3da1e9ea3332e7fb" Nov 24 12:41:09 crc kubenswrapper[4752]: I1124 12:41:09.340120 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbbdcce4-d55e-4504-83df-236789ebed4b-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "cbbdcce4-d55e-4504-83df-236789ebed4b" (UID: "cbbdcce4-d55e-4504-83df-236789ebed4b"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:41:09 crc kubenswrapper[4752]: I1124 12:41:09.349021 4752 scope.go:117] "RemoveContainer" containerID="3951e138e9b41b2d66f47ddaa321916a25e36217a469fdf8b51884b9d52552c8" Nov 24 12:41:09 crc kubenswrapper[4752]: E1124 12:41:09.349364 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3951e138e9b41b2d66f47ddaa321916a25e36217a469fdf8b51884b9d52552c8\": container with ID starting with 3951e138e9b41b2d66f47ddaa321916a25e36217a469fdf8b51884b9d52552c8 not found: ID does not exist" containerID="3951e138e9b41b2d66f47ddaa321916a25e36217a469fdf8b51884b9d52552c8" Nov 24 12:41:09 crc kubenswrapper[4752]: I1124 12:41:09.349397 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3951e138e9b41b2d66f47ddaa321916a25e36217a469fdf8b51884b9d52552c8"} err="failed to get container status \"3951e138e9b41b2d66f47ddaa321916a25e36217a469fdf8b51884b9d52552c8\": rpc error: code = NotFound desc = could not find container \"3951e138e9b41b2d66f47ddaa321916a25e36217a469fdf8b51884b9d52552c8\": container with ID starting with 3951e138e9b41b2d66f47ddaa321916a25e36217a469fdf8b51884b9d52552c8 not found: ID does not exist" Nov 24 12:41:09 crc kubenswrapper[4752]: I1124 12:41:09.349422 4752 scope.go:117] "RemoveContainer" containerID="7b0bcc9ed59adbc2a52af1dc4b864172d5155149bd05e98e3da1e9ea3332e7fb" Nov 24 12:41:09 crc kubenswrapper[4752]: E1124 12:41:09.349601 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b0bcc9ed59adbc2a52af1dc4b864172d5155149bd05e98e3da1e9ea3332e7fb\": container with ID starting with 7b0bcc9ed59adbc2a52af1dc4b864172d5155149bd05e98e3da1e9ea3332e7fb not found: ID does not exist" containerID="7b0bcc9ed59adbc2a52af1dc4b864172d5155149bd05e98e3da1e9ea3332e7fb" Nov 24 12:41:09 crc kubenswrapper[4752]: I1124 12:41:09.349622 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b0bcc9ed59adbc2a52af1dc4b864172d5155149bd05e98e3da1e9ea3332e7fb"} err="failed to get container status \"7b0bcc9ed59adbc2a52af1dc4b864172d5155149bd05e98e3da1e9ea3332e7fb\": rpc error: code = NotFound desc = could not find container \"7b0bcc9ed59adbc2a52af1dc4b864172d5155149bd05e98e3da1e9ea3332e7fb\": container with ID starting with 7b0bcc9ed59adbc2a52af1dc4b864172d5155149bd05e98e3da1e9ea3332e7fb not found: ID does not exist" Nov 24 12:41:09 crc kubenswrapper[4752]: I1124 12:41:09.395004 4752 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cbbdcce4-d55e-4504-83df-236789ebed4b-httpd-config\") on node \"crc\" DevicePath \"\"" Nov 24 12:41:09 crc kubenswrapper[4752]: I1124 12:41:09.396506 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbbdcce4-d55e-4504-83df-236789ebed4b-amphora-image" (OuterVolumeSpecName: "amphora-image") pod "cbbdcce4-d55e-4504-83df-236789ebed4b" (UID: "cbbdcce4-d55e-4504-83df-236789ebed4b"). InnerVolumeSpecName "amphora-image". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:41:09 crc kubenswrapper[4752]: I1124 12:41:09.496572 4752 reconciler_common.go:293] "Volume detached for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/cbbdcce4-d55e-4504-83df-236789ebed4b-amphora-image\") on node \"crc\" DevicePath \"\"" Nov 24 12:41:09 crc kubenswrapper[4752]: I1124 12:41:09.607683 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-cp9rm"] Nov 24 12:41:09 crc kubenswrapper[4752]: I1124 12:41:09.615257 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-cp9rm"] Nov 24 12:41:10 crc kubenswrapper[4752]: I1124 12:41:10.741423 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbbdcce4-d55e-4504-83df-236789ebed4b" path="/var/lib/kubelet/pods/cbbdcce4-d55e-4504-83df-236789ebed4b/volumes" Nov 24 12:41:10 crc kubenswrapper[4752]: E1124 12:41:10.814538 4752 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.145:41556->38.102.83.145:38429: read tcp 38.102.83.145:41556->38.102.83.145:38429: read: connection reset by peer Nov 24 12:41:10 crc kubenswrapper[4752]: E1124 12:41:10.814638 4752 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.145:41556->38.102.83.145:38429: write tcp 38.102.83.145:41556->38.102.83.145:38429: write: broken pipe Nov 24 12:41:12 crc kubenswrapper[4752]: I1124 12:41:12.728228 4752 scope.go:117] "RemoveContainer" containerID="2856f89824856d474bf147248c78cc6aab4d866c6c4a631c464c721708251588" Nov 24 12:41:12 crc kubenswrapper[4752]: E1124 12:41:12.729073 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:41:25 crc kubenswrapper[4752]: I1124 12:41:25.728806 4752 scope.go:117] "RemoveContainer" containerID="2856f89824856d474bf147248c78cc6aab4d866c6c4a631c464c721708251588" Nov 24 12:41:25 crc kubenswrapper[4752]: E1124 12:41:25.730108 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:41:37 crc kubenswrapper[4752]: I1124 12:41:37.728503 4752 scope.go:117] "RemoveContainer" containerID="2856f89824856d474bf147248c78cc6aab4d866c6c4a631c464c721708251588" Nov 24 12:41:37 crc kubenswrapper[4752]: E1124 12:41:37.729581 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:41:49 crc kubenswrapper[4752]: I1124 12:41:49.997487 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-b9cfc9c8f-x49vk"] Nov 24 12:41:50 crc kubenswrapper[4752]: E1124 12:41:50.003218 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbbdcce4-d55e-4504-83df-236789ebed4b" containerName="init" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.003382 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbbdcce4-d55e-4504-83df-236789ebed4b" containerName="init" Nov 24 12:41:50 crc kubenswrapper[4752]: E1124 12:41:50.003489 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbbdcce4-d55e-4504-83df-236789ebed4b" containerName="octavia-amphora-httpd" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.003572 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbbdcce4-d55e-4504-83df-236789ebed4b" containerName="octavia-amphora-httpd" Nov 24 12:41:50 crc kubenswrapper[4752]: E1124 12:41:50.003687 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc345722-cbf8-460e-bc35-86df16ab5f27" containerName="octavia-db-sync" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.003782 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc345722-cbf8-460e-bc35-86df16ab5f27" containerName="octavia-db-sync" Nov 24 12:41:50 crc kubenswrapper[4752]: E1124 12:41:50.003865 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc345722-cbf8-460e-bc35-86df16ab5f27" containerName="init" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.003920 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc345722-cbf8-460e-bc35-86df16ab5f27" containerName="init" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.004215 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbbdcce4-d55e-4504-83df-236789ebed4b" containerName="octavia-amphora-httpd" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.004326 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc345722-cbf8-460e-bc35-86df16ab5f27" containerName="octavia-db-sync" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.006636 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b9cfc9c8f-x49vk" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.010673 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-b9cfc9c8f-x49vk"] Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.016892 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.017202 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-gn85r" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.017447 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.017684 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.066675 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.067211 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e908313d-0c05-4500-9ad2-c8f86d019672" containerName="glance-log" containerID="cri-o://39a6c2690f144ed244f0a31e220599fe374a4c98d8e6ec9697b62bb8c22b3d14" gracePeriod=30 Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.067772 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e908313d-0c05-4500-9ad2-c8f86d019672" containerName="glance-httpd" containerID="cri-o://ff73a3acfb77932f17ac7df874edbbbcc3ae7dbe26c5b6ab575d9a8599eea7c0" gracePeriod=30 Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.131788 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5d5b58f69-nb8jh"] Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.196264 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.196668 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5d5b58f69-nb8jh"] Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.196627 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d5b58f69-nb8jh" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.196923 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="72d1cd63-792f-41e5-ab1a-b322680a18f1" containerName="glance-log" containerID="cri-o://9b0aae8c06275e141ede063b85eaeddb44ff2527a8822d67d87b4ba9edb9b29c" gracePeriod=30 Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.197024 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="72d1cd63-792f-41e5-ab1a-b322680a18f1" containerName="glance-httpd" containerID="cri-o://324e2b142bd12d04603cad9693b5d4ad70118361be149b988b4ca1a86ae49b83" gracePeriod=30 Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.244598 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68e8973a-2f6d-4af3-b72a-c8f6a5b319d5-scripts\") pod \"horizon-b9cfc9c8f-x49vk\" (UID: \"68e8973a-2f6d-4af3-b72a-c8f6a5b319d5\") " pod="openstack/horizon-b9cfc9c8f-x49vk" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.244644 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3343a79-7adf-4314-81ed-a106902874c2-config-data\") pod \"horizon-5d5b58f69-nb8jh\" (UID: \"e3343a79-7adf-4314-81ed-a106902874c2\") " pod="openstack/horizon-5d5b58f69-nb8jh" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.244677 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/68e8973a-2f6d-4af3-b72a-c8f6a5b319d5-horizon-secret-key\") pod \"horizon-b9cfc9c8f-x49vk\" (UID: \"68e8973a-2f6d-4af3-b72a-c8f6a5b319d5\") " pod="openstack/horizon-b9cfc9c8f-x49vk" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.244704 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68e8973a-2f6d-4af3-b72a-c8f6a5b319d5-logs\") pod \"horizon-b9cfc9c8f-x49vk\" (UID: \"68e8973a-2f6d-4af3-b72a-c8f6a5b319d5\") " pod="openstack/horizon-b9cfc9c8f-x49vk" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.244757 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q9sj\" (UniqueName: \"kubernetes.io/projected/68e8973a-2f6d-4af3-b72a-c8f6a5b319d5-kube-api-access-5q9sj\") pod \"horizon-b9cfc9c8f-x49vk\" (UID: \"68e8973a-2f6d-4af3-b72a-c8f6a5b319d5\") " pod="openstack/horizon-b9cfc9c8f-x49vk" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.244783 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3343a79-7adf-4314-81ed-a106902874c2-logs\") pod \"horizon-5d5b58f69-nb8jh\" (UID: \"e3343a79-7adf-4314-81ed-a106902874c2\") " pod="openstack/horizon-5d5b58f69-nb8jh" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.244801 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/68e8973a-2f6d-4af3-b72a-c8f6a5b319d5-config-data\") pod \"horizon-b9cfc9c8f-x49vk\" (UID: \"68e8973a-2f6d-4af3-b72a-c8f6a5b319d5\") " pod="openstack/horizon-b9cfc9c8f-x49vk" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.244821 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e3343a79-7adf-4314-81ed-a106902874c2-horizon-secret-key\") pod \"horizon-5d5b58f69-nb8jh\" (UID: \"e3343a79-7adf-4314-81ed-a106902874c2\") " pod="openstack/horizon-5d5b58f69-nb8jh" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.244855 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e3343a79-7adf-4314-81ed-a106902874c2-scripts\") pod \"horizon-5d5b58f69-nb8jh\" (UID: \"e3343a79-7adf-4314-81ed-a106902874c2\") " pod="openstack/horizon-5d5b58f69-nb8jh" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.244877 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrz6f\" (UniqueName: \"kubernetes.io/projected/e3343a79-7adf-4314-81ed-a106902874c2-kube-api-access-mrz6f\") pod \"horizon-5d5b58f69-nb8jh\" (UID: \"e3343a79-7adf-4314-81ed-a106902874c2\") " pod="openstack/horizon-5d5b58f69-nb8jh" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.346298 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e3343a79-7adf-4314-81ed-a106902874c2-scripts\") pod \"horizon-5d5b58f69-nb8jh\" (UID: \"e3343a79-7adf-4314-81ed-a106902874c2\") " pod="openstack/horizon-5d5b58f69-nb8jh" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.346595 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrz6f\" (UniqueName: \"kubernetes.io/projected/e3343a79-7adf-4314-81ed-a106902874c2-kube-api-access-mrz6f\") pod \"horizon-5d5b58f69-nb8jh\" (UID: \"e3343a79-7adf-4314-81ed-a106902874c2\") " pod="openstack/horizon-5d5b58f69-nb8jh" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.346779 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68e8973a-2f6d-4af3-b72a-c8f6a5b319d5-scripts\") pod \"horizon-b9cfc9c8f-x49vk\" (UID: \"68e8973a-2f6d-4af3-b72a-c8f6a5b319d5\") " pod="openstack/horizon-b9cfc9c8f-x49vk" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.347013 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3343a79-7adf-4314-81ed-a106902874c2-config-data\") pod \"horizon-5d5b58f69-nb8jh\" (UID: \"e3343a79-7adf-4314-81ed-a106902874c2\") " pod="openstack/horizon-5d5b58f69-nb8jh" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.347142 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/68e8973a-2f6d-4af3-b72a-c8f6a5b319d5-horizon-secret-key\") pod \"horizon-b9cfc9c8f-x49vk\" (UID: \"68e8973a-2f6d-4af3-b72a-c8f6a5b319d5\") " pod="openstack/horizon-b9cfc9c8f-x49vk" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.347259 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68e8973a-2f6d-4af3-b72a-c8f6a5b319d5-logs\") pod \"horizon-b9cfc9c8f-x49vk\" (UID: \"68e8973a-2f6d-4af3-b72a-c8f6a5b319d5\") " pod="openstack/horizon-b9cfc9c8f-x49vk" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.347377 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q9sj\" (UniqueName: \"kubernetes.io/projected/68e8973a-2f6d-4af3-b72a-c8f6a5b319d5-kube-api-access-5q9sj\") pod \"horizon-b9cfc9c8f-x49vk\" (UID: \"68e8973a-2f6d-4af3-b72a-c8f6a5b319d5\") " pod="openstack/horizon-b9cfc9c8f-x49vk" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.347471 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3343a79-7adf-4314-81ed-a106902874c2-logs\") pod \"horizon-5d5b58f69-nb8jh\" (UID: \"e3343a79-7adf-4314-81ed-a106902874c2\") " pod="openstack/horizon-5d5b58f69-nb8jh" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.347553 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/68e8973a-2f6d-4af3-b72a-c8f6a5b319d5-config-data\") pod \"horizon-b9cfc9c8f-x49vk\" (UID: \"68e8973a-2f6d-4af3-b72a-c8f6a5b319d5\") " pod="openstack/horizon-b9cfc9c8f-x49vk" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.347654 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e3343a79-7adf-4314-81ed-a106902874c2-horizon-secret-key\") pod \"horizon-5d5b58f69-nb8jh\" (UID: \"e3343a79-7adf-4314-81ed-a106902874c2\") " pod="openstack/horizon-5d5b58f69-nb8jh" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.348432 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3343a79-7adf-4314-81ed-a106902874c2-logs\") pod \"horizon-5d5b58f69-nb8jh\" (UID: \"e3343a79-7adf-4314-81ed-a106902874c2\") " pod="openstack/horizon-5d5b58f69-nb8jh" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.348604 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68e8973a-2f6d-4af3-b72a-c8f6a5b319d5-logs\") pod \"horizon-b9cfc9c8f-x49vk\" (UID: \"68e8973a-2f6d-4af3-b72a-c8f6a5b319d5\") " pod="openstack/horizon-b9cfc9c8f-x49vk" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.349150 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e3343a79-7adf-4314-81ed-a106902874c2-scripts\") pod \"horizon-5d5b58f69-nb8jh\" (UID: \"e3343a79-7adf-4314-81ed-a106902874c2\") " pod="openstack/horizon-5d5b58f69-nb8jh" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.350041 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3343a79-7adf-4314-81ed-a106902874c2-config-data\") pod \"horizon-5d5b58f69-nb8jh\" (UID: \"e3343a79-7adf-4314-81ed-a106902874c2\") " pod="openstack/horizon-5d5b58f69-nb8jh" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.351025 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68e8973a-2f6d-4af3-b72a-c8f6a5b319d5-scripts\") pod \"horizon-b9cfc9c8f-x49vk\" (UID: \"68e8973a-2f6d-4af3-b72a-c8f6a5b319d5\") " pod="openstack/horizon-b9cfc9c8f-x49vk" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.353236 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/68e8973a-2f6d-4af3-b72a-c8f6a5b319d5-config-data\") pod \"horizon-b9cfc9c8f-x49vk\" (UID: \"68e8973a-2f6d-4af3-b72a-c8f6a5b319d5\") " pod="openstack/horizon-b9cfc9c8f-x49vk" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.355869 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/68e8973a-2f6d-4af3-b72a-c8f6a5b319d5-horizon-secret-key\") pod \"horizon-b9cfc9c8f-x49vk\" (UID: \"68e8973a-2f6d-4af3-b72a-c8f6a5b319d5\") " pod="openstack/horizon-b9cfc9c8f-x49vk" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.360759 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e3343a79-7adf-4314-81ed-a106902874c2-horizon-secret-key\") pod \"horizon-5d5b58f69-nb8jh\" (UID: \"e3343a79-7adf-4314-81ed-a106902874c2\") " pod="openstack/horizon-5d5b58f69-nb8jh" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.364957 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q9sj\" (UniqueName: \"kubernetes.io/projected/68e8973a-2f6d-4af3-b72a-c8f6a5b319d5-kube-api-access-5q9sj\") pod \"horizon-b9cfc9c8f-x49vk\" (UID: \"68e8973a-2f6d-4af3-b72a-c8f6a5b319d5\") " pod="openstack/horizon-b9cfc9c8f-x49vk" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.369276 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrz6f\" (UniqueName: \"kubernetes.io/projected/e3343a79-7adf-4314-81ed-a106902874c2-kube-api-access-mrz6f\") pod \"horizon-5d5b58f69-nb8jh\" (UID: \"e3343a79-7adf-4314-81ed-a106902874c2\") " pod="openstack/horizon-5d5b58f69-nb8jh" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.575926 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d5b58f69-nb8jh" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.626983 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-b9cfc9c8f-x49vk"] Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.627785 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b9cfc9c8f-x49vk" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.664301 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5dfcff4df9-n9brt"] Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.668802 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5dfcff4df9-n9brt" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.680783 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5dfcff4df9-n9brt"] Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.728662 4752 scope.go:117] "RemoveContainer" containerID="2856f89824856d474bf147248c78cc6aab4d866c6c4a631c464c721708251588" Nov 24 12:41:50 crc kubenswrapper[4752]: E1124 12:41:50.728923 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.755665 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9442f57a-2b65-4f64-be4f-7c1a834acea9-logs\") pod \"horizon-5dfcff4df9-n9brt\" (UID: \"9442f57a-2b65-4f64-be4f-7c1a834acea9\") " pod="openstack/horizon-5dfcff4df9-n9brt" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.755712 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9442f57a-2b65-4f64-be4f-7c1a834acea9-scripts\") pod \"horizon-5dfcff4df9-n9brt\" (UID: \"9442f57a-2b65-4f64-be4f-7c1a834acea9\") " pod="openstack/horizon-5dfcff4df9-n9brt" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.755789 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzv2k\" (UniqueName: \"kubernetes.io/projected/9442f57a-2b65-4f64-be4f-7c1a834acea9-kube-api-access-nzv2k\") pod \"horizon-5dfcff4df9-n9brt\" (UID: \"9442f57a-2b65-4f64-be4f-7c1a834acea9\") " pod="openstack/horizon-5dfcff4df9-n9brt" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.755958 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9442f57a-2b65-4f64-be4f-7c1a834acea9-horizon-secret-key\") pod \"horizon-5dfcff4df9-n9brt\" (UID: \"9442f57a-2b65-4f64-be4f-7c1a834acea9\") " pod="openstack/horizon-5dfcff4df9-n9brt" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.756015 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9442f57a-2b65-4f64-be4f-7c1a834acea9-config-data\") pod \"horizon-5dfcff4df9-n9brt\" (UID: \"9442f57a-2b65-4f64-be4f-7c1a834acea9\") " pod="openstack/horizon-5dfcff4df9-n9brt" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.787535 4752 generic.go:334] "Generic (PLEG): container finished" podID="72d1cd63-792f-41e5-ab1a-b322680a18f1" containerID="9b0aae8c06275e141ede063b85eaeddb44ff2527a8822d67d87b4ba9edb9b29c" exitCode=143 Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.787672 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"72d1cd63-792f-41e5-ab1a-b322680a18f1","Type":"ContainerDied","Data":"9b0aae8c06275e141ede063b85eaeddb44ff2527a8822d67d87b4ba9edb9b29c"} Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.789323 4752 generic.go:334] "Generic (PLEG): container finished" podID="e908313d-0c05-4500-9ad2-c8f86d019672" containerID="39a6c2690f144ed244f0a31e220599fe374a4c98d8e6ec9697b62bb8c22b3d14" exitCode=143 Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.789363 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e908313d-0c05-4500-9ad2-c8f86d019672","Type":"ContainerDied","Data":"39a6c2690f144ed244f0a31e220599fe374a4c98d8e6ec9697b62bb8c22b3d14"} Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.856714 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9442f57a-2b65-4f64-be4f-7c1a834acea9-logs\") pod \"horizon-5dfcff4df9-n9brt\" (UID: \"9442f57a-2b65-4f64-be4f-7c1a834acea9\") " pod="openstack/horizon-5dfcff4df9-n9brt" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.856783 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9442f57a-2b65-4f64-be4f-7c1a834acea9-scripts\") pod \"horizon-5dfcff4df9-n9brt\" (UID: \"9442f57a-2b65-4f64-be4f-7c1a834acea9\") " pod="openstack/horizon-5dfcff4df9-n9brt" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.856812 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzv2k\" (UniqueName: \"kubernetes.io/projected/9442f57a-2b65-4f64-be4f-7c1a834acea9-kube-api-access-nzv2k\") pod \"horizon-5dfcff4df9-n9brt\" (UID: \"9442f57a-2b65-4f64-be4f-7c1a834acea9\") " pod="openstack/horizon-5dfcff4df9-n9brt" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.856888 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9442f57a-2b65-4f64-be4f-7c1a834acea9-horizon-secret-key\") pod \"horizon-5dfcff4df9-n9brt\" (UID: \"9442f57a-2b65-4f64-be4f-7c1a834acea9\") " pod="openstack/horizon-5dfcff4df9-n9brt" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.856927 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9442f57a-2b65-4f64-be4f-7c1a834acea9-config-data\") pod \"horizon-5dfcff4df9-n9brt\" (UID: \"9442f57a-2b65-4f64-be4f-7c1a834acea9\") " pod="openstack/horizon-5dfcff4df9-n9brt" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.857280 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9442f57a-2b65-4f64-be4f-7c1a834acea9-logs\") pod \"horizon-5dfcff4df9-n9brt\" (UID: \"9442f57a-2b65-4f64-be4f-7c1a834acea9\") " pod="openstack/horizon-5dfcff4df9-n9brt" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.859194 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9442f57a-2b65-4f64-be4f-7c1a834acea9-config-data\") pod \"horizon-5dfcff4df9-n9brt\" (UID: \"9442f57a-2b65-4f64-be4f-7c1a834acea9\") " pod="openstack/horizon-5dfcff4df9-n9brt" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.862609 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9442f57a-2b65-4f64-be4f-7c1a834acea9-scripts\") pod \"horizon-5dfcff4df9-n9brt\" (UID: \"9442f57a-2b65-4f64-be4f-7c1a834acea9\") " pod="openstack/horizon-5dfcff4df9-n9brt" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.868898 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9442f57a-2b65-4f64-be4f-7c1a834acea9-horizon-secret-key\") pod \"horizon-5dfcff4df9-n9brt\" (UID: \"9442f57a-2b65-4f64-be4f-7c1a834acea9\") " pod="openstack/horizon-5dfcff4df9-n9brt" Nov 24 12:41:50 crc kubenswrapper[4752]: I1124 12:41:50.881471 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzv2k\" (UniqueName: \"kubernetes.io/projected/9442f57a-2b65-4f64-be4f-7c1a834acea9-kube-api-access-nzv2k\") pod \"horizon-5dfcff4df9-n9brt\" (UID: \"9442f57a-2b65-4f64-be4f-7c1a834acea9\") " pod="openstack/horizon-5dfcff4df9-n9brt" Nov 24 12:41:51 crc kubenswrapper[4752]: I1124 12:41:51.057214 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5dfcff4df9-n9brt" Nov 24 12:41:51 crc kubenswrapper[4752]: I1124 12:41:51.214854 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5d5b58f69-nb8jh"] Nov 24 12:41:51 crc kubenswrapper[4752]: I1124 12:41:51.293008 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-b9cfc9c8f-x49vk"] Nov 24 12:41:51 crc kubenswrapper[4752]: W1124 12:41:51.296080 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68e8973a_2f6d_4af3_b72a_c8f6a5b319d5.slice/crio-33aba2beccd7c10af3065507f32ae2622ebebb74604dfc464838d4465c393949 WatchSource:0}: Error finding container 33aba2beccd7c10af3065507f32ae2622ebebb74604dfc464838d4465c393949: Status 404 returned error can't find the container with id 33aba2beccd7c10af3065507f32ae2622ebebb74604dfc464838d4465c393949 Nov 24 12:41:51 crc kubenswrapper[4752]: W1124 12:41:51.613917 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9442f57a_2b65_4f64_be4f_7c1a834acea9.slice/crio-2bb2e0b596e1faf56a723fb9a4f1aa98d3ed80e28a8b5c5eedccc7adbff6089f WatchSource:0}: Error finding container 2bb2e0b596e1faf56a723fb9a4f1aa98d3ed80e28a8b5c5eedccc7adbff6089f: Status 404 returned error can't find the container with id 2bb2e0b596e1faf56a723fb9a4f1aa98d3ed80e28a8b5c5eedccc7adbff6089f Nov 24 12:41:51 crc kubenswrapper[4752]: I1124 12:41:51.614531 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5dfcff4df9-n9brt"] Nov 24 12:41:51 crc kubenswrapper[4752]: I1124 12:41:51.800437 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5dfcff4df9-n9brt" event={"ID":"9442f57a-2b65-4f64-be4f-7c1a834acea9","Type":"ContainerStarted","Data":"2bb2e0b596e1faf56a723fb9a4f1aa98d3ed80e28a8b5c5eedccc7adbff6089f"} Nov 24 12:41:51 crc kubenswrapper[4752]: I1124 12:41:51.803357 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b9cfc9c8f-x49vk" event={"ID":"68e8973a-2f6d-4af3-b72a-c8f6a5b319d5","Type":"ContainerStarted","Data":"33aba2beccd7c10af3065507f32ae2622ebebb74604dfc464838d4465c393949"} Nov 24 12:41:51 crc kubenswrapper[4752]: I1124 12:41:51.804480 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d5b58f69-nb8jh" event={"ID":"e3343a79-7adf-4314-81ed-a106902874c2","Type":"ContainerStarted","Data":"4c0fcd5c26dd06f39ca978e3cc807ef52edc7f780fcd4bcc6ed9b58ea41c956e"} Nov 24 12:41:54 crc kubenswrapper[4752]: I1124 12:41:54.303931 4752 generic.go:334] "Generic (PLEG): container finished" podID="e908313d-0c05-4500-9ad2-c8f86d019672" containerID="ff73a3acfb77932f17ac7df874edbbbcc3ae7dbe26c5b6ab575d9a8599eea7c0" exitCode=0 Nov 24 12:41:54 crc kubenswrapper[4752]: I1124 12:41:54.304059 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e908313d-0c05-4500-9ad2-c8f86d019672","Type":"ContainerDied","Data":"ff73a3acfb77932f17ac7df874edbbbcc3ae7dbe26c5b6ab575d9a8599eea7c0"} Nov 24 12:41:54 crc kubenswrapper[4752]: I1124 12:41:54.307539 4752 generic.go:334] "Generic (PLEG): container finished" podID="72d1cd63-792f-41e5-ab1a-b322680a18f1" containerID="324e2b142bd12d04603cad9693b5d4ad70118361be149b988b4ca1a86ae49b83" exitCode=0 Nov 24 12:41:54 crc kubenswrapper[4752]: I1124 12:41:54.307571 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"72d1cd63-792f-41e5-ab1a-b322680a18f1","Type":"ContainerDied","Data":"324e2b142bd12d04603cad9693b5d4ad70118361be149b988b4ca1a86ae49b83"} Nov 24 12:41:58 crc kubenswrapper[4752]: I1124 12:41:58.500430 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 24 12:41:58 crc kubenswrapper[4752]: I1124 12:41:58.501322 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 24 12:41:58 crc kubenswrapper[4752]: I1124 12:41:58.528914 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72d1cd63-792f-41e5-ab1a-b322680a18f1-scripts\") pod \"72d1cd63-792f-41e5-ab1a-b322680a18f1\" (UID: \"72d1cd63-792f-41e5-ab1a-b322680a18f1\") " Nov 24 12:41:58 crc kubenswrapper[4752]: I1124 12:41:58.529009 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/72d1cd63-792f-41e5-ab1a-b322680a18f1-ceph\") pod \"72d1cd63-792f-41e5-ab1a-b322680a18f1\" (UID: \"72d1cd63-792f-41e5-ab1a-b322680a18f1\") " Nov 24 12:41:58 crc kubenswrapper[4752]: I1124 12:41:58.529080 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e908313d-0c05-4500-9ad2-c8f86d019672-combined-ca-bundle\") pod \"e908313d-0c05-4500-9ad2-c8f86d019672\" (UID: \"e908313d-0c05-4500-9ad2-c8f86d019672\") " Nov 24 12:41:58 crc kubenswrapper[4752]: I1124 12:41:58.529132 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e908313d-0c05-4500-9ad2-c8f86d019672-httpd-run\") pod \"e908313d-0c05-4500-9ad2-c8f86d019672\" (UID: \"e908313d-0c05-4500-9ad2-c8f86d019672\") " Nov 24 12:41:58 crc kubenswrapper[4752]: I1124 12:41:58.529247 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72d1cd63-792f-41e5-ab1a-b322680a18f1-logs\") pod \"72d1cd63-792f-41e5-ab1a-b322680a18f1\" (UID: \"72d1cd63-792f-41e5-ab1a-b322680a18f1\") " Nov 24 12:41:58 crc kubenswrapper[4752]: I1124 12:41:58.529312 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e908313d-0c05-4500-9ad2-c8f86d019672-scripts\") pod \"e908313d-0c05-4500-9ad2-c8f86d019672\" (UID: \"e908313d-0c05-4500-9ad2-c8f86d019672\") " Nov 24 12:41:58 crc kubenswrapper[4752]: I1124 12:41:58.529346 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kd2mg\" (UniqueName: \"kubernetes.io/projected/e908313d-0c05-4500-9ad2-c8f86d019672-kube-api-access-kd2mg\") pod \"e908313d-0c05-4500-9ad2-c8f86d019672\" (UID: \"e908313d-0c05-4500-9ad2-c8f86d019672\") " Nov 24 12:41:58 crc kubenswrapper[4752]: I1124 12:41:58.529401 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72d1cd63-792f-41e5-ab1a-b322680a18f1-config-data\") pod \"72d1cd63-792f-41e5-ab1a-b322680a18f1\" (UID: \"72d1cd63-792f-41e5-ab1a-b322680a18f1\") " Nov 24 12:41:58 crc kubenswrapper[4752]: I1124 12:41:58.529454 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e908313d-0c05-4500-9ad2-c8f86d019672-ceph\") pod \"e908313d-0c05-4500-9ad2-c8f86d019672\" (UID: \"e908313d-0c05-4500-9ad2-c8f86d019672\") " Nov 24 12:41:58 crc kubenswrapper[4752]: I1124 12:41:58.529533 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8f9x\" (UniqueName: \"kubernetes.io/projected/72d1cd63-792f-41e5-ab1a-b322680a18f1-kube-api-access-n8f9x\") pod \"72d1cd63-792f-41e5-ab1a-b322680a18f1\" (UID: \"72d1cd63-792f-41e5-ab1a-b322680a18f1\") " Nov 24 12:41:58 crc kubenswrapper[4752]: I1124 12:41:58.529576 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/72d1cd63-792f-41e5-ab1a-b322680a18f1-httpd-run\") pod \"72d1cd63-792f-41e5-ab1a-b322680a18f1\" (UID: \"72d1cd63-792f-41e5-ab1a-b322680a18f1\") " Nov 24 12:41:58 crc kubenswrapper[4752]: I1124 12:41:58.529613 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72d1cd63-792f-41e5-ab1a-b322680a18f1-combined-ca-bundle\") pod \"72d1cd63-792f-41e5-ab1a-b322680a18f1\" (UID: \"72d1cd63-792f-41e5-ab1a-b322680a18f1\") " Nov 24 12:41:58 crc kubenswrapper[4752]: I1124 12:41:58.529639 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e908313d-0c05-4500-9ad2-c8f86d019672-logs\") pod \"e908313d-0c05-4500-9ad2-c8f86d019672\" (UID: \"e908313d-0c05-4500-9ad2-c8f86d019672\") " Nov 24 12:41:58 crc kubenswrapper[4752]: I1124 12:41:58.529681 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e908313d-0c05-4500-9ad2-c8f86d019672-config-data\") pod \"e908313d-0c05-4500-9ad2-c8f86d019672\" (UID: \"e908313d-0c05-4500-9ad2-c8f86d019672\") " Nov 24 12:41:58 crc kubenswrapper[4752]: I1124 12:41:58.536817 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e908313d-0c05-4500-9ad2-c8f86d019672-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e908313d-0c05-4500-9ad2-c8f86d019672" (UID: "e908313d-0c05-4500-9ad2-c8f86d019672"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:41:58 crc kubenswrapper[4752]: I1124 12:41:58.537144 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72d1cd63-792f-41e5-ab1a-b322680a18f1-logs" (OuterVolumeSpecName: "logs") pod "72d1cd63-792f-41e5-ab1a-b322680a18f1" (UID: "72d1cd63-792f-41e5-ab1a-b322680a18f1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:41:58 crc kubenswrapper[4752]: I1124 12:41:58.537215 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e908313d-0c05-4500-9ad2-c8f86d019672-scripts" (OuterVolumeSpecName: "scripts") pod "e908313d-0c05-4500-9ad2-c8f86d019672" (UID: "e908313d-0c05-4500-9ad2-c8f86d019672"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:41:58 crc kubenswrapper[4752]: I1124 12:41:58.545101 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72d1cd63-792f-41e5-ab1a-b322680a18f1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "72d1cd63-792f-41e5-ab1a-b322680a18f1" (UID: "72d1cd63-792f-41e5-ab1a-b322680a18f1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:41:58 crc kubenswrapper[4752]: I1124 12:41:58.545178 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e908313d-0c05-4500-9ad2-c8f86d019672-logs" (OuterVolumeSpecName: "logs") pod "e908313d-0c05-4500-9ad2-c8f86d019672" (UID: "e908313d-0c05-4500-9ad2-c8f86d019672"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:41:58 crc kubenswrapper[4752]: I1124 12:41:58.545814 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72d1cd63-792f-41e5-ab1a-b322680a18f1-scripts" (OuterVolumeSpecName: "scripts") pod "72d1cd63-792f-41e5-ab1a-b322680a18f1" (UID: "72d1cd63-792f-41e5-ab1a-b322680a18f1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:41:58 crc kubenswrapper[4752]: I1124 12:41:58.546597 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72d1cd63-792f-41e5-ab1a-b322680a18f1-kube-api-access-n8f9x" (OuterVolumeSpecName: "kube-api-access-n8f9x") pod "72d1cd63-792f-41e5-ab1a-b322680a18f1" (UID: "72d1cd63-792f-41e5-ab1a-b322680a18f1"). InnerVolumeSpecName "kube-api-access-n8f9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:41:58 crc kubenswrapper[4752]: I1124 12:41:58.547023 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e908313d-0c05-4500-9ad2-c8f86d019672-kube-api-access-kd2mg" (OuterVolumeSpecName: "kube-api-access-kd2mg") pod "e908313d-0c05-4500-9ad2-c8f86d019672" (UID: "e908313d-0c05-4500-9ad2-c8f86d019672"). InnerVolumeSpecName "kube-api-access-kd2mg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:41:58 crc kubenswrapper[4752]: I1124 12:41:58.549599 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72d1cd63-792f-41e5-ab1a-b322680a18f1-ceph" (OuterVolumeSpecName: "ceph") pod "72d1cd63-792f-41e5-ab1a-b322680a18f1" (UID: "72d1cd63-792f-41e5-ab1a-b322680a18f1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:41:58 crc kubenswrapper[4752]: I1124 12:41:58.583029 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e908313d-0c05-4500-9ad2-c8f86d019672-ceph" (OuterVolumeSpecName: "ceph") pod "e908313d-0c05-4500-9ad2-c8f86d019672" (UID: "e908313d-0c05-4500-9ad2-c8f86d019672"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:41:58 crc kubenswrapper[4752]: I1124 12:41:58.585307 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e908313d-0c05-4500-9ad2-c8f86d019672-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e908313d-0c05-4500-9ad2-c8f86d019672" (UID: "e908313d-0c05-4500-9ad2-c8f86d019672"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:41:58 crc kubenswrapper[4752]: I1124 12:41:58.591488 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72d1cd63-792f-41e5-ab1a-b322680a18f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72d1cd63-792f-41e5-ab1a-b322680a18f1" (UID: "72d1cd63-792f-41e5-ab1a-b322680a18f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:41:58 crc kubenswrapper[4752]: I1124 12:41:58.633552 4752 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/72d1cd63-792f-41e5-ab1a-b322680a18f1-ceph\") on node \"crc\" DevicePath \"\"" Nov 24 12:41:58 crc kubenswrapper[4752]: I1124 12:41:58.633579 4752 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e908313d-0c05-4500-9ad2-c8f86d019672-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 24 12:41:58 crc kubenswrapper[4752]: I1124 12:41:58.633592 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e908313d-0c05-4500-9ad2-c8f86d019672-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 12:41:58 crc kubenswrapper[4752]: I1124 12:41:58.633606 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72d1cd63-792f-41e5-ab1a-b322680a18f1-logs\") on node \"crc\" DevicePath \"\"" Nov 24 12:41:58 crc kubenswrapper[4752]: I1124 12:41:58.633619 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e908313d-0c05-4500-9ad2-c8f86d019672-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:41:58 crc kubenswrapper[4752]: I1124 12:41:58.633630 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kd2mg\" (UniqueName: \"kubernetes.io/projected/e908313d-0c05-4500-9ad2-c8f86d019672-kube-api-access-kd2mg\") on node \"crc\" DevicePath \"\"" Nov 24 12:41:58 crc kubenswrapper[4752]: I1124 12:41:58.633642 4752 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e908313d-0c05-4500-9ad2-c8f86d019672-ceph\") on node \"crc\" DevicePath \"\"" Nov 24 12:41:58 crc kubenswrapper[4752]: I1124 12:41:58.633654 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8f9x\" (UniqueName: \"kubernetes.io/projected/72d1cd63-792f-41e5-ab1a-b322680a18f1-kube-api-access-n8f9x\") on node \"crc\" DevicePath \"\"" Nov 24 12:41:58 crc kubenswrapper[4752]: I1124 12:41:58.633664 4752 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/72d1cd63-792f-41e5-ab1a-b322680a18f1-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 24 12:41:58 crc kubenswrapper[4752]: I1124 12:41:58.633676 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72d1cd63-792f-41e5-ab1a-b322680a18f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 12:41:58 crc kubenswrapper[4752]: I1124 12:41:58.633688 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e908313d-0c05-4500-9ad2-c8f86d019672-logs\") on node \"crc\" DevicePath \"\"" Nov 24 12:41:58 crc kubenswrapper[4752]: I1124 12:41:58.633699 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72d1cd63-792f-41e5-ab1a-b322680a18f1-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:41:58 crc kubenswrapper[4752]: I1124 12:41:58.647518 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72d1cd63-792f-41e5-ab1a-b322680a18f1-config-data" (OuterVolumeSpecName: "config-data") pod "72d1cd63-792f-41e5-ab1a-b322680a18f1" (UID: "72d1cd63-792f-41e5-ab1a-b322680a18f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:41:58 crc kubenswrapper[4752]: I1124 12:41:58.678465 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e908313d-0c05-4500-9ad2-c8f86d019672-config-data" (OuterVolumeSpecName: "config-data") pod "e908313d-0c05-4500-9ad2-c8f86d019672" (UID: "e908313d-0c05-4500-9ad2-c8f86d019672"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:41:58 crc kubenswrapper[4752]: I1124 12:41:58.738616 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e908313d-0c05-4500-9ad2-c8f86d019672-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 12:41:58 crc kubenswrapper[4752]: I1124 12:41:58.738643 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72d1cd63-792f-41e5-ab1a-b322680a18f1-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.388722 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b9cfc9c8f-x49vk" event={"ID":"68e8973a-2f6d-4af3-b72a-c8f6a5b319d5","Type":"ContainerStarted","Data":"08c7eb9f1de04a8b92171de9f9f464e484aca0d154f855b0df43bc76c896d331"} Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.389180 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b9cfc9c8f-x49vk" event={"ID":"68e8973a-2f6d-4af3-b72a-c8f6a5b319d5","Type":"ContainerStarted","Data":"ccfe614516ec9d2089221b5b27ada912577a3d52ef7d6a9c134f3df4bca4091d"} Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.389131 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-b9cfc9c8f-x49vk" podUID="68e8973a-2f6d-4af3-b72a-c8f6a5b319d5" containerName="horizon-log" containerID="cri-o://ccfe614516ec9d2089221b5b27ada912577a3d52ef7d6a9c134f3df4bca4091d" gracePeriod=30 Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.389065 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-b9cfc9c8f-x49vk" podUID="68e8973a-2f6d-4af3-b72a-c8f6a5b319d5" containerName="horizon" containerID="cri-o://08c7eb9f1de04a8b92171de9f9f464e484aca0d154f855b0df43bc76c896d331" gracePeriod=30 Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.396932 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"72d1cd63-792f-41e5-ab1a-b322680a18f1","Type":"ContainerDied","Data":"37d9198e31cd920fe6bcdc4c246a90e558abde334379af8bb1cb94b965b6d701"} Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.396995 4752 scope.go:117] "RemoveContainer" containerID="324e2b142bd12d04603cad9693b5d4ad70118361be149b988b4ca1a86ae49b83" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.397144 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.401799 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d5b58f69-nb8jh" event={"ID":"e3343a79-7adf-4314-81ed-a106902874c2","Type":"ContainerStarted","Data":"2aec1bcbd6537bdc1d4c991e85f4c04680816e0ddd5fd765d8cae2f304716f58"} Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.401864 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d5b58f69-nb8jh" event={"ID":"e3343a79-7adf-4314-81ed-a106902874c2","Type":"ContainerStarted","Data":"72c897d8b2815313f3c49cc17f64a0edcac20d9a02d2f7cb9ca6652e7f78c801"} Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.417486 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e908313d-0c05-4500-9ad2-c8f86d019672","Type":"ContainerDied","Data":"96fd641a1e03cef22251f28cd456c9730e46bd9ba2e8c56f33d8b8d3ff1498bf"} Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.417465 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-b9cfc9c8f-x49vk" podStartSLOduration=3.265781069 podStartE2EDuration="10.417443978s" podCreationTimestamp="2025-11-24 12:41:49 +0000 UTC" firstStartedPulling="2025-11-24 12:41:51.297957496 +0000 UTC m=+5717.282777785" lastFinishedPulling="2025-11-24 12:41:58.449620395 +0000 UTC m=+5724.434440694" observedRunningTime="2025-11-24 12:41:59.407339537 +0000 UTC m=+5725.392159836" watchObservedRunningTime="2025-11-24 12:41:59.417443978 +0000 UTC m=+5725.402264267" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.417589 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.419957 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5dfcff4df9-n9brt" event={"ID":"9442f57a-2b65-4f64-be4f-7c1a834acea9","Type":"ContainerStarted","Data":"328fa17f7b2e7d6ce7d0ff5a9fb127aef069ff0b8009cfb11c3e0f3441aa0b7c"} Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.420002 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5dfcff4df9-n9brt" event={"ID":"9442f57a-2b65-4f64-be4f-7c1a834acea9","Type":"ContainerStarted","Data":"0e2ca094df1700bef9c91bc549879094f0e8c20bb6b3de251d08d53391dde6c8"} Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.449690 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5d5b58f69-nb8jh" podStartSLOduration=2.226694436 podStartE2EDuration="9.449672813s" podCreationTimestamp="2025-11-24 12:41:50 +0000 UTC" firstStartedPulling="2025-11-24 12:41:51.208163707 +0000 UTC m=+5717.192983996" lastFinishedPulling="2025-11-24 12:41:58.431142074 +0000 UTC m=+5724.415962373" observedRunningTime="2025-11-24 12:41:59.434439176 +0000 UTC m=+5725.419259485" watchObservedRunningTime="2025-11-24 12:41:59.449672813 +0000 UTC m=+5725.434493102" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.479000 4752 scope.go:117] "RemoveContainer" containerID="9b0aae8c06275e141ede063b85eaeddb44ff2527a8822d67d87b4ba9edb9b29c" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.486242 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5dfcff4df9-n9brt" podStartSLOduration=2.614964096 podStartE2EDuration="9.486222513s" podCreationTimestamp="2025-11-24 12:41:50 +0000 UTC" firstStartedPulling="2025-11-24 12:41:51.616295997 +0000 UTC m=+5717.601116286" lastFinishedPulling="2025-11-24 12:41:58.487554424 +0000 UTC m=+5724.472374703" observedRunningTime="2025-11-24 12:41:59.461258416 +0000 UTC m=+5725.446078715" watchObservedRunningTime="2025-11-24 12:41:59.486222513 +0000 UTC m=+5725.471042802" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.491405 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.500465 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.520811 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.528657 4752 scope.go:117] "RemoveContainer" containerID="ff73a3acfb77932f17ac7df874edbbbcc3ae7dbe26c5b6ab575d9a8599eea7c0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.530677 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.554831 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 12:41:59 crc kubenswrapper[4752]: E1124 12:41:59.555289 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e908313d-0c05-4500-9ad2-c8f86d019672" containerName="glance-log" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.555304 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="e908313d-0c05-4500-9ad2-c8f86d019672" containerName="glance-log" Nov 24 12:41:59 crc kubenswrapper[4752]: E1124 12:41:59.555325 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e908313d-0c05-4500-9ad2-c8f86d019672" containerName="glance-httpd" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.555332 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="e908313d-0c05-4500-9ad2-c8f86d019672" containerName="glance-httpd" Nov 24 12:41:59 crc kubenswrapper[4752]: E1124 12:41:59.555348 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72d1cd63-792f-41e5-ab1a-b322680a18f1" containerName="glance-httpd" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.555354 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="72d1cd63-792f-41e5-ab1a-b322680a18f1" containerName="glance-httpd" Nov 24 12:41:59 crc kubenswrapper[4752]: E1124 12:41:59.555371 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72d1cd63-792f-41e5-ab1a-b322680a18f1" containerName="glance-log" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.555377 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="72d1cd63-792f-41e5-ab1a-b322680a18f1" containerName="glance-log" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.555580 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="72d1cd63-792f-41e5-ab1a-b322680a18f1" containerName="glance-httpd" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.555601 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="72d1cd63-792f-41e5-ab1a-b322680a18f1" containerName="glance-log" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.555614 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="e908313d-0c05-4500-9ad2-c8f86d019672" containerName="glance-log" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.555623 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="e908313d-0c05-4500-9ad2-c8f86d019672" containerName="glance-httpd" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.556612 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.557845 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.558320 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.563049 4752 scope.go:117] "RemoveContainer" containerID="39a6c2690f144ed244f0a31e220599fe374a4c98d8e6ec9697b62bb8c22b3d14" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.563139 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.569311 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.577977 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-jxspm" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.578194 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.578307 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.578957 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.662512 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4n9r\" (UniqueName: \"kubernetes.io/projected/b1d02810-1b43-40b7-9b3c-99f316e7d3a9-kube-api-access-m4n9r\") pod \"glance-default-external-api-0\" (UID: \"b1d02810-1b43-40b7-9b3c-99f316e7d3a9\") " pod="openstack/glance-default-external-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.662894 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/571d263a-a54a-4576-a0d2-cb6325f91b19-scripts\") pod \"glance-default-internal-api-0\" (UID: \"571d263a-a54a-4576-a0d2-cb6325f91b19\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.662929 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/571d263a-a54a-4576-a0d2-cb6325f91b19-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"571d263a-a54a-4576-a0d2-cb6325f91b19\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.663011 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1d02810-1b43-40b7-9b3c-99f316e7d3a9-scripts\") pod \"glance-default-external-api-0\" (UID: \"b1d02810-1b43-40b7-9b3c-99f316e7d3a9\") " pod="openstack/glance-default-external-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.663032 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1d02810-1b43-40b7-9b3c-99f316e7d3a9-config-data\") pod \"glance-default-external-api-0\" (UID: \"b1d02810-1b43-40b7-9b3c-99f316e7d3a9\") " pod="openstack/glance-default-external-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.663078 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/571d263a-a54a-4576-a0d2-cb6325f91b19-ceph\") pod \"glance-default-internal-api-0\" (UID: \"571d263a-a54a-4576-a0d2-cb6325f91b19\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.663166 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1d02810-1b43-40b7-9b3c-99f316e7d3a9-logs\") pod \"glance-default-external-api-0\" (UID: \"b1d02810-1b43-40b7-9b3c-99f316e7d3a9\") " pod="openstack/glance-default-external-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.663218 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b1d02810-1b43-40b7-9b3c-99f316e7d3a9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b1d02810-1b43-40b7-9b3c-99f316e7d3a9\") " pod="openstack/glance-default-external-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.663246 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4624w\" (UniqueName: \"kubernetes.io/projected/571d263a-a54a-4576-a0d2-cb6325f91b19-kube-api-access-4624w\") pod \"glance-default-internal-api-0\" (UID: \"571d263a-a54a-4576-a0d2-cb6325f91b19\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.663287 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/571d263a-a54a-4576-a0d2-cb6325f91b19-config-data\") pod \"glance-default-internal-api-0\" (UID: \"571d263a-a54a-4576-a0d2-cb6325f91b19\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.663312 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1d02810-1b43-40b7-9b3c-99f316e7d3a9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b1d02810-1b43-40b7-9b3c-99f316e7d3a9\") " pod="openstack/glance-default-external-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.663328 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b1d02810-1b43-40b7-9b3c-99f316e7d3a9-ceph\") pod \"glance-default-external-api-0\" (UID: \"b1d02810-1b43-40b7-9b3c-99f316e7d3a9\") " pod="openstack/glance-default-external-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.663387 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/571d263a-a54a-4576-a0d2-cb6325f91b19-logs\") pod \"glance-default-internal-api-0\" (UID: \"571d263a-a54a-4576-a0d2-cb6325f91b19\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.663452 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/571d263a-a54a-4576-a0d2-cb6325f91b19-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"571d263a-a54a-4576-a0d2-cb6325f91b19\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.766248 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/571d263a-a54a-4576-a0d2-cb6325f91b19-ceph\") pod \"glance-default-internal-api-0\" (UID: \"571d263a-a54a-4576-a0d2-cb6325f91b19\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.766347 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1d02810-1b43-40b7-9b3c-99f316e7d3a9-logs\") pod \"glance-default-external-api-0\" (UID: \"b1d02810-1b43-40b7-9b3c-99f316e7d3a9\") " pod="openstack/glance-default-external-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.766408 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b1d02810-1b43-40b7-9b3c-99f316e7d3a9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b1d02810-1b43-40b7-9b3c-99f316e7d3a9\") " pod="openstack/glance-default-external-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.766460 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4624w\" (UniqueName: \"kubernetes.io/projected/571d263a-a54a-4576-a0d2-cb6325f91b19-kube-api-access-4624w\") pod \"glance-default-internal-api-0\" (UID: \"571d263a-a54a-4576-a0d2-cb6325f91b19\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.766496 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/571d263a-a54a-4576-a0d2-cb6325f91b19-config-data\") pod \"glance-default-internal-api-0\" (UID: \"571d263a-a54a-4576-a0d2-cb6325f91b19\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.766533 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1d02810-1b43-40b7-9b3c-99f316e7d3a9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b1d02810-1b43-40b7-9b3c-99f316e7d3a9\") " pod="openstack/glance-default-external-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.766561 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b1d02810-1b43-40b7-9b3c-99f316e7d3a9-ceph\") pod \"glance-default-external-api-0\" (UID: \"b1d02810-1b43-40b7-9b3c-99f316e7d3a9\") " pod="openstack/glance-default-external-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.766624 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/571d263a-a54a-4576-a0d2-cb6325f91b19-logs\") pod \"glance-default-internal-api-0\" (UID: \"571d263a-a54a-4576-a0d2-cb6325f91b19\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.766699 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/571d263a-a54a-4576-a0d2-cb6325f91b19-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"571d263a-a54a-4576-a0d2-cb6325f91b19\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.766759 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4n9r\" (UniqueName: \"kubernetes.io/projected/b1d02810-1b43-40b7-9b3c-99f316e7d3a9-kube-api-access-m4n9r\") pod \"glance-default-external-api-0\" (UID: \"b1d02810-1b43-40b7-9b3c-99f316e7d3a9\") " pod="openstack/glance-default-external-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.766800 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/571d263a-a54a-4576-a0d2-cb6325f91b19-scripts\") pod \"glance-default-internal-api-0\" (UID: \"571d263a-a54a-4576-a0d2-cb6325f91b19\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.766836 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/571d263a-a54a-4576-a0d2-cb6325f91b19-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"571d263a-a54a-4576-a0d2-cb6325f91b19\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.766915 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1d02810-1b43-40b7-9b3c-99f316e7d3a9-scripts\") pod \"glance-default-external-api-0\" (UID: \"b1d02810-1b43-40b7-9b3c-99f316e7d3a9\") " pod="openstack/glance-default-external-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.766947 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1d02810-1b43-40b7-9b3c-99f316e7d3a9-config-data\") pod \"glance-default-external-api-0\" (UID: \"b1d02810-1b43-40b7-9b3c-99f316e7d3a9\") " pod="openstack/glance-default-external-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.767158 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1d02810-1b43-40b7-9b3c-99f316e7d3a9-logs\") pod \"glance-default-external-api-0\" (UID: \"b1d02810-1b43-40b7-9b3c-99f316e7d3a9\") " pod="openstack/glance-default-external-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.767265 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/571d263a-a54a-4576-a0d2-cb6325f91b19-logs\") pod \"glance-default-internal-api-0\" (UID: \"571d263a-a54a-4576-a0d2-cb6325f91b19\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.767559 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/571d263a-a54a-4576-a0d2-cb6325f91b19-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"571d263a-a54a-4576-a0d2-cb6325f91b19\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.767869 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b1d02810-1b43-40b7-9b3c-99f316e7d3a9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b1d02810-1b43-40b7-9b3c-99f316e7d3a9\") " pod="openstack/glance-default-external-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.774310 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b1d02810-1b43-40b7-9b3c-99f316e7d3a9-ceph\") pod \"glance-default-external-api-0\" (UID: \"b1d02810-1b43-40b7-9b3c-99f316e7d3a9\") " pod="openstack/glance-default-external-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.774592 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1d02810-1b43-40b7-9b3c-99f316e7d3a9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b1d02810-1b43-40b7-9b3c-99f316e7d3a9\") " pod="openstack/glance-default-external-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.776711 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/571d263a-a54a-4576-a0d2-cb6325f91b19-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"571d263a-a54a-4576-a0d2-cb6325f91b19\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.778210 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/571d263a-a54a-4576-a0d2-cb6325f91b19-ceph\") pod \"glance-default-internal-api-0\" (UID: \"571d263a-a54a-4576-a0d2-cb6325f91b19\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.780709 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/571d263a-a54a-4576-a0d2-cb6325f91b19-config-data\") pod \"glance-default-internal-api-0\" (UID: \"571d263a-a54a-4576-a0d2-cb6325f91b19\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.781236 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/571d263a-a54a-4576-a0d2-cb6325f91b19-scripts\") pod \"glance-default-internal-api-0\" (UID: \"571d263a-a54a-4576-a0d2-cb6325f91b19\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.787188 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1d02810-1b43-40b7-9b3c-99f316e7d3a9-config-data\") pod \"glance-default-external-api-0\" (UID: \"b1d02810-1b43-40b7-9b3c-99f316e7d3a9\") " pod="openstack/glance-default-external-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.792417 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4n9r\" (UniqueName: \"kubernetes.io/projected/b1d02810-1b43-40b7-9b3c-99f316e7d3a9-kube-api-access-m4n9r\") pod \"glance-default-external-api-0\" (UID: \"b1d02810-1b43-40b7-9b3c-99f316e7d3a9\") " pod="openstack/glance-default-external-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.793216 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4624w\" (UniqueName: \"kubernetes.io/projected/571d263a-a54a-4576-a0d2-cb6325f91b19-kube-api-access-4624w\") pod \"glance-default-internal-api-0\" (UID: \"571d263a-a54a-4576-a0d2-cb6325f91b19\") " pod="openstack/glance-default-internal-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.808306 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1d02810-1b43-40b7-9b3c-99f316e7d3a9-scripts\") pod \"glance-default-external-api-0\" (UID: \"b1d02810-1b43-40b7-9b3c-99f316e7d3a9\") " pod="openstack/glance-default-external-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.914892 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 24 12:41:59 crc kubenswrapper[4752]: I1124 12:41:59.925314 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 24 12:42:00 crc kubenswrapper[4752]: I1124 12:42:00.519759 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 12:42:00 crc kubenswrapper[4752]: I1124 12:42:00.577502 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5d5b58f69-nb8jh" Nov 24 12:42:00 crc kubenswrapper[4752]: I1124 12:42:00.578071 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5d5b58f69-nb8jh" Nov 24 12:42:00 crc kubenswrapper[4752]: I1124 12:42:00.628589 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-b9cfc9c8f-x49vk" Nov 24 12:42:00 crc kubenswrapper[4752]: I1124 12:42:00.666591 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 12:42:00 crc kubenswrapper[4752]: I1124 12:42:00.744216 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72d1cd63-792f-41e5-ab1a-b322680a18f1" path="/var/lib/kubelet/pods/72d1cd63-792f-41e5-ab1a-b322680a18f1/volumes" Nov 24 12:42:00 crc kubenswrapper[4752]: I1124 12:42:00.744974 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e908313d-0c05-4500-9ad2-c8f86d019672" path="/var/lib/kubelet/pods/e908313d-0c05-4500-9ad2-c8f86d019672/volumes" Nov 24 12:42:01 crc kubenswrapper[4752]: I1124 12:42:01.059188 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5dfcff4df9-n9brt" Nov 24 12:42:01 crc kubenswrapper[4752]: I1124 12:42:01.059263 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5dfcff4df9-n9brt" Nov 24 12:42:01 crc kubenswrapper[4752]: I1124 12:42:01.443424 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"571d263a-a54a-4576-a0d2-cb6325f91b19","Type":"ContainerStarted","Data":"fc5be9a267c7a9eb08367883678b2207be70741095497fae6e15dafb2ed26eb6"} Nov 24 12:42:01 crc kubenswrapper[4752]: I1124 12:42:01.443685 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"571d263a-a54a-4576-a0d2-cb6325f91b19","Type":"ContainerStarted","Data":"024d0bcb6779ab540a4d1ccbba396c533bb219ca9f44fb4437ef9e6043122747"} Nov 24 12:42:01 crc kubenswrapper[4752]: I1124 12:42:01.448886 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b1d02810-1b43-40b7-9b3c-99f316e7d3a9","Type":"ContainerStarted","Data":"401ca9cd19b0cf0ffac61a192907d20bf7ae09c74471b0649404d368b9ca0519"} Nov 24 12:42:01 crc kubenswrapper[4752]: I1124 12:42:01.448917 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b1d02810-1b43-40b7-9b3c-99f316e7d3a9","Type":"ContainerStarted","Data":"2e6dad95a38993b64ed39a553ac132241287f844d38fea5d7dcaa52e7795a82b"} Nov 24 12:42:02 crc kubenswrapper[4752]: I1124 12:42:02.463613 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"571d263a-a54a-4576-a0d2-cb6325f91b19","Type":"ContainerStarted","Data":"ca191e87158a82e94acaa8dcdfdec33a69741e5c6b95aa2916f7ec635edba9fd"} Nov 24 12:42:02 crc kubenswrapper[4752]: I1124 12:42:02.466834 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b1d02810-1b43-40b7-9b3c-99f316e7d3a9","Type":"ContainerStarted","Data":"e94ca0637395e5ddb98bf4616455dc39057518f0a5e4675a06b3d66e461b3faa"} Nov 24 12:42:02 crc kubenswrapper[4752]: I1124 12:42:02.512011 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.511961931 podStartE2EDuration="3.511961931s" podCreationTimestamp="2025-11-24 12:41:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:42:02.489372132 +0000 UTC m=+5728.474192421" watchObservedRunningTime="2025-11-24 12:42:02.511961931 +0000 UTC m=+5728.496782240" Nov 24 12:42:02 crc kubenswrapper[4752]: I1124 12:42:02.532294 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.532274594 podStartE2EDuration="3.532274594s" podCreationTimestamp="2025-11-24 12:41:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:42:02.523497952 +0000 UTC m=+5728.508318241" watchObservedRunningTime="2025-11-24 12:42:02.532274594 +0000 UTC m=+5728.517094883" Nov 24 12:42:03 crc kubenswrapper[4752]: I1124 12:42:03.728886 4752 scope.go:117] "RemoveContainer" containerID="2856f89824856d474bf147248c78cc6aab4d866c6c4a631c464c721708251588" Nov 24 12:42:03 crc kubenswrapper[4752]: E1124 12:42:03.729616 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:42:09 crc kubenswrapper[4752]: I1124 12:42:09.915058 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 24 12:42:09 crc kubenswrapper[4752]: I1124 12:42:09.917028 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 24 12:42:09 crc kubenswrapper[4752]: I1124 12:42:09.925572 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 24 12:42:09 crc kubenswrapper[4752]: I1124 12:42:09.925632 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 24 12:42:09 crc kubenswrapper[4752]: I1124 12:42:09.953923 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 24 12:42:09 crc kubenswrapper[4752]: I1124 12:42:09.997878 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 24 12:42:09 crc kubenswrapper[4752]: I1124 12:42:09.998488 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 24 12:42:10 crc kubenswrapper[4752]: I1124 12:42:10.000183 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 24 12:42:10 crc kubenswrapper[4752]: I1124 12:42:10.555142 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 24 12:42:10 crc kubenswrapper[4752]: I1124 12:42:10.555188 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 24 12:42:10 crc kubenswrapper[4752]: I1124 12:42:10.555202 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 24 12:42:10 crc kubenswrapper[4752]: I1124 12:42:10.555217 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 24 12:42:10 crc kubenswrapper[4752]: I1124 12:42:10.577872 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5d5b58f69-nb8jh" podUID="e3343a79-7adf-4314-81ed-a106902874c2" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.109:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.109:8080: connect: connection refused" Nov 24 12:42:11 crc kubenswrapper[4752]: I1124 12:42:11.060123 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5dfcff4df9-n9brt" podUID="9442f57a-2b65-4f64-be4f-7c1a834acea9" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.110:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.110:8080: connect: connection refused" Nov 24 12:42:12 crc kubenswrapper[4752]: I1124 12:42:12.569691 4752 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 24 12:42:12 crc kubenswrapper[4752]: I1124 12:42:12.570069 4752 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 24 12:42:13 crc kubenswrapper[4752]: I1124 12:42:13.201935 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 24 12:42:13 crc kubenswrapper[4752]: I1124 12:42:13.292935 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 24 12:42:13 crc kubenswrapper[4752]: I1124 12:42:13.432544 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 24 12:42:13 crc kubenswrapper[4752]: I1124 12:42:13.432625 4752 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 24 12:42:13 crc kubenswrapper[4752]: I1124 12:42:13.765735 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 24 12:42:17 crc kubenswrapper[4752]: I1124 12:42:17.728171 4752 scope.go:117] "RemoveContainer" containerID="2856f89824856d474bf147248c78cc6aab4d866c6c4a631c464c721708251588" Nov 24 12:42:17 crc kubenswrapper[4752]: E1124 12:42:17.728961 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:42:22 crc kubenswrapper[4752]: I1124 12:42:22.036895 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-nh9fj"] Nov 24 12:42:22 crc kubenswrapper[4752]: I1124 12:42:22.052075 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-1e06-account-create-vcjrj"] Nov 24 12:42:22 crc kubenswrapper[4752]: I1124 12:42:22.063307 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-nh9fj"] Nov 24 12:42:22 crc kubenswrapper[4752]: I1124 12:42:22.074145 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-1e06-account-create-vcjrj"] Nov 24 12:42:22 crc kubenswrapper[4752]: I1124 12:42:22.364580 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5d5b58f69-nb8jh" Nov 24 12:42:22 crc kubenswrapper[4752]: I1124 12:42:22.738280 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84e5f7b0-c4d2-4672-8b81-87f84d56ac1e" path="/var/lib/kubelet/pods/84e5f7b0-c4d2-4672-8b81-87f84d56ac1e/volumes" Nov 24 12:42:22 crc kubenswrapper[4752]: I1124 12:42:22.740017 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae3d459b-1cb1-4d4a-a781-53f77cb195e4" path="/var/lib/kubelet/pods/ae3d459b-1cb1-4d4a-a781-53f77cb195e4/volumes" Nov 24 12:42:23 crc kubenswrapper[4752]: I1124 12:42:23.033507 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5dfcff4df9-n9brt" Nov 24 12:42:24 crc kubenswrapper[4752]: I1124 12:42:24.227835 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5d5b58f69-nb8jh" Nov 24 12:42:24 crc kubenswrapper[4752]: I1124 12:42:24.698240 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5dfcff4df9-n9brt" Nov 24 12:42:24 crc kubenswrapper[4752]: I1124 12:42:24.787044 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5d5b58f69-nb8jh"] Nov 24 12:42:24 crc kubenswrapper[4752]: I1124 12:42:24.787272 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5d5b58f69-nb8jh" podUID="e3343a79-7adf-4314-81ed-a106902874c2" containerName="horizon-log" containerID="cri-o://72c897d8b2815313f3c49cc17f64a0edcac20d9a02d2f7cb9ca6652e7f78c801" gracePeriod=30 Nov 24 12:42:24 crc kubenswrapper[4752]: I1124 12:42:24.787397 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5d5b58f69-nb8jh" podUID="e3343a79-7adf-4314-81ed-a106902874c2" containerName="horizon" containerID="cri-o://2aec1bcbd6537bdc1d4c991e85f4c04680816e0ddd5fd765d8cae2f304716f58" gracePeriod=30 Nov 24 12:42:28 crc kubenswrapper[4752]: I1124 12:42:28.802680 4752 generic.go:334] "Generic (PLEG): container finished" podID="e3343a79-7adf-4314-81ed-a106902874c2" containerID="2aec1bcbd6537bdc1d4c991e85f4c04680816e0ddd5fd765d8cae2f304716f58" exitCode=0 Nov 24 12:42:28 crc kubenswrapper[4752]: I1124 12:42:28.802799 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d5b58f69-nb8jh" event={"ID":"e3343a79-7adf-4314-81ed-a106902874c2","Type":"ContainerDied","Data":"2aec1bcbd6537bdc1d4c991e85f4c04680816e0ddd5fd765d8cae2f304716f58"} Nov 24 12:42:29 crc kubenswrapper[4752]: I1124 12:42:29.054514 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-9jxh8"] Nov 24 12:42:29 crc kubenswrapper[4752]: I1124 12:42:29.064176 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-9jxh8"] Nov 24 12:42:29 crc kubenswrapper[4752]: I1124 12:42:29.821540 4752 generic.go:334] "Generic (PLEG): container finished" podID="68e8973a-2f6d-4af3-b72a-c8f6a5b319d5" containerID="08c7eb9f1de04a8b92171de9f9f464e484aca0d154f855b0df43bc76c896d331" exitCode=137 Nov 24 12:42:29 crc kubenswrapper[4752]: I1124 12:42:29.821572 4752 generic.go:334] "Generic (PLEG): container finished" podID="68e8973a-2f6d-4af3-b72a-c8f6a5b319d5" containerID="ccfe614516ec9d2089221b5b27ada912577a3d52ef7d6a9c134f3df4bca4091d" exitCode=137 Nov 24 12:42:29 crc kubenswrapper[4752]: I1124 12:42:29.821595 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b9cfc9c8f-x49vk" event={"ID":"68e8973a-2f6d-4af3-b72a-c8f6a5b319d5","Type":"ContainerDied","Data":"08c7eb9f1de04a8b92171de9f9f464e484aca0d154f855b0df43bc76c896d331"} Nov 24 12:42:29 crc kubenswrapper[4752]: I1124 12:42:29.821632 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b9cfc9c8f-x49vk" event={"ID":"68e8973a-2f6d-4af3-b72a-c8f6a5b319d5","Type":"ContainerDied","Data":"ccfe614516ec9d2089221b5b27ada912577a3d52ef7d6a9c134f3df4bca4091d"} Nov 24 12:42:30 crc kubenswrapper[4752]: I1124 12:42:30.333328 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b9cfc9c8f-x49vk" Nov 24 12:42:30 crc kubenswrapper[4752]: I1124 12:42:30.485314 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68e8973a-2f6d-4af3-b72a-c8f6a5b319d5-scripts\") pod \"68e8973a-2f6d-4af3-b72a-c8f6a5b319d5\" (UID: \"68e8973a-2f6d-4af3-b72a-c8f6a5b319d5\") " Nov 24 12:42:30 crc kubenswrapper[4752]: I1124 12:42:30.485370 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68e8973a-2f6d-4af3-b72a-c8f6a5b319d5-logs\") pod \"68e8973a-2f6d-4af3-b72a-c8f6a5b319d5\" (UID: \"68e8973a-2f6d-4af3-b72a-c8f6a5b319d5\") " Nov 24 12:42:30 crc kubenswrapper[4752]: I1124 12:42:30.485402 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/68e8973a-2f6d-4af3-b72a-c8f6a5b319d5-horizon-secret-key\") pod \"68e8973a-2f6d-4af3-b72a-c8f6a5b319d5\" (UID: \"68e8973a-2f6d-4af3-b72a-c8f6a5b319d5\") " Nov 24 12:42:30 crc kubenswrapper[4752]: I1124 12:42:30.485482 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/68e8973a-2f6d-4af3-b72a-c8f6a5b319d5-config-data\") pod \"68e8973a-2f6d-4af3-b72a-c8f6a5b319d5\" (UID: \"68e8973a-2f6d-4af3-b72a-c8f6a5b319d5\") " Nov 24 12:42:30 crc kubenswrapper[4752]: I1124 12:42:30.485625 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5q9sj\" (UniqueName: \"kubernetes.io/projected/68e8973a-2f6d-4af3-b72a-c8f6a5b319d5-kube-api-access-5q9sj\") pod \"68e8973a-2f6d-4af3-b72a-c8f6a5b319d5\" (UID: \"68e8973a-2f6d-4af3-b72a-c8f6a5b319d5\") " Nov 24 12:42:30 crc kubenswrapper[4752]: I1124 12:42:30.486297 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68e8973a-2f6d-4af3-b72a-c8f6a5b319d5-logs" (OuterVolumeSpecName: "logs") pod "68e8973a-2f6d-4af3-b72a-c8f6a5b319d5" (UID: "68e8973a-2f6d-4af3-b72a-c8f6a5b319d5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:42:30 crc kubenswrapper[4752]: I1124 12:42:30.492435 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68e8973a-2f6d-4af3-b72a-c8f6a5b319d5-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "68e8973a-2f6d-4af3-b72a-c8f6a5b319d5" (UID: "68e8973a-2f6d-4af3-b72a-c8f6a5b319d5"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:42:30 crc kubenswrapper[4752]: I1124 12:42:30.493924 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68e8973a-2f6d-4af3-b72a-c8f6a5b319d5-kube-api-access-5q9sj" (OuterVolumeSpecName: "kube-api-access-5q9sj") pod "68e8973a-2f6d-4af3-b72a-c8f6a5b319d5" (UID: "68e8973a-2f6d-4af3-b72a-c8f6a5b319d5"). InnerVolumeSpecName "kube-api-access-5q9sj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:42:30 crc kubenswrapper[4752]: I1124 12:42:30.514503 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68e8973a-2f6d-4af3-b72a-c8f6a5b319d5-config-data" (OuterVolumeSpecName: "config-data") pod "68e8973a-2f6d-4af3-b72a-c8f6a5b319d5" (UID: "68e8973a-2f6d-4af3-b72a-c8f6a5b319d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:42:30 crc kubenswrapper[4752]: I1124 12:42:30.520448 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68e8973a-2f6d-4af3-b72a-c8f6a5b319d5-scripts" (OuterVolumeSpecName: "scripts") pod "68e8973a-2f6d-4af3-b72a-c8f6a5b319d5" (UID: "68e8973a-2f6d-4af3-b72a-c8f6a5b319d5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:42:30 crc kubenswrapper[4752]: I1124 12:42:30.577059 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5d5b58f69-nb8jh" podUID="e3343a79-7adf-4314-81ed-a106902874c2" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.109:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.109:8080: connect: connection refused" Nov 24 12:42:30 crc kubenswrapper[4752]: I1124 12:42:30.587703 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5q9sj\" (UniqueName: \"kubernetes.io/projected/68e8973a-2f6d-4af3-b72a-c8f6a5b319d5-kube-api-access-5q9sj\") on node \"crc\" DevicePath \"\"" Nov 24 12:42:30 crc kubenswrapper[4752]: I1124 12:42:30.587793 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68e8973a-2f6d-4af3-b72a-c8f6a5b319d5-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:42:30 crc kubenswrapper[4752]: I1124 12:42:30.587807 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68e8973a-2f6d-4af3-b72a-c8f6a5b319d5-logs\") on node \"crc\" DevicePath \"\"" Nov 24 12:42:30 crc kubenswrapper[4752]: I1124 12:42:30.587819 4752 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/68e8973a-2f6d-4af3-b72a-c8f6a5b319d5-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 24 12:42:30 crc kubenswrapper[4752]: I1124 12:42:30.587830 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/68e8973a-2f6d-4af3-b72a-c8f6a5b319d5-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 12:42:30 crc kubenswrapper[4752]: I1124 12:42:30.727911 4752 scope.go:117] "RemoveContainer" containerID="2856f89824856d474bf147248c78cc6aab4d866c6c4a631c464c721708251588" Nov 24 12:42:30 crc kubenswrapper[4752]: E1124 12:42:30.728197 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:42:30 crc kubenswrapper[4752]: I1124 12:42:30.745821 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bed6539-d927-4527-95f0-1643bd3b0cc7" path="/var/lib/kubelet/pods/7bed6539-d927-4527-95f0-1643bd3b0cc7/volumes" Nov 24 12:42:30 crc kubenswrapper[4752]: I1124 12:42:30.845069 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b9cfc9c8f-x49vk" event={"ID":"68e8973a-2f6d-4af3-b72a-c8f6a5b319d5","Type":"ContainerDied","Data":"33aba2beccd7c10af3065507f32ae2622ebebb74604dfc464838d4465c393949"} Nov 24 12:42:30 crc kubenswrapper[4752]: I1124 12:42:30.845170 4752 scope.go:117] "RemoveContainer" containerID="08c7eb9f1de04a8b92171de9f9f464e484aca0d154f855b0df43bc76c896d331" Nov 24 12:42:30 crc kubenswrapper[4752]: I1124 12:42:30.845532 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b9cfc9c8f-x49vk" Nov 24 12:42:30 crc kubenswrapper[4752]: I1124 12:42:30.892318 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-b9cfc9c8f-x49vk"] Nov 24 12:42:30 crc kubenswrapper[4752]: I1124 12:42:30.908656 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-b9cfc9c8f-x49vk"] Nov 24 12:42:31 crc kubenswrapper[4752]: I1124 12:42:31.059085 4752 scope.go:117] "RemoveContainer" containerID="ccfe614516ec9d2089221b5b27ada912577a3d52ef7d6a9c134f3df4bca4091d" Nov 24 12:42:32 crc kubenswrapper[4752]: I1124 12:42:32.740402 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68e8973a-2f6d-4af3-b72a-c8f6a5b319d5" path="/var/lib/kubelet/pods/68e8973a-2f6d-4af3-b72a-c8f6a5b319d5/volumes" Nov 24 12:42:40 crc kubenswrapper[4752]: I1124 12:42:40.577270 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5d5b58f69-nb8jh" podUID="e3343a79-7adf-4314-81ed-a106902874c2" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.109:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.109:8080: connect: connection refused" Nov 24 12:42:42 crc kubenswrapper[4752]: I1124 12:42:42.728693 4752 scope.go:117] "RemoveContainer" containerID="2856f89824856d474bf147248c78cc6aab4d866c6c4a631c464c721708251588" Nov 24 12:42:42 crc kubenswrapper[4752]: E1124 12:42:42.730094 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:42:43 crc kubenswrapper[4752]: I1124 12:42:43.581575 4752 scope.go:117] "RemoveContainer" containerID="b665efa9abf8dd36c2480e8a4fef3520a378fc471f87a1d4e667f7f0feb60a5f" Nov 24 12:42:43 crc kubenswrapper[4752]: I1124 12:42:43.620587 4752 scope.go:117] "RemoveContainer" containerID="926aab143f59578ae4b83e8719712c16bcd0b52c98b8f2af47f3ce97fd7afda7" Nov 24 12:42:43 crc kubenswrapper[4752]: I1124 12:42:43.679576 4752 scope.go:117] "RemoveContainer" containerID="0166f69cf3af1e5c15aaa08a472e637dea5361a73d2eaaef55a99edd701b8560" Nov 24 12:42:50 crc kubenswrapper[4752]: I1124 12:42:50.577822 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5d5b58f69-nb8jh" podUID="e3343a79-7adf-4314-81ed-a106902874c2" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.109:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.109:8080: connect: connection refused" Nov 24 12:42:50 crc kubenswrapper[4752]: I1124 12:42:50.578587 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5d5b58f69-nb8jh" Nov 24 12:42:54 crc kubenswrapper[4752]: I1124 12:42:54.736576 4752 scope.go:117] "RemoveContainer" containerID="2856f89824856d474bf147248c78cc6aab4d866c6c4a631c464c721708251588" Nov 24 12:42:54 crc kubenswrapper[4752]: E1124 12:42:54.737780 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:42:55 crc kubenswrapper[4752]: I1124 12:42:55.101854 4752 generic.go:334] "Generic (PLEG): container finished" podID="e3343a79-7adf-4314-81ed-a106902874c2" containerID="72c897d8b2815313f3c49cc17f64a0edcac20d9a02d2f7cb9ca6652e7f78c801" exitCode=137 Nov 24 12:42:55 crc kubenswrapper[4752]: I1124 12:42:55.101923 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d5b58f69-nb8jh" event={"ID":"e3343a79-7adf-4314-81ed-a106902874c2","Type":"ContainerDied","Data":"72c897d8b2815313f3c49cc17f64a0edcac20d9a02d2f7cb9ca6652e7f78c801"} Nov 24 12:42:55 crc kubenswrapper[4752]: I1124 12:42:55.239056 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d5b58f69-nb8jh" Nov 24 12:42:55 crc kubenswrapper[4752]: I1124 12:42:55.427387 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrz6f\" (UniqueName: \"kubernetes.io/projected/e3343a79-7adf-4314-81ed-a106902874c2-kube-api-access-mrz6f\") pod \"e3343a79-7adf-4314-81ed-a106902874c2\" (UID: \"e3343a79-7adf-4314-81ed-a106902874c2\") " Nov 24 12:42:55 crc kubenswrapper[4752]: I1124 12:42:55.427481 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e3343a79-7adf-4314-81ed-a106902874c2-horizon-secret-key\") pod \"e3343a79-7adf-4314-81ed-a106902874c2\" (UID: \"e3343a79-7adf-4314-81ed-a106902874c2\") " Nov 24 12:42:55 crc kubenswrapper[4752]: I1124 12:42:55.427608 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3343a79-7adf-4314-81ed-a106902874c2-logs\") pod \"e3343a79-7adf-4314-81ed-a106902874c2\" (UID: \"e3343a79-7adf-4314-81ed-a106902874c2\") " Nov 24 12:42:55 crc kubenswrapper[4752]: I1124 12:42:55.427688 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3343a79-7adf-4314-81ed-a106902874c2-config-data\") pod \"e3343a79-7adf-4314-81ed-a106902874c2\" (UID: \"e3343a79-7adf-4314-81ed-a106902874c2\") " Nov 24 12:42:55 crc kubenswrapper[4752]: I1124 12:42:55.427886 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e3343a79-7adf-4314-81ed-a106902874c2-scripts\") pod \"e3343a79-7adf-4314-81ed-a106902874c2\" (UID: \"e3343a79-7adf-4314-81ed-a106902874c2\") " Nov 24 12:42:55 crc kubenswrapper[4752]: I1124 12:42:55.427998 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3343a79-7adf-4314-81ed-a106902874c2-logs" (OuterVolumeSpecName: "logs") pod "e3343a79-7adf-4314-81ed-a106902874c2" (UID: "e3343a79-7adf-4314-81ed-a106902874c2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:42:55 crc kubenswrapper[4752]: I1124 12:42:55.428454 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3343a79-7adf-4314-81ed-a106902874c2-logs\") on node \"crc\" DevicePath \"\"" Nov 24 12:42:55 crc kubenswrapper[4752]: I1124 12:42:55.433378 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3343a79-7adf-4314-81ed-a106902874c2-kube-api-access-mrz6f" (OuterVolumeSpecName: "kube-api-access-mrz6f") pod "e3343a79-7adf-4314-81ed-a106902874c2" (UID: "e3343a79-7adf-4314-81ed-a106902874c2"). InnerVolumeSpecName "kube-api-access-mrz6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:42:55 crc kubenswrapper[4752]: I1124 12:42:55.433715 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3343a79-7adf-4314-81ed-a106902874c2-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e3343a79-7adf-4314-81ed-a106902874c2" (UID: "e3343a79-7adf-4314-81ed-a106902874c2"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:42:55 crc kubenswrapper[4752]: I1124 12:42:55.452077 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3343a79-7adf-4314-81ed-a106902874c2-config-data" (OuterVolumeSpecName: "config-data") pod "e3343a79-7adf-4314-81ed-a106902874c2" (UID: "e3343a79-7adf-4314-81ed-a106902874c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:42:55 crc kubenswrapper[4752]: I1124 12:42:55.458216 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3343a79-7adf-4314-81ed-a106902874c2-scripts" (OuterVolumeSpecName: "scripts") pod "e3343a79-7adf-4314-81ed-a106902874c2" (UID: "e3343a79-7adf-4314-81ed-a106902874c2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:42:55 crc kubenswrapper[4752]: I1124 12:42:55.530798 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e3343a79-7adf-4314-81ed-a106902874c2-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:42:55 crc kubenswrapper[4752]: I1124 12:42:55.530830 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrz6f\" (UniqueName: \"kubernetes.io/projected/e3343a79-7adf-4314-81ed-a106902874c2-kube-api-access-mrz6f\") on node \"crc\" DevicePath \"\"" Nov 24 12:42:55 crc kubenswrapper[4752]: I1124 12:42:55.530845 4752 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e3343a79-7adf-4314-81ed-a106902874c2-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 24 12:42:55 crc kubenswrapper[4752]: I1124 12:42:55.530857 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3343a79-7adf-4314-81ed-a106902874c2-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 12:42:56 crc kubenswrapper[4752]: I1124 12:42:56.045026 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-05d1-account-create-49gwn"] Nov 24 12:42:56 crc kubenswrapper[4752]: I1124 12:42:56.052849 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-7v95l"] Nov 24 12:42:56 crc kubenswrapper[4752]: I1124 12:42:56.061291 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-05d1-account-create-49gwn"] Nov 24 12:42:56 crc kubenswrapper[4752]: I1124 12:42:56.069260 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-7v95l"] Nov 24 12:42:56 crc kubenswrapper[4752]: I1124 12:42:56.112062 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d5b58f69-nb8jh" event={"ID":"e3343a79-7adf-4314-81ed-a106902874c2","Type":"ContainerDied","Data":"4c0fcd5c26dd06f39ca978e3cc807ef52edc7f780fcd4bcc6ed9b58ea41c956e"} Nov 24 12:42:56 crc kubenswrapper[4752]: I1124 12:42:56.112116 4752 scope.go:117] "RemoveContainer" containerID="2aec1bcbd6537bdc1d4c991e85f4c04680816e0ddd5fd765d8cae2f304716f58" Nov 24 12:42:56 crc kubenswrapper[4752]: I1124 12:42:56.112119 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d5b58f69-nb8jh" Nov 24 12:42:56 crc kubenswrapper[4752]: I1124 12:42:56.145266 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5d5b58f69-nb8jh"] Nov 24 12:42:56 crc kubenswrapper[4752]: I1124 12:42:56.152714 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5d5b58f69-nb8jh"] Nov 24 12:42:56 crc kubenswrapper[4752]: I1124 12:42:56.268049 4752 scope.go:117] "RemoveContainer" containerID="72c897d8b2815313f3c49cc17f64a0edcac20d9a02d2f7cb9ca6652e7f78c801" Nov 24 12:42:56 crc kubenswrapper[4752]: I1124 12:42:56.749330 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cbf0bbd-6b1f-48e1-8ec4-8779d4c9d060" path="/var/lib/kubelet/pods/0cbf0bbd-6b1f-48e1-8ec4-8779d4c9d060/volumes" Nov 24 12:42:56 crc kubenswrapper[4752]: I1124 12:42:56.750869 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7d0d60e-824d-4b68-aadc-0803ccd1130e" path="/var/lib/kubelet/pods/b7d0d60e-824d-4b68-aadc-0803ccd1130e/volumes" Nov 24 12:42:56 crc kubenswrapper[4752]: I1124 12:42:56.751622 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3343a79-7adf-4314-81ed-a106902874c2" path="/var/lib/kubelet/pods/e3343a79-7adf-4314-81ed-a106902874c2/volumes" Nov 24 12:43:05 crc kubenswrapper[4752]: I1124 12:43:05.036273 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-4cs8l"] Nov 24 12:43:05 crc kubenswrapper[4752]: I1124 12:43:05.045876 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-4cs8l"] Nov 24 12:43:06 crc kubenswrapper[4752]: I1124 12:43:06.729078 4752 scope.go:117] "RemoveContainer" containerID="2856f89824856d474bf147248c78cc6aab4d866c6c4a631c464c721708251588" Nov 24 12:43:06 crc kubenswrapper[4752]: E1124 12:43:06.729638 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:43:06 crc kubenswrapper[4752]: I1124 12:43:06.742590 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72555f28-0bcc-4ada-bd18-48e12c193a4a" path="/var/lib/kubelet/pods/72555f28-0bcc-4ada-bd18-48e12c193a4a/volumes" Nov 24 12:43:07 crc kubenswrapper[4752]: I1124 12:43:07.513798 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-554cf5f95c-cppx5"] Nov 24 12:43:07 crc kubenswrapper[4752]: E1124 12:43:07.514633 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3343a79-7adf-4314-81ed-a106902874c2" containerName="horizon" Nov 24 12:43:07 crc kubenswrapper[4752]: I1124 12:43:07.514653 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3343a79-7adf-4314-81ed-a106902874c2" containerName="horizon" Nov 24 12:43:07 crc kubenswrapper[4752]: E1124 12:43:07.514691 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68e8973a-2f6d-4af3-b72a-c8f6a5b319d5" containerName="horizon-log" Nov 24 12:43:07 crc kubenswrapper[4752]: I1124 12:43:07.514699 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="68e8973a-2f6d-4af3-b72a-c8f6a5b319d5" containerName="horizon-log" Nov 24 12:43:07 crc kubenswrapper[4752]: E1124 12:43:07.514757 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68e8973a-2f6d-4af3-b72a-c8f6a5b319d5" containerName="horizon" Nov 24 12:43:07 crc kubenswrapper[4752]: I1124 12:43:07.514767 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="68e8973a-2f6d-4af3-b72a-c8f6a5b319d5" containerName="horizon" Nov 24 12:43:07 crc kubenswrapper[4752]: E1124 12:43:07.514784 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3343a79-7adf-4314-81ed-a106902874c2" containerName="horizon-log" Nov 24 12:43:07 crc kubenswrapper[4752]: I1124 12:43:07.514794 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3343a79-7adf-4314-81ed-a106902874c2" containerName="horizon-log" Nov 24 12:43:07 crc kubenswrapper[4752]: I1124 12:43:07.515018 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="68e8973a-2f6d-4af3-b72a-c8f6a5b319d5" containerName="horizon-log" Nov 24 12:43:07 crc kubenswrapper[4752]: I1124 12:43:07.515042 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3343a79-7adf-4314-81ed-a106902874c2" containerName="horizon" Nov 24 12:43:07 crc kubenswrapper[4752]: I1124 12:43:07.515061 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="68e8973a-2f6d-4af3-b72a-c8f6a5b319d5" containerName="horizon" Nov 24 12:43:07 crc kubenswrapper[4752]: I1124 12:43:07.515078 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3343a79-7adf-4314-81ed-a106902874c2" containerName="horizon-log" Nov 24 12:43:07 crc kubenswrapper[4752]: I1124 12:43:07.516422 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-554cf5f95c-cppx5" Nov 24 12:43:07 crc kubenswrapper[4752]: I1124 12:43:07.524911 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-554cf5f95c-cppx5"] Nov 24 12:43:07 crc kubenswrapper[4752]: I1124 12:43:07.550027 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/da80ccb9-4d2f-49d3-8689-5cf720968e94-horizon-secret-key\") pod \"horizon-554cf5f95c-cppx5\" (UID: \"da80ccb9-4d2f-49d3-8689-5cf720968e94\") " pod="openstack/horizon-554cf5f95c-cppx5" Nov 24 12:43:07 crc kubenswrapper[4752]: I1124 12:43:07.553694 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/da80ccb9-4d2f-49d3-8689-5cf720968e94-config-data\") pod \"horizon-554cf5f95c-cppx5\" (UID: \"da80ccb9-4d2f-49d3-8689-5cf720968e94\") " pod="openstack/horizon-554cf5f95c-cppx5" Nov 24 12:43:07 crc kubenswrapper[4752]: I1124 12:43:07.555048 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da80ccb9-4d2f-49d3-8689-5cf720968e94-scripts\") pod \"horizon-554cf5f95c-cppx5\" (UID: \"da80ccb9-4d2f-49d3-8689-5cf720968e94\") " pod="openstack/horizon-554cf5f95c-cppx5" Nov 24 12:43:07 crc kubenswrapper[4752]: I1124 12:43:07.555473 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jbrk\" (UniqueName: \"kubernetes.io/projected/da80ccb9-4d2f-49d3-8689-5cf720968e94-kube-api-access-5jbrk\") pod \"horizon-554cf5f95c-cppx5\" (UID: \"da80ccb9-4d2f-49d3-8689-5cf720968e94\") " pod="openstack/horizon-554cf5f95c-cppx5" Nov 24 12:43:07 crc kubenswrapper[4752]: I1124 12:43:07.555638 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da80ccb9-4d2f-49d3-8689-5cf720968e94-logs\") pod \"horizon-554cf5f95c-cppx5\" (UID: \"da80ccb9-4d2f-49d3-8689-5cf720968e94\") " pod="openstack/horizon-554cf5f95c-cppx5" Nov 24 12:43:07 crc kubenswrapper[4752]: I1124 12:43:07.657165 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da80ccb9-4d2f-49d3-8689-5cf720968e94-logs\") pod \"horizon-554cf5f95c-cppx5\" (UID: \"da80ccb9-4d2f-49d3-8689-5cf720968e94\") " pod="openstack/horizon-554cf5f95c-cppx5" Nov 24 12:43:07 crc kubenswrapper[4752]: I1124 12:43:07.657242 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/da80ccb9-4d2f-49d3-8689-5cf720968e94-horizon-secret-key\") pod \"horizon-554cf5f95c-cppx5\" (UID: \"da80ccb9-4d2f-49d3-8689-5cf720968e94\") " pod="openstack/horizon-554cf5f95c-cppx5" Nov 24 12:43:07 crc kubenswrapper[4752]: I1124 12:43:07.657270 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/da80ccb9-4d2f-49d3-8689-5cf720968e94-config-data\") pod \"horizon-554cf5f95c-cppx5\" (UID: \"da80ccb9-4d2f-49d3-8689-5cf720968e94\") " pod="openstack/horizon-554cf5f95c-cppx5" Nov 24 12:43:07 crc kubenswrapper[4752]: I1124 12:43:07.657378 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da80ccb9-4d2f-49d3-8689-5cf720968e94-scripts\") pod \"horizon-554cf5f95c-cppx5\" (UID: \"da80ccb9-4d2f-49d3-8689-5cf720968e94\") " pod="openstack/horizon-554cf5f95c-cppx5" Nov 24 12:43:07 crc kubenswrapper[4752]: I1124 12:43:07.657417 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jbrk\" (UniqueName: \"kubernetes.io/projected/da80ccb9-4d2f-49d3-8689-5cf720968e94-kube-api-access-5jbrk\") pod \"horizon-554cf5f95c-cppx5\" (UID: \"da80ccb9-4d2f-49d3-8689-5cf720968e94\") " pod="openstack/horizon-554cf5f95c-cppx5" Nov 24 12:43:07 crc kubenswrapper[4752]: I1124 12:43:07.657843 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da80ccb9-4d2f-49d3-8689-5cf720968e94-logs\") pod \"horizon-554cf5f95c-cppx5\" (UID: \"da80ccb9-4d2f-49d3-8689-5cf720968e94\") " pod="openstack/horizon-554cf5f95c-cppx5" Nov 24 12:43:07 crc kubenswrapper[4752]: I1124 12:43:07.658376 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da80ccb9-4d2f-49d3-8689-5cf720968e94-scripts\") pod \"horizon-554cf5f95c-cppx5\" (UID: \"da80ccb9-4d2f-49d3-8689-5cf720968e94\") " pod="openstack/horizon-554cf5f95c-cppx5" Nov 24 12:43:07 crc kubenswrapper[4752]: I1124 12:43:07.658728 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/da80ccb9-4d2f-49d3-8689-5cf720968e94-config-data\") pod \"horizon-554cf5f95c-cppx5\" (UID: \"da80ccb9-4d2f-49d3-8689-5cf720968e94\") " pod="openstack/horizon-554cf5f95c-cppx5" Nov 24 12:43:07 crc kubenswrapper[4752]: I1124 12:43:07.678611 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/da80ccb9-4d2f-49d3-8689-5cf720968e94-horizon-secret-key\") pod \"horizon-554cf5f95c-cppx5\" (UID: \"da80ccb9-4d2f-49d3-8689-5cf720968e94\") " pod="openstack/horizon-554cf5f95c-cppx5" Nov 24 12:43:07 crc kubenswrapper[4752]: I1124 12:43:07.679350 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jbrk\" (UniqueName: \"kubernetes.io/projected/da80ccb9-4d2f-49d3-8689-5cf720968e94-kube-api-access-5jbrk\") pod \"horizon-554cf5f95c-cppx5\" (UID: \"da80ccb9-4d2f-49d3-8689-5cf720968e94\") " pod="openstack/horizon-554cf5f95c-cppx5" Nov 24 12:43:07 crc kubenswrapper[4752]: I1124 12:43:07.853176 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-554cf5f95c-cppx5" Nov 24 12:43:08 crc kubenswrapper[4752]: I1124 12:43:08.366191 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-554cf5f95c-cppx5"] Nov 24 12:43:08 crc kubenswrapper[4752]: I1124 12:43:08.498009 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-d74gw"] Nov 24 12:43:08 crc kubenswrapper[4752]: I1124 12:43:08.501825 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d74gw" Nov 24 12:43:08 crc kubenswrapper[4752]: I1124 12:43:08.513120 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d74gw"] Nov 24 12:43:08 crc kubenswrapper[4752]: I1124 12:43:08.575227 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/099586b8-337b-49b9-93ce-8799907d222e-catalog-content\") pod \"redhat-operators-d74gw\" (UID: \"099586b8-337b-49b9-93ce-8799907d222e\") " pod="openshift-marketplace/redhat-operators-d74gw" Nov 24 12:43:08 crc kubenswrapper[4752]: I1124 12:43:08.575275 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtvnd\" (UniqueName: \"kubernetes.io/projected/099586b8-337b-49b9-93ce-8799907d222e-kube-api-access-qtvnd\") pod \"redhat-operators-d74gw\" (UID: \"099586b8-337b-49b9-93ce-8799907d222e\") " pod="openshift-marketplace/redhat-operators-d74gw" Nov 24 12:43:08 crc kubenswrapper[4752]: I1124 12:43:08.575418 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/099586b8-337b-49b9-93ce-8799907d222e-utilities\") pod \"redhat-operators-d74gw\" (UID: \"099586b8-337b-49b9-93ce-8799907d222e\") " pod="openshift-marketplace/redhat-operators-d74gw" Nov 24 12:43:08 crc kubenswrapper[4752]: I1124 12:43:08.676763 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/099586b8-337b-49b9-93ce-8799907d222e-utilities\") pod \"redhat-operators-d74gw\" (UID: \"099586b8-337b-49b9-93ce-8799907d222e\") " pod="openshift-marketplace/redhat-operators-d74gw" Nov 24 12:43:08 crc kubenswrapper[4752]: I1124 12:43:08.676898 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/099586b8-337b-49b9-93ce-8799907d222e-catalog-content\") pod \"redhat-operators-d74gw\" (UID: \"099586b8-337b-49b9-93ce-8799907d222e\") " pod="openshift-marketplace/redhat-operators-d74gw" Nov 24 12:43:08 crc kubenswrapper[4752]: I1124 12:43:08.676939 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtvnd\" (UniqueName: \"kubernetes.io/projected/099586b8-337b-49b9-93ce-8799907d222e-kube-api-access-qtvnd\") pod \"redhat-operators-d74gw\" (UID: \"099586b8-337b-49b9-93ce-8799907d222e\") " pod="openshift-marketplace/redhat-operators-d74gw" Nov 24 12:43:08 crc kubenswrapper[4752]: I1124 12:43:08.677688 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/099586b8-337b-49b9-93ce-8799907d222e-utilities\") pod \"redhat-operators-d74gw\" (UID: \"099586b8-337b-49b9-93ce-8799907d222e\") " pod="openshift-marketplace/redhat-operators-d74gw" Nov 24 12:43:08 crc kubenswrapper[4752]: I1124 12:43:08.677924 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/099586b8-337b-49b9-93ce-8799907d222e-catalog-content\") pod \"redhat-operators-d74gw\" (UID: \"099586b8-337b-49b9-93ce-8799907d222e\") " pod="openshift-marketplace/redhat-operators-d74gw" Nov 24 12:43:08 crc kubenswrapper[4752]: I1124 12:43:08.698886 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtvnd\" (UniqueName: \"kubernetes.io/projected/099586b8-337b-49b9-93ce-8799907d222e-kube-api-access-qtvnd\") pod \"redhat-operators-d74gw\" (UID: \"099586b8-337b-49b9-93ce-8799907d222e\") " pod="openshift-marketplace/redhat-operators-d74gw" Nov 24 12:43:08 crc kubenswrapper[4752]: I1124 12:43:08.819298 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d74gw" Nov 24 12:43:09 crc kubenswrapper[4752]: I1124 12:43:09.165839 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-jn9pl"] Nov 24 12:43:09 crc kubenswrapper[4752]: I1124 12:43:09.168010 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-jn9pl" Nov 24 12:43:09 crc kubenswrapper[4752]: I1124 12:43:09.182836 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-jn9pl"] Nov 24 12:43:09 crc kubenswrapper[4752]: I1124 12:43:09.192992 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b548dc2a-4fbc-4303-9ace-3e58cb26598f-operator-scripts\") pod \"heat-db-create-jn9pl\" (UID: \"b548dc2a-4fbc-4303-9ace-3e58cb26598f\") " pod="openstack/heat-db-create-jn9pl" Nov 24 12:43:09 crc kubenswrapper[4752]: I1124 12:43:09.193053 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcrbm\" (UniqueName: \"kubernetes.io/projected/b548dc2a-4fbc-4303-9ace-3e58cb26598f-kube-api-access-rcrbm\") pod \"heat-db-create-jn9pl\" (UID: \"b548dc2a-4fbc-4303-9ace-3e58cb26598f\") " pod="openstack/heat-db-create-jn9pl" Nov 24 12:43:09 crc kubenswrapper[4752]: I1124 12:43:09.259454 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-554cf5f95c-cppx5" event={"ID":"da80ccb9-4d2f-49d3-8689-5cf720968e94","Type":"ContainerStarted","Data":"530b88a1418ae4ca6acd3aae4bbebc6a04f9f4d473d0a0ca76914749391b90c5"} Nov 24 12:43:09 crc kubenswrapper[4752]: I1124 12:43:09.259505 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-554cf5f95c-cppx5" event={"ID":"da80ccb9-4d2f-49d3-8689-5cf720968e94","Type":"ContainerStarted","Data":"eedf33a3f79f43bee4c9d39d36b7e90f2c380c80cddac54470557fbbe8d9b5a5"} Nov 24 12:43:09 crc kubenswrapper[4752]: I1124 12:43:09.259514 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-554cf5f95c-cppx5" event={"ID":"da80ccb9-4d2f-49d3-8689-5cf720968e94","Type":"ContainerStarted","Data":"3ad9dbedfb9601306a86a923cec6cd26df42e5dda98777d5120637f3ac722856"} Nov 24 12:43:09 crc kubenswrapper[4752]: I1124 12:43:09.295417 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b548dc2a-4fbc-4303-9ace-3e58cb26598f-operator-scripts\") pod \"heat-db-create-jn9pl\" (UID: \"b548dc2a-4fbc-4303-9ace-3e58cb26598f\") " pod="openstack/heat-db-create-jn9pl" Nov 24 12:43:09 crc kubenswrapper[4752]: I1124 12:43:09.295470 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcrbm\" (UniqueName: \"kubernetes.io/projected/b548dc2a-4fbc-4303-9ace-3e58cb26598f-kube-api-access-rcrbm\") pod \"heat-db-create-jn9pl\" (UID: \"b548dc2a-4fbc-4303-9ace-3e58cb26598f\") " pod="openstack/heat-db-create-jn9pl" Nov 24 12:43:09 crc kubenswrapper[4752]: I1124 12:43:09.296448 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b548dc2a-4fbc-4303-9ace-3e58cb26598f-operator-scripts\") pod \"heat-db-create-jn9pl\" (UID: \"b548dc2a-4fbc-4303-9ace-3e58cb26598f\") " pod="openstack/heat-db-create-jn9pl" Nov 24 12:43:09 crc kubenswrapper[4752]: I1124 12:43:09.302114 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-554cf5f95c-cppx5" podStartSLOduration=2.302096662 podStartE2EDuration="2.302096662s" podCreationTimestamp="2025-11-24 12:43:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:43:09.287230575 +0000 UTC m=+5795.272050864" watchObservedRunningTime="2025-11-24 12:43:09.302096662 +0000 UTC m=+5795.286916951" Nov 24 12:43:09 crc kubenswrapper[4752]: I1124 12:43:09.302316 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-8683-account-create-qqbxv"] Nov 24 12:43:09 crc kubenswrapper[4752]: I1124 12:43:09.303640 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-8683-account-create-qqbxv" Nov 24 12:43:09 crc kubenswrapper[4752]: I1124 12:43:09.306011 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Nov 24 12:43:09 crc kubenswrapper[4752]: I1124 12:43:09.348869 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcrbm\" (UniqueName: \"kubernetes.io/projected/b548dc2a-4fbc-4303-9ace-3e58cb26598f-kube-api-access-rcrbm\") pod \"heat-db-create-jn9pl\" (UID: \"b548dc2a-4fbc-4303-9ace-3e58cb26598f\") " pod="openstack/heat-db-create-jn9pl" Nov 24 12:43:09 crc kubenswrapper[4752]: I1124 12:43:09.354845 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-8683-account-create-qqbxv"] Nov 24 12:43:09 crc kubenswrapper[4752]: I1124 12:43:09.397043 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cb17497-6c03-4e7b-a42a-0d7f1829fbcf-operator-scripts\") pod \"heat-8683-account-create-qqbxv\" (UID: \"1cb17497-6c03-4e7b-a42a-0d7f1829fbcf\") " pod="openstack/heat-8683-account-create-qqbxv" Nov 24 12:43:09 crc kubenswrapper[4752]: I1124 12:43:09.397489 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-strlg\" (UniqueName: \"kubernetes.io/projected/1cb17497-6c03-4e7b-a42a-0d7f1829fbcf-kube-api-access-strlg\") pod \"heat-8683-account-create-qqbxv\" (UID: \"1cb17497-6c03-4e7b-a42a-0d7f1829fbcf\") " pod="openstack/heat-8683-account-create-qqbxv" Nov 24 12:43:09 crc kubenswrapper[4752]: I1124 12:43:09.439466 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d74gw"] Nov 24 12:43:09 crc kubenswrapper[4752]: W1124 12:43:09.456350 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod099586b8_337b_49b9_93ce_8799907d222e.slice/crio-b2f0dd36dbb1bbccbc37e7fd6067a279a3bda4a8247ca7a0004231e93ae8b293 WatchSource:0}: Error finding container b2f0dd36dbb1bbccbc37e7fd6067a279a3bda4a8247ca7a0004231e93ae8b293: Status 404 returned error can't find the container with id b2f0dd36dbb1bbccbc37e7fd6067a279a3bda4a8247ca7a0004231e93ae8b293 Nov 24 12:43:09 crc kubenswrapper[4752]: I1124 12:43:09.499669 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cb17497-6c03-4e7b-a42a-0d7f1829fbcf-operator-scripts\") pod \"heat-8683-account-create-qqbxv\" (UID: \"1cb17497-6c03-4e7b-a42a-0d7f1829fbcf\") " pod="openstack/heat-8683-account-create-qqbxv" Nov 24 12:43:09 crc kubenswrapper[4752]: I1124 12:43:09.500008 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-strlg\" (UniqueName: \"kubernetes.io/projected/1cb17497-6c03-4e7b-a42a-0d7f1829fbcf-kube-api-access-strlg\") pod \"heat-8683-account-create-qqbxv\" (UID: \"1cb17497-6c03-4e7b-a42a-0d7f1829fbcf\") " pod="openstack/heat-8683-account-create-qqbxv" Nov 24 12:43:09 crc kubenswrapper[4752]: I1124 12:43:09.500622 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cb17497-6c03-4e7b-a42a-0d7f1829fbcf-operator-scripts\") pod \"heat-8683-account-create-qqbxv\" (UID: \"1cb17497-6c03-4e7b-a42a-0d7f1829fbcf\") " pod="openstack/heat-8683-account-create-qqbxv" Nov 24 12:43:09 crc kubenswrapper[4752]: I1124 12:43:09.518397 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-jn9pl" Nov 24 12:43:09 crc kubenswrapper[4752]: I1124 12:43:09.518479 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-strlg\" (UniqueName: \"kubernetes.io/projected/1cb17497-6c03-4e7b-a42a-0d7f1829fbcf-kube-api-access-strlg\") pod \"heat-8683-account-create-qqbxv\" (UID: \"1cb17497-6c03-4e7b-a42a-0d7f1829fbcf\") " pod="openstack/heat-8683-account-create-qqbxv" Nov 24 12:43:09 crc kubenswrapper[4752]: I1124 12:43:09.624192 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-8683-account-create-qqbxv" Nov 24 12:43:10 crc kubenswrapper[4752]: I1124 12:43:10.088340 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-jn9pl"] Nov 24 12:43:10 crc kubenswrapper[4752]: I1124 12:43:10.178504 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-8683-account-create-qqbxv"] Nov 24 12:43:10 crc kubenswrapper[4752]: I1124 12:43:10.274623 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-jn9pl" event={"ID":"b548dc2a-4fbc-4303-9ace-3e58cb26598f","Type":"ContainerStarted","Data":"deabe2389ba1f875c955e5e8828dc2639082816e0f7c9ef5e93f730e599ebcbf"} Nov 24 12:43:10 crc kubenswrapper[4752]: I1124 12:43:10.277035 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-8683-account-create-qqbxv" event={"ID":"1cb17497-6c03-4e7b-a42a-0d7f1829fbcf","Type":"ContainerStarted","Data":"76de568844cdca691af337fc9c13494b601b53ef0e2442f449a9efc1c14ae4b3"} Nov 24 12:43:10 crc kubenswrapper[4752]: I1124 12:43:10.282371 4752 generic.go:334] "Generic (PLEG): container finished" podID="099586b8-337b-49b9-93ce-8799907d222e" containerID="0603e78e93b1a3bb1a5c40ea05318e9ba4825f70dd94dad27488af03db76797e" exitCode=0 Nov 24 12:43:10 crc kubenswrapper[4752]: I1124 12:43:10.283512 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d74gw" event={"ID":"099586b8-337b-49b9-93ce-8799907d222e","Type":"ContainerDied","Data":"0603e78e93b1a3bb1a5c40ea05318e9ba4825f70dd94dad27488af03db76797e"} Nov 24 12:43:10 crc kubenswrapper[4752]: I1124 12:43:10.283773 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d74gw" event={"ID":"099586b8-337b-49b9-93ce-8799907d222e","Type":"ContainerStarted","Data":"b2f0dd36dbb1bbccbc37e7fd6067a279a3bda4a8247ca7a0004231e93ae8b293"} Nov 24 12:43:10 crc kubenswrapper[4752]: I1124 12:43:10.331035 4752 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 24 12:43:11 crc kubenswrapper[4752]: I1124 12:43:11.309889 4752 generic.go:334] "Generic (PLEG): container finished" podID="b548dc2a-4fbc-4303-9ace-3e58cb26598f" containerID="6d596103c0c9697c86bb84c669d7afc4cb06ca9307cdf78f9a8cacda8b3692c7" exitCode=0 Nov 24 12:43:11 crc kubenswrapper[4752]: I1124 12:43:11.310267 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-jn9pl" event={"ID":"b548dc2a-4fbc-4303-9ace-3e58cb26598f","Type":"ContainerDied","Data":"6d596103c0c9697c86bb84c669d7afc4cb06ca9307cdf78f9a8cacda8b3692c7"} Nov 24 12:43:11 crc kubenswrapper[4752]: I1124 12:43:11.329180 4752 generic.go:334] "Generic (PLEG): container finished" podID="1cb17497-6c03-4e7b-a42a-0d7f1829fbcf" containerID="87cb9603077deef98005c5cb1f2c032839dcb5cb7bdc1e8b33a5c0efaccbbeef" exitCode=0 Nov 24 12:43:11 crc kubenswrapper[4752]: I1124 12:43:11.329242 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-8683-account-create-qqbxv" event={"ID":"1cb17497-6c03-4e7b-a42a-0d7f1829fbcf","Type":"ContainerDied","Data":"87cb9603077deef98005c5cb1f2c032839dcb5cb7bdc1e8b33a5c0efaccbbeef"} Nov 24 12:43:12 crc kubenswrapper[4752]: I1124 12:43:12.346660 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d74gw" event={"ID":"099586b8-337b-49b9-93ce-8799907d222e","Type":"ContainerStarted","Data":"c4e4194934b183535285437ec179794294f6ab77a34a074e2870b9db5b7be5af"} Nov 24 12:43:12 crc kubenswrapper[4752]: I1124 12:43:12.832672 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-jn9pl" Nov 24 12:43:12 crc kubenswrapper[4752]: I1124 12:43:12.839221 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-8683-account-create-qqbxv" Nov 24 12:43:12 crc kubenswrapper[4752]: I1124 12:43:12.991552 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-strlg\" (UniqueName: \"kubernetes.io/projected/1cb17497-6c03-4e7b-a42a-0d7f1829fbcf-kube-api-access-strlg\") pod \"1cb17497-6c03-4e7b-a42a-0d7f1829fbcf\" (UID: \"1cb17497-6c03-4e7b-a42a-0d7f1829fbcf\") " Nov 24 12:43:12 crc kubenswrapper[4752]: I1124 12:43:12.991635 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcrbm\" (UniqueName: \"kubernetes.io/projected/b548dc2a-4fbc-4303-9ace-3e58cb26598f-kube-api-access-rcrbm\") pod \"b548dc2a-4fbc-4303-9ace-3e58cb26598f\" (UID: \"b548dc2a-4fbc-4303-9ace-3e58cb26598f\") " Nov 24 12:43:12 crc kubenswrapper[4752]: I1124 12:43:12.991828 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cb17497-6c03-4e7b-a42a-0d7f1829fbcf-operator-scripts\") pod \"1cb17497-6c03-4e7b-a42a-0d7f1829fbcf\" (UID: \"1cb17497-6c03-4e7b-a42a-0d7f1829fbcf\") " Nov 24 12:43:12 crc kubenswrapper[4752]: I1124 12:43:12.991900 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b548dc2a-4fbc-4303-9ace-3e58cb26598f-operator-scripts\") pod \"b548dc2a-4fbc-4303-9ace-3e58cb26598f\" (UID: \"b548dc2a-4fbc-4303-9ace-3e58cb26598f\") " Nov 24 12:43:12 crc kubenswrapper[4752]: I1124 12:43:12.992824 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b548dc2a-4fbc-4303-9ace-3e58cb26598f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b548dc2a-4fbc-4303-9ace-3e58cb26598f" (UID: "b548dc2a-4fbc-4303-9ace-3e58cb26598f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:43:12 crc kubenswrapper[4752]: I1124 12:43:12.994002 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cb17497-6c03-4e7b-a42a-0d7f1829fbcf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1cb17497-6c03-4e7b-a42a-0d7f1829fbcf" (UID: "1cb17497-6c03-4e7b-a42a-0d7f1829fbcf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:43:12 crc kubenswrapper[4752]: I1124 12:43:12.998949 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b548dc2a-4fbc-4303-9ace-3e58cb26598f-kube-api-access-rcrbm" (OuterVolumeSpecName: "kube-api-access-rcrbm") pod "b548dc2a-4fbc-4303-9ace-3e58cb26598f" (UID: "b548dc2a-4fbc-4303-9ace-3e58cb26598f"). InnerVolumeSpecName "kube-api-access-rcrbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:43:13 crc kubenswrapper[4752]: I1124 12:43:13.000299 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cb17497-6c03-4e7b-a42a-0d7f1829fbcf-kube-api-access-strlg" (OuterVolumeSpecName: "kube-api-access-strlg") pod "1cb17497-6c03-4e7b-a42a-0d7f1829fbcf" (UID: "1cb17497-6c03-4e7b-a42a-0d7f1829fbcf"). InnerVolumeSpecName "kube-api-access-strlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:43:13 crc kubenswrapper[4752]: I1124 12:43:13.096103 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcrbm\" (UniqueName: \"kubernetes.io/projected/b548dc2a-4fbc-4303-9ace-3e58cb26598f-kube-api-access-rcrbm\") on node \"crc\" DevicePath \"\"" Nov 24 12:43:13 crc kubenswrapper[4752]: I1124 12:43:13.096139 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cb17497-6c03-4e7b-a42a-0d7f1829fbcf-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:43:13 crc kubenswrapper[4752]: I1124 12:43:13.096149 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b548dc2a-4fbc-4303-9ace-3e58cb26598f-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:43:13 crc kubenswrapper[4752]: I1124 12:43:13.096158 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-strlg\" (UniqueName: \"kubernetes.io/projected/1cb17497-6c03-4e7b-a42a-0d7f1829fbcf-kube-api-access-strlg\") on node \"crc\" DevicePath \"\"" Nov 24 12:43:13 crc kubenswrapper[4752]: I1124 12:43:13.365316 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-jn9pl" event={"ID":"b548dc2a-4fbc-4303-9ace-3e58cb26598f","Type":"ContainerDied","Data":"deabe2389ba1f875c955e5e8828dc2639082816e0f7c9ef5e93f730e599ebcbf"} Nov 24 12:43:13 crc kubenswrapper[4752]: I1124 12:43:13.365686 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="deabe2389ba1f875c955e5e8828dc2639082816e0f7c9ef5e93f730e599ebcbf" Nov 24 12:43:13 crc kubenswrapper[4752]: I1124 12:43:13.365351 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-jn9pl" Nov 24 12:43:13 crc kubenswrapper[4752]: I1124 12:43:13.375974 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-8683-account-create-qqbxv" Nov 24 12:43:13 crc kubenswrapper[4752]: I1124 12:43:13.376036 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-8683-account-create-qqbxv" event={"ID":"1cb17497-6c03-4e7b-a42a-0d7f1829fbcf","Type":"ContainerDied","Data":"76de568844cdca691af337fc9c13494b601b53ef0e2442f449a9efc1c14ae4b3"} Nov 24 12:43:13 crc kubenswrapper[4752]: I1124 12:43:13.376071 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76de568844cdca691af337fc9c13494b601b53ef0e2442f449a9efc1c14ae4b3" Nov 24 12:43:14 crc kubenswrapper[4752]: I1124 12:43:14.425911 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-dxb8j"] Nov 24 12:43:14 crc kubenswrapper[4752]: E1124 12:43:14.426521 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b548dc2a-4fbc-4303-9ace-3e58cb26598f" containerName="mariadb-database-create" Nov 24 12:43:14 crc kubenswrapper[4752]: I1124 12:43:14.426544 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="b548dc2a-4fbc-4303-9ace-3e58cb26598f" containerName="mariadb-database-create" Nov 24 12:43:14 crc kubenswrapper[4752]: E1124 12:43:14.426585 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cb17497-6c03-4e7b-a42a-0d7f1829fbcf" containerName="mariadb-account-create" Nov 24 12:43:14 crc kubenswrapper[4752]: I1124 12:43:14.426598 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cb17497-6c03-4e7b-a42a-0d7f1829fbcf" containerName="mariadb-account-create" Nov 24 12:43:14 crc kubenswrapper[4752]: I1124 12:43:14.426960 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cb17497-6c03-4e7b-a42a-0d7f1829fbcf" containerName="mariadb-account-create" Nov 24 12:43:14 crc kubenswrapper[4752]: I1124 12:43:14.426990 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="b548dc2a-4fbc-4303-9ace-3e58cb26598f" containerName="mariadb-database-create" Nov 24 12:43:14 crc kubenswrapper[4752]: I1124 12:43:14.428011 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-dxb8j" Nov 24 12:43:14 crc kubenswrapper[4752]: I1124 12:43:14.434280 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-vhnlh" Nov 24 12:43:14 crc kubenswrapper[4752]: I1124 12:43:14.435884 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-dxb8j"] Nov 24 12:43:14 crc kubenswrapper[4752]: I1124 12:43:14.435982 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Nov 24 12:43:14 crc kubenswrapper[4752]: I1124 12:43:14.528515 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49d043b2-da1a-4a85-9193-9b9214e09776-combined-ca-bundle\") pod \"heat-db-sync-dxb8j\" (UID: \"49d043b2-da1a-4a85-9193-9b9214e09776\") " pod="openstack/heat-db-sync-dxb8j" Nov 24 12:43:14 crc kubenswrapper[4752]: I1124 12:43:14.528573 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49d043b2-da1a-4a85-9193-9b9214e09776-config-data\") pod \"heat-db-sync-dxb8j\" (UID: \"49d043b2-da1a-4a85-9193-9b9214e09776\") " pod="openstack/heat-db-sync-dxb8j" Nov 24 12:43:14 crc kubenswrapper[4752]: I1124 12:43:14.529062 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5rz6\" (UniqueName: \"kubernetes.io/projected/49d043b2-da1a-4a85-9193-9b9214e09776-kube-api-access-l5rz6\") pod \"heat-db-sync-dxb8j\" (UID: \"49d043b2-da1a-4a85-9193-9b9214e09776\") " pod="openstack/heat-db-sync-dxb8j" Nov 24 12:43:14 crc kubenswrapper[4752]: I1124 12:43:14.630706 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5rz6\" (UniqueName: \"kubernetes.io/projected/49d043b2-da1a-4a85-9193-9b9214e09776-kube-api-access-l5rz6\") pod \"heat-db-sync-dxb8j\" (UID: \"49d043b2-da1a-4a85-9193-9b9214e09776\") " pod="openstack/heat-db-sync-dxb8j" Nov 24 12:43:14 crc kubenswrapper[4752]: I1124 12:43:14.630930 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49d043b2-da1a-4a85-9193-9b9214e09776-combined-ca-bundle\") pod \"heat-db-sync-dxb8j\" (UID: \"49d043b2-da1a-4a85-9193-9b9214e09776\") " pod="openstack/heat-db-sync-dxb8j" Nov 24 12:43:14 crc kubenswrapper[4752]: I1124 12:43:14.630961 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49d043b2-da1a-4a85-9193-9b9214e09776-config-data\") pod \"heat-db-sync-dxb8j\" (UID: \"49d043b2-da1a-4a85-9193-9b9214e09776\") " pod="openstack/heat-db-sync-dxb8j" Nov 24 12:43:14 crc kubenswrapper[4752]: I1124 12:43:14.636488 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49d043b2-da1a-4a85-9193-9b9214e09776-combined-ca-bundle\") pod \"heat-db-sync-dxb8j\" (UID: \"49d043b2-da1a-4a85-9193-9b9214e09776\") " pod="openstack/heat-db-sync-dxb8j" Nov 24 12:43:14 crc kubenswrapper[4752]: I1124 12:43:14.638613 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49d043b2-da1a-4a85-9193-9b9214e09776-config-data\") pod \"heat-db-sync-dxb8j\" (UID: \"49d043b2-da1a-4a85-9193-9b9214e09776\") " pod="openstack/heat-db-sync-dxb8j" Nov 24 12:43:14 crc kubenswrapper[4752]: I1124 12:43:14.663255 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5rz6\" (UniqueName: \"kubernetes.io/projected/49d043b2-da1a-4a85-9193-9b9214e09776-kube-api-access-l5rz6\") pod \"heat-db-sync-dxb8j\" (UID: \"49d043b2-da1a-4a85-9193-9b9214e09776\") " pod="openstack/heat-db-sync-dxb8j" Nov 24 12:43:14 crc kubenswrapper[4752]: I1124 12:43:14.764070 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-dxb8j" Nov 24 12:43:15 crc kubenswrapper[4752]: I1124 12:43:15.343355 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-dxb8j"] Nov 24 12:43:15 crc kubenswrapper[4752]: I1124 12:43:15.404101 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-dxb8j" event={"ID":"49d043b2-da1a-4a85-9193-9b9214e09776","Type":"ContainerStarted","Data":"03e1a3baf2063a3bef4ed1272de18b2c9857021af8e8a047e9c25439706b1205"} Nov 24 12:43:17 crc kubenswrapper[4752]: I1124 12:43:17.853800 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-554cf5f95c-cppx5" Nov 24 12:43:17 crc kubenswrapper[4752]: I1124 12:43:17.855133 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-554cf5f95c-cppx5" Nov 24 12:43:20 crc kubenswrapper[4752]: I1124 12:43:20.728636 4752 scope.go:117] "RemoveContainer" containerID="2856f89824856d474bf147248c78cc6aab4d866c6c4a631c464c721708251588" Nov 24 12:43:20 crc kubenswrapper[4752]: E1124 12:43:20.729232 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:43:21 crc kubenswrapper[4752]: I1124 12:43:21.482954 4752 generic.go:334] "Generic (PLEG): container finished" podID="099586b8-337b-49b9-93ce-8799907d222e" containerID="c4e4194934b183535285437ec179794294f6ab77a34a074e2870b9db5b7be5af" exitCode=0 Nov 24 12:43:21 crc kubenswrapper[4752]: I1124 12:43:21.483049 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d74gw" event={"ID":"099586b8-337b-49b9-93ce-8799907d222e","Type":"ContainerDied","Data":"c4e4194934b183535285437ec179794294f6ab77a34a074e2870b9db5b7be5af"} Nov 24 12:43:22 crc kubenswrapper[4752]: I1124 12:43:22.505158 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d74gw" event={"ID":"099586b8-337b-49b9-93ce-8799907d222e","Type":"ContainerStarted","Data":"01659914f97e5ad6ddfc51e052bf7773863846e4701fabbe6c64a70399d586c0"} Nov 24 12:43:22 crc kubenswrapper[4752]: I1124 12:43:22.509160 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-dxb8j" event={"ID":"49d043b2-da1a-4a85-9193-9b9214e09776","Type":"ContainerStarted","Data":"7ebb701caad73543aafffd680acd0e9a3c49f28e9df8899a36efdf8261e1d622"} Nov 24 12:43:22 crc kubenswrapper[4752]: I1124 12:43:22.528485 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-d74gw" podStartSLOduration=2.738799418 podStartE2EDuration="14.528465035s" podCreationTimestamp="2025-11-24 12:43:08 +0000 UTC" firstStartedPulling="2025-11-24 12:43:10.330690209 +0000 UTC m=+5796.315510498" lastFinishedPulling="2025-11-24 12:43:22.120355816 +0000 UTC m=+5808.105176115" observedRunningTime="2025-11-24 12:43:22.520223659 +0000 UTC m=+5808.505043948" watchObservedRunningTime="2025-11-24 12:43:22.528465035 +0000 UTC m=+5808.513285324" Nov 24 12:43:24 crc kubenswrapper[4752]: I1124 12:43:24.526369 4752 generic.go:334] "Generic (PLEG): container finished" podID="49d043b2-da1a-4a85-9193-9b9214e09776" containerID="7ebb701caad73543aafffd680acd0e9a3c49f28e9df8899a36efdf8261e1d622" exitCode=0 Nov 24 12:43:24 crc kubenswrapper[4752]: I1124 12:43:24.526407 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-dxb8j" event={"ID":"49d043b2-da1a-4a85-9193-9b9214e09776","Type":"ContainerDied","Data":"7ebb701caad73543aafffd680acd0e9a3c49f28e9df8899a36efdf8261e1d622"} Nov 24 12:43:25 crc kubenswrapper[4752]: I1124 12:43:25.960155 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-dxb8j" Nov 24 12:43:26 crc kubenswrapper[4752]: I1124 12:43:26.094088 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49d043b2-da1a-4a85-9193-9b9214e09776-combined-ca-bundle\") pod \"49d043b2-da1a-4a85-9193-9b9214e09776\" (UID: \"49d043b2-da1a-4a85-9193-9b9214e09776\") " Nov 24 12:43:26 crc kubenswrapper[4752]: I1124 12:43:26.094260 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49d043b2-da1a-4a85-9193-9b9214e09776-config-data\") pod \"49d043b2-da1a-4a85-9193-9b9214e09776\" (UID: \"49d043b2-da1a-4a85-9193-9b9214e09776\") " Nov 24 12:43:26 crc kubenswrapper[4752]: I1124 12:43:26.094305 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5rz6\" (UniqueName: \"kubernetes.io/projected/49d043b2-da1a-4a85-9193-9b9214e09776-kube-api-access-l5rz6\") pod \"49d043b2-da1a-4a85-9193-9b9214e09776\" (UID: \"49d043b2-da1a-4a85-9193-9b9214e09776\") " Nov 24 12:43:26 crc kubenswrapper[4752]: I1124 12:43:26.118072 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49d043b2-da1a-4a85-9193-9b9214e09776-kube-api-access-l5rz6" (OuterVolumeSpecName: "kube-api-access-l5rz6") pod "49d043b2-da1a-4a85-9193-9b9214e09776" (UID: "49d043b2-da1a-4a85-9193-9b9214e09776"). InnerVolumeSpecName "kube-api-access-l5rz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:43:26 crc kubenswrapper[4752]: I1124 12:43:26.131376 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49d043b2-da1a-4a85-9193-9b9214e09776-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49d043b2-da1a-4a85-9193-9b9214e09776" (UID: "49d043b2-da1a-4a85-9193-9b9214e09776"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:43:26 crc kubenswrapper[4752]: I1124 12:43:26.183204 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49d043b2-da1a-4a85-9193-9b9214e09776-config-data" (OuterVolumeSpecName: "config-data") pod "49d043b2-da1a-4a85-9193-9b9214e09776" (UID: "49d043b2-da1a-4a85-9193-9b9214e09776"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:43:26 crc kubenswrapper[4752]: I1124 12:43:26.197004 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5rz6\" (UniqueName: \"kubernetes.io/projected/49d043b2-da1a-4a85-9193-9b9214e09776-kube-api-access-l5rz6\") on node \"crc\" DevicePath \"\"" Nov 24 12:43:26 crc kubenswrapper[4752]: I1124 12:43:26.197035 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49d043b2-da1a-4a85-9193-9b9214e09776-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 12:43:26 crc kubenswrapper[4752]: I1124 12:43:26.197045 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49d043b2-da1a-4a85-9193-9b9214e09776-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 12:43:26 crc kubenswrapper[4752]: I1124 12:43:26.553444 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-dxb8j" event={"ID":"49d043b2-da1a-4a85-9193-9b9214e09776","Type":"ContainerDied","Data":"03e1a3baf2063a3bef4ed1272de18b2c9857021af8e8a047e9c25439706b1205"} Nov 24 12:43:26 crc kubenswrapper[4752]: I1124 12:43:26.553857 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03e1a3baf2063a3bef4ed1272de18b2c9857021af8e8a047e9c25439706b1205" Nov 24 12:43:26 crc kubenswrapper[4752]: I1124 12:43:26.553932 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-dxb8j" Nov 24 12:43:27 crc kubenswrapper[4752]: I1124 12:43:27.855316 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-554cf5f95c-cppx5" podUID="da80ccb9-4d2f-49d3-8689-5cf720968e94" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.113:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.113:8080: connect: connection refused" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.178605 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-85c675c6dc-z7nps"] Nov 24 12:43:28 crc kubenswrapper[4752]: E1124 12:43:28.179578 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49d043b2-da1a-4a85-9193-9b9214e09776" containerName="heat-db-sync" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.179645 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="49d043b2-da1a-4a85-9193-9b9214e09776" containerName="heat-db-sync" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.179970 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="49d043b2-da1a-4a85-9193-9b9214e09776" containerName="heat-db-sync" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.180817 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-85c675c6dc-z7nps" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.182771 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-vhnlh" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.184727 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.187271 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.208618 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-85c675c6dc-z7nps"] Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.242425 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d41cbaf4-ce55-43f4-8940-f58a0b2c62c0-config-data-custom\") pod \"heat-engine-85c675c6dc-z7nps\" (UID: \"d41cbaf4-ce55-43f4-8940-f58a0b2c62c0\") " pod="openstack/heat-engine-85c675c6dc-z7nps" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.242561 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5g9c\" (UniqueName: \"kubernetes.io/projected/d41cbaf4-ce55-43f4-8940-f58a0b2c62c0-kube-api-access-b5g9c\") pod \"heat-engine-85c675c6dc-z7nps\" (UID: \"d41cbaf4-ce55-43f4-8940-f58a0b2c62c0\") " pod="openstack/heat-engine-85c675c6dc-z7nps" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.242680 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d41cbaf4-ce55-43f4-8940-f58a0b2c62c0-config-data\") pod \"heat-engine-85c675c6dc-z7nps\" (UID: \"d41cbaf4-ce55-43f4-8940-f58a0b2c62c0\") " pod="openstack/heat-engine-85c675c6dc-z7nps" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.242793 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d41cbaf4-ce55-43f4-8940-f58a0b2c62c0-combined-ca-bundle\") pod \"heat-engine-85c675c6dc-z7nps\" (UID: \"d41cbaf4-ce55-43f4-8940-f58a0b2c62c0\") " pod="openstack/heat-engine-85c675c6dc-z7nps" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.297659 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-656f79b868-6plw4"] Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.299555 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-656f79b868-6plw4" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.302364 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.314096 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-656f79b868-6plw4"] Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.345439 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5g9c\" (UniqueName: \"kubernetes.io/projected/d41cbaf4-ce55-43f4-8940-f58a0b2c62c0-kube-api-access-b5g9c\") pod \"heat-engine-85c675c6dc-z7nps\" (UID: \"d41cbaf4-ce55-43f4-8940-f58a0b2c62c0\") " pod="openstack/heat-engine-85c675c6dc-z7nps" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.345559 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d41cbaf4-ce55-43f4-8940-f58a0b2c62c0-config-data\") pod \"heat-engine-85c675c6dc-z7nps\" (UID: \"d41cbaf4-ce55-43f4-8940-f58a0b2c62c0\") " pod="openstack/heat-engine-85c675c6dc-z7nps" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.345844 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d41cbaf4-ce55-43f4-8940-f58a0b2c62c0-combined-ca-bundle\") pod \"heat-engine-85c675c6dc-z7nps\" (UID: \"d41cbaf4-ce55-43f4-8940-f58a0b2c62c0\") " pod="openstack/heat-engine-85c675c6dc-z7nps" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.346790 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62674a23-d680-4d6f-9764-aed369cee2a0-config-data\") pod \"heat-api-656f79b868-6plw4\" (UID: \"62674a23-d680-4d6f-9764-aed369cee2a0\") " pod="openstack/heat-api-656f79b868-6plw4" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.346852 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsknd\" (UniqueName: \"kubernetes.io/projected/62674a23-d680-4d6f-9764-aed369cee2a0-kube-api-access-fsknd\") pod \"heat-api-656f79b868-6plw4\" (UID: \"62674a23-d680-4d6f-9764-aed369cee2a0\") " pod="openstack/heat-api-656f79b868-6plw4" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.346907 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62674a23-d680-4d6f-9764-aed369cee2a0-config-data-custom\") pod \"heat-api-656f79b868-6plw4\" (UID: \"62674a23-d680-4d6f-9764-aed369cee2a0\") " pod="openstack/heat-api-656f79b868-6plw4" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.346970 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d41cbaf4-ce55-43f4-8940-f58a0b2c62c0-config-data-custom\") pod \"heat-engine-85c675c6dc-z7nps\" (UID: \"d41cbaf4-ce55-43f4-8940-f58a0b2c62c0\") " pod="openstack/heat-engine-85c675c6dc-z7nps" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.347006 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62674a23-d680-4d6f-9764-aed369cee2a0-combined-ca-bundle\") pod \"heat-api-656f79b868-6plw4\" (UID: \"62674a23-d680-4d6f-9764-aed369cee2a0\") " pod="openstack/heat-api-656f79b868-6plw4" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.359183 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d41cbaf4-ce55-43f4-8940-f58a0b2c62c0-config-data-custom\") pod \"heat-engine-85c675c6dc-z7nps\" (UID: \"d41cbaf4-ce55-43f4-8940-f58a0b2c62c0\") " pod="openstack/heat-engine-85c675c6dc-z7nps" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.365733 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d41cbaf4-ce55-43f4-8940-f58a0b2c62c0-combined-ca-bundle\") pod \"heat-engine-85c675c6dc-z7nps\" (UID: \"d41cbaf4-ce55-43f4-8940-f58a0b2c62c0\") " pod="openstack/heat-engine-85c675c6dc-z7nps" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.377056 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d41cbaf4-ce55-43f4-8940-f58a0b2c62c0-config-data\") pod \"heat-engine-85c675c6dc-z7nps\" (UID: \"d41cbaf4-ce55-43f4-8940-f58a0b2c62c0\") " pod="openstack/heat-engine-85c675c6dc-z7nps" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.379766 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-78b7786c85-fxgnh"] Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.381433 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-78b7786c85-fxgnh" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.386098 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.392941 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-78b7786c85-fxgnh"] Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.393478 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5g9c\" (UniqueName: \"kubernetes.io/projected/d41cbaf4-ce55-43f4-8940-f58a0b2c62c0-kube-api-access-b5g9c\") pod \"heat-engine-85c675c6dc-z7nps\" (UID: \"d41cbaf4-ce55-43f4-8940-f58a0b2c62c0\") " pod="openstack/heat-engine-85c675c6dc-z7nps" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.451098 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62674a23-d680-4d6f-9764-aed369cee2a0-config-data\") pod \"heat-api-656f79b868-6plw4\" (UID: \"62674a23-d680-4d6f-9764-aed369cee2a0\") " pod="openstack/heat-api-656f79b868-6plw4" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.451154 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsknd\" (UniqueName: \"kubernetes.io/projected/62674a23-d680-4d6f-9764-aed369cee2a0-kube-api-access-fsknd\") pod \"heat-api-656f79b868-6plw4\" (UID: \"62674a23-d680-4d6f-9764-aed369cee2a0\") " pod="openstack/heat-api-656f79b868-6plw4" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.451196 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a2456d8-9cb0-4de8-b42a-bc5b7f7bd5a9-combined-ca-bundle\") pod \"heat-cfnapi-78b7786c85-fxgnh\" (UID: \"7a2456d8-9cb0-4de8-b42a-bc5b7f7bd5a9\") " pod="openstack/heat-cfnapi-78b7786c85-fxgnh" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.451221 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfwkv\" (UniqueName: \"kubernetes.io/projected/7a2456d8-9cb0-4de8-b42a-bc5b7f7bd5a9-kube-api-access-rfwkv\") pod \"heat-cfnapi-78b7786c85-fxgnh\" (UID: \"7a2456d8-9cb0-4de8-b42a-bc5b7f7bd5a9\") " pod="openstack/heat-cfnapi-78b7786c85-fxgnh" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.451240 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62674a23-d680-4d6f-9764-aed369cee2a0-config-data-custom\") pod \"heat-api-656f79b868-6plw4\" (UID: \"62674a23-d680-4d6f-9764-aed369cee2a0\") " pod="openstack/heat-api-656f79b868-6plw4" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.451285 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62674a23-d680-4d6f-9764-aed369cee2a0-combined-ca-bundle\") pod \"heat-api-656f79b868-6plw4\" (UID: \"62674a23-d680-4d6f-9764-aed369cee2a0\") " pod="openstack/heat-api-656f79b868-6plw4" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.451326 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a2456d8-9cb0-4de8-b42a-bc5b7f7bd5a9-config-data\") pod \"heat-cfnapi-78b7786c85-fxgnh\" (UID: \"7a2456d8-9cb0-4de8-b42a-bc5b7f7bd5a9\") " pod="openstack/heat-cfnapi-78b7786c85-fxgnh" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.451361 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a2456d8-9cb0-4de8-b42a-bc5b7f7bd5a9-config-data-custom\") pod \"heat-cfnapi-78b7786c85-fxgnh\" (UID: \"7a2456d8-9cb0-4de8-b42a-bc5b7f7bd5a9\") " pod="openstack/heat-cfnapi-78b7786c85-fxgnh" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.458926 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62674a23-d680-4d6f-9764-aed369cee2a0-config-data-custom\") pod \"heat-api-656f79b868-6plw4\" (UID: \"62674a23-d680-4d6f-9764-aed369cee2a0\") " pod="openstack/heat-api-656f79b868-6plw4" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.476374 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62674a23-d680-4d6f-9764-aed369cee2a0-combined-ca-bundle\") pod \"heat-api-656f79b868-6plw4\" (UID: \"62674a23-d680-4d6f-9764-aed369cee2a0\") " pod="openstack/heat-api-656f79b868-6plw4" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.476715 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62674a23-d680-4d6f-9764-aed369cee2a0-config-data\") pod \"heat-api-656f79b868-6plw4\" (UID: \"62674a23-d680-4d6f-9764-aed369cee2a0\") " pod="openstack/heat-api-656f79b868-6plw4" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.478034 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsknd\" (UniqueName: \"kubernetes.io/projected/62674a23-d680-4d6f-9764-aed369cee2a0-kube-api-access-fsknd\") pod \"heat-api-656f79b868-6plw4\" (UID: \"62674a23-d680-4d6f-9764-aed369cee2a0\") " pod="openstack/heat-api-656f79b868-6plw4" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.506180 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-85c675c6dc-z7nps" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.554344 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a2456d8-9cb0-4de8-b42a-bc5b7f7bd5a9-combined-ca-bundle\") pod \"heat-cfnapi-78b7786c85-fxgnh\" (UID: \"7a2456d8-9cb0-4de8-b42a-bc5b7f7bd5a9\") " pod="openstack/heat-cfnapi-78b7786c85-fxgnh" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.554835 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfwkv\" (UniqueName: \"kubernetes.io/projected/7a2456d8-9cb0-4de8-b42a-bc5b7f7bd5a9-kube-api-access-rfwkv\") pod \"heat-cfnapi-78b7786c85-fxgnh\" (UID: \"7a2456d8-9cb0-4de8-b42a-bc5b7f7bd5a9\") " pod="openstack/heat-cfnapi-78b7786c85-fxgnh" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.554931 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a2456d8-9cb0-4de8-b42a-bc5b7f7bd5a9-config-data\") pod \"heat-cfnapi-78b7786c85-fxgnh\" (UID: \"7a2456d8-9cb0-4de8-b42a-bc5b7f7bd5a9\") " pod="openstack/heat-cfnapi-78b7786c85-fxgnh" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.554977 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a2456d8-9cb0-4de8-b42a-bc5b7f7bd5a9-config-data-custom\") pod \"heat-cfnapi-78b7786c85-fxgnh\" (UID: \"7a2456d8-9cb0-4de8-b42a-bc5b7f7bd5a9\") " pod="openstack/heat-cfnapi-78b7786c85-fxgnh" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.562207 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a2456d8-9cb0-4de8-b42a-bc5b7f7bd5a9-combined-ca-bundle\") pod \"heat-cfnapi-78b7786c85-fxgnh\" (UID: \"7a2456d8-9cb0-4de8-b42a-bc5b7f7bd5a9\") " pod="openstack/heat-cfnapi-78b7786c85-fxgnh" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.564115 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a2456d8-9cb0-4de8-b42a-bc5b7f7bd5a9-config-data-custom\") pod \"heat-cfnapi-78b7786c85-fxgnh\" (UID: \"7a2456d8-9cb0-4de8-b42a-bc5b7f7bd5a9\") " pod="openstack/heat-cfnapi-78b7786c85-fxgnh" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.567204 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a2456d8-9cb0-4de8-b42a-bc5b7f7bd5a9-config-data\") pod \"heat-cfnapi-78b7786c85-fxgnh\" (UID: \"7a2456d8-9cb0-4de8-b42a-bc5b7f7bd5a9\") " pod="openstack/heat-cfnapi-78b7786c85-fxgnh" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.576239 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfwkv\" (UniqueName: \"kubernetes.io/projected/7a2456d8-9cb0-4de8-b42a-bc5b7f7bd5a9-kube-api-access-rfwkv\") pod \"heat-cfnapi-78b7786c85-fxgnh\" (UID: \"7a2456d8-9cb0-4de8-b42a-bc5b7f7bd5a9\") " pod="openstack/heat-cfnapi-78b7786c85-fxgnh" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.630728 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-656f79b868-6plw4" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.668136 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-78b7786c85-fxgnh" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.820523 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-d74gw" Nov 24 12:43:28 crc kubenswrapper[4752]: I1124 12:43:28.820873 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-d74gw" Nov 24 12:43:29 crc kubenswrapper[4752]: I1124 12:43:29.003214 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-85c675c6dc-z7nps"] Nov 24 12:43:29 crc kubenswrapper[4752]: I1124 12:43:29.190362 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-78b7786c85-fxgnh"] Nov 24 12:43:29 crc kubenswrapper[4752]: W1124 12:43:29.193051 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a2456d8_9cb0_4de8_b42a_bc5b7f7bd5a9.slice/crio-4a1335d34a25957441094a4ca6c13c72af69b77a1ebceeaae08faf480f2aa58f WatchSource:0}: Error finding container 4a1335d34a25957441094a4ca6c13c72af69b77a1ebceeaae08faf480f2aa58f: Status 404 returned error can't find the container with id 4a1335d34a25957441094a4ca6c13c72af69b77a1ebceeaae08faf480f2aa58f Nov 24 12:43:29 crc kubenswrapper[4752]: I1124 12:43:29.280011 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-656f79b868-6plw4"] Nov 24 12:43:29 crc kubenswrapper[4752]: I1124 12:43:29.591982 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-656f79b868-6plw4" event={"ID":"62674a23-d680-4d6f-9764-aed369cee2a0","Type":"ContainerStarted","Data":"2bfdc0813e8f31c1cabfe79fb76b0c0d8389f0d99e55f94943fad1a663b6d7b4"} Nov 24 12:43:29 crc kubenswrapper[4752]: I1124 12:43:29.593536 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-78b7786c85-fxgnh" event={"ID":"7a2456d8-9cb0-4de8-b42a-bc5b7f7bd5a9","Type":"ContainerStarted","Data":"4a1335d34a25957441094a4ca6c13c72af69b77a1ebceeaae08faf480f2aa58f"} Nov 24 12:43:29 crc kubenswrapper[4752]: I1124 12:43:29.595208 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-85c675c6dc-z7nps" event={"ID":"d41cbaf4-ce55-43f4-8940-f58a0b2c62c0","Type":"ContainerStarted","Data":"1eb6959d6992e70a881159be6079e85d0e76ada598d398bba02f4b3b55765e61"} Nov 24 12:43:29 crc kubenswrapper[4752]: I1124 12:43:29.874264 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-d74gw" podUID="099586b8-337b-49b9-93ce-8799907d222e" containerName="registry-server" probeResult="failure" output=< Nov 24 12:43:29 crc kubenswrapper[4752]: timeout: failed to connect service ":50051" within 1s Nov 24 12:43:29 crc kubenswrapper[4752]: > Nov 24 12:43:30 crc kubenswrapper[4752]: I1124 12:43:30.607267 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-85c675c6dc-z7nps" event={"ID":"d41cbaf4-ce55-43f4-8940-f58a0b2c62c0","Type":"ContainerStarted","Data":"3b55f5a5befb1657e081da3e0b271f5589eb79ece7ba75843847c5576bf3d7d7"} Nov 24 12:43:30 crc kubenswrapper[4752]: I1124 12:43:30.607822 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-85c675c6dc-z7nps" Nov 24 12:43:30 crc kubenswrapper[4752]: I1124 12:43:30.634889 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-85c675c6dc-z7nps" podStartSLOduration=2.634868052 podStartE2EDuration="2.634868052s" podCreationTimestamp="2025-11-24 12:43:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:43:30.626163422 +0000 UTC m=+5816.610983711" watchObservedRunningTime="2025-11-24 12:43:30.634868052 +0000 UTC m=+5816.619688341" Nov 24 12:43:33 crc kubenswrapper[4752]: I1124 12:43:33.642087 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-78b7786c85-fxgnh" event={"ID":"7a2456d8-9cb0-4de8-b42a-bc5b7f7bd5a9","Type":"ContainerStarted","Data":"37f1f451cb47b04c935faa72dbeb76ddb0a3b0c265bc3963e30a84a46a83a2dc"} Nov 24 12:43:33 crc kubenswrapper[4752]: I1124 12:43:33.642817 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-78b7786c85-fxgnh" Nov 24 12:43:33 crc kubenswrapper[4752]: I1124 12:43:33.644011 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-656f79b868-6plw4" event={"ID":"62674a23-d680-4d6f-9764-aed369cee2a0","Type":"ContainerStarted","Data":"13805ff6dbc6b0f4e7354d084901c716d94d63850c2fbb7a98675ce0c5ac50b1"} Nov 24 12:43:33 crc kubenswrapper[4752]: I1124 12:43:33.644683 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-656f79b868-6plw4" Nov 24 12:43:33 crc kubenswrapper[4752]: I1124 12:43:33.673109 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-78b7786c85-fxgnh" podStartSLOduration=2.5064616749999997 podStartE2EDuration="5.673088889s" podCreationTimestamp="2025-11-24 12:43:28 +0000 UTC" firstStartedPulling="2025-11-24 12:43:29.195789837 +0000 UTC m=+5815.180610126" lastFinishedPulling="2025-11-24 12:43:32.362417051 +0000 UTC m=+5818.347237340" observedRunningTime="2025-11-24 12:43:33.671075911 +0000 UTC m=+5819.655896210" watchObservedRunningTime="2025-11-24 12:43:33.673088889 +0000 UTC m=+5819.657909188" Nov 24 12:43:34 crc kubenswrapper[4752]: I1124 12:43:34.755472 4752 scope.go:117] "RemoveContainer" containerID="2856f89824856d474bf147248c78cc6aab4d866c6c4a631c464c721708251588" Nov 24 12:43:34 crc kubenswrapper[4752]: E1124 12:43:34.756181 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:43:39 crc kubenswrapper[4752]: I1124 12:43:39.878463 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-d74gw" podUID="099586b8-337b-49b9-93ce-8799907d222e" containerName="registry-server" probeResult="failure" output=< Nov 24 12:43:39 crc kubenswrapper[4752]: timeout: failed to connect service ":50051" within 1s Nov 24 12:43:39 crc kubenswrapper[4752]: > Nov 24 12:43:39 crc kubenswrapper[4752]: I1124 12:43:39.996807 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-554cf5f95c-cppx5" Nov 24 12:43:40 crc kubenswrapper[4752]: I1124 12:43:40.017316 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-656f79b868-6plw4" podStartSLOduration=8.967176003 podStartE2EDuration="12.017292401s" podCreationTimestamp="2025-11-24 12:43:28 +0000 UTC" firstStartedPulling="2025-11-24 12:43:29.314723542 +0000 UTC m=+5815.299543831" lastFinishedPulling="2025-11-24 12:43:32.36483994 +0000 UTC m=+5818.349660229" observedRunningTime="2025-11-24 12:43:33.70828846 +0000 UTC m=+5819.693108749" watchObservedRunningTime="2025-11-24 12:43:40.017292401 +0000 UTC m=+5826.002112690" Nov 24 12:43:40 crc kubenswrapper[4752]: I1124 12:43:40.784773 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-78b7786c85-fxgnh" Nov 24 12:43:40 crc kubenswrapper[4752]: I1124 12:43:40.824391 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-656f79b868-6plw4" Nov 24 12:43:41 crc kubenswrapper[4752]: I1124 12:43:41.867273 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-554cf5f95c-cppx5" Nov 24 12:43:41 crc kubenswrapper[4752]: I1124 12:43:41.934531 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5dfcff4df9-n9brt"] Nov 24 12:43:41 crc kubenswrapper[4752]: I1124 12:43:41.934819 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5dfcff4df9-n9brt" podUID="9442f57a-2b65-4f64-be4f-7c1a834acea9" containerName="horizon-log" containerID="cri-o://0e2ca094df1700bef9c91bc549879094f0e8c20bb6b3de251d08d53391dde6c8" gracePeriod=30 Nov 24 12:43:41 crc kubenswrapper[4752]: I1124 12:43:41.934875 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5dfcff4df9-n9brt" podUID="9442f57a-2b65-4f64-be4f-7c1a834acea9" containerName="horizon" containerID="cri-o://328fa17f7b2e7d6ce7d0ff5a9fb127aef069ff0b8009cfb11c3e0f3441aa0b7c" gracePeriod=30 Nov 24 12:43:43 crc kubenswrapper[4752]: I1124 12:43:43.814726 4752 scope.go:117] "RemoveContainer" containerID="09b52e83937056b67ffaaf5fd535fd3ec55d586875012f9a299273e31d77d592" Nov 24 12:43:43 crc kubenswrapper[4752]: I1124 12:43:43.865823 4752 scope.go:117] "RemoveContainer" containerID="ca89dccb86162f3dd03a28a60e76ad881866be87b888235d3fa3a5596c60fe94" Nov 24 12:43:43 crc kubenswrapper[4752]: I1124 12:43:43.892015 4752 scope.go:117] "RemoveContainer" containerID="62fccc183a19821ab053524c28a352d904efaeac505327c8c34b214b0afa452f" Nov 24 12:43:45 crc kubenswrapper[4752]: I1124 12:43:45.435506 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mxh8z"] Nov 24 12:43:45 crc kubenswrapper[4752]: I1124 12:43:45.439412 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxh8z" Nov 24 12:43:45 crc kubenswrapper[4752]: I1124 12:43:45.450298 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mxh8z"] Nov 24 12:43:45 crc kubenswrapper[4752]: I1124 12:43:45.586642 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1ad488f-020d-46f6-abc8-7b789cfc4594-utilities\") pod \"community-operators-mxh8z\" (UID: \"a1ad488f-020d-46f6-abc8-7b789cfc4594\") " pod="openshift-marketplace/community-operators-mxh8z" Nov 24 12:43:45 crc kubenswrapper[4752]: I1124 12:43:45.586770 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1ad488f-020d-46f6-abc8-7b789cfc4594-catalog-content\") pod \"community-operators-mxh8z\" (UID: \"a1ad488f-020d-46f6-abc8-7b789cfc4594\") " pod="openshift-marketplace/community-operators-mxh8z" Nov 24 12:43:45 crc kubenswrapper[4752]: I1124 12:43:45.586844 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7c48\" (UniqueName: \"kubernetes.io/projected/a1ad488f-020d-46f6-abc8-7b789cfc4594-kube-api-access-c7c48\") pod \"community-operators-mxh8z\" (UID: \"a1ad488f-020d-46f6-abc8-7b789cfc4594\") " pod="openshift-marketplace/community-operators-mxh8z" Nov 24 12:43:45 crc kubenswrapper[4752]: I1124 12:43:45.688378 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7c48\" (UniqueName: \"kubernetes.io/projected/a1ad488f-020d-46f6-abc8-7b789cfc4594-kube-api-access-c7c48\") pod \"community-operators-mxh8z\" (UID: \"a1ad488f-020d-46f6-abc8-7b789cfc4594\") " pod="openshift-marketplace/community-operators-mxh8z" Nov 24 12:43:45 crc kubenswrapper[4752]: I1124 12:43:45.688578 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1ad488f-020d-46f6-abc8-7b789cfc4594-utilities\") pod \"community-operators-mxh8z\" (UID: \"a1ad488f-020d-46f6-abc8-7b789cfc4594\") " pod="openshift-marketplace/community-operators-mxh8z" Nov 24 12:43:45 crc kubenswrapper[4752]: I1124 12:43:45.688644 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1ad488f-020d-46f6-abc8-7b789cfc4594-catalog-content\") pod \"community-operators-mxh8z\" (UID: \"a1ad488f-020d-46f6-abc8-7b789cfc4594\") " pod="openshift-marketplace/community-operators-mxh8z" Nov 24 12:43:45 crc kubenswrapper[4752]: I1124 12:43:45.689188 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1ad488f-020d-46f6-abc8-7b789cfc4594-utilities\") pod \"community-operators-mxh8z\" (UID: \"a1ad488f-020d-46f6-abc8-7b789cfc4594\") " pod="openshift-marketplace/community-operators-mxh8z" Nov 24 12:43:45 crc kubenswrapper[4752]: I1124 12:43:45.689245 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1ad488f-020d-46f6-abc8-7b789cfc4594-catalog-content\") pod \"community-operators-mxh8z\" (UID: \"a1ad488f-020d-46f6-abc8-7b789cfc4594\") " pod="openshift-marketplace/community-operators-mxh8z" Nov 24 12:43:45 crc kubenswrapper[4752]: I1124 12:43:45.712494 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7c48\" (UniqueName: \"kubernetes.io/projected/a1ad488f-020d-46f6-abc8-7b789cfc4594-kube-api-access-c7c48\") pod \"community-operators-mxh8z\" (UID: \"a1ad488f-020d-46f6-abc8-7b789cfc4594\") " pod="openshift-marketplace/community-operators-mxh8z" Nov 24 12:43:45 crc kubenswrapper[4752]: I1124 12:43:45.777728 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxh8z" Nov 24 12:43:45 crc kubenswrapper[4752]: I1124 12:43:45.824681 4752 generic.go:334] "Generic (PLEG): container finished" podID="9442f57a-2b65-4f64-be4f-7c1a834acea9" containerID="328fa17f7b2e7d6ce7d0ff5a9fb127aef069ff0b8009cfb11c3e0f3441aa0b7c" exitCode=0 Nov 24 12:43:45 crc kubenswrapper[4752]: I1124 12:43:45.824801 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5dfcff4df9-n9brt" event={"ID":"9442f57a-2b65-4f64-be4f-7c1a834acea9","Type":"ContainerDied","Data":"328fa17f7b2e7d6ce7d0ff5a9fb127aef069ff0b8009cfb11c3e0f3441aa0b7c"} Nov 24 12:43:46 crc kubenswrapper[4752]: I1124 12:43:46.338410 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mxh8z"] Nov 24 12:43:46 crc kubenswrapper[4752]: I1124 12:43:46.841239 4752 generic.go:334] "Generic (PLEG): container finished" podID="a1ad488f-020d-46f6-abc8-7b789cfc4594" containerID="f315e8705231cecc47d65c04126bf98094d8efb57dc49c070d195176b4b1f572" exitCode=0 Nov 24 12:43:46 crc kubenswrapper[4752]: I1124 12:43:46.841320 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxh8z" event={"ID":"a1ad488f-020d-46f6-abc8-7b789cfc4594","Type":"ContainerDied","Data":"f315e8705231cecc47d65c04126bf98094d8efb57dc49c070d195176b4b1f572"} Nov 24 12:43:46 crc kubenswrapper[4752]: I1124 12:43:46.841374 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxh8z" event={"ID":"a1ad488f-020d-46f6-abc8-7b789cfc4594","Type":"ContainerStarted","Data":"e6cde119ad7cac177d188e3d5988c041985bcf8683ec23612c76893a817eaa48"} Nov 24 12:43:47 crc kubenswrapper[4752]: I1124 12:43:47.039519 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-3f3c-account-create-tkbzs"] Nov 24 12:43:47 crc kubenswrapper[4752]: I1124 12:43:47.048945 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-zfhsz"] Nov 24 12:43:47 crc kubenswrapper[4752]: I1124 12:43:47.056674 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-3f3c-account-create-tkbzs"] Nov 24 12:43:47 crc kubenswrapper[4752]: I1124 12:43:47.064955 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-zfhsz"] Nov 24 12:43:47 crc kubenswrapper[4752]: I1124 12:43:47.852056 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxh8z" event={"ID":"a1ad488f-020d-46f6-abc8-7b789cfc4594","Type":"ContainerStarted","Data":"5e0924d21f17052ca88c37608d34334d5dc7c4e6c7efa9aebeb057901dd4f9e6"} Nov 24 12:43:48 crc kubenswrapper[4752]: I1124 12:43:48.551131 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-85c675c6dc-z7nps" Nov 24 12:43:48 crc kubenswrapper[4752]: I1124 12:43:48.741858 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12492e12-167b-4d92-8a59-fbb72d9b714a" path="/var/lib/kubelet/pods/12492e12-167b-4d92-8a59-fbb72d9b714a/volumes" Nov 24 12:43:48 crc kubenswrapper[4752]: I1124 12:43:48.743717 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1653b7b3-f4fb-47a6-ba38-f4d07d0bc55e" path="/var/lib/kubelet/pods/1653b7b3-f4fb-47a6-ba38-f4d07d0bc55e/volumes" Nov 24 12:43:49 crc kubenswrapper[4752]: I1124 12:43:49.729506 4752 scope.go:117] "RemoveContainer" containerID="2856f89824856d474bf147248c78cc6aab4d866c6c4a631c464c721708251588" Nov 24 12:43:49 crc kubenswrapper[4752]: E1124 12:43:49.729894 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:43:49 crc kubenswrapper[4752]: I1124 12:43:49.872420 4752 generic.go:334] "Generic (PLEG): container finished" podID="a1ad488f-020d-46f6-abc8-7b789cfc4594" containerID="5e0924d21f17052ca88c37608d34334d5dc7c4e6c7efa9aebeb057901dd4f9e6" exitCode=0 Nov 24 12:43:49 crc kubenswrapper[4752]: I1124 12:43:49.872473 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxh8z" event={"ID":"a1ad488f-020d-46f6-abc8-7b789cfc4594","Type":"ContainerDied","Data":"5e0924d21f17052ca88c37608d34334d5dc7c4e6c7efa9aebeb057901dd4f9e6"} Nov 24 12:43:49 crc kubenswrapper[4752]: I1124 12:43:49.875797 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-d74gw" podUID="099586b8-337b-49b9-93ce-8799907d222e" containerName="registry-server" probeResult="failure" output=< Nov 24 12:43:49 crc kubenswrapper[4752]: timeout: failed to connect service ":50051" within 1s Nov 24 12:43:49 crc kubenswrapper[4752]: > Nov 24 12:43:50 crc kubenswrapper[4752]: I1124 12:43:50.893868 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxh8z" event={"ID":"a1ad488f-020d-46f6-abc8-7b789cfc4594","Type":"ContainerStarted","Data":"bb55767101d8995670ed36fdfac3c14e4c0e06e2bd4d01c7768da316df7b86b0"} Nov 24 12:43:50 crc kubenswrapper[4752]: I1124 12:43:50.936597 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mxh8z" podStartSLOduration=2.375682228 podStartE2EDuration="5.936573683s" podCreationTimestamp="2025-11-24 12:43:45 +0000 UTC" firstStartedPulling="2025-11-24 12:43:46.844961417 +0000 UTC m=+5832.829781706" lastFinishedPulling="2025-11-24 12:43:50.405852872 +0000 UTC m=+5836.390673161" observedRunningTime="2025-11-24 12:43:50.930228071 +0000 UTC m=+5836.915048360" watchObservedRunningTime="2025-11-24 12:43:50.936573683 +0000 UTC m=+5836.921393972" Nov 24 12:43:51 crc kubenswrapper[4752]: I1124 12:43:51.058854 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5dfcff4df9-n9brt" podUID="9442f57a-2b65-4f64-be4f-7c1a834acea9" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.110:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.110:8080: connect: connection refused" Nov 24 12:43:55 crc kubenswrapper[4752]: I1124 12:43:55.778874 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mxh8z" Nov 24 12:43:55 crc kubenswrapper[4752]: I1124 12:43:55.779447 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mxh8z" Nov 24 12:43:55 crc kubenswrapper[4752]: I1124 12:43:55.842193 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mxh8z" Nov 24 12:43:55 crc kubenswrapper[4752]: I1124 12:43:55.987840 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mxh8z" Nov 24 12:43:56 crc kubenswrapper[4752]: I1124 12:43:56.042486 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-lp9hg"] Nov 24 12:43:56 crc kubenswrapper[4752]: I1124 12:43:56.055124 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-lp9hg"] Nov 24 12:43:56 crc kubenswrapper[4752]: I1124 12:43:56.079815 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mxh8z"] Nov 24 12:43:56 crc kubenswrapper[4752]: I1124 12:43:56.745855 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52b679aa-aa7b-4e82-9bbb-9c2642c7feb8" path="/var/lib/kubelet/pods/52b679aa-aa7b-4e82-9bbb-9c2642c7feb8/volumes" Nov 24 12:43:57 crc kubenswrapper[4752]: I1124 12:43:57.962527 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mxh8z" podUID="a1ad488f-020d-46f6-abc8-7b789cfc4594" containerName="registry-server" containerID="cri-o://bb55767101d8995670ed36fdfac3c14e4c0e06e2bd4d01c7768da316df7b86b0" gracePeriod=2 Nov 24 12:43:58 crc kubenswrapper[4752]: I1124 12:43:58.525407 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxh8z" Nov 24 12:43:58 crc kubenswrapper[4752]: I1124 12:43:58.593146 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7c48\" (UniqueName: \"kubernetes.io/projected/a1ad488f-020d-46f6-abc8-7b789cfc4594-kube-api-access-c7c48\") pod \"a1ad488f-020d-46f6-abc8-7b789cfc4594\" (UID: \"a1ad488f-020d-46f6-abc8-7b789cfc4594\") " Nov 24 12:43:58 crc kubenswrapper[4752]: I1124 12:43:58.593207 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1ad488f-020d-46f6-abc8-7b789cfc4594-catalog-content\") pod \"a1ad488f-020d-46f6-abc8-7b789cfc4594\" (UID: \"a1ad488f-020d-46f6-abc8-7b789cfc4594\") " Nov 24 12:43:58 crc kubenswrapper[4752]: I1124 12:43:58.593253 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1ad488f-020d-46f6-abc8-7b789cfc4594-utilities\") pod \"a1ad488f-020d-46f6-abc8-7b789cfc4594\" (UID: \"a1ad488f-020d-46f6-abc8-7b789cfc4594\") " Nov 24 12:43:58 crc kubenswrapper[4752]: I1124 12:43:58.594139 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1ad488f-020d-46f6-abc8-7b789cfc4594-utilities" (OuterVolumeSpecName: "utilities") pod "a1ad488f-020d-46f6-abc8-7b789cfc4594" (UID: "a1ad488f-020d-46f6-abc8-7b789cfc4594"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:43:58 crc kubenswrapper[4752]: I1124 12:43:58.600023 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1ad488f-020d-46f6-abc8-7b789cfc4594-kube-api-access-c7c48" (OuterVolumeSpecName: "kube-api-access-c7c48") pod "a1ad488f-020d-46f6-abc8-7b789cfc4594" (UID: "a1ad488f-020d-46f6-abc8-7b789cfc4594"). InnerVolumeSpecName "kube-api-access-c7c48". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:43:58 crc kubenswrapper[4752]: I1124 12:43:58.664626 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1ad488f-020d-46f6-abc8-7b789cfc4594-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a1ad488f-020d-46f6-abc8-7b789cfc4594" (UID: "a1ad488f-020d-46f6-abc8-7b789cfc4594"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:43:58 crc kubenswrapper[4752]: I1124 12:43:58.695637 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7c48\" (UniqueName: \"kubernetes.io/projected/a1ad488f-020d-46f6-abc8-7b789cfc4594-kube-api-access-c7c48\") on node \"crc\" DevicePath \"\"" Nov 24 12:43:58 crc kubenswrapper[4752]: I1124 12:43:58.695680 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1ad488f-020d-46f6-abc8-7b789cfc4594-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 12:43:58 crc kubenswrapper[4752]: I1124 12:43:58.695692 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1ad488f-020d-46f6-abc8-7b789cfc4594-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 12:43:58 crc kubenswrapper[4752]: I1124 12:43:58.975223 4752 generic.go:334] "Generic (PLEG): container finished" podID="a1ad488f-020d-46f6-abc8-7b789cfc4594" containerID="bb55767101d8995670ed36fdfac3c14e4c0e06e2bd4d01c7768da316df7b86b0" exitCode=0 Nov 24 12:43:58 crc kubenswrapper[4752]: I1124 12:43:58.975309 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxh8z" event={"ID":"a1ad488f-020d-46f6-abc8-7b789cfc4594","Type":"ContainerDied","Data":"bb55767101d8995670ed36fdfac3c14e4c0e06e2bd4d01c7768da316df7b86b0"} Nov 24 12:43:58 crc kubenswrapper[4752]: I1124 12:43:58.975336 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxh8z" Nov 24 12:43:58 crc kubenswrapper[4752]: I1124 12:43:58.975379 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxh8z" event={"ID":"a1ad488f-020d-46f6-abc8-7b789cfc4594","Type":"ContainerDied","Data":"e6cde119ad7cac177d188e3d5988c041985bcf8683ec23612c76893a817eaa48"} Nov 24 12:43:58 crc kubenswrapper[4752]: I1124 12:43:58.975421 4752 scope.go:117] "RemoveContainer" containerID="bb55767101d8995670ed36fdfac3c14e4c0e06e2bd4d01c7768da316df7b86b0" Nov 24 12:43:59 crc kubenswrapper[4752]: I1124 12:43:59.007133 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mxh8z"] Nov 24 12:43:59 crc kubenswrapper[4752]: I1124 12:43:59.018933 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mxh8z"] Nov 24 12:43:59 crc kubenswrapper[4752]: I1124 12:43:59.028832 4752 scope.go:117] "RemoveContainer" containerID="5e0924d21f17052ca88c37608d34334d5dc7c4e6c7efa9aebeb057901dd4f9e6" Nov 24 12:43:59 crc kubenswrapper[4752]: I1124 12:43:59.078212 4752 scope.go:117] "RemoveContainer" containerID="f315e8705231cecc47d65c04126bf98094d8efb57dc49c070d195176b4b1f572" Nov 24 12:43:59 crc kubenswrapper[4752]: I1124 12:43:59.107730 4752 scope.go:117] "RemoveContainer" containerID="bb55767101d8995670ed36fdfac3c14e4c0e06e2bd4d01c7768da316df7b86b0" Nov 24 12:43:59 crc kubenswrapper[4752]: E1124 12:43:59.108189 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb55767101d8995670ed36fdfac3c14e4c0e06e2bd4d01c7768da316df7b86b0\": container with ID starting with bb55767101d8995670ed36fdfac3c14e4c0e06e2bd4d01c7768da316df7b86b0 not found: ID does not exist" containerID="bb55767101d8995670ed36fdfac3c14e4c0e06e2bd4d01c7768da316df7b86b0" Nov 24 12:43:59 crc kubenswrapper[4752]: I1124 12:43:59.108226 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb55767101d8995670ed36fdfac3c14e4c0e06e2bd4d01c7768da316df7b86b0"} err="failed to get container status \"bb55767101d8995670ed36fdfac3c14e4c0e06e2bd4d01c7768da316df7b86b0\": rpc error: code = NotFound desc = could not find container \"bb55767101d8995670ed36fdfac3c14e4c0e06e2bd4d01c7768da316df7b86b0\": container with ID starting with bb55767101d8995670ed36fdfac3c14e4c0e06e2bd4d01c7768da316df7b86b0 not found: ID does not exist" Nov 24 12:43:59 crc kubenswrapper[4752]: I1124 12:43:59.108250 4752 scope.go:117] "RemoveContainer" containerID="5e0924d21f17052ca88c37608d34334d5dc7c4e6c7efa9aebeb057901dd4f9e6" Nov 24 12:43:59 crc kubenswrapper[4752]: E1124 12:43:59.108582 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e0924d21f17052ca88c37608d34334d5dc7c4e6c7efa9aebeb057901dd4f9e6\": container with ID starting with 5e0924d21f17052ca88c37608d34334d5dc7c4e6c7efa9aebeb057901dd4f9e6 not found: ID does not exist" containerID="5e0924d21f17052ca88c37608d34334d5dc7c4e6c7efa9aebeb057901dd4f9e6" Nov 24 12:43:59 crc kubenswrapper[4752]: I1124 12:43:59.108622 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e0924d21f17052ca88c37608d34334d5dc7c4e6c7efa9aebeb057901dd4f9e6"} err="failed to get container status \"5e0924d21f17052ca88c37608d34334d5dc7c4e6c7efa9aebeb057901dd4f9e6\": rpc error: code = NotFound desc = could not find container \"5e0924d21f17052ca88c37608d34334d5dc7c4e6c7efa9aebeb057901dd4f9e6\": container with ID starting with 5e0924d21f17052ca88c37608d34334d5dc7c4e6c7efa9aebeb057901dd4f9e6 not found: ID does not exist" Nov 24 12:43:59 crc kubenswrapper[4752]: I1124 12:43:59.108660 4752 scope.go:117] "RemoveContainer" containerID="f315e8705231cecc47d65c04126bf98094d8efb57dc49c070d195176b4b1f572" Nov 24 12:43:59 crc kubenswrapper[4752]: E1124 12:43:59.109247 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f315e8705231cecc47d65c04126bf98094d8efb57dc49c070d195176b4b1f572\": container with ID starting with f315e8705231cecc47d65c04126bf98094d8efb57dc49c070d195176b4b1f572 not found: ID does not exist" containerID="f315e8705231cecc47d65c04126bf98094d8efb57dc49c070d195176b4b1f572" Nov 24 12:43:59 crc kubenswrapper[4752]: I1124 12:43:59.109278 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f315e8705231cecc47d65c04126bf98094d8efb57dc49c070d195176b4b1f572"} err="failed to get container status \"f315e8705231cecc47d65c04126bf98094d8efb57dc49c070d195176b4b1f572\": rpc error: code = NotFound desc = could not find container \"f315e8705231cecc47d65c04126bf98094d8efb57dc49c070d195176b4b1f572\": container with ID starting with f315e8705231cecc47d65c04126bf98094d8efb57dc49c070d195176b4b1f572 not found: ID does not exist" Nov 24 12:43:59 crc kubenswrapper[4752]: I1124 12:43:59.886037 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-d74gw" podUID="099586b8-337b-49b9-93ce-8799907d222e" containerName="registry-server" probeResult="failure" output=< Nov 24 12:43:59 crc kubenswrapper[4752]: timeout: failed to connect service ":50051" within 1s Nov 24 12:43:59 crc kubenswrapper[4752]: > Nov 24 12:44:00 crc kubenswrapper[4752]: I1124 12:44:00.728983 4752 scope.go:117] "RemoveContainer" containerID="2856f89824856d474bf147248c78cc6aab4d866c6c4a631c464c721708251588" Nov 24 12:44:00 crc kubenswrapper[4752]: E1124 12:44:00.729600 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:44:00 crc kubenswrapper[4752]: I1124 12:44:00.748617 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1ad488f-020d-46f6-abc8-7b789cfc4594" path="/var/lib/kubelet/pods/a1ad488f-020d-46f6-abc8-7b789cfc4594/volumes" Nov 24 12:44:01 crc kubenswrapper[4752]: I1124 12:44:01.059302 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5dfcff4df9-n9brt" podUID="9442f57a-2b65-4f64-be4f-7c1a834acea9" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.110:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.110:8080: connect: connection refused" Nov 24 12:44:02 crc kubenswrapper[4752]: I1124 12:44:02.954378 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210trm5w"] Nov 24 12:44:02 crc kubenswrapper[4752]: E1124 12:44:02.955238 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1ad488f-020d-46f6-abc8-7b789cfc4594" containerName="extract-utilities" Nov 24 12:44:02 crc kubenswrapper[4752]: I1124 12:44:02.955257 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1ad488f-020d-46f6-abc8-7b789cfc4594" containerName="extract-utilities" Nov 24 12:44:02 crc kubenswrapper[4752]: E1124 12:44:02.955275 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1ad488f-020d-46f6-abc8-7b789cfc4594" containerName="extract-content" Nov 24 12:44:02 crc kubenswrapper[4752]: I1124 12:44:02.955285 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1ad488f-020d-46f6-abc8-7b789cfc4594" containerName="extract-content" Nov 24 12:44:02 crc kubenswrapper[4752]: E1124 12:44:02.955293 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1ad488f-020d-46f6-abc8-7b789cfc4594" containerName="registry-server" Nov 24 12:44:02 crc kubenswrapper[4752]: I1124 12:44:02.955301 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1ad488f-020d-46f6-abc8-7b789cfc4594" containerName="registry-server" Nov 24 12:44:02 crc kubenswrapper[4752]: I1124 12:44:02.955540 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1ad488f-020d-46f6-abc8-7b789cfc4594" containerName="registry-server" Nov 24 12:44:02 crc kubenswrapper[4752]: I1124 12:44:02.957475 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210trm5w" Nov 24 12:44:02 crc kubenswrapper[4752]: I1124 12:44:02.962087 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 24 12:44:02 crc kubenswrapper[4752]: I1124 12:44:02.978877 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210trm5w"] Nov 24 12:44:02 crc kubenswrapper[4752]: I1124 12:44:02.988759 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdh7p\" (UniqueName: \"kubernetes.io/projected/9b24258d-310a-4504-829b-5bbd08f76070-kube-api-access-tdh7p\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210trm5w\" (UID: \"9b24258d-310a-4504-829b-5bbd08f76070\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210trm5w" Nov 24 12:44:02 crc kubenswrapper[4752]: I1124 12:44:02.988939 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9b24258d-310a-4504-829b-5bbd08f76070-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210trm5w\" (UID: \"9b24258d-310a-4504-829b-5bbd08f76070\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210trm5w" Nov 24 12:44:02 crc kubenswrapper[4752]: I1124 12:44:02.989038 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9b24258d-310a-4504-829b-5bbd08f76070-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210trm5w\" (UID: \"9b24258d-310a-4504-829b-5bbd08f76070\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210trm5w" Nov 24 12:44:03 crc kubenswrapper[4752]: I1124 12:44:03.089863 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdh7p\" (UniqueName: \"kubernetes.io/projected/9b24258d-310a-4504-829b-5bbd08f76070-kube-api-access-tdh7p\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210trm5w\" (UID: \"9b24258d-310a-4504-829b-5bbd08f76070\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210trm5w" Nov 24 12:44:03 crc kubenswrapper[4752]: I1124 12:44:03.089924 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9b24258d-310a-4504-829b-5bbd08f76070-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210trm5w\" (UID: \"9b24258d-310a-4504-829b-5bbd08f76070\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210trm5w" Nov 24 12:44:03 crc kubenswrapper[4752]: I1124 12:44:03.089955 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9b24258d-310a-4504-829b-5bbd08f76070-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210trm5w\" (UID: \"9b24258d-310a-4504-829b-5bbd08f76070\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210trm5w" Nov 24 12:44:03 crc kubenswrapper[4752]: I1124 12:44:03.090522 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9b24258d-310a-4504-829b-5bbd08f76070-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210trm5w\" (UID: \"9b24258d-310a-4504-829b-5bbd08f76070\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210trm5w" Nov 24 12:44:03 crc kubenswrapper[4752]: I1124 12:44:03.090700 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9b24258d-310a-4504-829b-5bbd08f76070-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210trm5w\" (UID: \"9b24258d-310a-4504-829b-5bbd08f76070\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210trm5w" Nov 24 12:44:03 crc kubenswrapper[4752]: I1124 12:44:03.115595 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdh7p\" (UniqueName: \"kubernetes.io/projected/9b24258d-310a-4504-829b-5bbd08f76070-kube-api-access-tdh7p\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210trm5w\" (UID: \"9b24258d-310a-4504-829b-5bbd08f76070\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210trm5w" Nov 24 12:44:03 crc kubenswrapper[4752]: I1124 12:44:03.288096 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210trm5w" Nov 24 12:44:04 crc kubenswrapper[4752]: I1124 12:44:04.018905 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210trm5w"] Nov 24 12:44:05 crc kubenswrapper[4752]: I1124 12:44:05.046642 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210trm5w" event={"ID":"9b24258d-310a-4504-829b-5bbd08f76070","Type":"ContainerStarted","Data":"cfa18c8913ca4ea15ee0d652c14d99c550d8df25a10d7f7ab8b252943a52168e"} Nov 24 12:44:05 crc kubenswrapper[4752]: I1124 12:44:05.047240 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210trm5w" event={"ID":"9b24258d-310a-4504-829b-5bbd08f76070","Type":"ContainerStarted","Data":"5c372e5c913c0904d526fcb773957e25683641dfc749b6a11a2cad933a949f3a"} Nov 24 12:44:06 crc kubenswrapper[4752]: I1124 12:44:06.058958 4752 generic.go:334] "Generic (PLEG): container finished" podID="9b24258d-310a-4504-829b-5bbd08f76070" containerID="cfa18c8913ca4ea15ee0d652c14d99c550d8df25a10d7f7ab8b252943a52168e" exitCode=0 Nov 24 12:44:06 crc kubenswrapper[4752]: I1124 12:44:06.059006 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210trm5w" event={"ID":"9b24258d-310a-4504-829b-5bbd08f76070","Type":"ContainerDied","Data":"cfa18c8913ca4ea15ee0d652c14d99c550d8df25a10d7f7ab8b252943a52168e"} Nov 24 12:44:08 crc kubenswrapper[4752]: I1124 12:44:08.081299 4752 generic.go:334] "Generic (PLEG): container finished" podID="9b24258d-310a-4504-829b-5bbd08f76070" containerID="c7f87629a9157e18499f43c456af981d6e0daa1d8476a390c3581b18b2a8b7b1" exitCode=0 Nov 24 12:44:08 crc kubenswrapper[4752]: I1124 12:44:08.081370 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210trm5w" event={"ID":"9b24258d-310a-4504-829b-5bbd08f76070","Type":"ContainerDied","Data":"c7f87629a9157e18499f43c456af981d6e0daa1d8476a390c3581b18b2a8b7b1"} Nov 24 12:44:08 crc kubenswrapper[4752]: I1124 12:44:08.873336 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-d74gw" Nov 24 12:44:08 crc kubenswrapper[4752]: I1124 12:44:08.916486 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-d74gw" Nov 24 12:44:09 crc kubenswrapper[4752]: I1124 12:44:09.095862 4752 generic.go:334] "Generic (PLEG): container finished" podID="9b24258d-310a-4504-829b-5bbd08f76070" containerID="492a06f86f5da1dd0b6b870f2d156b322e9b0899112f2167b0546788778ffb08" exitCode=0 Nov 24 12:44:09 crc kubenswrapper[4752]: I1124 12:44:09.096967 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210trm5w" event={"ID":"9b24258d-310a-4504-829b-5bbd08f76070","Type":"ContainerDied","Data":"492a06f86f5da1dd0b6b870f2d156b322e9b0899112f2167b0546788778ffb08"} Nov 24 12:44:10 crc kubenswrapper[4752]: I1124 12:44:10.556141 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210trm5w" Nov 24 12:44:10 crc kubenswrapper[4752]: I1124 12:44:10.619625 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdh7p\" (UniqueName: \"kubernetes.io/projected/9b24258d-310a-4504-829b-5bbd08f76070-kube-api-access-tdh7p\") pod \"9b24258d-310a-4504-829b-5bbd08f76070\" (UID: \"9b24258d-310a-4504-829b-5bbd08f76070\") " Nov 24 12:44:10 crc kubenswrapper[4752]: I1124 12:44:10.619800 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9b24258d-310a-4504-829b-5bbd08f76070-util\") pod \"9b24258d-310a-4504-829b-5bbd08f76070\" (UID: \"9b24258d-310a-4504-829b-5bbd08f76070\") " Nov 24 12:44:10 crc kubenswrapper[4752]: I1124 12:44:10.619937 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9b24258d-310a-4504-829b-5bbd08f76070-bundle\") pod \"9b24258d-310a-4504-829b-5bbd08f76070\" (UID: \"9b24258d-310a-4504-829b-5bbd08f76070\") " Nov 24 12:44:10 crc kubenswrapper[4752]: I1124 12:44:10.623258 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b24258d-310a-4504-829b-5bbd08f76070-bundle" (OuterVolumeSpecName: "bundle") pod "9b24258d-310a-4504-829b-5bbd08f76070" (UID: "9b24258d-310a-4504-829b-5bbd08f76070"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:44:10 crc kubenswrapper[4752]: I1124 12:44:10.626927 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b24258d-310a-4504-829b-5bbd08f76070-kube-api-access-tdh7p" (OuterVolumeSpecName: "kube-api-access-tdh7p") pod "9b24258d-310a-4504-829b-5bbd08f76070" (UID: "9b24258d-310a-4504-829b-5bbd08f76070"). InnerVolumeSpecName "kube-api-access-tdh7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:44:10 crc kubenswrapper[4752]: I1124 12:44:10.633143 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b24258d-310a-4504-829b-5bbd08f76070-util" (OuterVolumeSpecName: "util") pod "9b24258d-310a-4504-829b-5bbd08f76070" (UID: "9b24258d-310a-4504-829b-5bbd08f76070"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:44:10 crc kubenswrapper[4752]: I1124 12:44:10.723737 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdh7p\" (UniqueName: \"kubernetes.io/projected/9b24258d-310a-4504-829b-5bbd08f76070-kube-api-access-tdh7p\") on node \"crc\" DevicePath \"\"" Nov 24 12:44:10 crc kubenswrapper[4752]: I1124 12:44:10.724501 4752 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9b24258d-310a-4504-829b-5bbd08f76070-util\") on node \"crc\" DevicePath \"\"" Nov 24 12:44:10 crc kubenswrapper[4752]: I1124 12:44:10.724651 4752 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9b24258d-310a-4504-829b-5bbd08f76070-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 12:44:10 crc kubenswrapper[4752]: I1124 12:44:10.907702 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zs99p"] Nov 24 12:44:10 crc kubenswrapper[4752]: E1124 12:44:10.908667 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b24258d-310a-4504-829b-5bbd08f76070" containerName="util" Nov 24 12:44:10 crc kubenswrapper[4752]: I1124 12:44:10.908689 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b24258d-310a-4504-829b-5bbd08f76070" containerName="util" Nov 24 12:44:10 crc kubenswrapper[4752]: E1124 12:44:10.908727 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b24258d-310a-4504-829b-5bbd08f76070" containerName="pull" Nov 24 12:44:10 crc kubenswrapper[4752]: I1124 12:44:10.908738 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b24258d-310a-4504-829b-5bbd08f76070" containerName="pull" Nov 24 12:44:10 crc kubenswrapper[4752]: E1124 12:44:10.908814 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b24258d-310a-4504-829b-5bbd08f76070" containerName="extract" Nov 24 12:44:10 crc kubenswrapper[4752]: I1124 12:44:10.908828 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b24258d-310a-4504-829b-5bbd08f76070" containerName="extract" Nov 24 12:44:10 crc kubenswrapper[4752]: I1124 12:44:10.909153 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b24258d-310a-4504-829b-5bbd08f76070" containerName="extract" Nov 24 12:44:10 crc kubenswrapper[4752]: I1124 12:44:10.917374 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zs99p" Nov 24 12:44:10 crc kubenswrapper[4752]: I1124 12:44:10.927754 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zs99p"] Nov 24 12:44:11 crc kubenswrapper[4752]: I1124 12:44:11.040975 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs7gt\" (UniqueName: \"kubernetes.io/projected/c81e6e6c-7a65-4932-9320-def9f487a951-kube-api-access-hs7gt\") pod \"redhat-marketplace-zs99p\" (UID: \"c81e6e6c-7a65-4932-9320-def9f487a951\") " pod="openshift-marketplace/redhat-marketplace-zs99p" Nov 24 12:44:11 crc kubenswrapper[4752]: I1124 12:44:11.041048 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c81e6e6c-7a65-4932-9320-def9f487a951-utilities\") pod \"redhat-marketplace-zs99p\" (UID: \"c81e6e6c-7a65-4932-9320-def9f487a951\") " pod="openshift-marketplace/redhat-marketplace-zs99p" Nov 24 12:44:11 crc kubenswrapper[4752]: I1124 12:44:11.041128 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c81e6e6c-7a65-4932-9320-def9f487a951-catalog-content\") pod \"redhat-marketplace-zs99p\" (UID: \"c81e6e6c-7a65-4932-9320-def9f487a951\") " pod="openshift-marketplace/redhat-marketplace-zs99p" Nov 24 12:44:11 crc kubenswrapper[4752]: I1124 12:44:11.058975 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5dfcff4df9-n9brt" podUID="9442f57a-2b65-4f64-be4f-7c1a834acea9" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.110:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.110:8080: connect: connection refused" Nov 24 12:44:11 crc kubenswrapper[4752]: I1124 12:44:11.059084 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5dfcff4df9-n9brt" Nov 24 12:44:11 crc kubenswrapper[4752]: I1124 12:44:11.129004 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210trm5w" event={"ID":"9b24258d-310a-4504-829b-5bbd08f76070","Type":"ContainerDied","Data":"5c372e5c913c0904d526fcb773957e25683641dfc749b6a11a2cad933a949f3a"} Nov 24 12:44:11 crc kubenswrapper[4752]: I1124 12:44:11.129039 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c372e5c913c0904d526fcb773957e25683641dfc749b6a11a2cad933a949f3a" Nov 24 12:44:11 crc kubenswrapper[4752]: I1124 12:44:11.129068 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210trm5w" Nov 24 12:44:11 crc kubenswrapper[4752]: I1124 12:44:11.143517 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs7gt\" (UniqueName: \"kubernetes.io/projected/c81e6e6c-7a65-4932-9320-def9f487a951-kube-api-access-hs7gt\") pod \"redhat-marketplace-zs99p\" (UID: \"c81e6e6c-7a65-4932-9320-def9f487a951\") " pod="openshift-marketplace/redhat-marketplace-zs99p" Nov 24 12:44:11 crc kubenswrapper[4752]: I1124 12:44:11.143586 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c81e6e6c-7a65-4932-9320-def9f487a951-utilities\") pod \"redhat-marketplace-zs99p\" (UID: \"c81e6e6c-7a65-4932-9320-def9f487a951\") " pod="openshift-marketplace/redhat-marketplace-zs99p" Nov 24 12:44:11 crc kubenswrapper[4752]: I1124 12:44:11.143661 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c81e6e6c-7a65-4932-9320-def9f487a951-catalog-content\") pod \"redhat-marketplace-zs99p\" (UID: \"c81e6e6c-7a65-4932-9320-def9f487a951\") " pod="openshift-marketplace/redhat-marketplace-zs99p" Nov 24 12:44:11 crc kubenswrapper[4752]: I1124 12:44:11.144238 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c81e6e6c-7a65-4932-9320-def9f487a951-utilities\") pod \"redhat-marketplace-zs99p\" (UID: \"c81e6e6c-7a65-4932-9320-def9f487a951\") " pod="openshift-marketplace/redhat-marketplace-zs99p" Nov 24 12:44:11 crc kubenswrapper[4752]: I1124 12:44:11.144339 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c81e6e6c-7a65-4932-9320-def9f487a951-catalog-content\") pod \"redhat-marketplace-zs99p\" (UID: \"c81e6e6c-7a65-4932-9320-def9f487a951\") " pod="openshift-marketplace/redhat-marketplace-zs99p" Nov 24 12:44:11 crc kubenswrapper[4752]: I1124 12:44:11.165323 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs7gt\" (UniqueName: \"kubernetes.io/projected/c81e6e6c-7a65-4932-9320-def9f487a951-kube-api-access-hs7gt\") pod \"redhat-marketplace-zs99p\" (UID: \"c81e6e6c-7a65-4932-9320-def9f487a951\") " pod="openshift-marketplace/redhat-marketplace-zs99p" Nov 24 12:44:11 crc kubenswrapper[4752]: I1124 12:44:11.257564 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zs99p" Nov 24 12:44:11 crc kubenswrapper[4752]: I1124 12:44:11.490609 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d74gw"] Nov 24 12:44:11 crc kubenswrapper[4752]: I1124 12:44:11.491194 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-d74gw" podUID="099586b8-337b-49b9-93ce-8799907d222e" containerName="registry-server" containerID="cri-o://01659914f97e5ad6ddfc51e052bf7773863846e4701fabbe6c64a70399d586c0" gracePeriod=2 Nov 24 12:44:11 crc kubenswrapper[4752]: I1124 12:44:11.729587 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zs99p"] Nov 24 12:44:11 crc kubenswrapper[4752]: I1124 12:44:11.730358 4752 scope.go:117] "RemoveContainer" containerID="2856f89824856d474bf147248c78cc6aab4d866c6c4a631c464c721708251588" Nov 24 12:44:11 crc kubenswrapper[4752]: E1124 12:44:11.730567 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.131414 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d74gw" Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.146539 4752 generic.go:334] "Generic (PLEG): container finished" podID="099586b8-337b-49b9-93ce-8799907d222e" containerID="01659914f97e5ad6ddfc51e052bf7773863846e4701fabbe6c64a70399d586c0" exitCode=0 Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.146603 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d74gw" event={"ID":"099586b8-337b-49b9-93ce-8799907d222e","Type":"ContainerDied","Data":"01659914f97e5ad6ddfc51e052bf7773863846e4701fabbe6c64a70399d586c0"} Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.146631 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d74gw" event={"ID":"099586b8-337b-49b9-93ce-8799907d222e","Type":"ContainerDied","Data":"b2f0dd36dbb1bbccbc37e7fd6067a279a3bda4a8247ca7a0004231e93ae8b293"} Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.146647 4752 scope.go:117] "RemoveContainer" containerID="01659914f97e5ad6ddfc51e052bf7773863846e4701fabbe6c64a70399d586c0" Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.146802 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d74gw" Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.152138 4752 generic.go:334] "Generic (PLEG): container finished" podID="c81e6e6c-7a65-4932-9320-def9f487a951" containerID="b0239d9f10e4a4a0527f7301052d2dcf1f4b6b9b22d659379d5fca1515669608" exitCode=0 Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.153707 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zs99p" event={"ID":"c81e6e6c-7a65-4932-9320-def9f487a951","Type":"ContainerDied","Data":"b0239d9f10e4a4a0527f7301052d2dcf1f4b6b9b22d659379d5fca1515669608"} Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.153933 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zs99p" event={"ID":"c81e6e6c-7a65-4932-9320-def9f487a951","Type":"ContainerStarted","Data":"e2db77bfa18d31105e2466051817e4db1ba3c67caba5eab974dfbea32ba97a60"} Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.159387 4752 generic.go:334] "Generic (PLEG): container finished" podID="9442f57a-2b65-4f64-be4f-7c1a834acea9" containerID="0e2ca094df1700bef9c91bc549879094f0e8c20bb6b3de251d08d53391dde6c8" exitCode=137 Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.159424 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5dfcff4df9-n9brt" event={"ID":"9442f57a-2b65-4f64-be4f-7c1a834acea9","Type":"ContainerDied","Data":"0e2ca094df1700bef9c91bc549879094f0e8c20bb6b3de251d08d53391dde6c8"} Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.188385 4752 scope.go:117] "RemoveContainer" containerID="c4e4194934b183535285437ec179794294f6ab77a34a074e2870b9db5b7be5af" Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.215391 4752 scope.go:117] "RemoveContainer" containerID="0603e78e93b1a3bb1a5c40ea05318e9ba4825f70dd94dad27488af03db76797e" Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.242929 4752 scope.go:117] "RemoveContainer" containerID="01659914f97e5ad6ddfc51e052bf7773863846e4701fabbe6c64a70399d586c0" Nov 24 12:44:12 crc kubenswrapper[4752]: E1124 12:44:12.243500 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01659914f97e5ad6ddfc51e052bf7773863846e4701fabbe6c64a70399d586c0\": container with ID starting with 01659914f97e5ad6ddfc51e052bf7773863846e4701fabbe6c64a70399d586c0 not found: ID does not exist" containerID="01659914f97e5ad6ddfc51e052bf7773863846e4701fabbe6c64a70399d586c0" Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.243531 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01659914f97e5ad6ddfc51e052bf7773863846e4701fabbe6c64a70399d586c0"} err="failed to get container status \"01659914f97e5ad6ddfc51e052bf7773863846e4701fabbe6c64a70399d586c0\": rpc error: code = NotFound desc = could not find container \"01659914f97e5ad6ddfc51e052bf7773863846e4701fabbe6c64a70399d586c0\": container with ID starting with 01659914f97e5ad6ddfc51e052bf7773863846e4701fabbe6c64a70399d586c0 not found: ID does not exist" Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.243558 4752 scope.go:117] "RemoveContainer" containerID="c4e4194934b183535285437ec179794294f6ab77a34a074e2870b9db5b7be5af" Nov 24 12:44:12 crc kubenswrapper[4752]: E1124 12:44:12.243829 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4e4194934b183535285437ec179794294f6ab77a34a074e2870b9db5b7be5af\": container with ID starting with c4e4194934b183535285437ec179794294f6ab77a34a074e2870b9db5b7be5af not found: ID does not exist" containerID="c4e4194934b183535285437ec179794294f6ab77a34a074e2870b9db5b7be5af" Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.243855 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4e4194934b183535285437ec179794294f6ab77a34a074e2870b9db5b7be5af"} err="failed to get container status \"c4e4194934b183535285437ec179794294f6ab77a34a074e2870b9db5b7be5af\": rpc error: code = NotFound desc = could not find container \"c4e4194934b183535285437ec179794294f6ab77a34a074e2870b9db5b7be5af\": container with ID starting with c4e4194934b183535285437ec179794294f6ab77a34a074e2870b9db5b7be5af not found: ID does not exist" Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.243871 4752 scope.go:117] "RemoveContainer" containerID="0603e78e93b1a3bb1a5c40ea05318e9ba4825f70dd94dad27488af03db76797e" Nov 24 12:44:12 crc kubenswrapper[4752]: E1124 12:44:12.244136 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0603e78e93b1a3bb1a5c40ea05318e9ba4825f70dd94dad27488af03db76797e\": container with ID starting with 0603e78e93b1a3bb1a5c40ea05318e9ba4825f70dd94dad27488af03db76797e not found: ID does not exist" containerID="0603e78e93b1a3bb1a5c40ea05318e9ba4825f70dd94dad27488af03db76797e" Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.244153 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0603e78e93b1a3bb1a5c40ea05318e9ba4825f70dd94dad27488af03db76797e"} err="failed to get container status \"0603e78e93b1a3bb1a5c40ea05318e9ba4825f70dd94dad27488af03db76797e\": rpc error: code = NotFound desc = could not find container \"0603e78e93b1a3bb1a5c40ea05318e9ba4825f70dd94dad27488af03db76797e\": container with ID starting with 0603e78e93b1a3bb1a5c40ea05318e9ba4825f70dd94dad27488af03db76797e not found: ID does not exist" Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.285636 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtvnd\" (UniqueName: \"kubernetes.io/projected/099586b8-337b-49b9-93ce-8799907d222e-kube-api-access-qtvnd\") pod \"099586b8-337b-49b9-93ce-8799907d222e\" (UID: \"099586b8-337b-49b9-93ce-8799907d222e\") " Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.285822 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/099586b8-337b-49b9-93ce-8799907d222e-catalog-content\") pod \"099586b8-337b-49b9-93ce-8799907d222e\" (UID: \"099586b8-337b-49b9-93ce-8799907d222e\") " Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.285891 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/099586b8-337b-49b9-93ce-8799907d222e-utilities\") pod \"099586b8-337b-49b9-93ce-8799907d222e\" (UID: \"099586b8-337b-49b9-93ce-8799907d222e\") " Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.286387 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/099586b8-337b-49b9-93ce-8799907d222e-utilities" (OuterVolumeSpecName: "utilities") pod "099586b8-337b-49b9-93ce-8799907d222e" (UID: "099586b8-337b-49b9-93ce-8799907d222e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.287046 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/099586b8-337b-49b9-93ce-8799907d222e-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.292514 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/099586b8-337b-49b9-93ce-8799907d222e-kube-api-access-qtvnd" (OuterVolumeSpecName: "kube-api-access-qtvnd") pod "099586b8-337b-49b9-93ce-8799907d222e" (UID: "099586b8-337b-49b9-93ce-8799907d222e"). InnerVolumeSpecName "kube-api-access-qtvnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.305790 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5dfcff4df9-n9brt" Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.383793 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/099586b8-337b-49b9-93ce-8799907d222e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "099586b8-337b-49b9-93ce-8799907d222e" (UID: "099586b8-337b-49b9-93ce-8799907d222e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.388325 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9442f57a-2b65-4f64-be4f-7c1a834acea9-horizon-secret-key\") pod \"9442f57a-2b65-4f64-be4f-7c1a834acea9\" (UID: \"9442f57a-2b65-4f64-be4f-7c1a834acea9\") " Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.388483 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzv2k\" (UniqueName: \"kubernetes.io/projected/9442f57a-2b65-4f64-be4f-7c1a834acea9-kube-api-access-nzv2k\") pod \"9442f57a-2b65-4f64-be4f-7c1a834acea9\" (UID: \"9442f57a-2b65-4f64-be4f-7c1a834acea9\") " Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.388502 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9442f57a-2b65-4f64-be4f-7c1a834acea9-config-data\") pod \"9442f57a-2b65-4f64-be4f-7c1a834acea9\" (UID: \"9442f57a-2b65-4f64-be4f-7c1a834acea9\") " Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.388563 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9442f57a-2b65-4f64-be4f-7c1a834acea9-logs\") pod \"9442f57a-2b65-4f64-be4f-7c1a834acea9\" (UID: \"9442f57a-2b65-4f64-be4f-7c1a834acea9\") " Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.388666 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9442f57a-2b65-4f64-be4f-7c1a834acea9-scripts\") pod \"9442f57a-2b65-4f64-be4f-7c1a834acea9\" (UID: \"9442f57a-2b65-4f64-be4f-7c1a834acea9\") " Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.389122 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/099586b8-337b-49b9-93ce-8799907d222e-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.389139 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtvnd\" (UniqueName: \"kubernetes.io/projected/099586b8-337b-49b9-93ce-8799907d222e-kube-api-access-qtvnd\") on node \"crc\" DevicePath \"\"" Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.390585 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9442f57a-2b65-4f64-be4f-7c1a834acea9-logs" (OuterVolumeSpecName: "logs") pod "9442f57a-2b65-4f64-be4f-7c1a834acea9" (UID: "9442f57a-2b65-4f64-be4f-7c1a834acea9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.392948 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9442f57a-2b65-4f64-be4f-7c1a834acea9-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "9442f57a-2b65-4f64-be4f-7c1a834acea9" (UID: "9442f57a-2b65-4f64-be4f-7c1a834acea9"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.393489 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9442f57a-2b65-4f64-be4f-7c1a834acea9-kube-api-access-nzv2k" (OuterVolumeSpecName: "kube-api-access-nzv2k") pod "9442f57a-2b65-4f64-be4f-7c1a834acea9" (UID: "9442f57a-2b65-4f64-be4f-7c1a834acea9"). InnerVolumeSpecName "kube-api-access-nzv2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.411674 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9442f57a-2b65-4f64-be4f-7c1a834acea9-config-data" (OuterVolumeSpecName: "config-data") pod "9442f57a-2b65-4f64-be4f-7c1a834acea9" (UID: "9442f57a-2b65-4f64-be4f-7c1a834acea9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.413379 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9442f57a-2b65-4f64-be4f-7c1a834acea9-scripts" (OuterVolumeSpecName: "scripts") pod "9442f57a-2b65-4f64-be4f-7c1a834acea9" (UID: "9442f57a-2b65-4f64-be4f-7c1a834acea9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.491494 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9442f57a-2b65-4f64-be4f-7c1a834acea9-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.491525 4752 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9442f57a-2b65-4f64-be4f-7c1a834acea9-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.491536 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzv2k\" (UniqueName: \"kubernetes.io/projected/9442f57a-2b65-4f64-be4f-7c1a834acea9-kube-api-access-nzv2k\") on node \"crc\" DevicePath \"\"" Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.491545 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9442f57a-2b65-4f64-be4f-7c1a834acea9-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.491553 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9442f57a-2b65-4f64-be4f-7c1a834acea9-logs\") on node \"crc\" DevicePath \"\"" Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.528482 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d74gw"] Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.536424 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-d74gw"] Nov 24 12:44:12 crc kubenswrapper[4752]: I1124 12:44:12.741376 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="099586b8-337b-49b9-93ce-8799907d222e" path="/var/lib/kubelet/pods/099586b8-337b-49b9-93ce-8799907d222e/volumes" Nov 24 12:44:13 crc kubenswrapper[4752]: I1124 12:44:13.171207 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5dfcff4df9-n9brt" event={"ID":"9442f57a-2b65-4f64-be4f-7c1a834acea9","Type":"ContainerDied","Data":"2bb2e0b596e1faf56a723fb9a4f1aa98d3ed80e28a8b5c5eedccc7adbff6089f"} Nov 24 12:44:13 crc kubenswrapper[4752]: I1124 12:44:13.171545 4752 scope.go:117] "RemoveContainer" containerID="328fa17f7b2e7d6ce7d0ff5a9fb127aef069ff0b8009cfb11c3e0f3441aa0b7c" Nov 24 12:44:13 crc kubenswrapper[4752]: I1124 12:44:13.171699 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5dfcff4df9-n9brt" Nov 24 12:44:13 crc kubenswrapper[4752]: I1124 12:44:13.180584 4752 generic.go:334] "Generic (PLEG): container finished" podID="c81e6e6c-7a65-4932-9320-def9f487a951" containerID="ebbb13f03df2e23bed8f168202ceb48392707e2ab171e270d435732c92e8472a" exitCode=0 Nov 24 12:44:13 crc kubenswrapper[4752]: I1124 12:44:13.180620 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zs99p" event={"ID":"c81e6e6c-7a65-4932-9320-def9f487a951","Type":"ContainerDied","Data":"ebbb13f03df2e23bed8f168202ceb48392707e2ab171e270d435732c92e8472a"} Nov 24 12:44:13 crc kubenswrapper[4752]: I1124 12:44:13.196047 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5dfcff4df9-n9brt"] Nov 24 12:44:13 crc kubenswrapper[4752]: I1124 12:44:13.205701 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5dfcff4df9-n9brt"] Nov 24 12:44:13 crc kubenswrapper[4752]: I1124 12:44:13.299063 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6rcxw"] Nov 24 12:44:13 crc kubenswrapper[4752]: E1124 12:44:13.299626 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="099586b8-337b-49b9-93ce-8799907d222e" containerName="extract-content" Nov 24 12:44:13 crc kubenswrapper[4752]: I1124 12:44:13.299652 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="099586b8-337b-49b9-93ce-8799907d222e" containerName="extract-content" Nov 24 12:44:13 crc kubenswrapper[4752]: E1124 12:44:13.299681 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="099586b8-337b-49b9-93ce-8799907d222e" containerName="registry-server" Nov 24 12:44:13 crc kubenswrapper[4752]: I1124 12:44:13.299689 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="099586b8-337b-49b9-93ce-8799907d222e" containerName="registry-server" Nov 24 12:44:13 crc kubenswrapper[4752]: E1124 12:44:13.299715 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9442f57a-2b65-4f64-be4f-7c1a834acea9" containerName="horizon-log" Nov 24 12:44:13 crc kubenswrapper[4752]: I1124 12:44:13.299723 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="9442f57a-2b65-4f64-be4f-7c1a834acea9" containerName="horizon-log" Nov 24 12:44:13 crc kubenswrapper[4752]: E1124 12:44:13.299736 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9442f57a-2b65-4f64-be4f-7c1a834acea9" containerName="horizon" Nov 24 12:44:13 crc kubenswrapper[4752]: I1124 12:44:13.299761 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="9442f57a-2b65-4f64-be4f-7c1a834acea9" containerName="horizon" Nov 24 12:44:13 crc kubenswrapper[4752]: E1124 12:44:13.299782 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="099586b8-337b-49b9-93ce-8799907d222e" containerName="extract-utilities" Nov 24 12:44:13 crc kubenswrapper[4752]: I1124 12:44:13.299788 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="099586b8-337b-49b9-93ce-8799907d222e" containerName="extract-utilities" Nov 24 12:44:13 crc kubenswrapper[4752]: I1124 12:44:13.299972 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="099586b8-337b-49b9-93ce-8799907d222e" containerName="registry-server" Nov 24 12:44:13 crc kubenswrapper[4752]: I1124 12:44:13.299984 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="9442f57a-2b65-4f64-be4f-7c1a834acea9" containerName="horizon" Nov 24 12:44:13 crc kubenswrapper[4752]: I1124 12:44:13.300003 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="9442f57a-2b65-4f64-be4f-7c1a834acea9" containerName="horizon-log" Nov 24 12:44:13 crc kubenswrapper[4752]: I1124 12:44:13.301424 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6rcxw" Nov 24 12:44:13 crc kubenswrapper[4752]: I1124 12:44:13.324628 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6rcxw"] Nov 24 12:44:13 crc kubenswrapper[4752]: I1124 12:44:13.412668 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e906b79b-99ac-4575-9f17-f8a4ba8114a8-catalog-content\") pod \"certified-operators-6rcxw\" (UID: \"e906b79b-99ac-4575-9f17-f8a4ba8114a8\") " pod="openshift-marketplace/certified-operators-6rcxw" Nov 24 12:44:13 crc kubenswrapper[4752]: I1124 12:44:13.412764 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e906b79b-99ac-4575-9f17-f8a4ba8114a8-utilities\") pod \"certified-operators-6rcxw\" (UID: \"e906b79b-99ac-4575-9f17-f8a4ba8114a8\") " pod="openshift-marketplace/certified-operators-6rcxw" Nov 24 12:44:13 crc kubenswrapper[4752]: I1124 12:44:13.412786 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fgrn\" (UniqueName: \"kubernetes.io/projected/e906b79b-99ac-4575-9f17-f8a4ba8114a8-kube-api-access-8fgrn\") pod \"certified-operators-6rcxw\" (UID: \"e906b79b-99ac-4575-9f17-f8a4ba8114a8\") " pod="openshift-marketplace/certified-operators-6rcxw" Nov 24 12:44:13 crc kubenswrapper[4752]: I1124 12:44:13.414336 4752 scope.go:117] "RemoveContainer" containerID="0e2ca094df1700bef9c91bc549879094f0e8c20bb6b3de251d08d53391dde6c8" Nov 24 12:44:13 crc kubenswrapper[4752]: I1124 12:44:13.515758 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e906b79b-99ac-4575-9f17-f8a4ba8114a8-catalog-content\") pod \"certified-operators-6rcxw\" (UID: \"e906b79b-99ac-4575-9f17-f8a4ba8114a8\") " pod="openshift-marketplace/certified-operators-6rcxw" Nov 24 12:44:13 crc kubenswrapper[4752]: I1124 12:44:13.515868 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e906b79b-99ac-4575-9f17-f8a4ba8114a8-utilities\") pod \"certified-operators-6rcxw\" (UID: \"e906b79b-99ac-4575-9f17-f8a4ba8114a8\") " pod="openshift-marketplace/certified-operators-6rcxw" Nov 24 12:44:13 crc kubenswrapper[4752]: I1124 12:44:13.515898 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fgrn\" (UniqueName: \"kubernetes.io/projected/e906b79b-99ac-4575-9f17-f8a4ba8114a8-kube-api-access-8fgrn\") pod \"certified-operators-6rcxw\" (UID: \"e906b79b-99ac-4575-9f17-f8a4ba8114a8\") " pod="openshift-marketplace/certified-operators-6rcxw" Nov 24 12:44:13 crc kubenswrapper[4752]: I1124 12:44:13.516360 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e906b79b-99ac-4575-9f17-f8a4ba8114a8-catalog-content\") pod \"certified-operators-6rcxw\" (UID: \"e906b79b-99ac-4575-9f17-f8a4ba8114a8\") " pod="openshift-marketplace/certified-operators-6rcxw" Nov 24 12:44:13 crc kubenswrapper[4752]: I1124 12:44:13.516389 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e906b79b-99ac-4575-9f17-f8a4ba8114a8-utilities\") pod \"certified-operators-6rcxw\" (UID: \"e906b79b-99ac-4575-9f17-f8a4ba8114a8\") " pod="openshift-marketplace/certified-operators-6rcxw" Nov 24 12:44:13 crc kubenswrapper[4752]: I1124 12:44:13.545038 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fgrn\" (UniqueName: \"kubernetes.io/projected/e906b79b-99ac-4575-9f17-f8a4ba8114a8-kube-api-access-8fgrn\") pod \"certified-operators-6rcxw\" (UID: \"e906b79b-99ac-4575-9f17-f8a4ba8114a8\") " pod="openshift-marketplace/certified-operators-6rcxw" Nov 24 12:44:13 crc kubenswrapper[4752]: I1124 12:44:13.628562 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6rcxw" Nov 24 12:44:14 crc kubenswrapper[4752]: I1124 12:44:14.192117 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zs99p" event={"ID":"c81e6e6c-7a65-4932-9320-def9f487a951","Type":"ContainerStarted","Data":"1c412324eda064b5145411ff2991fb2616368b7c55a70e21743c06161769acd1"} Nov 24 12:44:14 crc kubenswrapper[4752]: I1124 12:44:14.219265 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zs99p" podStartSLOduration=2.763372949 podStartE2EDuration="4.219249087s" podCreationTimestamp="2025-11-24 12:44:10 +0000 UTC" firstStartedPulling="2025-11-24 12:44:12.155623968 +0000 UTC m=+5858.140444257" lastFinishedPulling="2025-11-24 12:44:13.611500106 +0000 UTC m=+5859.596320395" observedRunningTime="2025-11-24 12:44:14.21515037 +0000 UTC m=+5860.199970659" watchObservedRunningTime="2025-11-24 12:44:14.219249087 +0000 UTC m=+5860.204069376" Nov 24 12:44:14 crc kubenswrapper[4752]: I1124 12:44:14.243493 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6rcxw"] Nov 24 12:44:14 crc kubenswrapper[4752]: W1124 12:44:14.257438 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode906b79b_99ac_4575_9f17_f8a4ba8114a8.slice/crio-1049cf4ef45b6cde7ee55900935162d9eec97462d31210a7a943525cc7f55581 WatchSource:0}: Error finding container 1049cf4ef45b6cde7ee55900935162d9eec97462d31210a7a943525cc7f55581: Status 404 returned error can't find the container with id 1049cf4ef45b6cde7ee55900935162d9eec97462d31210a7a943525cc7f55581 Nov 24 12:44:14 crc kubenswrapper[4752]: I1124 12:44:14.755135 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9442f57a-2b65-4f64-be4f-7c1a834acea9" path="/var/lib/kubelet/pods/9442f57a-2b65-4f64-be4f-7c1a834acea9/volumes" Nov 24 12:44:15 crc kubenswrapper[4752]: I1124 12:44:15.203266 4752 generic.go:334] "Generic (PLEG): container finished" podID="e906b79b-99ac-4575-9f17-f8a4ba8114a8" containerID="4d679221482ab2b872d33e1bcb1920a670c7d5c6ce790424bdd0a64f9728c993" exitCode=0 Nov 24 12:44:15 crc kubenswrapper[4752]: I1124 12:44:15.203384 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6rcxw" event={"ID":"e906b79b-99ac-4575-9f17-f8a4ba8114a8","Type":"ContainerDied","Data":"4d679221482ab2b872d33e1bcb1920a670c7d5c6ce790424bdd0a64f9728c993"} Nov 24 12:44:15 crc kubenswrapper[4752]: I1124 12:44:15.203618 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6rcxw" event={"ID":"e906b79b-99ac-4575-9f17-f8a4ba8114a8","Type":"ContainerStarted","Data":"1049cf4ef45b6cde7ee55900935162d9eec97462d31210a7a943525cc7f55581"} Nov 24 12:44:16 crc kubenswrapper[4752]: I1124 12:44:16.212973 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6rcxw" event={"ID":"e906b79b-99ac-4575-9f17-f8a4ba8114a8","Type":"ContainerStarted","Data":"7ce472adde3fdc6fd4dcf478ccf01c572ab8edb434b0cd9dbe9ce464206dd2da"} Nov 24 12:44:17 crc kubenswrapper[4752]: I1124 12:44:17.225095 4752 generic.go:334] "Generic (PLEG): container finished" podID="e906b79b-99ac-4575-9f17-f8a4ba8114a8" containerID="7ce472adde3fdc6fd4dcf478ccf01c572ab8edb434b0cd9dbe9ce464206dd2da" exitCode=0 Nov 24 12:44:17 crc kubenswrapper[4752]: I1124 12:44:17.225213 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6rcxw" event={"ID":"e906b79b-99ac-4575-9f17-f8a4ba8114a8","Type":"ContainerDied","Data":"7ce472adde3fdc6fd4dcf478ccf01c572ab8edb434b0cd9dbe9ce464206dd2da"} Nov 24 12:44:18 crc kubenswrapper[4752]: I1124 12:44:18.243706 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6rcxw" event={"ID":"e906b79b-99ac-4575-9f17-f8a4ba8114a8","Type":"ContainerStarted","Data":"e996a90f536c7e90e59df55a7e68ecf0050c22d11493b2bd987e875364ff6cb2"} Nov 24 12:44:18 crc kubenswrapper[4752]: I1124 12:44:18.292520 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6rcxw" podStartSLOduration=2.746276448 podStartE2EDuration="5.292502246s" podCreationTimestamp="2025-11-24 12:44:13 +0000 UTC" firstStartedPulling="2025-11-24 12:44:15.205387966 +0000 UTC m=+5861.190208255" lastFinishedPulling="2025-11-24 12:44:17.751613764 +0000 UTC m=+5863.736434053" observedRunningTime="2025-11-24 12:44:18.288844701 +0000 UTC m=+5864.273664990" watchObservedRunningTime="2025-11-24 12:44:18.292502246 +0000 UTC m=+5864.277322535" Nov 24 12:44:20 crc kubenswrapper[4752]: I1124 12:44:20.355980 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-xwxrt"] Nov 24 12:44:20 crc kubenswrapper[4752]: I1124 12:44:20.358383 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-xwxrt" Nov 24 12:44:20 crc kubenswrapper[4752]: I1124 12:44:20.366401 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Nov 24 12:44:20 crc kubenswrapper[4752]: I1124 12:44:20.367921 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Nov 24 12:44:20 crc kubenswrapper[4752]: I1124 12:44:20.368125 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-2sgfl" Nov 24 12:44:20 crc kubenswrapper[4752]: I1124 12:44:20.379429 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-xwxrt"] Nov 24 12:44:20 crc kubenswrapper[4752]: I1124 12:44:20.499666 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll674\" (UniqueName: \"kubernetes.io/projected/58f8c9ac-5f6e-4c39-a911-444d6ccf0391-kube-api-access-ll674\") pod \"obo-prometheus-operator-668cf9dfbb-xwxrt\" (UID: \"58f8c9ac-5f6e-4c39-a911-444d6ccf0391\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-xwxrt" Nov 24 12:44:20 crc kubenswrapper[4752]: I1124 12:44:20.503914 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5ccb6c774c-bctm6"] Nov 24 12:44:20 crc kubenswrapper[4752]: I1124 12:44:20.505358 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5ccb6c774c-bctm6" Nov 24 12:44:20 crc kubenswrapper[4752]: I1124 12:44:20.510864 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Nov 24 12:44:20 crc kubenswrapper[4752]: I1124 12:44:20.511260 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-g7j58" Nov 24 12:44:20 crc kubenswrapper[4752]: I1124 12:44:20.515575 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5ccb6c774c-q26pb"] Nov 24 12:44:20 crc kubenswrapper[4752]: I1124 12:44:20.518983 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5ccb6c774c-q26pb" Nov 24 12:44:20 crc kubenswrapper[4752]: I1124 12:44:20.531566 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5ccb6c774c-bctm6"] Nov 24 12:44:20 crc kubenswrapper[4752]: I1124 12:44:20.586463 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5ccb6c774c-q26pb"] Nov 24 12:44:20 crc kubenswrapper[4752]: I1124 12:44:20.602106 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8eca23da-684b-433d-a748-adc988ebd2a0-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5ccb6c774c-q26pb\" (UID: \"8eca23da-684b-433d-a748-adc988ebd2a0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5ccb6c774c-q26pb" Nov 24 12:44:20 crc kubenswrapper[4752]: I1124 12:44:20.602211 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/57f059a9-5d15-4efb-a1d0-392993e2ae4f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5ccb6c774c-bctm6\" (UID: \"57f059a9-5d15-4efb-a1d0-392993e2ae4f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5ccb6c774c-bctm6" Nov 24 12:44:20 crc kubenswrapper[4752]: I1124 12:44:20.602258 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/57f059a9-5d15-4efb-a1d0-392993e2ae4f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5ccb6c774c-bctm6\" (UID: \"57f059a9-5d15-4efb-a1d0-392993e2ae4f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5ccb6c774c-bctm6" Nov 24 12:44:20 crc kubenswrapper[4752]: I1124 12:44:20.603059 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8eca23da-684b-433d-a748-adc988ebd2a0-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5ccb6c774c-q26pb\" (UID: \"8eca23da-684b-433d-a748-adc988ebd2a0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5ccb6c774c-q26pb" Nov 24 12:44:20 crc kubenswrapper[4752]: I1124 12:44:20.603163 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll674\" (UniqueName: \"kubernetes.io/projected/58f8c9ac-5f6e-4c39-a911-444d6ccf0391-kube-api-access-ll674\") pod \"obo-prometheus-operator-668cf9dfbb-xwxrt\" (UID: \"58f8c9ac-5f6e-4c39-a911-444d6ccf0391\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-xwxrt" Nov 24 12:44:20 crc kubenswrapper[4752]: I1124 12:44:20.644784 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll674\" (UniqueName: \"kubernetes.io/projected/58f8c9ac-5f6e-4c39-a911-444d6ccf0391-kube-api-access-ll674\") pod \"obo-prometheus-operator-668cf9dfbb-xwxrt\" (UID: \"58f8c9ac-5f6e-4c39-a911-444d6ccf0391\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-xwxrt" Nov 24 12:44:20 crc kubenswrapper[4752]: I1124 12:44:20.689411 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-xwxrt" Nov 24 12:44:20 crc kubenswrapper[4752]: I1124 12:44:20.704559 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8eca23da-684b-433d-a748-adc988ebd2a0-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5ccb6c774c-q26pb\" (UID: \"8eca23da-684b-433d-a748-adc988ebd2a0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5ccb6c774c-q26pb" Nov 24 12:44:20 crc kubenswrapper[4752]: I1124 12:44:20.704699 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8eca23da-684b-433d-a748-adc988ebd2a0-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5ccb6c774c-q26pb\" (UID: \"8eca23da-684b-433d-a748-adc988ebd2a0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5ccb6c774c-q26pb" Nov 24 12:44:20 crc kubenswrapper[4752]: I1124 12:44:20.704763 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/57f059a9-5d15-4efb-a1d0-392993e2ae4f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5ccb6c774c-bctm6\" (UID: \"57f059a9-5d15-4efb-a1d0-392993e2ae4f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5ccb6c774c-bctm6" Nov 24 12:44:20 crc kubenswrapper[4752]: I1124 12:44:20.704807 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/57f059a9-5d15-4efb-a1d0-392993e2ae4f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5ccb6c774c-bctm6\" (UID: \"57f059a9-5d15-4efb-a1d0-392993e2ae4f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5ccb6c774c-bctm6" Nov 24 12:44:20 crc kubenswrapper[4752]: I1124 12:44:20.711065 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/57f059a9-5d15-4efb-a1d0-392993e2ae4f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5ccb6c774c-bctm6\" (UID: \"57f059a9-5d15-4efb-a1d0-392993e2ae4f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5ccb6c774c-bctm6" Nov 24 12:44:20 crc kubenswrapper[4752]: I1124 12:44:20.714385 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8eca23da-684b-433d-a748-adc988ebd2a0-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5ccb6c774c-q26pb\" (UID: \"8eca23da-684b-433d-a748-adc988ebd2a0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5ccb6c774c-q26pb" Nov 24 12:44:20 crc kubenswrapper[4752]: I1124 12:44:20.717289 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/57f059a9-5d15-4efb-a1d0-392993e2ae4f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5ccb6c774c-bctm6\" (UID: \"57f059a9-5d15-4efb-a1d0-392993e2ae4f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5ccb6c774c-bctm6" Nov 24 12:44:20 crc kubenswrapper[4752]: I1124 12:44:20.726317 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8eca23da-684b-433d-a748-adc988ebd2a0-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5ccb6c774c-q26pb\" (UID: \"8eca23da-684b-433d-a748-adc988ebd2a0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5ccb6c774c-q26pb" Nov 24 12:44:20 crc kubenswrapper[4752]: I1124 12:44:20.808419 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-tqxdw"] Nov 24 12:44:20 crc kubenswrapper[4752]: I1124 12:44:20.809805 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-tqxdw"] Nov 24 12:44:20 crc kubenswrapper[4752]: I1124 12:44:20.809904 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-tqxdw" Nov 24 12:44:20 crc kubenswrapper[4752]: I1124 12:44:20.815386 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-tq5bx" Nov 24 12:44:20 crc kubenswrapper[4752]: I1124 12:44:20.816032 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Nov 24 12:44:20 crc kubenswrapper[4752]: I1124 12:44:20.825353 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5ccb6c774c-bctm6" Nov 24 12:44:20 crc kubenswrapper[4752]: I1124 12:44:20.848597 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5ccb6c774c-q26pb" Nov 24 12:44:20 crc kubenswrapper[4752]: I1124 12:44:20.911082 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/e1c72bff-b2ec-4286-8016-d05fe2ea859e-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-tqxdw\" (UID: \"e1c72bff-b2ec-4286-8016-d05fe2ea859e\") " pod="openshift-operators/observability-operator-d8bb48f5d-tqxdw" Nov 24 12:44:20 crc kubenswrapper[4752]: I1124 12:44:20.911664 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmdvh\" (UniqueName: \"kubernetes.io/projected/e1c72bff-b2ec-4286-8016-d05fe2ea859e-kube-api-access-jmdvh\") pod \"observability-operator-d8bb48f5d-tqxdw\" (UID: \"e1c72bff-b2ec-4286-8016-d05fe2ea859e\") " pod="openshift-operators/observability-operator-d8bb48f5d-tqxdw" Nov 24 12:44:20 crc kubenswrapper[4752]: I1124 12:44:20.988014 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-grjrp"] Nov 24 12:44:20 crc kubenswrapper[4752]: I1124 12:44:20.989698 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-grjrp" Nov 24 12:44:21 crc kubenswrapper[4752]: I1124 12:44:21.000072 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-t9qps" Nov 24 12:44:21 crc kubenswrapper[4752]: I1124 12:44:21.001429 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-grjrp"] Nov 24 12:44:21 crc kubenswrapper[4752]: I1124 12:44:21.019556 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmdvh\" (UniqueName: \"kubernetes.io/projected/e1c72bff-b2ec-4286-8016-d05fe2ea859e-kube-api-access-jmdvh\") pod \"observability-operator-d8bb48f5d-tqxdw\" (UID: \"e1c72bff-b2ec-4286-8016-d05fe2ea859e\") " pod="openshift-operators/observability-operator-d8bb48f5d-tqxdw" Nov 24 12:44:21 crc kubenswrapper[4752]: I1124 12:44:21.019707 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/e1c72bff-b2ec-4286-8016-d05fe2ea859e-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-tqxdw\" (UID: \"e1c72bff-b2ec-4286-8016-d05fe2ea859e\") " pod="openshift-operators/observability-operator-d8bb48f5d-tqxdw" Nov 24 12:44:21 crc kubenswrapper[4752]: I1124 12:44:21.031559 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/e1c72bff-b2ec-4286-8016-d05fe2ea859e-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-tqxdw\" (UID: \"e1c72bff-b2ec-4286-8016-d05fe2ea859e\") " pod="openshift-operators/observability-operator-d8bb48f5d-tqxdw" Nov 24 12:44:21 crc kubenswrapper[4752]: I1124 12:44:21.100484 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmdvh\" (UniqueName: \"kubernetes.io/projected/e1c72bff-b2ec-4286-8016-d05fe2ea859e-kube-api-access-jmdvh\") pod \"observability-operator-d8bb48f5d-tqxdw\" (UID: \"e1c72bff-b2ec-4286-8016-d05fe2ea859e\") " pod="openshift-operators/observability-operator-d8bb48f5d-tqxdw" Nov 24 12:44:21 crc kubenswrapper[4752]: I1124 12:44:21.122713 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np2d7\" (UniqueName: \"kubernetes.io/projected/cbf04b43-41d6-4e39-9aa9-38bc4aa8a2bb-kube-api-access-np2d7\") pod \"perses-operator-5446b9c989-grjrp\" (UID: \"cbf04b43-41d6-4e39-9aa9-38bc4aa8a2bb\") " pod="openshift-operators/perses-operator-5446b9c989-grjrp" Nov 24 12:44:21 crc kubenswrapper[4752]: I1124 12:44:21.122913 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/cbf04b43-41d6-4e39-9aa9-38bc4aa8a2bb-openshift-service-ca\") pod \"perses-operator-5446b9c989-grjrp\" (UID: \"cbf04b43-41d6-4e39-9aa9-38bc4aa8a2bb\") " pod="openshift-operators/perses-operator-5446b9c989-grjrp" Nov 24 12:44:21 crc kubenswrapper[4752]: I1124 12:44:21.220499 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-tqxdw" Nov 24 12:44:21 crc kubenswrapper[4752]: I1124 12:44:21.224465 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/cbf04b43-41d6-4e39-9aa9-38bc4aa8a2bb-openshift-service-ca\") pod \"perses-operator-5446b9c989-grjrp\" (UID: \"cbf04b43-41d6-4e39-9aa9-38bc4aa8a2bb\") " pod="openshift-operators/perses-operator-5446b9c989-grjrp" Nov 24 12:44:21 crc kubenswrapper[4752]: I1124 12:44:21.224534 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np2d7\" (UniqueName: \"kubernetes.io/projected/cbf04b43-41d6-4e39-9aa9-38bc4aa8a2bb-kube-api-access-np2d7\") pod \"perses-operator-5446b9c989-grjrp\" (UID: \"cbf04b43-41d6-4e39-9aa9-38bc4aa8a2bb\") " pod="openshift-operators/perses-operator-5446b9c989-grjrp" Nov 24 12:44:21 crc kubenswrapper[4752]: I1124 12:44:21.225918 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/cbf04b43-41d6-4e39-9aa9-38bc4aa8a2bb-openshift-service-ca\") pod \"perses-operator-5446b9c989-grjrp\" (UID: \"cbf04b43-41d6-4e39-9aa9-38bc4aa8a2bb\") " pod="openshift-operators/perses-operator-5446b9c989-grjrp" Nov 24 12:44:21 crc kubenswrapper[4752]: I1124 12:44:21.258768 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np2d7\" (UniqueName: \"kubernetes.io/projected/cbf04b43-41d6-4e39-9aa9-38bc4aa8a2bb-kube-api-access-np2d7\") pod \"perses-operator-5446b9c989-grjrp\" (UID: \"cbf04b43-41d6-4e39-9aa9-38bc4aa8a2bb\") " pod="openshift-operators/perses-operator-5446b9c989-grjrp" Nov 24 12:44:21 crc kubenswrapper[4752]: I1124 12:44:21.258864 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zs99p" Nov 24 12:44:21 crc kubenswrapper[4752]: I1124 12:44:21.258898 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zs99p" Nov 24 12:44:21 crc kubenswrapper[4752]: I1124 12:44:21.275683 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-grjrp" Nov 24 12:44:21 crc kubenswrapper[4752]: I1124 12:44:21.340457 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zs99p" Nov 24 12:44:21 crc kubenswrapper[4752]: I1124 12:44:21.433299 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zs99p" Nov 24 12:44:21 crc kubenswrapper[4752]: I1124 12:44:21.634144 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-xwxrt"] Nov 24 12:44:21 crc kubenswrapper[4752]: I1124 12:44:21.822556 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5ccb6c774c-q26pb"] Nov 24 12:44:21 crc kubenswrapper[4752]: W1124 12:44:21.824841 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8eca23da_684b_433d_a748_adc988ebd2a0.slice/crio-beb9cb8fb8f48350cf372f96efe5c8dd6b4f52462ad5655eb6d9a45c1205e193 WatchSource:0}: Error finding container beb9cb8fb8f48350cf372f96efe5c8dd6b4f52462ad5655eb6d9a45c1205e193: Status 404 returned error can't find the container with id beb9cb8fb8f48350cf372f96efe5c8dd6b4f52462ad5655eb6d9a45c1205e193 Nov 24 12:44:21 crc kubenswrapper[4752]: I1124 12:44:21.967331 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5ccb6c774c-bctm6"] Nov 24 12:44:21 crc kubenswrapper[4752]: W1124 12:44:21.978421 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57f059a9_5d15_4efb_a1d0_392993e2ae4f.slice/crio-64fefd2675423b9598e42aa48b488820ed59276651ab6d7f6639b51194290a19 WatchSource:0}: Error finding container 64fefd2675423b9598e42aa48b488820ed59276651ab6d7f6639b51194290a19: Status 404 returned error can't find the container with id 64fefd2675423b9598e42aa48b488820ed59276651ab6d7f6639b51194290a19 Nov 24 12:44:22 crc kubenswrapper[4752]: W1124 12:44:22.070695 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1c72bff_b2ec_4286_8016_d05fe2ea859e.slice/crio-2661604b0fdc43795f26ce83b009f36f74239dbfed448a03237d2a7fa014e711 WatchSource:0}: Error finding container 2661604b0fdc43795f26ce83b009f36f74239dbfed448a03237d2a7fa014e711: Status 404 returned error can't find the container with id 2661604b0fdc43795f26ce83b009f36f74239dbfed448a03237d2a7fa014e711 Nov 24 12:44:22 crc kubenswrapper[4752]: I1124 12:44:22.071454 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-tqxdw"] Nov 24 12:44:22 crc kubenswrapper[4752]: I1124 12:44:22.085258 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-grjrp"] Nov 24 12:44:22 crc kubenswrapper[4752]: W1124 12:44:22.100513 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbf04b43_41d6_4e39_9aa9_38bc4aa8a2bb.slice/crio-8d0184f8b2a4a1657a82a895ea94cfd93b6ef15474381b2ab3d53f172ad677ff WatchSource:0}: Error finding container 8d0184f8b2a4a1657a82a895ea94cfd93b6ef15474381b2ab3d53f172ad677ff: Status 404 returned error can't find the container with id 8d0184f8b2a4a1657a82a895ea94cfd93b6ef15474381b2ab3d53f172ad677ff Nov 24 12:44:22 crc kubenswrapper[4752]: I1124 12:44:22.292313 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5ccb6c774c-bctm6" event={"ID":"57f059a9-5d15-4efb-a1d0-392993e2ae4f","Type":"ContainerStarted","Data":"64fefd2675423b9598e42aa48b488820ed59276651ab6d7f6639b51194290a19"} Nov 24 12:44:22 crc kubenswrapper[4752]: I1124 12:44:22.294328 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-xwxrt" event={"ID":"58f8c9ac-5f6e-4c39-a911-444d6ccf0391","Type":"ContainerStarted","Data":"9eab510503ae8b52511eceaeee5e28cd879824880fe872984e8c128c4a877f97"} Nov 24 12:44:22 crc kubenswrapper[4752]: I1124 12:44:22.303453 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-tqxdw" event={"ID":"e1c72bff-b2ec-4286-8016-d05fe2ea859e","Type":"ContainerStarted","Data":"2661604b0fdc43795f26ce83b009f36f74239dbfed448a03237d2a7fa014e711"} Nov 24 12:44:22 crc kubenswrapper[4752]: I1124 12:44:22.305982 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5ccb6c774c-q26pb" event={"ID":"8eca23da-684b-433d-a748-adc988ebd2a0","Type":"ContainerStarted","Data":"beb9cb8fb8f48350cf372f96efe5c8dd6b4f52462ad5655eb6d9a45c1205e193"} Nov 24 12:44:22 crc kubenswrapper[4752]: I1124 12:44:22.307820 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-grjrp" event={"ID":"cbf04b43-41d6-4e39-9aa9-38bc4aa8a2bb","Type":"ContainerStarted","Data":"8d0184f8b2a4a1657a82a895ea94cfd93b6ef15474381b2ab3d53f172ad677ff"} Nov 24 12:44:22 crc kubenswrapper[4752]: I1124 12:44:22.484885 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zs99p"] Nov 24 12:44:23 crc kubenswrapper[4752]: I1124 12:44:23.325196 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zs99p" podUID="c81e6e6c-7a65-4932-9320-def9f487a951" containerName="registry-server" containerID="cri-o://1c412324eda064b5145411ff2991fb2616368b7c55a70e21743c06161769acd1" gracePeriod=2 Nov 24 12:44:23 crc kubenswrapper[4752]: I1124 12:44:23.629889 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6rcxw" Nov 24 12:44:23 crc kubenswrapper[4752]: I1124 12:44:23.630164 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6rcxw" Nov 24 12:44:23 crc kubenswrapper[4752]: I1124 12:44:23.713209 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6rcxw" Nov 24 12:44:23 crc kubenswrapper[4752]: I1124 12:44:23.734350 4752 scope.go:117] "RemoveContainer" containerID="2856f89824856d474bf147248c78cc6aab4d866c6c4a631c464c721708251588" Nov 24 12:44:23 crc kubenswrapper[4752]: E1124 12:44:23.734598 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:44:23 crc kubenswrapper[4752]: I1124 12:44:23.999919 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zs99p" Nov 24 12:44:24 crc kubenswrapper[4752]: I1124 12:44:24.149023 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hs7gt\" (UniqueName: \"kubernetes.io/projected/c81e6e6c-7a65-4932-9320-def9f487a951-kube-api-access-hs7gt\") pod \"c81e6e6c-7a65-4932-9320-def9f487a951\" (UID: \"c81e6e6c-7a65-4932-9320-def9f487a951\") " Nov 24 12:44:24 crc kubenswrapper[4752]: I1124 12:44:24.149087 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c81e6e6c-7a65-4932-9320-def9f487a951-catalog-content\") pod \"c81e6e6c-7a65-4932-9320-def9f487a951\" (UID: \"c81e6e6c-7a65-4932-9320-def9f487a951\") " Nov 24 12:44:24 crc kubenswrapper[4752]: I1124 12:44:24.149180 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c81e6e6c-7a65-4932-9320-def9f487a951-utilities\") pod \"c81e6e6c-7a65-4932-9320-def9f487a951\" (UID: \"c81e6e6c-7a65-4932-9320-def9f487a951\") " Nov 24 12:44:24 crc kubenswrapper[4752]: I1124 12:44:24.150378 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c81e6e6c-7a65-4932-9320-def9f487a951-utilities" (OuterVolumeSpecName: "utilities") pod "c81e6e6c-7a65-4932-9320-def9f487a951" (UID: "c81e6e6c-7a65-4932-9320-def9f487a951"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:44:24 crc kubenswrapper[4752]: I1124 12:44:24.167428 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c81e6e6c-7a65-4932-9320-def9f487a951-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c81e6e6c-7a65-4932-9320-def9f487a951" (UID: "c81e6e6c-7a65-4932-9320-def9f487a951"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:44:24 crc kubenswrapper[4752]: I1124 12:44:24.174943 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c81e6e6c-7a65-4932-9320-def9f487a951-kube-api-access-hs7gt" (OuterVolumeSpecName: "kube-api-access-hs7gt") pod "c81e6e6c-7a65-4932-9320-def9f487a951" (UID: "c81e6e6c-7a65-4932-9320-def9f487a951"). InnerVolumeSpecName "kube-api-access-hs7gt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:44:24 crc kubenswrapper[4752]: I1124 12:44:24.251980 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hs7gt\" (UniqueName: \"kubernetes.io/projected/c81e6e6c-7a65-4932-9320-def9f487a951-kube-api-access-hs7gt\") on node \"crc\" DevicePath \"\"" Nov 24 12:44:24 crc kubenswrapper[4752]: I1124 12:44:24.252018 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c81e6e6c-7a65-4932-9320-def9f487a951-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 12:44:24 crc kubenswrapper[4752]: I1124 12:44:24.252028 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c81e6e6c-7a65-4932-9320-def9f487a951-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 12:44:24 crc kubenswrapper[4752]: I1124 12:44:24.341036 4752 generic.go:334] "Generic (PLEG): container finished" podID="c81e6e6c-7a65-4932-9320-def9f487a951" containerID="1c412324eda064b5145411ff2991fb2616368b7c55a70e21743c06161769acd1" exitCode=0 Nov 24 12:44:24 crc kubenswrapper[4752]: I1124 12:44:24.341075 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zs99p" event={"ID":"c81e6e6c-7a65-4932-9320-def9f487a951","Type":"ContainerDied","Data":"1c412324eda064b5145411ff2991fb2616368b7c55a70e21743c06161769acd1"} Nov 24 12:44:24 crc kubenswrapper[4752]: I1124 12:44:24.341438 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zs99p" event={"ID":"c81e6e6c-7a65-4932-9320-def9f487a951","Type":"ContainerDied","Data":"e2db77bfa18d31105e2466051817e4db1ba3c67caba5eab974dfbea32ba97a60"} Nov 24 12:44:24 crc kubenswrapper[4752]: I1124 12:44:24.341171 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zs99p" Nov 24 12:44:24 crc kubenswrapper[4752]: I1124 12:44:24.341513 4752 scope.go:117] "RemoveContainer" containerID="1c412324eda064b5145411ff2991fb2616368b7c55a70e21743c06161769acd1" Nov 24 12:44:24 crc kubenswrapper[4752]: I1124 12:44:24.382270 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zs99p"] Nov 24 12:44:24 crc kubenswrapper[4752]: I1124 12:44:24.391009 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zs99p"] Nov 24 12:44:24 crc kubenswrapper[4752]: I1124 12:44:24.418854 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6rcxw" Nov 24 12:44:24 crc kubenswrapper[4752]: I1124 12:44:24.752939 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c81e6e6c-7a65-4932-9320-def9f487a951" path="/var/lib/kubelet/pods/c81e6e6c-7a65-4932-9320-def9f487a951/volumes" Nov 24 12:44:26 crc kubenswrapper[4752]: I1124 12:44:26.048510 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-xrgxg"] Nov 24 12:44:26 crc kubenswrapper[4752]: I1124 12:44:26.057872 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-xrgxg"] Nov 24 12:44:26 crc kubenswrapper[4752]: I1124 12:44:26.741768 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="892c65e3-cd5b-4c52-8b65-35205b02e26b" path="/var/lib/kubelet/pods/892c65e3-cd5b-4c52-8b65-35205b02e26b/volumes" Nov 24 12:44:26 crc kubenswrapper[4752]: I1124 12:44:26.963267 4752 scope.go:117] "RemoveContainer" containerID="ebbb13f03df2e23bed8f168202ceb48392707e2ab171e270d435732c92e8472a" Nov 24 12:44:27 crc kubenswrapper[4752]: I1124 12:44:27.045477 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7355-account-create-2vzv7"] Nov 24 12:44:27 crc kubenswrapper[4752]: I1124 12:44:27.062403 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-7355-account-create-2vzv7"] Nov 24 12:44:27 crc kubenswrapper[4752]: I1124 12:44:27.284085 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6rcxw"] Nov 24 12:44:27 crc kubenswrapper[4752]: I1124 12:44:27.284347 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6rcxw" podUID="e906b79b-99ac-4575-9f17-f8a4ba8114a8" containerName="registry-server" containerID="cri-o://e996a90f536c7e90e59df55a7e68ecf0050c22d11493b2bd987e875364ff6cb2" gracePeriod=2 Nov 24 12:44:28 crc kubenswrapper[4752]: I1124 12:44:28.434692 4752 generic.go:334] "Generic (PLEG): container finished" podID="e906b79b-99ac-4575-9f17-f8a4ba8114a8" containerID="e996a90f536c7e90e59df55a7e68ecf0050c22d11493b2bd987e875364ff6cb2" exitCode=0 Nov 24 12:44:28 crc kubenswrapper[4752]: I1124 12:44:28.434765 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6rcxw" event={"ID":"e906b79b-99ac-4575-9f17-f8a4ba8114a8","Type":"ContainerDied","Data":"e996a90f536c7e90e59df55a7e68ecf0050c22d11493b2bd987e875364ff6cb2"} Nov 24 12:44:28 crc kubenswrapper[4752]: I1124 12:44:28.440864 4752 scope.go:117] "RemoveContainer" containerID="b0239d9f10e4a4a0527f7301052d2dcf1f4b6b9b22d659379d5fca1515669608" Nov 24 12:44:28 crc kubenswrapper[4752]: I1124 12:44:28.576987 4752 scope.go:117] "RemoveContainer" containerID="1c412324eda064b5145411ff2991fb2616368b7c55a70e21743c06161769acd1" Nov 24 12:44:28 crc kubenswrapper[4752]: E1124 12:44:28.579583 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c412324eda064b5145411ff2991fb2616368b7c55a70e21743c06161769acd1\": container with ID starting with 1c412324eda064b5145411ff2991fb2616368b7c55a70e21743c06161769acd1 not found: ID does not exist" containerID="1c412324eda064b5145411ff2991fb2616368b7c55a70e21743c06161769acd1" Nov 24 12:44:28 crc kubenswrapper[4752]: I1124 12:44:28.579620 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c412324eda064b5145411ff2991fb2616368b7c55a70e21743c06161769acd1"} err="failed to get container status \"1c412324eda064b5145411ff2991fb2616368b7c55a70e21743c06161769acd1\": rpc error: code = NotFound desc = could not find container \"1c412324eda064b5145411ff2991fb2616368b7c55a70e21743c06161769acd1\": container with ID starting with 1c412324eda064b5145411ff2991fb2616368b7c55a70e21743c06161769acd1 not found: ID does not exist" Nov 24 12:44:28 crc kubenswrapper[4752]: I1124 12:44:28.579646 4752 scope.go:117] "RemoveContainer" containerID="ebbb13f03df2e23bed8f168202ceb48392707e2ab171e270d435732c92e8472a" Nov 24 12:44:28 crc kubenswrapper[4752]: E1124 12:44:28.581023 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebbb13f03df2e23bed8f168202ceb48392707e2ab171e270d435732c92e8472a\": container with ID starting with ebbb13f03df2e23bed8f168202ceb48392707e2ab171e270d435732c92e8472a not found: ID does not exist" containerID="ebbb13f03df2e23bed8f168202ceb48392707e2ab171e270d435732c92e8472a" Nov 24 12:44:28 crc kubenswrapper[4752]: I1124 12:44:28.581080 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebbb13f03df2e23bed8f168202ceb48392707e2ab171e270d435732c92e8472a"} err="failed to get container status \"ebbb13f03df2e23bed8f168202ceb48392707e2ab171e270d435732c92e8472a\": rpc error: code = NotFound desc = could not find container \"ebbb13f03df2e23bed8f168202ceb48392707e2ab171e270d435732c92e8472a\": container with ID starting with ebbb13f03df2e23bed8f168202ceb48392707e2ab171e270d435732c92e8472a not found: ID does not exist" Nov 24 12:44:28 crc kubenswrapper[4752]: I1124 12:44:28.581456 4752 scope.go:117] "RemoveContainer" containerID="b0239d9f10e4a4a0527f7301052d2dcf1f4b6b9b22d659379d5fca1515669608" Nov 24 12:44:28 crc kubenswrapper[4752]: E1124 12:44:28.582471 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0239d9f10e4a4a0527f7301052d2dcf1f4b6b9b22d659379d5fca1515669608\": container with ID starting with b0239d9f10e4a4a0527f7301052d2dcf1f4b6b9b22d659379d5fca1515669608 not found: ID does not exist" containerID="b0239d9f10e4a4a0527f7301052d2dcf1f4b6b9b22d659379d5fca1515669608" Nov 24 12:44:28 crc kubenswrapper[4752]: I1124 12:44:28.582504 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0239d9f10e4a4a0527f7301052d2dcf1f4b6b9b22d659379d5fca1515669608"} err="failed to get container status \"b0239d9f10e4a4a0527f7301052d2dcf1f4b6b9b22d659379d5fca1515669608\": rpc error: code = NotFound desc = could not find container \"b0239d9f10e4a4a0527f7301052d2dcf1f4b6b9b22d659379d5fca1515669608\": container with ID starting with b0239d9f10e4a4a0527f7301052d2dcf1f4b6b9b22d659379d5fca1515669608 not found: ID does not exist" Nov 24 12:44:28 crc kubenswrapper[4752]: I1124 12:44:28.774534 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98072d93-5346-4958-9c03-136c784abc74" path="/var/lib/kubelet/pods/98072d93-5346-4958-9c03-136c784abc74/volumes" Nov 24 12:44:28 crc kubenswrapper[4752]: I1124 12:44:28.927487 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6rcxw" Nov 24 12:44:29 crc kubenswrapper[4752]: I1124 12:44:29.060573 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fgrn\" (UniqueName: \"kubernetes.io/projected/e906b79b-99ac-4575-9f17-f8a4ba8114a8-kube-api-access-8fgrn\") pod \"e906b79b-99ac-4575-9f17-f8a4ba8114a8\" (UID: \"e906b79b-99ac-4575-9f17-f8a4ba8114a8\") " Nov 24 12:44:29 crc kubenswrapper[4752]: I1124 12:44:29.061157 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e906b79b-99ac-4575-9f17-f8a4ba8114a8-catalog-content\") pod \"e906b79b-99ac-4575-9f17-f8a4ba8114a8\" (UID: \"e906b79b-99ac-4575-9f17-f8a4ba8114a8\") " Nov 24 12:44:29 crc kubenswrapper[4752]: I1124 12:44:29.061245 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e906b79b-99ac-4575-9f17-f8a4ba8114a8-utilities\") pod \"e906b79b-99ac-4575-9f17-f8a4ba8114a8\" (UID: \"e906b79b-99ac-4575-9f17-f8a4ba8114a8\") " Nov 24 12:44:29 crc kubenswrapper[4752]: I1124 12:44:29.062615 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e906b79b-99ac-4575-9f17-f8a4ba8114a8-utilities" (OuterVolumeSpecName: "utilities") pod "e906b79b-99ac-4575-9f17-f8a4ba8114a8" (UID: "e906b79b-99ac-4575-9f17-f8a4ba8114a8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:44:29 crc kubenswrapper[4752]: I1124 12:44:29.087993 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e906b79b-99ac-4575-9f17-f8a4ba8114a8-kube-api-access-8fgrn" (OuterVolumeSpecName: "kube-api-access-8fgrn") pod "e906b79b-99ac-4575-9f17-f8a4ba8114a8" (UID: "e906b79b-99ac-4575-9f17-f8a4ba8114a8"). InnerVolumeSpecName "kube-api-access-8fgrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:44:29 crc kubenswrapper[4752]: I1124 12:44:29.115702 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e906b79b-99ac-4575-9f17-f8a4ba8114a8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e906b79b-99ac-4575-9f17-f8a4ba8114a8" (UID: "e906b79b-99ac-4575-9f17-f8a4ba8114a8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:44:29 crc kubenswrapper[4752]: I1124 12:44:29.163834 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fgrn\" (UniqueName: \"kubernetes.io/projected/e906b79b-99ac-4575-9f17-f8a4ba8114a8-kube-api-access-8fgrn\") on node \"crc\" DevicePath \"\"" Nov 24 12:44:29 crc kubenswrapper[4752]: I1124 12:44:29.163862 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e906b79b-99ac-4575-9f17-f8a4ba8114a8-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 12:44:29 crc kubenswrapper[4752]: I1124 12:44:29.163873 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e906b79b-99ac-4575-9f17-f8a4ba8114a8-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 12:44:29 crc kubenswrapper[4752]: I1124 12:44:29.478916 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5ccb6c774c-bctm6" event={"ID":"57f059a9-5d15-4efb-a1d0-392993e2ae4f","Type":"ContainerStarted","Data":"100505307e2d11700590d924d291c902c6d64549c274e2384675d9496424f601"} Nov 24 12:44:29 crc kubenswrapper[4752]: I1124 12:44:29.516321 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5ccb6c774c-bctm6" podStartSLOduration=2.924067699 podStartE2EDuration="9.516303244s" podCreationTimestamp="2025-11-24 12:44:20 +0000 UTC" firstStartedPulling="2025-11-24 12:44:21.993441184 +0000 UTC m=+5867.978261473" lastFinishedPulling="2025-11-24 12:44:28.585676729 +0000 UTC m=+5874.570497018" observedRunningTime="2025-11-24 12:44:29.515213292 +0000 UTC m=+5875.500033581" watchObservedRunningTime="2025-11-24 12:44:29.516303244 +0000 UTC m=+5875.501123533" Nov 24 12:44:29 crc kubenswrapper[4752]: I1124 12:44:29.548843 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5ccb6c774c-q26pb" event={"ID":"8eca23da-684b-433d-a748-adc988ebd2a0","Type":"ContainerStarted","Data":"fb1794909729141319d7862c56059d64770a03629c9eb3a094df081ccaa50267"} Nov 24 12:44:29 crc kubenswrapper[4752]: I1124 12:44:29.551817 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6rcxw" event={"ID":"e906b79b-99ac-4575-9f17-f8a4ba8114a8","Type":"ContainerDied","Data":"1049cf4ef45b6cde7ee55900935162d9eec97462d31210a7a943525cc7f55581"} Nov 24 12:44:29 crc kubenswrapper[4752]: I1124 12:44:29.551862 4752 scope.go:117] "RemoveContainer" containerID="e996a90f536c7e90e59df55a7e68ecf0050c22d11493b2bd987e875364ff6cb2" Nov 24 12:44:29 crc kubenswrapper[4752]: I1124 12:44:29.552136 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6rcxw" Nov 24 12:44:29 crc kubenswrapper[4752]: I1124 12:44:29.572735 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-grjrp" event={"ID":"cbf04b43-41d6-4e39-9aa9-38bc4aa8a2bb","Type":"ContainerStarted","Data":"55c096f159eedbddb229ac8fff80c295173ffd08cfe1567c6a2cf03ef54677c0"} Nov 24 12:44:29 crc kubenswrapper[4752]: I1124 12:44:29.573561 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-grjrp" Nov 24 12:44:29 crc kubenswrapper[4752]: I1124 12:44:29.637563 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-xwxrt" podStartSLOduration=2.6692178 podStartE2EDuration="9.637528565s" podCreationTimestamp="2025-11-24 12:44:20 +0000 UTC" firstStartedPulling="2025-11-24 12:44:21.657942059 +0000 UTC m=+5867.642762348" lastFinishedPulling="2025-11-24 12:44:28.626252824 +0000 UTC m=+5874.611073113" observedRunningTime="2025-11-24 12:44:29.61507958 +0000 UTC m=+5875.599899869" watchObservedRunningTime="2025-11-24 12:44:29.637528565 +0000 UTC m=+5875.622348854" Nov 24 12:44:29 crc kubenswrapper[4752]: I1124 12:44:29.656425 4752 scope.go:117] "RemoveContainer" containerID="7ce472adde3fdc6fd4dcf478ccf01c572ab8edb434b0cd9dbe9ce464206dd2da" Nov 24 12:44:29 crc kubenswrapper[4752]: I1124 12:44:29.676469 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-grjrp" podStartSLOduration=3.201976069 podStartE2EDuration="9.676447122s" podCreationTimestamp="2025-11-24 12:44:20 +0000 UTC" firstStartedPulling="2025-11-24 12:44:22.102462715 +0000 UTC m=+5868.087283004" lastFinishedPulling="2025-11-24 12:44:28.576933768 +0000 UTC m=+5874.561754057" observedRunningTime="2025-11-24 12:44:29.661326798 +0000 UTC m=+5875.646147087" watchObservedRunningTime="2025-11-24 12:44:29.676447122 +0000 UTC m=+5875.661267411" Nov 24 12:44:29 crc kubenswrapper[4752]: I1124 12:44:29.712335 4752 scope.go:117] "RemoveContainer" containerID="4d679221482ab2b872d33e1bcb1920a670c7d5c6ce790424bdd0a64f9728c993" Nov 24 12:44:29 crc kubenswrapper[4752]: I1124 12:44:29.781004 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6rcxw"] Nov 24 12:44:29 crc kubenswrapper[4752]: I1124 12:44:29.813779 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5ccb6c774c-q26pb" podStartSLOduration=3.203006039 podStartE2EDuration="9.813736495s" podCreationTimestamp="2025-11-24 12:44:20 +0000 UTC" firstStartedPulling="2025-11-24 12:44:21.827347544 +0000 UTC m=+5867.812167833" lastFinishedPulling="2025-11-24 12:44:28.43807799 +0000 UTC m=+5874.422898289" observedRunningTime="2025-11-24 12:44:29.773598452 +0000 UTC m=+5875.758418751" watchObservedRunningTime="2025-11-24 12:44:29.813736495 +0000 UTC m=+5875.798556784" Nov 24 12:44:29 crc kubenswrapper[4752]: I1124 12:44:29.816716 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6rcxw"] Nov 24 12:44:30 crc kubenswrapper[4752]: I1124 12:44:30.585392 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-xwxrt" event={"ID":"58f8c9ac-5f6e-4c39-a911-444d6ccf0391","Type":"ContainerStarted","Data":"e972cdd3594c3c02edf529a316fc143f3883ebaf2bb43cdd7f475470e10b21cb"} Nov 24 12:44:30 crc kubenswrapper[4752]: I1124 12:44:30.739484 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e906b79b-99ac-4575-9f17-f8a4ba8114a8" path="/var/lib/kubelet/pods/e906b79b-99ac-4575-9f17-f8a4ba8114a8/volumes" Nov 24 12:44:33 crc kubenswrapper[4752]: I1124 12:44:33.625405 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-tqxdw" event={"ID":"e1c72bff-b2ec-4286-8016-d05fe2ea859e","Type":"ContainerStarted","Data":"803fb91abaa27aeaa3050e9f0b434ab940c0a9bfc0f88c7f1f545897bfc8cfef"} Nov 24 12:44:33 crc kubenswrapper[4752]: I1124 12:44:33.625731 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-tqxdw" Nov 24 12:44:33 crc kubenswrapper[4752]: I1124 12:44:33.628522 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-tqxdw" Nov 24 12:44:33 crc kubenswrapper[4752]: I1124 12:44:33.647318 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-tqxdw" podStartSLOduration=3.22601944 podStartE2EDuration="13.647296051s" podCreationTimestamp="2025-11-24 12:44:20 +0000 UTC" firstStartedPulling="2025-11-24 12:44:22.079559027 +0000 UTC m=+5868.064379326" lastFinishedPulling="2025-11-24 12:44:32.500835658 +0000 UTC m=+5878.485655937" observedRunningTime="2025-11-24 12:44:33.644406528 +0000 UTC m=+5879.629226817" watchObservedRunningTime="2025-11-24 12:44:33.647296051 +0000 UTC m=+5879.632116340" Nov 24 12:44:34 crc kubenswrapper[4752]: I1124 12:44:34.034775 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-kqdcp"] Nov 24 12:44:34 crc kubenswrapper[4752]: I1124 12:44:34.046695 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-kqdcp"] Nov 24 12:44:34 crc kubenswrapper[4752]: I1124 12:44:34.741820 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f88ca20-3284-434f-a7a6-13e5d3328574" path="/var/lib/kubelet/pods/1f88ca20-3284-434f-a7a6-13e5d3328574/volumes" Nov 24 12:44:36 crc kubenswrapper[4752]: I1124 12:44:36.728223 4752 scope.go:117] "RemoveContainer" containerID="2856f89824856d474bf147248c78cc6aab4d866c6c4a631c464c721708251588" Nov 24 12:44:36 crc kubenswrapper[4752]: E1124 12:44:36.729282 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:44:41 crc kubenswrapper[4752]: I1124 12:44:41.278672 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-grjrp" Nov 24 12:44:43 crc kubenswrapper[4752]: I1124 12:44:43.137385 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Nov 24 12:44:43 crc kubenswrapper[4752]: I1124 12:44:43.137659 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="aec6027c-fa97-4f99-adb4-8b4e298a7aa4" containerName="openstackclient" containerID="cri-o://86326bb1ea503c1e9a7da0299a97728771625a5237aa8cf4d82339d6a4fb0c20" gracePeriod=2 Nov 24 12:44:43 crc kubenswrapper[4752]: I1124 12:44:43.150808 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Nov 24 12:44:43 crc kubenswrapper[4752]: I1124 12:44:43.195277 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Nov 24 12:44:43 crc kubenswrapper[4752]: E1124 12:44:43.200389 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e906b79b-99ac-4575-9f17-f8a4ba8114a8" containerName="extract-content" Nov 24 12:44:43 crc kubenswrapper[4752]: I1124 12:44:43.200416 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="e906b79b-99ac-4575-9f17-f8a4ba8114a8" containerName="extract-content" Nov 24 12:44:43 crc kubenswrapper[4752]: E1124 12:44:43.200433 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e906b79b-99ac-4575-9f17-f8a4ba8114a8" containerName="extract-utilities" Nov 24 12:44:43 crc kubenswrapper[4752]: I1124 12:44:43.200439 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="e906b79b-99ac-4575-9f17-f8a4ba8114a8" containerName="extract-utilities" Nov 24 12:44:43 crc kubenswrapper[4752]: E1124 12:44:43.200452 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c81e6e6c-7a65-4932-9320-def9f487a951" containerName="extract-utilities" Nov 24 12:44:43 crc kubenswrapper[4752]: I1124 12:44:43.200460 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="c81e6e6c-7a65-4932-9320-def9f487a951" containerName="extract-utilities" Nov 24 12:44:43 crc kubenswrapper[4752]: E1124 12:44:43.200471 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e906b79b-99ac-4575-9f17-f8a4ba8114a8" containerName="registry-server" Nov 24 12:44:43 crc kubenswrapper[4752]: I1124 12:44:43.200476 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="e906b79b-99ac-4575-9f17-f8a4ba8114a8" containerName="registry-server" Nov 24 12:44:43 crc kubenswrapper[4752]: E1124 12:44:43.200492 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c81e6e6c-7a65-4932-9320-def9f487a951" containerName="registry-server" Nov 24 12:44:43 crc kubenswrapper[4752]: I1124 12:44:43.200499 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="c81e6e6c-7a65-4932-9320-def9f487a951" containerName="registry-server" Nov 24 12:44:43 crc kubenswrapper[4752]: E1124 12:44:43.200521 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c81e6e6c-7a65-4932-9320-def9f487a951" containerName="extract-content" Nov 24 12:44:43 crc kubenswrapper[4752]: I1124 12:44:43.200526 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="c81e6e6c-7a65-4932-9320-def9f487a951" containerName="extract-content" Nov 24 12:44:43 crc kubenswrapper[4752]: E1124 12:44:43.200542 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aec6027c-fa97-4f99-adb4-8b4e298a7aa4" containerName="openstackclient" Nov 24 12:44:43 crc kubenswrapper[4752]: I1124 12:44:43.200548 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="aec6027c-fa97-4f99-adb4-8b4e298a7aa4" containerName="openstackclient" Nov 24 12:44:43 crc kubenswrapper[4752]: I1124 12:44:43.200731 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="c81e6e6c-7a65-4932-9320-def9f487a951" containerName="registry-server" Nov 24 12:44:43 crc kubenswrapper[4752]: I1124 12:44:43.200753 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="e906b79b-99ac-4575-9f17-f8a4ba8114a8" containerName="registry-server" Nov 24 12:44:43 crc kubenswrapper[4752]: I1124 12:44:43.200771 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="aec6027c-fa97-4f99-adb4-8b4e298a7aa4" containerName="openstackclient" Nov 24 12:44:43 crc kubenswrapper[4752]: I1124 12:44:43.201492 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 24 12:44:43 crc kubenswrapper[4752]: I1124 12:44:43.217645 4752 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="aec6027c-fa97-4f99-adb4-8b4e298a7aa4" podUID="13a1dca4-d743-45d2-b4a8-262db404b0b4" Nov 24 12:44:43 crc kubenswrapper[4752]: I1124 12:44:43.302333 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 24 12:44:43 crc kubenswrapper[4752]: I1124 12:44:43.378051 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/13a1dca4-d743-45d2-b4a8-262db404b0b4-openstack-config-secret\") pod \"openstackclient\" (UID: \"13a1dca4-d743-45d2-b4a8-262db404b0b4\") " pod="openstack/openstackclient" Nov 24 12:44:43 crc kubenswrapper[4752]: I1124 12:44:43.378102 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/13a1dca4-d743-45d2-b4a8-262db404b0b4-openstack-config\") pod \"openstackclient\" (UID: \"13a1dca4-d743-45d2-b4a8-262db404b0b4\") " pod="openstack/openstackclient" Nov 24 12:44:43 crc kubenswrapper[4752]: I1124 12:44:43.378203 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzvbc\" (UniqueName: \"kubernetes.io/projected/13a1dca4-d743-45d2-b4a8-262db404b0b4-kube-api-access-fzvbc\") pod \"openstackclient\" (UID: \"13a1dca4-d743-45d2-b4a8-262db404b0b4\") " pod="openstack/openstackclient" Nov 24 12:44:43 crc kubenswrapper[4752]: I1124 12:44:43.472568 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Nov 24 12:44:43 crc kubenswrapper[4752]: I1124 12:44:43.480456 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 24 12:44:43 crc kubenswrapper[4752]: I1124 12:44:43.482671 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/13a1dca4-d743-45d2-b4a8-262db404b0b4-openstack-config\") pod \"openstackclient\" (UID: \"13a1dca4-d743-45d2-b4a8-262db404b0b4\") " pod="openstack/openstackclient" Nov 24 12:44:43 crc kubenswrapper[4752]: I1124 12:44:43.482722 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mq92\" (UniqueName: \"kubernetes.io/projected/f528da89-b1d3-4b08-a8af-b8371b35ff7c-kube-api-access-7mq92\") pod \"kube-state-metrics-0\" (UID: \"f528da89-b1d3-4b08-a8af-b8371b35ff7c\") " pod="openstack/kube-state-metrics-0" Nov 24 12:44:43 crc kubenswrapper[4752]: I1124 12:44:43.482797 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzvbc\" (UniqueName: \"kubernetes.io/projected/13a1dca4-d743-45d2-b4a8-262db404b0b4-kube-api-access-fzvbc\") pod \"openstackclient\" (UID: \"13a1dca4-d743-45d2-b4a8-262db404b0b4\") " pod="openstack/openstackclient" Nov 24 12:44:43 crc kubenswrapper[4752]: I1124 12:44:43.482895 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/13a1dca4-d743-45d2-b4a8-262db404b0b4-openstack-config-secret\") pod \"openstackclient\" (UID: \"13a1dca4-d743-45d2-b4a8-262db404b0b4\") " pod="openstack/openstackclient" Nov 24 12:44:43 crc kubenswrapper[4752]: I1124 12:44:43.490185 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-4nr6c" Nov 24 12:44:43 crc kubenswrapper[4752]: I1124 12:44:43.491180 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/13a1dca4-d743-45d2-b4a8-262db404b0b4-openstack-config\") pod \"openstackclient\" (UID: \"13a1dca4-d743-45d2-b4a8-262db404b0b4\") " pod="openstack/openstackclient" Nov 24 12:44:43 crc kubenswrapper[4752]: I1124 12:44:43.491728 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/13a1dca4-d743-45d2-b4a8-262db404b0b4-openstack-config-secret\") pod \"openstackclient\" (UID: \"13a1dca4-d743-45d2-b4a8-262db404b0b4\") " pod="openstack/openstackclient" Nov 24 12:44:43 crc kubenswrapper[4752]: I1124 12:44:43.499956 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 24 12:44:43 crc kubenswrapper[4752]: I1124 12:44:43.590608 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzvbc\" (UniqueName: \"kubernetes.io/projected/13a1dca4-d743-45d2-b4a8-262db404b0b4-kube-api-access-fzvbc\") pod \"openstackclient\" (UID: \"13a1dca4-d743-45d2-b4a8-262db404b0b4\") " pod="openstack/openstackclient" Nov 24 12:44:43 crc kubenswrapper[4752]: I1124 12:44:43.591249 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mq92\" (UniqueName: \"kubernetes.io/projected/f528da89-b1d3-4b08-a8af-b8371b35ff7c-kube-api-access-7mq92\") pod \"kube-state-metrics-0\" (UID: \"f528da89-b1d3-4b08-a8af-b8371b35ff7c\") " pod="openstack/kube-state-metrics-0" Nov 24 12:44:43 crc kubenswrapper[4752]: I1124 12:44:43.626153 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mq92\" (UniqueName: \"kubernetes.io/projected/f528da89-b1d3-4b08-a8af-b8371b35ff7c-kube-api-access-7mq92\") pod \"kube-state-metrics-0\" (UID: \"f528da89-b1d3-4b08-a8af-b8371b35ff7c\") " pod="openstack/kube-state-metrics-0" Nov 24 12:44:43 crc kubenswrapper[4752]: I1124 12:44:43.846258 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 24 12:44:43 crc kubenswrapper[4752]: I1124 12:44:43.878056 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.061737 4752 scope.go:117] "RemoveContainer" containerID="ad31de39ea38d7148a7a06c921db1ce2e84f70de2691a73cae88ad44790fb196" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.128078 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.130400 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.132511 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-cmtz6" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.132716 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.132873 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.133429 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.135931 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.143562 4752 scope.go:117] "RemoveContainer" containerID="b3f0df645f5df99584d6875624dede4a507f863fa3d16229f78dd8e5eda2060e" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.173093 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.217373 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/34761b79-c432-4e7e-9715-131bb3bb4450-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"34761b79-c432-4e7e-9715-131bb3bb4450\") " pod="openstack/alertmanager-metric-storage-0" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.217419 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgb8w\" (UniqueName: \"kubernetes.io/projected/34761b79-c432-4e7e-9715-131bb3bb4450-kube-api-access-zgb8w\") pod \"alertmanager-metric-storage-0\" (UID: \"34761b79-c432-4e7e-9715-131bb3bb4450\") " pod="openstack/alertmanager-metric-storage-0" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.217482 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/34761b79-c432-4e7e-9715-131bb3bb4450-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"34761b79-c432-4e7e-9715-131bb3bb4450\") " pod="openstack/alertmanager-metric-storage-0" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.217503 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/34761b79-c432-4e7e-9715-131bb3bb4450-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"34761b79-c432-4e7e-9715-131bb3bb4450\") " pod="openstack/alertmanager-metric-storage-0" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.217553 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/34761b79-c432-4e7e-9715-131bb3bb4450-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"34761b79-c432-4e7e-9715-131bb3bb4450\") " pod="openstack/alertmanager-metric-storage-0" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.217586 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/34761b79-c432-4e7e-9715-131bb3bb4450-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"34761b79-c432-4e7e-9715-131bb3bb4450\") " pod="openstack/alertmanager-metric-storage-0" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.217633 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/34761b79-c432-4e7e-9715-131bb3bb4450-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"34761b79-c432-4e7e-9715-131bb3bb4450\") " pod="openstack/alertmanager-metric-storage-0" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.319827 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/34761b79-c432-4e7e-9715-131bb3bb4450-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"34761b79-c432-4e7e-9715-131bb3bb4450\") " pod="openstack/alertmanager-metric-storage-0" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.319906 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/34761b79-c432-4e7e-9715-131bb3bb4450-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"34761b79-c432-4e7e-9715-131bb3bb4450\") " pod="openstack/alertmanager-metric-storage-0" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.319951 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/34761b79-c432-4e7e-9715-131bb3bb4450-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"34761b79-c432-4e7e-9715-131bb3bb4450\") " pod="openstack/alertmanager-metric-storage-0" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.320008 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/34761b79-c432-4e7e-9715-131bb3bb4450-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"34761b79-c432-4e7e-9715-131bb3bb4450\") " pod="openstack/alertmanager-metric-storage-0" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.320052 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/34761b79-c432-4e7e-9715-131bb3bb4450-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"34761b79-c432-4e7e-9715-131bb3bb4450\") " pod="openstack/alertmanager-metric-storage-0" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.320076 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgb8w\" (UniqueName: \"kubernetes.io/projected/34761b79-c432-4e7e-9715-131bb3bb4450-kube-api-access-zgb8w\") pod \"alertmanager-metric-storage-0\" (UID: \"34761b79-c432-4e7e-9715-131bb3bb4450\") " pod="openstack/alertmanager-metric-storage-0" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.320127 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/34761b79-c432-4e7e-9715-131bb3bb4450-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"34761b79-c432-4e7e-9715-131bb3bb4450\") " pod="openstack/alertmanager-metric-storage-0" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.327630 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/34761b79-c432-4e7e-9715-131bb3bb4450-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"34761b79-c432-4e7e-9715-131bb3bb4450\") " pod="openstack/alertmanager-metric-storage-0" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.345272 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/34761b79-c432-4e7e-9715-131bb3bb4450-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"34761b79-c432-4e7e-9715-131bb3bb4450\") " pod="openstack/alertmanager-metric-storage-0" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.345457 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/34761b79-c432-4e7e-9715-131bb3bb4450-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"34761b79-c432-4e7e-9715-131bb3bb4450\") " pod="openstack/alertmanager-metric-storage-0" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.345980 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/34761b79-c432-4e7e-9715-131bb3bb4450-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"34761b79-c432-4e7e-9715-131bb3bb4450\") " pod="openstack/alertmanager-metric-storage-0" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.351378 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/34761b79-c432-4e7e-9715-131bb3bb4450-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"34761b79-c432-4e7e-9715-131bb3bb4450\") " pod="openstack/alertmanager-metric-storage-0" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.351729 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/34761b79-c432-4e7e-9715-131bb3bb4450-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"34761b79-c432-4e7e-9715-131bb3bb4450\") " pod="openstack/alertmanager-metric-storage-0" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.359144 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgb8w\" (UniqueName: \"kubernetes.io/projected/34761b79-c432-4e7e-9715-131bb3bb4450-kube-api-access-zgb8w\") pod \"alertmanager-metric-storage-0\" (UID: \"34761b79-c432-4e7e-9715-131bb3bb4450\") " pod="openstack/alertmanager-metric-storage-0" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.462851 4752 scope.go:117] "RemoveContainer" containerID="d087871730aedd97b867333e38a0fd463864679121f1418c73b64a4f44d11f11" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.464402 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.584997 4752 scope.go:117] "RemoveContainer" containerID="e2113b1afe5b70e17f808ec03a9fb280c36d905d6734ed9b35522d5b29dcf520" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.709439 4752 scope.go:117] "RemoveContainer" containerID="9cf4a0e6c9f7c858c142856c13c36e4eaa4e581765bc099e8a893deceb3cbb1b" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.791341 4752 scope.go:117] "RemoveContainer" containerID="008c4ec2a5cb9debd960bd9495d9f670997fe14c99c46158134c2cab1e971864" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.792883 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.795076 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.795169 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.805920 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.818421 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.826813 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-r6dlr" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.827131 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.827290 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.828016 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.855696 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.936186 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/50616cf6-5526-40a2-bd9a-6b9608421058-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"50616cf6-5526-40a2-bd9a-6b9608421058\") " pod="openstack/prometheus-metric-storage-0" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.936234 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpwrx\" (UniqueName: \"kubernetes.io/projected/50616cf6-5526-40a2-bd9a-6b9608421058-kube-api-access-xpwrx\") pod \"prometheus-metric-storage-0\" (UID: \"50616cf6-5526-40a2-bd9a-6b9608421058\") " pod="openstack/prometheus-metric-storage-0" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.936302 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/50616cf6-5526-40a2-bd9a-6b9608421058-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"50616cf6-5526-40a2-bd9a-6b9608421058\") " pod="openstack/prometheus-metric-storage-0" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.936363 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/50616cf6-5526-40a2-bd9a-6b9608421058-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"50616cf6-5526-40a2-bd9a-6b9608421058\") " pod="openstack/prometheus-metric-storage-0" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.936433 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/50616cf6-5526-40a2-bd9a-6b9608421058-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"50616cf6-5526-40a2-bd9a-6b9608421058\") " pod="openstack/prometheus-metric-storage-0" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.936454 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d55e3ba6-f9c3-4c11-9018-78db1c1780b7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d55e3ba6-f9c3-4c11-9018-78db1c1780b7\") pod \"prometheus-metric-storage-0\" (UID: \"50616cf6-5526-40a2-bd9a-6b9608421058\") " pod="openstack/prometheus-metric-storage-0" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.936476 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/50616cf6-5526-40a2-bd9a-6b9608421058-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"50616cf6-5526-40a2-bd9a-6b9608421058\") " pod="openstack/prometheus-metric-storage-0" Nov 24 12:44:44 crc kubenswrapper[4752]: I1124 12:44:44.936495 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/50616cf6-5526-40a2-bd9a-6b9608421058-config\") pod \"prometheus-metric-storage-0\" (UID: \"50616cf6-5526-40a2-bd9a-6b9608421058\") " pod="openstack/prometheus-metric-storage-0" Nov 24 12:44:45 crc kubenswrapper[4752]: I1124 12:44:45.039969 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/50616cf6-5526-40a2-bd9a-6b9608421058-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"50616cf6-5526-40a2-bd9a-6b9608421058\") " pod="openstack/prometheus-metric-storage-0" Nov 24 12:44:45 crc kubenswrapper[4752]: I1124 12:44:45.040038 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/50616cf6-5526-40a2-bd9a-6b9608421058-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"50616cf6-5526-40a2-bd9a-6b9608421058\") " pod="openstack/prometheus-metric-storage-0" Nov 24 12:44:45 crc kubenswrapper[4752]: I1124 12:44:45.040111 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/50616cf6-5526-40a2-bd9a-6b9608421058-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"50616cf6-5526-40a2-bd9a-6b9608421058\") " pod="openstack/prometheus-metric-storage-0" Nov 24 12:44:45 crc kubenswrapper[4752]: I1124 12:44:45.040131 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/50616cf6-5526-40a2-bd9a-6b9608421058-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"50616cf6-5526-40a2-bd9a-6b9608421058\") " pod="openstack/prometheus-metric-storage-0" Nov 24 12:44:45 crc kubenswrapper[4752]: I1124 12:44:45.040148 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d55e3ba6-f9c3-4c11-9018-78db1c1780b7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d55e3ba6-f9c3-4c11-9018-78db1c1780b7\") pod \"prometheus-metric-storage-0\" (UID: \"50616cf6-5526-40a2-bd9a-6b9608421058\") " pod="openstack/prometheus-metric-storage-0" Nov 24 12:44:45 crc kubenswrapper[4752]: I1124 12:44:45.040170 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/50616cf6-5526-40a2-bd9a-6b9608421058-config\") pod \"prometheus-metric-storage-0\" (UID: \"50616cf6-5526-40a2-bd9a-6b9608421058\") " pod="openstack/prometheus-metric-storage-0" Nov 24 12:44:45 crc kubenswrapper[4752]: I1124 12:44:45.040228 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/50616cf6-5526-40a2-bd9a-6b9608421058-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"50616cf6-5526-40a2-bd9a-6b9608421058\") " pod="openstack/prometheus-metric-storage-0" Nov 24 12:44:45 crc kubenswrapper[4752]: I1124 12:44:45.040247 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpwrx\" (UniqueName: \"kubernetes.io/projected/50616cf6-5526-40a2-bd9a-6b9608421058-kube-api-access-xpwrx\") pod \"prometheus-metric-storage-0\" (UID: \"50616cf6-5526-40a2-bd9a-6b9608421058\") " pod="openstack/prometheus-metric-storage-0" Nov 24 12:44:45 crc kubenswrapper[4752]: I1124 12:44:45.042455 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/50616cf6-5526-40a2-bd9a-6b9608421058-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"50616cf6-5526-40a2-bd9a-6b9608421058\") " pod="openstack/prometheus-metric-storage-0" Nov 24 12:44:45 crc kubenswrapper[4752]: I1124 12:44:45.052555 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/50616cf6-5526-40a2-bd9a-6b9608421058-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"50616cf6-5526-40a2-bd9a-6b9608421058\") " pod="openstack/prometheus-metric-storage-0" Nov 24 12:44:45 crc kubenswrapper[4752]: I1124 12:44:45.052991 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/50616cf6-5526-40a2-bd9a-6b9608421058-config\") pod \"prometheus-metric-storage-0\" (UID: \"50616cf6-5526-40a2-bd9a-6b9608421058\") " pod="openstack/prometheus-metric-storage-0" Nov 24 12:44:45 crc kubenswrapper[4752]: I1124 12:44:45.053961 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/50616cf6-5526-40a2-bd9a-6b9608421058-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"50616cf6-5526-40a2-bd9a-6b9608421058\") " pod="openstack/prometheus-metric-storage-0" Nov 24 12:44:45 crc kubenswrapper[4752]: I1124 12:44:45.055179 4752 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 24 12:44:45 crc kubenswrapper[4752]: I1124 12:44:45.055229 4752 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d55e3ba6-f9c3-4c11-9018-78db1c1780b7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d55e3ba6-f9c3-4c11-9018-78db1c1780b7\") pod \"prometheus-metric-storage-0\" (UID: \"50616cf6-5526-40a2-bd9a-6b9608421058\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/93615f7981d48bbc02820b487b9d99ad5700f99bc808c45cf02a6d5ccd50ad12/globalmount\"" pod="openstack/prometheus-metric-storage-0" Nov 24 12:44:45 crc kubenswrapper[4752]: I1124 12:44:45.057996 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/50616cf6-5526-40a2-bd9a-6b9608421058-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"50616cf6-5526-40a2-bd9a-6b9608421058\") " pod="openstack/prometheus-metric-storage-0" Nov 24 12:44:45 crc kubenswrapper[4752]: I1124 12:44:45.065319 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/50616cf6-5526-40a2-bd9a-6b9608421058-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"50616cf6-5526-40a2-bd9a-6b9608421058\") " pod="openstack/prometheus-metric-storage-0" Nov 24 12:44:45 crc kubenswrapper[4752]: I1124 12:44:45.071335 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 24 12:44:45 crc kubenswrapper[4752]: I1124 12:44:45.107274 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpwrx\" (UniqueName: \"kubernetes.io/projected/50616cf6-5526-40a2-bd9a-6b9608421058-kube-api-access-xpwrx\") pod \"prometheus-metric-storage-0\" (UID: \"50616cf6-5526-40a2-bd9a-6b9608421058\") " pod="openstack/prometheus-metric-storage-0" Nov 24 12:44:45 crc kubenswrapper[4752]: I1124 12:44:45.365079 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Nov 24 12:44:45 crc kubenswrapper[4752]: I1124 12:44:45.586384 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d55e3ba6-f9c3-4c11-9018-78db1c1780b7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d55e3ba6-f9c3-4c11-9018-78db1c1780b7\") pod \"prometheus-metric-storage-0\" (UID: \"50616cf6-5526-40a2-bd9a-6b9608421058\") " pod="openstack/prometheus-metric-storage-0" Nov 24 12:44:45 crc kubenswrapper[4752]: I1124 12:44:45.729014 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Nov 24 12:44:45 crc kubenswrapper[4752]: I1124 12:44:45.860479 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 24 12:44:45 crc kubenswrapper[4752]: I1124 12:44:45.877105 4752 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="aec6027c-fa97-4f99-adb4-8b4e298a7aa4" podUID="13a1dca4-d743-45d2-b4a8-262db404b0b4" Nov 24 12:44:45 crc kubenswrapper[4752]: I1124 12:44:45.980179 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"34761b79-c432-4e7e-9715-131bb3bb4450","Type":"ContainerStarted","Data":"ab7f522955a8a6436a92c5644a1b97de840e8005289b84b889639d0e24a3099e"} Nov 24 12:44:45 crc kubenswrapper[4752]: I1124 12:44:45.987416 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f528da89-b1d3-4b08-a8af-b8371b35ff7c","Type":"ContainerStarted","Data":"959f6b1dfb8c3897d20a515b47bb76db3c3d2cc0a8448abb583e9efe81ff2236"} Nov 24 12:44:45 crc kubenswrapper[4752]: I1124 12:44:45.999445 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"13a1dca4-d743-45d2-b4a8-262db404b0b4","Type":"ContainerStarted","Data":"881d9f8810d3269bdc515177ca9cd2cf02760f3b7c48ff26f1535c10064c51a3"} Nov 24 12:44:45 crc kubenswrapper[4752]: I1124 12:44:45.999488 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"13a1dca4-d743-45d2-b4a8-262db404b0b4","Type":"ContainerStarted","Data":"a894b95a2aa22f71ea17e74698a05ae9e73678592e58742dcdab58cd06b5b4e7"} Nov 24 12:44:46 crc kubenswrapper[4752]: I1124 12:44:46.006882 4752 generic.go:334] "Generic (PLEG): container finished" podID="aec6027c-fa97-4f99-adb4-8b4e298a7aa4" containerID="86326bb1ea503c1e9a7da0299a97728771625a5237aa8cf4d82339d6a4fb0c20" exitCode=137 Nov 24 12:44:46 crc kubenswrapper[4752]: I1124 12:44:46.006954 4752 scope.go:117] "RemoveContainer" containerID="86326bb1ea503c1e9a7da0299a97728771625a5237aa8cf4d82339d6a4fb0c20" Nov 24 12:44:46 crc kubenswrapper[4752]: I1124 12:44:46.007111 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 24 12:44:46 crc kubenswrapper[4752]: I1124 12:44:46.018995 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xv4rj\" (UniqueName: \"kubernetes.io/projected/aec6027c-fa97-4f99-adb4-8b4e298a7aa4-kube-api-access-xv4rj\") pod \"aec6027c-fa97-4f99-adb4-8b4e298a7aa4\" (UID: \"aec6027c-fa97-4f99-adb4-8b4e298a7aa4\") " Nov 24 12:44:46 crc kubenswrapper[4752]: I1124 12:44:46.019338 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aec6027c-fa97-4f99-adb4-8b4e298a7aa4-openstack-config-secret\") pod \"aec6027c-fa97-4f99-adb4-8b4e298a7aa4\" (UID: \"aec6027c-fa97-4f99-adb4-8b4e298a7aa4\") " Nov 24 12:44:46 crc kubenswrapper[4752]: I1124 12:44:46.019373 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aec6027c-fa97-4f99-adb4-8b4e298a7aa4-openstack-config\") pod \"aec6027c-fa97-4f99-adb4-8b4e298a7aa4\" (UID: \"aec6027c-fa97-4f99-adb4-8b4e298a7aa4\") " Nov 24 12:44:46 crc kubenswrapper[4752]: I1124 12:44:46.025279 4752 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="aec6027c-fa97-4f99-adb4-8b4e298a7aa4" podUID="13a1dca4-d743-45d2-b4a8-262db404b0b4" Nov 24 12:44:46 crc kubenswrapper[4752]: I1124 12:44:46.035044 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aec6027c-fa97-4f99-adb4-8b4e298a7aa4-kube-api-access-xv4rj" (OuterVolumeSpecName: "kube-api-access-xv4rj") pod "aec6027c-fa97-4f99-adb4-8b4e298a7aa4" (UID: "aec6027c-fa97-4f99-adb4-8b4e298a7aa4"). InnerVolumeSpecName "kube-api-access-xv4rj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:44:46 crc kubenswrapper[4752]: I1124 12:44:46.044611 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.044590336 podStartE2EDuration="3.044590336s" podCreationTimestamp="2025-11-24 12:44:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:44:46.017889269 +0000 UTC m=+5892.002709558" watchObservedRunningTime="2025-11-24 12:44:46.044590336 +0000 UTC m=+5892.029410625" Nov 24 12:44:46 crc kubenswrapper[4752]: I1124 12:44:46.073458 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aec6027c-fa97-4f99-adb4-8b4e298a7aa4-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "aec6027c-fa97-4f99-adb4-8b4e298a7aa4" (UID: "aec6027c-fa97-4f99-adb4-8b4e298a7aa4"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:44:46 crc kubenswrapper[4752]: I1124 12:44:46.101154 4752 scope.go:117] "RemoveContainer" containerID="86326bb1ea503c1e9a7da0299a97728771625a5237aa8cf4d82339d6a4fb0c20" Nov 24 12:44:46 crc kubenswrapper[4752]: E1124 12:44:46.101526 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86326bb1ea503c1e9a7da0299a97728771625a5237aa8cf4d82339d6a4fb0c20\": container with ID starting with 86326bb1ea503c1e9a7da0299a97728771625a5237aa8cf4d82339d6a4fb0c20 not found: ID does not exist" containerID="86326bb1ea503c1e9a7da0299a97728771625a5237aa8cf4d82339d6a4fb0c20" Nov 24 12:44:46 crc kubenswrapper[4752]: I1124 12:44:46.101573 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86326bb1ea503c1e9a7da0299a97728771625a5237aa8cf4d82339d6a4fb0c20"} err="failed to get container status \"86326bb1ea503c1e9a7da0299a97728771625a5237aa8cf4d82339d6a4fb0c20\": rpc error: code = NotFound desc = could not find container \"86326bb1ea503c1e9a7da0299a97728771625a5237aa8cf4d82339d6a4fb0c20\": container with ID starting with 86326bb1ea503c1e9a7da0299a97728771625a5237aa8cf4d82339d6a4fb0c20 not found: ID does not exist" Nov 24 12:44:46 crc kubenswrapper[4752]: I1124 12:44:46.122603 4752 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aec6027c-fa97-4f99-adb4-8b4e298a7aa4-openstack-config\") on node \"crc\" DevicePath \"\"" Nov 24 12:44:46 crc kubenswrapper[4752]: I1124 12:44:46.122648 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xv4rj\" (UniqueName: \"kubernetes.io/projected/aec6027c-fa97-4f99-adb4-8b4e298a7aa4-kube-api-access-xv4rj\") on node \"crc\" DevicePath \"\"" Nov 24 12:44:46 crc kubenswrapper[4752]: I1124 12:44:46.130617 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aec6027c-fa97-4f99-adb4-8b4e298a7aa4-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "aec6027c-fa97-4f99-adb4-8b4e298a7aa4" (UID: "aec6027c-fa97-4f99-adb4-8b4e298a7aa4"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:44:46 crc kubenswrapper[4752]: I1124 12:44:46.225763 4752 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aec6027c-fa97-4f99-adb4-8b4e298a7aa4-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Nov 24 12:44:46 crc kubenswrapper[4752]: I1124 12:44:46.322909 4752 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="aec6027c-fa97-4f99-adb4-8b4e298a7aa4" podUID="13a1dca4-d743-45d2-b4a8-262db404b0b4" Nov 24 12:44:46 crc kubenswrapper[4752]: W1124 12:44:46.556186 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50616cf6_5526_40a2_bd9a_6b9608421058.slice/crio-2af224c3a97c1a6731e0adaebb0c448e38b60d082eeb87fda93733f3452fe78f WatchSource:0}: Error finding container 2af224c3a97c1a6731e0adaebb0c448e38b60d082eeb87fda93733f3452fe78f: Status 404 returned error can't find the container with id 2af224c3a97c1a6731e0adaebb0c448e38b60d082eeb87fda93733f3452fe78f Nov 24 12:44:46 crc kubenswrapper[4752]: I1124 12:44:46.563699 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 24 12:44:46 crc kubenswrapper[4752]: I1124 12:44:46.745341 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aec6027c-fa97-4f99-adb4-8b4e298a7aa4" path="/var/lib/kubelet/pods/aec6027c-fa97-4f99-adb4-8b4e298a7aa4/volumes" Nov 24 12:44:47 crc kubenswrapper[4752]: I1124 12:44:47.031256 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"50616cf6-5526-40a2-bd9a-6b9608421058","Type":"ContainerStarted","Data":"2af224c3a97c1a6731e0adaebb0c448e38b60d082eeb87fda93733f3452fe78f"} Nov 24 12:44:47 crc kubenswrapper[4752]: I1124 12:44:47.036086 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f528da89-b1d3-4b08-a8af-b8371b35ff7c","Type":"ContainerStarted","Data":"30dd8594cfb1763ec1c214c0d7d62c4670e9ac08a63399b8bf9fc8c376b7429f"} Nov 24 12:44:47 crc kubenswrapper[4752]: I1124 12:44:47.036830 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 24 12:44:47 crc kubenswrapper[4752]: I1124 12:44:47.057604 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.494569408 podStartE2EDuration="4.057580605s" podCreationTimestamp="2025-11-24 12:44:43 +0000 UTC" firstStartedPulling="2025-11-24 12:44:44.894972724 +0000 UTC m=+5890.879793003" lastFinishedPulling="2025-11-24 12:44:45.457983911 +0000 UTC m=+5891.442804200" observedRunningTime="2025-11-24 12:44:47.054818676 +0000 UTC m=+5893.039638965" watchObservedRunningTime="2025-11-24 12:44:47.057580605 +0000 UTC m=+5893.042400904" Nov 24 12:44:50 crc kubenswrapper[4752]: I1124 12:44:50.728412 4752 scope.go:117] "RemoveContainer" containerID="2856f89824856d474bf147248c78cc6aab4d866c6c4a631c464c721708251588" Nov 24 12:44:50 crc kubenswrapper[4752]: E1124 12:44:50.729384 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:44:52 crc kubenswrapper[4752]: I1124 12:44:52.109014 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"50616cf6-5526-40a2-bd9a-6b9608421058","Type":"ContainerStarted","Data":"fd3790c080f7e5ff1405a573c95398ccacee9bef6bc8a08ef2170c0c07dff36b"} Nov 24 12:44:52 crc kubenswrapper[4752]: I1124 12:44:52.111628 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"34761b79-c432-4e7e-9715-131bb3bb4450","Type":"ContainerStarted","Data":"98ae072dfd5349f8bd40fb55b81e0c9dd0880b5c61734c43d7dc8cecbd328862"} Nov 24 12:44:53 crc kubenswrapper[4752]: I1124 12:44:53.913662 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 24 12:44:59 crc kubenswrapper[4752]: I1124 12:44:59.181599 4752 generic.go:334] "Generic (PLEG): container finished" podID="50616cf6-5526-40a2-bd9a-6b9608421058" containerID="fd3790c080f7e5ff1405a573c95398ccacee9bef6bc8a08ef2170c0c07dff36b" exitCode=0 Nov 24 12:44:59 crc kubenswrapper[4752]: I1124 12:44:59.181678 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"50616cf6-5526-40a2-bd9a-6b9608421058","Type":"ContainerDied","Data":"fd3790c080f7e5ff1405a573c95398ccacee9bef6bc8a08ef2170c0c07dff36b"} Nov 24 12:44:59 crc kubenswrapper[4752]: E1124 12:44:59.520080 4752 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34761b79_c432_4e7e_9715_131bb3bb4450.slice/crio-98ae072dfd5349f8bd40fb55b81e0c9dd0880b5c61734c43d7dc8cecbd328862.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34761b79_c432_4e7e_9715_131bb3bb4450.slice/crio-conmon-98ae072dfd5349f8bd40fb55b81e0c9dd0880b5c61734c43d7dc8cecbd328862.scope\": RecentStats: unable to find data in memory cache]" Nov 24 12:45:00 crc kubenswrapper[4752]: I1124 12:45:00.149732 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399805-5vmr8"] Nov 24 12:45:00 crc kubenswrapper[4752]: I1124 12:45:00.151683 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399805-5vmr8" Nov 24 12:45:00 crc kubenswrapper[4752]: I1124 12:45:00.154115 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 24 12:45:00 crc kubenswrapper[4752]: I1124 12:45:00.155583 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 24 12:45:00 crc kubenswrapper[4752]: I1124 12:45:00.162808 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399805-5vmr8"] Nov 24 12:45:00 crc kubenswrapper[4752]: I1124 12:45:00.219936 4752 generic.go:334] "Generic (PLEG): container finished" podID="34761b79-c432-4e7e-9715-131bb3bb4450" containerID="98ae072dfd5349f8bd40fb55b81e0c9dd0880b5c61734c43d7dc8cecbd328862" exitCode=0 Nov 24 12:45:00 crc kubenswrapper[4752]: I1124 12:45:00.220058 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"34761b79-c432-4e7e-9715-131bb3bb4450","Type":"ContainerDied","Data":"98ae072dfd5349f8bd40fb55b81e0c9dd0880b5c61734c43d7dc8cecbd328862"} Nov 24 12:45:00 crc kubenswrapper[4752]: I1124 12:45:00.229884 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10f37a67-1524-4a04-b7e6-7546a1b32ee2-secret-volume\") pod \"collect-profiles-29399805-5vmr8\" (UID: \"10f37a67-1524-4a04-b7e6-7546a1b32ee2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399805-5vmr8" Nov 24 12:45:00 crc kubenswrapper[4752]: I1124 12:45:00.230312 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvw78\" (UniqueName: \"kubernetes.io/projected/10f37a67-1524-4a04-b7e6-7546a1b32ee2-kube-api-access-zvw78\") pod \"collect-profiles-29399805-5vmr8\" (UID: \"10f37a67-1524-4a04-b7e6-7546a1b32ee2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399805-5vmr8" Nov 24 12:45:00 crc kubenswrapper[4752]: I1124 12:45:00.230552 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10f37a67-1524-4a04-b7e6-7546a1b32ee2-config-volume\") pod \"collect-profiles-29399805-5vmr8\" (UID: \"10f37a67-1524-4a04-b7e6-7546a1b32ee2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399805-5vmr8" Nov 24 12:45:00 crc kubenswrapper[4752]: I1124 12:45:00.332445 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10f37a67-1524-4a04-b7e6-7546a1b32ee2-config-volume\") pod \"collect-profiles-29399805-5vmr8\" (UID: \"10f37a67-1524-4a04-b7e6-7546a1b32ee2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399805-5vmr8" Nov 24 12:45:00 crc kubenswrapper[4752]: I1124 12:45:00.332546 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10f37a67-1524-4a04-b7e6-7546a1b32ee2-secret-volume\") pod \"collect-profiles-29399805-5vmr8\" (UID: \"10f37a67-1524-4a04-b7e6-7546a1b32ee2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399805-5vmr8" Nov 24 12:45:00 crc kubenswrapper[4752]: I1124 12:45:00.332714 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvw78\" (UniqueName: \"kubernetes.io/projected/10f37a67-1524-4a04-b7e6-7546a1b32ee2-kube-api-access-zvw78\") pod \"collect-profiles-29399805-5vmr8\" (UID: \"10f37a67-1524-4a04-b7e6-7546a1b32ee2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399805-5vmr8" Nov 24 12:45:00 crc kubenswrapper[4752]: I1124 12:45:00.333502 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10f37a67-1524-4a04-b7e6-7546a1b32ee2-config-volume\") pod \"collect-profiles-29399805-5vmr8\" (UID: \"10f37a67-1524-4a04-b7e6-7546a1b32ee2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399805-5vmr8" Nov 24 12:45:00 crc kubenswrapper[4752]: I1124 12:45:00.347725 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10f37a67-1524-4a04-b7e6-7546a1b32ee2-secret-volume\") pod \"collect-profiles-29399805-5vmr8\" (UID: \"10f37a67-1524-4a04-b7e6-7546a1b32ee2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399805-5vmr8" Nov 24 12:45:00 crc kubenswrapper[4752]: I1124 12:45:00.356067 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvw78\" (UniqueName: \"kubernetes.io/projected/10f37a67-1524-4a04-b7e6-7546a1b32ee2-kube-api-access-zvw78\") pod \"collect-profiles-29399805-5vmr8\" (UID: \"10f37a67-1524-4a04-b7e6-7546a1b32ee2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399805-5vmr8" Nov 24 12:45:00 crc kubenswrapper[4752]: I1124 12:45:00.514798 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399805-5vmr8" Nov 24 12:45:00 crc kubenswrapper[4752]: I1124 12:45:00.992964 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399805-5vmr8"] Nov 24 12:45:01 crc kubenswrapper[4752]: I1124 12:45:01.232932 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399805-5vmr8" event={"ID":"10f37a67-1524-4a04-b7e6-7546a1b32ee2","Type":"ContainerStarted","Data":"ef9f7dec628d639345ec119d687bcf22ec53c0291705b4433e268c72e4abc1ea"} Nov 24 12:45:01 crc kubenswrapper[4752]: I1124 12:45:01.233302 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399805-5vmr8" event={"ID":"10f37a67-1524-4a04-b7e6-7546a1b32ee2","Type":"ContainerStarted","Data":"286dc816a564f867693127d160d90095629204298209fa7ff41a9c9b6afa89fb"} Nov 24 12:45:01 crc kubenswrapper[4752]: I1124 12:45:01.251725 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29399805-5vmr8" podStartSLOduration=1.251705388 podStartE2EDuration="1.251705388s" podCreationTimestamp="2025-11-24 12:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:45:01.24793257 +0000 UTC m=+5907.232752859" watchObservedRunningTime="2025-11-24 12:45:01.251705388 +0000 UTC m=+5907.236525677" Nov 24 12:45:02 crc kubenswrapper[4752]: I1124 12:45:02.255566 4752 generic.go:334] "Generic (PLEG): container finished" podID="10f37a67-1524-4a04-b7e6-7546a1b32ee2" containerID="ef9f7dec628d639345ec119d687bcf22ec53c0291705b4433e268c72e4abc1ea" exitCode=0 Nov 24 12:45:02 crc kubenswrapper[4752]: I1124 12:45:02.255674 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399805-5vmr8" event={"ID":"10f37a67-1524-4a04-b7e6-7546a1b32ee2","Type":"ContainerDied","Data":"ef9f7dec628d639345ec119d687bcf22ec53c0291705b4433e268c72e4abc1ea"} Nov 24 12:45:02 crc kubenswrapper[4752]: I1124 12:45:02.732504 4752 scope.go:117] "RemoveContainer" containerID="2856f89824856d474bf147248c78cc6aab4d866c6c4a631c464c721708251588" Nov 24 12:45:02 crc kubenswrapper[4752]: E1124 12:45:02.732822 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:45:03 crc kubenswrapper[4752]: I1124 12:45:03.270836 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"34761b79-c432-4e7e-9715-131bb3bb4450","Type":"ContainerStarted","Data":"b96305eda520f142aa76fc6784ae81f2d5273cfa248eaa1374ecb92cee66dfe4"} Nov 24 12:45:03 crc kubenswrapper[4752]: I1124 12:45:03.628893 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399805-5vmr8" Nov 24 12:45:03 crc kubenswrapper[4752]: I1124 12:45:03.726359 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvw78\" (UniqueName: \"kubernetes.io/projected/10f37a67-1524-4a04-b7e6-7546a1b32ee2-kube-api-access-zvw78\") pod \"10f37a67-1524-4a04-b7e6-7546a1b32ee2\" (UID: \"10f37a67-1524-4a04-b7e6-7546a1b32ee2\") " Nov 24 12:45:03 crc kubenswrapper[4752]: I1124 12:45:03.726466 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10f37a67-1524-4a04-b7e6-7546a1b32ee2-config-volume\") pod \"10f37a67-1524-4a04-b7e6-7546a1b32ee2\" (UID: \"10f37a67-1524-4a04-b7e6-7546a1b32ee2\") " Nov 24 12:45:03 crc kubenswrapper[4752]: I1124 12:45:03.726555 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10f37a67-1524-4a04-b7e6-7546a1b32ee2-secret-volume\") pod \"10f37a67-1524-4a04-b7e6-7546a1b32ee2\" (UID: \"10f37a67-1524-4a04-b7e6-7546a1b32ee2\") " Nov 24 12:45:03 crc kubenswrapper[4752]: I1124 12:45:03.727647 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10f37a67-1524-4a04-b7e6-7546a1b32ee2-config-volume" (OuterVolumeSpecName: "config-volume") pod "10f37a67-1524-4a04-b7e6-7546a1b32ee2" (UID: "10f37a67-1524-4a04-b7e6-7546a1b32ee2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:45:03 crc kubenswrapper[4752]: I1124 12:45:03.750016 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10f37a67-1524-4a04-b7e6-7546a1b32ee2-kube-api-access-zvw78" (OuterVolumeSpecName: "kube-api-access-zvw78") pod "10f37a67-1524-4a04-b7e6-7546a1b32ee2" (UID: "10f37a67-1524-4a04-b7e6-7546a1b32ee2"). InnerVolumeSpecName "kube-api-access-zvw78". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:45:03 crc kubenswrapper[4752]: I1124 12:45:03.772029 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10f37a67-1524-4a04-b7e6-7546a1b32ee2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "10f37a67-1524-4a04-b7e6-7546a1b32ee2" (UID: "10f37a67-1524-4a04-b7e6-7546a1b32ee2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:45:03 crc kubenswrapper[4752]: I1124 12:45:03.829317 4752 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10f37a67-1524-4a04-b7e6-7546a1b32ee2-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 24 12:45:03 crc kubenswrapper[4752]: I1124 12:45:03.829354 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvw78\" (UniqueName: \"kubernetes.io/projected/10f37a67-1524-4a04-b7e6-7546a1b32ee2-kube-api-access-zvw78\") on node \"crc\" DevicePath \"\"" Nov 24 12:45:03 crc kubenswrapper[4752]: I1124 12:45:03.829369 4752 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10f37a67-1524-4a04-b7e6-7546a1b32ee2-config-volume\") on node \"crc\" DevicePath \"\"" Nov 24 12:45:04 crc kubenswrapper[4752]: I1124 12:45:04.289258 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399805-5vmr8" event={"ID":"10f37a67-1524-4a04-b7e6-7546a1b32ee2","Type":"ContainerDied","Data":"286dc816a564f867693127d160d90095629204298209fa7ff41a9c9b6afa89fb"} Nov 24 12:45:04 crc kubenswrapper[4752]: I1124 12:45:04.289594 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="286dc816a564f867693127d160d90095629204298209fa7ff41a9c9b6afa89fb" Nov 24 12:45:04 crc kubenswrapper[4752]: I1124 12:45:04.289333 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399805-5vmr8" Nov 24 12:45:04 crc kubenswrapper[4752]: I1124 12:45:04.319280 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399760-k69tj"] Nov 24 12:45:04 crc kubenswrapper[4752]: I1124 12:45:04.329979 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399760-k69tj"] Nov 24 12:45:04 crc kubenswrapper[4752]: I1124 12:45:04.745637 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb2b95fb-899f-40a8-9282-799f7fa37597" path="/var/lib/kubelet/pods/fb2b95fb-899f-40a8-9282-799f7fa37597/volumes" Nov 24 12:45:07 crc kubenswrapper[4752]: I1124 12:45:07.318617 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"34761b79-c432-4e7e-9715-131bb3bb4450","Type":"ContainerStarted","Data":"f7503d17e482191b4a87dd140acc4eabcfa4e0247f710c57e992d561f83184bd"} Nov 24 12:45:07 crc kubenswrapper[4752]: I1124 12:45:07.319154 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Nov 24 12:45:07 crc kubenswrapper[4752]: I1124 12:45:07.321882 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Nov 24 12:45:07 crc kubenswrapper[4752]: I1124 12:45:07.321927 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"50616cf6-5526-40a2-bd9a-6b9608421058","Type":"ContainerStarted","Data":"645d0fcf66c880a7d9ca0f2418c11a87dfaf25ded9c6e15f715bf378db5f366c"} Nov 24 12:45:07 crc kubenswrapper[4752]: I1124 12:45:07.350314 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=6.136206703 podStartE2EDuration="23.350291468s" podCreationTimestamp="2025-11-24 12:44:44 +0000 UTC" firstStartedPulling="2025-11-24 12:44:45.414874533 +0000 UTC m=+5891.399694822" lastFinishedPulling="2025-11-24 12:45:02.628959308 +0000 UTC m=+5908.613779587" observedRunningTime="2025-11-24 12:45:07.34061484 +0000 UTC m=+5913.325435139" watchObservedRunningTime="2025-11-24 12:45:07.350291468 +0000 UTC m=+5913.335111757" Nov 24 12:45:11 crc kubenswrapper[4752]: I1124 12:45:11.373183 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"50616cf6-5526-40a2-bd9a-6b9608421058","Type":"ContainerStarted","Data":"203590165c5c6b8d11b265b4f886688e3eab9bf4b20089349527a653aa911295"} Nov 24 12:45:14 crc kubenswrapper[4752]: I1124 12:45:14.403858 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"50616cf6-5526-40a2-bd9a-6b9608421058","Type":"ContainerStarted","Data":"e618c60e7676930a019614db4697ca43aeb1db3004a748d9165cd030e6dd0b72"} Nov 24 12:45:14 crc kubenswrapper[4752]: I1124 12:45:14.428836 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=4.622802726 podStartE2EDuration="31.428811517s" podCreationTimestamp="2025-11-24 12:44:43 +0000 UTC" firstStartedPulling="2025-11-24 12:44:46.569069497 +0000 UTC m=+5892.553889786" lastFinishedPulling="2025-11-24 12:45:13.375078288 +0000 UTC m=+5919.359898577" observedRunningTime="2025-11-24 12:45:14.427620393 +0000 UTC m=+5920.412440742" watchObservedRunningTime="2025-11-24 12:45:14.428811517 +0000 UTC m=+5920.413631806" Nov 24 12:45:14 crc kubenswrapper[4752]: I1124 12:45:14.737241 4752 scope.go:117] "RemoveContainer" containerID="2856f89824856d474bf147248c78cc6aab4d866c6c4a631c464c721708251588" Nov 24 12:45:14 crc kubenswrapper[4752]: E1124 12:45:14.737606 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:45:15 crc kubenswrapper[4752]: I1124 12:45:15.729660 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Nov 24 12:45:15 crc kubenswrapper[4752]: I1124 12:45:15.730037 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Nov 24 12:45:15 crc kubenswrapper[4752]: I1124 12:45:15.732437 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Nov 24 12:45:16 crc kubenswrapper[4752]: I1124 12:45:16.420841 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Nov 24 12:45:18 crc kubenswrapper[4752]: I1124 12:45:18.535096 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 24 12:45:18 crc kubenswrapper[4752]: E1124 12:45:18.535781 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10f37a67-1524-4a04-b7e6-7546a1b32ee2" containerName="collect-profiles" Nov 24 12:45:18 crc kubenswrapper[4752]: I1124 12:45:18.535794 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="10f37a67-1524-4a04-b7e6-7546a1b32ee2" containerName="collect-profiles" Nov 24 12:45:18 crc kubenswrapper[4752]: I1124 12:45:18.536013 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="10f37a67-1524-4a04-b7e6-7546a1b32ee2" containerName="collect-profiles" Nov 24 12:45:18 crc kubenswrapper[4752]: I1124 12:45:18.538044 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 12:45:18 crc kubenswrapper[4752]: I1124 12:45:18.543564 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 24 12:45:18 crc kubenswrapper[4752]: I1124 12:45:18.543765 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 24 12:45:18 crc kubenswrapper[4752]: I1124 12:45:18.546065 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 24 12:45:18 crc kubenswrapper[4752]: I1124 12:45:18.669537 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87vv9\" (UniqueName: \"kubernetes.io/projected/4e30d780-072c-4d4a-86c7-4e26f594c466-kube-api-access-87vv9\") pod \"ceilometer-0\" (UID: \"4e30d780-072c-4d4a-86c7-4e26f594c466\") " pod="openstack/ceilometer-0" Nov 24 12:45:18 crc kubenswrapper[4752]: I1124 12:45:18.669647 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e30d780-072c-4d4a-86c7-4e26f594c466-scripts\") pod \"ceilometer-0\" (UID: \"4e30d780-072c-4d4a-86c7-4e26f594c466\") " pod="openstack/ceilometer-0" Nov 24 12:45:18 crc kubenswrapper[4752]: I1124 12:45:18.669861 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e30d780-072c-4d4a-86c7-4e26f594c466-log-httpd\") pod \"ceilometer-0\" (UID: \"4e30d780-072c-4d4a-86c7-4e26f594c466\") " pod="openstack/ceilometer-0" Nov 24 12:45:18 crc kubenswrapper[4752]: I1124 12:45:18.670022 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e30d780-072c-4d4a-86c7-4e26f594c466-run-httpd\") pod \"ceilometer-0\" (UID: \"4e30d780-072c-4d4a-86c7-4e26f594c466\") " pod="openstack/ceilometer-0" Nov 24 12:45:18 crc kubenswrapper[4752]: I1124 12:45:18.670162 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e30d780-072c-4d4a-86c7-4e26f594c466-config-data\") pod \"ceilometer-0\" (UID: \"4e30d780-072c-4d4a-86c7-4e26f594c466\") " pod="openstack/ceilometer-0" Nov 24 12:45:18 crc kubenswrapper[4752]: I1124 12:45:18.670270 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4e30d780-072c-4d4a-86c7-4e26f594c466-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4e30d780-072c-4d4a-86c7-4e26f594c466\") " pod="openstack/ceilometer-0" Nov 24 12:45:18 crc kubenswrapper[4752]: I1124 12:45:18.670315 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e30d780-072c-4d4a-86c7-4e26f594c466-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4e30d780-072c-4d4a-86c7-4e26f594c466\") " pod="openstack/ceilometer-0" Nov 24 12:45:18 crc kubenswrapper[4752]: I1124 12:45:18.771942 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e30d780-072c-4d4a-86c7-4e26f594c466-log-httpd\") pod \"ceilometer-0\" (UID: \"4e30d780-072c-4d4a-86c7-4e26f594c466\") " pod="openstack/ceilometer-0" Nov 24 12:45:18 crc kubenswrapper[4752]: I1124 12:45:18.772053 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e30d780-072c-4d4a-86c7-4e26f594c466-run-httpd\") pod \"ceilometer-0\" (UID: \"4e30d780-072c-4d4a-86c7-4e26f594c466\") " pod="openstack/ceilometer-0" Nov 24 12:45:18 crc kubenswrapper[4752]: I1124 12:45:18.772135 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e30d780-072c-4d4a-86c7-4e26f594c466-config-data\") pod \"ceilometer-0\" (UID: \"4e30d780-072c-4d4a-86c7-4e26f594c466\") " pod="openstack/ceilometer-0" Nov 24 12:45:18 crc kubenswrapper[4752]: I1124 12:45:18.772178 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4e30d780-072c-4d4a-86c7-4e26f594c466-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4e30d780-072c-4d4a-86c7-4e26f594c466\") " pod="openstack/ceilometer-0" Nov 24 12:45:18 crc kubenswrapper[4752]: I1124 12:45:18.772202 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e30d780-072c-4d4a-86c7-4e26f594c466-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4e30d780-072c-4d4a-86c7-4e26f594c466\") " pod="openstack/ceilometer-0" Nov 24 12:45:18 crc kubenswrapper[4752]: I1124 12:45:18.772290 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87vv9\" (UniqueName: \"kubernetes.io/projected/4e30d780-072c-4d4a-86c7-4e26f594c466-kube-api-access-87vv9\") pod \"ceilometer-0\" (UID: \"4e30d780-072c-4d4a-86c7-4e26f594c466\") " pod="openstack/ceilometer-0" Nov 24 12:45:18 crc kubenswrapper[4752]: I1124 12:45:18.772337 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e30d780-072c-4d4a-86c7-4e26f594c466-scripts\") pod \"ceilometer-0\" (UID: \"4e30d780-072c-4d4a-86c7-4e26f594c466\") " pod="openstack/ceilometer-0" Nov 24 12:45:18 crc kubenswrapper[4752]: I1124 12:45:18.773388 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e30d780-072c-4d4a-86c7-4e26f594c466-log-httpd\") pod \"ceilometer-0\" (UID: \"4e30d780-072c-4d4a-86c7-4e26f594c466\") " pod="openstack/ceilometer-0" Nov 24 12:45:18 crc kubenswrapper[4752]: I1124 12:45:18.774797 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e30d780-072c-4d4a-86c7-4e26f594c466-run-httpd\") pod \"ceilometer-0\" (UID: \"4e30d780-072c-4d4a-86c7-4e26f594c466\") " pod="openstack/ceilometer-0" Nov 24 12:45:18 crc kubenswrapper[4752]: I1124 12:45:18.786709 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e30d780-072c-4d4a-86c7-4e26f594c466-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4e30d780-072c-4d4a-86c7-4e26f594c466\") " pod="openstack/ceilometer-0" Nov 24 12:45:18 crc kubenswrapper[4752]: I1124 12:45:18.792602 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e30d780-072c-4d4a-86c7-4e26f594c466-scripts\") pod \"ceilometer-0\" (UID: \"4e30d780-072c-4d4a-86c7-4e26f594c466\") " pod="openstack/ceilometer-0" Nov 24 12:45:18 crc kubenswrapper[4752]: I1124 12:45:18.815353 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4e30d780-072c-4d4a-86c7-4e26f594c466-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4e30d780-072c-4d4a-86c7-4e26f594c466\") " pod="openstack/ceilometer-0" Nov 24 12:45:18 crc kubenswrapper[4752]: I1124 12:45:18.816523 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87vv9\" (UniqueName: \"kubernetes.io/projected/4e30d780-072c-4d4a-86c7-4e26f594c466-kube-api-access-87vv9\") pod \"ceilometer-0\" (UID: \"4e30d780-072c-4d4a-86c7-4e26f594c466\") " pod="openstack/ceilometer-0" Nov 24 12:45:18 crc kubenswrapper[4752]: I1124 12:45:18.818951 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e30d780-072c-4d4a-86c7-4e26f594c466-config-data\") pod \"ceilometer-0\" (UID: \"4e30d780-072c-4d4a-86c7-4e26f594c466\") " pod="openstack/ceilometer-0" Nov 24 12:45:18 crc kubenswrapper[4752]: I1124 12:45:18.866936 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 12:45:19 crc kubenswrapper[4752]: I1124 12:45:19.371450 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 24 12:45:19 crc kubenswrapper[4752]: I1124 12:45:19.474210 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e30d780-072c-4d4a-86c7-4e26f594c466","Type":"ContainerStarted","Data":"e969b95b61d56a593a032bae9d0a73f7724ad970f1bf44707b3daa3a22d90743"} Nov 24 12:45:20 crc kubenswrapper[4752]: I1124 12:45:20.494375 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e30d780-072c-4d4a-86c7-4e26f594c466","Type":"ContainerStarted","Data":"2e4eda664a9df883804785c027e378ab4f50508047b7746b37e50f6728548d11"} Nov 24 12:45:21 crc kubenswrapper[4752]: I1124 12:45:21.513628 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e30d780-072c-4d4a-86c7-4e26f594c466","Type":"ContainerStarted","Data":"8fa13470b362497139f02a5c3ce02b10068e552741b6674808639ad271170b0f"} Nov 24 12:45:21 crc kubenswrapper[4752]: I1124 12:45:21.514255 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e30d780-072c-4d4a-86c7-4e26f594c466","Type":"ContainerStarted","Data":"bbeb110b8a14f47c14df3e6dc334c2253300a868a5f5b9a846a6a7a366eaa8ce"} Nov 24 12:45:23 crc kubenswrapper[4752]: I1124 12:45:23.552786 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e30d780-072c-4d4a-86c7-4e26f594c466","Type":"ContainerStarted","Data":"05ccbba81ba71e372e38b991b62922b2ba06388e4ab31832c5c1f2dc9f218baf"} Nov 24 12:45:23 crc kubenswrapper[4752]: I1124 12:45:23.553479 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 24 12:45:23 crc kubenswrapper[4752]: I1124 12:45:23.574368 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.00448059 podStartE2EDuration="5.574354154s" podCreationTimestamp="2025-11-24 12:45:18 +0000 UTC" firstStartedPulling="2025-11-24 12:45:19.390629022 +0000 UTC m=+5925.375449311" lastFinishedPulling="2025-11-24 12:45:22.960502566 +0000 UTC m=+5928.945322875" observedRunningTime="2025-11-24 12:45:23.57315543 +0000 UTC m=+5929.557975729" watchObservedRunningTime="2025-11-24 12:45:23.574354154 +0000 UTC m=+5929.559174443" Nov 24 12:45:29 crc kubenswrapper[4752]: I1124 12:45:29.162981 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-g69bx"] Nov 24 12:45:29 crc kubenswrapper[4752]: I1124 12:45:29.168264 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-g69bx" Nov 24 12:45:29 crc kubenswrapper[4752]: I1124 12:45:29.173880 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-g69bx"] Nov 24 12:45:29 crc kubenswrapper[4752]: I1124 12:45:29.228106 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/329f7583-4cd1-46ce-a48d-8f549725dadd-operator-scripts\") pod \"aodh-db-create-g69bx\" (UID: \"329f7583-4cd1-46ce-a48d-8f549725dadd\") " pod="openstack/aodh-db-create-g69bx" Nov 24 12:45:29 crc kubenswrapper[4752]: I1124 12:45:29.228188 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjsrn\" (UniqueName: \"kubernetes.io/projected/329f7583-4cd1-46ce-a48d-8f549725dadd-kube-api-access-gjsrn\") pod \"aodh-db-create-g69bx\" (UID: \"329f7583-4cd1-46ce-a48d-8f549725dadd\") " pod="openstack/aodh-db-create-g69bx" Nov 24 12:45:29 crc kubenswrapper[4752]: I1124 12:45:29.330870 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjsrn\" (UniqueName: \"kubernetes.io/projected/329f7583-4cd1-46ce-a48d-8f549725dadd-kube-api-access-gjsrn\") pod \"aodh-db-create-g69bx\" (UID: \"329f7583-4cd1-46ce-a48d-8f549725dadd\") " pod="openstack/aodh-db-create-g69bx" Nov 24 12:45:29 crc kubenswrapper[4752]: I1124 12:45:29.331235 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/329f7583-4cd1-46ce-a48d-8f549725dadd-operator-scripts\") pod \"aodh-db-create-g69bx\" (UID: \"329f7583-4cd1-46ce-a48d-8f549725dadd\") " pod="openstack/aodh-db-create-g69bx" Nov 24 12:45:29 crc kubenswrapper[4752]: I1124 12:45:29.332521 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/329f7583-4cd1-46ce-a48d-8f549725dadd-operator-scripts\") pod \"aodh-db-create-g69bx\" (UID: \"329f7583-4cd1-46ce-a48d-8f549725dadd\") " pod="openstack/aodh-db-create-g69bx" Nov 24 12:45:29 crc kubenswrapper[4752]: I1124 12:45:29.356915 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjsrn\" (UniqueName: \"kubernetes.io/projected/329f7583-4cd1-46ce-a48d-8f549725dadd-kube-api-access-gjsrn\") pod \"aodh-db-create-g69bx\" (UID: \"329f7583-4cd1-46ce-a48d-8f549725dadd\") " pod="openstack/aodh-db-create-g69bx" Nov 24 12:45:29 crc kubenswrapper[4752]: I1124 12:45:29.506372 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-g69bx" Nov 24 12:45:29 crc kubenswrapper[4752]: I1124 12:45:29.525386 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-4512-account-create-v6zvq"] Nov 24 12:45:29 crc kubenswrapper[4752]: I1124 12:45:29.527982 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-4512-account-create-v6zvq" Nov 24 12:45:29 crc kubenswrapper[4752]: I1124 12:45:29.540128 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Nov 24 12:45:29 crc kubenswrapper[4752]: I1124 12:45:29.570388 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-4512-account-create-v6zvq"] Nov 24 12:45:29 crc kubenswrapper[4752]: I1124 12:45:29.642847 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbjm6\" (UniqueName: \"kubernetes.io/projected/8fdd3410-1be6-4f7f-bcb6-6787a8181fdf-kube-api-access-rbjm6\") pod \"aodh-4512-account-create-v6zvq\" (UID: \"8fdd3410-1be6-4f7f-bcb6-6787a8181fdf\") " pod="openstack/aodh-4512-account-create-v6zvq" Nov 24 12:45:29 crc kubenswrapper[4752]: I1124 12:45:29.643060 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fdd3410-1be6-4f7f-bcb6-6787a8181fdf-operator-scripts\") pod \"aodh-4512-account-create-v6zvq\" (UID: \"8fdd3410-1be6-4f7f-bcb6-6787a8181fdf\") " pod="openstack/aodh-4512-account-create-v6zvq" Nov 24 12:45:29 crc kubenswrapper[4752]: I1124 12:45:29.745722 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fdd3410-1be6-4f7f-bcb6-6787a8181fdf-operator-scripts\") pod \"aodh-4512-account-create-v6zvq\" (UID: \"8fdd3410-1be6-4f7f-bcb6-6787a8181fdf\") " pod="openstack/aodh-4512-account-create-v6zvq" Nov 24 12:45:29 crc kubenswrapper[4752]: I1124 12:45:29.746205 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbjm6\" (UniqueName: \"kubernetes.io/projected/8fdd3410-1be6-4f7f-bcb6-6787a8181fdf-kube-api-access-rbjm6\") pod \"aodh-4512-account-create-v6zvq\" (UID: \"8fdd3410-1be6-4f7f-bcb6-6787a8181fdf\") " pod="openstack/aodh-4512-account-create-v6zvq" Nov 24 12:45:29 crc kubenswrapper[4752]: I1124 12:45:29.751968 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fdd3410-1be6-4f7f-bcb6-6787a8181fdf-operator-scripts\") pod \"aodh-4512-account-create-v6zvq\" (UID: \"8fdd3410-1be6-4f7f-bcb6-6787a8181fdf\") " pod="openstack/aodh-4512-account-create-v6zvq" Nov 24 12:45:29 crc kubenswrapper[4752]: I1124 12:45:29.767718 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbjm6\" (UniqueName: \"kubernetes.io/projected/8fdd3410-1be6-4f7f-bcb6-6787a8181fdf-kube-api-access-rbjm6\") pod \"aodh-4512-account-create-v6zvq\" (UID: \"8fdd3410-1be6-4f7f-bcb6-6787a8181fdf\") " pod="openstack/aodh-4512-account-create-v6zvq" Nov 24 12:45:29 crc kubenswrapper[4752]: I1124 12:45:29.975022 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-4512-account-create-v6zvq" Nov 24 12:45:30 crc kubenswrapper[4752]: I1124 12:45:30.018163 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-g69bx"] Nov 24 12:45:30 crc kubenswrapper[4752]: W1124 12:45:30.022252 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod329f7583_4cd1_46ce_a48d_8f549725dadd.slice/crio-ba2d2e399b0cc72a689dd2373f1543f0c919daf3e4f2bfb9df3de0428a7f5499 WatchSource:0}: Error finding container ba2d2e399b0cc72a689dd2373f1543f0c919daf3e4f2bfb9df3de0428a7f5499: Status 404 returned error can't find the container with id ba2d2e399b0cc72a689dd2373f1543f0c919daf3e4f2bfb9df3de0428a7f5499 Nov 24 12:45:30 crc kubenswrapper[4752]: I1124 12:45:30.064054 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-f54k9"] Nov 24 12:45:30 crc kubenswrapper[4752]: I1124 12:45:30.074322 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-f54k9"] Nov 24 12:45:32 crc kubenswrapper[4752]: I1124 12:45:30.465350 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-4512-account-create-v6zvq"] Nov 24 12:45:32 crc kubenswrapper[4752]: W1124 12:45:30.483177 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fdd3410_1be6_4f7f_bcb6_6787a8181fdf.slice/crio-deec19a7dcff2150a9ce5918f4039511de67fa0fce6b40d3db1f56aad2ece8b2 WatchSource:0}: Error finding container deec19a7dcff2150a9ce5918f4039511de67fa0fce6b40d3db1f56aad2ece8b2: Status 404 returned error can't find the container with id deec19a7dcff2150a9ce5918f4039511de67fa0fce6b40d3db1f56aad2ece8b2 Nov 24 12:45:32 crc kubenswrapper[4752]: I1124 12:45:30.630993 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-4512-account-create-v6zvq" event={"ID":"8fdd3410-1be6-4f7f-bcb6-6787a8181fdf","Type":"ContainerStarted","Data":"deec19a7dcff2150a9ce5918f4039511de67fa0fce6b40d3db1f56aad2ece8b2"} Nov 24 12:45:32 crc kubenswrapper[4752]: I1124 12:45:30.632953 4752 generic.go:334] "Generic (PLEG): container finished" podID="329f7583-4cd1-46ce-a48d-8f549725dadd" containerID="3e79baf00deb37f543240d8d9693dbf04c89662d72894b560e698a8b4bababa9" exitCode=0 Nov 24 12:45:32 crc kubenswrapper[4752]: I1124 12:45:30.632997 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-g69bx" event={"ID":"329f7583-4cd1-46ce-a48d-8f549725dadd","Type":"ContainerDied","Data":"3e79baf00deb37f543240d8d9693dbf04c89662d72894b560e698a8b4bababa9"} Nov 24 12:45:32 crc kubenswrapper[4752]: I1124 12:45:30.633047 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-g69bx" event={"ID":"329f7583-4cd1-46ce-a48d-8f549725dadd","Type":"ContainerStarted","Data":"ba2d2e399b0cc72a689dd2373f1543f0c919daf3e4f2bfb9df3de0428a7f5499"} Nov 24 12:45:32 crc kubenswrapper[4752]: I1124 12:45:30.728941 4752 scope.go:117] "RemoveContainer" containerID="2856f89824856d474bf147248c78cc6aab4d866c6c4a631c464c721708251588" Nov 24 12:45:32 crc kubenswrapper[4752]: E1124 12:45:30.729234 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:45:32 crc kubenswrapper[4752]: I1124 12:45:30.743069 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06ebef79-47d8-4a69-a075-ce63622d42c4" path="/var/lib/kubelet/pods/06ebef79-47d8-4a69-a075-ce63622d42c4/volumes" Nov 24 12:45:32 crc kubenswrapper[4752]: I1124 12:45:31.042165 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-mlbcr"] Nov 24 12:45:32 crc kubenswrapper[4752]: I1124 12:45:31.050720 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-43c9-account-create-8qtsg"] Nov 24 12:45:32 crc kubenswrapper[4752]: I1124 12:45:31.066337 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-dbdgt"] Nov 24 12:45:32 crc kubenswrapper[4752]: I1124 12:45:31.078792 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-mlbcr"] Nov 24 12:45:32 crc kubenswrapper[4752]: I1124 12:45:31.086988 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-2ea2-account-create-wzkx2"] Nov 24 12:45:32 crc kubenswrapper[4752]: I1124 12:45:31.096755 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-fb8b-account-create-8dh5x"] Nov 24 12:45:32 crc kubenswrapper[4752]: I1124 12:45:31.104969 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-2ea2-account-create-wzkx2"] Nov 24 12:45:32 crc kubenswrapper[4752]: I1124 12:45:31.112805 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-dbdgt"] Nov 24 12:45:32 crc kubenswrapper[4752]: I1124 12:45:31.121173 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-43c9-account-create-8qtsg"] Nov 24 12:45:32 crc kubenswrapper[4752]: I1124 12:45:31.128617 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-fb8b-account-create-8dh5x"] Nov 24 12:45:32 crc kubenswrapper[4752]: I1124 12:45:31.646646 4752 generic.go:334] "Generic (PLEG): container finished" podID="8fdd3410-1be6-4f7f-bcb6-6787a8181fdf" containerID="080d0570df3e1f664a50c2fc76edb6cbc96175b04748fb07fa81f7210a945759" exitCode=0 Nov 24 12:45:32 crc kubenswrapper[4752]: I1124 12:45:31.646767 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-4512-account-create-v6zvq" event={"ID":"8fdd3410-1be6-4f7f-bcb6-6787a8181fdf","Type":"ContainerDied","Data":"080d0570df3e1f664a50c2fc76edb6cbc96175b04748fb07fa81f7210a945759"} Nov 24 12:45:32 crc kubenswrapper[4752]: I1124 12:45:32.532094 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-g69bx" Nov 24 12:45:32 crc kubenswrapper[4752]: I1124 12:45:32.617033 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjsrn\" (UniqueName: \"kubernetes.io/projected/329f7583-4cd1-46ce-a48d-8f549725dadd-kube-api-access-gjsrn\") pod \"329f7583-4cd1-46ce-a48d-8f549725dadd\" (UID: \"329f7583-4cd1-46ce-a48d-8f549725dadd\") " Nov 24 12:45:32 crc kubenswrapper[4752]: I1124 12:45:32.617206 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/329f7583-4cd1-46ce-a48d-8f549725dadd-operator-scripts\") pod \"329f7583-4cd1-46ce-a48d-8f549725dadd\" (UID: \"329f7583-4cd1-46ce-a48d-8f549725dadd\") " Nov 24 12:45:32 crc kubenswrapper[4752]: I1124 12:45:32.617811 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/329f7583-4cd1-46ce-a48d-8f549725dadd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "329f7583-4cd1-46ce-a48d-8f549725dadd" (UID: "329f7583-4cd1-46ce-a48d-8f549725dadd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:45:32 crc kubenswrapper[4752]: I1124 12:45:32.618157 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/329f7583-4cd1-46ce-a48d-8f549725dadd-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:45:32 crc kubenswrapper[4752]: I1124 12:45:32.625254 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/329f7583-4cd1-46ce-a48d-8f549725dadd-kube-api-access-gjsrn" (OuterVolumeSpecName: "kube-api-access-gjsrn") pod "329f7583-4cd1-46ce-a48d-8f549725dadd" (UID: "329f7583-4cd1-46ce-a48d-8f549725dadd"). InnerVolumeSpecName "kube-api-access-gjsrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:45:32 crc kubenswrapper[4752]: I1124 12:45:32.659452 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-g69bx" Nov 24 12:45:32 crc kubenswrapper[4752]: I1124 12:45:32.659436 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-g69bx" event={"ID":"329f7583-4cd1-46ce-a48d-8f549725dadd","Type":"ContainerDied","Data":"ba2d2e399b0cc72a689dd2373f1543f0c919daf3e4f2bfb9df3de0428a7f5499"} Nov 24 12:45:32 crc kubenswrapper[4752]: I1124 12:45:32.659670 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba2d2e399b0cc72a689dd2373f1543f0c919daf3e4f2bfb9df3de0428a7f5499" Nov 24 12:45:32 crc kubenswrapper[4752]: I1124 12:45:32.720919 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjsrn\" (UniqueName: \"kubernetes.io/projected/329f7583-4cd1-46ce-a48d-8f549725dadd-kube-api-access-gjsrn\") on node \"crc\" DevicePath \"\"" Nov 24 12:45:32 crc kubenswrapper[4752]: I1124 12:45:32.747707 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1be35493-c5d0-476a-ae26-42485af9efe5" path="/var/lib/kubelet/pods/1be35493-c5d0-476a-ae26-42485af9efe5/volumes" Nov 24 12:45:32 crc kubenswrapper[4752]: I1124 12:45:32.749328 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26a41aac-1ab9-4a64-b65c-1c45ea19c56b" path="/var/lib/kubelet/pods/26a41aac-1ab9-4a64-b65c-1c45ea19c56b/volumes" Nov 24 12:45:32 crc kubenswrapper[4752]: I1124 12:45:32.750525 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44914755-229b-4591-8098-84982a49fdbc" path="/var/lib/kubelet/pods/44914755-229b-4591-8098-84982a49fdbc/volumes" Nov 24 12:45:32 crc kubenswrapper[4752]: I1124 12:45:32.752461 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55251fd6-dac5-445f-99cb-3a600e94217c" path="/var/lib/kubelet/pods/55251fd6-dac5-445f-99cb-3a600e94217c/volumes" Nov 24 12:45:32 crc kubenswrapper[4752]: I1124 12:45:32.754433 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c58d145a-cf82-4d24-acea-034d8e2b5f6a" path="/var/lib/kubelet/pods/c58d145a-cf82-4d24-acea-034d8e2b5f6a/volumes" Nov 24 12:45:33 crc kubenswrapper[4752]: I1124 12:45:33.127160 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-4512-account-create-v6zvq" Nov 24 12:45:33 crc kubenswrapper[4752]: I1124 12:45:33.232935 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fdd3410-1be6-4f7f-bcb6-6787a8181fdf-operator-scripts\") pod \"8fdd3410-1be6-4f7f-bcb6-6787a8181fdf\" (UID: \"8fdd3410-1be6-4f7f-bcb6-6787a8181fdf\") " Nov 24 12:45:33 crc kubenswrapper[4752]: I1124 12:45:33.233160 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbjm6\" (UniqueName: \"kubernetes.io/projected/8fdd3410-1be6-4f7f-bcb6-6787a8181fdf-kube-api-access-rbjm6\") pod \"8fdd3410-1be6-4f7f-bcb6-6787a8181fdf\" (UID: \"8fdd3410-1be6-4f7f-bcb6-6787a8181fdf\") " Nov 24 12:45:33 crc kubenswrapper[4752]: I1124 12:45:33.234725 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fdd3410-1be6-4f7f-bcb6-6787a8181fdf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8fdd3410-1be6-4f7f-bcb6-6787a8181fdf" (UID: "8fdd3410-1be6-4f7f-bcb6-6787a8181fdf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:45:33 crc kubenswrapper[4752]: I1124 12:45:33.239326 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fdd3410-1be6-4f7f-bcb6-6787a8181fdf-kube-api-access-rbjm6" (OuterVolumeSpecName: "kube-api-access-rbjm6") pod "8fdd3410-1be6-4f7f-bcb6-6787a8181fdf" (UID: "8fdd3410-1be6-4f7f-bcb6-6787a8181fdf"). InnerVolumeSpecName "kube-api-access-rbjm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:45:33 crc kubenswrapper[4752]: I1124 12:45:33.336237 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbjm6\" (UniqueName: \"kubernetes.io/projected/8fdd3410-1be6-4f7f-bcb6-6787a8181fdf-kube-api-access-rbjm6\") on node \"crc\" DevicePath \"\"" Nov 24 12:45:33 crc kubenswrapper[4752]: I1124 12:45:33.336275 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fdd3410-1be6-4f7f-bcb6-6787a8181fdf-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:45:33 crc kubenswrapper[4752]: I1124 12:45:33.679895 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-4512-account-create-v6zvq" event={"ID":"8fdd3410-1be6-4f7f-bcb6-6787a8181fdf","Type":"ContainerDied","Data":"deec19a7dcff2150a9ce5918f4039511de67fa0fce6b40d3db1f56aad2ece8b2"} Nov 24 12:45:33 crc kubenswrapper[4752]: I1124 12:45:33.680183 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="deec19a7dcff2150a9ce5918f4039511de67fa0fce6b40d3db1f56aad2ece8b2" Nov 24 12:45:33 crc kubenswrapper[4752]: I1124 12:45:33.680089 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-4512-account-create-v6zvq" Nov 24 12:45:34 crc kubenswrapper[4752]: I1124 12:45:34.906811 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-74jvb"] Nov 24 12:45:34 crc kubenswrapper[4752]: E1124 12:45:34.907768 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fdd3410-1be6-4f7f-bcb6-6787a8181fdf" containerName="mariadb-account-create" Nov 24 12:45:34 crc kubenswrapper[4752]: I1124 12:45:34.907784 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fdd3410-1be6-4f7f-bcb6-6787a8181fdf" containerName="mariadb-account-create" Nov 24 12:45:34 crc kubenswrapper[4752]: E1124 12:45:34.907808 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="329f7583-4cd1-46ce-a48d-8f549725dadd" containerName="mariadb-database-create" Nov 24 12:45:34 crc kubenswrapper[4752]: I1124 12:45:34.907815 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="329f7583-4cd1-46ce-a48d-8f549725dadd" containerName="mariadb-database-create" Nov 24 12:45:34 crc kubenswrapper[4752]: I1124 12:45:34.908034 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fdd3410-1be6-4f7f-bcb6-6787a8181fdf" containerName="mariadb-account-create" Nov 24 12:45:34 crc kubenswrapper[4752]: I1124 12:45:34.908059 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="329f7583-4cd1-46ce-a48d-8f549725dadd" containerName="mariadb-database-create" Nov 24 12:45:34 crc kubenswrapper[4752]: I1124 12:45:34.908851 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-74jvb" Nov 24 12:45:34 crc kubenswrapper[4752]: I1124 12:45:34.909763 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-74jvb"] Nov 24 12:45:34 crc kubenswrapper[4752]: I1124 12:45:34.913580 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Nov 24 12:45:34 crc kubenswrapper[4752]: I1124 12:45:34.913638 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-sj676" Nov 24 12:45:34 crc kubenswrapper[4752]: I1124 12:45:34.913845 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 24 12:45:34 crc kubenswrapper[4752]: I1124 12:45:34.913907 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Nov 24 12:45:34 crc kubenswrapper[4752]: I1124 12:45:34.974608 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73030570-ae25-48f4-bdee-e70a12c2623e-combined-ca-bundle\") pod \"aodh-db-sync-74jvb\" (UID: \"73030570-ae25-48f4-bdee-e70a12c2623e\") " pod="openstack/aodh-db-sync-74jvb" Nov 24 12:45:34 crc kubenswrapper[4752]: I1124 12:45:34.974787 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73030570-ae25-48f4-bdee-e70a12c2623e-scripts\") pod \"aodh-db-sync-74jvb\" (UID: \"73030570-ae25-48f4-bdee-e70a12c2623e\") " pod="openstack/aodh-db-sync-74jvb" Nov 24 12:45:34 crc kubenswrapper[4752]: I1124 12:45:34.974838 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znqqv\" (UniqueName: \"kubernetes.io/projected/73030570-ae25-48f4-bdee-e70a12c2623e-kube-api-access-znqqv\") pod \"aodh-db-sync-74jvb\" (UID: \"73030570-ae25-48f4-bdee-e70a12c2623e\") " pod="openstack/aodh-db-sync-74jvb" Nov 24 12:45:34 crc kubenswrapper[4752]: I1124 12:45:34.974924 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73030570-ae25-48f4-bdee-e70a12c2623e-config-data\") pod \"aodh-db-sync-74jvb\" (UID: \"73030570-ae25-48f4-bdee-e70a12c2623e\") " pod="openstack/aodh-db-sync-74jvb" Nov 24 12:45:35 crc kubenswrapper[4752]: I1124 12:45:35.076855 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73030570-ae25-48f4-bdee-e70a12c2623e-scripts\") pod \"aodh-db-sync-74jvb\" (UID: \"73030570-ae25-48f4-bdee-e70a12c2623e\") " pod="openstack/aodh-db-sync-74jvb" Nov 24 12:45:35 crc kubenswrapper[4752]: I1124 12:45:35.076915 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znqqv\" (UniqueName: \"kubernetes.io/projected/73030570-ae25-48f4-bdee-e70a12c2623e-kube-api-access-znqqv\") pod \"aodh-db-sync-74jvb\" (UID: \"73030570-ae25-48f4-bdee-e70a12c2623e\") " pod="openstack/aodh-db-sync-74jvb" Nov 24 12:45:35 crc kubenswrapper[4752]: I1124 12:45:35.076961 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73030570-ae25-48f4-bdee-e70a12c2623e-config-data\") pod \"aodh-db-sync-74jvb\" (UID: \"73030570-ae25-48f4-bdee-e70a12c2623e\") " pod="openstack/aodh-db-sync-74jvb" Nov 24 12:45:35 crc kubenswrapper[4752]: I1124 12:45:35.077038 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73030570-ae25-48f4-bdee-e70a12c2623e-combined-ca-bundle\") pod \"aodh-db-sync-74jvb\" (UID: \"73030570-ae25-48f4-bdee-e70a12c2623e\") " pod="openstack/aodh-db-sync-74jvb" Nov 24 12:45:35 crc kubenswrapper[4752]: I1124 12:45:35.082643 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73030570-ae25-48f4-bdee-e70a12c2623e-combined-ca-bundle\") pod \"aodh-db-sync-74jvb\" (UID: \"73030570-ae25-48f4-bdee-e70a12c2623e\") " pod="openstack/aodh-db-sync-74jvb" Nov 24 12:45:35 crc kubenswrapper[4752]: I1124 12:45:35.086361 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73030570-ae25-48f4-bdee-e70a12c2623e-config-data\") pod \"aodh-db-sync-74jvb\" (UID: \"73030570-ae25-48f4-bdee-e70a12c2623e\") " pod="openstack/aodh-db-sync-74jvb" Nov 24 12:45:35 crc kubenswrapper[4752]: I1124 12:45:35.087225 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73030570-ae25-48f4-bdee-e70a12c2623e-scripts\") pod \"aodh-db-sync-74jvb\" (UID: \"73030570-ae25-48f4-bdee-e70a12c2623e\") " pod="openstack/aodh-db-sync-74jvb" Nov 24 12:45:35 crc kubenswrapper[4752]: I1124 12:45:35.100039 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znqqv\" (UniqueName: \"kubernetes.io/projected/73030570-ae25-48f4-bdee-e70a12c2623e-kube-api-access-znqqv\") pod \"aodh-db-sync-74jvb\" (UID: \"73030570-ae25-48f4-bdee-e70a12c2623e\") " pod="openstack/aodh-db-sync-74jvb" Nov 24 12:45:35 crc kubenswrapper[4752]: I1124 12:45:35.231045 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-74jvb" Nov 24 12:45:36 crc kubenswrapper[4752]: I1124 12:45:36.037357 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-74jvb"] Nov 24 12:45:36 crc kubenswrapper[4752]: I1124 12:45:36.717350 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-74jvb" event={"ID":"73030570-ae25-48f4-bdee-e70a12c2623e","Type":"ContainerStarted","Data":"781ce9acb2f8dc4122eb25ace5455a43e22effd1a72ef0ec7c5fb99c3790896b"} Nov 24 12:45:40 crc kubenswrapper[4752]: I1124 12:45:40.036552 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bdrng"] Nov 24 12:45:40 crc kubenswrapper[4752]: I1124 12:45:40.093066 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bdrng"] Nov 24 12:45:40 crc kubenswrapper[4752]: I1124 12:45:40.742735 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e91b6fb-094d-468a-af30-0e161a3e16ab" path="/var/lib/kubelet/pods/6e91b6fb-094d-468a-af30-0e161a3e16ab/volumes" Nov 24 12:45:41 crc kubenswrapper[4752]: I1124 12:45:41.729085 4752 scope.go:117] "RemoveContainer" containerID="2856f89824856d474bf147248c78cc6aab4d866c6c4a631c464c721708251588" Nov 24 12:45:41 crc kubenswrapper[4752]: E1124 12:45:41.729912 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:45:42 crc kubenswrapper[4752]: I1124 12:45:42.791519 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-74jvb" event={"ID":"73030570-ae25-48f4-bdee-e70a12c2623e","Type":"ContainerStarted","Data":"98dc4929120b24fbe9ed100aa2120d926debdafe325a7b8d856a7b07d4d6337c"} Nov 24 12:45:42 crc kubenswrapper[4752]: I1124 12:45:42.817995 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-74jvb" podStartSLOduration=3.016972988 podStartE2EDuration="8.817964481s" podCreationTimestamp="2025-11-24 12:45:34 +0000 UTC" firstStartedPulling="2025-11-24 12:45:36.045626684 +0000 UTC m=+5942.030446973" lastFinishedPulling="2025-11-24 12:45:41.846618177 +0000 UTC m=+5947.831438466" observedRunningTime="2025-11-24 12:45:42.814605255 +0000 UTC m=+5948.799425564" watchObservedRunningTime="2025-11-24 12:45:42.817964481 +0000 UTC m=+5948.802784800" Nov 24 12:45:44 crc kubenswrapper[4752]: I1124 12:45:44.819237 4752 generic.go:334] "Generic (PLEG): container finished" podID="73030570-ae25-48f4-bdee-e70a12c2623e" containerID="98dc4929120b24fbe9ed100aa2120d926debdafe325a7b8d856a7b07d4d6337c" exitCode=0 Nov 24 12:45:44 crc kubenswrapper[4752]: I1124 12:45:44.819289 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-74jvb" event={"ID":"73030570-ae25-48f4-bdee-e70a12c2623e","Type":"ContainerDied","Data":"98dc4929120b24fbe9ed100aa2120d926debdafe325a7b8d856a7b07d4d6337c"} Nov 24 12:45:45 crc kubenswrapper[4752]: I1124 12:45:45.670649 4752 scope.go:117] "RemoveContainer" containerID="a6c4db08d5ed908e7b36eb39e1962f11529ba79d0e4961289bb5fc5026200c06" Nov 24 12:45:45 crc kubenswrapper[4752]: I1124 12:45:45.709818 4752 scope.go:117] "RemoveContainer" containerID="7ee1badcca9a9830941349e81895cdae778096336dfc507f10562388a478a9a8" Nov 24 12:45:45 crc kubenswrapper[4752]: I1124 12:45:45.788123 4752 scope.go:117] "RemoveContainer" containerID="70242a0e997c03dadde82918af5df61024dd5115f3af6886478123f5e51f8ccd" Nov 24 12:45:45 crc kubenswrapper[4752]: I1124 12:45:45.888009 4752 scope.go:117] "RemoveContainer" containerID="6e75e69472046419ee54bd5401569b2d91a312a57f044380163361e9ba16a0b6" Nov 24 12:45:45 crc kubenswrapper[4752]: I1124 12:45:45.959148 4752 scope.go:117] "RemoveContainer" containerID="7721169ae85727d9a39cd13e0a925e97b570f57d194841e1265f55da8b66aefb" Nov 24 12:45:46 crc kubenswrapper[4752]: I1124 12:45:46.008902 4752 scope.go:117] "RemoveContainer" containerID="b5af52724f270e0e2ea22302d1dfb909bea70b499eb59dd08546f1bdbc7951b4" Nov 24 12:45:46 crc kubenswrapper[4752]: I1124 12:45:46.036367 4752 scope.go:117] "RemoveContainer" containerID="f6f1107bb8c6bdab6510656d8c2f84a7b272365cce808da81f9a844af1cc150b" Nov 24 12:45:46 crc kubenswrapper[4752]: I1124 12:45:46.090482 4752 scope.go:117] "RemoveContainer" containerID="5c3ca94672a7e945e47f1589c91eda4ea922c0027b0eece350599aef423112f3" Nov 24 12:45:46 crc kubenswrapper[4752]: I1124 12:45:46.189928 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-74jvb" Nov 24 12:45:46 crc kubenswrapper[4752]: I1124 12:45:46.359610 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73030570-ae25-48f4-bdee-e70a12c2623e-config-data\") pod \"73030570-ae25-48f4-bdee-e70a12c2623e\" (UID: \"73030570-ae25-48f4-bdee-e70a12c2623e\") " Nov 24 12:45:46 crc kubenswrapper[4752]: I1124 12:45:46.359739 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73030570-ae25-48f4-bdee-e70a12c2623e-scripts\") pod \"73030570-ae25-48f4-bdee-e70a12c2623e\" (UID: \"73030570-ae25-48f4-bdee-e70a12c2623e\") " Nov 24 12:45:46 crc kubenswrapper[4752]: I1124 12:45:46.360043 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znqqv\" (UniqueName: \"kubernetes.io/projected/73030570-ae25-48f4-bdee-e70a12c2623e-kube-api-access-znqqv\") pod \"73030570-ae25-48f4-bdee-e70a12c2623e\" (UID: \"73030570-ae25-48f4-bdee-e70a12c2623e\") " Nov 24 12:45:46 crc kubenswrapper[4752]: I1124 12:45:46.360197 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73030570-ae25-48f4-bdee-e70a12c2623e-combined-ca-bundle\") pod \"73030570-ae25-48f4-bdee-e70a12c2623e\" (UID: \"73030570-ae25-48f4-bdee-e70a12c2623e\") " Nov 24 12:45:46 crc kubenswrapper[4752]: I1124 12:45:46.365343 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73030570-ae25-48f4-bdee-e70a12c2623e-scripts" (OuterVolumeSpecName: "scripts") pod "73030570-ae25-48f4-bdee-e70a12c2623e" (UID: "73030570-ae25-48f4-bdee-e70a12c2623e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:45:46 crc kubenswrapper[4752]: I1124 12:45:46.365466 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73030570-ae25-48f4-bdee-e70a12c2623e-kube-api-access-znqqv" (OuterVolumeSpecName: "kube-api-access-znqqv") pod "73030570-ae25-48f4-bdee-e70a12c2623e" (UID: "73030570-ae25-48f4-bdee-e70a12c2623e"). InnerVolumeSpecName "kube-api-access-znqqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:45:46 crc kubenswrapper[4752]: I1124 12:45:46.387836 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73030570-ae25-48f4-bdee-e70a12c2623e-config-data" (OuterVolumeSpecName: "config-data") pod "73030570-ae25-48f4-bdee-e70a12c2623e" (UID: "73030570-ae25-48f4-bdee-e70a12c2623e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:45:46 crc kubenswrapper[4752]: I1124 12:45:46.393472 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73030570-ae25-48f4-bdee-e70a12c2623e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73030570-ae25-48f4-bdee-e70a12c2623e" (UID: "73030570-ae25-48f4-bdee-e70a12c2623e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:45:46 crc kubenswrapper[4752]: I1124 12:45:46.463265 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73030570-ae25-48f4-bdee-e70a12c2623e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 12:45:46 crc kubenswrapper[4752]: I1124 12:45:46.463296 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73030570-ae25-48f4-bdee-e70a12c2623e-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 12:45:46 crc kubenswrapper[4752]: I1124 12:45:46.463305 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73030570-ae25-48f4-bdee-e70a12c2623e-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:45:46 crc kubenswrapper[4752]: I1124 12:45:46.463315 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znqqv\" (UniqueName: \"kubernetes.io/projected/73030570-ae25-48f4-bdee-e70a12c2623e-kube-api-access-znqqv\") on node \"crc\" DevicePath \"\"" Nov 24 12:45:46 crc kubenswrapper[4752]: I1124 12:45:46.874498 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-74jvb" event={"ID":"73030570-ae25-48f4-bdee-e70a12c2623e","Type":"ContainerDied","Data":"781ce9acb2f8dc4122eb25ace5455a43e22effd1a72ef0ec7c5fb99c3790896b"} Nov 24 12:45:46 crc kubenswrapper[4752]: I1124 12:45:46.874929 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="781ce9acb2f8dc4122eb25ace5455a43e22effd1a72ef0ec7c5fb99c3790896b" Nov 24 12:45:46 crc kubenswrapper[4752]: I1124 12:45:46.874599 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-74jvb" Nov 24 12:45:48 crc kubenswrapper[4752]: I1124 12:45:48.878041 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 24 12:45:50 crc kubenswrapper[4752]: I1124 12:45:50.009181 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Nov 24 12:45:50 crc kubenswrapper[4752]: E1124 12:45:50.009886 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73030570-ae25-48f4-bdee-e70a12c2623e" containerName="aodh-db-sync" Nov 24 12:45:50 crc kubenswrapper[4752]: I1124 12:45:50.009899 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="73030570-ae25-48f4-bdee-e70a12c2623e" containerName="aodh-db-sync" Nov 24 12:45:50 crc kubenswrapper[4752]: I1124 12:45:50.013092 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="73030570-ae25-48f4-bdee-e70a12c2623e" containerName="aodh-db-sync" Nov 24 12:45:50 crc kubenswrapper[4752]: I1124 12:45:50.015511 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Nov 24 12:45:50 crc kubenswrapper[4752]: I1124 12:45:50.018443 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Nov 24 12:45:50 crc kubenswrapper[4752]: I1124 12:45:50.018633 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-sj676" Nov 24 12:45:50 crc kubenswrapper[4752]: I1124 12:45:50.024271 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Nov 24 12:45:50 crc kubenswrapper[4752]: I1124 12:45:50.033075 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Nov 24 12:45:50 crc kubenswrapper[4752]: I1124 12:45:50.164202 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8cfa846-da98-401c-968b-aa10bf8093de-scripts\") pod \"aodh-0\" (UID: \"b8cfa846-da98-401c-968b-aa10bf8093de\") " pod="openstack/aodh-0" Nov 24 12:45:50 crc kubenswrapper[4752]: I1124 12:45:50.164321 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8cfa846-da98-401c-968b-aa10bf8093de-combined-ca-bundle\") pod \"aodh-0\" (UID: \"b8cfa846-da98-401c-968b-aa10bf8093de\") " pod="openstack/aodh-0" Nov 24 12:45:50 crc kubenswrapper[4752]: I1124 12:45:50.164357 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95ghz\" (UniqueName: \"kubernetes.io/projected/b8cfa846-da98-401c-968b-aa10bf8093de-kube-api-access-95ghz\") pod \"aodh-0\" (UID: \"b8cfa846-da98-401c-968b-aa10bf8093de\") " pod="openstack/aodh-0" Nov 24 12:45:50 crc kubenswrapper[4752]: I1124 12:45:50.164440 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8cfa846-da98-401c-968b-aa10bf8093de-config-data\") pod \"aodh-0\" (UID: \"b8cfa846-da98-401c-968b-aa10bf8093de\") " pod="openstack/aodh-0" Nov 24 12:45:50 crc kubenswrapper[4752]: I1124 12:45:50.266273 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8cfa846-da98-401c-968b-aa10bf8093de-scripts\") pod \"aodh-0\" (UID: \"b8cfa846-da98-401c-968b-aa10bf8093de\") " pod="openstack/aodh-0" Nov 24 12:45:50 crc kubenswrapper[4752]: I1124 12:45:50.266396 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8cfa846-da98-401c-968b-aa10bf8093de-combined-ca-bundle\") pod \"aodh-0\" (UID: \"b8cfa846-da98-401c-968b-aa10bf8093de\") " pod="openstack/aodh-0" Nov 24 12:45:50 crc kubenswrapper[4752]: I1124 12:45:50.266440 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95ghz\" (UniqueName: \"kubernetes.io/projected/b8cfa846-da98-401c-968b-aa10bf8093de-kube-api-access-95ghz\") pod \"aodh-0\" (UID: \"b8cfa846-da98-401c-968b-aa10bf8093de\") " pod="openstack/aodh-0" Nov 24 12:45:50 crc kubenswrapper[4752]: I1124 12:45:50.266521 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8cfa846-da98-401c-968b-aa10bf8093de-config-data\") pod \"aodh-0\" (UID: \"b8cfa846-da98-401c-968b-aa10bf8093de\") " pod="openstack/aodh-0" Nov 24 12:45:50 crc kubenswrapper[4752]: I1124 12:45:50.276572 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8cfa846-da98-401c-968b-aa10bf8093de-combined-ca-bundle\") pod \"aodh-0\" (UID: \"b8cfa846-da98-401c-968b-aa10bf8093de\") " pod="openstack/aodh-0" Nov 24 12:45:50 crc kubenswrapper[4752]: I1124 12:45:50.278773 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8cfa846-da98-401c-968b-aa10bf8093de-config-data\") pod \"aodh-0\" (UID: \"b8cfa846-da98-401c-968b-aa10bf8093de\") " pod="openstack/aodh-0" Nov 24 12:45:50 crc kubenswrapper[4752]: I1124 12:45:50.288819 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8cfa846-da98-401c-968b-aa10bf8093de-scripts\") pod \"aodh-0\" (UID: \"b8cfa846-da98-401c-968b-aa10bf8093de\") " pod="openstack/aodh-0" Nov 24 12:45:50 crc kubenswrapper[4752]: I1124 12:45:50.301234 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95ghz\" (UniqueName: \"kubernetes.io/projected/b8cfa846-da98-401c-968b-aa10bf8093de-kube-api-access-95ghz\") pod \"aodh-0\" (UID: \"b8cfa846-da98-401c-968b-aa10bf8093de\") " pod="openstack/aodh-0" Nov 24 12:45:50 crc kubenswrapper[4752]: I1124 12:45:50.342385 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Nov 24 12:45:50 crc kubenswrapper[4752]: I1124 12:45:50.859859 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Nov 24 12:45:50 crc kubenswrapper[4752]: I1124 12:45:50.947997 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b8cfa846-da98-401c-968b-aa10bf8093de","Type":"ContainerStarted","Data":"dd7ca60f337b485cf24d62320062207ea1257035137ac67f46fd690669a6f81f"} Nov 24 12:45:51 crc kubenswrapper[4752]: I1124 12:45:51.940003 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 24 12:45:51 crc kubenswrapper[4752]: I1124 12:45:51.941496 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4e30d780-072c-4d4a-86c7-4e26f594c466" containerName="ceilometer-central-agent" containerID="cri-o://2e4eda664a9df883804785c027e378ab4f50508047b7746b37e50f6728548d11" gracePeriod=30 Nov 24 12:45:51 crc kubenswrapper[4752]: I1124 12:45:51.941562 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4e30d780-072c-4d4a-86c7-4e26f594c466" containerName="proxy-httpd" containerID="cri-o://05ccbba81ba71e372e38b991b62922b2ba06388e4ab31832c5c1f2dc9f218baf" gracePeriod=30 Nov 24 12:45:51 crc kubenswrapper[4752]: I1124 12:45:51.941680 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4e30d780-072c-4d4a-86c7-4e26f594c466" containerName="ceilometer-notification-agent" containerID="cri-o://bbeb110b8a14f47c14df3e6dc334c2253300a868a5f5b9a846a6a7a366eaa8ce" gracePeriod=30 Nov 24 12:45:51 crc kubenswrapper[4752]: I1124 12:45:51.941562 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4e30d780-072c-4d4a-86c7-4e26f594c466" containerName="sg-core" containerID="cri-o://8fa13470b362497139f02a5c3ce02b10068e552741b6674808639ad271170b0f" gracePeriod=30 Nov 24 12:45:51 crc kubenswrapper[4752]: I1124 12:45:51.969389 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b8cfa846-da98-401c-968b-aa10bf8093de","Type":"ContainerStarted","Data":"d8f9717ebcb39e52d7173da522ab2e33a450d2b6ea8bbeb4bd1a780f4eeb319f"} Nov 24 12:45:52 crc kubenswrapper[4752]: I1124 12:45:52.983345 4752 generic.go:334] "Generic (PLEG): container finished" podID="4e30d780-072c-4d4a-86c7-4e26f594c466" containerID="05ccbba81ba71e372e38b991b62922b2ba06388e4ab31832c5c1f2dc9f218baf" exitCode=0 Nov 24 12:45:52 crc kubenswrapper[4752]: I1124 12:45:52.983939 4752 generic.go:334] "Generic (PLEG): container finished" podID="4e30d780-072c-4d4a-86c7-4e26f594c466" containerID="8fa13470b362497139f02a5c3ce02b10068e552741b6674808639ad271170b0f" exitCode=2 Nov 24 12:45:52 crc kubenswrapper[4752]: I1124 12:45:52.983963 4752 generic.go:334] "Generic (PLEG): container finished" podID="4e30d780-072c-4d4a-86c7-4e26f594c466" containerID="2e4eda664a9df883804785c027e378ab4f50508047b7746b37e50f6728548d11" exitCode=0 Nov 24 12:45:52 crc kubenswrapper[4752]: I1124 12:45:52.983430 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e30d780-072c-4d4a-86c7-4e26f594c466","Type":"ContainerDied","Data":"05ccbba81ba71e372e38b991b62922b2ba06388e4ab31832c5c1f2dc9f218baf"} Nov 24 12:45:52 crc kubenswrapper[4752]: I1124 12:45:52.984026 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e30d780-072c-4d4a-86c7-4e26f594c466","Type":"ContainerDied","Data":"8fa13470b362497139f02a5c3ce02b10068e552741b6674808639ad271170b0f"} Nov 24 12:45:52 crc kubenswrapper[4752]: I1124 12:45:52.984051 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e30d780-072c-4d4a-86c7-4e26f594c466","Type":"ContainerDied","Data":"2e4eda664a9df883804785c027e378ab4f50508047b7746b37e50f6728548d11"} Nov 24 12:45:53 crc kubenswrapper[4752]: I1124 12:45:53.994731 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b8cfa846-da98-401c-968b-aa10bf8093de","Type":"ContainerStarted","Data":"f6312ebd2f586b04c24d6a405a309b40640c41e95cf7873235037b48c5231068"} Nov 24 12:45:54 crc kubenswrapper[4752]: I1124 12:45:54.040170 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-vdv2t"] Nov 24 12:45:54 crc kubenswrapper[4752]: I1124 12:45:54.050527 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-vdv2t"] Nov 24 12:45:54 crc kubenswrapper[4752]: I1124 12:45:54.741608 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73c32f7d-4d73-4876-8610-95810a5318c6" path="/var/lib/kubelet/pods/73c32f7d-4d73-4876-8610-95810a5318c6/volumes" Nov 24 12:45:55 crc kubenswrapper[4752]: I1124 12:45:55.029043 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-4jn2n"] Nov 24 12:45:55 crc kubenswrapper[4752]: I1124 12:45:55.037700 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-4jn2n"] Nov 24 12:45:56 crc kubenswrapper[4752]: I1124 12:45:56.020109 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b8cfa846-da98-401c-968b-aa10bf8093de","Type":"ContainerStarted","Data":"82c757002aeefc473d578482e2199b0a04ff87cefb0ef81d5056e4b1e0e5d9f2"} Nov 24 12:45:56 crc kubenswrapper[4752]: I1124 12:45:56.728731 4752 scope.go:117] "RemoveContainer" containerID="2856f89824856d474bf147248c78cc6aab4d866c6c4a631c464c721708251588" Nov 24 12:45:56 crc kubenswrapper[4752]: I1124 12:45:56.741485 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0f4c990-46bb-4b1a-ad4b-c7207ab5facd" path="/var/lib/kubelet/pods/f0f4c990-46bb-4b1a-ad4b-c7207ab5facd/volumes" Nov 24 12:45:57 crc kubenswrapper[4752]: I1124 12:45:57.030490 4752 generic.go:334] "Generic (PLEG): container finished" podID="4e30d780-072c-4d4a-86c7-4e26f594c466" containerID="bbeb110b8a14f47c14df3e6dc334c2253300a868a5f5b9a846a6a7a366eaa8ce" exitCode=0 Nov 24 12:45:57 crc kubenswrapper[4752]: I1124 12:45:57.030956 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e30d780-072c-4d4a-86c7-4e26f594c466","Type":"ContainerDied","Data":"bbeb110b8a14f47c14df3e6dc334c2253300a868a5f5b9a846a6a7a366eaa8ce"} Nov 24 12:45:57 crc kubenswrapper[4752]: I1124 12:45:57.033948 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerStarted","Data":"658b3ae483b45978b2c67da668bc0398eca5583a5c4be24385a4896168f7191b"} Nov 24 12:45:57 crc kubenswrapper[4752]: I1124 12:45:57.280830 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 12:45:57 crc kubenswrapper[4752]: I1124 12:45:57.449438 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e30d780-072c-4d4a-86c7-4e26f594c466-combined-ca-bundle\") pod \"4e30d780-072c-4d4a-86c7-4e26f594c466\" (UID: \"4e30d780-072c-4d4a-86c7-4e26f594c466\") " Nov 24 12:45:57 crc kubenswrapper[4752]: I1124 12:45:57.449558 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e30d780-072c-4d4a-86c7-4e26f594c466-scripts\") pod \"4e30d780-072c-4d4a-86c7-4e26f594c466\" (UID: \"4e30d780-072c-4d4a-86c7-4e26f594c466\") " Nov 24 12:45:57 crc kubenswrapper[4752]: I1124 12:45:57.449614 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e30d780-072c-4d4a-86c7-4e26f594c466-run-httpd\") pod \"4e30d780-072c-4d4a-86c7-4e26f594c466\" (UID: \"4e30d780-072c-4d4a-86c7-4e26f594c466\") " Nov 24 12:45:57 crc kubenswrapper[4752]: I1124 12:45:57.449642 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4e30d780-072c-4d4a-86c7-4e26f594c466-sg-core-conf-yaml\") pod \"4e30d780-072c-4d4a-86c7-4e26f594c466\" (UID: \"4e30d780-072c-4d4a-86c7-4e26f594c466\") " Nov 24 12:45:57 crc kubenswrapper[4752]: I1124 12:45:57.449714 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e30d780-072c-4d4a-86c7-4e26f594c466-config-data\") pod \"4e30d780-072c-4d4a-86c7-4e26f594c466\" (UID: \"4e30d780-072c-4d4a-86c7-4e26f594c466\") " Nov 24 12:45:57 crc kubenswrapper[4752]: I1124 12:45:57.449790 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87vv9\" (UniqueName: \"kubernetes.io/projected/4e30d780-072c-4d4a-86c7-4e26f594c466-kube-api-access-87vv9\") pod \"4e30d780-072c-4d4a-86c7-4e26f594c466\" (UID: \"4e30d780-072c-4d4a-86c7-4e26f594c466\") " Nov 24 12:45:57 crc kubenswrapper[4752]: I1124 12:45:57.449882 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e30d780-072c-4d4a-86c7-4e26f594c466-log-httpd\") pod \"4e30d780-072c-4d4a-86c7-4e26f594c466\" (UID: \"4e30d780-072c-4d4a-86c7-4e26f594c466\") " Nov 24 12:45:57 crc kubenswrapper[4752]: I1124 12:45:57.450829 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e30d780-072c-4d4a-86c7-4e26f594c466-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4e30d780-072c-4d4a-86c7-4e26f594c466" (UID: "4e30d780-072c-4d4a-86c7-4e26f594c466"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:45:57 crc kubenswrapper[4752]: I1124 12:45:57.466448 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e30d780-072c-4d4a-86c7-4e26f594c466-kube-api-access-87vv9" (OuterVolumeSpecName: "kube-api-access-87vv9") pod "4e30d780-072c-4d4a-86c7-4e26f594c466" (UID: "4e30d780-072c-4d4a-86c7-4e26f594c466"). InnerVolumeSpecName "kube-api-access-87vv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:45:57 crc kubenswrapper[4752]: I1124 12:45:57.466936 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e30d780-072c-4d4a-86c7-4e26f594c466-scripts" (OuterVolumeSpecName: "scripts") pod "4e30d780-072c-4d4a-86c7-4e26f594c466" (UID: "4e30d780-072c-4d4a-86c7-4e26f594c466"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:45:57 crc kubenswrapper[4752]: I1124 12:45:57.475099 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e30d780-072c-4d4a-86c7-4e26f594c466-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4e30d780-072c-4d4a-86c7-4e26f594c466" (UID: "4e30d780-072c-4d4a-86c7-4e26f594c466"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:45:57 crc kubenswrapper[4752]: I1124 12:45:57.500918 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e30d780-072c-4d4a-86c7-4e26f594c466-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4e30d780-072c-4d4a-86c7-4e26f594c466" (UID: "4e30d780-072c-4d4a-86c7-4e26f594c466"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:45:57 crc kubenswrapper[4752]: I1124 12:45:57.534350 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e30d780-072c-4d4a-86c7-4e26f594c466-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4e30d780-072c-4d4a-86c7-4e26f594c466" (UID: "4e30d780-072c-4d4a-86c7-4e26f594c466"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:45:57 crc kubenswrapper[4752]: I1124 12:45:57.552687 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87vv9\" (UniqueName: \"kubernetes.io/projected/4e30d780-072c-4d4a-86c7-4e26f594c466-kube-api-access-87vv9\") on node \"crc\" DevicePath \"\"" Nov 24 12:45:57 crc kubenswrapper[4752]: I1124 12:45:57.552728 4752 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e30d780-072c-4d4a-86c7-4e26f594c466-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 24 12:45:57 crc kubenswrapper[4752]: I1124 12:45:57.552755 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e30d780-072c-4d4a-86c7-4e26f594c466-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 12:45:57 crc kubenswrapper[4752]: I1124 12:45:57.552767 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e30d780-072c-4d4a-86c7-4e26f594c466-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:45:57 crc kubenswrapper[4752]: I1124 12:45:57.552779 4752 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e30d780-072c-4d4a-86c7-4e26f594c466-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 24 12:45:57 crc kubenswrapper[4752]: I1124 12:45:57.552791 4752 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4e30d780-072c-4d4a-86c7-4e26f594c466-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 24 12:45:57 crc kubenswrapper[4752]: I1124 12:45:57.600805 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e30d780-072c-4d4a-86c7-4e26f594c466-config-data" (OuterVolumeSpecName: "config-data") pod "4e30d780-072c-4d4a-86c7-4e26f594c466" (UID: "4e30d780-072c-4d4a-86c7-4e26f594c466"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:45:57 crc kubenswrapper[4752]: I1124 12:45:57.654656 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e30d780-072c-4d4a-86c7-4e26f594c466-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 12:45:58 crc kubenswrapper[4752]: I1124 12:45:58.050360 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e30d780-072c-4d4a-86c7-4e26f594c466","Type":"ContainerDied","Data":"e969b95b61d56a593a032bae9d0a73f7724ad970f1bf44707b3daa3a22d90743"} Nov 24 12:45:58 crc kubenswrapper[4752]: I1124 12:45:58.050791 4752 scope.go:117] "RemoveContainer" containerID="05ccbba81ba71e372e38b991b62922b2ba06388e4ab31832c5c1f2dc9f218baf" Nov 24 12:45:58 crc kubenswrapper[4752]: I1124 12:45:58.050970 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 12:45:58 crc kubenswrapper[4752]: I1124 12:45:58.094608 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 24 12:45:58 crc kubenswrapper[4752]: I1124 12:45:58.108357 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 24 12:45:58 crc kubenswrapper[4752]: I1124 12:45:58.123251 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 24 12:45:58 crc kubenswrapper[4752]: E1124 12:45:58.123849 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e30d780-072c-4d4a-86c7-4e26f594c466" containerName="proxy-httpd" Nov 24 12:45:58 crc kubenswrapper[4752]: I1124 12:45:58.123870 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e30d780-072c-4d4a-86c7-4e26f594c466" containerName="proxy-httpd" Nov 24 12:45:58 crc kubenswrapper[4752]: E1124 12:45:58.123889 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e30d780-072c-4d4a-86c7-4e26f594c466" containerName="ceilometer-central-agent" Nov 24 12:45:58 crc kubenswrapper[4752]: I1124 12:45:58.123896 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e30d780-072c-4d4a-86c7-4e26f594c466" containerName="ceilometer-central-agent" Nov 24 12:45:58 crc kubenswrapper[4752]: E1124 12:45:58.123908 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e30d780-072c-4d4a-86c7-4e26f594c466" containerName="ceilometer-notification-agent" Nov 24 12:45:58 crc kubenswrapper[4752]: I1124 12:45:58.123916 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e30d780-072c-4d4a-86c7-4e26f594c466" containerName="ceilometer-notification-agent" Nov 24 12:45:58 crc kubenswrapper[4752]: E1124 12:45:58.123988 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e30d780-072c-4d4a-86c7-4e26f594c466" containerName="sg-core" Nov 24 12:45:58 crc kubenswrapper[4752]: I1124 12:45:58.123995 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e30d780-072c-4d4a-86c7-4e26f594c466" containerName="sg-core" Nov 24 12:45:58 crc kubenswrapper[4752]: I1124 12:45:58.124216 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e30d780-072c-4d4a-86c7-4e26f594c466" containerName="sg-core" Nov 24 12:45:58 crc kubenswrapper[4752]: I1124 12:45:58.124260 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e30d780-072c-4d4a-86c7-4e26f594c466" containerName="ceilometer-notification-agent" Nov 24 12:45:58 crc kubenswrapper[4752]: I1124 12:45:58.124284 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e30d780-072c-4d4a-86c7-4e26f594c466" containerName="proxy-httpd" Nov 24 12:45:58 crc kubenswrapper[4752]: I1124 12:45:58.124302 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e30d780-072c-4d4a-86c7-4e26f594c466" containerName="ceilometer-central-agent" Nov 24 12:45:58 crc kubenswrapper[4752]: I1124 12:45:58.126398 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 12:45:58 crc kubenswrapper[4752]: I1124 12:45:58.129618 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 24 12:45:58 crc kubenswrapper[4752]: I1124 12:45:58.129824 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 24 12:45:58 crc kubenswrapper[4752]: I1124 12:45:58.133644 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 24 12:45:58 crc kubenswrapper[4752]: I1124 12:45:58.271635 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcdd7e57-6834-4d6c-ac2e-7489bae07373-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bcdd7e57-6834-4d6c-ac2e-7489bae07373\") " pod="openstack/ceilometer-0" Nov 24 12:45:58 crc kubenswrapper[4752]: I1124 12:45:58.271945 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcdd7e57-6834-4d6c-ac2e-7489bae07373-scripts\") pod \"ceilometer-0\" (UID: \"bcdd7e57-6834-4d6c-ac2e-7489bae07373\") " pod="openstack/ceilometer-0" Nov 24 12:45:58 crc kubenswrapper[4752]: I1124 12:45:58.272171 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcdd7e57-6834-4d6c-ac2e-7489bae07373-run-httpd\") pod \"ceilometer-0\" (UID: \"bcdd7e57-6834-4d6c-ac2e-7489bae07373\") " pod="openstack/ceilometer-0" Nov 24 12:45:58 crc kubenswrapper[4752]: I1124 12:45:58.272227 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcdd7e57-6834-4d6c-ac2e-7489bae07373-log-httpd\") pod \"ceilometer-0\" (UID: \"bcdd7e57-6834-4d6c-ac2e-7489bae07373\") " pod="openstack/ceilometer-0" Nov 24 12:45:58 crc kubenswrapper[4752]: I1124 12:45:58.272338 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bcdd7e57-6834-4d6c-ac2e-7489bae07373-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bcdd7e57-6834-4d6c-ac2e-7489bae07373\") " pod="openstack/ceilometer-0" Nov 24 12:45:58 crc kubenswrapper[4752]: I1124 12:45:58.272588 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcdd7e57-6834-4d6c-ac2e-7489bae07373-config-data\") pod \"ceilometer-0\" (UID: \"bcdd7e57-6834-4d6c-ac2e-7489bae07373\") " pod="openstack/ceilometer-0" Nov 24 12:45:58 crc kubenswrapper[4752]: I1124 12:45:58.272657 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8wrr\" (UniqueName: \"kubernetes.io/projected/bcdd7e57-6834-4d6c-ac2e-7489bae07373-kube-api-access-c8wrr\") pod \"ceilometer-0\" (UID: \"bcdd7e57-6834-4d6c-ac2e-7489bae07373\") " pod="openstack/ceilometer-0" Nov 24 12:45:58 crc kubenswrapper[4752]: I1124 12:45:58.371357 4752 scope.go:117] "RemoveContainer" containerID="8fa13470b362497139f02a5c3ce02b10068e552741b6674808639ad271170b0f" Nov 24 12:45:58 crc kubenswrapper[4752]: I1124 12:45:58.376039 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcdd7e57-6834-4d6c-ac2e-7489bae07373-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bcdd7e57-6834-4d6c-ac2e-7489bae07373\") " pod="openstack/ceilometer-0" Nov 24 12:45:58 crc kubenswrapper[4752]: I1124 12:45:58.376196 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcdd7e57-6834-4d6c-ac2e-7489bae07373-scripts\") pod \"ceilometer-0\" (UID: \"bcdd7e57-6834-4d6c-ac2e-7489bae07373\") " pod="openstack/ceilometer-0" Nov 24 12:45:58 crc kubenswrapper[4752]: I1124 12:45:58.376255 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcdd7e57-6834-4d6c-ac2e-7489bae07373-run-httpd\") pod \"ceilometer-0\" (UID: \"bcdd7e57-6834-4d6c-ac2e-7489bae07373\") " pod="openstack/ceilometer-0" Nov 24 12:45:58 crc kubenswrapper[4752]: I1124 12:45:58.376283 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcdd7e57-6834-4d6c-ac2e-7489bae07373-log-httpd\") pod \"ceilometer-0\" (UID: \"bcdd7e57-6834-4d6c-ac2e-7489bae07373\") " pod="openstack/ceilometer-0" Nov 24 12:45:58 crc kubenswrapper[4752]: I1124 12:45:58.376382 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bcdd7e57-6834-4d6c-ac2e-7489bae07373-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bcdd7e57-6834-4d6c-ac2e-7489bae07373\") " pod="openstack/ceilometer-0" Nov 24 12:45:58 crc kubenswrapper[4752]: I1124 12:45:58.377845 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcdd7e57-6834-4d6c-ac2e-7489bae07373-run-httpd\") pod \"ceilometer-0\" (UID: \"bcdd7e57-6834-4d6c-ac2e-7489bae07373\") " pod="openstack/ceilometer-0" Nov 24 12:45:58 crc kubenswrapper[4752]: I1124 12:45:58.378267 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcdd7e57-6834-4d6c-ac2e-7489bae07373-log-httpd\") pod \"ceilometer-0\" (UID: \"bcdd7e57-6834-4d6c-ac2e-7489bae07373\") " pod="openstack/ceilometer-0" Nov 24 12:45:58 crc kubenswrapper[4752]: I1124 12:45:58.376592 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcdd7e57-6834-4d6c-ac2e-7489bae07373-config-data\") pod \"ceilometer-0\" (UID: \"bcdd7e57-6834-4d6c-ac2e-7489bae07373\") " pod="openstack/ceilometer-0" Nov 24 12:45:58 crc kubenswrapper[4752]: I1124 12:45:58.379445 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8wrr\" (UniqueName: \"kubernetes.io/projected/bcdd7e57-6834-4d6c-ac2e-7489bae07373-kube-api-access-c8wrr\") pod \"ceilometer-0\" (UID: \"bcdd7e57-6834-4d6c-ac2e-7489bae07373\") " pod="openstack/ceilometer-0" Nov 24 12:45:58 crc kubenswrapper[4752]: I1124 12:45:58.381685 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcdd7e57-6834-4d6c-ac2e-7489bae07373-scripts\") pod \"ceilometer-0\" (UID: \"bcdd7e57-6834-4d6c-ac2e-7489bae07373\") " pod="openstack/ceilometer-0" Nov 24 12:45:58 crc kubenswrapper[4752]: I1124 12:45:58.382689 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcdd7e57-6834-4d6c-ac2e-7489bae07373-config-data\") pod \"ceilometer-0\" (UID: \"bcdd7e57-6834-4d6c-ac2e-7489bae07373\") " pod="openstack/ceilometer-0" Nov 24 12:45:58 crc kubenswrapper[4752]: I1124 12:45:58.383309 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcdd7e57-6834-4d6c-ac2e-7489bae07373-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bcdd7e57-6834-4d6c-ac2e-7489bae07373\") " pod="openstack/ceilometer-0" Nov 24 12:45:58 crc kubenswrapper[4752]: I1124 12:45:58.394435 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bcdd7e57-6834-4d6c-ac2e-7489bae07373-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bcdd7e57-6834-4d6c-ac2e-7489bae07373\") " pod="openstack/ceilometer-0" Nov 24 12:45:58 crc kubenswrapper[4752]: I1124 12:45:58.412567 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8wrr\" (UniqueName: \"kubernetes.io/projected/bcdd7e57-6834-4d6c-ac2e-7489bae07373-kube-api-access-c8wrr\") pod \"ceilometer-0\" (UID: \"bcdd7e57-6834-4d6c-ac2e-7489bae07373\") " pod="openstack/ceilometer-0" Nov 24 12:45:58 crc kubenswrapper[4752]: I1124 12:45:58.431008 4752 scope.go:117] "RemoveContainer" containerID="bbeb110b8a14f47c14df3e6dc334c2253300a868a5f5b9a846a6a7a366eaa8ce" Nov 24 12:45:58 crc kubenswrapper[4752]: I1124 12:45:58.473453 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 12:45:58 crc kubenswrapper[4752]: I1124 12:45:58.634374 4752 scope.go:117] "RemoveContainer" containerID="2e4eda664a9df883804785c027e378ab4f50508047b7746b37e50f6728548d11" Nov 24 12:45:58 crc kubenswrapper[4752]: I1124 12:45:58.747735 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e30d780-072c-4d4a-86c7-4e26f594c466" path="/var/lib/kubelet/pods/4e30d780-072c-4d4a-86c7-4e26f594c466/volumes" Nov 24 12:45:59 crc kubenswrapper[4752]: I1124 12:45:59.002831 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 24 12:45:59 crc kubenswrapper[4752]: I1124 12:45:59.066259 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b8cfa846-da98-401c-968b-aa10bf8093de","Type":"ContainerStarted","Data":"b4af2c3627072c0436dfc216ae0b9829e42011962bb5555c529af5338a31fb22"} Nov 24 12:45:59 crc kubenswrapper[4752]: I1124 12:45:59.070834 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcdd7e57-6834-4d6c-ac2e-7489bae07373","Type":"ContainerStarted","Data":"bf6963b7516244cf803227db21876fb39e56f8359e0e4bbeb6edb8aadeb10ab0"} Nov 24 12:45:59 crc kubenswrapper[4752]: I1124 12:45:59.091298 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.527872788 podStartE2EDuration="10.091273691s" podCreationTimestamp="2025-11-24 12:45:49 +0000 UTC" firstStartedPulling="2025-11-24 12:45:50.868078681 +0000 UTC m=+5956.852898960" lastFinishedPulling="2025-11-24 12:45:58.431479574 +0000 UTC m=+5964.416299863" observedRunningTime="2025-11-24 12:45:59.087279637 +0000 UTC m=+5965.072099926" watchObservedRunningTime="2025-11-24 12:45:59.091273691 +0000 UTC m=+5965.076093980" Nov 24 12:46:00 crc kubenswrapper[4752]: I1124 12:46:00.087755 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcdd7e57-6834-4d6c-ac2e-7489bae07373","Type":"ContainerStarted","Data":"45f427be77176c8c7480233ef78da5e83bacabd33e0f3969b88b0cb13e2f8db6"} Nov 24 12:46:01 crc kubenswrapper[4752]: I1124 12:46:01.102927 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcdd7e57-6834-4d6c-ac2e-7489bae07373","Type":"ContainerStarted","Data":"7246ff097d33a175194c8681d68818325857b181214ca1f263740e63712b3e1d"} Nov 24 12:46:02 crc kubenswrapper[4752]: I1124 12:46:02.114677 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcdd7e57-6834-4d6c-ac2e-7489bae07373","Type":"ContainerStarted","Data":"c406a8a8a2f912e28a2f965f0a42f996eb33ae004074b356e037bde4cc8f8994"} Nov 24 12:46:03 crc kubenswrapper[4752]: I1124 12:46:03.154676 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcdd7e57-6834-4d6c-ac2e-7489bae07373","Type":"ContainerStarted","Data":"86b178d77c4cc3790b2e8301dd07be220d333d5adb3cffc97e0acc39df6edbf5"} Nov 24 12:46:03 crc kubenswrapper[4752]: I1124 12:46:03.155415 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 24 12:46:03 crc kubenswrapper[4752]: I1124 12:46:03.185296 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.803943146 podStartE2EDuration="5.185274796s" podCreationTimestamp="2025-11-24 12:45:58 +0000 UTC" firstStartedPulling="2025-11-24 12:45:59.009836963 +0000 UTC m=+5964.994657252" lastFinishedPulling="2025-11-24 12:46:02.391168613 +0000 UTC m=+5968.375988902" observedRunningTime="2025-11-24 12:46:03.178232024 +0000 UTC m=+5969.163052313" watchObservedRunningTime="2025-11-24 12:46:03.185274796 +0000 UTC m=+5969.170095085" Nov 24 12:46:04 crc kubenswrapper[4752]: I1124 12:46:04.880399 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-d2fwj"] Nov 24 12:46:04 crc kubenswrapper[4752]: I1124 12:46:04.882312 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-d2fwj" Nov 24 12:46:04 crc kubenswrapper[4752]: I1124 12:46:04.889403 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-d2fwj"] Nov 24 12:46:04 crc kubenswrapper[4752]: I1124 12:46:04.923647 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prr9s\" (UniqueName: \"kubernetes.io/projected/75b1add3-2b06-4170-ac77-e588f45ac2c9-kube-api-access-prr9s\") pod \"manila-db-create-d2fwj\" (UID: \"75b1add3-2b06-4170-ac77-e588f45ac2c9\") " pod="openstack/manila-db-create-d2fwj" Nov 24 12:46:04 crc kubenswrapper[4752]: I1124 12:46:04.923940 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75b1add3-2b06-4170-ac77-e588f45ac2c9-operator-scripts\") pod \"manila-db-create-d2fwj\" (UID: \"75b1add3-2b06-4170-ac77-e588f45ac2c9\") " pod="openstack/manila-db-create-d2fwj" Nov 24 12:46:04 crc kubenswrapper[4752]: I1124 12:46:04.981688 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-e5e7-account-create-nrcvg"] Nov 24 12:46:04 crc kubenswrapper[4752]: I1124 12:46:04.983572 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-e5e7-account-create-nrcvg" Nov 24 12:46:04 crc kubenswrapper[4752]: I1124 12:46:04.985268 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Nov 24 12:46:04 crc kubenswrapper[4752]: I1124 12:46:04.990053 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-e5e7-account-create-nrcvg"] Nov 24 12:46:05 crc kubenswrapper[4752]: I1124 12:46:05.025800 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75b1add3-2b06-4170-ac77-e588f45ac2c9-operator-scripts\") pod \"manila-db-create-d2fwj\" (UID: \"75b1add3-2b06-4170-ac77-e588f45ac2c9\") " pod="openstack/manila-db-create-d2fwj" Nov 24 12:46:05 crc kubenswrapper[4752]: I1124 12:46:05.025919 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmxh4\" (UniqueName: \"kubernetes.io/projected/7779481c-2a7e-408f-a8d0-3ffbf8abe7a7-kube-api-access-rmxh4\") pod \"manila-e5e7-account-create-nrcvg\" (UID: \"7779481c-2a7e-408f-a8d0-3ffbf8abe7a7\") " pod="openstack/manila-e5e7-account-create-nrcvg" Nov 24 12:46:05 crc kubenswrapper[4752]: I1124 12:46:05.026018 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7779481c-2a7e-408f-a8d0-3ffbf8abe7a7-operator-scripts\") pod \"manila-e5e7-account-create-nrcvg\" (UID: \"7779481c-2a7e-408f-a8d0-3ffbf8abe7a7\") " pod="openstack/manila-e5e7-account-create-nrcvg" Nov 24 12:46:05 crc kubenswrapper[4752]: I1124 12:46:05.026109 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prr9s\" (UniqueName: \"kubernetes.io/projected/75b1add3-2b06-4170-ac77-e588f45ac2c9-kube-api-access-prr9s\") pod \"manila-db-create-d2fwj\" (UID: \"75b1add3-2b06-4170-ac77-e588f45ac2c9\") " pod="openstack/manila-db-create-d2fwj" Nov 24 12:46:05 crc kubenswrapper[4752]: I1124 12:46:05.026521 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75b1add3-2b06-4170-ac77-e588f45ac2c9-operator-scripts\") pod \"manila-db-create-d2fwj\" (UID: \"75b1add3-2b06-4170-ac77-e588f45ac2c9\") " pod="openstack/manila-db-create-d2fwj" Nov 24 12:46:05 crc kubenswrapper[4752]: I1124 12:46:05.043759 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prr9s\" (UniqueName: \"kubernetes.io/projected/75b1add3-2b06-4170-ac77-e588f45ac2c9-kube-api-access-prr9s\") pod \"manila-db-create-d2fwj\" (UID: \"75b1add3-2b06-4170-ac77-e588f45ac2c9\") " pod="openstack/manila-db-create-d2fwj" Nov 24 12:46:05 crc kubenswrapper[4752]: I1124 12:46:05.128176 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmxh4\" (UniqueName: \"kubernetes.io/projected/7779481c-2a7e-408f-a8d0-3ffbf8abe7a7-kube-api-access-rmxh4\") pod \"manila-e5e7-account-create-nrcvg\" (UID: \"7779481c-2a7e-408f-a8d0-3ffbf8abe7a7\") " pod="openstack/manila-e5e7-account-create-nrcvg" Nov 24 12:46:05 crc kubenswrapper[4752]: I1124 12:46:05.128266 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7779481c-2a7e-408f-a8d0-3ffbf8abe7a7-operator-scripts\") pod \"manila-e5e7-account-create-nrcvg\" (UID: \"7779481c-2a7e-408f-a8d0-3ffbf8abe7a7\") " pod="openstack/manila-e5e7-account-create-nrcvg" Nov 24 12:46:05 crc kubenswrapper[4752]: I1124 12:46:05.129179 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7779481c-2a7e-408f-a8d0-3ffbf8abe7a7-operator-scripts\") pod \"manila-e5e7-account-create-nrcvg\" (UID: \"7779481c-2a7e-408f-a8d0-3ffbf8abe7a7\") " pod="openstack/manila-e5e7-account-create-nrcvg" Nov 24 12:46:05 crc kubenswrapper[4752]: I1124 12:46:05.158527 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmxh4\" (UniqueName: \"kubernetes.io/projected/7779481c-2a7e-408f-a8d0-3ffbf8abe7a7-kube-api-access-rmxh4\") pod \"manila-e5e7-account-create-nrcvg\" (UID: \"7779481c-2a7e-408f-a8d0-3ffbf8abe7a7\") " pod="openstack/manila-e5e7-account-create-nrcvg" Nov 24 12:46:05 crc kubenswrapper[4752]: I1124 12:46:05.201937 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-d2fwj" Nov 24 12:46:05 crc kubenswrapper[4752]: I1124 12:46:05.297838 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-e5e7-account-create-nrcvg" Nov 24 12:46:05 crc kubenswrapper[4752]: I1124 12:46:05.735470 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-d2fwj"] Nov 24 12:46:05 crc kubenswrapper[4752]: W1124 12:46:05.739406 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75b1add3_2b06_4170_ac77_e588f45ac2c9.slice/crio-1b42c6b35fff9d5f46687964d1684ac8b52e762cd84e6210822b3c9548e4345e WatchSource:0}: Error finding container 1b42c6b35fff9d5f46687964d1684ac8b52e762cd84e6210822b3c9548e4345e: Status 404 returned error can't find the container with id 1b42c6b35fff9d5f46687964d1684ac8b52e762cd84e6210822b3c9548e4345e Nov 24 12:46:05 crc kubenswrapper[4752]: I1124 12:46:05.908454 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-e5e7-account-create-nrcvg"] Nov 24 12:46:06 crc kubenswrapper[4752]: I1124 12:46:06.184037 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-d2fwj" event={"ID":"75b1add3-2b06-4170-ac77-e588f45ac2c9","Type":"ContainerStarted","Data":"4262aeebce2551c3175a6e45917416d62fd1b9aae5132152b2b7a354636e3d3c"} Nov 24 12:46:06 crc kubenswrapper[4752]: I1124 12:46:06.184093 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-d2fwj" event={"ID":"75b1add3-2b06-4170-ac77-e588f45ac2c9","Type":"ContainerStarted","Data":"1b42c6b35fff9d5f46687964d1684ac8b52e762cd84e6210822b3c9548e4345e"} Nov 24 12:46:06 crc kubenswrapper[4752]: I1124 12:46:06.186360 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-e5e7-account-create-nrcvg" event={"ID":"7779481c-2a7e-408f-a8d0-3ffbf8abe7a7","Type":"ContainerStarted","Data":"8d9b9b3f9cc2041ed11f1330ad7b77af1b884ffe1cd4385713205ce7e971e29f"} Nov 24 12:46:06 crc kubenswrapper[4752]: I1124 12:46:06.186421 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-e5e7-account-create-nrcvg" event={"ID":"7779481c-2a7e-408f-a8d0-3ffbf8abe7a7","Type":"ContainerStarted","Data":"794634110d790c34c0e9a42eefd4f0def1c0297a2f50a901dfac5adc386a5666"} Nov 24 12:46:06 crc kubenswrapper[4752]: I1124 12:46:06.205266 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-create-d2fwj" podStartSLOduration=2.205243299 podStartE2EDuration="2.205243299s" podCreationTimestamp="2025-11-24 12:46:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:46:06.199507224 +0000 UTC m=+5972.184327513" watchObservedRunningTime="2025-11-24 12:46:06.205243299 +0000 UTC m=+5972.190063588" Nov 24 12:46:06 crc kubenswrapper[4752]: I1124 12:46:06.225191 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-e5e7-account-create-nrcvg" podStartSLOduration=2.225173771 podStartE2EDuration="2.225173771s" podCreationTimestamp="2025-11-24 12:46:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:46:06.21711977 +0000 UTC m=+5972.201940069" watchObservedRunningTime="2025-11-24 12:46:06.225173771 +0000 UTC m=+5972.209994060" Nov 24 12:46:07 crc kubenswrapper[4752]: I1124 12:46:07.203696 4752 generic.go:334] "Generic (PLEG): container finished" podID="75b1add3-2b06-4170-ac77-e588f45ac2c9" containerID="4262aeebce2551c3175a6e45917416d62fd1b9aae5132152b2b7a354636e3d3c" exitCode=0 Nov 24 12:46:07 crc kubenswrapper[4752]: I1124 12:46:07.203794 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-d2fwj" event={"ID":"75b1add3-2b06-4170-ac77-e588f45ac2c9","Type":"ContainerDied","Data":"4262aeebce2551c3175a6e45917416d62fd1b9aae5132152b2b7a354636e3d3c"} Nov 24 12:46:07 crc kubenswrapper[4752]: I1124 12:46:07.221462 4752 generic.go:334] "Generic (PLEG): container finished" podID="7779481c-2a7e-408f-a8d0-3ffbf8abe7a7" containerID="8d9b9b3f9cc2041ed11f1330ad7b77af1b884ffe1cd4385713205ce7e971e29f" exitCode=0 Nov 24 12:46:07 crc kubenswrapper[4752]: I1124 12:46:07.221532 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-e5e7-account-create-nrcvg" event={"ID":"7779481c-2a7e-408f-a8d0-3ffbf8abe7a7","Type":"ContainerDied","Data":"8d9b9b3f9cc2041ed11f1330ad7b77af1b884ffe1cd4385713205ce7e971e29f"} Nov 24 12:46:08 crc kubenswrapper[4752]: I1124 12:46:08.805318 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-d2fwj" Nov 24 12:46:08 crc kubenswrapper[4752]: I1124 12:46:08.810534 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-e5e7-account-create-nrcvg" Nov 24 12:46:08 crc kubenswrapper[4752]: I1124 12:46:08.930216 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7779481c-2a7e-408f-a8d0-3ffbf8abe7a7-operator-scripts\") pod \"7779481c-2a7e-408f-a8d0-3ffbf8abe7a7\" (UID: \"7779481c-2a7e-408f-a8d0-3ffbf8abe7a7\") " Nov 24 12:46:08 crc kubenswrapper[4752]: I1124 12:46:08.930426 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75b1add3-2b06-4170-ac77-e588f45ac2c9-operator-scripts\") pod \"75b1add3-2b06-4170-ac77-e588f45ac2c9\" (UID: \"75b1add3-2b06-4170-ac77-e588f45ac2c9\") " Nov 24 12:46:08 crc kubenswrapper[4752]: I1124 12:46:08.930545 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmxh4\" (UniqueName: \"kubernetes.io/projected/7779481c-2a7e-408f-a8d0-3ffbf8abe7a7-kube-api-access-rmxh4\") pod \"7779481c-2a7e-408f-a8d0-3ffbf8abe7a7\" (UID: \"7779481c-2a7e-408f-a8d0-3ffbf8abe7a7\") " Nov 24 12:46:08 crc kubenswrapper[4752]: I1124 12:46:08.930606 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prr9s\" (UniqueName: \"kubernetes.io/projected/75b1add3-2b06-4170-ac77-e588f45ac2c9-kube-api-access-prr9s\") pod \"75b1add3-2b06-4170-ac77-e588f45ac2c9\" (UID: \"75b1add3-2b06-4170-ac77-e588f45ac2c9\") " Nov 24 12:46:08 crc kubenswrapper[4752]: I1124 12:46:08.931083 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7779481c-2a7e-408f-a8d0-3ffbf8abe7a7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7779481c-2a7e-408f-a8d0-3ffbf8abe7a7" (UID: "7779481c-2a7e-408f-a8d0-3ffbf8abe7a7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:46:08 crc kubenswrapper[4752]: I1124 12:46:08.931556 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75b1add3-2b06-4170-ac77-e588f45ac2c9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "75b1add3-2b06-4170-ac77-e588f45ac2c9" (UID: "75b1add3-2b06-4170-ac77-e588f45ac2c9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:46:08 crc kubenswrapper[4752]: I1124 12:46:08.932015 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7779481c-2a7e-408f-a8d0-3ffbf8abe7a7-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:46:08 crc kubenswrapper[4752]: I1124 12:46:08.932062 4752 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75b1add3-2b06-4170-ac77-e588f45ac2c9-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:46:08 crc kubenswrapper[4752]: I1124 12:46:08.937701 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7779481c-2a7e-408f-a8d0-3ffbf8abe7a7-kube-api-access-rmxh4" (OuterVolumeSpecName: "kube-api-access-rmxh4") pod "7779481c-2a7e-408f-a8d0-3ffbf8abe7a7" (UID: "7779481c-2a7e-408f-a8d0-3ffbf8abe7a7"). InnerVolumeSpecName "kube-api-access-rmxh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:46:08 crc kubenswrapper[4752]: I1124 12:46:08.938400 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75b1add3-2b06-4170-ac77-e588f45ac2c9-kube-api-access-prr9s" (OuterVolumeSpecName: "kube-api-access-prr9s") pod "75b1add3-2b06-4170-ac77-e588f45ac2c9" (UID: "75b1add3-2b06-4170-ac77-e588f45ac2c9"). InnerVolumeSpecName "kube-api-access-prr9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:46:09 crc kubenswrapper[4752]: I1124 12:46:09.034383 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmxh4\" (UniqueName: \"kubernetes.io/projected/7779481c-2a7e-408f-a8d0-3ffbf8abe7a7-kube-api-access-rmxh4\") on node \"crc\" DevicePath \"\"" Nov 24 12:46:09 crc kubenswrapper[4752]: I1124 12:46:09.034445 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prr9s\" (UniqueName: \"kubernetes.io/projected/75b1add3-2b06-4170-ac77-e588f45ac2c9-kube-api-access-prr9s\") on node \"crc\" DevicePath \"\"" Nov 24 12:46:09 crc kubenswrapper[4752]: I1124 12:46:09.248387 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-e5e7-account-create-nrcvg" Nov 24 12:46:09 crc kubenswrapper[4752]: I1124 12:46:09.248539 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-e5e7-account-create-nrcvg" event={"ID":"7779481c-2a7e-408f-a8d0-3ffbf8abe7a7","Type":"ContainerDied","Data":"794634110d790c34c0e9a42eefd4f0def1c0297a2f50a901dfac5adc386a5666"} Nov 24 12:46:09 crc kubenswrapper[4752]: I1124 12:46:09.248576 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="794634110d790c34c0e9a42eefd4f0def1c0297a2f50a901dfac5adc386a5666" Nov 24 12:46:09 crc kubenswrapper[4752]: I1124 12:46:09.250897 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-d2fwj" event={"ID":"75b1add3-2b06-4170-ac77-e588f45ac2c9","Type":"ContainerDied","Data":"1b42c6b35fff9d5f46687964d1684ac8b52e762cd84e6210822b3c9548e4345e"} Nov 24 12:46:09 crc kubenswrapper[4752]: I1124 12:46:09.250944 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-d2fwj" Nov 24 12:46:09 crc kubenswrapper[4752]: I1124 12:46:09.250949 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b42c6b35fff9d5f46687964d1684ac8b52e762cd84e6210822b3c9548e4345e" Nov 24 12:46:10 crc kubenswrapper[4752]: I1124 12:46:10.312398 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-x57sk"] Nov 24 12:46:10 crc kubenswrapper[4752]: E1124 12:46:10.313119 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7779481c-2a7e-408f-a8d0-3ffbf8abe7a7" containerName="mariadb-account-create" Nov 24 12:46:10 crc kubenswrapper[4752]: I1124 12:46:10.313133 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="7779481c-2a7e-408f-a8d0-3ffbf8abe7a7" containerName="mariadb-account-create" Nov 24 12:46:10 crc kubenswrapper[4752]: E1124 12:46:10.313185 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75b1add3-2b06-4170-ac77-e588f45ac2c9" containerName="mariadb-database-create" Nov 24 12:46:10 crc kubenswrapper[4752]: I1124 12:46:10.313196 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="75b1add3-2b06-4170-ac77-e588f45ac2c9" containerName="mariadb-database-create" Nov 24 12:46:10 crc kubenswrapper[4752]: I1124 12:46:10.313458 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="75b1add3-2b06-4170-ac77-e588f45ac2c9" containerName="mariadb-database-create" Nov 24 12:46:10 crc kubenswrapper[4752]: I1124 12:46:10.313486 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="7779481c-2a7e-408f-a8d0-3ffbf8abe7a7" containerName="mariadb-account-create" Nov 24 12:46:10 crc kubenswrapper[4752]: I1124 12:46:10.314349 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-x57sk" Nov 24 12:46:10 crc kubenswrapper[4752]: I1124 12:46:10.318142 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-828mv" Nov 24 12:46:10 crc kubenswrapper[4752]: I1124 12:46:10.318380 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Nov 24 12:46:10 crc kubenswrapper[4752]: I1124 12:46:10.336609 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-x57sk"] Nov 24 12:46:10 crc kubenswrapper[4752]: I1124 12:46:10.468476 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16f6b200-e723-474f-8bb1-c4e502ebd5ad-combined-ca-bundle\") pod \"manila-db-sync-x57sk\" (UID: \"16f6b200-e723-474f-8bb1-c4e502ebd5ad\") " pod="openstack/manila-db-sync-x57sk" Nov 24 12:46:10 crc kubenswrapper[4752]: I1124 12:46:10.468767 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6wq4\" (UniqueName: \"kubernetes.io/projected/16f6b200-e723-474f-8bb1-c4e502ebd5ad-kube-api-access-l6wq4\") pod \"manila-db-sync-x57sk\" (UID: \"16f6b200-e723-474f-8bb1-c4e502ebd5ad\") " pod="openstack/manila-db-sync-x57sk" Nov 24 12:46:10 crc kubenswrapper[4752]: I1124 12:46:10.468972 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/16f6b200-e723-474f-8bb1-c4e502ebd5ad-job-config-data\") pod \"manila-db-sync-x57sk\" (UID: \"16f6b200-e723-474f-8bb1-c4e502ebd5ad\") " pod="openstack/manila-db-sync-x57sk" Nov 24 12:46:10 crc kubenswrapper[4752]: I1124 12:46:10.469072 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16f6b200-e723-474f-8bb1-c4e502ebd5ad-config-data\") pod \"manila-db-sync-x57sk\" (UID: \"16f6b200-e723-474f-8bb1-c4e502ebd5ad\") " pod="openstack/manila-db-sync-x57sk" Nov 24 12:46:10 crc kubenswrapper[4752]: I1124 12:46:10.571190 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/16f6b200-e723-474f-8bb1-c4e502ebd5ad-job-config-data\") pod \"manila-db-sync-x57sk\" (UID: \"16f6b200-e723-474f-8bb1-c4e502ebd5ad\") " pod="openstack/manila-db-sync-x57sk" Nov 24 12:46:10 crc kubenswrapper[4752]: I1124 12:46:10.571271 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16f6b200-e723-474f-8bb1-c4e502ebd5ad-config-data\") pod \"manila-db-sync-x57sk\" (UID: \"16f6b200-e723-474f-8bb1-c4e502ebd5ad\") " pod="openstack/manila-db-sync-x57sk" Nov 24 12:46:10 crc kubenswrapper[4752]: I1124 12:46:10.571309 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16f6b200-e723-474f-8bb1-c4e502ebd5ad-combined-ca-bundle\") pod \"manila-db-sync-x57sk\" (UID: \"16f6b200-e723-474f-8bb1-c4e502ebd5ad\") " pod="openstack/manila-db-sync-x57sk" Nov 24 12:46:10 crc kubenswrapper[4752]: I1124 12:46:10.571390 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6wq4\" (UniqueName: \"kubernetes.io/projected/16f6b200-e723-474f-8bb1-c4e502ebd5ad-kube-api-access-l6wq4\") pod \"manila-db-sync-x57sk\" (UID: \"16f6b200-e723-474f-8bb1-c4e502ebd5ad\") " pod="openstack/manila-db-sync-x57sk" Nov 24 12:46:10 crc kubenswrapper[4752]: I1124 12:46:10.577278 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16f6b200-e723-474f-8bb1-c4e502ebd5ad-config-data\") pod \"manila-db-sync-x57sk\" (UID: \"16f6b200-e723-474f-8bb1-c4e502ebd5ad\") " pod="openstack/manila-db-sync-x57sk" Nov 24 12:46:10 crc kubenswrapper[4752]: I1124 12:46:10.577332 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/16f6b200-e723-474f-8bb1-c4e502ebd5ad-job-config-data\") pod \"manila-db-sync-x57sk\" (UID: \"16f6b200-e723-474f-8bb1-c4e502ebd5ad\") " pod="openstack/manila-db-sync-x57sk" Nov 24 12:46:10 crc kubenswrapper[4752]: I1124 12:46:10.577574 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16f6b200-e723-474f-8bb1-c4e502ebd5ad-combined-ca-bundle\") pod \"manila-db-sync-x57sk\" (UID: \"16f6b200-e723-474f-8bb1-c4e502ebd5ad\") " pod="openstack/manila-db-sync-x57sk" Nov 24 12:46:10 crc kubenswrapper[4752]: I1124 12:46:10.596418 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6wq4\" (UniqueName: \"kubernetes.io/projected/16f6b200-e723-474f-8bb1-c4e502ebd5ad-kube-api-access-l6wq4\") pod \"manila-db-sync-x57sk\" (UID: \"16f6b200-e723-474f-8bb1-c4e502ebd5ad\") " pod="openstack/manila-db-sync-x57sk" Nov 24 12:46:10 crc kubenswrapper[4752]: I1124 12:46:10.650818 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-x57sk" Nov 24 12:46:11 crc kubenswrapper[4752]: W1124 12:46:11.402842 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16f6b200_e723_474f_8bb1_c4e502ebd5ad.slice/crio-f22e4b4a018a61e2cfd292d41e4be36a46674aaca64bf4ffa2bf4278dfaaf413 WatchSource:0}: Error finding container f22e4b4a018a61e2cfd292d41e4be36a46674aaca64bf4ffa2bf4278dfaaf413: Status 404 returned error can't find the container with id f22e4b4a018a61e2cfd292d41e4be36a46674aaca64bf4ffa2bf4278dfaaf413 Nov 24 12:46:11 crc kubenswrapper[4752]: I1124 12:46:11.407623 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-x57sk"] Nov 24 12:46:12 crc kubenswrapper[4752]: I1124 12:46:12.282146 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-x57sk" event={"ID":"16f6b200-e723-474f-8bb1-c4e502ebd5ad","Type":"ContainerStarted","Data":"f22e4b4a018a61e2cfd292d41e4be36a46674aaca64bf4ffa2bf4278dfaaf413"} Nov 24 12:46:14 crc kubenswrapper[4752]: I1124 12:46:14.029634 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-zmsrk"] Nov 24 12:46:14 crc kubenswrapper[4752]: I1124 12:46:14.039834 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-zmsrk"] Nov 24 12:46:14 crc kubenswrapper[4752]: I1124 12:46:14.743594 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7af01e52-ecc0-4328-9619-a48520ec233e" path="/var/lib/kubelet/pods/7af01e52-ecc0-4328-9619-a48520ec233e/volumes" Nov 24 12:46:18 crc kubenswrapper[4752]: I1124 12:46:18.360698 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-x57sk" event={"ID":"16f6b200-e723-474f-8bb1-c4e502ebd5ad","Type":"ContainerStarted","Data":"7075480ff63c5ad7c9395589d146b87662edef8e1426ee784477f81506c4f158"} Nov 24 12:46:18 crc kubenswrapper[4752]: I1124 12:46:18.394902 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-x57sk" podStartSLOduration=2.285574955 podStartE2EDuration="8.394883592s" podCreationTimestamp="2025-11-24 12:46:10 +0000 UTC" firstStartedPulling="2025-11-24 12:46:11.405544113 +0000 UTC m=+5977.390364402" lastFinishedPulling="2025-11-24 12:46:17.51485275 +0000 UTC m=+5983.499673039" observedRunningTime="2025-11-24 12:46:18.384857504 +0000 UTC m=+5984.369677803" watchObservedRunningTime="2025-11-24 12:46:18.394883592 +0000 UTC m=+5984.379703871" Nov 24 12:46:22 crc kubenswrapper[4752]: I1124 12:46:22.410891 4752 generic.go:334] "Generic (PLEG): container finished" podID="16f6b200-e723-474f-8bb1-c4e502ebd5ad" containerID="7075480ff63c5ad7c9395589d146b87662edef8e1426ee784477f81506c4f158" exitCode=0 Nov 24 12:46:22 crc kubenswrapper[4752]: I1124 12:46:22.410993 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-x57sk" event={"ID":"16f6b200-e723-474f-8bb1-c4e502ebd5ad","Type":"ContainerDied","Data":"7075480ff63c5ad7c9395589d146b87662edef8e1426ee784477f81506c4f158"} Nov 24 12:46:23 crc kubenswrapper[4752]: I1124 12:46:23.909815 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-x57sk" Nov 24 12:46:23 crc kubenswrapper[4752]: I1124 12:46:23.990917 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/16f6b200-e723-474f-8bb1-c4e502ebd5ad-job-config-data\") pod \"16f6b200-e723-474f-8bb1-c4e502ebd5ad\" (UID: \"16f6b200-e723-474f-8bb1-c4e502ebd5ad\") " Nov 24 12:46:23 crc kubenswrapper[4752]: I1124 12:46:23.990968 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16f6b200-e723-474f-8bb1-c4e502ebd5ad-combined-ca-bundle\") pod \"16f6b200-e723-474f-8bb1-c4e502ebd5ad\" (UID: \"16f6b200-e723-474f-8bb1-c4e502ebd5ad\") " Nov 24 12:46:23 crc kubenswrapper[4752]: I1124 12:46:23.990997 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16f6b200-e723-474f-8bb1-c4e502ebd5ad-config-data\") pod \"16f6b200-e723-474f-8bb1-c4e502ebd5ad\" (UID: \"16f6b200-e723-474f-8bb1-c4e502ebd5ad\") " Nov 24 12:46:23 crc kubenswrapper[4752]: I1124 12:46:23.991041 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6wq4\" (UniqueName: \"kubernetes.io/projected/16f6b200-e723-474f-8bb1-c4e502ebd5ad-kube-api-access-l6wq4\") pod \"16f6b200-e723-474f-8bb1-c4e502ebd5ad\" (UID: \"16f6b200-e723-474f-8bb1-c4e502ebd5ad\") " Nov 24 12:46:23 crc kubenswrapper[4752]: I1124 12:46:23.998728 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16f6b200-e723-474f-8bb1-c4e502ebd5ad-kube-api-access-l6wq4" (OuterVolumeSpecName: "kube-api-access-l6wq4") pod "16f6b200-e723-474f-8bb1-c4e502ebd5ad" (UID: "16f6b200-e723-474f-8bb1-c4e502ebd5ad"). InnerVolumeSpecName "kube-api-access-l6wq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:46:23 crc kubenswrapper[4752]: I1124 12:46:23.998915 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16f6b200-e723-474f-8bb1-c4e502ebd5ad-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "16f6b200-e723-474f-8bb1-c4e502ebd5ad" (UID: "16f6b200-e723-474f-8bb1-c4e502ebd5ad"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:46:24 crc kubenswrapper[4752]: I1124 12:46:24.000763 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16f6b200-e723-474f-8bb1-c4e502ebd5ad-config-data" (OuterVolumeSpecName: "config-data") pod "16f6b200-e723-474f-8bb1-c4e502ebd5ad" (UID: "16f6b200-e723-474f-8bb1-c4e502ebd5ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:46:24 crc kubenswrapper[4752]: I1124 12:46:24.022469 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16f6b200-e723-474f-8bb1-c4e502ebd5ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16f6b200-e723-474f-8bb1-c4e502ebd5ad" (UID: "16f6b200-e723-474f-8bb1-c4e502ebd5ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:46:24 crc kubenswrapper[4752]: I1124 12:46:24.108728 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16f6b200-e723-474f-8bb1-c4e502ebd5ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 12:46:24 crc kubenswrapper[4752]: I1124 12:46:24.108817 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16f6b200-e723-474f-8bb1-c4e502ebd5ad-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 12:46:24 crc kubenswrapper[4752]: I1124 12:46:24.108838 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6wq4\" (UniqueName: \"kubernetes.io/projected/16f6b200-e723-474f-8bb1-c4e502ebd5ad-kube-api-access-l6wq4\") on node \"crc\" DevicePath \"\"" Nov 24 12:46:24 crc kubenswrapper[4752]: I1124 12:46:24.108857 4752 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/16f6b200-e723-474f-8bb1-c4e502ebd5ad-job-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 12:46:24 crc kubenswrapper[4752]: I1124 12:46:24.435068 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-x57sk" event={"ID":"16f6b200-e723-474f-8bb1-c4e502ebd5ad","Type":"ContainerDied","Data":"f22e4b4a018a61e2cfd292d41e4be36a46674aaca64bf4ffa2bf4278dfaaf413"} Nov 24 12:46:24 crc kubenswrapper[4752]: I1124 12:46:24.435120 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f22e4b4a018a61e2cfd292d41e4be36a46674aaca64bf4ffa2bf4278dfaaf413" Nov 24 12:46:24 crc kubenswrapper[4752]: I1124 12:46:24.435135 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-x57sk" Nov 24 12:46:24 crc kubenswrapper[4752]: I1124 12:46:24.815396 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Nov 24 12:46:24 crc kubenswrapper[4752]: E1124 12:46:24.824002 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16f6b200-e723-474f-8bb1-c4e502ebd5ad" containerName="manila-db-sync" Nov 24 12:46:24 crc kubenswrapper[4752]: I1124 12:46:24.824771 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="16f6b200-e723-474f-8bb1-c4e502ebd5ad" containerName="manila-db-sync" Nov 24 12:46:24 crc kubenswrapper[4752]: I1124 12:46:24.825135 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="16f6b200-e723-474f-8bb1-c4e502ebd5ad" containerName="manila-db-sync" Nov 24 12:46:24 crc kubenswrapper[4752]: I1124 12:46:24.826804 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Nov 24 12:46:24 crc kubenswrapper[4752]: I1124 12:46:24.830527 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Nov 24 12:46:24 crc kubenswrapper[4752]: I1124 12:46:24.830806 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Nov 24 12:46:24 crc kubenswrapper[4752]: I1124 12:46:24.833446 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Nov 24 12:46:24 crc kubenswrapper[4752]: I1124 12:46:24.834495 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-828mv" Nov 24 12:46:24 crc kubenswrapper[4752]: I1124 12:46:24.847908 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Nov 24 12:46:24 crc kubenswrapper[4752]: I1124 12:46:24.864418 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Nov 24 12:46:24 crc kubenswrapper[4752]: I1124 12:46:24.866870 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Nov 24 12:46:24 crc kubenswrapper[4752]: I1124 12:46:24.869661 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Nov 24 12:46:24 crc kubenswrapper[4752]: I1124 12:46:24.903104 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Nov 24 12:46:24 crc kubenswrapper[4752]: I1124 12:46:24.923457 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9e8e957-b8c5-40bc-bbb6-aa5800cac83f-scripts\") pod \"manila-scheduler-0\" (UID: \"d9e8e957-b8c5-40bc-bbb6-aa5800cac83f\") " pod="openstack/manila-scheduler-0" Nov 24 12:46:24 crc kubenswrapper[4752]: I1124 12:46:24.923541 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9e8e957-b8c5-40bc-bbb6-aa5800cac83f-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"d9e8e957-b8c5-40bc-bbb6-aa5800cac83f\") " pod="openstack/manila-scheduler-0" Nov 24 12:46:24 crc kubenswrapper[4752]: I1124 12:46:24.923619 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9e8e957-b8c5-40bc-bbb6-aa5800cac83f-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"d9e8e957-b8c5-40bc-bbb6-aa5800cac83f\") " pod="openstack/manila-scheduler-0" Nov 24 12:46:24 crc kubenswrapper[4752]: I1124 12:46:24.923648 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9e8e957-b8c5-40bc-bbb6-aa5800cac83f-config-data\") pod \"manila-scheduler-0\" (UID: \"d9e8e957-b8c5-40bc-bbb6-aa5800cac83f\") " pod="openstack/manila-scheduler-0" Nov 24 12:46:24 crc kubenswrapper[4752]: I1124 12:46:24.923668 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d9e8e957-b8c5-40bc-bbb6-aa5800cac83f-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"d9e8e957-b8c5-40bc-bbb6-aa5800cac83f\") " pod="openstack/manila-scheduler-0" Nov 24 12:46:24 crc kubenswrapper[4752]: I1124 12:46:24.923758 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhtth\" (UniqueName: \"kubernetes.io/projected/d9e8e957-b8c5-40bc-bbb6-aa5800cac83f-kube-api-access-zhtth\") pod \"manila-scheduler-0\" (UID: \"d9e8e957-b8c5-40bc-bbb6-aa5800cac83f\") " pod="openstack/manila-scheduler-0" Nov 24 12:46:24 crc kubenswrapper[4752]: I1124 12:46:24.931913 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-676f74989c-5mnw8"] Nov 24 12:46:24 crc kubenswrapper[4752]: I1124 12:46:24.934136 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-676f74989c-5mnw8" Nov 24 12:46:24 crc kubenswrapper[4752]: I1124 12:46:24.957357 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-676f74989c-5mnw8"] Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.025177 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5bf54a05-b002-4171-8c39-0707e219e168-ceph\") pod \"manila-share-share1-0\" (UID: \"5bf54a05-b002-4171-8c39-0707e219e168\") " pod="openstack/manila-share-share1-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.025225 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9e8e957-b8c5-40bc-bbb6-aa5800cac83f-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"d9e8e957-b8c5-40bc-bbb6-aa5800cac83f\") " pod="openstack/manila-scheduler-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.025244 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/183e03d9-f5f1-4596-a69e-aa1a417a2c78-ovsdbserver-nb\") pod \"dnsmasq-dns-676f74989c-5mnw8\" (UID: \"183e03d9-f5f1-4596-a69e-aa1a417a2c78\") " pod="openstack/dnsmasq-dns-676f74989c-5mnw8" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.025293 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/183e03d9-f5f1-4596-a69e-aa1a417a2c78-config\") pod \"dnsmasq-dns-676f74989c-5mnw8\" (UID: \"183e03d9-f5f1-4596-a69e-aa1a417a2c78\") " pod="openstack/dnsmasq-dns-676f74989c-5mnw8" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.025324 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/5bf54a05-b002-4171-8c39-0707e219e168-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"5bf54a05-b002-4171-8c39-0707e219e168\") " pod="openstack/manila-share-share1-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.025345 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bf54a05-b002-4171-8c39-0707e219e168-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"5bf54a05-b002-4171-8c39-0707e219e168\") " pod="openstack/manila-share-share1-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.025367 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bf54a05-b002-4171-8c39-0707e219e168-config-data\") pod \"manila-share-share1-0\" (UID: \"5bf54a05-b002-4171-8c39-0707e219e168\") " pod="openstack/manila-share-share1-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.025401 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9e8e957-b8c5-40bc-bbb6-aa5800cac83f-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"d9e8e957-b8c5-40bc-bbb6-aa5800cac83f\") " pod="openstack/manila-scheduler-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.025448 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9e8e957-b8c5-40bc-bbb6-aa5800cac83f-config-data\") pod \"manila-scheduler-0\" (UID: \"d9e8e957-b8c5-40bc-bbb6-aa5800cac83f\") " pod="openstack/manila-scheduler-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.025464 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d9e8e957-b8c5-40bc-bbb6-aa5800cac83f-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"d9e8e957-b8c5-40bc-bbb6-aa5800cac83f\") " pod="openstack/manila-scheduler-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.025481 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/183e03d9-f5f1-4596-a69e-aa1a417a2c78-ovsdbserver-sb\") pod \"dnsmasq-dns-676f74989c-5mnw8\" (UID: \"183e03d9-f5f1-4596-a69e-aa1a417a2c78\") " pod="openstack/dnsmasq-dns-676f74989c-5mnw8" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.025500 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/183e03d9-f5f1-4596-a69e-aa1a417a2c78-dns-svc\") pod \"dnsmasq-dns-676f74989c-5mnw8\" (UID: \"183e03d9-f5f1-4596-a69e-aa1a417a2c78\") " pod="openstack/dnsmasq-dns-676f74989c-5mnw8" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.025536 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5bf54a05-b002-4171-8c39-0707e219e168-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"5bf54a05-b002-4171-8c39-0707e219e168\") " pod="openstack/manila-share-share1-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.025564 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bf54a05-b002-4171-8c39-0707e219e168-scripts\") pod \"manila-share-share1-0\" (UID: \"5bf54a05-b002-4171-8c39-0707e219e168\") " pod="openstack/manila-share-share1-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.025628 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhtth\" (UniqueName: \"kubernetes.io/projected/d9e8e957-b8c5-40bc-bbb6-aa5800cac83f-kube-api-access-zhtth\") pod \"manila-scheduler-0\" (UID: \"d9e8e957-b8c5-40bc-bbb6-aa5800cac83f\") " pod="openstack/manila-scheduler-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.025804 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5bf54a05-b002-4171-8c39-0707e219e168-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"5bf54a05-b002-4171-8c39-0707e219e168\") " pod="openstack/manila-share-share1-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.025829 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fbpc\" (UniqueName: \"kubernetes.io/projected/183e03d9-f5f1-4596-a69e-aa1a417a2c78-kube-api-access-2fbpc\") pod \"dnsmasq-dns-676f74989c-5mnw8\" (UID: \"183e03d9-f5f1-4596-a69e-aa1a417a2c78\") " pod="openstack/dnsmasq-dns-676f74989c-5mnw8" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.025857 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9e8e957-b8c5-40bc-bbb6-aa5800cac83f-scripts\") pod \"manila-scheduler-0\" (UID: \"d9e8e957-b8c5-40bc-bbb6-aa5800cac83f\") " pod="openstack/manila-scheduler-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.025900 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zdvc\" (UniqueName: \"kubernetes.io/projected/5bf54a05-b002-4171-8c39-0707e219e168-kube-api-access-9zdvc\") pod \"manila-share-share1-0\" (UID: \"5bf54a05-b002-4171-8c39-0707e219e168\") " pod="openstack/manila-share-share1-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.026365 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d9e8e957-b8c5-40bc-bbb6-aa5800cac83f-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"d9e8e957-b8c5-40bc-bbb6-aa5800cac83f\") " pod="openstack/manila-scheduler-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.029500 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9e8e957-b8c5-40bc-bbb6-aa5800cac83f-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"d9e8e957-b8c5-40bc-bbb6-aa5800cac83f\") " pod="openstack/manila-scheduler-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.032050 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9e8e957-b8c5-40bc-bbb6-aa5800cac83f-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"d9e8e957-b8c5-40bc-bbb6-aa5800cac83f\") " pod="openstack/manila-scheduler-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.041984 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9e8e957-b8c5-40bc-bbb6-aa5800cac83f-scripts\") pod \"manila-scheduler-0\" (UID: \"d9e8e957-b8c5-40bc-bbb6-aa5800cac83f\") " pod="openstack/manila-scheduler-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.042858 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9e8e957-b8c5-40bc-bbb6-aa5800cac83f-config-data\") pod \"manila-scheduler-0\" (UID: \"d9e8e957-b8c5-40bc-bbb6-aa5800cac83f\") " pod="openstack/manila-scheduler-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.051410 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhtth\" (UniqueName: \"kubernetes.io/projected/d9e8e957-b8c5-40bc-bbb6-aa5800cac83f-kube-api-access-zhtth\") pod \"manila-scheduler-0\" (UID: \"d9e8e957-b8c5-40bc-bbb6-aa5800cac83f\") " pod="openstack/manila-scheduler-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.091349 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.096198 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.098218 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.105110 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.128186 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/183e03d9-f5f1-4596-a69e-aa1a417a2c78-ovsdbserver-sb\") pod \"dnsmasq-dns-676f74989c-5mnw8\" (UID: \"183e03d9-f5f1-4596-a69e-aa1a417a2c78\") " pod="openstack/dnsmasq-dns-676f74989c-5mnw8" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.128232 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/183e03d9-f5f1-4596-a69e-aa1a417a2c78-dns-svc\") pod \"dnsmasq-dns-676f74989c-5mnw8\" (UID: \"183e03d9-f5f1-4596-a69e-aa1a417a2c78\") " pod="openstack/dnsmasq-dns-676f74989c-5mnw8" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.128256 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5bf54a05-b002-4171-8c39-0707e219e168-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"5bf54a05-b002-4171-8c39-0707e219e168\") " pod="openstack/manila-share-share1-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.128294 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bf54a05-b002-4171-8c39-0707e219e168-scripts\") pod \"manila-share-share1-0\" (UID: \"5bf54a05-b002-4171-8c39-0707e219e168\") " pod="openstack/manila-share-share1-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.128359 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fbpc\" (UniqueName: \"kubernetes.io/projected/183e03d9-f5f1-4596-a69e-aa1a417a2c78-kube-api-access-2fbpc\") pod \"dnsmasq-dns-676f74989c-5mnw8\" (UID: \"183e03d9-f5f1-4596-a69e-aa1a417a2c78\") " pod="openstack/dnsmasq-dns-676f74989c-5mnw8" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.128378 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5bf54a05-b002-4171-8c39-0707e219e168-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"5bf54a05-b002-4171-8c39-0707e219e168\") " pod="openstack/manila-share-share1-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.128382 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5bf54a05-b002-4171-8c39-0707e219e168-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"5bf54a05-b002-4171-8c39-0707e219e168\") " pod="openstack/manila-share-share1-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.128523 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zdvc\" (UniqueName: \"kubernetes.io/projected/5bf54a05-b002-4171-8c39-0707e219e168-kube-api-access-9zdvc\") pod \"manila-share-share1-0\" (UID: \"5bf54a05-b002-4171-8c39-0707e219e168\") " pod="openstack/manila-share-share1-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.128596 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5bf54a05-b002-4171-8c39-0707e219e168-ceph\") pod \"manila-share-share1-0\" (UID: \"5bf54a05-b002-4171-8c39-0707e219e168\") " pod="openstack/manila-share-share1-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.128626 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/183e03d9-f5f1-4596-a69e-aa1a417a2c78-ovsdbserver-nb\") pod \"dnsmasq-dns-676f74989c-5mnw8\" (UID: \"183e03d9-f5f1-4596-a69e-aa1a417a2c78\") " pod="openstack/dnsmasq-dns-676f74989c-5mnw8" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.128701 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/183e03d9-f5f1-4596-a69e-aa1a417a2c78-config\") pod \"dnsmasq-dns-676f74989c-5mnw8\" (UID: \"183e03d9-f5f1-4596-a69e-aa1a417a2c78\") " pod="openstack/dnsmasq-dns-676f74989c-5mnw8" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.128785 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/5bf54a05-b002-4171-8c39-0707e219e168-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"5bf54a05-b002-4171-8c39-0707e219e168\") " pod="openstack/manila-share-share1-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.128814 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bf54a05-b002-4171-8c39-0707e219e168-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"5bf54a05-b002-4171-8c39-0707e219e168\") " pod="openstack/manila-share-share1-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.128844 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bf54a05-b002-4171-8c39-0707e219e168-config-data\") pod \"manila-share-share1-0\" (UID: \"5bf54a05-b002-4171-8c39-0707e219e168\") " pod="openstack/manila-share-share1-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.129176 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/183e03d9-f5f1-4596-a69e-aa1a417a2c78-dns-svc\") pod \"dnsmasq-dns-676f74989c-5mnw8\" (UID: \"183e03d9-f5f1-4596-a69e-aa1a417a2c78\") " pod="openstack/dnsmasq-dns-676f74989c-5mnw8" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.129592 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/5bf54a05-b002-4171-8c39-0707e219e168-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"5bf54a05-b002-4171-8c39-0707e219e168\") " pod="openstack/manila-share-share1-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.130199 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/183e03d9-f5f1-4596-a69e-aa1a417a2c78-config\") pod \"dnsmasq-dns-676f74989c-5mnw8\" (UID: \"183e03d9-f5f1-4596-a69e-aa1a417a2c78\") " pod="openstack/dnsmasq-dns-676f74989c-5mnw8" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.130447 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/183e03d9-f5f1-4596-a69e-aa1a417a2c78-ovsdbserver-sb\") pod \"dnsmasq-dns-676f74989c-5mnw8\" (UID: \"183e03d9-f5f1-4596-a69e-aa1a417a2c78\") " pod="openstack/dnsmasq-dns-676f74989c-5mnw8" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.131067 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/183e03d9-f5f1-4596-a69e-aa1a417a2c78-ovsdbserver-nb\") pod \"dnsmasq-dns-676f74989c-5mnw8\" (UID: \"183e03d9-f5f1-4596-a69e-aa1a417a2c78\") " pod="openstack/dnsmasq-dns-676f74989c-5mnw8" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.131673 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5bf54a05-b002-4171-8c39-0707e219e168-ceph\") pod \"manila-share-share1-0\" (UID: \"5bf54a05-b002-4171-8c39-0707e219e168\") " pod="openstack/manila-share-share1-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.136286 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bf54a05-b002-4171-8c39-0707e219e168-scripts\") pod \"manila-share-share1-0\" (UID: \"5bf54a05-b002-4171-8c39-0707e219e168\") " pod="openstack/manila-share-share1-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.136500 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5bf54a05-b002-4171-8c39-0707e219e168-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"5bf54a05-b002-4171-8c39-0707e219e168\") " pod="openstack/manila-share-share1-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.136769 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bf54a05-b002-4171-8c39-0707e219e168-config-data\") pod \"manila-share-share1-0\" (UID: \"5bf54a05-b002-4171-8c39-0707e219e168\") " pod="openstack/manila-share-share1-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.137263 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bf54a05-b002-4171-8c39-0707e219e168-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"5bf54a05-b002-4171-8c39-0707e219e168\") " pod="openstack/manila-share-share1-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.161429 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.162458 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zdvc\" (UniqueName: \"kubernetes.io/projected/5bf54a05-b002-4171-8c39-0707e219e168-kube-api-access-9zdvc\") pod \"manila-share-share1-0\" (UID: \"5bf54a05-b002-4171-8c39-0707e219e168\") " pod="openstack/manila-share-share1-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.177672 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fbpc\" (UniqueName: \"kubernetes.io/projected/183e03d9-f5f1-4596-a69e-aa1a417a2c78-kube-api-access-2fbpc\") pod \"dnsmasq-dns-676f74989c-5mnw8\" (UID: \"183e03d9-f5f1-4596-a69e-aa1a417a2c78\") " pod="openstack/dnsmasq-dns-676f74989c-5mnw8" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.194037 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.232361 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a736d712-c29c-4228-b878-46ab90132fe4-scripts\") pod \"manila-api-0\" (UID: \"a736d712-c29c-4228-b878-46ab90132fe4\") " pod="openstack/manila-api-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.232442 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a736d712-c29c-4228-b878-46ab90132fe4-etc-machine-id\") pod \"manila-api-0\" (UID: \"a736d712-c29c-4228-b878-46ab90132fe4\") " pod="openstack/manila-api-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.232589 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a736d712-c29c-4228-b878-46ab90132fe4-config-data\") pod \"manila-api-0\" (UID: \"a736d712-c29c-4228-b878-46ab90132fe4\") " pod="openstack/manila-api-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.232622 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a736d712-c29c-4228-b878-46ab90132fe4-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"a736d712-c29c-4228-b878-46ab90132fe4\") " pod="openstack/manila-api-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.232649 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a736d712-c29c-4228-b878-46ab90132fe4-config-data-custom\") pod \"manila-api-0\" (UID: \"a736d712-c29c-4228-b878-46ab90132fe4\") " pod="openstack/manila-api-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.232682 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st2w6\" (UniqueName: \"kubernetes.io/projected/a736d712-c29c-4228-b878-46ab90132fe4-kube-api-access-st2w6\") pod \"manila-api-0\" (UID: \"a736d712-c29c-4228-b878-46ab90132fe4\") " pod="openstack/manila-api-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.232702 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a736d712-c29c-4228-b878-46ab90132fe4-logs\") pod \"manila-api-0\" (UID: \"a736d712-c29c-4228-b878-46ab90132fe4\") " pod="openstack/manila-api-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.269837 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-676f74989c-5mnw8" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.335294 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a736d712-c29c-4228-b878-46ab90132fe4-scripts\") pod \"manila-api-0\" (UID: \"a736d712-c29c-4228-b878-46ab90132fe4\") " pod="openstack/manila-api-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.335364 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a736d712-c29c-4228-b878-46ab90132fe4-etc-machine-id\") pod \"manila-api-0\" (UID: \"a736d712-c29c-4228-b878-46ab90132fe4\") " pod="openstack/manila-api-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.335438 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a736d712-c29c-4228-b878-46ab90132fe4-config-data\") pod \"manila-api-0\" (UID: \"a736d712-c29c-4228-b878-46ab90132fe4\") " pod="openstack/manila-api-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.335456 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a736d712-c29c-4228-b878-46ab90132fe4-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"a736d712-c29c-4228-b878-46ab90132fe4\") " pod="openstack/manila-api-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.335485 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a736d712-c29c-4228-b878-46ab90132fe4-config-data-custom\") pod \"manila-api-0\" (UID: \"a736d712-c29c-4228-b878-46ab90132fe4\") " pod="openstack/manila-api-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.335510 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-st2w6\" (UniqueName: \"kubernetes.io/projected/a736d712-c29c-4228-b878-46ab90132fe4-kube-api-access-st2w6\") pod \"manila-api-0\" (UID: \"a736d712-c29c-4228-b878-46ab90132fe4\") " pod="openstack/manila-api-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.335532 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a736d712-c29c-4228-b878-46ab90132fe4-logs\") pod \"manila-api-0\" (UID: \"a736d712-c29c-4228-b878-46ab90132fe4\") " pod="openstack/manila-api-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.335942 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a736d712-c29c-4228-b878-46ab90132fe4-logs\") pod \"manila-api-0\" (UID: \"a736d712-c29c-4228-b878-46ab90132fe4\") " pod="openstack/manila-api-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.337121 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a736d712-c29c-4228-b878-46ab90132fe4-etc-machine-id\") pod \"manila-api-0\" (UID: \"a736d712-c29c-4228-b878-46ab90132fe4\") " pod="openstack/manila-api-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.351883 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a736d712-c29c-4228-b878-46ab90132fe4-config-data-custom\") pod \"manila-api-0\" (UID: \"a736d712-c29c-4228-b878-46ab90132fe4\") " pod="openstack/manila-api-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.355633 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a736d712-c29c-4228-b878-46ab90132fe4-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"a736d712-c29c-4228-b878-46ab90132fe4\") " pod="openstack/manila-api-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.360627 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a736d712-c29c-4228-b878-46ab90132fe4-scripts\") pod \"manila-api-0\" (UID: \"a736d712-c29c-4228-b878-46ab90132fe4\") " pod="openstack/manila-api-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.371580 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-st2w6\" (UniqueName: \"kubernetes.io/projected/a736d712-c29c-4228-b878-46ab90132fe4-kube-api-access-st2w6\") pod \"manila-api-0\" (UID: \"a736d712-c29c-4228-b878-46ab90132fe4\") " pod="openstack/manila-api-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.372286 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a736d712-c29c-4228-b878-46ab90132fe4-config-data\") pod \"manila-api-0\" (UID: \"a736d712-c29c-4228-b878-46ab90132fe4\") " pod="openstack/manila-api-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.626621 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Nov 24 12:46:25 crc kubenswrapper[4752]: I1124 12:46:25.805241 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Nov 24 12:46:26 crc kubenswrapper[4752]: I1124 12:46:26.016976 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-676f74989c-5mnw8"] Nov 24 12:46:26 crc kubenswrapper[4752]: W1124 12:46:26.018884 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod183e03d9_f5f1_4596_a69e_aa1a417a2c78.slice/crio-be1ec49ed9a4e3b7b6fa7bd0c54d3c94d389960782d0a568d19dac9c0cb4c54f WatchSource:0}: Error finding container be1ec49ed9a4e3b7b6fa7bd0c54d3c94d389960782d0a568d19dac9c0cb4c54f: Status 404 returned error can't find the container with id be1ec49ed9a4e3b7b6fa7bd0c54d3c94d389960782d0a568d19dac9c0cb4c54f Nov 24 12:46:26 crc kubenswrapper[4752]: I1124 12:46:26.132891 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Nov 24 12:46:26 crc kubenswrapper[4752]: I1124 12:46:26.443713 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Nov 24 12:46:26 crc kubenswrapper[4752]: I1124 12:46:26.494089 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"d9e8e957-b8c5-40bc-bbb6-aa5800cac83f","Type":"ContainerStarted","Data":"a3cce96e53496766c4100cd00ee50a3e4176f0c952757ab17499d2bb9152ee8a"} Nov 24 12:46:26 crc kubenswrapper[4752]: I1124 12:46:26.499410 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"5bf54a05-b002-4171-8c39-0707e219e168","Type":"ContainerStarted","Data":"c3076927089440d1549aa883044233fe79759f94fb60cd8f5de446456d06c565"} Nov 24 12:46:26 crc kubenswrapper[4752]: I1124 12:46:26.501867 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-676f74989c-5mnw8" event={"ID":"183e03d9-f5f1-4596-a69e-aa1a417a2c78","Type":"ContainerStarted","Data":"277ab1dcd3680267507bae6a48d31c0d989301d2bf9e198c1382f0b854de056b"} Nov 24 12:46:26 crc kubenswrapper[4752]: I1124 12:46:26.501904 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-676f74989c-5mnw8" event={"ID":"183e03d9-f5f1-4596-a69e-aa1a417a2c78","Type":"ContainerStarted","Data":"be1ec49ed9a4e3b7b6fa7bd0c54d3c94d389960782d0a568d19dac9c0cb4c54f"} Nov 24 12:46:27 crc kubenswrapper[4752]: I1124 12:46:27.525494 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"d9e8e957-b8c5-40bc-bbb6-aa5800cac83f","Type":"ContainerStarted","Data":"f0a775d6256d950ac204b737b2cc6e5a9cc7f71898a05ae5687723e1fd871a89"} Nov 24 12:46:27 crc kubenswrapper[4752]: I1124 12:46:27.529515 4752 generic.go:334] "Generic (PLEG): container finished" podID="183e03d9-f5f1-4596-a69e-aa1a417a2c78" containerID="277ab1dcd3680267507bae6a48d31c0d989301d2bf9e198c1382f0b854de056b" exitCode=0 Nov 24 12:46:27 crc kubenswrapper[4752]: I1124 12:46:27.529574 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-676f74989c-5mnw8" event={"ID":"183e03d9-f5f1-4596-a69e-aa1a417a2c78","Type":"ContainerDied","Data":"277ab1dcd3680267507bae6a48d31c0d989301d2bf9e198c1382f0b854de056b"} Nov 24 12:46:27 crc kubenswrapper[4752]: I1124 12:46:27.540976 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"a736d712-c29c-4228-b878-46ab90132fe4","Type":"ContainerStarted","Data":"ae1df64f5a94d192f06a791a442e023f4d37d950b3717198a865df524a341701"} Nov 24 12:46:27 crc kubenswrapper[4752]: I1124 12:46:27.541029 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"a736d712-c29c-4228-b878-46ab90132fe4","Type":"ContainerStarted","Data":"0475c53f22985ab1d8ce50eb7431e5355ef160da61f47da59cc4ef2472a0fe68"} Nov 24 12:46:28 crc kubenswrapper[4752]: I1124 12:46:28.489548 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 24 12:46:28 crc kubenswrapper[4752]: I1124 12:46:28.561619 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-676f74989c-5mnw8" event={"ID":"183e03d9-f5f1-4596-a69e-aa1a417a2c78","Type":"ContainerStarted","Data":"7065dc6142da639dafea9e4651f6ca4eb29ff2f42534d550c3006ee943817439"} Nov 24 12:46:28 crc kubenswrapper[4752]: I1124 12:46:28.561685 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-676f74989c-5mnw8" Nov 24 12:46:28 crc kubenswrapper[4752]: I1124 12:46:28.570476 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"a736d712-c29c-4228-b878-46ab90132fe4","Type":"ContainerStarted","Data":"ef01cb28674274c893858a8c4be3527aa0ee6cc025bd1aa7357a0d215b5895fa"} Nov 24 12:46:28 crc kubenswrapper[4752]: I1124 12:46:28.570962 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Nov 24 12:46:28 crc kubenswrapper[4752]: I1124 12:46:28.576239 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"d9e8e957-b8c5-40bc-bbb6-aa5800cac83f","Type":"ContainerStarted","Data":"e25effbdb4fc218928a16483b846735a7212bfdd4d3a87d7584106d941458fbf"} Nov 24 12:46:28 crc kubenswrapper[4752]: I1124 12:46:28.594587 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-676f74989c-5mnw8" podStartSLOduration=4.594563889 podStartE2EDuration="4.594563889s" podCreationTimestamp="2025-11-24 12:46:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:46:28.583764029 +0000 UTC m=+5994.568584338" watchObservedRunningTime="2025-11-24 12:46:28.594563889 +0000 UTC m=+5994.579384178" Nov 24 12:46:28 crc kubenswrapper[4752]: I1124 12:46:28.610666 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.896516645 podStartE2EDuration="4.610646301s" podCreationTimestamp="2025-11-24 12:46:24 +0000 UTC" firstStartedPulling="2025-11-24 12:46:25.811662335 +0000 UTC m=+5991.796482624" lastFinishedPulling="2025-11-24 12:46:26.525791991 +0000 UTC m=+5992.510612280" observedRunningTime="2025-11-24 12:46:28.605055331 +0000 UTC m=+5994.589875630" watchObservedRunningTime="2025-11-24 12:46:28.610646301 +0000 UTC m=+5994.595466590" Nov 24 12:46:28 crc kubenswrapper[4752]: I1124 12:46:28.641260 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=3.64123916 podStartE2EDuration="3.64123916s" podCreationTimestamp="2025-11-24 12:46:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:46:28.624254652 +0000 UTC m=+5994.609074951" watchObservedRunningTime="2025-11-24 12:46:28.64123916 +0000 UTC m=+5994.626059449" Nov 24 12:46:30 crc kubenswrapper[4752]: I1124 12:46:30.018614 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 24 12:46:30 crc kubenswrapper[4752]: I1124 12:46:30.019218 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bcdd7e57-6834-4d6c-ac2e-7489bae07373" containerName="ceilometer-central-agent" containerID="cri-o://45f427be77176c8c7480233ef78da5e83bacabd33e0f3969b88b0cb13e2f8db6" gracePeriod=30 Nov 24 12:46:30 crc kubenswrapper[4752]: I1124 12:46:30.019674 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bcdd7e57-6834-4d6c-ac2e-7489bae07373" containerName="proxy-httpd" containerID="cri-o://86b178d77c4cc3790b2e8301dd07be220d333d5adb3cffc97e0acc39df6edbf5" gracePeriod=30 Nov 24 12:46:30 crc kubenswrapper[4752]: I1124 12:46:30.019719 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bcdd7e57-6834-4d6c-ac2e-7489bae07373" containerName="sg-core" containerID="cri-o://c406a8a8a2f912e28a2f965f0a42f996eb33ae004074b356e037bde4cc8f8994" gracePeriod=30 Nov 24 12:46:30 crc kubenswrapper[4752]: I1124 12:46:30.019782 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bcdd7e57-6834-4d6c-ac2e-7489bae07373" containerName="ceilometer-notification-agent" containerID="cri-o://7246ff097d33a175194c8681d68818325857b181214ca1f263740e63712b3e1d" gracePeriod=30 Nov 24 12:46:30 crc kubenswrapper[4752]: I1124 12:46:30.601618 4752 generic.go:334] "Generic (PLEG): container finished" podID="bcdd7e57-6834-4d6c-ac2e-7489bae07373" containerID="86b178d77c4cc3790b2e8301dd07be220d333d5adb3cffc97e0acc39df6edbf5" exitCode=0 Nov 24 12:46:30 crc kubenswrapper[4752]: I1124 12:46:30.601946 4752 generic.go:334] "Generic (PLEG): container finished" podID="bcdd7e57-6834-4d6c-ac2e-7489bae07373" containerID="c406a8a8a2f912e28a2f965f0a42f996eb33ae004074b356e037bde4cc8f8994" exitCode=2 Nov 24 12:46:30 crc kubenswrapper[4752]: I1124 12:46:30.601697 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcdd7e57-6834-4d6c-ac2e-7489bae07373","Type":"ContainerDied","Data":"86b178d77c4cc3790b2e8301dd07be220d333d5adb3cffc97e0acc39df6edbf5"} Nov 24 12:46:30 crc kubenswrapper[4752]: I1124 12:46:30.601996 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcdd7e57-6834-4d6c-ac2e-7489bae07373","Type":"ContainerDied","Data":"c406a8a8a2f912e28a2f965f0a42f996eb33ae004074b356e037bde4cc8f8994"} Nov 24 12:46:31 crc kubenswrapper[4752]: I1124 12:46:31.618637 4752 generic.go:334] "Generic (PLEG): container finished" podID="bcdd7e57-6834-4d6c-ac2e-7489bae07373" containerID="45f427be77176c8c7480233ef78da5e83bacabd33e0f3969b88b0cb13e2f8db6" exitCode=0 Nov 24 12:46:31 crc kubenswrapper[4752]: I1124 12:46:31.618719 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcdd7e57-6834-4d6c-ac2e-7489bae07373","Type":"ContainerDied","Data":"45f427be77176c8c7480233ef78da5e83bacabd33e0f3969b88b0cb13e2f8db6"} Nov 24 12:46:32 crc kubenswrapper[4752]: I1124 12:46:32.638215 4752 generic.go:334] "Generic (PLEG): container finished" podID="bcdd7e57-6834-4d6c-ac2e-7489bae07373" containerID="7246ff097d33a175194c8681d68818325857b181214ca1f263740e63712b3e1d" exitCode=0 Nov 24 12:46:32 crc kubenswrapper[4752]: I1124 12:46:32.638292 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcdd7e57-6834-4d6c-ac2e-7489bae07373","Type":"ContainerDied","Data":"7246ff097d33a175194c8681d68818325857b181214ca1f263740e63712b3e1d"} Nov 24 12:46:33 crc kubenswrapper[4752]: I1124 12:46:33.967304 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.079191 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcdd7e57-6834-4d6c-ac2e-7489bae07373-run-httpd\") pod \"bcdd7e57-6834-4d6c-ac2e-7489bae07373\" (UID: \"bcdd7e57-6834-4d6c-ac2e-7489bae07373\") " Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.079390 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bcdd7e57-6834-4d6c-ac2e-7489bae07373-sg-core-conf-yaml\") pod \"bcdd7e57-6834-4d6c-ac2e-7489bae07373\" (UID: \"bcdd7e57-6834-4d6c-ac2e-7489bae07373\") " Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.079486 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcdd7e57-6834-4d6c-ac2e-7489bae07373-combined-ca-bundle\") pod \"bcdd7e57-6834-4d6c-ac2e-7489bae07373\" (UID: \"bcdd7e57-6834-4d6c-ac2e-7489bae07373\") " Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.079510 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcdd7e57-6834-4d6c-ac2e-7489bae07373-log-httpd\") pod \"bcdd7e57-6834-4d6c-ac2e-7489bae07373\" (UID: \"bcdd7e57-6834-4d6c-ac2e-7489bae07373\") " Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.079576 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcdd7e57-6834-4d6c-ac2e-7489bae07373-scripts\") pod \"bcdd7e57-6834-4d6c-ac2e-7489bae07373\" (UID: \"bcdd7e57-6834-4d6c-ac2e-7489bae07373\") " Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.079608 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8wrr\" (UniqueName: \"kubernetes.io/projected/bcdd7e57-6834-4d6c-ac2e-7489bae07373-kube-api-access-c8wrr\") pod \"bcdd7e57-6834-4d6c-ac2e-7489bae07373\" (UID: \"bcdd7e57-6834-4d6c-ac2e-7489bae07373\") " Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.079694 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcdd7e57-6834-4d6c-ac2e-7489bae07373-config-data\") pod \"bcdd7e57-6834-4d6c-ac2e-7489bae07373\" (UID: \"bcdd7e57-6834-4d6c-ac2e-7489bae07373\") " Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.079828 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcdd7e57-6834-4d6c-ac2e-7489bae07373-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bcdd7e57-6834-4d6c-ac2e-7489bae07373" (UID: "bcdd7e57-6834-4d6c-ac2e-7489bae07373"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.080106 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcdd7e57-6834-4d6c-ac2e-7489bae07373-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bcdd7e57-6834-4d6c-ac2e-7489bae07373" (UID: "bcdd7e57-6834-4d6c-ac2e-7489bae07373"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.080290 4752 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcdd7e57-6834-4d6c-ac2e-7489bae07373-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.080304 4752 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcdd7e57-6834-4d6c-ac2e-7489bae07373-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.084929 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcdd7e57-6834-4d6c-ac2e-7489bae07373-scripts" (OuterVolumeSpecName: "scripts") pod "bcdd7e57-6834-4d6c-ac2e-7489bae07373" (UID: "bcdd7e57-6834-4d6c-ac2e-7489bae07373"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.085475 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcdd7e57-6834-4d6c-ac2e-7489bae07373-kube-api-access-c8wrr" (OuterVolumeSpecName: "kube-api-access-c8wrr") pod "bcdd7e57-6834-4d6c-ac2e-7489bae07373" (UID: "bcdd7e57-6834-4d6c-ac2e-7489bae07373"). InnerVolumeSpecName "kube-api-access-c8wrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.141270 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcdd7e57-6834-4d6c-ac2e-7489bae07373-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bcdd7e57-6834-4d6c-ac2e-7489bae07373" (UID: "bcdd7e57-6834-4d6c-ac2e-7489bae07373"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.182860 4752 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bcdd7e57-6834-4d6c-ac2e-7489bae07373-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.182901 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcdd7e57-6834-4d6c-ac2e-7489bae07373-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.182915 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8wrr\" (UniqueName: \"kubernetes.io/projected/bcdd7e57-6834-4d6c-ac2e-7489bae07373-kube-api-access-c8wrr\") on node \"crc\" DevicePath \"\"" Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.251207 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcdd7e57-6834-4d6c-ac2e-7489bae07373-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bcdd7e57-6834-4d6c-ac2e-7489bae07373" (UID: "bcdd7e57-6834-4d6c-ac2e-7489bae07373"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.260990 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcdd7e57-6834-4d6c-ac2e-7489bae07373-config-data" (OuterVolumeSpecName: "config-data") pod "bcdd7e57-6834-4d6c-ac2e-7489bae07373" (UID: "bcdd7e57-6834-4d6c-ac2e-7489bae07373"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.285118 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcdd7e57-6834-4d6c-ac2e-7489bae07373-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.285163 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcdd7e57-6834-4d6c-ac2e-7489bae07373-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.675512 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"5bf54a05-b002-4171-8c39-0707e219e168","Type":"ContainerStarted","Data":"559bdd6d16b1095ab5dfc053749f8e6432b4e6b36c85c4a545d1f7ea2ea51d2a"} Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.676034 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"5bf54a05-b002-4171-8c39-0707e219e168","Type":"ContainerStarted","Data":"d0087125f2191d4a98b46626739154a7fee497bfa1c828947730e130f88f90a6"} Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.679190 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcdd7e57-6834-4d6c-ac2e-7489bae07373","Type":"ContainerDied","Data":"bf6963b7516244cf803227db21876fb39e56f8359e0e4bbeb6edb8aadeb10ab0"} Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.679236 4752 scope.go:117] "RemoveContainer" containerID="86b178d77c4cc3790b2e8301dd07be220d333d5adb3cffc97e0acc39df6edbf5" Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.679441 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.712373 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.234898015 podStartE2EDuration="10.71235715s" podCreationTimestamp="2025-11-24 12:46:24 +0000 UTC" firstStartedPulling="2025-11-24 12:46:26.164157967 +0000 UTC m=+5992.148978256" lastFinishedPulling="2025-11-24 12:46:33.641617102 +0000 UTC m=+5999.626437391" observedRunningTime="2025-11-24 12:46:34.703023052 +0000 UTC m=+6000.687843341" watchObservedRunningTime="2025-11-24 12:46:34.71235715 +0000 UTC m=+6000.697177439" Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.750425 4752 scope.go:117] "RemoveContainer" containerID="c406a8a8a2f912e28a2f965f0a42f996eb33ae004074b356e037bde4cc8f8994" Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.781311 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.781414 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.781435 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 24 12:46:34 crc kubenswrapper[4752]: E1124 12:46:34.789017 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcdd7e57-6834-4d6c-ac2e-7489bae07373" containerName="sg-core" Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.789066 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcdd7e57-6834-4d6c-ac2e-7489bae07373" containerName="sg-core" Nov 24 12:46:34 crc kubenswrapper[4752]: E1124 12:46:34.789099 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcdd7e57-6834-4d6c-ac2e-7489bae07373" containerName="ceilometer-notification-agent" Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.789108 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcdd7e57-6834-4d6c-ac2e-7489bae07373" containerName="ceilometer-notification-agent" Nov 24 12:46:34 crc kubenswrapper[4752]: E1124 12:46:34.789126 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcdd7e57-6834-4d6c-ac2e-7489bae07373" containerName="ceilometer-central-agent" Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.789134 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcdd7e57-6834-4d6c-ac2e-7489bae07373" containerName="ceilometer-central-agent" Nov 24 12:46:34 crc kubenswrapper[4752]: E1124 12:46:34.789156 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcdd7e57-6834-4d6c-ac2e-7489bae07373" containerName="proxy-httpd" Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.789164 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcdd7e57-6834-4d6c-ac2e-7489bae07373" containerName="proxy-httpd" Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.789533 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcdd7e57-6834-4d6c-ac2e-7489bae07373" containerName="ceilometer-notification-agent" Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.789559 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcdd7e57-6834-4d6c-ac2e-7489bae07373" containerName="ceilometer-central-agent" Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.789572 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcdd7e57-6834-4d6c-ac2e-7489bae07373" containerName="sg-core" Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.789599 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcdd7e57-6834-4d6c-ac2e-7489bae07373" containerName="proxy-httpd" Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.791900 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.792013 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.800155 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.800220 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.904104 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0b3f8aa5-321a-4103-8f5d-59ff951c103a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0b3f8aa5-321a-4103-8f5d-59ff951c103a\") " pod="openstack/ceilometer-0" Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.904185 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b3f8aa5-321a-4103-8f5d-59ff951c103a-log-httpd\") pod \"ceilometer-0\" (UID: \"0b3f8aa5-321a-4103-8f5d-59ff951c103a\") " pod="openstack/ceilometer-0" Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.904253 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b3f8aa5-321a-4103-8f5d-59ff951c103a-run-httpd\") pod \"ceilometer-0\" (UID: \"0b3f8aa5-321a-4103-8f5d-59ff951c103a\") " pod="openstack/ceilometer-0" Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.904337 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b3f8aa5-321a-4103-8f5d-59ff951c103a-scripts\") pod \"ceilometer-0\" (UID: \"0b3f8aa5-321a-4103-8f5d-59ff951c103a\") " pod="openstack/ceilometer-0" Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.904366 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps64q\" (UniqueName: \"kubernetes.io/projected/0b3f8aa5-321a-4103-8f5d-59ff951c103a-kube-api-access-ps64q\") pod \"ceilometer-0\" (UID: \"0b3f8aa5-321a-4103-8f5d-59ff951c103a\") " pod="openstack/ceilometer-0" Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.904434 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b3f8aa5-321a-4103-8f5d-59ff951c103a-config-data\") pod \"ceilometer-0\" (UID: \"0b3f8aa5-321a-4103-8f5d-59ff951c103a\") " pod="openstack/ceilometer-0" Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.904470 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b3f8aa5-321a-4103-8f5d-59ff951c103a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0b3f8aa5-321a-4103-8f5d-59ff951c103a\") " pod="openstack/ceilometer-0" Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.918030 4752 scope.go:117] "RemoveContainer" containerID="7246ff097d33a175194c8681d68818325857b181214ca1f263740e63712b3e1d" Nov 24 12:46:34 crc kubenswrapper[4752]: I1124 12:46:34.976101 4752 scope.go:117] "RemoveContainer" containerID="45f427be77176c8c7480233ef78da5e83bacabd33e0f3969b88b0cb13e2f8db6" Nov 24 12:46:35 crc kubenswrapper[4752]: I1124 12:46:35.007951 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b3f8aa5-321a-4103-8f5d-59ff951c103a-config-data\") pod \"ceilometer-0\" (UID: \"0b3f8aa5-321a-4103-8f5d-59ff951c103a\") " pod="openstack/ceilometer-0" Nov 24 12:46:35 crc kubenswrapper[4752]: I1124 12:46:35.008647 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b3f8aa5-321a-4103-8f5d-59ff951c103a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0b3f8aa5-321a-4103-8f5d-59ff951c103a\") " pod="openstack/ceilometer-0" Nov 24 12:46:35 crc kubenswrapper[4752]: I1124 12:46:35.008698 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0b3f8aa5-321a-4103-8f5d-59ff951c103a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0b3f8aa5-321a-4103-8f5d-59ff951c103a\") " pod="openstack/ceilometer-0" Nov 24 12:46:35 crc kubenswrapper[4752]: I1124 12:46:35.008731 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b3f8aa5-321a-4103-8f5d-59ff951c103a-log-httpd\") pod \"ceilometer-0\" (UID: \"0b3f8aa5-321a-4103-8f5d-59ff951c103a\") " pod="openstack/ceilometer-0" Nov 24 12:46:35 crc kubenswrapper[4752]: I1124 12:46:35.008815 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b3f8aa5-321a-4103-8f5d-59ff951c103a-run-httpd\") pod \"ceilometer-0\" (UID: \"0b3f8aa5-321a-4103-8f5d-59ff951c103a\") " pod="openstack/ceilometer-0" Nov 24 12:46:35 crc kubenswrapper[4752]: I1124 12:46:35.008918 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b3f8aa5-321a-4103-8f5d-59ff951c103a-scripts\") pod \"ceilometer-0\" (UID: \"0b3f8aa5-321a-4103-8f5d-59ff951c103a\") " pod="openstack/ceilometer-0" Nov 24 12:46:35 crc kubenswrapper[4752]: I1124 12:46:35.008948 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ps64q\" (UniqueName: \"kubernetes.io/projected/0b3f8aa5-321a-4103-8f5d-59ff951c103a-kube-api-access-ps64q\") pod \"ceilometer-0\" (UID: \"0b3f8aa5-321a-4103-8f5d-59ff951c103a\") " pod="openstack/ceilometer-0" Nov 24 12:46:35 crc kubenswrapper[4752]: I1124 12:46:35.011602 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b3f8aa5-321a-4103-8f5d-59ff951c103a-log-httpd\") pod \"ceilometer-0\" (UID: \"0b3f8aa5-321a-4103-8f5d-59ff951c103a\") " pod="openstack/ceilometer-0" Nov 24 12:46:35 crc kubenswrapper[4752]: I1124 12:46:35.014526 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b3f8aa5-321a-4103-8f5d-59ff951c103a-run-httpd\") pod \"ceilometer-0\" (UID: \"0b3f8aa5-321a-4103-8f5d-59ff951c103a\") " pod="openstack/ceilometer-0" Nov 24 12:46:35 crc kubenswrapper[4752]: I1124 12:46:35.019395 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b3f8aa5-321a-4103-8f5d-59ff951c103a-scripts\") pod \"ceilometer-0\" (UID: \"0b3f8aa5-321a-4103-8f5d-59ff951c103a\") " pod="openstack/ceilometer-0" Nov 24 12:46:35 crc kubenswrapper[4752]: I1124 12:46:35.019961 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b3f8aa5-321a-4103-8f5d-59ff951c103a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0b3f8aa5-321a-4103-8f5d-59ff951c103a\") " pod="openstack/ceilometer-0" Nov 24 12:46:35 crc kubenswrapper[4752]: I1124 12:46:35.021650 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0b3f8aa5-321a-4103-8f5d-59ff951c103a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0b3f8aa5-321a-4103-8f5d-59ff951c103a\") " pod="openstack/ceilometer-0" Nov 24 12:46:35 crc kubenswrapper[4752]: I1124 12:46:35.043061 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps64q\" (UniqueName: \"kubernetes.io/projected/0b3f8aa5-321a-4103-8f5d-59ff951c103a-kube-api-access-ps64q\") pod \"ceilometer-0\" (UID: \"0b3f8aa5-321a-4103-8f5d-59ff951c103a\") " pod="openstack/ceilometer-0" Nov 24 12:46:35 crc kubenswrapper[4752]: I1124 12:46:35.044210 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b3f8aa5-321a-4103-8f5d-59ff951c103a-config-data\") pod \"ceilometer-0\" (UID: \"0b3f8aa5-321a-4103-8f5d-59ff951c103a\") " pod="openstack/ceilometer-0" Nov 24 12:46:35 crc kubenswrapper[4752]: I1124 12:46:35.149174 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 12:46:35 crc kubenswrapper[4752]: I1124 12:46:35.161983 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Nov 24 12:46:35 crc kubenswrapper[4752]: I1124 12:46:35.195502 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Nov 24 12:46:35 crc kubenswrapper[4752]: I1124 12:46:35.275329 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-676f74989c-5mnw8" Nov 24 12:46:35 crc kubenswrapper[4752]: I1124 12:46:35.374489 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fd99f9cb5-xl88k"] Nov 24 12:46:35 crc kubenswrapper[4752]: I1124 12:46:35.375071 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6fd99f9cb5-xl88k" podUID="e0a1d649-16e0-474e-ae26-2c0f17f2ef74" containerName="dnsmasq-dns" containerID="cri-o://ab8869d564759e9fc6fcf1fbbfec19c6b2e4acce0b1acbff0de373405c7023fa" gracePeriod=10 Nov 24 12:46:35 crc kubenswrapper[4752]: I1124 12:46:35.699340 4752 generic.go:334] "Generic (PLEG): container finished" podID="e0a1d649-16e0-474e-ae26-2c0f17f2ef74" containerID="ab8869d564759e9fc6fcf1fbbfec19c6b2e4acce0b1acbff0de373405c7023fa" exitCode=0 Nov 24 12:46:35 crc kubenswrapper[4752]: I1124 12:46:35.699978 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fd99f9cb5-xl88k" event={"ID":"e0a1d649-16e0-474e-ae26-2c0f17f2ef74","Type":"ContainerDied","Data":"ab8869d564759e9fc6fcf1fbbfec19c6b2e4acce0b1acbff0de373405c7023fa"} Nov 24 12:46:35 crc kubenswrapper[4752]: W1124 12:46:35.723951 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b3f8aa5_321a_4103_8f5d_59ff951c103a.slice/crio-b77c6aa3291be1a8bec4220cda2714dff002042cf5fa054929b8dd8357ccf365 WatchSource:0}: Error finding container b77c6aa3291be1a8bec4220cda2714dff002042cf5fa054929b8dd8357ccf365: Status 404 returned error can't find the container with id b77c6aa3291be1a8bec4220cda2714dff002042cf5fa054929b8dd8357ccf365 Nov 24 12:46:35 crc kubenswrapper[4752]: I1124 12:46:35.727400 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 24 12:46:36 crc kubenswrapper[4752]: I1124 12:46:36.108132 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fd99f9cb5-xl88k" Nov 24 12:46:36 crc kubenswrapper[4752]: I1124 12:46:36.146423 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0a1d649-16e0-474e-ae26-2c0f17f2ef74-dns-svc\") pod \"e0a1d649-16e0-474e-ae26-2c0f17f2ef74\" (UID: \"e0a1d649-16e0-474e-ae26-2c0f17f2ef74\") " Nov 24 12:46:36 crc kubenswrapper[4752]: I1124 12:46:36.146486 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0a1d649-16e0-474e-ae26-2c0f17f2ef74-ovsdbserver-sb\") pod \"e0a1d649-16e0-474e-ae26-2c0f17f2ef74\" (UID: \"e0a1d649-16e0-474e-ae26-2c0f17f2ef74\") " Nov 24 12:46:36 crc kubenswrapper[4752]: I1124 12:46:36.146680 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7nxd\" (UniqueName: \"kubernetes.io/projected/e0a1d649-16e0-474e-ae26-2c0f17f2ef74-kube-api-access-h7nxd\") pod \"e0a1d649-16e0-474e-ae26-2c0f17f2ef74\" (UID: \"e0a1d649-16e0-474e-ae26-2c0f17f2ef74\") " Nov 24 12:46:36 crc kubenswrapper[4752]: I1124 12:46:36.146783 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0a1d649-16e0-474e-ae26-2c0f17f2ef74-ovsdbserver-nb\") pod \"e0a1d649-16e0-474e-ae26-2c0f17f2ef74\" (UID: \"e0a1d649-16e0-474e-ae26-2c0f17f2ef74\") " Nov 24 12:46:36 crc kubenswrapper[4752]: I1124 12:46:36.146879 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0a1d649-16e0-474e-ae26-2c0f17f2ef74-config\") pod \"e0a1d649-16e0-474e-ae26-2c0f17f2ef74\" (UID: \"e0a1d649-16e0-474e-ae26-2c0f17f2ef74\") " Nov 24 12:46:36 crc kubenswrapper[4752]: I1124 12:46:36.153179 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0a1d649-16e0-474e-ae26-2c0f17f2ef74-kube-api-access-h7nxd" (OuterVolumeSpecName: "kube-api-access-h7nxd") pod "e0a1d649-16e0-474e-ae26-2c0f17f2ef74" (UID: "e0a1d649-16e0-474e-ae26-2c0f17f2ef74"). InnerVolumeSpecName "kube-api-access-h7nxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:46:36 crc kubenswrapper[4752]: I1124 12:46:36.234327 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0a1d649-16e0-474e-ae26-2c0f17f2ef74-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e0a1d649-16e0-474e-ae26-2c0f17f2ef74" (UID: "e0a1d649-16e0-474e-ae26-2c0f17f2ef74"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:46:36 crc kubenswrapper[4752]: I1124 12:46:36.235491 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0a1d649-16e0-474e-ae26-2c0f17f2ef74-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e0a1d649-16e0-474e-ae26-2c0f17f2ef74" (UID: "e0a1d649-16e0-474e-ae26-2c0f17f2ef74"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:46:36 crc kubenswrapper[4752]: I1124 12:46:36.249031 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0a1d649-16e0-474e-ae26-2c0f17f2ef74-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 24 12:46:36 crc kubenswrapper[4752]: I1124 12:46:36.249091 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7nxd\" (UniqueName: \"kubernetes.io/projected/e0a1d649-16e0-474e-ae26-2c0f17f2ef74-kube-api-access-h7nxd\") on node \"crc\" DevicePath \"\"" Nov 24 12:46:36 crc kubenswrapper[4752]: I1124 12:46:36.249099 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0a1d649-16e0-474e-ae26-2c0f17f2ef74-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 24 12:46:36 crc kubenswrapper[4752]: I1124 12:46:36.257794 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0a1d649-16e0-474e-ae26-2c0f17f2ef74-config" (OuterVolumeSpecName: "config") pod "e0a1d649-16e0-474e-ae26-2c0f17f2ef74" (UID: "e0a1d649-16e0-474e-ae26-2c0f17f2ef74"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:46:36 crc kubenswrapper[4752]: I1124 12:46:36.261283 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0a1d649-16e0-474e-ae26-2c0f17f2ef74-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e0a1d649-16e0-474e-ae26-2c0f17f2ef74" (UID: "e0a1d649-16e0-474e-ae26-2c0f17f2ef74"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:46:36 crc kubenswrapper[4752]: I1124 12:46:36.350905 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0a1d649-16e0-474e-ae26-2c0f17f2ef74-config\") on node \"crc\" DevicePath \"\"" Nov 24 12:46:36 crc kubenswrapper[4752]: I1124 12:46:36.351205 4752 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0a1d649-16e0-474e-ae26-2c0f17f2ef74-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 12:46:36 crc kubenswrapper[4752]: I1124 12:46:36.710932 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b3f8aa5-321a-4103-8f5d-59ff951c103a","Type":"ContainerStarted","Data":"0c50385e235fe7ba7087bf3178c3f41d104e68b45589afe10616d24d5713597d"} Nov 24 12:46:36 crc kubenswrapper[4752]: I1124 12:46:36.710988 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b3f8aa5-321a-4103-8f5d-59ff951c103a","Type":"ContainerStarted","Data":"b77c6aa3291be1a8bec4220cda2714dff002042cf5fa054929b8dd8357ccf365"} Nov 24 12:46:36 crc kubenswrapper[4752]: I1124 12:46:36.714419 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fd99f9cb5-xl88k" event={"ID":"e0a1d649-16e0-474e-ae26-2c0f17f2ef74","Type":"ContainerDied","Data":"3effb433d190370929b7ef4b23f91d6a6356e0d4fc65b0cfb98bd45919a5f5ca"} Nov 24 12:46:36 crc kubenswrapper[4752]: I1124 12:46:36.714462 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fd99f9cb5-xl88k" Nov 24 12:46:36 crc kubenswrapper[4752]: I1124 12:46:36.714490 4752 scope.go:117] "RemoveContainer" containerID="ab8869d564759e9fc6fcf1fbbfec19c6b2e4acce0b1acbff0de373405c7023fa" Nov 24 12:46:36 crc kubenswrapper[4752]: I1124 12:46:36.767163 4752 scope.go:117] "RemoveContainer" containerID="cca1b550908ef80250525dc8f0ed13ea8ec3d7f84c7e38e339ed6318e9cccca7" Nov 24 12:46:36 crc kubenswrapper[4752]: I1124 12:46:36.780107 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcdd7e57-6834-4d6c-ac2e-7489bae07373" path="/var/lib/kubelet/pods/bcdd7e57-6834-4d6c-ac2e-7489bae07373/volumes" Nov 24 12:46:36 crc kubenswrapper[4752]: I1124 12:46:36.788455 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fd99f9cb5-xl88k"] Nov 24 12:46:36 crc kubenswrapper[4752]: I1124 12:46:36.788505 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6fd99f9cb5-xl88k"] Nov 24 12:46:37 crc kubenswrapper[4752]: I1124 12:46:37.730715 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b3f8aa5-321a-4103-8f5d-59ff951c103a","Type":"ContainerStarted","Data":"862a8058ab20b86d4bca603837bff44b2e68008b6311e399b36ec23e95062f0a"} Nov 24 12:46:38 crc kubenswrapper[4752]: I1124 12:46:38.745879 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0a1d649-16e0-474e-ae26-2c0f17f2ef74" path="/var/lib/kubelet/pods/e0a1d649-16e0-474e-ae26-2c0f17f2ef74/volumes" Nov 24 12:46:38 crc kubenswrapper[4752]: I1124 12:46:38.749660 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b3f8aa5-321a-4103-8f5d-59ff951c103a","Type":"ContainerStarted","Data":"272f0ade6feb1b000f2c77346f98b0331d90931da06b33a3d888b9d3a4891221"} Nov 24 12:46:39 crc kubenswrapper[4752]: I1124 12:46:39.563164 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 24 12:46:41 crc kubenswrapper[4752]: I1124 12:46:41.786526 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b3f8aa5-321a-4103-8f5d-59ff951c103a","Type":"ContainerStarted","Data":"6f5480af53f22ff33fd27baaa27755727b2b55a6dbd6735abf2f3ab85daf2604"} Nov 24 12:46:41 crc kubenswrapper[4752]: I1124 12:46:41.787235 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 24 12:46:41 crc kubenswrapper[4752]: I1124 12:46:41.787265 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0b3f8aa5-321a-4103-8f5d-59ff951c103a" containerName="proxy-httpd" containerID="cri-o://6f5480af53f22ff33fd27baaa27755727b2b55a6dbd6735abf2f3ab85daf2604" gracePeriod=30 Nov 24 12:46:41 crc kubenswrapper[4752]: I1124 12:46:41.787346 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0b3f8aa5-321a-4103-8f5d-59ff951c103a" containerName="sg-core" containerID="cri-o://272f0ade6feb1b000f2c77346f98b0331d90931da06b33a3d888b9d3a4891221" gracePeriod=30 Nov 24 12:46:41 crc kubenswrapper[4752]: I1124 12:46:41.787391 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0b3f8aa5-321a-4103-8f5d-59ff951c103a" containerName="ceilometer-notification-agent" containerID="cri-o://862a8058ab20b86d4bca603837bff44b2e68008b6311e399b36ec23e95062f0a" gracePeriod=30 Nov 24 12:46:41 crc kubenswrapper[4752]: I1124 12:46:41.786855 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0b3f8aa5-321a-4103-8f5d-59ff951c103a" containerName="ceilometer-central-agent" containerID="cri-o://0c50385e235fe7ba7087bf3178c3f41d104e68b45589afe10616d24d5713597d" gracePeriod=30 Nov 24 12:46:41 crc kubenswrapper[4752]: I1124 12:46:41.820914 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.956197955 podStartE2EDuration="7.820890261s" podCreationTimestamp="2025-11-24 12:46:34 +0000 UTC" firstStartedPulling="2025-11-24 12:46:35.727522502 +0000 UTC m=+6001.712342791" lastFinishedPulling="2025-11-24 12:46:40.592214808 +0000 UTC m=+6006.577035097" observedRunningTime="2025-11-24 12:46:41.813037616 +0000 UTC m=+6007.797857925" watchObservedRunningTime="2025-11-24 12:46:41.820890261 +0000 UTC m=+6007.805710560" Nov 24 12:46:42 crc kubenswrapper[4752]: I1124 12:46:42.697650 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 12:46:42 crc kubenswrapper[4752]: I1124 12:46:42.799394 4752 generic.go:334] "Generic (PLEG): container finished" podID="0b3f8aa5-321a-4103-8f5d-59ff951c103a" containerID="6f5480af53f22ff33fd27baaa27755727b2b55a6dbd6735abf2f3ab85daf2604" exitCode=0 Nov 24 12:46:42 crc kubenswrapper[4752]: I1124 12:46:42.799424 4752 generic.go:334] "Generic (PLEG): container finished" podID="0b3f8aa5-321a-4103-8f5d-59ff951c103a" containerID="272f0ade6feb1b000f2c77346f98b0331d90931da06b33a3d888b9d3a4891221" exitCode=2 Nov 24 12:46:42 crc kubenswrapper[4752]: I1124 12:46:42.799430 4752 generic.go:334] "Generic (PLEG): container finished" podID="0b3f8aa5-321a-4103-8f5d-59ff951c103a" containerID="862a8058ab20b86d4bca603837bff44b2e68008b6311e399b36ec23e95062f0a" exitCode=0 Nov 24 12:46:42 crc kubenswrapper[4752]: I1124 12:46:42.799436 4752 generic.go:334] "Generic (PLEG): container finished" podID="0b3f8aa5-321a-4103-8f5d-59ff951c103a" containerID="0c50385e235fe7ba7087bf3178c3f41d104e68b45589afe10616d24d5713597d" exitCode=0 Nov 24 12:46:42 crc kubenswrapper[4752]: I1124 12:46:42.799458 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b3f8aa5-321a-4103-8f5d-59ff951c103a","Type":"ContainerDied","Data":"6f5480af53f22ff33fd27baaa27755727b2b55a6dbd6735abf2f3ab85daf2604"} Nov 24 12:46:42 crc kubenswrapper[4752]: I1124 12:46:42.799497 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b3f8aa5-321a-4103-8f5d-59ff951c103a","Type":"ContainerDied","Data":"272f0ade6feb1b000f2c77346f98b0331d90931da06b33a3d888b9d3a4891221"} Nov 24 12:46:42 crc kubenswrapper[4752]: I1124 12:46:42.799514 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b3f8aa5-321a-4103-8f5d-59ff951c103a","Type":"ContainerDied","Data":"862a8058ab20b86d4bca603837bff44b2e68008b6311e399b36ec23e95062f0a"} Nov 24 12:46:42 crc kubenswrapper[4752]: I1124 12:46:42.799526 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b3f8aa5-321a-4103-8f5d-59ff951c103a","Type":"ContainerDied","Data":"0c50385e235fe7ba7087bf3178c3f41d104e68b45589afe10616d24d5713597d"} Nov 24 12:46:42 crc kubenswrapper[4752]: I1124 12:46:42.799529 4752 scope.go:117] "RemoveContainer" containerID="6f5480af53f22ff33fd27baaa27755727b2b55a6dbd6735abf2f3ab85daf2604" Nov 24 12:46:42 crc kubenswrapper[4752]: I1124 12:46:42.799558 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 12:46:42 crc kubenswrapper[4752]: I1124 12:46:42.799549 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b3f8aa5-321a-4103-8f5d-59ff951c103a","Type":"ContainerDied","Data":"b77c6aa3291be1a8bec4220cda2714dff002042cf5fa054929b8dd8357ccf365"} Nov 24 12:46:42 crc kubenswrapper[4752]: I1124 12:46:42.815911 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ps64q\" (UniqueName: \"kubernetes.io/projected/0b3f8aa5-321a-4103-8f5d-59ff951c103a-kube-api-access-ps64q\") pod \"0b3f8aa5-321a-4103-8f5d-59ff951c103a\" (UID: \"0b3f8aa5-321a-4103-8f5d-59ff951c103a\") " Nov 24 12:46:42 crc kubenswrapper[4752]: I1124 12:46:42.815998 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b3f8aa5-321a-4103-8f5d-59ff951c103a-log-httpd\") pod \"0b3f8aa5-321a-4103-8f5d-59ff951c103a\" (UID: \"0b3f8aa5-321a-4103-8f5d-59ff951c103a\") " Nov 24 12:46:42 crc kubenswrapper[4752]: I1124 12:46:42.816044 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b3f8aa5-321a-4103-8f5d-59ff951c103a-config-data\") pod \"0b3f8aa5-321a-4103-8f5d-59ff951c103a\" (UID: \"0b3f8aa5-321a-4103-8f5d-59ff951c103a\") " Nov 24 12:46:42 crc kubenswrapper[4752]: I1124 12:46:42.816090 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b3f8aa5-321a-4103-8f5d-59ff951c103a-combined-ca-bundle\") pod \"0b3f8aa5-321a-4103-8f5d-59ff951c103a\" (UID: \"0b3f8aa5-321a-4103-8f5d-59ff951c103a\") " Nov 24 12:46:42 crc kubenswrapper[4752]: I1124 12:46:42.816122 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b3f8aa5-321a-4103-8f5d-59ff951c103a-run-httpd\") pod \"0b3f8aa5-321a-4103-8f5d-59ff951c103a\" (UID: \"0b3f8aa5-321a-4103-8f5d-59ff951c103a\") " Nov 24 12:46:42 crc kubenswrapper[4752]: I1124 12:46:42.816180 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0b3f8aa5-321a-4103-8f5d-59ff951c103a-sg-core-conf-yaml\") pod \"0b3f8aa5-321a-4103-8f5d-59ff951c103a\" (UID: \"0b3f8aa5-321a-4103-8f5d-59ff951c103a\") " Nov 24 12:46:42 crc kubenswrapper[4752]: I1124 12:46:42.816225 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b3f8aa5-321a-4103-8f5d-59ff951c103a-scripts\") pod \"0b3f8aa5-321a-4103-8f5d-59ff951c103a\" (UID: \"0b3f8aa5-321a-4103-8f5d-59ff951c103a\") " Nov 24 12:46:42 crc kubenswrapper[4752]: I1124 12:46:42.816663 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b3f8aa5-321a-4103-8f5d-59ff951c103a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0b3f8aa5-321a-4103-8f5d-59ff951c103a" (UID: "0b3f8aa5-321a-4103-8f5d-59ff951c103a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:46:42 crc kubenswrapper[4752]: I1124 12:46:42.816929 4752 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b3f8aa5-321a-4103-8f5d-59ff951c103a-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 24 12:46:42 crc kubenswrapper[4752]: I1124 12:46:42.818125 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b3f8aa5-321a-4103-8f5d-59ff951c103a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0b3f8aa5-321a-4103-8f5d-59ff951c103a" (UID: "0b3f8aa5-321a-4103-8f5d-59ff951c103a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:46:42 crc kubenswrapper[4752]: I1124 12:46:42.828276 4752 scope.go:117] "RemoveContainer" containerID="272f0ade6feb1b000f2c77346f98b0331d90931da06b33a3d888b9d3a4891221" Nov 24 12:46:42 crc kubenswrapper[4752]: I1124 12:46:42.838493 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b3f8aa5-321a-4103-8f5d-59ff951c103a-scripts" (OuterVolumeSpecName: "scripts") pod "0b3f8aa5-321a-4103-8f5d-59ff951c103a" (UID: "0b3f8aa5-321a-4103-8f5d-59ff951c103a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:46:42 crc kubenswrapper[4752]: I1124 12:46:42.839387 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b3f8aa5-321a-4103-8f5d-59ff951c103a-kube-api-access-ps64q" (OuterVolumeSpecName: "kube-api-access-ps64q") pod "0b3f8aa5-321a-4103-8f5d-59ff951c103a" (UID: "0b3f8aa5-321a-4103-8f5d-59ff951c103a"). InnerVolumeSpecName "kube-api-access-ps64q". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:46:42 crc kubenswrapper[4752]: I1124 12:46:42.860178 4752 scope.go:117] "RemoveContainer" containerID="862a8058ab20b86d4bca603837bff44b2e68008b6311e399b36ec23e95062f0a" Nov 24 12:46:42 crc kubenswrapper[4752]: I1124 12:46:42.865514 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b3f8aa5-321a-4103-8f5d-59ff951c103a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0b3f8aa5-321a-4103-8f5d-59ff951c103a" (UID: "0b3f8aa5-321a-4103-8f5d-59ff951c103a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:46:42 crc kubenswrapper[4752]: I1124 12:46:42.889583 4752 scope.go:117] "RemoveContainer" containerID="0c50385e235fe7ba7087bf3178c3f41d104e68b45589afe10616d24d5713597d" Nov 24 12:46:42 crc kubenswrapper[4752]: I1124 12:46:42.920443 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ps64q\" (UniqueName: \"kubernetes.io/projected/0b3f8aa5-321a-4103-8f5d-59ff951c103a-kube-api-access-ps64q\") on node \"crc\" DevicePath \"\"" Nov 24 12:46:42 crc kubenswrapper[4752]: I1124 12:46:42.920715 4752 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b3f8aa5-321a-4103-8f5d-59ff951c103a-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 24 12:46:42 crc kubenswrapper[4752]: I1124 12:46:42.920803 4752 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0b3f8aa5-321a-4103-8f5d-59ff951c103a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 24 12:46:42 crc kubenswrapper[4752]: I1124 12:46:42.920880 4752 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b3f8aa5-321a-4103-8f5d-59ff951c103a-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 12:46:42 crc kubenswrapper[4752]: I1124 12:46:42.922727 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b3f8aa5-321a-4103-8f5d-59ff951c103a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0b3f8aa5-321a-4103-8f5d-59ff951c103a" (UID: "0b3f8aa5-321a-4103-8f5d-59ff951c103a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:46:42 crc kubenswrapper[4752]: I1124 12:46:42.940916 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b3f8aa5-321a-4103-8f5d-59ff951c103a-config-data" (OuterVolumeSpecName: "config-data") pod "0b3f8aa5-321a-4103-8f5d-59ff951c103a" (UID: "0b3f8aa5-321a-4103-8f5d-59ff951c103a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.023123 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b3f8aa5-321a-4103-8f5d-59ff951c103a-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.023333 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b3f8aa5-321a-4103-8f5d-59ff951c103a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.032224 4752 scope.go:117] "RemoveContainer" containerID="6f5480af53f22ff33fd27baaa27755727b2b55a6dbd6735abf2f3ab85daf2604" Nov 24 12:46:43 crc kubenswrapper[4752]: E1124 12:46:43.033173 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f5480af53f22ff33fd27baaa27755727b2b55a6dbd6735abf2f3ab85daf2604\": container with ID starting with 6f5480af53f22ff33fd27baaa27755727b2b55a6dbd6735abf2f3ab85daf2604 not found: ID does not exist" containerID="6f5480af53f22ff33fd27baaa27755727b2b55a6dbd6735abf2f3ab85daf2604" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.033486 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f5480af53f22ff33fd27baaa27755727b2b55a6dbd6735abf2f3ab85daf2604"} err="failed to get container status \"6f5480af53f22ff33fd27baaa27755727b2b55a6dbd6735abf2f3ab85daf2604\": rpc error: code = NotFound desc = could not find container \"6f5480af53f22ff33fd27baaa27755727b2b55a6dbd6735abf2f3ab85daf2604\": container with ID starting with 6f5480af53f22ff33fd27baaa27755727b2b55a6dbd6735abf2f3ab85daf2604 not found: ID does not exist" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.033663 4752 scope.go:117] "RemoveContainer" containerID="272f0ade6feb1b000f2c77346f98b0331d90931da06b33a3d888b9d3a4891221" Nov 24 12:46:43 crc kubenswrapper[4752]: E1124 12:46:43.034135 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"272f0ade6feb1b000f2c77346f98b0331d90931da06b33a3d888b9d3a4891221\": container with ID starting with 272f0ade6feb1b000f2c77346f98b0331d90931da06b33a3d888b9d3a4891221 not found: ID does not exist" containerID="272f0ade6feb1b000f2c77346f98b0331d90931da06b33a3d888b9d3a4891221" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.034168 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"272f0ade6feb1b000f2c77346f98b0331d90931da06b33a3d888b9d3a4891221"} err="failed to get container status \"272f0ade6feb1b000f2c77346f98b0331d90931da06b33a3d888b9d3a4891221\": rpc error: code = NotFound desc = could not find container \"272f0ade6feb1b000f2c77346f98b0331d90931da06b33a3d888b9d3a4891221\": container with ID starting with 272f0ade6feb1b000f2c77346f98b0331d90931da06b33a3d888b9d3a4891221 not found: ID does not exist" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.034188 4752 scope.go:117] "RemoveContainer" containerID="862a8058ab20b86d4bca603837bff44b2e68008b6311e399b36ec23e95062f0a" Nov 24 12:46:43 crc kubenswrapper[4752]: E1124 12:46:43.034511 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"862a8058ab20b86d4bca603837bff44b2e68008b6311e399b36ec23e95062f0a\": container with ID starting with 862a8058ab20b86d4bca603837bff44b2e68008b6311e399b36ec23e95062f0a not found: ID does not exist" containerID="862a8058ab20b86d4bca603837bff44b2e68008b6311e399b36ec23e95062f0a" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.034549 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"862a8058ab20b86d4bca603837bff44b2e68008b6311e399b36ec23e95062f0a"} err="failed to get container status \"862a8058ab20b86d4bca603837bff44b2e68008b6311e399b36ec23e95062f0a\": rpc error: code = NotFound desc = could not find container \"862a8058ab20b86d4bca603837bff44b2e68008b6311e399b36ec23e95062f0a\": container with ID starting with 862a8058ab20b86d4bca603837bff44b2e68008b6311e399b36ec23e95062f0a not found: ID does not exist" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.034577 4752 scope.go:117] "RemoveContainer" containerID="0c50385e235fe7ba7087bf3178c3f41d104e68b45589afe10616d24d5713597d" Nov 24 12:46:43 crc kubenswrapper[4752]: E1124 12:46:43.034951 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c50385e235fe7ba7087bf3178c3f41d104e68b45589afe10616d24d5713597d\": container with ID starting with 0c50385e235fe7ba7087bf3178c3f41d104e68b45589afe10616d24d5713597d not found: ID does not exist" containerID="0c50385e235fe7ba7087bf3178c3f41d104e68b45589afe10616d24d5713597d" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.035125 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c50385e235fe7ba7087bf3178c3f41d104e68b45589afe10616d24d5713597d"} err="failed to get container status \"0c50385e235fe7ba7087bf3178c3f41d104e68b45589afe10616d24d5713597d\": rpc error: code = NotFound desc = could not find container \"0c50385e235fe7ba7087bf3178c3f41d104e68b45589afe10616d24d5713597d\": container with ID starting with 0c50385e235fe7ba7087bf3178c3f41d104e68b45589afe10616d24d5713597d not found: ID does not exist" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.035372 4752 scope.go:117] "RemoveContainer" containerID="6f5480af53f22ff33fd27baaa27755727b2b55a6dbd6735abf2f3ab85daf2604" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.036168 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f5480af53f22ff33fd27baaa27755727b2b55a6dbd6735abf2f3ab85daf2604"} err="failed to get container status \"6f5480af53f22ff33fd27baaa27755727b2b55a6dbd6735abf2f3ab85daf2604\": rpc error: code = NotFound desc = could not find container \"6f5480af53f22ff33fd27baaa27755727b2b55a6dbd6735abf2f3ab85daf2604\": container with ID starting with 6f5480af53f22ff33fd27baaa27755727b2b55a6dbd6735abf2f3ab85daf2604 not found: ID does not exist" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.036191 4752 scope.go:117] "RemoveContainer" containerID="272f0ade6feb1b000f2c77346f98b0331d90931da06b33a3d888b9d3a4891221" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.036597 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"272f0ade6feb1b000f2c77346f98b0331d90931da06b33a3d888b9d3a4891221"} err="failed to get container status \"272f0ade6feb1b000f2c77346f98b0331d90931da06b33a3d888b9d3a4891221\": rpc error: code = NotFound desc = could not find container \"272f0ade6feb1b000f2c77346f98b0331d90931da06b33a3d888b9d3a4891221\": container with ID starting with 272f0ade6feb1b000f2c77346f98b0331d90931da06b33a3d888b9d3a4891221 not found: ID does not exist" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.036617 4752 scope.go:117] "RemoveContainer" containerID="862a8058ab20b86d4bca603837bff44b2e68008b6311e399b36ec23e95062f0a" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.036999 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"862a8058ab20b86d4bca603837bff44b2e68008b6311e399b36ec23e95062f0a"} err="failed to get container status \"862a8058ab20b86d4bca603837bff44b2e68008b6311e399b36ec23e95062f0a\": rpc error: code = NotFound desc = could not find container \"862a8058ab20b86d4bca603837bff44b2e68008b6311e399b36ec23e95062f0a\": container with ID starting with 862a8058ab20b86d4bca603837bff44b2e68008b6311e399b36ec23e95062f0a not found: ID does not exist" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.037088 4752 scope.go:117] "RemoveContainer" containerID="0c50385e235fe7ba7087bf3178c3f41d104e68b45589afe10616d24d5713597d" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.037523 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c50385e235fe7ba7087bf3178c3f41d104e68b45589afe10616d24d5713597d"} err="failed to get container status \"0c50385e235fe7ba7087bf3178c3f41d104e68b45589afe10616d24d5713597d\": rpc error: code = NotFound desc = could not find container \"0c50385e235fe7ba7087bf3178c3f41d104e68b45589afe10616d24d5713597d\": container with ID starting with 0c50385e235fe7ba7087bf3178c3f41d104e68b45589afe10616d24d5713597d not found: ID does not exist" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.037679 4752 scope.go:117] "RemoveContainer" containerID="6f5480af53f22ff33fd27baaa27755727b2b55a6dbd6735abf2f3ab85daf2604" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.037961 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f5480af53f22ff33fd27baaa27755727b2b55a6dbd6735abf2f3ab85daf2604"} err="failed to get container status \"6f5480af53f22ff33fd27baaa27755727b2b55a6dbd6735abf2f3ab85daf2604\": rpc error: code = NotFound desc = could not find container \"6f5480af53f22ff33fd27baaa27755727b2b55a6dbd6735abf2f3ab85daf2604\": container with ID starting with 6f5480af53f22ff33fd27baaa27755727b2b55a6dbd6735abf2f3ab85daf2604 not found: ID does not exist" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.037981 4752 scope.go:117] "RemoveContainer" containerID="272f0ade6feb1b000f2c77346f98b0331d90931da06b33a3d888b9d3a4891221" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.038244 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"272f0ade6feb1b000f2c77346f98b0331d90931da06b33a3d888b9d3a4891221"} err="failed to get container status \"272f0ade6feb1b000f2c77346f98b0331d90931da06b33a3d888b9d3a4891221\": rpc error: code = NotFound desc = could not find container \"272f0ade6feb1b000f2c77346f98b0331d90931da06b33a3d888b9d3a4891221\": container with ID starting with 272f0ade6feb1b000f2c77346f98b0331d90931da06b33a3d888b9d3a4891221 not found: ID does not exist" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.038272 4752 scope.go:117] "RemoveContainer" containerID="862a8058ab20b86d4bca603837bff44b2e68008b6311e399b36ec23e95062f0a" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.038762 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"862a8058ab20b86d4bca603837bff44b2e68008b6311e399b36ec23e95062f0a"} err="failed to get container status \"862a8058ab20b86d4bca603837bff44b2e68008b6311e399b36ec23e95062f0a\": rpc error: code = NotFound desc = could not find container \"862a8058ab20b86d4bca603837bff44b2e68008b6311e399b36ec23e95062f0a\": container with ID starting with 862a8058ab20b86d4bca603837bff44b2e68008b6311e399b36ec23e95062f0a not found: ID does not exist" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.038786 4752 scope.go:117] "RemoveContainer" containerID="0c50385e235fe7ba7087bf3178c3f41d104e68b45589afe10616d24d5713597d" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.039243 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c50385e235fe7ba7087bf3178c3f41d104e68b45589afe10616d24d5713597d"} err="failed to get container status \"0c50385e235fe7ba7087bf3178c3f41d104e68b45589afe10616d24d5713597d\": rpc error: code = NotFound desc = could not find container \"0c50385e235fe7ba7087bf3178c3f41d104e68b45589afe10616d24d5713597d\": container with ID starting with 0c50385e235fe7ba7087bf3178c3f41d104e68b45589afe10616d24d5713597d not found: ID does not exist" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.039362 4752 scope.go:117] "RemoveContainer" containerID="6f5480af53f22ff33fd27baaa27755727b2b55a6dbd6735abf2f3ab85daf2604" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.039953 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f5480af53f22ff33fd27baaa27755727b2b55a6dbd6735abf2f3ab85daf2604"} err="failed to get container status \"6f5480af53f22ff33fd27baaa27755727b2b55a6dbd6735abf2f3ab85daf2604\": rpc error: code = NotFound desc = could not find container \"6f5480af53f22ff33fd27baaa27755727b2b55a6dbd6735abf2f3ab85daf2604\": container with ID starting with 6f5480af53f22ff33fd27baaa27755727b2b55a6dbd6735abf2f3ab85daf2604 not found: ID does not exist" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.040040 4752 scope.go:117] "RemoveContainer" containerID="272f0ade6feb1b000f2c77346f98b0331d90931da06b33a3d888b9d3a4891221" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.040445 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"272f0ade6feb1b000f2c77346f98b0331d90931da06b33a3d888b9d3a4891221"} err="failed to get container status \"272f0ade6feb1b000f2c77346f98b0331d90931da06b33a3d888b9d3a4891221\": rpc error: code = NotFound desc = could not find container \"272f0ade6feb1b000f2c77346f98b0331d90931da06b33a3d888b9d3a4891221\": container with ID starting with 272f0ade6feb1b000f2c77346f98b0331d90931da06b33a3d888b9d3a4891221 not found: ID does not exist" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.040579 4752 scope.go:117] "RemoveContainer" containerID="862a8058ab20b86d4bca603837bff44b2e68008b6311e399b36ec23e95062f0a" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.041025 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"862a8058ab20b86d4bca603837bff44b2e68008b6311e399b36ec23e95062f0a"} err="failed to get container status \"862a8058ab20b86d4bca603837bff44b2e68008b6311e399b36ec23e95062f0a\": rpc error: code = NotFound desc = could not find container \"862a8058ab20b86d4bca603837bff44b2e68008b6311e399b36ec23e95062f0a\": container with ID starting with 862a8058ab20b86d4bca603837bff44b2e68008b6311e399b36ec23e95062f0a not found: ID does not exist" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.041047 4752 scope.go:117] "RemoveContainer" containerID="0c50385e235fe7ba7087bf3178c3f41d104e68b45589afe10616d24d5713597d" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.041260 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c50385e235fe7ba7087bf3178c3f41d104e68b45589afe10616d24d5713597d"} err="failed to get container status \"0c50385e235fe7ba7087bf3178c3f41d104e68b45589afe10616d24d5713597d\": rpc error: code = NotFound desc = could not find container \"0c50385e235fe7ba7087bf3178c3f41d104e68b45589afe10616d24d5713597d\": container with ID starting with 0c50385e235fe7ba7087bf3178c3f41d104e68b45589afe10616d24d5713597d not found: ID does not exist" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.145363 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.156145 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.191736 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 24 12:46:43 crc kubenswrapper[4752]: E1124 12:46:43.192637 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0a1d649-16e0-474e-ae26-2c0f17f2ef74" containerName="init" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.192730 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0a1d649-16e0-474e-ae26-2c0f17f2ef74" containerName="init" Nov 24 12:46:43 crc kubenswrapper[4752]: E1124 12:46:43.192851 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b3f8aa5-321a-4103-8f5d-59ff951c103a" containerName="sg-core" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.192927 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b3f8aa5-321a-4103-8f5d-59ff951c103a" containerName="sg-core" Nov 24 12:46:43 crc kubenswrapper[4752]: E1124 12:46:43.193010 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b3f8aa5-321a-4103-8f5d-59ff951c103a" containerName="ceilometer-notification-agent" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.193077 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b3f8aa5-321a-4103-8f5d-59ff951c103a" containerName="ceilometer-notification-agent" Nov 24 12:46:43 crc kubenswrapper[4752]: E1124 12:46:43.193158 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b3f8aa5-321a-4103-8f5d-59ff951c103a" containerName="ceilometer-central-agent" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.193220 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b3f8aa5-321a-4103-8f5d-59ff951c103a" containerName="ceilometer-central-agent" Nov 24 12:46:43 crc kubenswrapper[4752]: E1124 12:46:43.193293 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0a1d649-16e0-474e-ae26-2c0f17f2ef74" containerName="dnsmasq-dns" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.193363 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0a1d649-16e0-474e-ae26-2c0f17f2ef74" containerName="dnsmasq-dns" Nov 24 12:46:43 crc kubenswrapper[4752]: E1124 12:46:43.193459 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b3f8aa5-321a-4103-8f5d-59ff951c103a" containerName="proxy-httpd" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.193530 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b3f8aa5-321a-4103-8f5d-59ff951c103a" containerName="proxy-httpd" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.193883 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0a1d649-16e0-474e-ae26-2c0f17f2ef74" containerName="dnsmasq-dns" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.193976 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b3f8aa5-321a-4103-8f5d-59ff951c103a" containerName="ceilometer-central-agent" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.194063 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b3f8aa5-321a-4103-8f5d-59ff951c103a" containerName="proxy-httpd" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.194162 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b3f8aa5-321a-4103-8f5d-59ff951c103a" containerName="ceilometer-notification-agent" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.194231 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b3f8aa5-321a-4103-8f5d-59ff951c103a" containerName="sg-core" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.196828 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.202523 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.202635 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.205621 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.226727 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87748d6d-7490-4a95-9cdc-5fc516929b3d-run-httpd\") pod \"ceilometer-0\" (UID: \"87748d6d-7490-4a95-9cdc-5fc516929b3d\") " pod="openstack/ceilometer-0" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.226809 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87748d6d-7490-4a95-9cdc-5fc516929b3d-config-data\") pod \"ceilometer-0\" (UID: \"87748d6d-7490-4a95-9cdc-5fc516929b3d\") " pod="openstack/ceilometer-0" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.226871 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87748d6d-7490-4a95-9cdc-5fc516929b3d-scripts\") pod \"ceilometer-0\" (UID: \"87748d6d-7490-4a95-9cdc-5fc516929b3d\") " pod="openstack/ceilometer-0" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.226895 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87748d6d-7490-4a95-9cdc-5fc516929b3d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"87748d6d-7490-4a95-9cdc-5fc516929b3d\") " pod="openstack/ceilometer-0" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.226933 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87748d6d-7490-4a95-9cdc-5fc516929b3d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"87748d6d-7490-4a95-9cdc-5fc516929b3d\") " pod="openstack/ceilometer-0" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.226976 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87748d6d-7490-4a95-9cdc-5fc516929b3d-log-httpd\") pod \"ceilometer-0\" (UID: \"87748d6d-7490-4a95-9cdc-5fc516929b3d\") " pod="openstack/ceilometer-0" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.226999 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq57q\" (UniqueName: \"kubernetes.io/projected/87748d6d-7490-4a95-9cdc-5fc516929b3d-kube-api-access-mq57q\") pod \"ceilometer-0\" (UID: \"87748d6d-7490-4a95-9cdc-5fc516929b3d\") " pod="openstack/ceilometer-0" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.328659 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87748d6d-7490-4a95-9cdc-5fc516929b3d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"87748d6d-7490-4a95-9cdc-5fc516929b3d\") " pod="openstack/ceilometer-0" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.328735 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87748d6d-7490-4a95-9cdc-5fc516929b3d-log-httpd\") pod \"ceilometer-0\" (UID: \"87748d6d-7490-4a95-9cdc-5fc516929b3d\") " pod="openstack/ceilometer-0" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.328771 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq57q\" (UniqueName: \"kubernetes.io/projected/87748d6d-7490-4a95-9cdc-5fc516929b3d-kube-api-access-mq57q\") pod \"ceilometer-0\" (UID: \"87748d6d-7490-4a95-9cdc-5fc516929b3d\") " pod="openstack/ceilometer-0" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.328833 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87748d6d-7490-4a95-9cdc-5fc516929b3d-run-httpd\") pod \"ceilometer-0\" (UID: \"87748d6d-7490-4a95-9cdc-5fc516929b3d\") " pod="openstack/ceilometer-0" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.328866 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87748d6d-7490-4a95-9cdc-5fc516929b3d-config-data\") pod \"ceilometer-0\" (UID: \"87748d6d-7490-4a95-9cdc-5fc516929b3d\") " pod="openstack/ceilometer-0" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.328920 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87748d6d-7490-4a95-9cdc-5fc516929b3d-scripts\") pod \"ceilometer-0\" (UID: \"87748d6d-7490-4a95-9cdc-5fc516929b3d\") " pod="openstack/ceilometer-0" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.328947 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87748d6d-7490-4a95-9cdc-5fc516929b3d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"87748d6d-7490-4a95-9cdc-5fc516929b3d\") " pod="openstack/ceilometer-0" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.329499 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87748d6d-7490-4a95-9cdc-5fc516929b3d-run-httpd\") pod \"ceilometer-0\" (UID: \"87748d6d-7490-4a95-9cdc-5fc516929b3d\") " pod="openstack/ceilometer-0" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.329563 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87748d6d-7490-4a95-9cdc-5fc516929b3d-log-httpd\") pod \"ceilometer-0\" (UID: \"87748d6d-7490-4a95-9cdc-5fc516929b3d\") " pod="openstack/ceilometer-0" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.335119 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87748d6d-7490-4a95-9cdc-5fc516929b3d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"87748d6d-7490-4a95-9cdc-5fc516929b3d\") " pod="openstack/ceilometer-0" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.336069 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87748d6d-7490-4a95-9cdc-5fc516929b3d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"87748d6d-7490-4a95-9cdc-5fc516929b3d\") " pod="openstack/ceilometer-0" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.344954 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87748d6d-7490-4a95-9cdc-5fc516929b3d-config-data\") pod \"ceilometer-0\" (UID: \"87748d6d-7490-4a95-9cdc-5fc516929b3d\") " pod="openstack/ceilometer-0" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.354154 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq57q\" (UniqueName: \"kubernetes.io/projected/87748d6d-7490-4a95-9cdc-5fc516929b3d-kube-api-access-mq57q\") pod \"ceilometer-0\" (UID: \"87748d6d-7490-4a95-9cdc-5fc516929b3d\") " pod="openstack/ceilometer-0" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.356572 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87748d6d-7490-4a95-9cdc-5fc516929b3d-scripts\") pod \"ceilometer-0\" (UID: \"87748d6d-7490-4a95-9cdc-5fc516929b3d\") " pod="openstack/ceilometer-0" Nov 24 12:46:43 crc kubenswrapper[4752]: I1124 12:46:43.521847 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 12:46:44 crc kubenswrapper[4752]: I1124 12:46:44.111357 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 24 12:46:44 crc kubenswrapper[4752]: I1124 12:46:44.742710 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b3f8aa5-321a-4103-8f5d-59ff951c103a" path="/var/lib/kubelet/pods/0b3f8aa5-321a-4103-8f5d-59ff951c103a/volumes" Nov 24 12:46:44 crc kubenswrapper[4752]: I1124 12:46:44.834893 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87748d6d-7490-4a95-9cdc-5fc516929b3d","Type":"ContainerStarted","Data":"6d45a1d506100abc95398c44af5294bd31f4e3119ac98d0385b84f6dbadf7626"} Nov 24 12:46:45 crc kubenswrapper[4752]: I1124 12:46:45.881278 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87748d6d-7490-4a95-9cdc-5fc516929b3d","Type":"ContainerStarted","Data":"c30b7caca9e182ff141c87114a7c382f75aaf3d42e19484db399d62018fbf902"} Nov 24 12:46:46 crc kubenswrapper[4752]: I1124 12:46:46.293032 4752 scope.go:117] "RemoveContainer" containerID="a4b317cccf9b069d669d79631708f878dfceb62d2f2df166276b3ddae208c6c9" Nov 24 12:46:46 crc kubenswrapper[4752]: I1124 12:46:46.341578 4752 scope.go:117] "RemoveContainer" containerID="75824353830c05a0521eb1cb0c25fc522e3e8198813f7e10876fe6c76ae15e02" Nov 24 12:46:46 crc kubenswrapper[4752]: I1124 12:46:46.505700 4752 scope.go:117] "RemoveContainer" containerID="fcf86d523ed08db04b505e8ec59028ab8a5a646c2138613945af49f0b4e19c6b" Nov 24 12:46:46 crc kubenswrapper[4752]: I1124 12:46:46.862617 4752 scope.go:117] "RemoveContainer" containerID="cd6925524a88bfd66414e81ae9a1567d3985dfbad3b896dfb9fa0bda4ec33241" Nov 24 12:46:46 crc kubenswrapper[4752]: I1124 12:46:46.938963 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Nov 24 12:46:47 crc kubenswrapper[4752]: I1124 12:46:47.208815 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Nov 24 12:46:47 crc kubenswrapper[4752]: I1124 12:46:47.419253 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Nov 24 12:46:47 crc kubenswrapper[4752]: I1124 12:46:47.920579 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87748d6d-7490-4a95-9cdc-5fc516929b3d","Type":"ContainerStarted","Data":"d9be728800e2513628e1445bc135856431db4827da55994e91f954cdab7d22bd"} Nov 24 12:46:48 crc kubenswrapper[4752]: I1124 12:46:48.934299 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87748d6d-7490-4a95-9cdc-5fc516929b3d","Type":"ContainerStarted","Data":"437de17f5eed897fd0e91dec2d0b930fb1c2422bf3668aea7bbe89bf61113e68"} Nov 24 12:46:49 crc kubenswrapper[4752]: I1124 12:46:49.949232 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87748d6d-7490-4a95-9cdc-5fc516929b3d","Type":"ContainerStarted","Data":"73a8e087514f934533d0e54ccd101a6609dc9ff45afb1acd326b79fb14cbe2cd"} Nov 24 12:46:49 crc kubenswrapper[4752]: I1124 12:46:49.949566 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 24 12:46:49 crc kubenswrapper[4752]: I1124 12:46:49.977560 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.744821247 podStartE2EDuration="6.977542171s" podCreationTimestamp="2025-11-24 12:46:43 +0000 UTC" firstStartedPulling="2025-11-24 12:46:44.102968195 +0000 UTC m=+6010.087788484" lastFinishedPulling="2025-11-24 12:46:49.335689119 +0000 UTC m=+6015.320509408" observedRunningTime="2025-11-24 12:46:49.975483862 +0000 UTC m=+6015.960304151" watchObservedRunningTime="2025-11-24 12:46:49.977542171 +0000 UTC m=+6015.962362460" Nov 24 12:46:55 crc kubenswrapper[4752]: I1124 12:46:55.042005 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-r4mbw"] Nov 24 12:46:55 crc kubenswrapper[4752]: I1124 12:46:55.059312 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-483d-account-create-6rq5c"] Nov 24 12:46:55 crc kubenswrapper[4752]: I1124 12:46:55.069571 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-483d-account-create-6rq5c"] Nov 24 12:46:55 crc kubenswrapper[4752]: I1124 12:46:55.079533 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-r4mbw"] Nov 24 12:46:56 crc kubenswrapper[4752]: I1124 12:46:56.758609 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21eb33dc-8ff4-486c-94e2-6bb575ae96b1" path="/var/lib/kubelet/pods/21eb33dc-8ff4-486c-94e2-6bb575ae96b1/volumes" Nov 24 12:46:56 crc kubenswrapper[4752]: I1124 12:46:56.759784 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df5574bf-0917-4afa-b978-788f73ad765f" path="/var/lib/kubelet/pods/df5574bf-0917-4afa-b978-788f73ad765f/volumes" Nov 24 12:47:03 crc kubenswrapper[4752]: I1124 12:47:03.034051 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-wtnw8"] Nov 24 12:47:03 crc kubenswrapper[4752]: I1124 12:47:03.043213 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-wtnw8"] Nov 24 12:47:04 crc kubenswrapper[4752]: I1124 12:47:04.751134 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="150b6dd6-25f2-4854-8f0c-2088c5db37b2" path="/var/lib/kubelet/pods/150b6dd6-25f2-4854-8f0c-2088c5db37b2/volumes" Nov 24 12:47:13 crc kubenswrapper[4752]: I1124 12:47:13.530169 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 24 12:47:34 crc kubenswrapper[4752]: I1124 12:47:34.156110 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7467c5dd9-jfq79"] Nov 24 12:47:34 crc kubenswrapper[4752]: I1124 12:47:34.158714 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7467c5dd9-jfq79" Nov 24 12:47:34 crc kubenswrapper[4752]: I1124 12:47:34.160648 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1" Nov 24 12:47:34 crc kubenswrapper[4752]: I1124 12:47:34.175691 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7467c5dd9-jfq79"] Nov 24 12:47:34 crc kubenswrapper[4752]: I1124 12:47:34.222810 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsmql\" (UniqueName: \"kubernetes.io/projected/7caac067-8c0c-485c-98ff-3021fd720674-kube-api-access-nsmql\") pod \"dnsmasq-dns-7467c5dd9-jfq79\" (UID: \"7caac067-8c0c-485c-98ff-3021fd720674\") " pod="openstack/dnsmasq-dns-7467c5dd9-jfq79" Nov 24 12:47:34 crc kubenswrapper[4752]: I1124 12:47:34.222867 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/7caac067-8c0c-485c-98ff-3021fd720674-openstack-cell1\") pod \"dnsmasq-dns-7467c5dd9-jfq79\" (UID: \"7caac067-8c0c-485c-98ff-3021fd720674\") " pod="openstack/dnsmasq-dns-7467c5dd9-jfq79" Nov 24 12:47:34 crc kubenswrapper[4752]: I1124 12:47:34.223243 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7caac067-8c0c-485c-98ff-3021fd720674-ovsdbserver-sb\") pod \"dnsmasq-dns-7467c5dd9-jfq79\" (UID: \"7caac067-8c0c-485c-98ff-3021fd720674\") " pod="openstack/dnsmasq-dns-7467c5dd9-jfq79" Nov 24 12:47:34 crc kubenswrapper[4752]: I1124 12:47:34.223374 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7caac067-8c0c-485c-98ff-3021fd720674-dns-svc\") pod \"dnsmasq-dns-7467c5dd9-jfq79\" (UID: \"7caac067-8c0c-485c-98ff-3021fd720674\") " pod="openstack/dnsmasq-dns-7467c5dd9-jfq79" Nov 24 12:47:34 crc kubenswrapper[4752]: I1124 12:47:34.223464 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7caac067-8c0c-485c-98ff-3021fd720674-ovsdbserver-nb\") pod \"dnsmasq-dns-7467c5dd9-jfq79\" (UID: \"7caac067-8c0c-485c-98ff-3021fd720674\") " pod="openstack/dnsmasq-dns-7467c5dd9-jfq79" Nov 24 12:47:34 crc kubenswrapper[4752]: I1124 12:47:34.223527 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7caac067-8c0c-485c-98ff-3021fd720674-config\") pod \"dnsmasq-dns-7467c5dd9-jfq79\" (UID: \"7caac067-8c0c-485c-98ff-3021fd720674\") " pod="openstack/dnsmasq-dns-7467c5dd9-jfq79" Nov 24 12:47:34 crc kubenswrapper[4752]: I1124 12:47:34.325558 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7caac067-8c0c-485c-98ff-3021fd720674-ovsdbserver-sb\") pod \"dnsmasq-dns-7467c5dd9-jfq79\" (UID: \"7caac067-8c0c-485c-98ff-3021fd720674\") " pod="openstack/dnsmasq-dns-7467c5dd9-jfq79" Nov 24 12:47:34 crc kubenswrapper[4752]: I1124 12:47:34.325630 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7caac067-8c0c-485c-98ff-3021fd720674-dns-svc\") pod \"dnsmasq-dns-7467c5dd9-jfq79\" (UID: \"7caac067-8c0c-485c-98ff-3021fd720674\") " pod="openstack/dnsmasq-dns-7467c5dd9-jfq79" Nov 24 12:47:34 crc kubenswrapper[4752]: I1124 12:47:34.325671 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7caac067-8c0c-485c-98ff-3021fd720674-ovsdbserver-nb\") pod \"dnsmasq-dns-7467c5dd9-jfq79\" (UID: \"7caac067-8c0c-485c-98ff-3021fd720674\") " pod="openstack/dnsmasq-dns-7467c5dd9-jfq79" Nov 24 12:47:34 crc kubenswrapper[4752]: I1124 12:47:34.325700 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7caac067-8c0c-485c-98ff-3021fd720674-config\") pod \"dnsmasq-dns-7467c5dd9-jfq79\" (UID: \"7caac067-8c0c-485c-98ff-3021fd720674\") " pod="openstack/dnsmasq-dns-7467c5dd9-jfq79" Nov 24 12:47:34 crc kubenswrapper[4752]: I1124 12:47:34.325787 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsmql\" (UniqueName: \"kubernetes.io/projected/7caac067-8c0c-485c-98ff-3021fd720674-kube-api-access-nsmql\") pod \"dnsmasq-dns-7467c5dd9-jfq79\" (UID: \"7caac067-8c0c-485c-98ff-3021fd720674\") " pod="openstack/dnsmasq-dns-7467c5dd9-jfq79" Nov 24 12:47:34 crc kubenswrapper[4752]: I1124 12:47:34.325808 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/7caac067-8c0c-485c-98ff-3021fd720674-openstack-cell1\") pod \"dnsmasq-dns-7467c5dd9-jfq79\" (UID: \"7caac067-8c0c-485c-98ff-3021fd720674\") " pod="openstack/dnsmasq-dns-7467c5dd9-jfq79" Nov 24 12:47:34 crc kubenswrapper[4752]: I1124 12:47:34.326888 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7caac067-8c0c-485c-98ff-3021fd720674-config\") pod \"dnsmasq-dns-7467c5dd9-jfq79\" (UID: \"7caac067-8c0c-485c-98ff-3021fd720674\") " pod="openstack/dnsmasq-dns-7467c5dd9-jfq79" Nov 24 12:47:34 crc kubenswrapper[4752]: I1124 12:47:34.326916 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/7caac067-8c0c-485c-98ff-3021fd720674-openstack-cell1\") pod \"dnsmasq-dns-7467c5dd9-jfq79\" (UID: \"7caac067-8c0c-485c-98ff-3021fd720674\") " pod="openstack/dnsmasq-dns-7467c5dd9-jfq79" Nov 24 12:47:34 crc kubenswrapper[4752]: I1124 12:47:34.326944 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7caac067-8c0c-485c-98ff-3021fd720674-dns-svc\") pod \"dnsmasq-dns-7467c5dd9-jfq79\" (UID: \"7caac067-8c0c-485c-98ff-3021fd720674\") " pod="openstack/dnsmasq-dns-7467c5dd9-jfq79" Nov 24 12:47:34 crc kubenswrapper[4752]: I1124 12:47:34.326946 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7caac067-8c0c-485c-98ff-3021fd720674-ovsdbserver-nb\") pod \"dnsmasq-dns-7467c5dd9-jfq79\" (UID: \"7caac067-8c0c-485c-98ff-3021fd720674\") " pod="openstack/dnsmasq-dns-7467c5dd9-jfq79" Nov 24 12:47:34 crc kubenswrapper[4752]: I1124 12:47:34.326916 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7caac067-8c0c-485c-98ff-3021fd720674-ovsdbserver-sb\") pod \"dnsmasq-dns-7467c5dd9-jfq79\" (UID: \"7caac067-8c0c-485c-98ff-3021fd720674\") " pod="openstack/dnsmasq-dns-7467c5dd9-jfq79" Nov 24 12:47:34 crc kubenswrapper[4752]: I1124 12:47:34.343470 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsmql\" (UniqueName: \"kubernetes.io/projected/7caac067-8c0c-485c-98ff-3021fd720674-kube-api-access-nsmql\") pod \"dnsmasq-dns-7467c5dd9-jfq79\" (UID: \"7caac067-8c0c-485c-98ff-3021fd720674\") " pod="openstack/dnsmasq-dns-7467c5dd9-jfq79" Nov 24 12:47:34 crc kubenswrapper[4752]: I1124 12:47:34.480538 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7467c5dd9-jfq79" Nov 24 12:47:35 crc kubenswrapper[4752]: I1124 12:47:35.000318 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7467c5dd9-jfq79"] Nov 24 12:47:35 crc kubenswrapper[4752]: I1124 12:47:35.545940 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7467c5dd9-jfq79" event={"ID":"7caac067-8c0c-485c-98ff-3021fd720674","Type":"ContainerStarted","Data":"a5371446e4e7d055db80481450b1f286ddc20e7af08188bf9dd6c45759e492d9"} Nov 24 12:47:35 crc kubenswrapper[4752]: I1124 12:47:35.546329 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7467c5dd9-jfq79" event={"ID":"7caac067-8c0c-485c-98ff-3021fd720674","Type":"ContainerStarted","Data":"9667a150f13eed4a0c453da24909dc19b74a5e37eade62212709c1d9fed943f0"} Nov 24 12:47:36 crc kubenswrapper[4752]: I1124 12:47:36.557928 4752 generic.go:334] "Generic (PLEG): container finished" podID="7caac067-8c0c-485c-98ff-3021fd720674" containerID="a5371446e4e7d055db80481450b1f286ddc20e7af08188bf9dd6c45759e492d9" exitCode=0 Nov 24 12:47:36 crc kubenswrapper[4752]: I1124 12:47:36.558050 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7467c5dd9-jfq79" event={"ID":"7caac067-8c0c-485c-98ff-3021fd720674","Type":"ContainerDied","Data":"a5371446e4e7d055db80481450b1f286ddc20e7af08188bf9dd6c45759e492d9"} Nov 24 12:47:37 crc kubenswrapper[4752]: I1124 12:47:37.576991 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7467c5dd9-jfq79" event={"ID":"7caac067-8c0c-485c-98ff-3021fd720674","Type":"ContainerStarted","Data":"535a96444c40290cd05c4ba7e7c1e04a1062498688a3629dda3d7e7156347cc9"} Nov 24 12:47:37 crc kubenswrapper[4752]: I1124 12:47:37.577657 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7467c5dd9-jfq79" Nov 24 12:47:37 crc kubenswrapper[4752]: I1124 12:47:37.611628 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7467c5dd9-jfq79" podStartSLOduration=3.611600358 podStartE2EDuration="3.611600358s" podCreationTimestamp="2025-11-24 12:47:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:47:37.595059203 +0000 UTC m=+6063.579879502" watchObservedRunningTime="2025-11-24 12:47:37.611600358 +0000 UTC m=+6063.596420657" Nov 24 12:47:44 crc kubenswrapper[4752]: I1124 12:47:44.481949 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7467c5dd9-jfq79" Nov 24 12:47:44 crc kubenswrapper[4752]: I1124 12:47:44.589224 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-676f74989c-5mnw8"] Nov 24 12:47:44 crc kubenswrapper[4752]: I1124 12:47:44.589470 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-676f74989c-5mnw8" podUID="183e03d9-f5f1-4596-a69e-aa1a417a2c78" containerName="dnsmasq-dns" containerID="cri-o://7065dc6142da639dafea9e4651f6ca4eb29ff2f42534d550c3006ee943817439" gracePeriod=10 Nov 24 12:47:44 crc kubenswrapper[4752]: I1124 12:47:44.766140 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5cdb84b55c-kbnrm"] Nov 24 12:47:44 crc kubenswrapper[4752]: I1124 12:47:44.769923 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cdb84b55c-kbnrm" Nov 24 12:47:44 crc kubenswrapper[4752]: I1124 12:47:44.782218 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cdb84b55c-kbnrm"] Nov 24 12:47:44 crc kubenswrapper[4752]: I1124 12:47:44.899596 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8729d4c7-0edc-4f70-a988-ffb7c2a265ca-ovsdbserver-sb\") pod \"dnsmasq-dns-5cdb84b55c-kbnrm\" (UID: \"8729d4c7-0edc-4f70-a988-ffb7c2a265ca\") " pod="openstack/dnsmasq-dns-5cdb84b55c-kbnrm" Nov 24 12:47:44 crc kubenswrapper[4752]: I1124 12:47:44.899679 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8729d4c7-0edc-4f70-a988-ffb7c2a265ca-ovsdbserver-nb\") pod \"dnsmasq-dns-5cdb84b55c-kbnrm\" (UID: \"8729d4c7-0edc-4f70-a988-ffb7c2a265ca\") " pod="openstack/dnsmasq-dns-5cdb84b55c-kbnrm" Nov 24 12:47:44 crc kubenswrapper[4752]: I1124 12:47:44.899820 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/8729d4c7-0edc-4f70-a988-ffb7c2a265ca-openstack-cell1\") pod \"dnsmasq-dns-5cdb84b55c-kbnrm\" (UID: \"8729d4c7-0edc-4f70-a988-ffb7c2a265ca\") " pod="openstack/dnsmasq-dns-5cdb84b55c-kbnrm" Nov 24 12:47:44 crc kubenswrapper[4752]: I1124 12:47:44.899860 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8729d4c7-0edc-4f70-a988-ffb7c2a265ca-dns-svc\") pod \"dnsmasq-dns-5cdb84b55c-kbnrm\" (UID: \"8729d4c7-0edc-4f70-a988-ffb7c2a265ca\") " pod="openstack/dnsmasq-dns-5cdb84b55c-kbnrm" Nov 24 12:47:44 crc kubenswrapper[4752]: I1124 12:47:44.899912 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8729d4c7-0edc-4f70-a988-ffb7c2a265ca-config\") pod \"dnsmasq-dns-5cdb84b55c-kbnrm\" (UID: \"8729d4c7-0edc-4f70-a988-ffb7c2a265ca\") " pod="openstack/dnsmasq-dns-5cdb84b55c-kbnrm" Nov 24 12:47:44 crc kubenswrapper[4752]: I1124 12:47:44.899991 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plm9z\" (UniqueName: \"kubernetes.io/projected/8729d4c7-0edc-4f70-a988-ffb7c2a265ca-kube-api-access-plm9z\") pod \"dnsmasq-dns-5cdb84b55c-kbnrm\" (UID: \"8729d4c7-0edc-4f70-a988-ffb7c2a265ca\") " pod="openstack/dnsmasq-dns-5cdb84b55c-kbnrm" Nov 24 12:47:45 crc kubenswrapper[4752]: I1124 12:47:45.006281 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plm9z\" (UniqueName: \"kubernetes.io/projected/8729d4c7-0edc-4f70-a988-ffb7c2a265ca-kube-api-access-plm9z\") pod \"dnsmasq-dns-5cdb84b55c-kbnrm\" (UID: \"8729d4c7-0edc-4f70-a988-ffb7c2a265ca\") " pod="openstack/dnsmasq-dns-5cdb84b55c-kbnrm" Nov 24 12:47:45 crc kubenswrapper[4752]: I1124 12:47:45.006628 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8729d4c7-0edc-4f70-a988-ffb7c2a265ca-ovsdbserver-sb\") pod \"dnsmasq-dns-5cdb84b55c-kbnrm\" (UID: \"8729d4c7-0edc-4f70-a988-ffb7c2a265ca\") " pod="openstack/dnsmasq-dns-5cdb84b55c-kbnrm" Nov 24 12:47:45 crc kubenswrapper[4752]: I1124 12:47:45.006665 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8729d4c7-0edc-4f70-a988-ffb7c2a265ca-ovsdbserver-nb\") pod \"dnsmasq-dns-5cdb84b55c-kbnrm\" (UID: \"8729d4c7-0edc-4f70-a988-ffb7c2a265ca\") " pod="openstack/dnsmasq-dns-5cdb84b55c-kbnrm" Nov 24 12:47:45 crc kubenswrapper[4752]: I1124 12:47:45.006731 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/8729d4c7-0edc-4f70-a988-ffb7c2a265ca-openstack-cell1\") pod \"dnsmasq-dns-5cdb84b55c-kbnrm\" (UID: \"8729d4c7-0edc-4f70-a988-ffb7c2a265ca\") " pod="openstack/dnsmasq-dns-5cdb84b55c-kbnrm" Nov 24 12:47:45 crc kubenswrapper[4752]: I1124 12:47:45.006779 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8729d4c7-0edc-4f70-a988-ffb7c2a265ca-dns-svc\") pod \"dnsmasq-dns-5cdb84b55c-kbnrm\" (UID: \"8729d4c7-0edc-4f70-a988-ffb7c2a265ca\") " pod="openstack/dnsmasq-dns-5cdb84b55c-kbnrm" Nov 24 12:47:45 crc kubenswrapper[4752]: I1124 12:47:45.006809 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8729d4c7-0edc-4f70-a988-ffb7c2a265ca-config\") pod \"dnsmasq-dns-5cdb84b55c-kbnrm\" (UID: \"8729d4c7-0edc-4f70-a988-ffb7c2a265ca\") " pod="openstack/dnsmasq-dns-5cdb84b55c-kbnrm" Nov 24 12:47:45 crc kubenswrapper[4752]: I1124 12:47:45.007626 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8729d4c7-0edc-4f70-a988-ffb7c2a265ca-config\") pod \"dnsmasq-dns-5cdb84b55c-kbnrm\" (UID: \"8729d4c7-0edc-4f70-a988-ffb7c2a265ca\") " pod="openstack/dnsmasq-dns-5cdb84b55c-kbnrm" Nov 24 12:47:45 crc kubenswrapper[4752]: I1124 12:47:45.008783 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8729d4c7-0edc-4f70-a988-ffb7c2a265ca-ovsdbserver-nb\") pod \"dnsmasq-dns-5cdb84b55c-kbnrm\" (UID: \"8729d4c7-0edc-4f70-a988-ffb7c2a265ca\") " pod="openstack/dnsmasq-dns-5cdb84b55c-kbnrm" Nov 24 12:47:45 crc kubenswrapper[4752]: I1124 12:47:45.009301 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/8729d4c7-0edc-4f70-a988-ffb7c2a265ca-openstack-cell1\") pod \"dnsmasq-dns-5cdb84b55c-kbnrm\" (UID: \"8729d4c7-0edc-4f70-a988-ffb7c2a265ca\") " pod="openstack/dnsmasq-dns-5cdb84b55c-kbnrm" Nov 24 12:47:45 crc kubenswrapper[4752]: I1124 12:47:45.009833 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8729d4c7-0edc-4f70-a988-ffb7c2a265ca-dns-svc\") pod \"dnsmasq-dns-5cdb84b55c-kbnrm\" (UID: \"8729d4c7-0edc-4f70-a988-ffb7c2a265ca\") " pod="openstack/dnsmasq-dns-5cdb84b55c-kbnrm" Nov 24 12:47:45 crc kubenswrapper[4752]: I1124 12:47:45.010259 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8729d4c7-0edc-4f70-a988-ffb7c2a265ca-ovsdbserver-sb\") pod \"dnsmasq-dns-5cdb84b55c-kbnrm\" (UID: \"8729d4c7-0edc-4f70-a988-ffb7c2a265ca\") " pod="openstack/dnsmasq-dns-5cdb84b55c-kbnrm" Nov 24 12:47:45 crc kubenswrapper[4752]: I1124 12:47:45.045772 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plm9z\" (UniqueName: \"kubernetes.io/projected/8729d4c7-0edc-4f70-a988-ffb7c2a265ca-kube-api-access-plm9z\") pod \"dnsmasq-dns-5cdb84b55c-kbnrm\" (UID: \"8729d4c7-0edc-4f70-a988-ffb7c2a265ca\") " pod="openstack/dnsmasq-dns-5cdb84b55c-kbnrm" Nov 24 12:47:45 crc kubenswrapper[4752]: I1124 12:47:45.114365 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cdb84b55c-kbnrm" Nov 24 12:47:45 crc kubenswrapper[4752]: I1124 12:47:45.269340 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-676f74989c-5mnw8" Nov 24 12:47:45 crc kubenswrapper[4752]: I1124 12:47:45.314618 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/183e03d9-f5f1-4596-a69e-aa1a417a2c78-config\") pod \"183e03d9-f5f1-4596-a69e-aa1a417a2c78\" (UID: \"183e03d9-f5f1-4596-a69e-aa1a417a2c78\") " Nov 24 12:47:45 crc kubenswrapper[4752]: I1124 12:47:45.314723 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fbpc\" (UniqueName: \"kubernetes.io/projected/183e03d9-f5f1-4596-a69e-aa1a417a2c78-kube-api-access-2fbpc\") pod \"183e03d9-f5f1-4596-a69e-aa1a417a2c78\" (UID: \"183e03d9-f5f1-4596-a69e-aa1a417a2c78\") " Nov 24 12:47:45 crc kubenswrapper[4752]: I1124 12:47:45.314798 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/183e03d9-f5f1-4596-a69e-aa1a417a2c78-ovsdbserver-sb\") pod \"183e03d9-f5f1-4596-a69e-aa1a417a2c78\" (UID: \"183e03d9-f5f1-4596-a69e-aa1a417a2c78\") " Nov 24 12:47:45 crc kubenswrapper[4752]: I1124 12:47:45.315010 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/183e03d9-f5f1-4596-a69e-aa1a417a2c78-ovsdbserver-nb\") pod \"183e03d9-f5f1-4596-a69e-aa1a417a2c78\" (UID: \"183e03d9-f5f1-4596-a69e-aa1a417a2c78\") " Nov 24 12:47:45 crc kubenswrapper[4752]: I1124 12:47:45.315047 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/183e03d9-f5f1-4596-a69e-aa1a417a2c78-dns-svc\") pod \"183e03d9-f5f1-4596-a69e-aa1a417a2c78\" (UID: \"183e03d9-f5f1-4596-a69e-aa1a417a2c78\") " Nov 24 12:47:45 crc kubenswrapper[4752]: I1124 12:47:45.331126 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/183e03d9-f5f1-4596-a69e-aa1a417a2c78-kube-api-access-2fbpc" (OuterVolumeSpecName: "kube-api-access-2fbpc") pod "183e03d9-f5f1-4596-a69e-aa1a417a2c78" (UID: "183e03d9-f5f1-4596-a69e-aa1a417a2c78"). InnerVolumeSpecName "kube-api-access-2fbpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:47:45 crc kubenswrapper[4752]: I1124 12:47:45.399419 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/183e03d9-f5f1-4596-a69e-aa1a417a2c78-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "183e03d9-f5f1-4596-a69e-aa1a417a2c78" (UID: "183e03d9-f5f1-4596-a69e-aa1a417a2c78"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:47:45 crc kubenswrapper[4752]: I1124 12:47:45.399447 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/183e03d9-f5f1-4596-a69e-aa1a417a2c78-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "183e03d9-f5f1-4596-a69e-aa1a417a2c78" (UID: "183e03d9-f5f1-4596-a69e-aa1a417a2c78"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:47:45 crc kubenswrapper[4752]: I1124 12:47:45.418374 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fbpc\" (UniqueName: \"kubernetes.io/projected/183e03d9-f5f1-4596-a69e-aa1a417a2c78-kube-api-access-2fbpc\") on node \"crc\" DevicePath \"\"" Nov 24 12:47:45 crc kubenswrapper[4752]: I1124 12:47:45.418405 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/183e03d9-f5f1-4596-a69e-aa1a417a2c78-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 24 12:47:45 crc kubenswrapper[4752]: I1124 12:47:45.418464 4752 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/183e03d9-f5f1-4596-a69e-aa1a417a2c78-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 12:47:45 crc kubenswrapper[4752]: I1124 12:47:45.426558 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/183e03d9-f5f1-4596-a69e-aa1a417a2c78-config" (OuterVolumeSpecName: "config") pod "183e03d9-f5f1-4596-a69e-aa1a417a2c78" (UID: "183e03d9-f5f1-4596-a69e-aa1a417a2c78"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:47:45 crc kubenswrapper[4752]: I1124 12:47:45.433443 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/183e03d9-f5f1-4596-a69e-aa1a417a2c78-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "183e03d9-f5f1-4596-a69e-aa1a417a2c78" (UID: "183e03d9-f5f1-4596-a69e-aa1a417a2c78"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:47:45 crc kubenswrapper[4752]: I1124 12:47:45.520668 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/183e03d9-f5f1-4596-a69e-aa1a417a2c78-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 24 12:47:45 crc kubenswrapper[4752]: I1124 12:47:45.520712 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/183e03d9-f5f1-4596-a69e-aa1a417a2c78-config\") on node \"crc\" DevicePath \"\"" Nov 24 12:47:45 crc kubenswrapper[4752]: I1124 12:47:45.665148 4752 generic.go:334] "Generic (PLEG): container finished" podID="183e03d9-f5f1-4596-a69e-aa1a417a2c78" containerID="7065dc6142da639dafea9e4651f6ca4eb29ff2f42534d550c3006ee943817439" exitCode=0 Nov 24 12:47:45 crc kubenswrapper[4752]: I1124 12:47:45.665214 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-676f74989c-5mnw8" Nov 24 12:47:45 crc kubenswrapper[4752]: I1124 12:47:45.665228 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-676f74989c-5mnw8" event={"ID":"183e03d9-f5f1-4596-a69e-aa1a417a2c78","Type":"ContainerDied","Data":"7065dc6142da639dafea9e4651f6ca4eb29ff2f42534d550c3006ee943817439"} Nov 24 12:47:45 crc kubenswrapper[4752]: I1124 12:47:45.666365 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-676f74989c-5mnw8" event={"ID":"183e03d9-f5f1-4596-a69e-aa1a417a2c78","Type":"ContainerDied","Data":"be1ec49ed9a4e3b7b6fa7bd0c54d3c94d389960782d0a568d19dac9c0cb4c54f"} Nov 24 12:47:45 crc kubenswrapper[4752]: I1124 12:47:45.666402 4752 scope.go:117] "RemoveContainer" containerID="7065dc6142da639dafea9e4651f6ca4eb29ff2f42534d550c3006ee943817439" Nov 24 12:47:45 crc kubenswrapper[4752]: I1124 12:47:45.698835 4752 scope.go:117] "RemoveContainer" containerID="277ab1dcd3680267507bae6a48d31c0d989301d2bf9e198c1382f0b854de056b" Nov 24 12:47:45 crc kubenswrapper[4752]: I1124 12:47:45.717860 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-676f74989c-5mnw8"] Nov 24 12:47:45 crc kubenswrapper[4752]: I1124 12:47:45.729687 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-676f74989c-5mnw8"] Nov 24 12:47:45 crc kubenswrapper[4752]: I1124 12:47:45.739108 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cdb84b55c-kbnrm"] Nov 24 12:47:45 crc kubenswrapper[4752]: I1124 12:47:45.983035 4752 scope.go:117] "RemoveContainer" containerID="7065dc6142da639dafea9e4651f6ca4eb29ff2f42534d550c3006ee943817439" Nov 24 12:47:45 crc kubenswrapper[4752]: E1124 12:47:45.984006 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7065dc6142da639dafea9e4651f6ca4eb29ff2f42534d550c3006ee943817439\": container with ID starting with 7065dc6142da639dafea9e4651f6ca4eb29ff2f42534d550c3006ee943817439 not found: ID does not exist" containerID="7065dc6142da639dafea9e4651f6ca4eb29ff2f42534d550c3006ee943817439" Nov 24 12:47:45 crc kubenswrapper[4752]: I1124 12:47:45.984035 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7065dc6142da639dafea9e4651f6ca4eb29ff2f42534d550c3006ee943817439"} err="failed to get container status \"7065dc6142da639dafea9e4651f6ca4eb29ff2f42534d550c3006ee943817439\": rpc error: code = NotFound desc = could not find container \"7065dc6142da639dafea9e4651f6ca4eb29ff2f42534d550c3006ee943817439\": container with ID starting with 7065dc6142da639dafea9e4651f6ca4eb29ff2f42534d550c3006ee943817439 not found: ID does not exist" Nov 24 12:47:45 crc kubenswrapper[4752]: I1124 12:47:45.984060 4752 scope.go:117] "RemoveContainer" containerID="277ab1dcd3680267507bae6a48d31c0d989301d2bf9e198c1382f0b854de056b" Nov 24 12:47:45 crc kubenswrapper[4752]: E1124 12:47:45.984346 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"277ab1dcd3680267507bae6a48d31c0d989301d2bf9e198c1382f0b854de056b\": container with ID starting with 277ab1dcd3680267507bae6a48d31c0d989301d2bf9e198c1382f0b854de056b not found: ID does not exist" containerID="277ab1dcd3680267507bae6a48d31c0d989301d2bf9e198c1382f0b854de056b" Nov 24 12:47:45 crc kubenswrapper[4752]: I1124 12:47:45.984366 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"277ab1dcd3680267507bae6a48d31c0d989301d2bf9e198c1382f0b854de056b"} err="failed to get container status \"277ab1dcd3680267507bae6a48d31c0d989301d2bf9e198c1382f0b854de056b\": rpc error: code = NotFound desc = could not find container \"277ab1dcd3680267507bae6a48d31c0d989301d2bf9e198c1382f0b854de056b\": container with ID starting with 277ab1dcd3680267507bae6a48d31c0d989301d2bf9e198c1382f0b854de056b not found: ID does not exist" Nov 24 12:47:46 crc kubenswrapper[4752]: I1124 12:47:46.687845 4752 generic.go:334] "Generic (PLEG): container finished" podID="8729d4c7-0edc-4f70-a988-ffb7c2a265ca" containerID="e2c6d788b03c85b3d6da6a1f6f6d50a9bf22ef7315bcf4c7a593ce22535b51bb" exitCode=0 Nov 24 12:47:46 crc kubenswrapper[4752]: I1124 12:47:46.688242 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cdb84b55c-kbnrm" event={"ID":"8729d4c7-0edc-4f70-a988-ffb7c2a265ca","Type":"ContainerDied","Data":"e2c6d788b03c85b3d6da6a1f6f6d50a9bf22ef7315bcf4c7a593ce22535b51bb"} Nov 24 12:47:46 crc kubenswrapper[4752]: I1124 12:47:46.688287 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cdb84b55c-kbnrm" event={"ID":"8729d4c7-0edc-4f70-a988-ffb7c2a265ca","Type":"ContainerStarted","Data":"7fc77e05c745fc32d7a3ad1d8e04d698300f795183ad758d39306da0c05df57b"} Nov 24 12:47:46 crc kubenswrapper[4752]: I1124 12:47:46.759608 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="183e03d9-f5f1-4596-a69e-aa1a417a2c78" path="/var/lib/kubelet/pods/183e03d9-f5f1-4596-a69e-aa1a417a2c78/volumes" Nov 24 12:47:47 crc kubenswrapper[4752]: I1124 12:47:47.406171 4752 scope.go:117] "RemoveContainer" containerID="7ed569668bcc9bfb02226bffc809ab3ba71d247d314b7301cba9f30bcad94d0e" Nov 24 12:47:47 crc kubenswrapper[4752]: I1124 12:47:47.444819 4752 scope.go:117] "RemoveContainer" containerID="17e2f74d0ed3b38d56f78d5df5499ca44082fa9de130f96a5c55a03a87e949e4" Nov 24 12:47:47 crc kubenswrapper[4752]: I1124 12:47:47.491601 4752 scope.go:117] "RemoveContainer" containerID="1ddfdd3eaa014a677e29890ad9b7ba6b478751bd11e918a04e2b3f15ed57f85e" Nov 24 12:47:47 crc kubenswrapper[4752]: I1124 12:47:47.714788 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cdb84b55c-kbnrm" event={"ID":"8729d4c7-0edc-4f70-a988-ffb7c2a265ca","Type":"ContainerStarted","Data":"4d008753e533977184257a1d404732a5c7bfe2bc8b0f773d6baf0751366a347c"} Nov 24 12:47:47 crc kubenswrapper[4752]: I1124 12:47:47.715100 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5cdb84b55c-kbnrm" Nov 24 12:47:47 crc kubenswrapper[4752]: I1124 12:47:47.748079 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5cdb84b55c-kbnrm" podStartSLOduration=3.74805789 podStartE2EDuration="3.74805789s" podCreationTimestamp="2025-11-24 12:47:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 12:47:47.73934106 +0000 UTC m=+6073.724161359" watchObservedRunningTime="2025-11-24 12:47:47.74805789 +0000 UTC m=+6073.732878169" Nov 24 12:47:55 crc kubenswrapper[4752]: I1124 12:47:55.117023 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5cdb84b55c-kbnrm" Nov 24 12:47:55 crc kubenswrapper[4752]: I1124 12:47:55.193345 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7467c5dd9-jfq79"] Nov 24 12:47:55 crc kubenswrapper[4752]: I1124 12:47:55.193611 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7467c5dd9-jfq79" podUID="7caac067-8c0c-485c-98ff-3021fd720674" containerName="dnsmasq-dns" containerID="cri-o://535a96444c40290cd05c4ba7e7c1e04a1062498688a3629dda3d7e7156347cc9" gracePeriod=10 Nov 24 12:47:55 crc kubenswrapper[4752]: I1124 12:47:55.733940 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7467c5dd9-jfq79" Nov 24 12:47:55 crc kubenswrapper[4752]: I1124 12:47:55.777571 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7caac067-8c0c-485c-98ff-3021fd720674-ovsdbserver-nb\") pod \"7caac067-8c0c-485c-98ff-3021fd720674\" (UID: \"7caac067-8c0c-485c-98ff-3021fd720674\") " Nov 24 12:47:55 crc kubenswrapper[4752]: I1124 12:47:55.777934 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/7caac067-8c0c-485c-98ff-3021fd720674-openstack-cell1\") pod \"7caac067-8c0c-485c-98ff-3021fd720674\" (UID: \"7caac067-8c0c-485c-98ff-3021fd720674\") " Nov 24 12:47:55 crc kubenswrapper[4752]: I1124 12:47:55.778004 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7caac067-8c0c-485c-98ff-3021fd720674-ovsdbserver-sb\") pod \"7caac067-8c0c-485c-98ff-3021fd720674\" (UID: \"7caac067-8c0c-485c-98ff-3021fd720674\") " Nov 24 12:47:55 crc kubenswrapper[4752]: I1124 12:47:55.778043 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7caac067-8c0c-485c-98ff-3021fd720674-config\") pod \"7caac067-8c0c-485c-98ff-3021fd720674\" (UID: \"7caac067-8c0c-485c-98ff-3021fd720674\") " Nov 24 12:47:55 crc kubenswrapper[4752]: I1124 12:47:55.778149 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7caac067-8c0c-485c-98ff-3021fd720674-dns-svc\") pod \"7caac067-8c0c-485c-98ff-3021fd720674\" (UID: \"7caac067-8c0c-485c-98ff-3021fd720674\") " Nov 24 12:47:55 crc kubenswrapper[4752]: I1124 12:47:55.778193 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsmql\" (UniqueName: \"kubernetes.io/projected/7caac067-8c0c-485c-98ff-3021fd720674-kube-api-access-nsmql\") pod \"7caac067-8c0c-485c-98ff-3021fd720674\" (UID: \"7caac067-8c0c-485c-98ff-3021fd720674\") " Nov 24 12:47:55 crc kubenswrapper[4752]: I1124 12:47:55.793362 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7caac067-8c0c-485c-98ff-3021fd720674-kube-api-access-nsmql" (OuterVolumeSpecName: "kube-api-access-nsmql") pod "7caac067-8c0c-485c-98ff-3021fd720674" (UID: "7caac067-8c0c-485c-98ff-3021fd720674"). InnerVolumeSpecName "kube-api-access-nsmql". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:47:55 crc kubenswrapper[4752]: I1124 12:47:55.805430 4752 generic.go:334] "Generic (PLEG): container finished" podID="7caac067-8c0c-485c-98ff-3021fd720674" containerID="535a96444c40290cd05c4ba7e7c1e04a1062498688a3629dda3d7e7156347cc9" exitCode=0 Nov 24 12:47:55 crc kubenswrapper[4752]: I1124 12:47:55.805507 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7467c5dd9-jfq79" event={"ID":"7caac067-8c0c-485c-98ff-3021fd720674","Type":"ContainerDied","Data":"535a96444c40290cd05c4ba7e7c1e04a1062498688a3629dda3d7e7156347cc9"} Nov 24 12:47:55 crc kubenswrapper[4752]: I1124 12:47:55.805555 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7467c5dd9-jfq79" event={"ID":"7caac067-8c0c-485c-98ff-3021fd720674","Type":"ContainerDied","Data":"9667a150f13eed4a0c453da24909dc19b74a5e37eade62212709c1d9fed943f0"} Nov 24 12:47:55 crc kubenswrapper[4752]: I1124 12:47:55.805583 4752 scope.go:117] "RemoveContainer" containerID="535a96444c40290cd05c4ba7e7c1e04a1062498688a3629dda3d7e7156347cc9" Nov 24 12:47:55 crc kubenswrapper[4752]: I1124 12:47:55.805831 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7467c5dd9-jfq79" Nov 24 12:47:55 crc kubenswrapper[4752]: I1124 12:47:55.842944 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7caac067-8c0c-485c-98ff-3021fd720674-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7caac067-8c0c-485c-98ff-3021fd720674" (UID: "7caac067-8c0c-485c-98ff-3021fd720674"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:47:55 crc kubenswrapper[4752]: I1124 12:47:55.842954 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7caac067-8c0c-485c-98ff-3021fd720674-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7caac067-8c0c-485c-98ff-3021fd720674" (UID: "7caac067-8c0c-485c-98ff-3021fd720674"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:47:55 crc kubenswrapper[4752]: I1124 12:47:55.866583 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7caac067-8c0c-485c-98ff-3021fd720674-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7caac067-8c0c-485c-98ff-3021fd720674" (UID: "7caac067-8c0c-485c-98ff-3021fd720674"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:47:55 crc kubenswrapper[4752]: I1124 12:47:55.869479 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7caac067-8c0c-485c-98ff-3021fd720674-config" (OuterVolumeSpecName: "config") pod "7caac067-8c0c-485c-98ff-3021fd720674" (UID: "7caac067-8c0c-485c-98ff-3021fd720674"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:47:55 crc kubenswrapper[4752]: I1124 12:47:55.874576 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7caac067-8c0c-485c-98ff-3021fd720674-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "7caac067-8c0c-485c-98ff-3021fd720674" (UID: "7caac067-8c0c-485c-98ff-3021fd720674"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 12:47:55 crc kubenswrapper[4752]: I1124 12:47:55.882558 4752 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/7caac067-8c0c-485c-98ff-3021fd720674-openstack-cell1\") on node \"crc\" DevicePath \"\"" Nov 24 12:47:55 crc kubenswrapper[4752]: I1124 12:47:55.882598 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7caac067-8c0c-485c-98ff-3021fd720674-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 24 12:47:55 crc kubenswrapper[4752]: I1124 12:47:55.882612 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7caac067-8c0c-485c-98ff-3021fd720674-config\") on node \"crc\" DevicePath \"\"" Nov 24 12:47:55 crc kubenswrapper[4752]: I1124 12:47:55.882631 4752 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7caac067-8c0c-485c-98ff-3021fd720674-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 12:47:55 crc kubenswrapper[4752]: I1124 12:47:55.882643 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsmql\" (UniqueName: \"kubernetes.io/projected/7caac067-8c0c-485c-98ff-3021fd720674-kube-api-access-nsmql\") on node \"crc\" DevicePath \"\"" Nov 24 12:47:55 crc kubenswrapper[4752]: I1124 12:47:55.882656 4752 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7caac067-8c0c-485c-98ff-3021fd720674-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 24 12:47:55 crc kubenswrapper[4752]: I1124 12:47:55.922095 4752 scope.go:117] "RemoveContainer" containerID="a5371446e4e7d055db80481450b1f286ddc20e7af08188bf9dd6c45759e492d9" Nov 24 12:47:55 crc kubenswrapper[4752]: I1124 12:47:55.953795 4752 scope.go:117] "RemoveContainer" containerID="535a96444c40290cd05c4ba7e7c1e04a1062498688a3629dda3d7e7156347cc9" Nov 24 12:47:55 crc kubenswrapper[4752]: E1124 12:47:55.954585 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"535a96444c40290cd05c4ba7e7c1e04a1062498688a3629dda3d7e7156347cc9\": container with ID starting with 535a96444c40290cd05c4ba7e7c1e04a1062498688a3629dda3d7e7156347cc9 not found: ID does not exist" containerID="535a96444c40290cd05c4ba7e7c1e04a1062498688a3629dda3d7e7156347cc9" Nov 24 12:47:55 crc kubenswrapper[4752]: I1124 12:47:55.954629 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"535a96444c40290cd05c4ba7e7c1e04a1062498688a3629dda3d7e7156347cc9"} err="failed to get container status \"535a96444c40290cd05c4ba7e7c1e04a1062498688a3629dda3d7e7156347cc9\": rpc error: code = NotFound desc = could not find container \"535a96444c40290cd05c4ba7e7c1e04a1062498688a3629dda3d7e7156347cc9\": container with ID starting with 535a96444c40290cd05c4ba7e7c1e04a1062498688a3629dda3d7e7156347cc9 not found: ID does not exist" Nov 24 12:47:55 crc kubenswrapper[4752]: I1124 12:47:55.954669 4752 scope.go:117] "RemoveContainer" containerID="a5371446e4e7d055db80481450b1f286ddc20e7af08188bf9dd6c45759e492d9" Nov 24 12:47:55 crc kubenswrapper[4752]: E1124 12:47:55.955007 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5371446e4e7d055db80481450b1f286ddc20e7af08188bf9dd6c45759e492d9\": container with ID starting with a5371446e4e7d055db80481450b1f286ddc20e7af08188bf9dd6c45759e492d9 not found: ID does not exist" containerID="a5371446e4e7d055db80481450b1f286ddc20e7af08188bf9dd6c45759e492d9" Nov 24 12:47:55 crc kubenswrapper[4752]: I1124 12:47:55.955080 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5371446e4e7d055db80481450b1f286ddc20e7af08188bf9dd6c45759e492d9"} err="failed to get container status \"a5371446e4e7d055db80481450b1f286ddc20e7af08188bf9dd6c45759e492d9\": rpc error: code = NotFound desc = could not find container \"a5371446e4e7d055db80481450b1f286ddc20e7af08188bf9dd6c45759e492d9\": container with ID starting with a5371446e4e7d055db80481450b1f286ddc20e7af08188bf9dd6c45759e492d9 not found: ID does not exist" Nov 24 12:47:56 crc kubenswrapper[4752]: I1124 12:47:56.144936 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7467c5dd9-jfq79"] Nov 24 12:47:56 crc kubenswrapper[4752]: I1124 12:47:56.154234 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7467c5dd9-jfq79"] Nov 24 12:47:56 crc kubenswrapper[4752]: I1124 12:47:56.740432 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7caac067-8c0c-485c-98ff-3021fd720674" path="/var/lib/kubelet/pods/7caac067-8c0c-485c-98ff-3021fd720674/volumes" Nov 24 12:48:05 crc kubenswrapper[4752]: I1124 12:48:05.918111 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmqtbp"] Nov 24 12:48:05 crc kubenswrapper[4752]: E1124 12:48:05.922597 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="183e03d9-f5f1-4596-a69e-aa1a417a2c78" containerName="init" Nov 24 12:48:05 crc kubenswrapper[4752]: I1124 12:48:05.922627 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="183e03d9-f5f1-4596-a69e-aa1a417a2c78" containerName="init" Nov 24 12:48:05 crc kubenswrapper[4752]: E1124 12:48:05.922655 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7caac067-8c0c-485c-98ff-3021fd720674" containerName="init" Nov 24 12:48:05 crc kubenswrapper[4752]: I1124 12:48:05.922664 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="7caac067-8c0c-485c-98ff-3021fd720674" containerName="init" Nov 24 12:48:05 crc kubenswrapper[4752]: E1124 12:48:05.922686 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="183e03d9-f5f1-4596-a69e-aa1a417a2c78" containerName="dnsmasq-dns" Nov 24 12:48:05 crc kubenswrapper[4752]: I1124 12:48:05.922692 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="183e03d9-f5f1-4596-a69e-aa1a417a2c78" containerName="dnsmasq-dns" Nov 24 12:48:05 crc kubenswrapper[4752]: E1124 12:48:05.922717 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7caac067-8c0c-485c-98ff-3021fd720674" containerName="dnsmasq-dns" Nov 24 12:48:05 crc kubenswrapper[4752]: I1124 12:48:05.922722 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="7caac067-8c0c-485c-98ff-3021fd720674" containerName="dnsmasq-dns" Nov 24 12:48:05 crc kubenswrapper[4752]: I1124 12:48:05.922945 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="7caac067-8c0c-485c-98ff-3021fd720674" containerName="dnsmasq-dns" Nov 24 12:48:05 crc kubenswrapper[4752]: I1124 12:48:05.922971 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="183e03d9-f5f1-4596-a69e-aa1a417a2c78" containerName="dnsmasq-dns" Nov 24 12:48:05 crc kubenswrapper[4752]: I1124 12:48:05.923731 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmqtbp" Nov 24 12:48:05 crc kubenswrapper[4752]: I1124 12:48:05.943635 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 24 12:48:05 crc kubenswrapper[4752]: I1124 12:48:05.943795 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 24 12:48:05 crc kubenswrapper[4752]: I1124 12:48:05.944225 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qj7cx" Nov 24 12:48:05 crc kubenswrapper[4752]: I1124 12:48:05.944340 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 24 12:48:05 crc kubenswrapper[4752]: I1124 12:48:05.978412 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmqtbp"] Nov 24 12:48:06 crc kubenswrapper[4752]: I1124 12:48:06.029781 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0e4de15b-2388-46ea-a966-8471f04cd894-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cmqtbp\" (UID: \"0e4de15b-2388-46ea-a966-8471f04cd894\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmqtbp" Nov 24 12:48:06 crc kubenswrapper[4752]: I1124 12:48:06.030115 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e4de15b-2388-46ea-a966-8471f04cd894-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cmqtbp\" (UID: \"0e4de15b-2388-46ea-a966-8471f04cd894\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmqtbp" Nov 24 12:48:06 crc kubenswrapper[4752]: I1124 12:48:06.030148 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e4de15b-2388-46ea-a966-8471f04cd894-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cmqtbp\" (UID: \"0e4de15b-2388-46ea-a966-8471f04cd894\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmqtbp" Nov 24 12:48:06 crc kubenswrapper[4752]: I1124 12:48:06.030178 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljf6d\" (UniqueName: \"kubernetes.io/projected/0e4de15b-2388-46ea-a966-8471f04cd894-kube-api-access-ljf6d\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cmqtbp\" (UID: \"0e4de15b-2388-46ea-a966-8471f04cd894\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmqtbp" Nov 24 12:48:06 crc kubenswrapper[4752]: I1124 12:48:06.030252 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0e4de15b-2388-46ea-a966-8471f04cd894-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cmqtbp\" (UID: \"0e4de15b-2388-46ea-a966-8471f04cd894\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmqtbp" Nov 24 12:48:06 crc kubenswrapper[4752]: I1124 12:48:06.133330 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0e4de15b-2388-46ea-a966-8471f04cd894-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cmqtbp\" (UID: \"0e4de15b-2388-46ea-a966-8471f04cd894\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmqtbp" Nov 24 12:48:06 crc kubenswrapper[4752]: I1124 12:48:06.133729 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0e4de15b-2388-46ea-a966-8471f04cd894-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cmqtbp\" (UID: \"0e4de15b-2388-46ea-a966-8471f04cd894\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmqtbp" Nov 24 12:48:06 crc kubenswrapper[4752]: I1124 12:48:06.133909 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e4de15b-2388-46ea-a966-8471f04cd894-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cmqtbp\" (UID: \"0e4de15b-2388-46ea-a966-8471f04cd894\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmqtbp" Nov 24 12:48:06 crc kubenswrapper[4752]: I1124 12:48:06.133943 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e4de15b-2388-46ea-a966-8471f04cd894-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cmqtbp\" (UID: \"0e4de15b-2388-46ea-a966-8471f04cd894\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmqtbp" Nov 24 12:48:06 crc kubenswrapper[4752]: I1124 12:48:06.133976 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljf6d\" (UniqueName: \"kubernetes.io/projected/0e4de15b-2388-46ea-a966-8471f04cd894-kube-api-access-ljf6d\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cmqtbp\" (UID: \"0e4de15b-2388-46ea-a966-8471f04cd894\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmqtbp" Nov 24 12:48:06 crc kubenswrapper[4752]: I1124 12:48:06.145617 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e4de15b-2388-46ea-a966-8471f04cd894-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cmqtbp\" (UID: \"0e4de15b-2388-46ea-a966-8471f04cd894\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmqtbp" Nov 24 12:48:06 crc kubenswrapper[4752]: I1124 12:48:06.145700 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e4de15b-2388-46ea-a966-8471f04cd894-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cmqtbp\" (UID: \"0e4de15b-2388-46ea-a966-8471f04cd894\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmqtbp" Nov 24 12:48:06 crc kubenswrapper[4752]: I1124 12:48:06.147436 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0e4de15b-2388-46ea-a966-8471f04cd894-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cmqtbp\" (UID: \"0e4de15b-2388-46ea-a966-8471f04cd894\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmqtbp" Nov 24 12:48:06 crc kubenswrapper[4752]: I1124 12:48:06.152980 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljf6d\" (UniqueName: \"kubernetes.io/projected/0e4de15b-2388-46ea-a966-8471f04cd894-kube-api-access-ljf6d\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cmqtbp\" (UID: \"0e4de15b-2388-46ea-a966-8471f04cd894\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmqtbp" Nov 24 12:48:06 crc kubenswrapper[4752]: I1124 12:48:06.168581 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0e4de15b-2388-46ea-a966-8471f04cd894-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cmqtbp\" (UID: \"0e4de15b-2388-46ea-a966-8471f04cd894\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmqtbp" Nov 24 12:48:06 crc kubenswrapper[4752]: I1124 12:48:06.253485 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmqtbp" Nov 24 12:48:06 crc kubenswrapper[4752]: I1124 12:48:06.887224 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmqtbp"] Nov 24 12:48:06 crc kubenswrapper[4752]: I1124 12:48:06.966078 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmqtbp" event={"ID":"0e4de15b-2388-46ea-a966-8471f04cd894","Type":"ContainerStarted","Data":"563a860bd81acb09688899dfdf5bec992b6a1fb48db8844249e2642821075809"} Nov 24 12:48:15 crc kubenswrapper[4752]: I1124 12:48:15.469158 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 12:48:15 crc kubenswrapper[4752]: I1124 12:48:15.469816 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 12:48:16 crc kubenswrapper[4752]: I1124 12:48:16.078941 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmqtbp" event={"ID":"0e4de15b-2388-46ea-a966-8471f04cd894","Type":"ContainerStarted","Data":"498ae82f2d1f440535577b79559c62c3a73eb1e009dbd328aba96fb1056a9a87"} Nov 24 12:48:16 crc kubenswrapper[4752]: I1124 12:48:16.100734 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmqtbp" podStartSLOduration=2.327257413 podStartE2EDuration="11.100710996s" podCreationTimestamp="2025-11-24 12:48:05 +0000 UTC" firstStartedPulling="2025-11-24 12:48:06.899600502 +0000 UTC m=+6092.884420791" lastFinishedPulling="2025-11-24 12:48:15.673054065 +0000 UTC m=+6101.657874374" observedRunningTime="2025-11-24 12:48:16.096220767 +0000 UTC m=+6102.081041066" watchObservedRunningTime="2025-11-24 12:48:16.100710996 +0000 UTC m=+6102.085531285" Nov 24 12:48:30 crc kubenswrapper[4752]: I1124 12:48:30.221146 4752 generic.go:334] "Generic (PLEG): container finished" podID="0e4de15b-2388-46ea-a966-8471f04cd894" containerID="498ae82f2d1f440535577b79559c62c3a73eb1e009dbd328aba96fb1056a9a87" exitCode=0 Nov 24 12:48:30 crc kubenswrapper[4752]: I1124 12:48:30.221198 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmqtbp" event={"ID":"0e4de15b-2388-46ea-a966-8471f04cd894","Type":"ContainerDied","Data":"498ae82f2d1f440535577b79559c62c3a73eb1e009dbd328aba96fb1056a9a87"} Nov 24 12:48:31 crc kubenswrapper[4752]: I1124 12:48:31.751392 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmqtbp" Nov 24 12:48:31 crc kubenswrapper[4752]: I1124 12:48:31.930610 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljf6d\" (UniqueName: \"kubernetes.io/projected/0e4de15b-2388-46ea-a966-8471f04cd894-kube-api-access-ljf6d\") pod \"0e4de15b-2388-46ea-a966-8471f04cd894\" (UID: \"0e4de15b-2388-46ea-a966-8471f04cd894\") " Nov 24 12:48:31 crc kubenswrapper[4752]: I1124 12:48:31.930673 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e4de15b-2388-46ea-a966-8471f04cd894-inventory\") pod \"0e4de15b-2388-46ea-a966-8471f04cd894\" (UID: \"0e4de15b-2388-46ea-a966-8471f04cd894\") " Nov 24 12:48:31 crc kubenswrapper[4752]: I1124 12:48:31.930931 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0e4de15b-2388-46ea-a966-8471f04cd894-ceph\") pod \"0e4de15b-2388-46ea-a966-8471f04cd894\" (UID: \"0e4de15b-2388-46ea-a966-8471f04cd894\") " Nov 24 12:48:31 crc kubenswrapper[4752]: I1124 12:48:31.931023 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0e4de15b-2388-46ea-a966-8471f04cd894-ssh-key\") pod \"0e4de15b-2388-46ea-a966-8471f04cd894\" (UID: \"0e4de15b-2388-46ea-a966-8471f04cd894\") " Nov 24 12:48:31 crc kubenswrapper[4752]: I1124 12:48:31.931091 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e4de15b-2388-46ea-a966-8471f04cd894-pre-adoption-validation-combined-ca-bundle\") pod \"0e4de15b-2388-46ea-a966-8471f04cd894\" (UID: \"0e4de15b-2388-46ea-a966-8471f04cd894\") " Nov 24 12:48:31 crc kubenswrapper[4752]: I1124 12:48:31.936661 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e4de15b-2388-46ea-a966-8471f04cd894-ceph" (OuterVolumeSpecName: "ceph") pod "0e4de15b-2388-46ea-a966-8471f04cd894" (UID: "0e4de15b-2388-46ea-a966-8471f04cd894"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:48:31 crc kubenswrapper[4752]: I1124 12:48:31.937041 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e4de15b-2388-46ea-a966-8471f04cd894-kube-api-access-ljf6d" (OuterVolumeSpecName: "kube-api-access-ljf6d") pod "0e4de15b-2388-46ea-a966-8471f04cd894" (UID: "0e4de15b-2388-46ea-a966-8471f04cd894"). InnerVolumeSpecName "kube-api-access-ljf6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:48:31 crc kubenswrapper[4752]: I1124 12:48:31.945915 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e4de15b-2388-46ea-a966-8471f04cd894-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "0e4de15b-2388-46ea-a966-8471f04cd894" (UID: "0e4de15b-2388-46ea-a966-8471f04cd894"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:48:31 crc kubenswrapper[4752]: I1124 12:48:31.963530 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e4de15b-2388-46ea-a966-8471f04cd894-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0e4de15b-2388-46ea-a966-8471f04cd894" (UID: "0e4de15b-2388-46ea-a966-8471f04cd894"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:48:31 crc kubenswrapper[4752]: I1124 12:48:31.980082 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e4de15b-2388-46ea-a966-8471f04cd894-inventory" (OuterVolumeSpecName: "inventory") pod "0e4de15b-2388-46ea-a966-8471f04cd894" (UID: "0e4de15b-2388-46ea-a966-8471f04cd894"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:48:32 crc kubenswrapper[4752]: I1124 12:48:32.035039 4752 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0e4de15b-2388-46ea-a966-8471f04cd894-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 24 12:48:32 crc kubenswrapper[4752]: I1124 12:48:32.035091 4752 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e4de15b-2388-46ea-a966-8471f04cd894-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 12:48:32 crc kubenswrapper[4752]: I1124 12:48:32.035108 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljf6d\" (UniqueName: \"kubernetes.io/projected/0e4de15b-2388-46ea-a966-8471f04cd894-kube-api-access-ljf6d\") on node \"crc\" DevicePath \"\"" Nov 24 12:48:32 crc kubenswrapper[4752]: I1124 12:48:32.035121 4752 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e4de15b-2388-46ea-a966-8471f04cd894-inventory\") on node \"crc\" DevicePath \"\"" Nov 24 12:48:32 crc kubenswrapper[4752]: I1124 12:48:32.035133 4752 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0e4de15b-2388-46ea-a966-8471f04cd894-ceph\") on node \"crc\" DevicePath \"\"" Nov 24 12:48:32 crc kubenswrapper[4752]: I1124 12:48:32.295619 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmqtbp" event={"ID":"0e4de15b-2388-46ea-a966-8471f04cd894","Type":"ContainerDied","Data":"563a860bd81acb09688899dfdf5bec992b6a1fb48db8844249e2642821075809"} Nov 24 12:48:32 crc kubenswrapper[4752]: I1124 12:48:32.295671 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="563a860bd81acb09688899dfdf5bec992b6a1fb48db8844249e2642821075809" Nov 24 12:48:32 crc kubenswrapper[4752]: I1124 12:48:32.295779 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmqtbp" Nov 24 12:48:38 crc kubenswrapper[4752]: I1124 12:48:38.772679 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-v5jr7"] Nov 24 12:48:38 crc kubenswrapper[4752]: E1124 12:48:38.773670 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e4de15b-2388-46ea-a966-8471f04cd894" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Nov 24 12:48:38 crc kubenswrapper[4752]: I1124 12:48:38.773685 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e4de15b-2388-46ea-a966-8471f04cd894" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Nov 24 12:48:38 crc kubenswrapper[4752]: I1124 12:48:38.778264 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e4de15b-2388-46ea-a966-8471f04cd894" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Nov 24 12:48:38 crc kubenswrapper[4752]: I1124 12:48:38.779643 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-v5jr7" Nov 24 12:48:38 crc kubenswrapper[4752]: I1124 12:48:38.791454 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-v5jr7"] Nov 24 12:48:38 crc kubenswrapper[4752]: I1124 12:48:38.792590 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 24 12:48:38 crc kubenswrapper[4752]: I1124 12:48:38.792872 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qj7cx" Nov 24 12:48:38 crc kubenswrapper[4752]: I1124 12:48:38.792950 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 24 12:48:38 crc kubenswrapper[4752]: I1124 12:48:38.792994 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 24 12:48:38 crc kubenswrapper[4752]: I1124 12:48:38.901497 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e737c81-721d-4220-ac1e-24a3057556fe-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-v5jr7\" (UID: \"3e737c81-721d-4220-ac1e-24a3057556fe\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-v5jr7" Nov 24 12:48:38 crc kubenswrapper[4752]: I1124 12:48:38.901938 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzm9s\" (UniqueName: \"kubernetes.io/projected/3e737c81-721d-4220-ac1e-24a3057556fe-kube-api-access-nzm9s\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-v5jr7\" (UID: \"3e737c81-721d-4220-ac1e-24a3057556fe\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-v5jr7" Nov 24 12:48:38 crc kubenswrapper[4752]: I1124 12:48:38.902866 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e737c81-721d-4220-ac1e-24a3057556fe-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-v5jr7\" (UID: \"3e737c81-721d-4220-ac1e-24a3057556fe\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-v5jr7" Nov 24 12:48:38 crc kubenswrapper[4752]: I1124 12:48:38.903118 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3e737c81-721d-4220-ac1e-24a3057556fe-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-v5jr7\" (UID: \"3e737c81-721d-4220-ac1e-24a3057556fe\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-v5jr7" Nov 24 12:48:38 crc kubenswrapper[4752]: I1124 12:48:38.903161 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e737c81-721d-4220-ac1e-24a3057556fe-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-v5jr7\" (UID: \"3e737c81-721d-4220-ac1e-24a3057556fe\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-v5jr7" Nov 24 12:48:39 crc kubenswrapper[4752]: I1124 12:48:39.006704 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzm9s\" (UniqueName: \"kubernetes.io/projected/3e737c81-721d-4220-ac1e-24a3057556fe-kube-api-access-nzm9s\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-v5jr7\" (UID: \"3e737c81-721d-4220-ac1e-24a3057556fe\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-v5jr7" Nov 24 12:48:39 crc kubenswrapper[4752]: I1124 12:48:39.006811 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e737c81-721d-4220-ac1e-24a3057556fe-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-v5jr7\" (UID: \"3e737c81-721d-4220-ac1e-24a3057556fe\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-v5jr7" Nov 24 12:48:39 crc kubenswrapper[4752]: I1124 12:48:39.006901 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3e737c81-721d-4220-ac1e-24a3057556fe-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-v5jr7\" (UID: \"3e737c81-721d-4220-ac1e-24a3057556fe\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-v5jr7" Nov 24 12:48:39 crc kubenswrapper[4752]: I1124 12:48:39.006931 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e737c81-721d-4220-ac1e-24a3057556fe-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-v5jr7\" (UID: \"3e737c81-721d-4220-ac1e-24a3057556fe\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-v5jr7" Nov 24 12:48:39 crc kubenswrapper[4752]: I1124 12:48:39.006957 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e737c81-721d-4220-ac1e-24a3057556fe-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-v5jr7\" (UID: \"3e737c81-721d-4220-ac1e-24a3057556fe\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-v5jr7" Nov 24 12:48:39 crc kubenswrapper[4752]: I1124 12:48:39.013346 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e737c81-721d-4220-ac1e-24a3057556fe-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-v5jr7\" (UID: \"3e737c81-721d-4220-ac1e-24a3057556fe\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-v5jr7" Nov 24 12:48:39 crc kubenswrapper[4752]: I1124 12:48:39.013417 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e737c81-721d-4220-ac1e-24a3057556fe-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-v5jr7\" (UID: \"3e737c81-721d-4220-ac1e-24a3057556fe\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-v5jr7" Nov 24 12:48:39 crc kubenswrapper[4752]: I1124 12:48:39.014934 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e737c81-721d-4220-ac1e-24a3057556fe-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-v5jr7\" (UID: \"3e737c81-721d-4220-ac1e-24a3057556fe\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-v5jr7" Nov 24 12:48:39 crc kubenswrapper[4752]: I1124 12:48:39.015139 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3e737c81-721d-4220-ac1e-24a3057556fe-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-v5jr7\" (UID: \"3e737c81-721d-4220-ac1e-24a3057556fe\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-v5jr7" Nov 24 12:48:39 crc kubenswrapper[4752]: I1124 12:48:39.026659 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzm9s\" (UniqueName: \"kubernetes.io/projected/3e737c81-721d-4220-ac1e-24a3057556fe-kube-api-access-nzm9s\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-v5jr7\" (UID: \"3e737c81-721d-4220-ac1e-24a3057556fe\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-v5jr7" Nov 24 12:48:39 crc kubenswrapper[4752]: I1124 12:48:39.111279 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-v5jr7" Nov 24 12:48:39 crc kubenswrapper[4752]: I1124 12:48:39.715411 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-v5jr7"] Nov 24 12:48:39 crc kubenswrapper[4752]: I1124 12:48:39.724764 4752 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 24 12:48:40 crc kubenswrapper[4752]: I1124 12:48:40.404379 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-v5jr7" event={"ID":"3e737c81-721d-4220-ac1e-24a3057556fe","Type":"ContainerStarted","Data":"64d004cd1d0a1f1b2d835a9b4e2471f67d5b1ad28f9c22990e158b117218816e"} Nov 24 12:48:41 crc kubenswrapper[4752]: I1124 12:48:41.415161 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-v5jr7" event={"ID":"3e737c81-721d-4220-ac1e-24a3057556fe","Type":"ContainerStarted","Data":"472da0a9aa5a42f9bdbabdb912792f366ce314d3548ffd75422efa6b24aca980"} Nov 24 12:48:41 crc kubenswrapper[4752]: I1124 12:48:41.432095 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-v5jr7" podStartSLOduration=3.015658993 podStartE2EDuration="3.43206675s" podCreationTimestamp="2025-11-24 12:48:38 +0000 UTC" firstStartedPulling="2025-11-24 12:48:39.724473634 +0000 UTC m=+6125.709293923" lastFinishedPulling="2025-11-24 12:48:40.140881391 +0000 UTC m=+6126.125701680" observedRunningTime="2025-11-24 12:48:41.4296399 +0000 UTC m=+6127.414460199" watchObservedRunningTime="2025-11-24 12:48:41.43206675 +0000 UTC m=+6127.416887059" Nov 24 12:48:45 crc kubenswrapper[4752]: I1124 12:48:45.468442 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 12:48:45 crc kubenswrapper[4752]: I1124 12:48:45.469125 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 12:49:15 crc kubenswrapper[4752]: I1124 12:49:15.468595 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 12:49:15 crc kubenswrapper[4752]: I1124 12:49:15.469562 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 12:49:15 crc kubenswrapper[4752]: I1124 12:49:15.469612 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 12:49:15 crc kubenswrapper[4752]: I1124 12:49:15.470618 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"658b3ae483b45978b2c67da668bc0398eca5583a5c4be24385a4896168f7191b"} pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 24 12:49:15 crc kubenswrapper[4752]: I1124 12:49:15.470681 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" containerID="cri-o://658b3ae483b45978b2c67da668bc0398eca5583a5c4be24385a4896168f7191b" gracePeriod=600 Nov 24 12:49:15 crc kubenswrapper[4752]: I1124 12:49:15.796376 4752 generic.go:334] "Generic (PLEG): container finished" podID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerID="658b3ae483b45978b2c67da668bc0398eca5583a5c4be24385a4896168f7191b" exitCode=0 Nov 24 12:49:15 crc kubenswrapper[4752]: I1124 12:49:15.796425 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerDied","Data":"658b3ae483b45978b2c67da668bc0398eca5583a5c4be24385a4896168f7191b"} Nov 24 12:49:15 crc kubenswrapper[4752]: I1124 12:49:15.796503 4752 scope.go:117] "RemoveContainer" containerID="2856f89824856d474bf147248c78cc6aab4d866c6c4a631c464c721708251588" Nov 24 12:49:16 crc kubenswrapper[4752]: I1124 12:49:16.808854 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerStarted","Data":"93c57cc4c01503e626ed124c01600822ac967ab96dcecca9218b3f4bd94b1b3b"} Nov 24 12:49:46 crc kubenswrapper[4752]: I1124 12:49:46.043777 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-create-shb95"] Nov 24 12:49:46 crc kubenswrapper[4752]: I1124 12:49:46.055132 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-create-shb95"] Nov 24 12:49:46 crc kubenswrapper[4752]: I1124 12:49:46.753863 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3847960-35ca-445f-8c43-06dfbef18148" path="/var/lib/kubelet/pods/d3847960-35ca-445f-8c43-06dfbef18148/volumes" Nov 24 12:49:47 crc kubenswrapper[4752]: I1124 12:49:47.698049 4752 scope.go:117] "RemoveContainer" containerID="d62f337cc56a0924cbe6a1ebc6adc4ff3b24f9b292d2fa77ad418a375db03505" Nov 24 12:49:48 crc kubenswrapper[4752]: I1124 12:49:48.055420 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-8169-account-create-76s8k"] Nov 24 12:49:48 crc kubenswrapper[4752]: I1124 12:49:48.068454 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-8169-account-create-76s8k"] Nov 24 12:49:48 crc kubenswrapper[4752]: I1124 12:49:48.744848 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f5322bd-b92a-44ec-92ff-519b2d922f8e" path="/var/lib/kubelet/pods/7f5322bd-b92a-44ec-92ff-519b2d922f8e/volumes" Nov 24 12:49:53 crc kubenswrapper[4752]: I1124 12:49:53.033237 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-persistence-db-create-m5lqs"] Nov 24 12:49:53 crc kubenswrapper[4752]: I1124 12:49:53.046100 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-persistence-db-create-m5lqs"] Nov 24 12:49:54 crc kubenswrapper[4752]: I1124 12:49:54.745231 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c470417c-fbc7-4525-a5c5-e5d7a99cfa08" path="/var/lib/kubelet/pods/c470417c-fbc7-4525-a5c5-e5d7a99cfa08/volumes" Nov 24 12:49:55 crc kubenswrapper[4752]: I1124 12:49:55.056079 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-b221-account-create-lm2zh"] Nov 24 12:49:55 crc kubenswrapper[4752]: I1124 12:49:55.069105 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-b221-account-create-lm2zh"] Nov 24 12:49:56 crc kubenswrapper[4752]: I1124 12:49:56.740947 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a8ac779-7b0c-4aba-9a2c-8624a1a55516" path="/var/lib/kubelet/pods/8a8ac779-7b0c-4aba-9a2c-8624a1a55516/volumes" Nov 24 12:50:41 crc kubenswrapper[4752]: I1124 12:50:41.048406 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-sync-gjzpt"] Nov 24 12:50:41 crc kubenswrapper[4752]: I1124 12:50:41.058818 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-sync-gjzpt"] Nov 24 12:50:42 crc kubenswrapper[4752]: I1124 12:50:42.745798 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc345722-cbf8-460e-bc35-86df16ab5f27" path="/var/lib/kubelet/pods/dc345722-cbf8-460e-bc35-86df16ab5f27/volumes" Nov 24 12:50:47 crc kubenswrapper[4752]: I1124 12:50:47.816177 4752 scope.go:117] "RemoveContainer" containerID="38d2f9b6f84d73d99ba0e9c3d9cbcf112f0dd97838b9ab9f3dcbb916f7f442b4" Nov 24 12:50:47 crc kubenswrapper[4752]: I1124 12:50:47.848225 4752 scope.go:117] "RemoveContainer" containerID="3a07ae90a33a655478501b1e0bd4ed8027fe71ee1680e3cb4425379acc09b6de" Nov 24 12:50:47 crc kubenswrapper[4752]: I1124 12:50:47.943477 4752 scope.go:117] "RemoveContainer" containerID="ea501866f7dd91a7fecc400f2b7b86685e522964dbe74fbd25586f3fa286ca0d" Nov 24 12:50:47 crc kubenswrapper[4752]: I1124 12:50:47.994835 4752 scope.go:117] "RemoveContainer" containerID="8d77796551eb0cbd11259ae6a87dbc713018f76403c385718038614f08eaf4bd" Nov 24 12:50:48 crc kubenswrapper[4752]: I1124 12:50:48.071952 4752 scope.go:117] "RemoveContainer" containerID="9a713a4c2e75a1604df15ef04f5e0a88c28b15a38d1e4b44d9f490fc28d47551" Nov 24 12:51:45 crc kubenswrapper[4752]: I1124 12:51:45.469393 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 12:51:45 crc kubenswrapper[4752]: I1124 12:51:45.470067 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 12:52:15 crc kubenswrapper[4752]: I1124 12:52:15.469610 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 12:52:15 crc kubenswrapper[4752]: I1124 12:52:15.470287 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 12:52:45 crc kubenswrapper[4752]: I1124 12:52:45.469257 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 12:52:45 crc kubenswrapper[4752]: I1124 12:52:45.469920 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 12:52:45 crc kubenswrapper[4752]: I1124 12:52:45.469984 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 12:52:45 crc kubenswrapper[4752]: I1124 12:52:45.470917 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"93c57cc4c01503e626ed124c01600822ac967ab96dcecca9218b3f4bd94b1b3b"} pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 24 12:52:45 crc kubenswrapper[4752]: I1124 12:52:45.470972 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" containerID="cri-o://93c57cc4c01503e626ed124c01600822ac967ab96dcecca9218b3f4bd94b1b3b" gracePeriod=600 Nov 24 12:52:45 crc kubenswrapper[4752]: E1124 12:52:45.650912 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:52:46 crc kubenswrapper[4752]: I1124 12:52:46.410469 4752 generic.go:334] "Generic (PLEG): container finished" podID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerID="93c57cc4c01503e626ed124c01600822ac967ab96dcecca9218b3f4bd94b1b3b" exitCode=0 Nov 24 12:52:46 crc kubenswrapper[4752]: I1124 12:52:46.410813 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerDied","Data":"93c57cc4c01503e626ed124c01600822ac967ab96dcecca9218b3f4bd94b1b3b"} Nov 24 12:52:46 crc kubenswrapper[4752]: I1124 12:52:46.410855 4752 scope.go:117] "RemoveContainer" containerID="658b3ae483b45978b2c67da668bc0398eca5583a5c4be24385a4896168f7191b" Nov 24 12:52:46 crc kubenswrapper[4752]: I1124 12:52:46.411542 4752 scope.go:117] "RemoveContainer" containerID="93c57cc4c01503e626ed124c01600822ac967ab96dcecca9218b3f4bd94b1b3b" Nov 24 12:52:46 crc kubenswrapper[4752]: E1124 12:52:46.411845 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:53:00 crc kubenswrapper[4752]: I1124 12:53:00.728237 4752 scope.go:117] "RemoveContainer" containerID="93c57cc4c01503e626ed124c01600822ac967ab96dcecca9218b3f4bd94b1b3b" Nov 24 12:53:00 crc kubenswrapper[4752]: E1124 12:53:00.728965 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:53:13 crc kubenswrapper[4752]: I1124 12:53:13.051349 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-jn9pl"] Nov 24 12:53:13 crc kubenswrapper[4752]: I1124 12:53:13.060877 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-8683-account-create-qqbxv"] Nov 24 12:53:13 crc kubenswrapper[4752]: I1124 12:53:13.068949 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-jn9pl"] Nov 24 12:53:13 crc kubenswrapper[4752]: I1124 12:53:13.076489 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-8683-account-create-qqbxv"] Nov 24 12:53:13 crc kubenswrapper[4752]: I1124 12:53:13.729025 4752 scope.go:117] "RemoveContainer" containerID="93c57cc4c01503e626ed124c01600822ac967ab96dcecca9218b3f4bd94b1b3b" Nov 24 12:53:13 crc kubenswrapper[4752]: E1124 12:53:13.729443 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:53:14 crc kubenswrapper[4752]: I1124 12:53:14.761029 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cb17497-6c03-4e7b-a42a-0d7f1829fbcf" path="/var/lib/kubelet/pods/1cb17497-6c03-4e7b-a42a-0d7f1829fbcf/volumes" Nov 24 12:53:14 crc kubenswrapper[4752]: I1124 12:53:14.762140 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b548dc2a-4fbc-4303-9ace-3e58cb26598f" path="/var/lib/kubelet/pods/b548dc2a-4fbc-4303-9ace-3e58cb26598f/volumes" Nov 24 12:53:26 crc kubenswrapper[4752]: I1124 12:53:26.735800 4752 scope.go:117] "RemoveContainer" containerID="93c57cc4c01503e626ed124c01600822ac967ab96dcecca9218b3f4bd94b1b3b" Nov 24 12:53:26 crc kubenswrapper[4752]: E1124 12:53:26.739412 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:53:27 crc kubenswrapper[4752]: I1124 12:53:27.046180 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-dxb8j"] Nov 24 12:53:27 crc kubenswrapper[4752]: I1124 12:53:27.057872 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-dxb8j"] Nov 24 12:53:28 crc kubenswrapper[4752]: I1124 12:53:28.741506 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49d043b2-da1a-4a85-9193-9b9214e09776" path="/var/lib/kubelet/pods/49d043b2-da1a-4a85-9193-9b9214e09776/volumes" Nov 24 12:53:35 crc kubenswrapper[4752]: I1124 12:53:35.421253 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zrw7t"] Nov 24 12:53:35 crc kubenswrapper[4752]: I1124 12:53:35.424226 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zrw7t" Nov 24 12:53:35 crc kubenswrapper[4752]: I1124 12:53:35.434712 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zrw7t"] Nov 24 12:53:35 crc kubenswrapper[4752]: I1124 12:53:35.539014 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24-utilities\") pod \"redhat-operators-zrw7t\" (UID: \"34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24\") " pod="openshift-marketplace/redhat-operators-zrw7t" Nov 24 12:53:35 crc kubenswrapper[4752]: I1124 12:53:35.539237 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhqhj\" (UniqueName: \"kubernetes.io/projected/34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24-kube-api-access-nhqhj\") pod \"redhat-operators-zrw7t\" (UID: \"34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24\") " pod="openshift-marketplace/redhat-operators-zrw7t" Nov 24 12:53:35 crc kubenswrapper[4752]: I1124 12:53:35.539282 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24-catalog-content\") pod \"redhat-operators-zrw7t\" (UID: \"34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24\") " pod="openshift-marketplace/redhat-operators-zrw7t" Nov 24 12:53:35 crc kubenswrapper[4752]: I1124 12:53:35.641460 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhqhj\" (UniqueName: \"kubernetes.io/projected/34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24-kube-api-access-nhqhj\") pod \"redhat-operators-zrw7t\" (UID: \"34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24\") " pod="openshift-marketplace/redhat-operators-zrw7t" Nov 24 12:53:35 crc kubenswrapper[4752]: I1124 12:53:35.641540 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24-catalog-content\") pod \"redhat-operators-zrw7t\" (UID: \"34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24\") " pod="openshift-marketplace/redhat-operators-zrw7t" Nov 24 12:53:35 crc kubenswrapper[4752]: I1124 12:53:35.641660 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24-utilities\") pod \"redhat-operators-zrw7t\" (UID: \"34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24\") " pod="openshift-marketplace/redhat-operators-zrw7t" Nov 24 12:53:35 crc kubenswrapper[4752]: I1124 12:53:35.642504 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24-catalog-content\") pod \"redhat-operators-zrw7t\" (UID: \"34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24\") " pod="openshift-marketplace/redhat-operators-zrw7t" Nov 24 12:53:35 crc kubenswrapper[4752]: I1124 12:53:35.642515 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24-utilities\") pod \"redhat-operators-zrw7t\" (UID: \"34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24\") " pod="openshift-marketplace/redhat-operators-zrw7t" Nov 24 12:53:35 crc kubenswrapper[4752]: I1124 12:53:35.669820 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhqhj\" (UniqueName: \"kubernetes.io/projected/34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24-kube-api-access-nhqhj\") pod \"redhat-operators-zrw7t\" (UID: \"34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24\") " pod="openshift-marketplace/redhat-operators-zrw7t" Nov 24 12:53:35 crc kubenswrapper[4752]: I1124 12:53:35.745463 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zrw7t" Nov 24 12:53:36 crc kubenswrapper[4752]: I1124 12:53:36.265431 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zrw7t"] Nov 24 12:53:36 crc kubenswrapper[4752]: I1124 12:53:36.963018 4752 generic.go:334] "Generic (PLEG): container finished" podID="34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24" containerID="c79facf542153daa476e4099e4d41a49367b977f09a90be52e3666d65c2efaeb" exitCode=0 Nov 24 12:53:36 crc kubenswrapper[4752]: I1124 12:53:36.963121 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrw7t" event={"ID":"34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24","Type":"ContainerDied","Data":"c79facf542153daa476e4099e4d41a49367b977f09a90be52e3666d65c2efaeb"} Nov 24 12:53:36 crc kubenswrapper[4752]: I1124 12:53:36.963814 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrw7t" event={"ID":"34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24","Type":"ContainerStarted","Data":"0b6ba3e56b255d176e6e5df09b1c4cfc48b4e4e1c118f7b8501f0dc7d345a71a"} Nov 24 12:53:37 crc kubenswrapper[4752]: I1124 12:53:37.973857 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrw7t" event={"ID":"34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24","Type":"ContainerStarted","Data":"7ca10b84c4dd125c1287baf711cb0e48309bcb79db0de4bfa5ceab09bdb43f68"} Nov 24 12:53:39 crc kubenswrapper[4752]: I1124 12:53:39.728301 4752 scope.go:117] "RemoveContainer" containerID="93c57cc4c01503e626ed124c01600822ac967ab96dcecca9218b3f4bd94b1b3b" Nov 24 12:53:39 crc kubenswrapper[4752]: E1124 12:53:39.730221 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:53:43 crc kubenswrapper[4752]: I1124 12:53:43.036399 4752 generic.go:334] "Generic (PLEG): container finished" podID="34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24" containerID="7ca10b84c4dd125c1287baf711cb0e48309bcb79db0de4bfa5ceab09bdb43f68" exitCode=0 Nov 24 12:53:43 crc kubenswrapper[4752]: I1124 12:53:43.036463 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrw7t" event={"ID":"34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24","Type":"ContainerDied","Data":"7ca10b84c4dd125c1287baf711cb0e48309bcb79db0de4bfa5ceab09bdb43f68"} Nov 24 12:53:43 crc kubenswrapper[4752]: I1124 12:53:43.040177 4752 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 24 12:53:44 crc kubenswrapper[4752]: I1124 12:53:44.056718 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrw7t" event={"ID":"34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24","Type":"ContainerStarted","Data":"ede461addad97ff11c025a00724200e3eec615c4bf786fda946fa11923bcf708"} Nov 24 12:53:44 crc kubenswrapper[4752]: I1124 12:53:44.079318 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zrw7t" podStartSLOduration=2.509953349 podStartE2EDuration="9.079299057s" podCreationTimestamp="2025-11-24 12:53:35 +0000 UTC" firstStartedPulling="2025-11-24 12:53:36.966291317 +0000 UTC m=+6422.951111606" lastFinishedPulling="2025-11-24 12:53:43.535637025 +0000 UTC m=+6429.520457314" observedRunningTime="2025-11-24 12:53:44.077264719 +0000 UTC m=+6430.062085018" watchObservedRunningTime="2025-11-24 12:53:44.079299057 +0000 UTC m=+6430.064119356" Nov 24 12:53:45 crc kubenswrapper[4752]: I1124 12:53:45.746272 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zrw7t" Nov 24 12:53:45 crc kubenswrapper[4752]: I1124 12:53:45.746594 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zrw7t" Nov 24 12:53:46 crc kubenswrapper[4752]: I1124 12:53:46.809984 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zrw7t" podUID="34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24" containerName="registry-server" probeResult="failure" output=< Nov 24 12:53:46 crc kubenswrapper[4752]: timeout: failed to connect service ":50051" within 1s Nov 24 12:53:46 crc kubenswrapper[4752]: > Nov 24 12:53:48 crc kubenswrapper[4752]: I1124 12:53:48.261533 4752 scope.go:117] "RemoveContainer" containerID="6d596103c0c9697c86bb84c669d7afc4cb06ca9307cdf78f9a8cacda8b3692c7" Nov 24 12:53:48 crc kubenswrapper[4752]: I1124 12:53:48.293071 4752 scope.go:117] "RemoveContainer" containerID="7ebb701caad73543aafffd680acd0e9a3c49f28e9df8899a36efdf8261e1d622" Nov 24 12:53:48 crc kubenswrapper[4752]: I1124 12:53:48.343901 4752 scope.go:117] "RemoveContainer" containerID="87cb9603077deef98005c5cb1f2c032839dcb5cb7bdc1e8b33a5c0efaccbbeef" Nov 24 12:53:50 crc kubenswrapper[4752]: I1124 12:53:50.728586 4752 scope.go:117] "RemoveContainer" containerID="93c57cc4c01503e626ed124c01600822ac967ab96dcecca9218b3f4bd94b1b3b" Nov 24 12:53:50 crc kubenswrapper[4752]: E1124 12:53:50.729295 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:53:55 crc kubenswrapper[4752]: I1124 12:53:55.813859 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zrw7t" Nov 24 12:53:55 crc kubenswrapper[4752]: I1124 12:53:55.872064 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zrw7t" Nov 24 12:53:56 crc kubenswrapper[4752]: I1124 12:53:56.057709 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zrw7t"] Nov 24 12:53:57 crc kubenswrapper[4752]: I1124 12:53:57.197593 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zrw7t" podUID="34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24" containerName="registry-server" containerID="cri-o://ede461addad97ff11c025a00724200e3eec615c4bf786fda946fa11923bcf708" gracePeriod=2 Nov 24 12:53:57 crc kubenswrapper[4752]: I1124 12:53:57.806945 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zrw7t" Nov 24 12:53:57 crc kubenswrapper[4752]: I1124 12:53:57.992479 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhqhj\" (UniqueName: \"kubernetes.io/projected/34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24-kube-api-access-nhqhj\") pod \"34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24\" (UID: \"34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24\") " Nov 24 12:53:57 crc kubenswrapper[4752]: I1124 12:53:57.992853 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24-utilities\") pod \"34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24\" (UID: \"34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24\") " Nov 24 12:53:57 crc kubenswrapper[4752]: I1124 12:53:57.993072 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24-catalog-content\") pod \"34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24\" (UID: \"34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24\") " Nov 24 12:53:57 crc kubenswrapper[4752]: I1124 12:53:57.993923 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24-utilities" (OuterVolumeSpecName: "utilities") pod "34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24" (UID: "34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:53:58 crc kubenswrapper[4752]: I1124 12:53:58.002270 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24-kube-api-access-nhqhj" (OuterVolumeSpecName: "kube-api-access-nhqhj") pod "34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24" (UID: "34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24"). InnerVolumeSpecName "kube-api-access-nhqhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:53:58 crc kubenswrapper[4752]: I1124 12:53:58.087854 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24" (UID: "34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:53:58 crc kubenswrapper[4752]: I1124 12:53:58.096394 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhqhj\" (UniqueName: \"kubernetes.io/projected/34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24-kube-api-access-nhqhj\") on node \"crc\" DevicePath \"\"" Nov 24 12:53:58 crc kubenswrapper[4752]: I1124 12:53:58.096449 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 12:53:58 crc kubenswrapper[4752]: I1124 12:53:58.096467 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 12:53:58 crc kubenswrapper[4752]: I1124 12:53:58.208854 4752 generic.go:334] "Generic (PLEG): container finished" podID="34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24" containerID="ede461addad97ff11c025a00724200e3eec615c4bf786fda946fa11923bcf708" exitCode=0 Nov 24 12:53:58 crc kubenswrapper[4752]: I1124 12:53:58.208904 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zrw7t" Nov 24 12:53:58 crc kubenswrapper[4752]: I1124 12:53:58.208914 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrw7t" event={"ID":"34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24","Type":"ContainerDied","Data":"ede461addad97ff11c025a00724200e3eec615c4bf786fda946fa11923bcf708"} Nov 24 12:53:58 crc kubenswrapper[4752]: I1124 12:53:58.208974 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrw7t" event={"ID":"34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24","Type":"ContainerDied","Data":"0b6ba3e56b255d176e6e5df09b1c4cfc48b4e4e1c118f7b8501f0dc7d345a71a"} Nov 24 12:53:58 crc kubenswrapper[4752]: I1124 12:53:58.209004 4752 scope.go:117] "RemoveContainer" containerID="ede461addad97ff11c025a00724200e3eec615c4bf786fda946fa11923bcf708" Nov 24 12:53:58 crc kubenswrapper[4752]: I1124 12:53:58.245969 4752 scope.go:117] "RemoveContainer" containerID="7ca10b84c4dd125c1287baf711cb0e48309bcb79db0de4bfa5ceab09bdb43f68" Nov 24 12:53:58 crc kubenswrapper[4752]: I1124 12:53:58.254116 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zrw7t"] Nov 24 12:53:58 crc kubenswrapper[4752]: I1124 12:53:58.262802 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zrw7t"] Nov 24 12:53:58 crc kubenswrapper[4752]: I1124 12:53:58.275143 4752 scope.go:117] "RemoveContainer" containerID="c79facf542153daa476e4099e4d41a49367b977f09a90be52e3666d65c2efaeb" Nov 24 12:53:58 crc kubenswrapper[4752]: I1124 12:53:58.331431 4752 scope.go:117] "RemoveContainer" containerID="ede461addad97ff11c025a00724200e3eec615c4bf786fda946fa11923bcf708" Nov 24 12:53:58 crc kubenswrapper[4752]: E1124 12:53:58.331962 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ede461addad97ff11c025a00724200e3eec615c4bf786fda946fa11923bcf708\": container with ID starting with ede461addad97ff11c025a00724200e3eec615c4bf786fda946fa11923bcf708 not found: ID does not exist" containerID="ede461addad97ff11c025a00724200e3eec615c4bf786fda946fa11923bcf708" Nov 24 12:53:58 crc kubenswrapper[4752]: I1124 12:53:58.332002 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ede461addad97ff11c025a00724200e3eec615c4bf786fda946fa11923bcf708"} err="failed to get container status \"ede461addad97ff11c025a00724200e3eec615c4bf786fda946fa11923bcf708\": rpc error: code = NotFound desc = could not find container \"ede461addad97ff11c025a00724200e3eec615c4bf786fda946fa11923bcf708\": container with ID starting with ede461addad97ff11c025a00724200e3eec615c4bf786fda946fa11923bcf708 not found: ID does not exist" Nov 24 12:53:58 crc kubenswrapper[4752]: I1124 12:53:58.332036 4752 scope.go:117] "RemoveContainer" containerID="7ca10b84c4dd125c1287baf711cb0e48309bcb79db0de4bfa5ceab09bdb43f68" Nov 24 12:53:58 crc kubenswrapper[4752]: E1124 12:53:58.332416 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ca10b84c4dd125c1287baf711cb0e48309bcb79db0de4bfa5ceab09bdb43f68\": container with ID starting with 7ca10b84c4dd125c1287baf711cb0e48309bcb79db0de4bfa5ceab09bdb43f68 not found: ID does not exist" containerID="7ca10b84c4dd125c1287baf711cb0e48309bcb79db0de4bfa5ceab09bdb43f68" Nov 24 12:53:58 crc kubenswrapper[4752]: I1124 12:53:58.332464 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ca10b84c4dd125c1287baf711cb0e48309bcb79db0de4bfa5ceab09bdb43f68"} err="failed to get container status \"7ca10b84c4dd125c1287baf711cb0e48309bcb79db0de4bfa5ceab09bdb43f68\": rpc error: code = NotFound desc = could not find container \"7ca10b84c4dd125c1287baf711cb0e48309bcb79db0de4bfa5ceab09bdb43f68\": container with ID starting with 7ca10b84c4dd125c1287baf711cb0e48309bcb79db0de4bfa5ceab09bdb43f68 not found: ID does not exist" Nov 24 12:53:58 crc kubenswrapper[4752]: I1124 12:53:58.332493 4752 scope.go:117] "RemoveContainer" containerID="c79facf542153daa476e4099e4d41a49367b977f09a90be52e3666d65c2efaeb" Nov 24 12:53:58 crc kubenswrapper[4752]: E1124 12:53:58.332946 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c79facf542153daa476e4099e4d41a49367b977f09a90be52e3666d65c2efaeb\": container with ID starting with c79facf542153daa476e4099e4d41a49367b977f09a90be52e3666d65c2efaeb not found: ID does not exist" containerID="c79facf542153daa476e4099e4d41a49367b977f09a90be52e3666d65c2efaeb" Nov 24 12:53:58 crc kubenswrapper[4752]: I1124 12:53:58.332980 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c79facf542153daa476e4099e4d41a49367b977f09a90be52e3666d65c2efaeb"} err="failed to get container status \"c79facf542153daa476e4099e4d41a49367b977f09a90be52e3666d65c2efaeb\": rpc error: code = NotFound desc = could not find container \"c79facf542153daa476e4099e4d41a49367b977f09a90be52e3666d65c2efaeb\": container with ID starting with c79facf542153daa476e4099e4d41a49367b977f09a90be52e3666d65c2efaeb not found: ID does not exist" Nov 24 12:53:58 crc kubenswrapper[4752]: I1124 12:53:58.741169 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24" path="/var/lib/kubelet/pods/34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24/volumes" Nov 24 12:54:05 crc kubenswrapper[4752]: I1124 12:54:05.728003 4752 scope.go:117] "RemoveContainer" containerID="93c57cc4c01503e626ed124c01600822ac967ab96dcecca9218b3f4bd94b1b3b" Nov 24 12:54:05 crc kubenswrapper[4752]: E1124 12:54:05.728803 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:54:16 crc kubenswrapper[4752]: I1124 12:54:16.728457 4752 scope.go:117] "RemoveContainer" containerID="93c57cc4c01503e626ed124c01600822ac967ab96dcecca9218b3f4bd94b1b3b" Nov 24 12:54:16 crc kubenswrapper[4752]: E1124 12:54:16.729301 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:54:31 crc kubenswrapper[4752]: I1124 12:54:31.728929 4752 scope.go:117] "RemoveContainer" containerID="93c57cc4c01503e626ed124c01600822ac967ab96dcecca9218b3f4bd94b1b3b" Nov 24 12:54:31 crc kubenswrapper[4752]: E1124 12:54:31.729933 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:54:43 crc kubenswrapper[4752]: I1124 12:54:43.728776 4752 scope.go:117] "RemoveContainer" containerID="93c57cc4c01503e626ed124c01600822ac967ab96dcecca9218b3f4bd94b1b3b" Nov 24 12:54:43 crc kubenswrapper[4752]: E1124 12:54:43.729941 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:54:54 crc kubenswrapper[4752]: I1124 12:54:54.734842 4752 scope.go:117] "RemoveContainer" containerID="93c57cc4c01503e626ed124c01600822ac967ab96dcecca9218b3f4bd94b1b3b" Nov 24 12:54:54 crc kubenswrapper[4752]: E1124 12:54:54.735876 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:55:08 crc kubenswrapper[4752]: I1124 12:55:08.745817 4752 scope.go:117] "RemoveContainer" containerID="93c57cc4c01503e626ed124c01600822ac967ab96dcecca9218b3f4bd94b1b3b" Nov 24 12:55:08 crc kubenswrapper[4752]: E1124 12:55:08.747339 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:55:13 crc kubenswrapper[4752]: I1124 12:55:13.338553 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lgnnh"] Nov 24 12:55:13 crc kubenswrapper[4752]: E1124 12:55:13.339678 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24" containerName="extract-utilities" Nov 24 12:55:13 crc kubenswrapper[4752]: I1124 12:55:13.339695 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24" containerName="extract-utilities" Nov 24 12:55:13 crc kubenswrapper[4752]: E1124 12:55:13.339725 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24" containerName="extract-content" Nov 24 12:55:13 crc kubenswrapper[4752]: I1124 12:55:13.339735 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24" containerName="extract-content" Nov 24 12:55:13 crc kubenswrapper[4752]: E1124 12:55:13.339780 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24" containerName="registry-server" Nov 24 12:55:13 crc kubenswrapper[4752]: I1124 12:55:13.339789 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24" containerName="registry-server" Nov 24 12:55:13 crc kubenswrapper[4752]: I1124 12:55:13.340098 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="34f8a7fd-7f6b-4b54-a09f-5e67b79e2c24" containerName="registry-server" Nov 24 12:55:13 crc kubenswrapper[4752]: I1124 12:55:13.342026 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lgnnh" Nov 24 12:55:13 crc kubenswrapper[4752]: I1124 12:55:13.356885 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lgnnh"] Nov 24 12:55:13 crc kubenswrapper[4752]: I1124 12:55:13.412705 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a533085-f3d8-4f4b-a39f-3a995e006a80-catalog-content\") pod \"certified-operators-lgnnh\" (UID: \"7a533085-f3d8-4f4b-a39f-3a995e006a80\") " pod="openshift-marketplace/certified-operators-lgnnh" Nov 24 12:55:13 crc kubenswrapper[4752]: I1124 12:55:13.412830 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a533085-f3d8-4f4b-a39f-3a995e006a80-utilities\") pod \"certified-operators-lgnnh\" (UID: \"7a533085-f3d8-4f4b-a39f-3a995e006a80\") " pod="openshift-marketplace/certified-operators-lgnnh" Nov 24 12:55:13 crc kubenswrapper[4752]: I1124 12:55:13.412916 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vbk7\" (UniqueName: \"kubernetes.io/projected/7a533085-f3d8-4f4b-a39f-3a995e006a80-kube-api-access-4vbk7\") pod \"certified-operators-lgnnh\" (UID: \"7a533085-f3d8-4f4b-a39f-3a995e006a80\") " pod="openshift-marketplace/certified-operators-lgnnh" Nov 24 12:55:13 crc kubenswrapper[4752]: I1124 12:55:13.515000 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a533085-f3d8-4f4b-a39f-3a995e006a80-utilities\") pod \"certified-operators-lgnnh\" (UID: \"7a533085-f3d8-4f4b-a39f-3a995e006a80\") " pod="openshift-marketplace/certified-operators-lgnnh" Nov 24 12:55:13 crc kubenswrapper[4752]: I1124 12:55:13.515119 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vbk7\" (UniqueName: \"kubernetes.io/projected/7a533085-f3d8-4f4b-a39f-3a995e006a80-kube-api-access-4vbk7\") pod \"certified-operators-lgnnh\" (UID: \"7a533085-f3d8-4f4b-a39f-3a995e006a80\") " pod="openshift-marketplace/certified-operators-lgnnh" Nov 24 12:55:13 crc kubenswrapper[4752]: I1124 12:55:13.515312 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a533085-f3d8-4f4b-a39f-3a995e006a80-catalog-content\") pod \"certified-operators-lgnnh\" (UID: \"7a533085-f3d8-4f4b-a39f-3a995e006a80\") " pod="openshift-marketplace/certified-operators-lgnnh" Nov 24 12:55:13 crc kubenswrapper[4752]: I1124 12:55:13.515618 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a533085-f3d8-4f4b-a39f-3a995e006a80-utilities\") pod \"certified-operators-lgnnh\" (UID: \"7a533085-f3d8-4f4b-a39f-3a995e006a80\") " pod="openshift-marketplace/certified-operators-lgnnh" Nov 24 12:55:13 crc kubenswrapper[4752]: I1124 12:55:13.515898 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a533085-f3d8-4f4b-a39f-3a995e006a80-catalog-content\") pod \"certified-operators-lgnnh\" (UID: \"7a533085-f3d8-4f4b-a39f-3a995e006a80\") " pod="openshift-marketplace/certified-operators-lgnnh" Nov 24 12:55:13 crc kubenswrapper[4752]: I1124 12:55:13.539049 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vbk7\" (UniqueName: \"kubernetes.io/projected/7a533085-f3d8-4f4b-a39f-3a995e006a80-kube-api-access-4vbk7\") pod \"certified-operators-lgnnh\" (UID: \"7a533085-f3d8-4f4b-a39f-3a995e006a80\") " pod="openshift-marketplace/certified-operators-lgnnh" Nov 24 12:55:13 crc kubenswrapper[4752]: I1124 12:55:13.671677 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lgnnh" Nov 24 12:55:14 crc kubenswrapper[4752]: I1124 12:55:14.245511 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lgnnh"] Nov 24 12:55:15 crc kubenswrapper[4752]: I1124 12:55:15.006760 4752 generic.go:334] "Generic (PLEG): container finished" podID="7a533085-f3d8-4f4b-a39f-3a995e006a80" containerID="78d788376e62df990a92cf9ecb0db66ce5b0740787600726051efac141695fed" exitCode=0 Nov 24 12:55:15 crc kubenswrapper[4752]: I1124 12:55:15.006807 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lgnnh" event={"ID":"7a533085-f3d8-4f4b-a39f-3a995e006a80","Type":"ContainerDied","Data":"78d788376e62df990a92cf9ecb0db66ce5b0740787600726051efac141695fed"} Nov 24 12:55:15 crc kubenswrapper[4752]: I1124 12:55:15.007126 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lgnnh" event={"ID":"7a533085-f3d8-4f4b-a39f-3a995e006a80","Type":"ContainerStarted","Data":"47dffe4d06504683511c0eea9137cc83c8803de8a73b3d3e63c201834433d7ac"} Nov 24 12:55:17 crc kubenswrapper[4752]: I1124 12:55:17.026549 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lgnnh" event={"ID":"7a533085-f3d8-4f4b-a39f-3a995e006a80","Type":"ContainerStarted","Data":"75c4d76dfd77f07fa5c24fc130f0096dfb3f2b739f87e4ce15e534aa181d1865"} Nov 24 12:55:18 crc kubenswrapper[4752]: I1124 12:55:18.040568 4752 generic.go:334] "Generic (PLEG): container finished" podID="7a533085-f3d8-4f4b-a39f-3a995e006a80" containerID="75c4d76dfd77f07fa5c24fc130f0096dfb3f2b739f87e4ce15e534aa181d1865" exitCode=0 Nov 24 12:55:18 crc kubenswrapper[4752]: I1124 12:55:18.040681 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lgnnh" event={"ID":"7a533085-f3d8-4f4b-a39f-3a995e006a80","Type":"ContainerDied","Data":"75c4d76dfd77f07fa5c24fc130f0096dfb3f2b739f87e4ce15e534aa181d1865"} Nov 24 12:55:19 crc kubenswrapper[4752]: I1124 12:55:19.058582 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lgnnh" event={"ID":"7a533085-f3d8-4f4b-a39f-3a995e006a80","Type":"ContainerStarted","Data":"8af7ad62d6f50b1f9049c59aebcbd738e4f73b02806bccaf7318e300b02ee5fb"} Nov 24 12:55:19 crc kubenswrapper[4752]: I1124 12:55:19.076621 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lgnnh" podStartSLOduration=2.390994127 podStartE2EDuration="6.076601974s" podCreationTimestamp="2025-11-24 12:55:13 +0000 UTC" firstStartedPulling="2025-11-24 12:55:15.009316997 +0000 UTC m=+6520.994137286" lastFinishedPulling="2025-11-24 12:55:18.694924844 +0000 UTC m=+6524.679745133" observedRunningTime="2025-11-24 12:55:19.076315336 +0000 UTC m=+6525.061135625" watchObservedRunningTime="2025-11-24 12:55:19.076601974 +0000 UTC m=+6525.061422263" Nov 24 12:55:23 crc kubenswrapper[4752]: I1124 12:55:23.672095 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lgnnh" Nov 24 12:55:23 crc kubenswrapper[4752]: I1124 12:55:23.672777 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lgnnh" Nov 24 12:55:23 crc kubenswrapper[4752]: I1124 12:55:23.724988 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lgnnh" Nov 24 12:55:23 crc kubenswrapper[4752]: I1124 12:55:23.728263 4752 scope.go:117] "RemoveContainer" containerID="93c57cc4c01503e626ed124c01600822ac967ab96dcecca9218b3f4bd94b1b3b" Nov 24 12:55:23 crc kubenswrapper[4752]: E1124 12:55:23.728606 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:55:24 crc kubenswrapper[4752]: I1124 12:55:24.149380 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lgnnh" Nov 24 12:55:24 crc kubenswrapper[4752]: I1124 12:55:24.207018 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lgnnh"] Nov 24 12:55:26 crc kubenswrapper[4752]: I1124 12:55:26.122577 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lgnnh" podUID="7a533085-f3d8-4f4b-a39f-3a995e006a80" containerName="registry-server" containerID="cri-o://8af7ad62d6f50b1f9049c59aebcbd738e4f73b02806bccaf7318e300b02ee5fb" gracePeriod=2 Nov 24 12:55:26 crc kubenswrapper[4752]: I1124 12:55:26.659369 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lgnnh" Nov 24 12:55:26 crc kubenswrapper[4752]: I1124 12:55:26.816048 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a533085-f3d8-4f4b-a39f-3a995e006a80-utilities\") pod \"7a533085-f3d8-4f4b-a39f-3a995e006a80\" (UID: \"7a533085-f3d8-4f4b-a39f-3a995e006a80\") " Nov 24 12:55:26 crc kubenswrapper[4752]: I1124 12:55:26.816295 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a533085-f3d8-4f4b-a39f-3a995e006a80-catalog-content\") pod \"7a533085-f3d8-4f4b-a39f-3a995e006a80\" (UID: \"7a533085-f3d8-4f4b-a39f-3a995e006a80\") " Nov 24 12:55:26 crc kubenswrapper[4752]: I1124 12:55:26.816386 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vbk7\" (UniqueName: \"kubernetes.io/projected/7a533085-f3d8-4f4b-a39f-3a995e006a80-kube-api-access-4vbk7\") pod \"7a533085-f3d8-4f4b-a39f-3a995e006a80\" (UID: \"7a533085-f3d8-4f4b-a39f-3a995e006a80\") " Nov 24 12:55:26 crc kubenswrapper[4752]: I1124 12:55:26.817664 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a533085-f3d8-4f4b-a39f-3a995e006a80-utilities" (OuterVolumeSpecName: "utilities") pod "7a533085-f3d8-4f4b-a39f-3a995e006a80" (UID: "7a533085-f3d8-4f4b-a39f-3a995e006a80"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:55:26 crc kubenswrapper[4752]: I1124 12:55:26.819638 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a533085-f3d8-4f4b-a39f-3a995e006a80-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 12:55:26 crc kubenswrapper[4752]: I1124 12:55:26.825089 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a533085-f3d8-4f4b-a39f-3a995e006a80-kube-api-access-4vbk7" (OuterVolumeSpecName: "kube-api-access-4vbk7") pod "7a533085-f3d8-4f4b-a39f-3a995e006a80" (UID: "7a533085-f3d8-4f4b-a39f-3a995e006a80"). InnerVolumeSpecName "kube-api-access-4vbk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:55:26 crc kubenswrapper[4752]: I1124 12:55:26.863427 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a533085-f3d8-4f4b-a39f-3a995e006a80-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a533085-f3d8-4f4b-a39f-3a995e006a80" (UID: "7a533085-f3d8-4f4b-a39f-3a995e006a80"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 12:55:26 crc kubenswrapper[4752]: I1124 12:55:26.920936 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a533085-f3d8-4f4b-a39f-3a995e006a80-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 12:55:26 crc kubenswrapper[4752]: I1124 12:55:26.921389 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vbk7\" (UniqueName: \"kubernetes.io/projected/7a533085-f3d8-4f4b-a39f-3a995e006a80-kube-api-access-4vbk7\") on node \"crc\" DevicePath \"\"" Nov 24 12:55:27 crc kubenswrapper[4752]: I1124 12:55:27.139469 4752 generic.go:334] "Generic (PLEG): container finished" podID="7a533085-f3d8-4f4b-a39f-3a995e006a80" containerID="8af7ad62d6f50b1f9049c59aebcbd738e4f73b02806bccaf7318e300b02ee5fb" exitCode=0 Nov 24 12:55:27 crc kubenswrapper[4752]: I1124 12:55:27.139538 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lgnnh" event={"ID":"7a533085-f3d8-4f4b-a39f-3a995e006a80","Type":"ContainerDied","Data":"8af7ad62d6f50b1f9049c59aebcbd738e4f73b02806bccaf7318e300b02ee5fb"} Nov 24 12:55:27 crc kubenswrapper[4752]: I1124 12:55:27.139592 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lgnnh" event={"ID":"7a533085-f3d8-4f4b-a39f-3a995e006a80","Type":"ContainerDied","Data":"47dffe4d06504683511c0eea9137cc83c8803de8a73b3d3e63c201834433d7ac"} Nov 24 12:55:27 crc kubenswrapper[4752]: I1124 12:55:27.139623 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lgnnh" Nov 24 12:55:27 crc kubenswrapper[4752]: I1124 12:55:27.139624 4752 scope.go:117] "RemoveContainer" containerID="8af7ad62d6f50b1f9049c59aebcbd738e4f73b02806bccaf7318e300b02ee5fb" Nov 24 12:55:27 crc kubenswrapper[4752]: I1124 12:55:27.161400 4752 scope.go:117] "RemoveContainer" containerID="75c4d76dfd77f07fa5c24fc130f0096dfb3f2b739f87e4ce15e534aa181d1865" Nov 24 12:55:27 crc kubenswrapper[4752]: I1124 12:55:27.183605 4752 scope.go:117] "RemoveContainer" containerID="78d788376e62df990a92cf9ecb0db66ce5b0740787600726051efac141695fed" Nov 24 12:55:27 crc kubenswrapper[4752]: I1124 12:55:27.192797 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lgnnh"] Nov 24 12:55:27 crc kubenswrapper[4752]: I1124 12:55:27.203826 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lgnnh"] Nov 24 12:55:27 crc kubenswrapper[4752]: I1124 12:55:27.251003 4752 scope.go:117] "RemoveContainer" containerID="8af7ad62d6f50b1f9049c59aebcbd738e4f73b02806bccaf7318e300b02ee5fb" Nov 24 12:55:27 crc kubenswrapper[4752]: E1124 12:55:27.251434 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8af7ad62d6f50b1f9049c59aebcbd738e4f73b02806bccaf7318e300b02ee5fb\": container with ID starting with 8af7ad62d6f50b1f9049c59aebcbd738e4f73b02806bccaf7318e300b02ee5fb not found: ID does not exist" containerID="8af7ad62d6f50b1f9049c59aebcbd738e4f73b02806bccaf7318e300b02ee5fb" Nov 24 12:55:27 crc kubenswrapper[4752]: I1124 12:55:27.251483 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8af7ad62d6f50b1f9049c59aebcbd738e4f73b02806bccaf7318e300b02ee5fb"} err="failed to get container status \"8af7ad62d6f50b1f9049c59aebcbd738e4f73b02806bccaf7318e300b02ee5fb\": rpc error: code = NotFound desc = could not find container \"8af7ad62d6f50b1f9049c59aebcbd738e4f73b02806bccaf7318e300b02ee5fb\": container with ID starting with 8af7ad62d6f50b1f9049c59aebcbd738e4f73b02806bccaf7318e300b02ee5fb not found: ID does not exist" Nov 24 12:55:27 crc kubenswrapper[4752]: I1124 12:55:27.251513 4752 scope.go:117] "RemoveContainer" containerID="75c4d76dfd77f07fa5c24fc130f0096dfb3f2b739f87e4ce15e534aa181d1865" Nov 24 12:55:27 crc kubenswrapper[4752]: E1124 12:55:27.251886 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75c4d76dfd77f07fa5c24fc130f0096dfb3f2b739f87e4ce15e534aa181d1865\": container with ID starting with 75c4d76dfd77f07fa5c24fc130f0096dfb3f2b739f87e4ce15e534aa181d1865 not found: ID does not exist" containerID="75c4d76dfd77f07fa5c24fc130f0096dfb3f2b739f87e4ce15e534aa181d1865" Nov 24 12:55:27 crc kubenswrapper[4752]: I1124 12:55:27.251960 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75c4d76dfd77f07fa5c24fc130f0096dfb3f2b739f87e4ce15e534aa181d1865"} err="failed to get container status \"75c4d76dfd77f07fa5c24fc130f0096dfb3f2b739f87e4ce15e534aa181d1865\": rpc error: code = NotFound desc = could not find container \"75c4d76dfd77f07fa5c24fc130f0096dfb3f2b739f87e4ce15e534aa181d1865\": container with ID starting with 75c4d76dfd77f07fa5c24fc130f0096dfb3f2b739f87e4ce15e534aa181d1865 not found: ID does not exist" Nov 24 12:55:27 crc kubenswrapper[4752]: I1124 12:55:27.251990 4752 scope.go:117] "RemoveContainer" containerID="78d788376e62df990a92cf9ecb0db66ce5b0740787600726051efac141695fed" Nov 24 12:55:27 crc kubenswrapper[4752]: E1124 12:55:27.252324 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78d788376e62df990a92cf9ecb0db66ce5b0740787600726051efac141695fed\": container with ID starting with 78d788376e62df990a92cf9ecb0db66ce5b0740787600726051efac141695fed not found: ID does not exist" containerID="78d788376e62df990a92cf9ecb0db66ce5b0740787600726051efac141695fed" Nov 24 12:55:27 crc kubenswrapper[4752]: I1124 12:55:27.252364 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78d788376e62df990a92cf9ecb0db66ce5b0740787600726051efac141695fed"} err="failed to get container status \"78d788376e62df990a92cf9ecb0db66ce5b0740787600726051efac141695fed\": rpc error: code = NotFound desc = could not find container \"78d788376e62df990a92cf9ecb0db66ce5b0740787600726051efac141695fed\": container with ID starting with 78d788376e62df990a92cf9ecb0db66ce5b0740787600726051efac141695fed not found: ID does not exist" Nov 24 12:55:28 crc kubenswrapper[4752]: I1124 12:55:28.740501 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a533085-f3d8-4f4b-a39f-3a995e006a80" path="/var/lib/kubelet/pods/7a533085-f3d8-4f4b-a39f-3a995e006a80/volumes" Nov 24 12:55:33 crc kubenswrapper[4752]: I1124 12:55:33.066011 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-g69bx"] Nov 24 12:55:33 crc kubenswrapper[4752]: I1124 12:55:33.081114 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-g69bx"] Nov 24 12:55:34 crc kubenswrapper[4752]: I1124 12:55:34.048236 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-4512-account-create-v6zvq"] Nov 24 12:55:34 crc kubenswrapper[4752]: I1124 12:55:34.060534 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-4512-account-create-v6zvq"] Nov 24 12:55:34 crc kubenswrapper[4752]: I1124 12:55:34.741452 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="329f7583-4cd1-46ce-a48d-8f549725dadd" path="/var/lib/kubelet/pods/329f7583-4cd1-46ce-a48d-8f549725dadd/volumes" Nov 24 12:55:34 crc kubenswrapper[4752]: I1124 12:55:34.742525 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fdd3410-1be6-4f7f-bcb6-6787a8181fdf" path="/var/lib/kubelet/pods/8fdd3410-1be6-4f7f-bcb6-6787a8181fdf/volumes" Nov 24 12:55:38 crc kubenswrapper[4752]: I1124 12:55:38.728935 4752 scope.go:117] "RemoveContainer" containerID="93c57cc4c01503e626ed124c01600822ac967ab96dcecca9218b3f4bd94b1b3b" Nov 24 12:55:38 crc kubenswrapper[4752]: E1124 12:55:38.730151 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:55:46 crc kubenswrapper[4752]: I1124 12:55:46.029986 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-74jvb"] Nov 24 12:55:46 crc kubenswrapper[4752]: I1124 12:55:46.041541 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-74jvb"] Nov 24 12:55:46 crc kubenswrapper[4752]: I1124 12:55:46.744730 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73030570-ae25-48f4-bdee-e70a12c2623e" path="/var/lib/kubelet/pods/73030570-ae25-48f4-bdee-e70a12c2623e/volumes" Nov 24 12:55:48 crc kubenswrapper[4752]: I1124 12:55:48.494467 4752 scope.go:117] "RemoveContainer" containerID="080d0570df3e1f664a50c2fc76edb6cbc96175b04748fb07fa81f7210a945759" Nov 24 12:55:48 crc kubenswrapper[4752]: I1124 12:55:48.532656 4752 scope.go:117] "RemoveContainer" containerID="3e79baf00deb37f543240d8d9693dbf04c89662d72894b560e698a8b4bababa9" Nov 24 12:55:48 crc kubenswrapper[4752]: I1124 12:55:48.582824 4752 scope.go:117] "RemoveContainer" containerID="98dc4929120b24fbe9ed100aa2120d926debdafe325a7b8d856a7b07d4d6337c" Nov 24 12:55:49 crc kubenswrapper[4752]: I1124 12:55:49.728625 4752 scope.go:117] "RemoveContainer" containerID="93c57cc4c01503e626ed124c01600822ac967ab96dcecca9218b3f4bd94b1b3b" Nov 24 12:55:49 crc kubenswrapper[4752]: E1124 12:55:49.729490 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:56:01 crc kubenswrapper[4752]: I1124 12:56:01.728775 4752 scope.go:117] "RemoveContainer" containerID="93c57cc4c01503e626ed124c01600822ac967ab96dcecca9218b3f4bd94b1b3b" Nov 24 12:56:01 crc kubenswrapper[4752]: E1124 12:56:01.729594 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:56:09 crc kubenswrapper[4752]: I1124 12:56:09.055111 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-d2fwj"] Nov 24 12:56:09 crc kubenswrapper[4752]: I1124 12:56:09.072495 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-e5e7-account-create-nrcvg"] Nov 24 12:56:09 crc kubenswrapper[4752]: I1124 12:56:09.083938 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-e5e7-account-create-nrcvg"] Nov 24 12:56:09 crc kubenswrapper[4752]: I1124 12:56:09.092857 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-d2fwj"] Nov 24 12:56:10 crc kubenswrapper[4752]: I1124 12:56:10.749378 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75b1add3-2b06-4170-ac77-e588f45ac2c9" path="/var/lib/kubelet/pods/75b1add3-2b06-4170-ac77-e588f45ac2c9/volumes" Nov 24 12:56:10 crc kubenswrapper[4752]: I1124 12:56:10.750843 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7779481c-2a7e-408f-a8d0-3ffbf8abe7a7" path="/var/lib/kubelet/pods/7779481c-2a7e-408f-a8d0-3ffbf8abe7a7/volumes" Nov 24 12:56:16 crc kubenswrapper[4752]: I1124 12:56:16.729300 4752 scope.go:117] "RemoveContainer" containerID="93c57cc4c01503e626ed124c01600822ac967ab96dcecca9218b3f4bd94b1b3b" Nov 24 12:56:16 crc kubenswrapper[4752]: E1124 12:56:16.730469 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:56:24 crc kubenswrapper[4752]: I1124 12:56:24.039606 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-x57sk"] Nov 24 12:56:24 crc kubenswrapper[4752]: I1124 12:56:24.049183 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-x57sk"] Nov 24 12:56:24 crc kubenswrapper[4752]: I1124 12:56:24.746213 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16f6b200-e723-474f-8bb1-c4e502ebd5ad" path="/var/lib/kubelet/pods/16f6b200-e723-474f-8bb1-c4e502ebd5ad/volumes" Nov 24 12:56:27 crc kubenswrapper[4752]: I1124 12:56:27.728679 4752 scope.go:117] "RemoveContainer" containerID="93c57cc4c01503e626ed124c01600822ac967ab96dcecca9218b3f4bd94b1b3b" Nov 24 12:56:27 crc kubenswrapper[4752]: E1124 12:56:27.729469 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:56:41 crc kubenswrapper[4752]: I1124 12:56:41.728642 4752 scope.go:117] "RemoveContainer" containerID="93c57cc4c01503e626ed124c01600822ac967ab96dcecca9218b3f4bd94b1b3b" Nov 24 12:56:41 crc kubenswrapper[4752]: E1124 12:56:41.729640 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:56:48 crc kubenswrapper[4752]: I1124 12:56:48.774256 4752 scope.go:117] "RemoveContainer" containerID="7075480ff63c5ad7c9395589d146b87662edef8e1426ee784477f81506c4f158" Nov 24 12:56:48 crc kubenswrapper[4752]: I1124 12:56:48.811969 4752 scope.go:117] "RemoveContainer" containerID="4262aeebce2551c3175a6e45917416d62fd1b9aae5132152b2b7a354636e3d3c" Nov 24 12:56:48 crc kubenswrapper[4752]: I1124 12:56:48.864485 4752 scope.go:117] "RemoveContainer" containerID="8d9b9b3f9cc2041ed11f1330ad7b77af1b884ffe1cd4385713205ce7e971e29f" Nov 24 12:56:55 crc kubenswrapper[4752]: I1124 12:56:55.727726 4752 scope.go:117] "RemoveContainer" containerID="93c57cc4c01503e626ed124c01600822ac967ab96dcecca9218b3f4bd94b1b3b" Nov 24 12:56:55 crc kubenswrapper[4752]: E1124 12:56:55.728616 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:57:09 crc kubenswrapper[4752]: I1124 12:57:09.728179 4752 scope.go:117] "RemoveContainer" containerID="93c57cc4c01503e626ed124c01600822ac967ab96dcecca9218b3f4bd94b1b3b" Nov 24 12:57:09 crc kubenswrapper[4752]: E1124 12:57:09.729274 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:57:22 crc kubenswrapper[4752]: I1124 12:57:22.728957 4752 scope.go:117] "RemoveContainer" containerID="93c57cc4c01503e626ed124c01600822ac967ab96dcecca9218b3f4bd94b1b3b" Nov 24 12:57:22 crc kubenswrapper[4752]: E1124 12:57:22.730042 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:57:37 crc kubenswrapper[4752]: I1124 12:57:37.729199 4752 scope.go:117] "RemoveContainer" containerID="93c57cc4c01503e626ed124c01600822ac967ab96dcecca9218b3f4bd94b1b3b" Nov 24 12:57:37 crc kubenswrapper[4752]: E1124 12:57:37.731118 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 12:57:50 crc kubenswrapper[4752]: I1124 12:57:50.730332 4752 scope.go:117] "RemoveContainer" containerID="93c57cc4c01503e626ed124c01600822ac967ab96dcecca9218b3f4bd94b1b3b" Nov 24 12:57:51 crc kubenswrapper[4752]: I1124 12:57:51.622702 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerStarted","Data":"0788fdd31bd33cfe1bebc621b6dface2712d5104da45f2898cca2cf2493e647c"} Nov 24 12:59:17 crc kubenswrapper[4752]: I1124 12:59:17.450310 4752 generic.go:334] "Generic (PLEG): container finished" podID="3e737c81-721d-4220-ac1e-24a3057556fe" containerID="472da0a9aa5a42f9bdbabdb912792f366ce314d3548ffd75422efa6b24aca980" exitCode=0 Nov 24 12:59:17 crc kubenswrapper[4752]: I1124 12:59:17.450507 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-v5jr7" event={"ID":"3e737c81-721d-4220-ac1e-24a3057556fe","Type":"ContainerDied","Data":"472da0a9aa5a42f9bdbabdb912792f366ce314d3548ffd75422efa6b24aca980"} Nov 24 12:59:19 crc kubenswrapper[4752]: I1124 12:59:19.042129 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-v5jr7" Nov 24 12:59:19 crc kubenswrapper[4752]: I1124 12:59:19.197714 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3e737c81-721d-4220-ac1e-24a3057556fe-ceph\") pod \"3e737c81-721d-4220-ac1e-24a3057556fe\" (UID: \"3e737c81-721d-4220-ac1e-24a3057556fe\") " Nov 24 12:59:19 crc kubenswrapper[4752]: I1124 12:59:19.198666 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzm9s\" (UniqueName: \"kubernetes.io/projected/3e737c81-721d-4220-ac1e-24a3057556fe-kube-api-access-nzm9s\") pod \"3e737c81-721d-4220-ac1e-24a3057556fe\" (UID: \"3e737c81-721d-4220-ac1e-24a3057556fe\") " Nov 24 12:59:19 crc kubenswrapper[4752]: I1124 12:59:19.198709 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e737c81-721d-4220-ac1e-24a3057556fe-tripleo-cleanup-combined-ca-bundle\") pod \"3e737c81-721d-4220-ac1e-24a3057556fe\" (UID: \"3e737c81-721d-4220-ac1e-24a3057556fe\") " Nov 24 12:59:19 crc kubenswrapper[4752]: I1124 12:59:19.198731 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e737c81-721d-4220-ac1e-24a3057556fe-ssh-key\") pod \"3e737c81-721d-4220-ac1e-24a3057556fe\" (UID: \"3e737c81-721d-4220-ac1e-24a3057556fe\") " Nov 24 12:59:19 crc kubenswrapper[4752]: I1124 12:59:19.198778 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e737c81-721d-4220-ac1e-24a3057556fe-inventory\") pod \"3e737c81-721d-4220-ac1e-24a3057556fe\" (UID: \"3e737c81-721d-4220-ac1e-24a3057556fe\") " Nov 24 12:59:19 crc kubenswrapper[4752]: I1124 12:59:19.204072 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e737c81-721d-4220-ac1e-24a3057556fe-ceph" (OuterVolumeSpecName: "ceph") pod "3e737c81-721d-4220-ac1e-24a3057556fe" (UID: "3e737c81-721d-4220-ac1e-24a3057556fe"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:59:19 crc kubenswrapper[4752]: I1124 12:59:19.204323 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e737c81-721d-4220-ac1e-24a3057556fe-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "3e737c81-721d-4220-ac1e-24a3057556fe" (UID: "3e737c81-721d-4220-ac1e-24a3057556fe"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:59:19 crc kubenswrapper[4752]: I1124 12:59:19.208952 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e737c81-721d-4220-ac1e-24a3057556fe-kube-api-access-nzm9s" (OuterVolumeSpecName: "kube-api-access-nzm9s") pod "3e737c81-721d-4220-ac1e-24a3057556fe" (UID: "3e737c81-721d-4220-ac1e-24a3057556fe"). InnerVolumeSpecName "kube-api-access-nzm9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 12:59:19 crc kubenswrapper[4752]: I1124 12:59:19.230240 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e737c81-721d-4220-ac1e-24a3057556fe-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3e737c81-721d-4220-ac1e-24a3057556fe" (UID: "3e737c81-721d-4220-ac1e-24a3057556fe"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:59:19 crc kubenswrapper[4752]: I1124 12:59:19.232312 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e737c81-721d-4220-ac1e-24a3057556fe-inventory" (OuterVolumeSpecName: "inventory") pod "3e737c81-721d-4220-ac1e-24a3057556fe" (UID: "3e737c81-721d-4220-ac1e-24a3057556fe"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 12:59:19 crc kubenswrapper[4752]: I1124 12:59:19.300398 4752 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3e737c81-721d-4220-ac1e-24a3057556fe-ceph\") on node \"crc\" DevicePath \"\"" Nov 24 12:59:19 crc kubenswrapper[4752]: I1124 12:59:19.300626 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzm9s\" (UniqueName: \"kubernetes.io/projected/3e737c81-721d-4220-ac1e-24a3057556fe-kube-api-access-nzm9s\") on node \"crc\" DevicePath \"\"" Nov 24 12:59:19 crc kubenswrapper[4752]: I1124 12:59:19.300703 4752 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e737c81-721d-4220-ac1e-24a3057556fe-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 12:59:19 crc kubenswrapper[4752]: I1124 12:59:19.300792 4752 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e737c81-721d-4220-ac1e-24a3057556fe-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 24 12:59:19 crc kubenswrapper[4752]: I1124 12:59:19.300895 4752 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e737c81-721d-4220-ac1e-24a3057556fe-inventory\") on node \"crc\" DevicePath \"\"" Nov 24 12:59:19 crc kubenswrapper[4752]: I1124 12:59:19.476972 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-v5jr7" event={"ID":"3e737c81-721d-4220-ac1e-24a3057556fe","Type":"ContainerDied","Data":"64d004cd1d0a1f1b2d835a9b4e2471f67d5b1ad28f9c22990e158b117218816e"} Nov 24 12:59:19 crc kubenswrapper[4752]: I1124 12:59:19.477037 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64d004cd1d0a1f1b2d835a9b4e2471f67d5b1ad28f9c22990e158b117218816e" Nov 24 12:59:19 crc kubenswrapper[4752]: I1124 12:59:19.477075 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-v5jr7" Nov 24 12:59:26 crc kubenswrapper[4752]: I1124 12:59:26.866804 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-956sb"] Nov 24 12:59:26 crc kubenswrapper[4752]: E1124 12:59:26.867827 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a533085-f3d8-4f4b-a39f-3a995e006a80" containerName="registry-server" Nov 24 12:59:26 crc kubenswrapper[4752]: I1124 12:59:26.867845 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a533085-f3d8-4f4b-a39f-3a995e006a80" containerName="registry-server" Nov 24 12:59:26 crc kubenswrapper[4752]: E1124 12:59:26.867859 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e737c81-721d-4220-ac1e-24a3057556fe" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Nov 24 12:59:26 crc kubenswrapper[4752]: I1124 12:59:26.867867 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e737c81-721d-4220-ac1e-24a3057556fe" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Nov 24 12:59:26 crc kubenswrapper[4752]: E1124 12:59:26.867896 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a533085-f3d8-4f4b-a39f-3a995e006a80" containerName="extract-utilities" Nov 24 12:59:26 crc kubenswrapper[4752]: I1124 12:59:26.867903 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a533085-f3d8-4f4b-a39f-3a995e006a80" containerName="extract-utilities" Nov 24 12:59:26 crc kubenswrapper[4752]: E1124 12:59:26.867931 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a533085-f3d8-4f4b-a39f-3a995e006a80" containerName="extract-content" Nov 24 12:59:26 crc kubenswrapper[4752]: I1124 12:59:26.867939 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a533085-f3d8-4f4b-a39f-3a995e006a80" containerName="extract-content" Nov 24 12:59:26 crc kubenswrapper[4752]: I1124 12:59:26.868207 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a533085-f3d8-4f4b-a39f-3a995e006a80" containerName="registry-server" Nov 24 12:59:26 crc kubenswrapper[4752]: I1124 12:59:26.868229 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e737c81-721d-4220-ac1e-24a3057556fe" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Nov 24 12:59:26 crc kubenswrapper[4752]: I1124 12:59:26.870032 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-956sb" Nov 24 12:59:26 crc kubenswrapper[4752]: I1124 12:59:26.874215 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 24 12:59:26 crc kubenswrapper[4752]: I1124 12:59:26.874262 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 24 12:59:26 crc kubenswrapper[4752]: I1124 12:59:26.874458 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qj7cx" Nov 24 12:59:26 crc kubenswrapper[4752]: I1124 12:59:26.874475 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 24 12:59:26 crc kubenswrapper[4752]: I1124 12:59:26.882347 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-956sb"] Nov 24 12:59:26 crc kubenswrapper[4752]: I1124 12:59:26.974349 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d98851aa-db89-424e-9304-44681012e2f0-ceph\") pod \"bootstrap-openstack-openstack-cell1-956sb\" (UID: \"d98851aa-db89-424e-9304-44681012e2f0\") " pod="openstack/bootstrap-openstack-openstack-cell1-956sb" Nov 24 12:59:26 crc kubenswrapper[4752]: I1124 12:59:26.974475 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45srs\" (UniqueName: \"kubernetes.io/projected/d98851aa-db89-424e-9304-44681012e2f0-kube-api-access-45srs\") pod \"bootstrap-openstack-openstack-cell1-956sb\" (UID: \"d98851aa-db89-424e-9304-44681012e2f0\") " pod="openstack/bootstrap-openstack-openstack-cell1-956sb" Nov 24 12:59:26 crc kubenswrapper[4752]: I1124 12:59:26.974643 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d98851aa-db89-424e-9304-44681012e2f0-inventory\") pod \"bootstrap-openstack-openstack-cell1-956sb\" (UID: \"d98851aa-db89-424e-9304-44681012e2f0\") " pod="openstack/bootstrap-openstack-openstack-cell1-956sb" Nov 24 12:59:26 crc kubenswrapper[4752]: I1124 12:59:26.974675 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d98851aa-db89-424e-9304-44681012e2f0-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-956sb\" (UID: \"d98851aa-db89-424e-9304-44681012e2f0\") " pod="openstack/bootstrap-openstack-openstack-cell1-956sb" Nov 24 12:59:26 crc kubenswrapper[4752]: I1124 12:59:26.974981 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d98851aa-db89-424e-9304-44681012e2f0-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-956sb\" (UID: \"d98851aa-db89-424e-9304-44681012e2f0\") " pod="openstack/bootstrap-openstack-openstack-cell1-956sb" Nov 24 12:59:27 crc kubenswrapper[4752]: I1124 12:59:27.077461 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d98851aa-db89-424e-9304-44681012e2f0-inventory\") pod \"bootstrap-openstack-openstack-cell1-956sb\" (UID: \"d98851aa-db89-424e-9304-44681012e2f0\") " pod="openstack/bootstrap-openstack-openstack-cell1-956sb" Nov 24 12:59:27 crc kubenswrapper[4752]: I1124 12:59:27.077529 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d98851aa-db89-424e-9304-44681012e2f0-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-956sb\" (UID: \"d98851aa-db89-424e-9304-44681012e2f0\") " pod="openstack/bootstrap-openstack-openstack-cell1-956sb" Nov 24 12:59:27 crc kubenswrapper[4752]: I1124 12:59:27.077590 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d98851aa-db89-424e-9304-44681012e2f0-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-956sb\" (UID: \"d98851aa-db89-424e-9304-44681012e2f0\") " pod="openstack/bootstrap-openstack-openstack-cell1-956sb" Nov 24 12:59:27 crc kubenswrapper[4752]: I1124 12:59:27.077646 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d98851aa-db89-424e-9304-44681012e2f0-ceph\") pod \"bootstrap-openstack-openstack-cell1-956sb\" (UID: \"d98851aa-db89-424e-9304-44681012e2f0\") " pod="openstack/bootstrap-openstack-openstack-cell1-956sb" Nov 24 12:59:27 crc kubenswrapper[4752]: I1124 12:59:27.077738 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45srs\" (UniqueName: \"kubernetes.io/projected/d98851aa-db89-424e-9304-44681012e2f0-kube-api-access-45srs\") pod \"bootstrap-openstack-openstack-cell1-956sb\" (UID: \"d98851aa-db89-424e-9304-44681012e2f0\") " pod="openstack/bootstrap-openstack-openstack-cell1-956sb" Nov 24 12:59:27 crc kubenswrapper[4752]: I1124 12:59:27.083589 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d98851aa-db89-424e-9304-44681012e2f0-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-956sb\" (UID: \"d98851aa-db89-424e-9304-44681012e2f0\") " pod="openstack/bootstrap-openstack-openstack-cell1-956sb" Nov 24 12:59:27 crc kubenswrapper[4752]: I1124 12:59:27.084107 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d98851aa-db89-424e-9304-44681012e2f0-ceph\") pod \"bootstrap-openstack-openstack-cell1-956sb\" (UID: \"d98851aa-db89-424e-9304-44681012e2f0\") " pod="openstack/bootstrap-openstack-openstack-cell1-956sb" Nov 24 12:59:27 crc kubenswrapper[4752]: I1124 12:59:27.085017 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d98851aa-db89-424e-9304-44681012e2f0-inventory\") pod \"bootstrap-openstack-openstack-cell1-956sb\" (UID: \"d98851aa-db89-424e-9304-44681012e2f0\") " pod="openstack/bootstrap-openstack-openstack-cell1-956sb" Nov 24 12:59:27 crc kubenswrapper[4752]: I1124 12:59:27.085117 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d98851aa-db89-424e-9304-44681012e2f0-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-956sb\" (UID: \"d98851aa-db89-424e-9304-44681012e2f0\") " pod="openstack/bootstrap-openstack-openstack-cell1-956sb" Nov 24 12:59:27 crc kubenswrapper[4752]: I1124 12:59:27.099262 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45srs\" (UniqueName: \"kubernetes.io/projected/d98851aa-db89-424e-9304-44681012e2f0-kube-api-access-45srs\") pod \"bootstrap-openstack-openstack-cell1-956sb\" (UID: \"d98851aa-db89-424e-9304-44681012e2f0\") " pod="openstack/bootstrap-openstack-openstack-cell1-956sb" Nov 24 12:59:27 crc kubenswrapper[4752]: I1124 12:59:27.223046 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-956sb" Nov 24 12:59:27 crc kubenswrapper[4752]: I1124 12:59:27.744670 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-956sb"] Nov 24 12:59:27 crc kubenswrapper[4752]: I1124 12:59:27.764471 4752 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 24 12:59:28 crc kubenswrapper[4752]: I1124 12:59:28.557376 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-956sb" event={"ID":"d98851aa-db89-424e-9304-44681012e2f0","Type":"ContainerStarted","Data":"bf75b2951c1271f218553f85519095a0bd1a8bd1dd12987140ecfc50639e4c54"} Nov 24 12:59:28 crc kubenswrapper[4752]: I1124 12:59:28.557798 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-956sb" event={"ID":"d98851aa-db89-424e-9304-44681012e2f0","Type":"ContainerStarted","Data":"0f8399a27340672bf8cfe0709076f9c0d81790698454a5396dab886b11f9793d"} Nov 24 12:59:28 crc kubenswrapper[4752]: I1124 12:59:28.577542 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-956sb" podStartSLOduration=2.103208933 podStartE2EDuration="2.577516372s" podCreationTimestamp="2025-11-24 12:59:26 +0000 UTC" firstStartedPulling="2025-11-24 12:59:27.764064162 +0000 UTC m=+6773.748884491" lastFinishedPulling="2025-11-24 12:59:28.238371651 +0000 UTC m=+6774.223191930" observedRunningTime="2025-11-24 12:59:28.575257478 +0000 UTC m=+6774.560077767" watchObservedRunningTime="2025-11-24 12:59:28.577516372 +0000 UTC m=+6774.562336661" Nov 24 12:59:59 crc kubenswrapper[4752]: I1124 12:59:59.028218 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-llgnn"] Nov 24 12:59:59 crc kubenswrapper[4752]: I1124 12:59:59.036590 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-llgnn" Nov 24 12:59:59 crc kubenswrapper[4752]: I1124 12:59:59.045593 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-llgnn"] Nov 24 12:59:59 crc kubenswrapper[4752]: I1124 12:59:59.128166 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d0e16b8-9861-42f5-9ea9-3bc37334e0d7-utilities\") pod \"community-operators-llgnn\" (UID: \"5d0e16b8-9861-42f5-9ea9-3bc37334e0d7\") " pod="openshift-marketplace/community-operators-llgnn" Nov 24 12:59:59 crc kubenswrapper[4752]: I1124 12:59:59.128298 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d0e16b8-9861-42f5-9ea9-3bc37334e0d7-catalog-content\") pod \"community-operators-llgnn\" (UID: \"5d0e16b8-9861-42f5-9ea9-3bc37334e0d7\") " pod="openshift-marketplace/community-operators-llgnn" Nov 24 12:59:59 crc kubenswrapper[4752]: I1124 12:59:59.128445 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8mcv\" (UniqueName: \"kubernetes.io/projected/5d0e16b8-9861-42f5-9ea9-3bc37334e0d7-kube-api-access-k8mcv\") pod \"community-operators-llgnn\" (UID: \"5d0e16b8-9861-42f5-9ea9-3bc37334e0d7\") " pod="openshift-marketplace/community-operators-llgnn" Nov 24 12:59:59 crc kubenswrapper[4752]: I1124 12:59:59.230853 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d0e16b8-9861-42f5-9ea9-3bc37334e0d7-utilities\") pod \"community-operators-llgnn\" (UID: \"5d0e16b8-9861-42f5-9ea9-3bc37334e0d7\") " pod="openshift-marketplace/community-operators-llgnn" Nov 24 12:59:59 crc kubenswrapper[4752]: I1124 12:59:59.230955 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d0e16b8-9861-42f5-9ea9-3bc37334e0d7-catalog-content\") pod \"community-operators-llgnn\" (UID: \"5d0e16b8-9861-42f5-9ea9-3bc37334e0d7\") " pod="openshift-marketplace/community-operators-llgnn" Nov 24 12:59:59 crc kubenswrapper[4752]: I1124 12:59:59.231028 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8mcv\" (UniqueName: \"kubernetes.io/projected/5d0e16b8-9861-42f5-9ea9-3bc37334e0d7-kube-api-access-k8mcv\") pod \"community-operators-llgnn\" (UID: \"5d0e16b8-9861-42f5-9ea9-3bc37334e0d7\") " pod="openshift-marketplace/community-operators-llgnn" Nov 24 12:59:59 crc kubenswrapper[4752]: I1124 12:59:59.231346 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d0e16b8-9861-42f5-9ea9-3bc37334e0d7-utilities\") pod \"community-operators-llgnn\" (UID: \"5d0e16b8-9861-42f5-9ea9-3bc37334e0d7\") " pod="openshift-marketplace/community-operators-llgnn" Nov 24 12:59:59 crc kubenswrapper[4752]: I1124 12:59:59.231462 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d0e16b8-9861-42f5-9ea9-3bc37334e0d7-catalog-content\") pod \"community-operators-llgnn\" (UID: \"5d0e16b8-9861-42f5-9ea9-3bc37334e0d7\") " pod="openshift-marketplace/community-operators-llgnn" Nov 24 12:59:59 crc kubenswrapper[4752]: I1124 12:59:59.251261 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8mcv\" (UniqueName: \"kubernetes.io/projected/5d0e16b8-9861-42f5-9ea9-3bc37334e0d7-kube-api-access-k8mcv\") pod \"community-operators-llgnn\" (UID: \"5d0e16b8-9861-42f5-9ea9-3bc37334e0d7\") " pod="openshift-marketplace/community-operators-llgnn" Nov 24 12:59:59 crc kubenswrapper[4752]: I1124 12:59:59.382922 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-llgnn" Nov 24 12:59:59 crc kubenswrapper[4752]: I1124 12:59:59.931538 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-llgnn"] Nov 24 13:00:00 crc kubenswrapper[4752]: I1124 13:00:00.173710 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399820-sltfb"] Nov 24 13:00:00 crc kubenswrapper[4752]: I1124 13:00:00.176989 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399820-sltfb" Nov 24 13:00:00 crc kubenswrapper[4752]: I1124 13:00:00.178972 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 24 13:00:00 crc kubenswrapper[4752]: I1124 13:00:00.181098 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 24 13:00:00 crc kubenswrapper[4752]: I1124 13:00:00.192223 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399820-sltfb"] Nov 24 13:00:00 crc kubenswrapper[4752]: I1124 13:00:00.258436 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvpcw\" (UniqueName: \"kubernetes.io/projected/7c51d28f-4a83-4b39-bcff-877881ab970c-kube-api-access-cvpcw\") pod \"collect-profiles-29399820-sltfb\" (UID: \"7c51d28f-4a83-4b39-bcff-877881ab970c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399820-sltfb" Nov 24 13:00:00 crc kubenswrapper[4752]: I1124 13:00:00.258625 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c51d28f-4a83-4b39-bcff-877881ab970c-secret-volume\") pod \"collect-profiles-29399820-sltfb\" (UID: \"7c51d28f-4a83-4b39-bcff-877881ab970c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399820-sltfb" Nov 24 13:00:00 crc kubenswrapper[4752]: I1124 13:00:00.258713 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c51d28f-4a83-4b39-bcff-877881ab970c-config-volume\") pod \"collect-profiles-29399820-sltfb\" (UID: \"7c51d28f-4a83-4b39-bcff-877881ab970c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399820-sltfb" Nov 24 13:00:00 crc kubenswrapper[4752]: I1124 13:00:00.361148 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvpcw\" (UniqueName: \"kubernetes.io/projected/7c51d28f-4a83-4b39-bcff-877881ab970c-kube-api-access-cvpcw\") pod \"collect-profiles-29399820-sltfb\" (UID: \"7c51d28f-4a83-4b39-bcff-877881ab970c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399820-sltfb" Nov 24 13:00:00 crc kubenswrapper[4752]: I1124 13:00:00.361236 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c51d28f-4a83-4b39-bcff-877881ab970c-secret-volume\") pod \"collect-profiles-29399820-sltfb\" (UID: \"7c51d28f-4a83-4b39-bcff-877881ab970c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399820-sltfb" Nov 24 13:00:00 crc kubenswrapper[4752]: I1124 13:00:00.361289 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c51d28f-4a83-4b39-bcff-877881ab970c-config-volume\") pod \"collect-profiles-29399820-sltfb\" (UID: \"7c51d28f-4a83-4b39-bcff-877881ab970c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399820-sltfb" Nov 24 13:00:00 crc kubenswrapper[4752]: I1124 13:00:00.362693 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c51d28f-4a83-4b39-bcff-877881ab970c-config-volume\") pod \"collect-profiles-29399820-sltfb\" (UID: \"7c51d28f-4a83-4b39-bcff-877881ab970c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399820-sltfb" Nov 24 13:00:00 crc kubenswrapper[4752]: I1124 13:00:00.367798 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c51d28f-4a83-4b39-bcff-877881ab970c-secret-volume\") pod \"collect-profiles-29399820-sltfb\" (UID: \"7c51d28f-4a83-4b39-bcff-877881ab970c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399820-sltfb" Nov 24 13:00:00 crc kubenswrapper[4752]: I1124 13:00:00.378478 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvpcw\" (UniqueName: \"kubernetes.io/projected/7c51d28f-4a83-4b39-bcff-877881ab970c-kube-api-access-cvpcw\") pod \"collect-profiles-29399820-sltfb\" (UID: \"7c51d28f-4a83-4b39-bcff-877881ab970c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399820-sltfb" Nov 24 13:00:00 crc kubenswrapper[4752]: I1124 13:00:00.562618 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399820-sltfb" Nov 24 13:00:00 crc kubenswrapper[4752]: I1124 13:00:00.881414 4752 generic.go:334] "Generic (PLEG): container finished" podID="5d0e16b8-9861-42f5-9ea9-3bc37334e0d7" containerID="a1f16d044088a62d67f6d574468f4427576b3a18a6528a836564dca841ef8f21" exitCode=0 Nov 24 13:00:00 crc kubenswrapper[4752]: I1124 13:00:00.881624 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llgnn" event={"ID":"5d0e16b8-9861-42f5-9ea9-3bc37334e0d7","Type":"ContainerDied","Data":"a1f16d044088a62d67f6d574468f4427576b3a18a6528a836564dca841ef8f21"} Nov 24 13:00:00 crc kubenswrapper[4752]: I1124 13:00:00.881646 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llgnn" event={"ID":"5d0e16b8-9861-42f5-9ea9-3bc37334e0d7","Type":"ContainerStarted","Data":"8fc2f0bf59f7095e9c595929892bd863a8bd873ccc1ce73909c5f368ea3817a6"} Nov 24 13:00:01 crc kubenswrapper[4752]: I1124 13:00:01.051968 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399820-sltfb"] Nov 24 13:00:01 crc kubenswrapper[4752]: I1124 13:00:01.228948 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ncr9b"] Nov 24 13:00:01 crc kubenswrapper[4752]: I1124 13:00:01.232580 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ncr9b" Nov 24 13:00:01 crc kubenswrapper[4752]: I1124 13:00:01.248767 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ncr9b"] Nov 24 13:00:01 crc kubenswrapper[4752]: I1124 13:00:01.380188 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93f62ad3-ec0a-430c-aa7a-be03559e8fc6-utilities\") pod \"redhat-marketplace-ncr9b\" (UID: \"93f62ad3-ec0a-430c-aa7a-be03559e8fc6\") " pod="openshift-marketplace/redhat-marketplace-ncr9b" Nov 24 13:00:01 crc kubenswrapper[4752]: I1124 13:00:01.380225 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93f62ad3-ec0a-430c-aa7a-be03559e8fc6-catalog-content\") pod \"redhat-marketplace-ncr9b\" (UID: \"93f62ad3-ec0a-430c-aa7a-be03559e8fc6\") " pod="openshift-marketplace/redhat-marketplace-ncr9b" Nov 24 13:00:01 crc kubenswrapper[4752]: I1124 13:00:01.380267 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxs2k\" (UniqueName: \"kubernetes.io/projected/93f62ad3-ec0a-430c-aa7a-be03559e8fc6-kube-api-access-qxs2k\") pod \"redhat-marketplace-ncr9b\" (UID: \"93f62ad3-ec0a-430c-aa7a-be03559e8fc6\") " pod="openshift-marketplace/redhat-marketplace-ncr9b" Nov 24 13:00:01 crc kubenswrapper[4752]: I1124 13:00:01.482614 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93f62ad3-ec0a-430c-aa7a-be03559e8fc6-utilities\") pod \"redhat-marketplace-ncr9b\" (UID: \"93f62ad3-ec0a-430c-aa7a-be03559e8fc6\") " pod="openshift-marketplace/redhat-marketplace-ncr9b" Nov 24 13:00:01 crc kubenswrapper[4752]: I1124 13:00:01.483007 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93f62ad3-ec0a-430c-aa7a-be03559e8fc6-catalog-content\") pod \"redhat-marketplace-ncr9b\" (UID: \"93f62ad3-ec0a-430c-aa7a-be03559e8fc6\") " pod="openshift-marketplace/redhat-marketplace-ncr9b" Nov 24 13:00:01 crc kubenswrapper[4752]: I1124 13:00:01.483147 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxs2k\" (UniqueName: \"kubernetes.io/projected/93f62ad3-ec0a-430c-aa7a-be03559e8fc6-kube-api-access-qxs2k\") pod \"redhat-marketplace-ncr9b\" (UID: \"93f62ad3-ec0a-430c-aa7a-be03559e8fc6\") " pod="openshift-marketplace/redhat-marketplace-ncr9b" Nov 24 13:00:01 crc kubenswrapper[4752]: I1124 13:00:01.483179 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93f62ad3-ec0a-430c-aa7a-be03559e8fc6-utilities\") pod \"redhat-marketplace-ncr9b\" (UID: \"93f62ad3-ec0a-430c-aa7a-be03559e8fc6\") " pod="openshift-marketplace/redhat-marketplace-ncr9b" Nov 24 13:00:01 crc kubenswrapper[4752]: I1124 13:00:01.483446 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93f62ad3-ec0a-430c-aa7a-be03559e8fc6-catalog-content\") pod \"redhat-marketplace-ncr9b\" (UID: \"93f62ad3-ec0a-430c-aa7a-be03559e8fc6\") " pod="openshift-marketplace/redhat-marketplace-ncr9b" Nov 24 13:00:01 crc kubenswrapper[4752]: I1124 13:00:01.505435 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxs2k\" (UniqueName: \"kubernetes.io/projected/93f62ad3-ec0a-430c-aa7a-be03559e8fc6-kube-api-access-qxs2k\") pod \"redhat-marketplace-ncr9b\" (UID: \"93f62ad3-ec0a-430c-aa7a-be03559e8fc6\") " pod="openshift-marketplace/redhat-marketplace-ncr9b" Nov 24 13:00:01 crc kubenswrapper[4752]: I1124 13:00:01.556330 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ncr9b" Nov 24 13:00:01 crc kubenswrapper[4752]: I1124 13:00:01.892473 4752 generic.go:334] "Generic (PLEG): container finished" podID="7c51d28f-4a83-4b39-bcff-877881ab970c" containerID="8d738190f518ed98518543d01faa817a6a104c05fe3aefc8374d73aa1f3c5e3c" exitCode=0 Nov 24 13:00:01 crc kubenswrapper[4752]: I1124 13:00:01.892655 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399820-sltfb" event={"ID":"7c51d28f-4a83-4b39-bcff-877881ab970c","Type":"ContainerDied","Data":"8d738190f518ed98518543d01faa817a6a104c05fe3aefc8374d73aa1f3c5e3c"} Nov 24 13:00:01 crc kubenswrapper[4752]: I1124 13:00:01.892854 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399820-sltfb" event={"ID":"7c51d28f-4a83-4b39-bcff-877881ab970c","Type":"ContainerStarted","Data":"957a87d4da49490c3447b724667f72ce5f95e649e64c2a439fa5ac09887551db"} Nov 24 13:00:02 crc kubenswrapper[4752]: I1124 13:00:02.073104 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ncr9b"] Nov 24 13:00:02 crc kubenswrapper[4752]: I1124 13:00:02.904314 4752 generic.go:334] "Generic (PLEG): container finished" podID="93f62ad3-ec0a-430c-aa7a-be03559e8fc6" containerID="c7dfad64fc0084ef685ffbb096eb5d53e42b10ca9e18c37d4aeca082e3698d1c" exitCode=0 Nov 24 13:00:02 crc kubenswrapper[4752]: I1124 13:00:02.904374 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ncr9b" event={"ID":"93f62ad3-ec0a-430c-aa7a-be03559e8fc6","Type":"ContainerDied","Data":"c7dfad64fc0084ef685ffbb096eb5d53e42b10ca9e18c37d4aeca082e3698d1c"} Nov 24 13:00:02 crc kubenswrapper[4752]: I1124 13:00:02.904654 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ncr9b" event={"ID":"93f62ad3-ec0a-430c-aa7a-be03559e8fc6","Type":"ContainerStarted","Data":"2156bfa11f72192f0665e3c466a722967e775985116e6ef075ea00e2a545596e"} Nov 24 13:00:03 crc kubenswrapper[4752]: I1124 13:00:03.325733 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399820-sltfb" Nov 24 13:00:03 crc kubenswrapper[4752]: I1124 13:00:03.443080 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c51d28f-4a83-4b39-bcff-877881ab970c-secret-volume\") pod \"7c51d28f-4a83-4b39-bcff-877881ab970c\" (UID: \"7c51d28f-4a83-4b39-bcff-877881ab970c\") " Nov 24 13:00:03 crc kubenswrapper[4752]: I1124 13:00:03.443454 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c51d28f-4a83-4b39-bcff-877881ab970c-config-volume\") pod \"7c51d28f-4a83-4b39-bcff-877881ab970c\" (UID: \"7c51d28f-4a83-4b39-bcff-877881ab970c\") " Nov 24 13:00:03 crc kubenswrapper[4752]: I1124 13:00:03.443513 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvpcw\" (UniqueName: \"kubernetes.io/projected/7c51d28f-4a83-4b39-bcff-877881ab970c-kube-api-access-cvpcw\") pod \"7c51d28f-4a83-4b39-bcff-877881ab970c\" (UID: \"7c51d28f-4a83-4b39-bcff-877881ab970c\") " Nov 24 13:00:03 crc kubenswrapper[4752]: I1124 13:00:03.443993 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c51d28f-4a83-4b39-bcff-877881ab970c-config-volume" (OuterVolumeSpecName: "config-volume") pod "7c51d28f-4a83-4b39-bcff-877881ab970c" (UID: "7c51d28f-4a83-4b39-bcff-877881ab970c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 13:00:03 crc kubenswrapper[4752]: I1124 13:00:03.444137 4752 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c51d28f-4a83-4b39-bcff-877881ab970c-config-volume\") on node \"crc\" DevicePath \"\"" Nov 24 13:00:03 crc kubenswrapper[4752]: I1124 13:00:03.449837 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c51d28f-4a83-4b39-bcff-877881ab970c-kube-api-access-cvpcw" (OuterVolumeSpecName: "kube-api-access-cvpcw") pod "7c51d28f-4a83-4b39-bcff-877881ab970c" (UID: "7c51d28f-4a83-4b39-bcff-877881ab970c"). InnerVolumeSpecName "kube-api-access-cvpcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:00:03 crc kubenswrapper[4752]: I1124 13:00:03.449900 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c51d28f-4a83-4b39-bcff-877881ab970c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7c51d28f-4a83-4b39-bcff-877881ab970c" (UID: "7c51d28f-4a83-4b39-bcff-877881ab970c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:00:03 crc kubenswrapper[4752]: I1124 13:00:03.550686 4752 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c51d28f-4a83-4b39-bcff-877881ab970c-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 24 13:00:03 crc kubenswrapper[4752]: I1124 13:00:03.550722 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvpcw\" (UniqueName: \"kubernetes.io/projected/7c51d28f-4a83-4b39-bcff-877881ab970c-kube-api-access-cvpcw\") on node \"crc\" DevicePath \"\"" Nov 24 13:00:03 crc kubenswrapper[4752]: I1124 13:00:03.914606 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399820-sltfb" event={"ID":"7c51d28f-4a83-4b39-bcff-877881ab970c","Type":"ContainerDied","Data":"957a87d4da49490c3447b724667f72ce5f95e649e64c2a439fa5ac09887551db"} Nov 24 13:00:03 crc kubenswrapper[4752]: I1124 13:00:03.914644 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="957a87d4da49490c3447b724667f72ce5f95e649e64c2a439fa5ac09887551db" Nov 24 13:00:03 crc kubenswrapper[4752]: I1124 13:00:03.914651 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399820-sltfb" Nov 24 13:00:04 crc kubenswrapper[4752]: I1124 13:00:04.394189 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399775-gqkn2"] Nov 24 13:00:04 crc kubenswrapper[4752]: I1124 13:00:04.414108 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399775-gqkn2"] Nov 24 13:00:04 crc kubenswrapper[4752]: I1124 13:00:04.743945 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="478b588e-3987-4b51-8934-cf455f0a6408" path="/var/lib/kubelet/pods/478b588e-3987-4b51-8934-cf455f0a6408/volumes" Nov 24 13:00:05 crc kubenswrapper[4752]: I1124 13:00:05.980884 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ncr9b" event={"ID":"93f62ad3-ec0a-430c-aa7a-be03559e8fc6","Type":"ContainerStarted","Data":"48b01143b5953f5161d5fdfeb36bea26c812a29c1f9f7dccaf79abfdaca012c3"} Nov 24 13:00:05 crc kubenswrapper[4752]: I1124 13:00:05.987308 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llgnn" event={"ID":"5d0e16b8-9861-42f5-9ea9-3bc37334e0d7","Type":"ContainerStarted","Data":"9364f76c3c4b854fb8248c261b0f2b6ae7b204b410aaa4ebf13ede5761b40bf2"} Nov 24 13:00:06 crc kubenswrapper[4752]: E1124 13:00:06.272513 4752 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d0e16b8_9861_42f5_9ea9_3bc37334e0d7.slice/crio-9364f76c3c4b854fb8248c261b0f2b6ae7b204b410aaa4ebf13ede5761b40bf2.scope\": RecentStats: unable to find data in memory cache]" Nov 24 13:00:07 crc kubenswrapper[4752]: I1124 13:00:07.000969 4752 generic.go:334] "Generic (PLEG): container finished" podID="5d0e16b8-9861-42f5-9ea9-3bc37334e0d7" containerID="9364f76c3c4b854fb8248c261b0f2b6ae7b204b410aaa4ebf13ede5761b40bf2" exitCode=0 Nov 24 13:00:07 crc kubenswrapper[4752]: I1124 13:00:07.001044 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llgnn" event={"ID":"5d0e16b8-9861-42f5-9ea9-3bc37334e0d7","Type":"ContainerDied","Data":"9364f76c3c4b854fb8248c261b0f2b6ae7b204b410aaa4ebf13ede5761b40bf2"} Nov 24 13:00:08 crc kubenswrapper[4752]: I1124 13:00:08.014547 4752 generic.go:334] "Generic (PLEG): container finished" podID="93f62ad3-ec0a-430c-aa7a-be03559e8fc6" containerID="48b01143b5953f5161d5fdfeb36bea26c812a29c1f9f7dccaf79abfdaca012c3" exitCode=0 Nov 24 13:00:08 crc kubenswrapper[4752]: I1124 13:00:08.014638 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ncr9b" event={"ID":"93f62ad3-ec0a-430c-aa7a-be03559e8fc6","Type":"ContainerDied","Data":"48b01143b5953f5161d5fdfeb36bea26c812a29c1f9f7dccaf79abfdaca012c3"} Nov 24 13:00:08 crc kubenswrapper[4752]: I1124 13:00:08.018994 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llgnn" event={"ID":"5d0e16b8-9861-42f5-9ea9-3bc37334e0d7","Type":"ContainerStarted","Data":"e028edf06807a2676d3d40b0f38dc42eaca38d94f52f41261d141acb9583b73c"} Nov 24 13:00:08 crc kubenswrapper[4752]: I1124 13:00:08.054961 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-llgnn" podStartSLOduration=3.481385215 podStartE2EDuration="10.054941081s" podCreationTimestamp="2025-11-24 12:59:58 +0000 UTC" firstStartedPulling="2025-11-24 13:00:00.883616593 +0000 UTC m=+6806.868436882" lastFinishedPulling="2025-11-24 13:00:07.457172439 +0000 UTC m=+6813.441992748" observedRunningTime="2025-11-24 13:00:08.052080429 +0000 UTC m=+6814.036900748" watchObservedRunningTime="2025-11-24 13:00:08.054941081 +0000 UTC m=+6814.039761370" Nov 24 13:00:09 crc kubenswrapper[4752]: I1124 13:00:09.031227 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ncr9b" event={"ID":"93f62ad3-ec0a-430c-aa7a-be03559e8fc6","Type":"ContainerStarted","Data":"7c3a92f8b93385d7ac49c6e2fe4042131e9a1b21e84985f4ff70eeb9fd2c34d6"} Nov 24 13:00:09 crc kubenswrapper[4752]: I1124 13:00:09.383555 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-llgnn" Nov 24 13:00:09 crc kubenswrapper[4752]: I1124 13:00:09.383617 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-llgnn" Nov 24 13:00:10 crc kubenswrapper[4752]: I1124 13:00:10.431104 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-llgnn" podUID="5d0e16b8-9861-42f5-9ea9-3bc37334e0d7" containerName="registry-server" probeResult="failure" output=< Nov 24 13:00:10 crc kubenswrapper[4752]: timeout: failed to connect service ":50051" within 1s Nov 24 13:00:10 crc kubenswrapper[4752]: > Nov 24 13:00:11 crc kubenswrapper[4752]: I1124 13:00:11.557143 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ncr9b" Nov 24 13:00:11 crc kubenswrapper[4752]: I1124 13:00:11.557212 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ncr9b" Nov 24 13:00:11 crc kubenswrapper[4752]: I1124 13:00:11.612975 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ncr9b" Nov 24 13:00:11 crc kubenswrapper[4752]: I1124 13:00:11.654541 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ncr9b" podStartSLOduration=5.046025189 podStartE2EDuration="10.654190935s" podCreationTimestamp="2025-11-24 13:00:01 +0000 UTC" firstStartedPulling="2025-11-24 13:00:02.906070144 +0000 UTC m=+6808.890890443" lastFinishedPulling="2025-11-24 13:00:08.51423587 +0000 UTC m=+6814.499056189" observedRunningTime="2025-11-24 13:00:09.060177275 +0000 UTC m=+6815.044997564" watchObservedRunningTime="2025-11-24 13:00:11.654190935 +0000 UTC m=+6817.639011254" Nov 24 13:00:15 crc kubenswrapper[4752]: I1124 13:00:15.468491 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 13:00:15 crc kubenswrapper[4752]: I1124 13:00:15.468956 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 13:00:19 crc kubenswrapper[4752]: I1124 13:00:19.450694 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-llgnn" Nov 24 13:00:19 crc kubenswrapper[4752]: I1124 13:00:19.514498 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-llgnn" Nov 24 13:00:19 crc kubenswrapper[4752]: I1124 13:00:19.577837 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-llgnn"] Nov 24 13:00:19 crc kubenswrapper[4752]: I1124 13:00:19.703432 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nl7tl"] Nov 24 13:00:19 crc kubenswrapper[4752]: I1124 13:00:19.704500 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nl7tl" podUID="59a05b3c-7f50-43bb-8fad-9de225d4fb96" containerName="registry-server" containerID="cri-o://6a5c2efbe9352ecf09dfc9cc19f1956bda6aa2fcfde12d97064f2e7c36b965e8" gracePeriod=2 Nov 24 13:00:20 crc kubenswrapper[4752]: I1124 13:00:20.159334 4752 generic.go:334] "Generic (PLEG): container finished" podID="59a05b3c-7f50-43bb-8fad-9de225d4fb96" containerID="6a5c2efbe9352ecf09dfc9cc19f1956bda6aa2fcfde12d97064f2e7c36b965e8" exitCode=0 Nov 24 13:00:20 crc kubenswrapper[4752]: I1124 13:00:20.159414 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nl7tl" event={"ID":"59a05b3c-7f50-43bb-8fad-9de225d4fb96","Type":"ContainerDied","Data":"6a5c2efbe9352ecf09dfc9cc19f1956bda6aa2fcfde12d97064f2e7c36b965e8"} Nov 24 13:00:20 crc kubenswrapper[4752]: I1124 13:00:20.298599 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nl7tl" Nov 24 13:00:20 crc kubenswrapper[4752]: I1124 13:00:20.358003 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59a05b3c-7f50-43bb-8fad-9de225d4fb96-utilities\") pod \"59a05b3c-7f50-43bb-8fad-9de225d4fb96\" (UID: \"59a05b3c-7f50-43bb-8fad-9de225d4fb96\") " Nov 24 13:00:20 crc kubenswrapper[4752]: I1124 13:00:20.358084 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59a05b3c-7f50-43bb-8fad-9de225d4fb96-catalog-content\") pod \"59a05b3c-7f50-43bb-8fad-9de225d4fb96\" (UID: \"59a05b3c-7f50-43bb-8fad-9de225d4fb96\") " Nov 24 13:00:20 crc kubenswrapper[4752]: I1124 13:00:20.358202 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh8q9\" (UniqueName: \"kubernetes.io/projected/59a05b3c-7f50-43bb-8fad-9de225d4fb96-kube-api-access-zh8q9\") pod \"59a05b3c-7f50-43bb-8fad-9de225d4fb96\" (UID: \"59a05b3c-7f50-43bb-8fad-9de225d4fb96\") " Nov 24 13:00:20 crc kubenswrapper[4752]: I1124 13:00:20.360516 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59a05b3c-7f50-43bb-8fad-9de225d4fb96-utilities" (OuterVolumeSpecName: "utilities") pod "59a05b3c-7f50-43bb-8fad-9de225d4fb96" (UID: "59a05b3c-7f50-43bb-8fad-9de225d4fb96"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 13:00:20 crc kubenswrapper[4752]: I1124 13:00:20.372304 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59a05b3c-7f50-43bb-8fad-9de225d4fb96-kube-api-access-zh8q9" (OuterVolumeSpecName: "kube-api-access-zh8q9") pod "59a05b3c-7f50-43bb-8fad-9de225d4fb96" (UID: "59a05b3c-7f50-43bb-8fad-9de225d4fb96"). InnerVolumeSpecName "kube-api-access-zh8q9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:00:20 crc kubenswrapper[4752]: I1124 13:00:20.407617 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59a05b3c-7f50-43bb-8fad-9de225d4fb96-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "59a05b3c-7f50-43bb-8fad-9de225d4fb96" (UID: "59a05b3c-7f50-43bb-8fad-9de225d4fb96"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 13:00:20 crc kubenswrapper[4752]: I1124 13:00:20.460999 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zh8q9\" (UniqueName: \"kubernetes.io/projected/59a05b3c-7f50-43bb-8fad-9de225d4fb96-kube-api-access-zh8q9\") on node \"crc\" DevicePath \"\"" Nov 24 13:00:20 crc kubenswrapper[4752]: I1124 13:00:20.461037 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59a05b3c-7f50-43bb-8fad-9de225d4fb96-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 13:00:20 crc kubenswrapper[4752]: I1124 13:00:20.461048 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59a05b3c-7f50-43bb-8fad-9de225d4fb96-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 13:00:21 crc kubenswrapper[4752]: I1124 13:00:21.172407 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nl7tl" event={"ID":"59a05b3c-7f50-43bb-8fad-9de225d4fb96","Type":"ContainerDied","Data":"1ab5d6cd31d9be362a6871309e1aa7d4e0b30ca91918efd9db71f4bef63d4935"} Nov 24 13:00:21 crc kubenswrapper[4752]: I1124 13:00:21.172465 4752 scope.go:117] "RemoveContainer" containerID="6a5c2efbe9352ecf09dfc9cc19f1956bda6aa2fcfde12d97064f2e7c36b965e8" Nov 24 13:00:21 crc kubenswrapper[4752]: I1124 13:00:21.172472 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nl7tl" Nov 24 13:00:21 crc kubenswrapper[4752]: I1124 13:00:21.200463 4752 scope.go:117] "RemoveContainer" containerID="188fcfd77293d3326fecd6afae3a4273510e77c860a2decc9c73121acacc6b09" Nov 24 13:00:21 crc kubenswrapper[4752]: I1124 13:00:21.204832 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nl7tl"] Nov 24 13:00:21 crc kubenswrapper[4752]: I1124 13:00:21.221726 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nl7tl"] Nov 24 13:00:21 crc kubenswrapper[4752]: I1124 13:00:21.235252 4752 scope.go:117] "RemoveContainer" containerID="b31a6265aa77f22b3b745ec00b5ce2fe238242508f5a5b3ce5ccd62450150b0c" Nov 24 13:00:21 crc kubenswrapper[4752]: I1124 13:00:21.618954 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ncr9b" Nov 24 13:00:22 crc kubenswrapper[4752]: I1124 13:00:22.750327 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59a05b3c-7f50-43bb-8fad-9de225d4fb96" path="/var/lib/kubelet/pods/59a05b3c-7f50-43bb-8fad-9de225d4fb96/volumes" Nov 24 13:00:23 crc kubenswrapper[4752]: I1124 13:00:23.903436 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ncr9b"] Nov 24 13:00:23 crc kubenswrapper[4752]: I1124 13:00:23.904113 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ncr9b" podUID="93f62ad3-ec0a-430c-aa7a-be03559e8fc6" containerName="registry-server" containerID="cri-o://7c3a92f8b93385d7ac49c6e2fe4042131e9a1b21e84985f4ff70eeb9fd2c34d6" gracePeriod=2 Nov 24 13:00:24 crc kubenswrapper[4752]: I1124 13:00:24.203456 4752 generic.go:334] "Generic (PLEG): container finished" podID="93f62ad3-ec0a-430c-aa7a-be03559e8fc6" containerID="7c3a92f8b93385d7ac49c6e2fe4042131e9a1b21e84985f4ff70eeb9fd2c34d6" exitCode=0 Nov 24 13:00:24 crc kubenswrapper[4752]: I1124 13:00:24.203502 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ncr9b" event={"ID":"93f62ad3-ec0a-430c-aa7a-be03559e8fc6","Type":"ContainerDied","Data":"7c3a92f8b93385d7ac49c6e2fe4042131e9a1b21e84985f4ff70eeb9fd2c34d6"} Nov 24 13:00:24 crc kubenswrapper[4752]: I1124 13:00:24.416614 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ncr9b" Nov 24 13:00:24 crc kubenswrapper[4752]: I1124 13:00:24.551789 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93f62ad3-ec0a-430c-aa7a-be03559e8fc6-catalog-content\") pod \"93f62ad3-ec0a-430c-aa7a-be03559e8fc6\" (UID: \"93f62ad3-ec0a-430c-aa7a-be03559e8fc6\") " Nov 24 13:00:24 crc kubenswrapper[4752]: I1124 13:00:24.552003 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93f62ad3-ec0a-430c-aa7a-be03559e8fc6-utilities\") pod \"93f62ad3-ec0a-430c-aa7a-be03559e8fc6\" (UID: \"93f62ad3-ec0a-430c-aa7a-be03559e8fc6\") " Nov 24 13:00:24 crc kubenswrapper[4752]: I1124 13:00:24.552048 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxs2k\" (UniqueName: \"kubernetes.io/projected/93f62ad3-ec0a-430c-aa7a-be03559e8fc6-kube-api-access-qxs2k\") pod \"93f62ad3-ec0a-430c-aa7a-be03559e8fc6\" (UID: \"93f62ad3-ec0a-430c-aa7a-be03559e8fc6\") " Nov 24 13:00:24 crc kubenswrapper[4752]: I1124 13:00:24.552703 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93f62ad3-ec0a-430c-aa7a-be03559e8fc6-utilities" (OuterVolumeSpecName: "utilities") pod "93f62ad3-ec0a-430c-aa7a-be03559e8fc6" (UID: "93f62ad3-ec0a-430c-aa7a-be03559e8fc6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 13:00:24 crc kubenswrapper[4752]: I1124 13:00:24.557414 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93f62ad3-ec0a-430c-aa7a-be03559e8fc6-kube-api-access-qxs2k" (OuterVolumeSpecName: "kube-api-access-qxs2k") pod "93f62ad3-ec0a-430c-aa7a-be03559e8fc6" (UID: "93f62ad3-ec0a-430c-aa7a-be03559e8fc6"). InnerVolumeSpecName "kube-api-access-qxs2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:00:24 crc kubenswrapper[4752]: I1124 13:00:24.571975 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93f62ad3-ec0a-430c-aa7a-be03559e8fc6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "93f62ad3-ec0a-430c-aa7a-be03559e8fc6" (UID: "93f62ad3-ec0a-430c-aa7a-be03559e8fc6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 13:00:24 crc kubenswrapper[4752]: I1124 13:00:24.654421 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93f62ad3-ec0a-430c-aa7a-be03559e8fc6-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 13:00:24 crc kubenswrapper[4752]: I1124 13:00:24.654457 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxs2k\" (UniqueName: \"kubernetes.io/projected/93f62ad3-ec0a-430c-aa7a-be03559e8fc6-kube-api-access-qxs2k\") on node \"crc\" DevicePath \"\"" Nov 24 13:00:24 crc kubenswrapper[4752]: I1124 13:00:24.654470 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93f62ad3-ec0a-430c-aa7a-be03559e8fc6-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 13:00:25 crc kubenswrapper[4752]: I1124 13:00:25.216295 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ncr9b" event={"ID":"93f62ad3-ec0a-430c-aa7a-be03559e8fc6","Type":"ContainerDied","Data":"2156bfa11f72192f0665e3c466a722967e775985116e6ef075ea00e2a545596e"} Nov 24 13:00:25 crc kubenswrapper[4752]: I1124 13:00:25.216560 4752 scope.go:117] "RemoveContainer" containerID="7c3a92f8b93385d7ac49c6e2fe4042131e9a1b21e84985f4ff70eeb9fd2c34d6" Nov 24 13:00:25 crc kubenswrapper[4752]: I1124 13:00:25.216378 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ncr9b" Nov 24 13:00:25 crc kubenswrapper[4752]: I1124 13:00:25.241416 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ncr9b"] Nov 24 13:00:25 crc kubenswrapper[4752]: I1124 13:00:25.246436 4752 scope.go:117] "RemoveContainer" containerID="48b01143b5953f5161d5fdfeb36bea26c812a29c1f9f7dccaf79abfdaca012c3" Nov 24 13:00:25 crc kubenswrapper[4752]: I1124 13:00:25.252087 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ncr9b"] Nov 24 13:00:25 crc kubenswrapper[4752]: I1124 13:00:25.290101 4752 scope.go:117] "RemoveContainer" containerID="c7dfad64fc0084ef685ffbb096eb5d53e42b10ca9e18c37d4aeca082e3698d1c" Nov 24 13:00:26 crc kubenswrapper[4752]: I1124 13:00:26.738697 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93f62ad3-ec0a-430c-aa7a-be03559e8fc6" path="/var/lib/kubelet/pods/93f62ad3-ec0a-430c-aa7a-be03559e8fc6/volumes" Nov 24 13:00:45 crc kubenswrapper[4752]: I1124 13:00:45.469187 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 13:00:45 crc kubenswrapper[4752]: I1124 13:00:45.469937 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 13:00:49 crc kubenswrapper[4752]: I1124 13:00:49.059938 4752 scope.go:117] "RemoveContainer" containerID="92e120f4ed3198101cd9f950393fe58fd6ebfec3363950c9eb88558d96b13480" Nov 24 13:01:00 crc kubenswrapper[4752]: I1124 13:01:00.189129 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29399821-qd67c"] Nov 24 13:01:00 crc kubenswrapper[4752]: E1124 13:01:00.190076 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59a05b3c-7f50-43bb-8fad-9de225d4fb96" containerName="extract-utilities" Nov 24 13:01:00 crc kubenswrapper[4752]: I1124 13:01:00.190090 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="59a05b3c-7f50-43bb-8fad-9de225d4fb96" containerName="extract-utilities" Nov 24 13:01:00 crc kubenswrapper[4752]: E1124 13:01:00.190100 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59a05b3c-7f50-43bb-8fad-9de225d4fb96" containerName="registry-server" Nov 24 13:01:00 crc kubenswrapper[4752]: I1124 13:01:00.190106 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="59a05b3c-7f50-43bb-8fad-9de225d4fb96" containerName="registry-server" Nov 24 13:01:00 crc kubenswrapper[4752]: E1124 13:01:00.190128 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93f62ad3-ec0a-430c-aa7a-be03559e8fc6" containerName="extract-content" Nov 24 13:01:00 crc kubenswrapper[4752]: I1124 13:01:00.190134 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="93f62ad3-ec0a-430c-aa7a-be03559e8fc6" containerName="extract-content" Nov 24 13:01:00 crc kubenswrapper[4752]: E1124 13:01:00.190150 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c51d28f-4a83-4b39-bcff-877881ab970c" containerName="collect-profiles" Nov 24 13:01:00 crc kubenswrapper[4752]: I1124 13:01:00.190157 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c51d28f-4a83-4b39-bcff-877881ab970c" containerName="collect-profiles" Nov 24 13:01:00 crc kubenswrapper[4752]: E1124 13:01:00.190171 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59a05b3c-7f50-43bb-8fad-9de225d4fb96" containerName="extract-content" Nov 24 13:01:00 crc kubenswrapper[4752]: I1124 13:01:00.190178 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="59a05b3c-7f50-43bb-8fad-9de225d4fb96" containerName="extract-content" Nov 24 13:01:00 crc kubenswrapper[4752]: E1124 13:01:00.190197 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93f62ad3-ec0a-430c-aa7a-be03559e8fc6" containerName="extract-utilities" Nov 24 13:01:00 crc kubenswrapper[4752]: I1124 13:01:00.190203 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="93f62ad3-ec0a-430c-aa7a-be03559e8fc6" containerName="extract-utilities" Nov 24 13:01:00 crc kubenswrapper[4752]: E1124 13:01:00.190226 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93f62ad3-ec0a-430c-aa7a-be03559e8fc6" containerName="registry-server" Nov 24 13:01:00 crc kubenswrapper[4752]: I1124 13:01:00.190231 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="93f62ad3-ec0a-430c-aa7a-be03559e8fc6" containerName="registry-server" Nov 24 13:01:00 crc kubenswrapper[4752]: I1124 13:01:00.190428 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="93f62ad3-ec0a-430c-aa7a-be03559e8fc6" containerName="registry-server" Nov 24 13:01:00 crc kubenswrapper[4752]: I1124 13:01:00.190468 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="59a05b3c-7f50-43bb-8fad-9de225d4fb96" containerName="registry-server" Nov 24 13:01:00 crc kubenswrapper[4752]: I1124 13:01:00.190488 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c51d28f-4a83-4b39-bcff-877881ab970c" containerName="collect-profiles" Nov 24 13:01:00 crc kubenswrapper[4752]: I1124 13:01:00.191257 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29399821-qd67c" Nov 24 13:01:00 crc kubenswrapper[4752]: I1124 13:01:00.280621 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29399821-qd67c"] Nov 24 13:01:00 crc kubenswrapper[4752]: I1124 13:01:00.289968 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2-config-data\") pod \"keystone-cron-29399821-qd67c\" (UID: \"6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2\") " pod="openstack/keystone-cron-29399821-qd67c" Nov 24 13:01:00 crc kubenswrapper[4752]: I1124 13:01:00.290017 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2-fernet-keys\") pod \"keystone-cron-29399821-qd67c\" (UID: \"6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2\") " pod="openstack/keystone-cron-29399821-qd67c" Nov 24 13:01:00 crc kubenswrapper[4752]: I1124 13:01:00.290077 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2-combined-ca-bundle\") pod \"keystone-cron-29399821-qd67c\" (UID: \"6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2\") " pod="openstack/keystone-cron-29399821-qd67c" Nov 24 13:01:00 crc kubenswrapper[4752]: I1124 13:01:00.290233 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk4z5\" (UniqueName: \"kubernetes.io/projected/6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2-kube-api-access-dk4z5\") pod \"keystone-cron-29399821-qd67c\" (UID: \"6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2\") " pod="openstack/keystone-cron-29399821-qd67c" Nov 24 13:01:00 crc kubenswrapper[4752]: I1124 13:01:00.391669 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dk4z5\" (UniqueName: \"kubernetes.io/projected/6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2-kube-api-access-dk4z5\") pod \"keystone-cron-29399821-qd67c\" (UID: \"6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2\") " pod="openstack/keystone-cron-29399821-qd67c" Nov 24 13:01:00 crc kubenswrapper[4752]: I1124 13:01:00.392043 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2-config-data\") pod \"keystone-cron-29399821-qd67c\" (UID: \"6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2\") " pod="openstack/keystone-cron-29399821-qd67c" Nov 24 13:01:00 crc kubenswrapper[4752]: I1124 13:01:00.392064 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2-fernet-keys\") pod \"keystone-cron-29399821-qd67c\" (UID: \"6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2\") " pod="openstack/keystone-cron-29399821-qd67c" Nov 24 13:01:00 crc kubenswrapper[4752]: I1124 13:01:00.392103 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2-combined-ca-bundle\") pod \"keystone-cron-29399821-qd67c\" (UID: \"6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2\") " pod="openstack/keystone-cron-29399821-qd67c" Nov 24 13:01:00 crc kubenswrapper[4752]: I1124 13:01:00.397926 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2-config-data\") pod \"keystone-cron-29399821-qd67c\" (UID: \"6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2\") " pod="openstack/keystone-cron-29399821-qd67c" Nov 24 13:01:00 crc kubenswrapper[4752]: I1124 13:01:00.399306 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2-fernet-keys\") pod \"keystone-cron-29399821-qd67c\" (UID: \"6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2\") " pod="openstack/keystone-cron-29399821-qd67c" Nov 24 13:01:00 crc kubenswrapper[4752]: I1124 13:01:00.405880 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2-combined-ca-bundle\") pod \"keystone-cron-29399821-qd67c\" (UID: \"6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2\") " pod="openstack/keystone-cron-29399821-qd67c" Nov 24 13:01:00 crc kubenswrapper[4752]: I1124 13:01:00.408972 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk4z5\" (UniqueName: \"kubernetes.io/projected/6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2-kube-api-access-dk4z5\") pod \"keystone-cron-29399821-qd67c\" (UID: \"6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2\") " pod="openstack/keystone-cron-29399821-qd67c" Nov 24 13:01:00 crc kubenswrapper[4752]: I1124 13:01:00.511604 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29399821-qd67c" Nov 24 13:01:01 crc kubenswrapper[4752]: I1124 13:01:01.030049 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29399821-qd67c"] Nov 24 13:01:01 crc kubenswrapper[4752]: I1124 13:01:01.621158 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29399821-qd67c" event={"ID":"6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2","Type":"ContainerStarted","Data":"116a13926b45ec3b1585b49383494c3b794c4e1b8ff3e36667c5d40faebde9eb"} Nov 24 13:01:01 crc kubenswrapper[4752]: I1124 13:01:01.621533 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29399821-qd67c" event={"ID":"6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2","Type":"ContainerStarted","Data":"17d00e38f9c45f392c6f229b02f7831c18b7c006a4e42ef26820e9fd33ceb66e"} Nov 24 13:01:01 crc kubenswrapper[4752]: I1124 13:01:01.647378 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29399821-qd67c" podStartSLOduration=1.647355272 podStartE2EDuration="1.647355272s" podCreationTimestamp="2025-11-24 13:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 13:01:01.639656861 +0000 UTC m=+6867.624477160" watchObservedRunningTime="2025-11-24 13:01:01.647355272 +0000 UTC m=+6867.632175581" Nov 24 13:01:04 crc kubenswrapper[4752]: I1124 13:01:04.657799 4752 generic.go:334] "Generic (PLEG): container finished" podID="6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2" containerID="116a13926b45ec3b1585b49383494c3b794c4e1b8ff3e36667c5d40faebde9eb" exitCode=0 Nov 24 13:01:04 crc kubenswrapper[4752]: I1124 13:01:04.657896 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29399821-qd67c" event={"ID":"6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2","Type":"ContainerDied","Data":"116a13926b45ec3b1585b49383494c3b794c4e1b8ff3e36667c5d40faebde9eb"} Nov 24 13:01:06 crc kubenswrapper[4752]: I1124 13:01:06.065277 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29399821-qd67c" Nov 24 13:01:06 crc kubenswrapper[4752]: I1124 13:01:06.225919 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2-fernet-keys\") pod \"6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2\" (UID: \"6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2\") " Nov 24 13:01:06 crc kubenswrapper[4752]: I1124 13:01:06.226392 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2-combined-ca-bundle\") pod \"6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2\" (UID: \"6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2\") " Nov 24 13:01:06 crc kubenswrapper[4752]: I1124 13:01:06.227139 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dk4z5\" (UniqueName: \"kubernetes.io/projected/6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2-kube-api-access-dk4z5\") pod \"6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2\" (UID: \"6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2\") " Nov 24 13:01:06 crc kubenswrapper[4752]: I1124 13:01:06.227350 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2-config-data\") pod \"6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2\" (UID: \"6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2\") " Nov 24 13:01:06 crc kubenswrapper[4752]: I1124 13:01:06.232276 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2" (UID: "6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:01:06 crc kubenswrapper[4752]: I1124 13:01:06.248021 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2-kube-api-access-dk4z5" (OuterVolumeSpecName: "kube-api-access-dk4z5") pod "6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2" (UID: "6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2"). InnerVolumeSpecName "kube-api-access-dk4z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:01:06 crc kubenswrapper[4752]: I1124 13:01:06.256249 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2" (UID: "6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:01:06 crc kubenswrapper[4752]: I1124 13:01:06.281037 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2-config-data" (OuterVolumeSpecName: "config-data") pod "6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2" (UID: "6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:01:06 crc kubenswrapper[4752]: I1124 13:01:06.329520 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dk4z5\" (UniqueName: \"kubernetes.io/projected/6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2-kube-api-access-dk4z5\") on node \"crc\" DevicePath \"\"" Nov 24 13:01:06 crc kubenswrapper[4752]: I1124 13:01:06.329560 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 13:01:06 crc kubenswrapper[4752]: I1124 13:01:06.329573 4752 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 24 13:01:06 crc kubenswrapper[4752]: I1124 13:01:06.329581 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 13:01:06 crc kubenswrapper[4752]: I1124 13:01:06.676101 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29399821-qd67c" event={"ID":"6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2","Type":"ContainerDied","Data":"17d00e38f9c45f392c6f229b02f7831c18b7c006a4e42ef26820e9fd33ceb66e"} Nov 24 13:01:06 crc kubenswrapper[4752]: I1124 13:01:06.676149 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17d00e38f9c45f392c6f229b02f7831c18b7c006a4e42ef26820e9fd33ceb66e" Nov 24 13:01:06 crc kubenswrapper[4752]: I1124 13:01:06.676219 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29399821-qd67c" Nov 24 13:01:15 crc kubenswrapper[4752]: I1124 13:01:15.468725 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 13:01:15 crc kubenswrapper[4752]: I1124 13:01:15.469435 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 13:01:15 crc kubenswrapper[4752]: I1124 13:01:15.469487 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 13:01:15 crc kubenswrapper[4752]: I1124 13:01:15.470358 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0788fdd31bd33cfe1bebc621b6dface2712d5104da45f2898cca2cf2493e647c"} pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 24 13:01:15 crc kubenswrapper[4752]: I1124 13:01:15.470412 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" containerID="cri-o://0788fdd31bd33cfe1bebc621b6dface2712d5104da45f2898cca2cf2493e647c" gracePeriod=600 Nov 24 13:01:15 crc kubenswrapper[4752]: I1124 13:01:15.760104 4752 generic.go:334] "Generic (PLEG): container finished" podID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerID="0788fdd31bd33cfe1bebc621b6dface2712d5104da45f2898cca2cf2493e647c" exitCode=0 Nov 24 13:01:15 crc kubenswrapper[4752]: I1124 13:01:15.760597 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerDied","Data":"0788fdd31bd33cfe1bebc621b6dface2712d5104da45f2898cca2cf2493e647c"} Nov 24 13:01:15 crc kubenswrapper[4752]: I1124 13:01:15.760633 4752 scope.go:117] "RemoveContainer" containerID="93c57cc4c01503e626ed124c01600822ac967ab96dcecca9218b3f4bd94b1b3b" Nov 24 13:01:16 crc kubenswrapper[4752]: I1124 13:01:16.775409 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerStarted","Data":"ec8b3c7b99e04ac7de2a7811e3000500deed3097f15545f1d3f164ab34d7ec12"} Nov 24 13:02:53 crc kubenswrapper[4752]: I1124 13:02:53.816509 4752 generic.go:334] "Generic (PLEG): container finished" podID="d98851aa-db89-424e-9304-44681012e2f0" containerID="bf75b2951c1271f218553f85519095a0bd1a8bd1dd12987140ecfc50639e4c54" exitCode=0 Nov 24 13:02:53 crc kubenswrapper[4752]: I1124 13:02:53.816658 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-956sb" event={"ID":"d98851aa-db89-424e-9304-44681012e2f0","Type":"ContainerDied","Data":"bf75b2951c1271f218553f85519095a0bd1a8bd1dd12987140ecfc50639e4c54"} Nov 24 13:02:55 crc kubenswrapper[4752]: I1124 13:02:55.327404 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-956sb" Nov 24 13:02:55 crc kubenswrapper[4752]: I1124 13:02:55.485561 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d98851aa-db89-424e-9304-44681012e2f0-ceph\") pod \"d98851aa-db89-424e-9304-44681012e2f0\" (UID: \"d98851aa-db89-424e-9304-44681012e2f0\") " Nov 24 13:02:55 crc kubenswrapper[4752]: I1124 13:02:55.485624 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45srs\" (UniqueName: \"kubernetes.io/projected/d98851aa-db89-424e-9304-44681012e2f0-kube-api-access-45srs\") pod \"d98851aa-db89-424e-9304-44681012e2f0\" (UID: \"d98851aa-db89-424e-9304-44681012e2f0\") " Nov 24 13:02:55 crc kubenswrapper[4752]: I1124 13:02:55.485650 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d98851aa-db89-424e-9304-44681012e2f0-bootstrap-combined-ca-bundle\") pod \"d98851aa-db89-424e-9304-44681012e2f0\" (UID: \"d98851aa-db89-424e-9304-44681012e2f0\") " Nov 24 13:02:55 crc kubenswrapper[4752]: I1124 13:02:55.485717 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d98851aa-db89-424e-9304-44681012e2f0-ssh-key\") pod \"d98851aa-db89-424e-9304-44681012e2f0\" (UID: \"d98851aa-db89-424e-9304-44681012e2f0\") " Nov 24 13:02:55 crc kubenswrapper[4752]: I1124 13:02:55.488520 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d98851aa-db89-424e-9304-44681012e2f0-inventory\") pod \"d98851aa-db89-424e-9304-44681012e2f0\" (UID: \"d98851aa-db89-424e-9304-44681012e2f0\") " Nov 24 13:02:55 crc kubenswrapper[4752]: I1124 13:02:55.501183 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d98851aa-db89-424e-9304-44681012e2f0-ceph" (OuterVolumeSpecName: "ceph") pod "d98851aa-db89-424e-9304-44681012e2f0" (UID: "d98851aa-db89-424e-9304-44681012e2f0"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:02:55 crc kubenswrapper[4752]: I1124 13:02:55.516703 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d98851aa-db89-424e-9304-44681012e2f0-kube-api-access-45srs" (OuterVolumeSpecName: "kube-api-access-45srs") pod "d98851aa-db89-424e-9304-44681012e2f0" (UID: "d98851aa-db89-424e-9304-44681012e2f0"). InnerVolumeSpecName "kube-api-access-45srs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:02:55 crc kubenswrapper[4752]: I1124 13:02:55.521475 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d98851aa-db89-424e-9304-44681012e2f0-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "d98851aa-db89-424e-9304-44681012e2f0" (UID: "d98851aa-db89-424e-9304-44681012e2f0"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:02:55 crc kubenswrapper[4752]: I1124 13:02:55.547758 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d98851aa-db89-424e-9304-44681012e2f0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d98851aa-db89-424e-9304-44681012e2f0" (UID: "d98851aa-db89-424e-9304-44681012e2f0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:02:55 crc kubenswrapper[4752]: I1124 13:02:55.558271 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d98851aa-db89-424e-9304-44681012e2f0-inventory" (OuterVolumeSpecName: "inventory") pod "d98851aa-db89-424e-9304-44681012e2f0" (UID: "d98851aa-db89-424e-9304-44681012e2f0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:02:55 crc kubenswrapper[4752]: I1124 13:02:55.593298 4752 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d98851aa-db89-424e-9304-44681012e2f0-ceph\") on node \"crc\" DevicePath \"\"" Nov 24 13:02:55 crc kubenswrapper[4752]: I1124 13:02:55.593342 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45srs\" (UniqueName: \"kubernetes.io/projected/d98851aa-db89-424e-9304-44681012e2f0-kube-api-access-45srs\") on node \"crc\" DevicePath \"\"" Nov 24 13:02:55 crc kubenswrapper[4752]: I1124 13:02:55.593436 4752 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d98851aa-db89-424e-9304-44681012e2f0-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 13:02:55 crc kubenswrapper[4752]: I1124 13:02:55.593450 4752 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d98851aa-db89-424e-9304-44681012e2f0-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 24 13:02:55 crc kubenswrapper[4752]: I1124 13:02:55.593462 4752 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d98851aa-db89-424e-9304-44681012e2f0-inventory\") on node \"crc\" DevicePath \"\"" Nov 24 13:02:55 crc kubenswrapper[4752]: I1124 13:02:55.836305 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-956sb" event={"ID":"d98851aa-db89-424e-9304-44681012e2f0","Type":"ContainerDied","Data":"0f8399a27340672bf8cfe0709076f9c0d81790698454a5396dab886b11f9793d"} Nov 24 13:02:55 crc kubenswrapper[4752]: I1124 13:02:55.836353 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f8399a27340672bf8cfe0709076f9c0d81790698454a5396dab886b11f9793d" Nov 24 13:02:55 crc kubenswrapper[4752]: I1124 13:02:55.836410 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-956sb" Nov 24 13:02:55 crc kubenswrapper[4752]: I1124 13:02:55.949446 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-t6pjb"] Nov 24 13:02:55 crc kubenswrapper[4752]: E1124 13:02:55.951061 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2" containerName="keystone-cron" Nov 24 13:02:55 crc kubenswrapper[4752]: I1124 13:02:55.951086 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2" containerName="keystone-cron" Nov 24 13:02:55 crc kubenswrapper[4752]: E1124 13:02:55.951298 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d98851aa-db89-424e-9304-44681012e2f0" containerName="bootstrap-openstack-openstack-cell1" Nov 24 13:02:55 crc kubenswrapper[4752]: I1124 13:02:55.951317 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="d98851aa-db89-424e-9304-44681012e2f0" containerName="bootstrap-openstack-openstack-cell1" Nov 24 13:02:55 crc kubenswrapper[4752]: I1124 13:02:55.951710 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="d98851aa-db89-424e-9304-44681012e2f0" containerName="bootstrap-openstack-openstack-cell1" Nov 24 13:02:55 crc kubenswrapper[4752]: I1124 13:02:55.951733 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2" containerName="keystone-cron" Nov 24 13:02:55 crc kubenswrapper[4752]: I1124 13:02:55.953405 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-t6pjb" Nov 24 13:02:55 crc kubenswrapper[4752]: I1124 13:02:55.959308 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qj7cx" Nov 24 13:02:55 crc kubenswrapper[4752]: I1124 13:02:55.959485 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 24 13:02:55 crc kubenswrapper[4752]: I1124 13:02:55.959602 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 24 13:02:55 crc kubenswrapper[4752]: I1124 13:02:55.960202 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 24 13:02:55 crc kubenswrapper[4752]: I1124 13:02:55.960953 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-t6pjb"] Nov 24 13:02:56 crc kubenswrapper[4752]: I1124 13:02:56.005624 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/291a26fd-6d14-45f7-bd62-cf7806827e6d-inventory\") pod \"download-cache-openstack-openstack-cell1-t6pjb\" (UID: \"291a26fd-6d14-45f7-bd62-cf7806827e6d\") " pod="openstack/download-cache-openstack-openstack-cell1-t6pjb" Nov 24 13:02:56 crc kubenswrapper[4752]: I1124 13:02:56.005679 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqm48\" (UniqueName: \"kubernetes.io/projected/291a26fd-6d14-45f7-bd62-cf7806827e6d-kube-api-access-fqm48\") pod \"download-cache-openstack-openstack-cell1-t6pjb\" (UID: \"291a26fd-6d14-45f7-bd62-cf7806827e6d\") " pod="openstack/download-cache-openstack-openstack-cell1-t6pjb" Nov 24 13:02:56 crc kubenswrapper[4752]: I1124 13:02:56.005775 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/291a26fd-6d14-45f7-bd62-cf7806827e6d-ceph\") pod \"download-cache-openstack-openstack-cell1-t6pjb\" (UID: \"291a26fd-6d14-45f7-bd62-cf7806827e6d\") " pod="openstack/download-cache-openstack-openstack-cell1-t6pjb" Nov 24 13:02:56 crc kubenswrapper[4752]: I1124 13:02:56.005896 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/291a26fd-6d14-45f7-bd62-cf7806827e6d-ssh-key\") pod \"download-cache-openstack-openstack-cell1-t6pjb\" (UID: \"291a26fd-6d14-45f7-bd62-cf7806827e6d\") " pod="openstack/download-cache-openstack-openstack-cell1-t6pjb" Nov 24 13:02:56 crc kubenswrapper[4752]: I1124 13:02:56.107650 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/291a26fd-6d14-45f7-bd62-cf7806827e6d-ceph\") pod \"download-cache-openstack-openstack-cell1-t6pjb\" (UID: \"291a26fd-6d14-45f7-bd62-cf7806827e6d\") " pod="openstack/download-cache-openstack-openstack-cell1-t6pjb" Nov 24 13:02:56 crc kubenswrapper[4752]: I1124 13:02:56.107785 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/291a26fd-6d14-45f7-bd62-cf7806827e6d-ssh-key\") pod \"download-cache-openstack-openstack-cell1-t6pjb\" (UID: \"291a26fd-6d14-45f7-bd62-cf7806827e6d\") " pod="openstack/download-cache-openstack-openstack-cell1-t6pjb" Nov 24 13:02:56 crc kubenswrapper[4752]: I1124 13:02:56.107928 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqm48\" (UniqueName: \"kubernetes.io/projected/291a26fd-6d14-45f7-bd62-cf7806827e6d-kube-api-access-fqm48\") pod \"download-cache-openstack-openstack-cell1-t6pjb\" (UID: \"291a26fd-6d14-45f7-bd62-cf7806827e6d\") " pod="openstack/download-cache-openstack-openstack-cell1-t6pjb" Nov 24 13:02:56 crc kubenswrapper[4752]: I1124 13:02:56.107955 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/291a26fd-6d14-45f7-bd62-cf7806827e6d-inventory\") pod \"download-cache-openstack-openstack-cell1-t6pjb\" (UID: \"291a26fd-6d14-45f7-bd62-cf7806827e6d\") " pod="openstack/download-cache-openstack-openstack-cell1-t6pjb" Nov 24 13:02:56 crc kubenswrapper[4752]: I1124 13:02:56.111672 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/291a26fd-6d14-45f7-bd62-cf7806827e6d-ssh-key\") pod \"download-cache-openstack-openstack-cell1-t6pjb\" (UID: \"291a26fd-6d14-45f7-bd62-cf7806827e6d\") " pod="openstack/download-cache-openstack-openstack-cell1-t6pjb" Nov 24 13:02:56 crc kubenswrapper[4752]: I1124 13:02:56.111914 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/291a26fd-6d14-45f7-bd62-cf7806827e6d-inventory\") pod \"download-cache-openstack-openstack-cell1-t6pjb\" (UID: \"291a26fd-6d14-45f7-bd62-cf7806827e6d\") " pod="openstack/download-cache-openstack-openstack-cell1-t6pjb" Nov 24 13:02:56 crc kubenswrapper[4752]: I1124 13:02:56.115091 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/291a26fd-6d14-45f7-bd62-cf7806827e6d-ceph\") pod \"download-cache-openstack-openstack-cell1-t6pjb\" (UID: \"291a26fd-6d14-45f7-bd62-cf7806827e6d\") " pod="openstack/download-cache-openstack-openstack-cell1-t6pjb" Nov 24 13:02:56 crc kubenswrapper[4752]: I1124 13:02:56.128067 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqm48\" (UniqueName: \"kubernetes.io/projected/291a26fd-6d14-45f7-bd62-cf7806827e6d-kube-api-access-fqm48\") pod \"download-cache-openstack-openstack-cell1-t6pjb\" (UID: \"291a26fd-6d14-45f7-bd62-cf7806827e6d\") " pod="openstack/download-cache-openstack-openstack-cell1-t6pjb" Nov 24 13:02:56 crc kubenswrapper[4752]: I1124 13:02:56.295260 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-t6pjb" Nov 24 13:02:56 crc kubenswrapper[4752]: I1124 13:02:56.873424 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-t6pjb"] Nov 24 13:02:57 crc kubenswrapper[4752]: I1124 13:02:57.856264 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-t6pjb" event={"ID":"291a26fd-6d14-45f7-bd62-cf7806827e6d","Type":"ContainerStarted","Data":"8a9d3f66d495cf9bf156fe6aca353d22fc1ba7011f3e714bc8340b18fe5405da"} Nov 24 13:02:57 crc kubenswrapper[4752]: I1124 13:02:57.856882 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-t6pjb" event={"ID":"291a26fd-6d14-45f7-bd62-cf7806827e6d","Type":"ContainerStarted","Data":"9f092c813060ea489dd87f4b88e98c03b000855027506e258b8602d00b33fce4"} Nov 24 13:02:57 crc kubenswrapper[4752]: I1124 13:02:57.883791 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-cell1-t6pjb" podStartSLOduration=2.359303767 podStartE2EDuration="2.883765336s" podCreationTimestamp="2025-11-24 13:02:55 +0000 UTC" firstStartedPulling="2025-11-24 13:02:56.869401511 +0000 UTC m=+6982.854221800" lastFinishedPulling="2025-11-24 13:02:57.39386308 +0000 UTC m=+6983.378683369" observedRunningTime="2025-11-24 13:02:57.871602617 +0000 UTC m=+6983.856422936" watchObservedRunningTime="2025-11-24 13:02:57.883765336 +0000 UTC m=+6983.868585645" Nov 24 13:03:15 crc kubenswrapper[4752]: I1124 13:03:15.469012 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 13:03:15 crc kubenswrapper[4752]: I1124 13:03:15.469625 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 13:03:45 crc kubenswrapper[4752]: I1124 13:03:45.469242 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 13:03:45 crc kubenswrapper[4752]: I1124 13:03:45.470094 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 13:04:15 crc kubenswrapper[4752]: I1124 13:04:15.468677 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 13:04:15 crc kubenswrapper[4752]: I1124 13:04:15.469708 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 13:04:15 crc kubenswrapper[4752]: I1124 13:04:15.469817 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 13:04:15 crc kubenswrapper[4752]: I1124 13:04:15.470981 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ec8b3c7b99e04ac7de2a7811e3000500deed3097f15545f1d3f164ab34d7ec12"} pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 24 13:04:15 crc kubenswrapper[4752]: I1124 13:04:15.471060 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" containerID="cri-o://ec8b3c7b99e04ac7de2a7811e3000500deed3097f15545f1d3f164ab34d7ec12" gracePeriod=600 Nov 24 13:04:15 crc kubenswrapper[4752]: E1124 13:04:15.597024 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:04:15 crc kubenswrapper[4752]: I1124 13:04:15.626975 4752 generic.go:334] "Generic (PLEG): container finished" podID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerID="ec8b3c7b99e04ac7de2a7811e3000500deed3097f15545f1d3f164ab34d7ec12" exitCode=0 Nov 24 13:04:15 crc kubenswrapper[4752]: I1124 13:04:15.627020 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerDied","Data":"ec8b3c7b99e04ac7de2a7811e3000500deed3097f15545f1d3f164ab34d7ec12"} Nov 24 13:04:15 crc kubenswrapper[4752]: I1124 13:04:15.627055 4752 scope.go:117] "RemoveContainer" containerID="0788fdd31bd33cfe1bebc621b6dface2712d5104da45f2898cca2cf2493e647c" Nov 24 13:04:15 crc kubenswrapper[4752]: I1124 13:04:15.628388 4752 scope.go:117] "RemoveContainer" containerID="ec8b3c7b99e04ac7de2a7811e3000500deed3097f15545f1d3f164ab34d7ec12" Nov 24 13:04:15 crc kubenswrapper[4752]: E1124 13:04:15.629879 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:04:24 crc kubenswrapper[4752]: I1124 13:04:24.080443 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4dcbc"] Nov 24 13:04:24 crc kubenswrapper[4752]: I1124 13:04:24.084731 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4dcbc" Nov 24 13:04:24 crc kubenswrapper[4752]: I1124 13:04:24.106009 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4dcbc"] Nov 24 13:04:24 crc kubenswrapper[4752]: I1124 13:04:24.237390 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hgwn\" (UniqueName: \"kubernetes.io/projected/4f84466e-5f50-42cc-ae1a-be631dd3d74f-kube-api-access-5hgwn\") pod \"redhat-operators-4dcbc\" (UID: \"4f84466e-5f50-42cc-ae1a-be631dd3d74f\") " pod="openshift-marketplace/redhat-operators-4dcbc" Nov 24 13:04:24 crc kubenswrapper[4752]: I1124 13:04:24.237460 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f84466e-5f50-42cc-ae1a-be631dd3d74f-catalog-content\") pod \"redhat-operators-4dcbc\" (UID: \"4f84466e-5f50-42cc-ae1a-be631dd3d74f\") " pod="openshift-marketplace/redhat-operators-4dcbc" Nov 24 13:04:24 crc kubenswrapper[4752]: I1124 13:04:24.237592 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f84466e-5f50-42cc-ae1a-be631dd3d74f-utilities\") pod \"redhat-operators-4dcbc\" (UID: \"4f84466e-5f50-42cc-ae1a-be631dd3d74f\") " pod="openshift-marketplace/redhat-operators-4dcbc" Nov 24 13:04:24 crc kubenswrapper[4752]: I1124 13:04:24.339684 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hgwn\" (UniqueName: \"kubernetes.io/projected/4f84466e-5f50-42cc-ae1a-be631dd3d74f-kube-api-access-5hgwn\") pod \"redhat-operators-4dcbc\" (UID: \"4f84466e-5f50-42cc-ae1a-be631dd3d74f\") " pod="openshift-marketplace/redhat-operators-4dcbc" Nov 24 13:04:24 crc kubenswrapper[4752]: I1124 13:04:24.340122 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f84466e-5f50-42cc-ae1a-be631dd3d74f-catalog-content\") pod \"redhat-operators-4dcbc\" (UID: \"4f84466e-5f50-42cc-ae1a-be631dd3d74f\") " pod="openshift-marketplace/redhat-operators-4dcbc" Nov 24 13:04:24 crc kubenswrapper[4752]: I1124 13:04:24.340269 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f84466e-5f50-42cc-ae1a-be631dd3d74f-utilities\") pod \"redhat-operators-4dcbc\" (UID: \"4f84466e-5f50-42cc-ae1a-be631dd3d74f\") " pod="openshift-marketplace/redhat-operators-4dcbc" Nov 24 13:04:24 crc kubenswrapper[4752]: I1124 13:04:24.340726 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f84466e-5f50-42cc-ae1a-be631dd3d74f-catalog-content\") pod \"redhat-operators-4dcbc\" (UID: \"4f84466e-5f50-42cc-ae1a-be631dd3d74f\") " pod="openshift-marketplace/redhat-operators-4dcbc" Nov 24 13:04:24 crc kubenswrapper[4752]: I1124 13:04:24.340831 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f84466e-5f50-42cc-ae1a-be631dd3d74f-utilities\") pod \"redhat-operators-4dcbc\" (UID: \"4f84466e-5f50-42cc-ae1a-be631dd3d74f\") " pod="openshift-marketplace/redhat-operators-4dcbc" Nov 24 13:04:24 crc kubenswrapper[4752]: I1124 13:04:24.366603 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hgwn\" (UniqueName: \"kubernetes.io/projected/4f84466e-5f50-42cc-ae1a-be631dd3d74f-kube-api-access-5hgwn\") pod \"redhat-operators-4dcbc\" (UID: \"4f84466e-5f50-42cc-ae1a-be631dd3d74f\") " pod="openshift-marketplace/redhat-operators-4dcbc" Nov 24 13:04:24 crc kubenswrapper[4752]: I1124 13:04:24.417873 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4dcbc" Nov 24 13:04:24 crc kubenswrapper[4752]: I1124 13:04:24.939711 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4dcbc"] Nov 24 13:04:25 crc kubenswrapper[4752]: I1124 13:04:25.742033 4752 generic.go:334] "Generic (PLEG): container finished" podID="4f84466e-5f50-42cc-ae1a-be631dd3d74f" containerID="0dcbcafc233f67c5468499e3160b3538264ca2321553ad30cdcf16066e93dd0b" exitCode=0 Nov 24 13:04:25 crc kubenswrapper[4752]: I1124 13:04:25.742293 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4dcbc" event={"ID":"4f84466e-5f50-42cc-ae1a-be631dd3d74f","Type":"ContainerDied","Data":"0dcbcafc233f67c5468499e3160b3538264ca2321553ad30cdcf16066e93dd0b"} Nov 24 13:04:25 crc kubenswrapper[4752]: I1124 13:04:25.742452 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4dcbc" event={"ID":"4f84466e-5f50-42cc-ae1a-be631dd3d74f","Type":"ContainerStarted","Data":"768ad4872ad7d60396ab5c7cda75ef6e7328af26d372272f864a640f355f027d"} Nov 24 13:04:26 crc kubenswrapper[4752]: I1124 13:04:26.752040 4752 generic.go:334] "Generic (PLEG): container finished" podID="291a26fd-6d14-45f7-bd62-cf7806827e6d" containerID="8a9d3f66d495cf9bf156fe6aca353d22fc1ba7011f3e714bc8340b18fe5405da" exitCode=0 Nov 24 13:04:26 crc kubenswrapper[4752]: I1124 13:04:26.752098 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-t6pjb" event={"ID":"291a26fd-6d14-45f7-bd62-cf7806827e6d","Type":"ContainerDied","Data":"8a9d3f66d495cf9bf156fe6aca353d22fc1ba7011f3e714bc8340b18fe5405da"} Nov 24 13:04:28 crc kubenswrapper[4752]: I1124 13:04:28.276109 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-t6pjb" Nov 24 13:04:28 crc kubenswrapper[4752]: I1124 13:04:28.340026 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/291a26fd-6d14-45f7-bd62-cf7806827e6d-ceph\") pod \"291a26fd-6d14-45f7-bd62-cf7806827e6d\" (UID: \"291a26fd-6d14-45f7-bd62-cf7806827e6d\") " Nov 24 13:04:28 crc kubenswrapper[4752]: I1124 13:04:28.340177 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/291a26fd-6d14-45f7-bd62-cf7806827e6d-ssh-key\") pod \"291a26fd-6d14-45f7-bd62-cf7806827e6d\" (UID: \"291a26fd-6d14-45f7-bd62-cf7806827e6d\") " Nov 24 13:04:28 crc kubenswrapper[4752]: I1124 13:04:28.340265 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/291a26fd-6d14-45f7-bd62-cf7806827e6d-inventory\") pod \"291a26fd-6d14-45f7-bd62-cf7806827e6d\" (UID: \"291a26fd-6d14-45f7-bd62-cf7806827e6d\") " Nov 24 13:04:28 crc kubenswrapper[4752]: I1124 13:04:28.340346 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqm48\" (UniqueName: \"kubernetes.io/projected/291a26fd-6d14-45f7-bd62-cf7806827e6d-kube-api-access-fqm48\") pod \"291a26fd-6d14-45f7-bd62-cf7806827e6d\" (UID: \"291a26fd-6d14-45f7-bd62-cf7806827e6d\") " Nov 24 13:04:28 crc kubenswrapper[4752]: I1124 13:04:28.346885 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/291a26fd-6d14-45f7-bd62-cf7806827e6d-kube-api-access-fqm48" (OuterVolumeSpecName: "kube-api-access-fqm48") pod "291a26fd-6d14-45f7-bd62-cf7806827e6d" (UID: "291a26fd-6d14-45f7-bd62-cf7806827e6d"). InnerVolumeSpecName "kube-api-access-fqm48". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:04:28 crc kubenswrapper[4752]: I1124 13:04:28.359598 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/291a26fd-6d14-45f7-bd62-cf7806827e6d-ceph" (OuterVolumeSpecName: "ceph") pod "291a26fd-6d14-45f7-bd62-cf7806827e6d" (UID: "291a26fd-6d14-45f7-bd62-cf7806827e6d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:04:28 crc kubenswrapper[4752]: I1124 13:04:28.376408 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/291a26fd-6d14-45f7-bd62-cf7806827e6d-inventory" (OuterVolumeSpecName: "inventory") pod "291a26fd-6d14-45f7-bd62-cf7806827e6d" (UID: "291a26fd-6d14-45f7-bd62-cf7806827e6d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:04:28 crc kubenswrapper[4752]: I1124 13:04:28.377118 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/291a26fd-6d14-45f7-bd62-cf7806827e6d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "291a26fd-6d14-45f7-bd62-cf7806827e6d" (UID: "291a26fd-6d14-45f7-bd62-cf7806827e6d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:04:28 crc kubenswrapper[4752]: I1124 13:04:28.442365 4752 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/291a26fd-6d14-45f7-bd62-cf7806827e6d-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 24 13:04:28 crc kubenswrapper[4752]: I1124 13:04:28.442410 4752 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/291a26fd-6d14-45f7-bd62-cf7806827e6d-inventory\") on node \"crc\" DevicePath \"\"" Nov 24 13:04:28 crc kubenswrapper[4752]: I1124 13:04:28.442424 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqm48\" (UniqueName: \"kubernetes.io/projected/291a26fd-6d14-45f7-bd62-cf7806827e6d-kube-api-access-fqm48\") on node \"crc\" DevicePath \"\"" Nov 24 13:04:28 crc kubenswrapper[4752]: I1124 13:04:28.442436 4752 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/291a26fd-6d14-45f7-bd62-cf7806827e6d-ceph\") on node \"crc\" DevicePath \"\"" Nov 24 13:04:28 crc kubenswrapper[4752]: I1124 13:04:28.773266 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-t6pjb" event={"ID":"291a26fd-6d14-45f7-bd62-cf7806827e6d","Type":"ContainerDied","Data":"9f092c813060ea489dd87f4b88e98c03b000855027506e258b8602d00b33fce4"} Nov 24 13:04:28 crc kubenswrapper[4752]: I1124 13:04:28.773309 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f092c813060ea489dd87f4b88e98c03b000855027506e258b8602d00b33fce4" Nov 24 13:04:28 crc kubenswrapper[4752]: I1124 13:04:28.773380 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-t6pjb" Nov 24 13:04:28 crc kubenswrapper[4752]: I1124 13:04:28.851068 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-ftbqj"] Nov 24 13:04:28 crc kubenswrapper[4752]: E1124 13:04:28.851545 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="291a26fd-6d14-45f7-bd62-cf7806827e6d" containerName="download-cache-openstack-openstack-cell1" Nov 24 13:04:28 crc kubenswrapper[4752]: I1124 13:04:28.851575 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="291a26fd-6d14-45f7-bd62-cf7806827e6d" containerName="download-cache-openstack-openstack-cell1" Nov 24 13:04:28 crc kubenswrapper[4752]: I1124 13:04:28.851795 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="291a26fd-6d14-45f7-bd62-cf7806827e6d" containerName="download-cache-openstack-openstack-cell1" Nov 24 13:04:28 crc kubenswrapper[4752]: I1124 13:04:28.852635 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-ftbqj" Nov 24 13:04:28 crc kubenswrapper[4752]: I1124 13:04:28.855460 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 24 13:04:28 crc kubenswrapper[4752]: I1124 13:04:28.855564 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qj7cx" Nov 24 13:04:28 crc kubenswrapper[4752]: I1124 13:04:28.856216 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 24 13:04:28 crc kubenswrapper[4752]: I1124 13:04:28.857083 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 24 13:04:28 crc kubenswrapper[4752]: I1124 13:04:28.871716 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-ftbqj"] Nov 24 13:04:28 crc kubenswrapper[4752]: I1124 13:04:28.956795 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9b8b46cb-745f-489f-84e7-2b4e9001ac6e-ceph\") pod \"configure-network-openstack-openstack-cell1-ftbqj\" (UID: \"9b8b46cb-745f-489f-84e7-2b4e9001ac6e\") " pod="openstack/configure-network-openstack-openstack-cell1-ftbqj" Nov 24 13:04:28 crc kubenswrapper[4752]: I1124 13:04:28.956966 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b8b46cb-745f-489f-84e7-2b4e9001ac6e-inventory\") pod \"configure-network-openstack-openstack-cell1-ftbqj\" (UID: \"9b8b46cb-745f-489f-84e7-2b4e9001ac6e\") " pod="openstack/configure-network-openstack-openstack-cell1-ftbqj" Nov 24 13:04:28 crc kubenswrapper[4752]: I1124 13:04:28.957126 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgsx8\" (UniqueName: \"kubernetes.io/projected/9b8b46cb-745f-489f-84e7-2b4e9001ac6e-kube-api-access-lgsx8\") pod \"configure-network-openstack-openstack-cell1-ftbqj\" (UID: \"9b8b46cb-745f-489f-84e7-2b4e9001ac6e\") " pod="openstack/configure-network-openstack-openstack-cell1-ftbqj" Nov 24 13:04:28 crc kubenswrapper[4752]: I1124 13:04:28.957233 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b8b46cb-745f-489f-84e7-2b4e9001ac6e-ssh-key\") pod \"configure-network-openstack-openstack-cell1-ftbqj\" (UID: \"9b8b46cb-745f-489f-84e7-2b4e9001ac6e\") " pod="openstack/configure-network-openstack-openstack-cell1-ftbqj" Nov 24 13:04:29 crc kubenswrapper[4752]: I1124 13:04:29.060157 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9b8b46cb-745f-489f-84e7-2b4e9001ac6e-ceph\") pod \"configure-network-openstack-openstack-cell1-ftbqj\" (UID: \"9b8b46cb-745f-489f-84e7-2b4e9001ac6e\") " pod="openstack/configure-network-openstack-openstack-cell1-ftbqj" Nov 24 13:04:29 crc kubenswrapper[4752]: I1124 13:04:29.060862 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b8b46cb-745f-489f-84e7-2b4e9001ac6e-inventory\") pod \"configure-network-openstack-openstack-cell1-ftbqj\" (UID: \"9b8b46cb-745f-489f-84e7-2b4e9001ac6e\") " pod="openstack/configure-network-openstack-openstack-cell1-ftbqj" Nov 24 13:04:29 crc kubenswrapper[4752]: I1124 13:04:29.061090 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgsx8\" (UniqueName: \"kubernetes.io/projected/9b8b46cb-745f-489f-84e7-2b4e9001ac6e-kube-api-access-lgsx8\") pod \"configure-network-openstack-openstack-cell1-ftbqj\" (UID: \"9b8b46cb-745f-489f-84e7-2b4e9001ac6e\") " pod="openstack/configure-network-openstack-openstack-cell1-ftbqj" Nov 24 13:04:29 crc kubenswrapper[4752]: I1124 13:04:29.061196 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b8b46cb-745f-489f-84e7-2b4e9001ac6e-ssh-key\") pod \"configure-network-openstack-openstack-cell1-ftbqj\" (UID: \"9b8b46cb-745f-489f-84e7-2b4e9001ac6e\") " pod="openstack/configure-network-openstack-openstack-cell1-ftbqj" Nov 24 13:04:29 crc kubenswrapper[4752]: I1124 13:04:29.064531 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9b8b46cb-745f-489f-84e7-2b4e9001ac6e-ceph\") pod \"configure-network-openstack-openstack-cell1-ftbqj\" (UID: \"9b8b46cb-745f-489f-84e7-2b4e9001ac6e\") " pod="openstack/configure-network-openstack-openstack-cell1-ftbqj" Nov 24 13:04:29 crc kubenswrapper[4752]: I1124 13:04:29.064831 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b8b46cb-745f-489f-84e7-2b4e9001ac6e-inventory\") pod \"configure-network-openstack-openstack-cell1-ftbqj\" (UID: \"9b8b46cb-745f-489f-84e7-2b4e9001ac6e\") " pod="openstack/configure-network-openstack-openstack-cell1-ftbqj" Nov 24 13:04:29 crc kubenswrapper[4752]: I1124 13:04:29.065115 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b8b46cb-745f-489f-84e7-2b4e9001ac6e-ssh-key\") pod \"configure-network-openstack-openstack-cell1-ftbqj\" (UID: \"9b8b46cb-745f-489f-84e7-2b4e9001ac6e\") " pod="openstack/configure-network-openstack-openstack-cell1-ftbqj" Nov 24 13:04:29 crc kubenswrapper[4752]: I1124 13:04:29.080278 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgsx8\" (UniqueName: \"kubernetes.io/projected/9b8b46cb-745f-489f-84e7-2b4e9001ac6e-kube-api-access-lgsx8\") pod \"configure-network-openstack-openstack-cell1-ftbqj\" (UID: \"9b8b46cb-745f-489f-84e7-2b4e9001ac6e\") " pod="openstack/configure-network-openstack-openstack-cell1-ftbqj" Nov 24 13:04:29 crc kubenswrapper[4752]: I1124 13:04:29.180216 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-ftbqj" Nov 24 13:04:29 crc kubenswrapper[4752]: I1124 13:04:29.820262 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-ftbqj"] Nov 24 13:04:30 crc kubenswrapper[4752]: I1124 13:04:30.728078 4752 scope.go:117] "RemoveContainer" containerID="ec8b3c7b99e04ac7de2a7811e3000500deed3097f15545f1d3f164ab34d7ec12" Nov 24 13:04:30 crc kubenswrapper[4752]: E1124 13:04:30.728610 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:04:34 crc kubenswrapper[4752]: W1124 13:04:34.591783 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b8b46cb_745f_489f_84e7_2b4e9001ac6e.slice/crio-a14c63be982f70628efd75fa7762f654b7467449dfb04d045608ccf27a40c666 WatchSource:0}: Error finding container a14c63be982f70628efd75fa7762f654b7467449dfb04d045608ccf27a40c666: Status 404 returned error can't find the container with id a14c63be982f70628efd75fa7762f654b7467449dfb04d045608ccf27a40c666 Nov 24 13:04:34 crc kubenswrapper[4752]: I1124 13:04:34.593944 4752 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 24 13:04:34 crc kubenswrapper[4752]: I1124 13:04:34.839446 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-ftbqj" event={"ID":"9b8b46cb-745f-489f-84e7-2b4e9001ac6e","Type":"ContainerStarted","Data":"a14c63be982f70628efd75fa7762f654b7467449dfb04d045608ccf27a40c666"} Nov 24 13:04:35 crc kubenswrapper[4752]: I1124 13:04:35.077297 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 24 13:04:35 crc kubenswrapper[4752]: I1124 13:04:35.856380 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4dcbc" event={"ID":"4f84466e-5f50-42cc-ae1a-be631dd3d74f","Type":"ContainerStarted","Data":"784d07c4126c945e3a54fb46d233371e710f0c83a3edddfae56f371c8bde34ac"} Nov 24 13:04:36 crc kubenswrapper[4752]: I1124 13:04:36.867425 4752 generic.go:334] "Generic (PLEG): container finished" podID="4f84466e-5f50-42cc-ae1a-be631dd3d74f" containerID="784d07c4126c945e3a54fb46d233371e710f0c83a3edddfae56f371c8bde34ac" exitCode=0 Nov 24 13:04:36 crc kubenswrapper[4752]: I1124 13:04:36.867498 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4dcbc" event={"ID":"4f84466e-5f50-42cc-ae1a-be631dd3d74f","Type":"ContainerDied","Data":"784d07c4126c945e3a54fb46d233371e710f0c83a3edddfae56f371c8bde34ac"} Nov 24 13:04:36 crc kubenswrapper[4752]: I1124 13:04:36.870198 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-ftbqj" event={"ID":"9b8b46cb-745f-489f-84e7-2b4e9001ac6e","Type":"ContainerStarted","Data":"37495c9dd612a7c3d35d9d98b23db1dd446ae3d3ab9119ab9b5bfc56eb457687"} Nov 24 13:04:36 crc kubenswrapper[4752]: I1124 13:04:36.916187 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-cell1-ftbqj" podStartSLOduration=8.437312471 podStartE2EDuration="8.916163771s" podCreationTimestamp="2025-11-24 13:04:28 +0000 UTC" firstStartedPulling="2025-11-24 13:04:34.593567709 +0000 UTC m=+7080.578388018" lastFinishedPulling="2025-11-24 13:04:35.072419019 +0000 UTC m=+7081.057239318" observedRunningTime="2025-11-24 13:04:36.913216796 +0000 UTC m=+7082.898037085" watchObservedRunningTime="2025-11-24 13:04:36.916163771 +0000 UTC m=+7082.900984070" Nov 24 13:04:37 crc kubenswrapper[4752]: I1124 13:04:37.884012 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4dcbc" event={"ID":"4f84466e-5f50-42cc-ae1a-be631dd3d74f","Type":"ContainerStarted","Data":"ac8d17017db8b9d73ff82d4b2fb63c2b57a697f070cdc437fd6fc090a2d22d0d"} Nov 24 13:04:37 crc kubenswrapper[4752]: I1124 13:04:37.911351 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4dcbc" podStartSLOduration=2.097402137 podStartE2EDuration="13.911327284s" podCreationTimestamp="2025-11-24 13:04:24 +0000 UTC" firstStartedPulling="2025-11-24 13:04:25.746946292 +0000 UTC m=+7071.731766581" lastFinishedPulling="2025-11-24 13:04:37.560871419 +0000 UTC m=+7083.545691728" observedRunningTime="2025-11-24 13:04:37.904005713 +0000 UTC m=+7083.888826002" watchObservedRunningTime="2025-11-24 13:04:37.911327284 +0000 UTC m=+7083.896147573" Nov 24 13:04:41 crc kubenswrapper[4752]: I1124 13:04:41.728961 4752 scope.go:117] "RemoveContainer" containerID="ec8b3c7b99e04ac7de2a7811e3000500deed3097f15545f1d3f164ab34d7ec12" Nov 24 13:04:41 crc kubenswrapper[4752]: E1124 13:04:41.730535 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:04:44 crc kubenswrapper[4752]: I1124 13:04:44.418122 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4dcbc" Nov 24 13:04:44 crc kubenswrapper[4752]: I1124 13:04:44.418416 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4dcbc" Nov 24 13:04:44 crc kubenswrapper[4752]: I1124 13:04:44.483562 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4dcbc" Nov 24 13:04:45 crc kubenswrapper[4752]: I1124 13:04:45.050484 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4dcbc" Nov 24 13:04:45 crc kubenswrapper[4752]: I1124 13:04:45.531321 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4dcbc"] Nov 24 13:04:46 crc kubenswrapper[4752]: I1124 13:04:46.097488 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4qjnj"] Nov 24 13:04:46 crc kubenswrapper[4752]: I1124 13:04:46.098048 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4qjnj" podUID="2aa3cc59-ad7a-4874-84bd-1d0b05b3acce" containerName="registry-server" containerID="cri-o://03b20c090304d0de1e14e982886a0c03359f971aae4a2dbc698e1bf3e94e6c6e" gracePeriod=2 Nov 24 13:04:46 crc kubenswrapper[4752]: I1124 13:04:46.644570 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4qjnj" Nov 24 13:04:46 crc kubenswrapper[4752]: I1124 13:04:46.826078 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qv92w\" (UniqueName: \"kubernetes.io/projected/2aa3cc59-ad7a-4874-84bd-1d0b05b3acce-kube-api-access-qv92w\") pod \"2aa3cc59-ad7a-4874-84bd-1d0b05b3acce\" (UID: \"2aa3cc59-ad7a-4874-84bd-1d0b05b3acce\") " Nov 24 13:04:46 crc kubenswrapper[4752]: I1124 13:04:46.826119 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2aa3cc59-ad7a-4874-84bd-1d0b05b3acce-catalog-content\") pod \"2aa3cc59-ad7a-4874-84bd-1d0b05b3acce\" (UID: \"2aa3cc59-ad7a-4874-84bd-1d0b05b3acce\") " Nov 24 13:04:46 crc kubenswrapper[4752]: I1124 13:04:46.826185 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2aa3cc59-ad7a-4874-84bd-1d0b05b3acce-utilities\") pod \"2aa3cc59-ad7a-4874-84bd-1d0b05b3acce\" (UID: \"2aa3cc59-ad7a-4874-84bd-1d0b05b3acce\") " Nov 24 13:04:46 crc kubenswrapper[4752]: I1124 13:04:46.827454 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2aa3cc59-ad7a-4874-84bd-1d0b05b3acce-utilities" (OuterVolumeSpecName: "utilities") pod "2aa3cc59-ad7a-4874-84bd-1d0b05b3acce" (UID: "2aa3cc59-ad7a-4874-84bd-1d0b05b3acce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 13:04:46 crc kubenswrapper[4752]: I1124 13:04:46.828026 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2aa3cc59-ad7a-4874-84bd-1d0b05b3acce-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 13:04:46 crc kubenswrapper[4752]: I1124 13:04:46.833932 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2aa3cc59-ad7a-4874-84bd-1d0b05b3acce-kube-api-access-qv92w" (OuterVolumeSpecName: "kube-api-access-qv92w") pod "2aa3cc59-ad7a-4874-84bd-1d0b05b3acce" (UID: "2aa3cc59-ad7a-4874-84bd-1d0b05b3acce"). InnerVolumeSpecName "kube-api-access-qv92w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:04:46 crc kubenswrapper[4752]: I1124 13:04:46.919586 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2aa3cc59-ad7a-4874-84bd-1d0b05b3acce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2aa3cc59-ad7a-4874-84bd-1d0b05b3acce" (UID: "2aa3cc59-ad7a-4874-84bd-1d0b05b3acce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 13:04:46 crc kubenswrapper[4752]: I1124 13:04:46.930428 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qv92w\" (UniqueName: \"kubernetes.io/projected/2aa3cc59-ad7a-4874-84bd-1d0b05b3acce-kube-api-access-qv92w\") on node \"crc\" DevicePath \"\"" Nov 24 13:04:46 crc kubenswrapper[4752]: I1124 13:04:46.930465 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2aa3cc59-ad7a-4874-84bd-1d0b05b3acce-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 13:04:46 crc kubenswrapper[4752]: I1124 13:04:46.980110 4752 generic.go:334] "Generic (PLEG): container finished" podID="2aa3cc59-ad7a-4874-84bd-1d0b05b3acce" containerID="03b20c090304d0de1e14e982886a0c03359f971aae4a2dbc698e1bf3e94e6c6e" exitCode=0 Nov 24 13:04:46 crc kubenswrapper[4752]: I1124 13:04:46.980220 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4qjnj" event={"ID":"2aa3cc59-ad7a-4874-84bd-1d0b05b3acce","Type":"ContainerDied","Data":"03b20c090304d0de1e14e982886a0c03359f971aae4a2dbc698e1bf3e94e6c6e"} Nov 24 13:04:46 crc kubenswrapper[4752]: I1124 13:04:46.980279 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4qjnj" event={"ID":"2aa3cc59-ad7a-4874-84bd-1d0b05b3acce","Type":"ContainerDied","Data":"5683aa0ee90b76d11e1b67c1eaa9ffc6fcc61082560c6d4131a22b6dbe3908ae"} Nov 24 13:04:46 crc kubenswrapper[4752]: I1124 13:04:46.980273 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4qjnj" Nov 24 13:04:46 crc kubenswrapper[4752]: I1124 13:04:46.980305 4752 scope.go:117] "RemoveContainer" containerID="03b20c090304d0de1e14e982886a0c03359f971aae4a2dbc698e1bf3e94e6c6e" Nov 24 13:04:47 crc kubenswrapper[4752]: I1124 13:04:47.007838 4752 scope.go:117] "RemoveContainer" containerID="ae1e54c2e2ff06d43a625e4085d7ca800abaf8970e81a33ff6054ee62665c595" Nov 24 13:04:47 crc kubenswrapper[4752]: I1124 13:04:47.019409 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4qjnj"] Nov 24 13:04:47 crc kubenswrapper[4752]: I1124 13:04:47.028124 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4qjnj"] Nov 24 13:04:47 crc kubenswrapper[4752]: I1124 13:04:47.045720 4752 scope.go:117] "RemoveContainer" containerID="a179407bbe5cf2a0f0c5d4a39d2495519a04dab92d22c20a11229126ac4bd89d" Nov 24 13:04:47 crc kubenswrapper[4752]: I1124 13:04:47.089583 4752 scope.go:117] "RemoveContainer" containerID="03b20c090304d0de1e14e982886a0c03359f971aae4a2dbc698e1bf3e94e6c6e" Nov 24 13:04:47 crc kubenswrapper[4752]: E1124 13:04:47.090092 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03b20c090304d0de1e14e982886a0c03359f971aae4a2dbc698e1bf3e94e6c6e\": container with ID starting with 03b20c090304d0de1e14e982886a0c03359f971aae4a2dbc698e1bf3e94e6c6e not found: ID does not exist" containerID="03b20c090304d0de1e14e982886a0c03359f971aae4a2dbc698e1bf3e94e6c6e" Nov 24 13:04:47 crc kubenswrapper[4752]: I1124 13:04:47.090121 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03b20c090304d0de1e14e982886a0c03359f971aae4a2dbc698e1bf3e94e6c6e"} err="failed to get container status \"03b20c090304d0de1e14e982886a0c03359f971aae4a2dbc698e1bf3e94e6c6e\": rpc error: code = NotFound desc = could not find container \"03b20c090304d0de1e14e982886a0c03359f971aae4a2dbc698e1bf3e94e6c6e\": container with ID starting with 03b20c090304d0de1e14e982886a0c03359f971aae4a2dbc698e1bf3e94e6c6e not found: ID does not exist" Nov 24 13:04:47 crc kubenswrapper[4752]: I1124 13:04:47.090144 4752 scope.go:117] "RemoveContainer" containerID="ae1e54c2e2ff06d43a625e4085d7ca800abaf8970e81a33ff6054ee62665c595" Nov 24 13:04:47 crc kubenswrapper[4752]: E1124 13:04:47.090510 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae1e54c2e2ff06d43a625e4085d7ca800abaf8970e81a33ff6054ee62665c595\": container with ID starting with ae1e54c2e2ff06d43a625e4085d7ca800abaf8970e81a33ff6054ee62665c595 not found: ID does not exist" containerID="ae1e54c2e2ff06d43a625e4085d7ca800abaf8970e81a33ff6054ee62665c595" Nov 24 13:04:47 crc kubenswrapper[4752]: I1124 13:04:47.090546 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae1e54c2e2ff06d43a625e4085d7ca800abaf8970e81a33ff6054ee62665c595"} err="failed to get container status \"ae1e54c2e2ff06d43a625e4085d7ca800abaf8970e81a33ff6054ee62665c595\": rpc error: code = NotFound desc = could not find container \"ae1e54c2e2ff06d43a625e4085d7ca800abaf8970e81a33ff6054ee62665c595\": container with ID starting with ae1e54c2e2ff06d43a625e4085d7ca800abaf8970e81a33ff6054ee62665c595 not found: ID does not exist" Nov 24 13:04:47 crc kubenswrapper[4752]: I1124 13:04:47.090568 4752 scope.go:117] "RemoveContainer" containerID="a179407bbe5cf2a0f0c5d4a39d2495519a04dab92d22c20a11229126ac4bd89d" Nov 24 13:04:47 crc kubenswrapper[4752]: E1124 13:04:47.090979 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a179407bbe5cf2a0f0c5d4a39d2495519a04dab92d22c20a11229126ac4bd89d\": container with ID starting with a179407bbe5cf2a0f0c5d4a39d2495519a04dab92d22c20a11229126ac4bd89d not found: ID does not exist" containerID="a179407bbe5cf2a0f0c5d4a39d2495519a04dab92d22c20a11229126ac4bd89d" Nov 24 13:04:47 crc kubenswrapper[4752]: I1124 13:04:47.091024 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a179407bbe5cf2a0f0c5d4a39d2495519a04dab92d22c20a11229126ac4bd89d"} err="failed to get container status \"a179407bbe5cf2a0f0c5d4a39d2495519a04dab92d22c20a11229126ac4bd89d\": rpc error: code = NotFound desc = could not find container \"a179407bbe5cf2a0f0c5d4a39d2495519a04dab92d22c20a11229126ac4bd89d\": container with ID starting with a179407bbe5cf2a0f0c5d4a39d2495519a04dab92d22c20a11229126ac4bd89d not found: ID does not exist" Nov 24 13:04:48 crc kubenswrapper[4752]: I1124 13:04:48.740285 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2aa3cc59-ad7a-4874-84bd-1d0b05b3acce" path="/var/lib/kubelet/pods/2aa3cc59-ad7a-4874-84bd-1d0b05b3acce/volumes" Nov 24 13:04:56 crc kubenswrapper[4752]: I1124 13:04:56.729285 4752 scope.go:117] "RemoveContainer" containerID="ec8b3c7b99e04ac7de2a7811e3000500deed3097f15545f1d3f164ab34d7ec12" Nov 24 13:04:56 crc kubenswrapper[4752]: E1124 13:04:56.730024 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:05:08 crc kubenswrapper[4752]: I1124 13:05:08.728854 4752 scope.go:117] "RemoveContainer" containerID="ec8b3c7b99e04ac7de2a7811e3000500deed3097f15545f1d3f164ab34d7ec12" Nov 24 13:05:08 crc kubenswrapper[4752]: E1124 13:05:08.730025 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:05:19 crc kubenswrapper[4752]: I1124 13:05:19.729301 4752 scope.go:117] "RemoveContainer" containerID="ec8b3c7b99e04ac7de2a7811e3000500deed3097f15545f1d3f164ab34d7ec12" Nov 24 13:05:19 crc kubenswrapper[4752]: E1124 13:05:19.730292 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:05:31 crc kubenswrapper[4752]: I1124 13:05:31.728793 4752 scope.go:117] "RemoveContainer" containerID="ec8b3c7b99e04ac7de2a7811e3000500deed3097f15545f1d3f164ab34d7ec12" Nov 24 13:05:31 crc kubenswrapper[4752]: E1124 13:05:31.729919 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:05:44 crc kubenswrapper[4752]: I1124 13:05:44.738052 4752 scope.go:117] "RemoveContainer" containerID="ec8b3c7b99e04ac7de2a7811e3000500deed3097f15545f1d3f164ab34d7ec12" Nov 24 13:05:44 crc kubenswrapper[4752]: E1124 13:05:44.738957 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:05:56 crc kubenswrapper[4752]: I1124 13:05:56.732427 4752 scope.go:117] "RemoveContainer" containerID="ec8b3c7b99e04ac7de2a7811e3000500deed3097f15545f1d3f164ab34d7ec12" Nov 24 13:05:56 crc kubenswrapper[4752]: E1124 13:05:56.734878 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:05:56 crc kubenswrapper[4752]: I1124 13:05:56.763152 4752 generic.go:334] "Generic (PLEG): container finished" podID="9b8b46cb-745f-489f-84e7-2b4e9001ac6e" containerID="37495c9dd612a7c3d35d9d98b23db1dd446ae3d3ab9119ab9b5bfc56eb457687" exitCode=0 Nov 24 13:05:56 crc kubenswrapper[4752]: I1124 13:05:56.763201 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-ftbqj" event={"ID":"9b8b46cb-745f-489f-84e7-2b4e9001ac6e","Type":"ContainerDied","Data":"37495c9dd612a7c3d35d9d98b23db1dd446ae3d3ab9119ab9b5bfc56eb457687"} Nov 24 13:05:58 crc kubenswrapper[4752]: I1124 13:05:58.217904 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-ftbqj" Nov 24 13:05:58 crc kubenswrapper[4752]: I1124 13:05:58.298437 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgsx8\" (UniqueName: \"kubernetes.io/projected/9b8b46cb-745f-489f-84e7-2b4e9001ac6e-kube-api-access-lgsx8\") pod \"9b8b46cb-745f-489f-84e7-2b4e9001ac6e\" (UID: \"9b8b46cb-745f-489f-84e7-2b4e9001ac6e\") " Nov 24 13:05:58 crc kubenswrapper[4752]: I1124 13:05:58.298508 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b8b46cb-745f-489f-84e7-2b4e9001ac6e-ssh-key\") pod \"9b8b46cb-745f-489f-84e7-2b4e9001ac6e\" (UID: \"9b8b46cb-745f-489f-84e7-2b4e9001ac6e\") " Nov 24 13:05:58 crc kubenswrapper[4752]: I1124 13:05:58.298665 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9b8b46cb-745f-489f-84e7-2b4e9001ac6e-ceph\") pod \"9b8b46cb-745f-489f-84e7-2b4e9001ac6e\" (UID: \"9b8b46cb-745f-489f-84e7-2b4e9001ac6e\") " Nov 24 13:05:58 crc kubenswrapper[4752]: I1124 13:05:58.298729 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b8b46cb-745f-489f-84e7-2b4e9001ac6e-inventory\") pod \"9b8b46cb-745f-489f-84e7-2b4e9001ac6e\" (UID: \"9b8b46cb-745f-489f-84e7-2b4e9001ac6e\") " Nov 24 13:05:58 crc kubenswrapper[4752]: I1124 13:05:58.303600 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b8b46cb-745f-489f-84e7-2b4e9001ac6e-kube-api-access-lgsx8" (OuterVolumeSpecName: "kube-api-access-lgsx8") pod "9b8b46cb-745f-489f-84e7-2b4e9001ac6e" (UID: "9b8b46cb-745f-489f-84e7-2b4e9001ac6e"). InnerVolumeSpecName "kube-api-access-lgsx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:05:58 crc kubenswrapper[4752]: I1124 13:05:58.304344 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b8b46cb-745f-489f-84e7-2b4e9001ac6e-ceph" (OuterVolumeSpecName: "ceph") pod "9b8b46cb-745f-489f-84e7-2b4e9001ac6e" (UID: "9b8b46cb-745f-489f-84e7-2b4e9001ac6e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:05:58 crc kubenswrapper[4752]: I1124 13:05:58.327248 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b8b46cb-745f-489f-84e7-2b4e9001ac6e-inventory" (OuterVolumeSpecName: "inventory") pod "9b8b46cb-745f-489f-84e7-2b4e9001ac6e" (UID: "9b8b46cb-745f-489f-84e7-2b4e9001ac6e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:05:58 crc kubenswrapper[4752]: I1124 13:05:58.327808 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b8b46cb-745f-489f-84e7-2b4e9001ac6e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9b8b46cb-745f-489f-84e7-2b4e9001ac6e" (UID: "9b8b46cb-745f-489f-84e7-2b4e9001ac6e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:05:58 crc kubenswrapper[4752]: I1124 13:05:58.401820 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgsx8\" (UniqueName: \"kubernetes.io/projected/9b8b46cb-745f-489f-84e7-2b4e9001ac6e-kube-api-access-lgsx8\") on node \"crc\" DevicePath \"\"" Nov 24 13:05:58 crc kubenswrapper[4752]: I1124 13:05:58.401865 4752 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b8b46cb-745f-489f-84e7-2b4e9001ac6e-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 24 13:05:58 crc kubenswrapper[4752]: I1124 13:05:58.401878 4752 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9b8b46cb-745f-489f-84e7-2b4e9001ac6e-ceph\") on node \"crc\" DevicePath \"\"" Nov 24 13:05:58 crc kubenswrapper[4752]: I1124 13:05:58.401889 4752 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b8b46cb-745f-489f-84e7-2b4e9001ac6e-inventory\") on node \"crc\" DevicePath \"\"" Nov 24 13:05:58 crc kubenswrapper[4752]: I1124 13:05:58.785064 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-ftbqj" event={"ID":"9b8b46cb-745f-489f-84e7-2b4e9001ac6e","Type":"ContainerDied","Data":"a14c63be982f70628efd75fa7762f654b7467449dfb04d045608ccf27a40c666"} Nov 24 13:05:58 crc kubenswrapper[4752]: I1124 13:05:58.785453 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a14c63be982f70628efd75fa7762f654b7467449dfb04d045608ccf27a40c666" Nov 24 13:05:58 crc kubenswrapper[4752]: I1124 13:05:58.785134 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-ftbqj" Nov 24 13:05:58 crc kubenswrapper[4752]: I1124 13:05:58.875883 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-hhcgm"] Nov 24 13:05:58 crc kubenswrapper[4752]: E1124 13:05:58.876315 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aa3cc59-ad7a-4874-84bd-1d0b05b3acce" containerName="registry-server" Nov 24 13:05:58 crc kubenswrapper[4752]: I1124 13:05:58.876335 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aa3cc59-ad7a-4874-84bd-1d0b05b3acce" containerName="registry-server" Nov 24 13:05:58 crc kubenswrapper[4752]: E1124 13:05:58.876347 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b8b46cb-745f-489f-84e7-2b4e9001ac6e" containerName="configure-network-openstack-openstack-cell1" Nov 24 13:05:58 crc kubenswrapper[4752]: I1124 13:05:58.876355 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b8b46cb-745f-489f-84e7-2b4e9001ac6e" containerName="configure-network-openstack-openstack-cell1" Nov 24 13:05:58 crc kubenswrapper[4752]: E1124 13:05:58.876378 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aa3cc59-ad7a-4874-84bd-1d0b05b3acce" containerName="extract-content" Nov 24 13:05:58 crc kubenswrapper[4752]: I1124 13:05:58.876385 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aa3cc59-ad7a-4874-84bd-1d0b05b3acce" containerName="extract-content" Nov 24 13:05:58 crc kubenswrapper[4752]: E1124 13:05:58.876398 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aa3cc59-ad7a-4874-84bd-1d0b05b3acce" containerName="extract-utilities" Nov 24 13:05:58 crc kubenswrapper[4752]: I1124 13:05:58.876403 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aa3cc59-ad7a-4874-84bd-1d0b05b3acce" containerName="extract-utilities" Nov 24 13:05:58 crc kubenswrapper[4752]: I1124 13:05:58.876613 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b8b46cb-745f-489f-84e7-2b4e9001ac6e" containerName="configure-network-openstack-openstack-cell1" Nov 24 13:05:58 crc kubenswrapper[4752]: I1124 13:05:58.876632 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aa3cc59-ad7a-4874-84bd-1d0b05b3acce" containerName="registry-server" Nov 24 13:05:58 crc kubenswrapper[4752]: I1124 13:05:58.877449 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-hhcgm" Nov 24 13:05:58 crc kubenswrapper[4752]: I1124 13:05:58.880094 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 24 13:05:58 crc kubenswrapper[4752]: I1124 13:05:58.880240 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 24 13:05:58 crc kubenswrapper[4752]: I1124 13:05:58.880409 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 24 13:05:58 crc kubenswrapper[4752]: I1124 13:05:58.880575 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qj7cx" Nov 24 13:05:58 crc kubenswrapper[4752]: I1124 13:05:58.893652 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-hhcgm"] Nov 24 13:05:59 crc kubenswrapper[4752]: I1124 13:05:59.020778 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6b3b41b1-fa64-45fc-9d41-19b0be538111-ceph\") pod \"validate-network-openstack-openstack-cell1-hhcgm\" (UID: \"6b3b41b1-fa64-45fc-9d41-19b0be538111\") " pod="openstack/validate-network-openstack-openstack-cell1-hhcgm" Nov 24 13:05:59 crc kubenswrapper[4752]: I1124 13:05:59.021086 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b3b41b1-fa64-45fc-9d41-19b0be538111-inventory\") pod \"validate-network-openstack-openstack-cell1-hhcgm\" (UID: \"6b3b41b1-fa64-45fc-9d41-19b0be538111\") " pod="openstack/validate-network-openstack-openstack-cell1-hhcgm" Nov 24 13:05:59 crc kubenswrapper[4752]: I1124 13:05:59.021208 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbwsw\" (UniqueName: \"kubernetes.io/projected/6b3b41b1-fa64-45fc-9d41-19b0be538111-kube-api-access-nbwsw\") pod \"validate-network-openstack-openstack-cell1-hhcgm\" (UID: \"6b3b41b1-fa64-45fc-9d41-19b0be538111\") " pod="openstack/validate-network-openstack-openstack-cell1-hhcgm" Nov 24 13:05:59 crc kubenswrapper[4752]: I1124 13:05:59.021314 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b3b41b1-fa64-45fc-9d41-19b0be538111-ssh-key\") pod \"validate-network-openstack-openstack-cell1-hhcgm\" (UID: \"6b3b41b1-fa64-45fc-9d41-19b0be538111\") " pod="openstack/validate-network-openstack-openstack-cell1-hhcgm" Nov 24 13:05:59 crc kubenswrapper[4752]: I1124 13:05:59.123172 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbwsw\" (UniqueName: \"kubernetes.io/projected/6b3b41b1-fa64-45fc-9d41-19b0be538111-kube-api-access-nbwsw\") pod \"validate-network-openstack-openstack-cell1-hhcgm\" (UID: \"6b3b41b1-fa64-45fc-9d41-19b0be538111\") " pod="openstack/validate-network-openstack-openstack-cell1-hhcgm" Nov 24 13:05:59 crc kubenswrapper[4752]: I1124 13:05:59.123282 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b3b41b1-fa64-45fc-9d41-19b0be538111-ssh-key\") pod \"validate-network-openstack-openstack-cell1-hhcgm\" (UID: \"6b3b41b1-fa64-45fc-9d41-19b0be538111\") " pod="openstack/validate-network-openstack-openstack-cell1-hhcgm" Nov 24 13:05:59 crc kubenswrapper[4752]: I1124 13:05:59.123481 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6b3b41b1-fa64-45fc-9d41-19b0be538111-ceph\") pod \"validate-network-openstack-openstack-cell1-hhcgm\" (UID: \"6b3b41b1-fa64-45fc-9d41-19b0be538111\") " pod="openstack/validate-network-openstack-openstack-cell1-hhcgm" Nov 24 13:05:59 crc kubenswrapper[4752]: I1124 13:05:59.123655 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b3b41b1-fa64-45fc-9d41-19b0be538111-inventory\") pod \"validate-network-openstack-openstack-cell1-hhcgm\" (UID: \"6b3b41b1-fa64-45fc-9d41-19b0be538111\") " pod="openstack/validate-network-openstack-openstack-cell1-hhcgm" Nov 24 13:05:59 crc kubenswrapper[4752]: I1124 13:05:59.134482 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6b3b41b1-fa64-45fc-9d41-19b0be538111-ceph\") pod \"validate-network-openstack-openstack-cell1-hhcgm\" (UID: \"6b3b41b1-fa64-45fc-9d41-19b0be538111\") " pod="openstack/validate-network-openstack-openstack-cell1-hhcgm" Nov 24 13:05:59 crc kubenswrapper[4752]: I1124 13:05:59.134849 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b3b41b1-fa64-45fc-9d41-19b0be538111-inventory\") pod \"validate-network-openstack-openstack-cell1-hhcgm\" (UID: \"6b3b41b1-fa64-45fc-9d41-19b0be538111\") " pod="openstack/validate-network-openstack-openstack-cell1-hhcgm" Nov 24 13:05:59 crc kubenswrapper[4752]: I1124 13:05:59.135054 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b3b41b1-fa64-45fc-9d41-19b0be538111-ssh-key\") pod \"validate-network-openstack-openstack-cell1-hhcgm\" (UID: \"6b3b41b1-fa64-45fc-9d41-19b0be538111\") " pod="openstack/validate-network-openstack-openstack-cell1-hhcgm" Nov 24 13:05:59 crc kubenswrapper[4752]: I1124 13:05:59.155965 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbwsw\" (UniqueName: \"kubernetes.io/projected/6b3b41b1-fa64-45fc-9d41-19b0be538111-kube-api-access-nbwsw\") pod \"validate-network-openstack-openstack-cell1-hhcgm\" (UID: \"6b3b41b1-fa64-45fc-9d41-19b0be538111\") " pod="openstack/validate-network-openstack-openstack-cell1-hhcgm" Nov 24 13:05:59 crc kubenswrapper[4752]: I1124 13:05:59.196482 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-hhcgm" Nov 24 13:05:59 crc kubenswrapper[4752]: I1124 13:05:59.792591 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-hhcgm"] Nov 24 13:05:59 crc kubenswrapper[4752]: W1124 13:05:59.796715 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b3b41b1_fa64_45fc_9d41_19b0be538111.slice/crio-e1596066a94eb7c9a7f51576450417328847e5b789299c37e9792e651a87d319 WatchSource:0}: Error finding container e1596066a94eb7c9a7f51576450417328847e5b789299c37e9792e651a87d319: Status 404 returned error can't find the container with id e1596066a94eb7c9a7f51576450417328847e5b789299c37e9792e651a87d319 Nov 24 13:06:00 crc kubenswrapper[4752]: I1124 13:06:00.803969 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-hhcgm" event={"ID":"6b3b41b1-fa64-45fc-9d41-19b0be538111","Type":"ContainerStarted","Data":"384f7ef4e17a3f0a7102875f249dd62ceb6d16b0664dc82063f2a23340ea9fa4"} Nov 24 13:06:00 crc kubenswrapper[4752]: I1124 13:06:00.804025 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-hhcgm" event={"ID":"6b3b41b1-fa64-45fc-9d41-19b0be538111","Type":"ContainerStarted","Data":"e1596066a94eb7c9a7f51576450417328847e5b789299c37e9792e651a87d319"} Nov 24 13:06:00 crc kubenswrapper[4752]: I1124 13:06:00.826834 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-cell1-hhcgm" podStartSLOduration=2.259994939 podStartE2EDuration="2.826811382s" podCreationTimestamp="2025-11-24 13:05:58 +0000 UTC" firstStartedPulling="2025-11-24 13:05:59.798728913 +0000 UTC m=+7165.783549202" lastFinishedPulling="2025-11-24 13:06:00.365545366 +0000 UTC m=+7166.350365645" observedRunningTime="2025-11-24 13:06:00.817877685 +0000 UTC m=+7166.802697974" watchObservedRunningTime="2025-11-24 13:06:00.826811382 +0000 UTC m=+7166.811631671" Nov 24 13:06:05 crc kubenswrapper[4752]: I1124 13:06:05.882684 4752 generic.go:334] "Generic (PLEG): container finished" podID="6b3b41b1-fa64-45fc-9d41-19b0be538111" containerID="384f7ef4e17a3f0a7102875f249dd62ceb6d16b0664dc82063f2a23340ea9fa4" exitCode=0 Nov 24 13:06:05 crc kubenswrapper[4752]: I1124 13:06:05.882889 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-hhcgm" event={"ID":"6b3b41b1-fa64-45fc-9d41-19b0be538111","Type":"ContainerDied","Data":"384f7ef4e17a3f0a7102875f249dd62ceb6d16b0664dc82063f2a23340ea9fa4"} Nov 24 13:06:07 crc kubenswrapper[4752]: I1124 13:06:07.373297 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-hhcgm" Nov 24 13:06:07 crc kubenswrapper[4752]: I1124 13:06:07.529084 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6b3b41b1-fa64-45fc-9d41-19b0be538111-ceph\") pod \"6b3b41b1-fa64-45fc-9d41-19b0be538111\" (UID: \"6b3b41b1-fa64-45fc-9d41-19b0be538111\") " Nov 24 13:06:07 crc kubenswrapper[4752]: I1124 13:06:07.529534 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b3b41b1-fa64-45fc-9d41-19b0be538111-ssh-key\") pod \"6b3b41b1-fa64-45fc-9d41-19b0be538111\" (UID: \"6b3b41b1-fa64-45fc-9d41-19b0be538111\") " Nov 24 13:06:07 crc kubenswrapper[4752]: I1124 13:06:07.529617 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbwsw\" (UniqueName: \"kubernetes.io/projected/6b3b41b1-fa64-45fc-9d41-19b0be538111-kube-api-access-nbwsw\") pod \"6b3b41b1-fa64-45fc-9d41-19b0be538111\" (UID: \"6b3b41b1-fa64-45fc-9d41-19b0be538111\") " Nov 24 13:06:07 crc kubenswrapper[4752]: I1124 13:06:07.529795 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b3b41b1-fa64-45fc-9d41-19b0be538111-inventory\") pod \"6b3b41b1-fa64-45fc-9d41-19b0be538111\" (UID: \"6b3b41b1-fa64-45fc-9d41-19b0be538111\") " Nov 24 13:06:07 crc kubenswrapper[4752]: I1124 13:06:07.534279 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b3b41b1-fa64-45fc-9d41-19b0be538111-kube-api-access-nbwsw" (OuterVolumeSpecName: "kube-api-access-nbwsw") pod "6b3b41b1-fa64-45fc-9d41-19b0be538111" (UID: "6b3b41b1-fa64-45fc-9d41-19b0be538111"). InnerVolumeSpecName "kube-api-access-nbwsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:06:07 crc kubenswrapper[4752]: I1124 13:06:07.537429 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b3b41b1-fa64-45fc-9d41-19b0be538111-ceph" (OuterVolumeSpecName: "ceph") pod "6b3b41b1-fa64-45fc-9d41-19b0be538111" (UID: "6b3b41b1-fa64-45fc-9d41-19b0be538111"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:06:07 crc kubenswrapper[4752]: I1124 13:06:07.567956 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b3b41b1-fa64-45fc-9d41-19b0be538111-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6b3b41b1-fa64-45fc-9d41-19b0be538111" (UID: "6b3b41b1-fa64-45fc-9d41-19b0be538111"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:06:07 crc kubenswrapper[4752]: I1124 13:06:07.580132 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b3b41b1-fa64-45fc-9d41-19b0be538111-inventory" (OuterVolumeSpecName: "inventory") pod "6b3b41b1-fa64-45fc-9d41-19b0be538111" (UID: "6b3b41b1-fa64-45fc-9d41-19b0be538111"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:06:07 crc kubenswrapper[4752]: I1124 13:06:07.633206 4752 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b3b41b1-fa64-45fc-9d41-19b0be538111-inventory\") on node \"crc\" DevicePath \"\"" Nov 24 13:06:07 crc kubenswrapper[4752]: I1124 13:06:07.633428 4752 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6b3b41b1-fa64-45fc-9d41-19b0be538111-ceph\") on node \"crc\" DevicePath \"\"" Nov 24 13:06:07 crc kubenswrapper[4752]: I1124 13:06:07.633437 4752 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b3b41b1-fa64-45fc-9d41-19b0be538111-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 24 13:06:07 crc kubenswrapper[4752]: I1124 13:06:07.633448 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbwsw\" (UniqueName: \"kubernetes.io/projected/6b3b41b1-fa64-45fc-9d41-19b0be538111-kube-api-access-nbwsw\") on node \"crc\" DevicePath \"\"" Nov 24 13:06:07 crc kubenswrapper[4752]: I1124 13:06:07.906716 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-hhcgm" event={"ID":"6b3b41b1-fa64-45fc-9d41-19b0be538111","Type":"ContainerDied","Data":"e1596066a94eb7c9a7f51576450417328847e5b789299c37e9792e651a87d319"} Nov 24 13:06:07 crc kubenswrapper[4752]: I1124 13:06:07.906794 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1596066a94eb7c9a7f51576450417328847e5b789299c37e9792e651a87d319" Nov 24 13:06:07 crc kubenswrapper[4752]: I1124 13:06:07.906798 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-hhcgm" Nov 24 13:06:07 crc kubenswrapper[4752]: I1124 13:06:07.974458 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-cell1-wq8r6"] Nov 24 13:06:07 crc kubenswrapper[4752]: E1124 13:06:07.975032 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b3b41b1-fa64-45fc-9d41-19b0be538111" containerName="validate-network-openstack-openstack-cell1" Nov 24 13:06:07 crc kubenswrapper[4752]: I1124 13:06:07.975068 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b3b41b1-fa64-45fc-9d41-19b0be538111" containerName="validate-network-openstack-openstack-cell1" Nov 24 13:06:07 crc kubenswrapper[4752]: I1124 13:06:07.975352 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b3b41b1-fa64-45fc-9d41-19b0be538111" containerName="validate-network-openstack-openstack-cell1" Nov 24 13:06:07 crc kubenswrapper[4752]: I1124 13:06:07.976244 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-wq8r6" Nov 24 13:06:07 crc kubenswrapper[4752]: I1124 13:06:07.978722 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qj7cx" Nov 24 13:06:07 crc kubenswrapper[4752]: I1124 13:06:07.979186 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 24 13:06:07 crc kubenswrapper[4752]: I1124 13:06:07.979505 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 24 13:06:07 crc kubenswrapper[4752]: I1124 13:06:07.979782 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 24 13:06:07 crc kubenswrapper[4752]: I1124 13:06:07.992090 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-wq8r6"] Nov 24 13:06:08 crc kubenswrapper[4752]: I1124 13:06:08.042061 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7qnt\" (UniqueName: \"kubernetes.io/projected/7f26d91c-fd10-48f3-b4cd-1cecf074d2b0-kube-api-access-j7qnt\") pod \"install-os-openstack-openstack-cell1-wq8r6\" (UID: \"7f26d91c-fd10-48f3-b4cd-1cecf074d2b0\") " pod="openstack/install-os-openstack-openstack-cell1-wq8r6" Nov 24 13:06:08 crc kubenswrapper[4752]: I1124 13:06:08.042112 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f26d91c-fd10-48f3-b4cd-1cecf074d2b0-inventory\") pod \"install-os-openstack-openstack-cell1-wq8r6\" (UID: \"7f26d91c-fd10-48f3-b4cd-1cecf074d2b0\") " pod="openstack/install-os-openstack-openstack-cell1-wq8r6" Nov 24 13:06:08 crc kubenswrapper[4752]: I1124 13:06:08.042275 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7f26d91c-fd10-48f3-b4cd-1cecf074d2b0-ceph\") pod \"install-os-openstack-openstack-cell1-wq8r6\" (UID: \"7f26d91c-fd10-48f3-b4cd-1cecf074d2b0\") " pod="openstack/install-os-openstack-openstack-cell1-wq8r6" Nov 24 13:06:08 crc kubenswrapper[4752]: I1124 13:06:08.042306 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7f26d91c-fd10-48f3-b4cd-1cecf074d2b0-ssh-key\") pod \"install-os-openstack-openstack-cell1-wq8r6\" (UID: \"7f26d91c-fd10-48f3-b4cd-1cecf074d2b0\") " pod="openstack/install-os-openstack-openstack-cell1-wq8r6" Nov 24 13:06:08 crc kubenswrapper[4752]: I1124 13:06:08.143949 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7f26d91c-fd10-48f3-b4cd-1cecf074d2b0-ssh-key\") pod \"install-os-openstack-openstack-cell1-wq8r6\" (UID: \"7f26d91c-fd10-48f3-b4cd-1cecf074d2b0\") " pod="openstack/install-os-openstack-openstack-cell1-wq8r6" Nov 24 13:06:08 crc kubenswrapper[4752]: I1124 13:06:08.144089 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7qnt\" (UniqueName: \"kubernetes.io/projected/7f26d91c-fd10-48f3-b4cd-1cecf074d2b0-kube-api-access-j7qnt\") pod \"install-os-openstack-openstack-cell1-wq8r6\" (UID: \"7f26d91c-fd10-48f3-b4cd-1cecf074d2b0\") " pod="openstack/install-os-openstack-openstack-cell1-wq8r6" Nov 24 13:06:08 crc kubenswrapper[4752]: I1124 13:06:08.144119 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f26d91c-fd10-48f3-b4cd-1cecf074d2b0-inventory\") pod \"install-os-openstack-openstack-cell1-wq8r6\" (UID: \"7f26d91c-fd10-48f3-b4cd-1cecf074d2b0\") " pod="openstack/install-os-openstack-openstack-cell1-wq8r6" Nov 24 13:06:08 crc kubenswrapper[4752]: I1124 13:06:08.144210 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7f26d91c-fd10-48f3-b4cd-1cecf074d2b0-ceph\") pod \"install-os-openstack-openstack-cell1-wq8r6\" (UID: \"7f26d91c-fd10-48f3-b4cd-1cecf074d2b0\") " pod="openstack/install-os-openstack-openstack-cell1-wq8r6" Nov 24 13:06:08 crc kubenswrapper[4752]: I1124 13:06:08.149731 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7f26d91c-fd10-48f3-b4cd-1cecf074d2b0-ssh-key\") pod \"install-os-openstack-openstack-cell1-wq8r6\" (UID: \"7f26d91c-fd10-48f3-b4cd-1cecf074d2b0\") " pod="openstack/install-os-openstack-openstack-cell1-wq8r6" Nov 24 13:06:08 crc kubenswrapper[4752]: I1124 13:06:08.150316 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f26d91c-fd10-48f3-b4cd-1cecf074d2b0-inventory\") pod \"install-os-openstack-openstack-cell1-wq8r6\" (UID: \"7f26d91c-fd10-48f3-b4cd-1cecf074d2b0\") " pod="openstack/install-os-openstack-openstack-cell1-wq8r6" Nov 24 13:06:08 crc kubenswrapper[4752]: I1124 13:06:08.152083 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7f26d91c-fd10-48f3-b4cd-1cecf074d2b0-ceph\") pod \"install-os-openstack-openstack-cell1-wq8r6\" (UID: \"7f26d91c-fd10-48f3-b4cd-1cecf074d2b0\") " pod="openstack/install-os-openstack-openstack-cell1-wq8r6" Nov 24 13:06:08 crc kubenswrapper[4752]: I1124 13:06:08.171214 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7qnt\" (UniqueName: \"kubernetes.io/projected/7f26d91c-fd10-48f3-b4cd-1cecf074d2b0-kube-api-access-j7qnt\") pod \"install-os-openstack-openstack-cell1-wq8r6\" (UID: \"7f26d91c-fd10-48f3-b4cd-1cecf074d2b0\") " pod="openstack/install-os-openstack-openstack-cell1-wq8r6" Nov 24 13:06:08 crc kubenswrapper[4752]: I1124 13:06:08.295972 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-wq8r6" Nov 24 13:06:08 crc kubenswrapper[4752]: I1124 13:06:08.856148 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-wq8r6"] Nov 24 13:06:08 crc kubenswrapper[4752]: I1124 13:06:08.916613 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-wq8r6" event={"ID":"7f26d91c-fd10-48f3-b4cd-1cecf074d2b0","Type":"ContainerStarted","Data":"35a039001d7692319ccedc9d67aa15f9bc5fc7426b78a16992f9c2947c1299f6"} Nov 24 13:06:09 crc kubenswrapper[4752]: I1124 13:06:09.728720 4752 scope.go:117] "RemoveContainer" containerID="ec8b3c7b99e04ac7de2a7811e3000500deed3097f15545f1d3f164ab34d7ec12" Nov 24 13:06:09 crc kubenswrapper[4752]: E1124 13:06:09.729318 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:06:09 crc kubenswrapper[4752]: I1124 13:06:09.930232 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-wq8r6" event={"ID":"7f26d91c-fd10-48f3-b4cd-1cecf074d2b0","Type":"ContainerStarted","Data":"c2f28a4c1977355eb5e4a12c7cad4d48ea384413118040333f8e9589570d8fe9"} Nov 24 13:06:09 crc kubenswrapper[4752]: I1124 13:06:09.960048 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-cell1-wq8r6" podStartSLOduration=2.5263453 podStartE2EDuration="2.960008833s" podCreationTimestamp="2025-11-24 13:06:07 +0000 UTC" firstStartedPulling="2025-11-24 13:06:08.861022739 +0000 UTC m=+7174.845843028" lastFinishedPulling="2025-11-24 13:06:09.294686272 +0000 UTC m=+7175.279506561" observedRunningTime="2025-11-24 13:06:09.954152465 +0000 UTC m=+7175.938972764" watchObservedRunningTime="2025-11-24 13:06:09.960008833 +0000 UTC m=+7175.944829132" Nov 24 13:06:21 crc kubenswrapper[4752]: I1124 13:06:21.729176 4752 scope.go:117] "RemoveContainer" containerID="ec8b3c7b99e04ac7de2a7811e3000500deed3097f15545f1d3f164ab34d7ec12" Nov 24 13:06:21 crc kubenswrapper[4752]: E1124 13:06:21.730200 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:06:33 crc kubenswrapper[4752]: I1124 13:06:33.728599 4752 scope.go:117] "RemoveContainer" containerID="ec8b3c7b99e04ac7de2a7811e3000500deed3097f15545f1d3f164ab34d7ec12" Nov 24 13:06:33 crc kubenswrapper[4752]: E1124 13:06:33.729421 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:06:47 crc kubenswrapper[4752]: I1124 13:06:47.727702 4752 scope.go:117] "RemoveContainer" containerID="ec8b3c7b99e04ac7de2a7811e3000500deed3097f15545f1d3f164ab34d7ec12" Nov 24 13:06:47 crc kubenswrapper[4752]: E1124 13:06:47.728402 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:06:54 crc kubenswrapper[4752]: I1124 13:06:54.433045 4752 generic.go:334] "Generic (PLEG): container finished" podID="7f26d91c-fd10-48f3-b4cd-1cecf074d2b0" containerID="c2f28a4c1977355eb5e4a12c7cad4d48ea384413118040333f8e9589570d8fe9" exitCode=0 Nov 24 13:06:54 crc kubenswrapper[4752]: I1124 13:06:54.433151 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-wq8r6" event={"ID":"7f26d91c-fd10-48f3-b4cd-1cecf074d2b0","Type":"ContainerDied","Data":"c2f28a4c1977355eb5e4a12c7cad4d48ea384413118040333f8e9589570d8fe9"} Nov 24 13:06:56 crc kubenswrapper[4752]: I1124 13:06:56.013082 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-wq8r6" Nov 24 13:06:56 crc kubenswrapper[4752]: I1124 13:06:56.093707 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7f26d91c-fd10-48f3-b4cd-1cecf074d2b0-ssh-key\") pod \"7f26d91c-fd10-48f3-b4cd-1cecf074d2b0\" (UID: \"7f26d91c-fd10-48f3-b4cd-1cecf074d2b0\") " Nov 24 13:06:56 crc kubenswrapper[4752]: I1124 13:06:56.093781 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f26d91c-fd10-48f3-b4cd-1cecf074d2b0-inventory\") pod \"7f26d91c-fd10-48f3-b4cd-1cecf074d2b0\" (UID: \"7f26d91c-fd10-48f3-b4cd-1cecf074d2b0\") " Nov 24 13:06:56 crc kubenswrapper[4752]: I1124 13:06:56.093884 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7qnt\" (UniqueName: \"kubernetes.io/projected/7f26d91c-fd10-48f3-b4cd-1cecf074d2b0-kube-api-access-j7qnt\") pod \"7f26d91c-fd10-48f3-b4cd-1cecf074d2b0\" (UID: \"7f26d91c-fd10-48f3-b4cd-1cecf074d2b0\") " Nov 24 13:06:56 crc kubenswrapper[4752]: I1124 13:06:56.093903 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7f26d91c-fd10-48f3-b4cd-1cecf074d2b0-ceph\") pod \"7f26d91c-fd10-48f3-b4cd-1cecf074d2b0\" (UID: \"7f26d91c-fd10-48f3-b4cd-1cecf074d2b0\") " Nov 24 13:06:56 crc kubenswrapper[4752]: I1124 13:06:56.110127 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f26d91c-fd10-48f3-b4cd-1cecf074d2b0-ceph" (OuterVolumeSpecName: "ceph") pod "7f26d91c-fd10-48f3-b4cd-1cecf074d2b0" (UID: "7f26d91c-fd10-48f3-b4cd-1cecf074d2b0"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:06:56 crc kubenswrapper[4752]: I1124 13:06:56.110125 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f26d91c-fd10-48f3-b4cd-1cecf074d2b0-kube-api-access-j7qnt" (OuterVolumeSpecName: "kube-api-access-j7qnt") pod "7f26d91c-fd10-48f3-b4cd-1cecf074d2b0" (UID: "7f26d91c-fd10-48f3-b4cd-1cecf074d2b0"). InnerVolumeSpecName "kube-api-access-j7qnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:06:56 crc kubenswrapper[4752]: I1124 13:06:56.144808 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f26d91c-fd10-48f3-b4cd-1cecf074d2b0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7f26d91c-fd10-48f3-b4cd-1cecf074d2b0" (UID: "7f26d91c-fd10-48f3-b4cd-1cecf074d2b0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:06:56 crc kubenswrapper[4752]: I1124 13:06:56.153350 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f26d91c-fd10-48f3-b4cd-1cecf074d2b0-inventory" (OuterVolumeSpecName: "inventory") pod "7f26d91c-fd10-48f3-b4cd-1cecf074d2b0" (UID: "7f26d91c-fd10-48f3-b4cd-1cecf074d2b0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:06:56 crc kubenswrapper[4752]: I1124 13:06:56.196118 4752 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7f26d91c-fd10-48f3-b4cd-1cecf074d2b0-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 24 13:06:56 crc kubenswrapper[4752]: I1124 13:06:56.196169 4752 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f26d91c-fd10-48f3-b4cd-1cecf074d2b0-inventory\") on node \"crc\" DevicePath \"\"" Nov 24 13:06:56 crc kubenswrapper[4752]: I1124 13:06:56.196181 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7qnt\" (UniqueName: \"kubernetes.io/projected/7f26d91c-fd10-48f3-b4cd-1cecf074d2b0-kube-api-access-j7qnt\") on node \"crc\" DevicePath \"\"" Nov 24 13:06:56 crc kubenswrapper[4752]: I1124 13:06:56.196192 4752 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7f26d91c-fd10-48f3-b4cd-1cecf074d2b0-ceph\") on node \"crc\" DevicePath \"\"" Nov 24 13:06:56 crc kubenswrapper[4752]: I1124 13:06:56.461797 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-wq8r6" event={"ID":"7f26d91c-fd10-48f3-b4cd-1cecf074d2b0","Type":"ContainerDied","Data":"35a039001d7692319ccedc9d67aa15f9bc5fc7426b78a16992f9c2947c1299f6"} Nov 24 13:06:56 crc kubenswrapper[4752]: I1124 13:06:56.461845 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35a039001d7692319ccedc9d67aa15f9bc5fc7426b78a16992f9c2947c1299f6" Nov 24 13:06:56 crc kubenswrapper[4752]: I1124 13:06:56.461956 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-wq8r6" Nov 24 13:06:57 crc kubenswrapper[4752]: I1124 13:06:57.114613 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-rqzkn"] Nov 24 13:06:57 crc kubenswrapper[4752]: E1124 13:06:57.116115 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f26d91c-fd10-48f3-b4cd-1cecf074d2b0" containerName="install-os-openstack-openstack-cell1" Nov 24 13:06:57 crc kubenswrapper[4752]: I1124 13:06:57.116194 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f26d91c-fd10-48f3-b4cd-1cecf074d2b0" containerName="install-os-openstack-openstack-cell1" Nov 24 13:06:57 crc kubenswrapper[4752]: I1124 13:06:57.116452 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f26d91c-fd10-48f3-b4cd-1cecf074d2b0" containerName="install-os-openstack-openstack-cell1" Nov 24 13:06:57 crc kubenswrapper[4752]: I1124 13:06:57.117337 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-rqzkn" Nov 24 13:06:57 crc kubenswrapper[4752]: I1124 13:06:57.122285 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 24 13:06:57 crc kubenswrapper[4752]: I1124 13:06:57.122610 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qj7cx" Nov 24 13:06:57 crc kubenswrapper[4752]: I1124 13:06:57.122783 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 24 13:06:57 crc kubenswrapper[4752]: I1124 13:06:57.122485 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 24 13:06:57 crc kubenswrapper[4752]: I1124 13:06:57.131241 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-rqzkn"] Nov 24 13:06:57 crc kubenswrapper[4752]: I1124 13:06:57.215727 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee7cb878-862d-49ed-ace6-1ba9b4b6daf0-inventory\") pod \"configure-os-openstack-openstack-cell1-rqzkn\" (UID: \"ee7cb878-862d-49ed-ace6-1ba9b4b6daf0\") " pod="openstack/configure-os-openstack-openstack-cell1-rqzkn" Nov 24 13:06:57 crc kubenswrapper[4752]: I1124 13:06:57.215859 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ee7cb878-862d-49ed-ace6-1ba9b4b6daf0-ceph\") pod \"configure-os-openstack-openstack-cell1-rqzkn\" (UID: \"ee7cb878-862d-49ed-ace6-1ba9b4b6daf0\") " pod="openstack/configure-os-openstack-openstack-cell1-rqzkn" Nov 24 13:06:57 crc kubenswrapper[4752]: I1124 13:06:57.216218 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ee7cb878-862d-49ed-ace6-1ba9b4b6daf0-ssh-key\") pod \"configure-os-openstack-openstack-cell1-rqzkn\" (UID: \"ee7cb878-862d-49ed-ace6-1ba9b4b6daf0\") " pod="openstack/configure-os-openstack-openstack-cell1-rqzkn" Nov 24 13:06:57 crc kubenswrapper[4752]: I1124 13:06:57.216397 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zd9l\" (UniqueName: \"kubernetes.io/projected/ee7cb878-862d-49ed-ace6-1ba9b4b6daf0-kube-api-access-9zd9l\") pod \"configure-os-openstack-openstack-cell1-rqzkn\" (UID: \"ee7cb878-862d-49ed-ace6-1ba9b4b6daf0\") " pod="openstack/configure-os-openstack-openstack-cell1-rqzkn" Nov 24 13:06:57 crc kubenswrapper[4752]: I1124 13:06:57.318271 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ee7cb878-862d-49ed-ace6-1ba9b4b6daf0-ssh-key\") pod \"configure-os-openstack-openstack-cell1-rqzkn\" (UID: \"ee7cb878-862d-49ed-ace6-1ba9b4b6daf0\") " pod="openstack/configure-os-openstack-openstack-cell1-rqzkn" Nov 24 13:06:57 crc kubenswrapper[4752]: I1124 13:06:57.318359 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zd9l\" (UniqueName: \"kubernetes.io/projected/ee7cb878-862d-49ed-ace6-1ba9b4b6daf0-kube-api-access-9zd9l\") pod \"configure-os-openstack-openstack-cell1-rqzkn\" (UID: \"ee7cb878-862d-49ed-ace6-1ba9b4b6daf0\") " pod="openstack/configure-os-openstack-openstack-cell1-rqzkn" Nov 24 13:06:57 crc kubenswrapper[4752]: I1124 13:06:57.318404 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee7cb878-862d-49ed-ace6-1ba9b4b6daf0-inventory\") pod \"configure-os-openstack-openstack-cell1-rqzkn\" (UID: \"ee7cb878-862d-49ed-ace6-1ba9b4b6daf0\") " pod="openstack/configure-os-openstack-openstack-cell1-rqzkn" Nov 24 13:06:57 crc kubenswrapper[4752]: I1124 13:06:57.318435 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ee7cb878-862d-49ed-ace6-1ba9b4b6daf0-ceph\") pod \"configure-os-openstack-openstack-cell1-rqzkn\" (UID: \"ee7cb878-862d-49ed-ace6-1ba9b4b6daf0\") " pod="openstack/configure-os-openstack-openstack-cell1-rqzkn" Nov 24 13:06:57 crc kubenswrapper[4752]: I1124 13:06:57.322927 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ee7cb878-862d-49ed-ace6-1ba9b4b6daf0-ssh-key\") pod \"configure-os-openstack-openstack-cell1-rqzkn\" (UID: \"ee7cb878-862d-49ed-ace6-1ba9b4b6daf0\") " pod="openstack/configure-os-openstack-openstack-cell1-rqzkn" Nov 24 13:06:57 crc kubenswrapper[4752]: I1124 13:06:57.323063 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee7cb878-862d-49ed-ace6-1ba9b4b6daf0-inventory\") pod \"configure-os-openstack-openstack-cell1-rqzkn\" (UID: \"ee7cb878-862d-49ed-ace6-1ba9b4b6daf0\") " pod="openstack/configure-os-openstack-openstack-cell1-rqzkn" Nov 24 13:06:57 crc kubenswrapper[4752]: I1124 13:06:57.327297 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ee7cb878-862d-49ed-ace6-1ba9b4b6daf0-ceph\") pod \"configure-os-openstack-openstack-cell1-rqzkn\" (UID: \"ee7cb878-862d-49ed-ace6-1ba9b4b6daf0\") " pod="openstack/configure-os-openstack-openstack-cell1-rqzkn" Nov 24 13:06:57 crc kubenswrapper[4752]: I1124 13:06:57.341438 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zd9l\" (UniqueName: \"kubernetes.io/projected/ee7cb878-862d-49ed-ace6-1ba9b4b6daf0-kube-api-access-9zd9l\") pod \"configure-os-openstack-openstack-cell1-rqzkn\" (UID: \"ee7cb878-862d-49ed-ace6-1ba9b4b6daf0\") " pod="openstack/configure-os-openstack-openstack-cell1-rqzkn" Nov 24 13:06:57 crc kubenswrapper[4752]: I1124 13:06:57.443847 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-rqzkn" Nov 24 13:06:58 crc kubenswrapper[4752]: I1124 13:06:58.044495 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-rqzkn"] Nov 24 13:06:58 crc kubenswrapper[4752]: I1124 13:06:58.489639 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-rqzkn" event={"ID":"ee7cb878-862d-49ed-ace6-1ba9b4b6daf0","Type":"ContainerStarted","Data":"77952ec4d6ea165f0d192a69c8e10ee79467042c9674fb0384e362d61463fb57"} Nov 24 13:06:58 crc kubenswrapper[4752]: I1124 13:06:58.729726 4752 scope.go:117] "RemoveContainer" containerID="ec8b3c7b99e04ac7de2a7811e3000500deed3097f15545f1d3f164ab34d7ec12" Nov 24 13:06:58 crc kubenswrapper[4752]: E1124 13:06:58.730020 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:06:59 crc kubenswrapper[4752]: I1124 13:06:59.501607 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-rqzkn" event={"ID":"ee7cb878-862d-49ed-ace6-1ba9b4b6daf0","Type":"ContainerStarted","Data":"854be9874e567a34c95987d6afc2505adc85354f620215f45bbea1764bfb296a"} Nov 24 13:06:59 crc kubenswrapper[4752]: I1124 13:06:59.531364 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-cell1-rqzkn" podStartSLOduration=2.147460408 podStartE2EDuration="2.531342173s" podCreationTimestamp="2025-11-24 13:06:57 +0000 UTC" firstStartedPulling="2025-11-24 13:06:58.053948471 +0000 UTC m=+7224.038768760" lastFinishedPulling="2025-11-24 13:06:58.437830236 +0000 UTC m=+7224.422650525" observedRunningTime="2025-11-24 13:06:59.520640456 +0000 UTC m=+7225.505460745" watchObservedRunningTime="2025-11-24 13:06:59.531342173 +0000 UTC m=+7225.516162482" Nov 24 13:07:09 crc kubenswrapper[4752]: I1124 13:07:09.728764 4752 scope.go:117] "RemoveContainer" containerID="ec8b3c7b99e04ac7de2a7811e3000500deed3097f15545f1d3f164ab34d7ec12" Nov 24 13:07:09 crc kubenswrapper[4752]: E1124 13:07:09.729611 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:07:20 crc kubenswrapper[4752]: I1124 13:07:20.729224 4752 scope.go:117] "RemoveContainer" containerID="ec8b3c7b99e04ac7de2a7811e3000500deed3097f15545f1d3f164ab34d7ec12" Nov 24 13:07:20 crc kubenswrapper[4752]: E1124 13:07:20.730159 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:07:31 crc kubenswrapper[4752]: I1124 13:07:31.728383 4752 scope.go:117] "RemoveContainer" containerID="ec8b3c7b99e04ac7de2a7811e3000500deed3097f15545f1d3f164ab34d7ec12" Nov 24 13:07:31 crc kubenswrapper[4752]: E1124 13:07:31.729714 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:07:44 crc kubenswrapper[4752]: I1124 13:07:44.733814 4752 scope.go:117] "RemoveContainer" containerID="ec8b3c7b99e04ac7de2a7811e3000500deed3097f15545f1d3f164ab34d7ec12" Nov 24 13:07:44 crc kubenswrapper[4752]: E1124 13:07:44.734674 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:07:45 crc kubenswrapper[4752]: I1124 13:07:45.029237 4752 generic.go:334] "Generic (PLEG): container finished" podID="ee7cb878-862d-49ed-ace6-1ba9b4b6daf0" containerID="854be9874e567a34c95987d6afc2505adc85354f620215f45bbea1764bfb296a" exitCode=0 Nov 24 13:07:45 crc kubenswrapper[4752]: I1124 13:07:45.029300 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-rqzkn" event={"ID":"ee7cb878-862d-49ed-ace6-1ba9b4b6daf0","Type":"ContainerDied","Data":"854be9874e567a34c95987d6afc2505adc85354f620215f45bbea1764bfb296a"} Nov 24 13:07:46 crc kubenswrapper[4752]: I1124 13:07:46.538849 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-rqzkn" Nov 24 13:07:46 crc kubenswrapper[4752]: I1124 13:07:46.659911 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee7cb878-862d-49ed-ace6-1ba9b4b6daf0-inventory\") pod \"ee7cb878-862d-49ed-ace6-1ba9b4b6daf0\" (UID: \"ee7cb878-862d-49ed-ace6-1ba9b4b6daf0\") " Nov 24 13:07:46 crc kubenswrapper[4752]: I1124 13:07:46.660175 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ee7cb878-862d-49ed-ace6-1ba9b4b6daf0-ceph\") pod \"ee7cb878-862d-49ed-ace6-1ba9b4b6daf0\" (UID: \"ee7cb878-862d-49ed-ace6-1ba9b4b6daf0\") " Nov 24 13:07:46 crc kubenswrapper[4752]: I1124 13:07:46.660404 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zd9l\" (UniqueName: \"kubernetes.io/projected/ee7cb878-862d-49ed-ace6-1ba9b4b6daf0-kube-api-access-9zd9l\") pod \"ee7cb878-862d-49ed-ace6-1ba9b4b6daf0\" (UID: \"ee7cb878-862d-49ed-ace6-1ba9b4b6daf0\") " Nov 24 13:07:46 crc kubenswrapper[4752]: I1124 13:07:46.660450 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ee7cb878-862d-49ed-ace6-1ba9b4b6daf0-ssh-key\") pod \"ee7cb878-862d-49ed-ace6-1ba9b4b6daf0\" (UID: \"ee7cb878-862d-49ed-ace6-1ba9b4b6daf0\") " Nov 24 13:07:46 crc kubenswrapper[4752]: I1124 13:07:46.667435 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee7cb878-862d-49ed-ace6-1ba9b4b6daf0-kube-api-access-9zd9l" (OuterVolumeSpecName: "kube-api-access-9zd9l") pod "ee7cb878-862d-49ed-ace6-1ba9b4b6daf0" (UID: "ee7cb878-862d-49ed-ace6-1ba9b4b6daf0"). InnerVolumeSpecName "kube-api-access-9zd9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:07:46 crc kubenswrapper[4752]: I1124 13:07:46.669665 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee7cb878-862d-49ed-ace6-1ba9b4b6daf0-ceph" (OuterVolumeSpecName: "ceph") pod "ee7cb878-862d-49ed-ace6-1ba9b4b6daf0" (UID: "ee7cb878-862d-49ed-ace6-1ba9b4b6daf0"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:07:46 crc kubenswrapper[4752]: I1124 13:07:46.698277 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee7cb878-862d-49ed-ace6-1ba9b4b6daf0-inventory" (OuterVolumeSpecName: "inventory") pod "ee7cb878-862d-49ed-ace6-1ba9b4b6daf0" (UID: "ee7cb878-862d-49ed-ace6-1ba9b4b6daf0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:07:46 crc kubenswrapper[4752]: I1124 13:07:46.705907 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee7cb878-862d-49ed-ace6-1ba9b4b6daf0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ee7cb878-862d-49ed-ace6-1ba9b4b6daf0" (UID: "ee7cb878-862d-49ed-ace6-1ba9b4b6daf0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:07:46 crc kubenswrapper[4752]: I1124 13:07:46.764868 4752 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ee7cb878-862d-49ed-ace6-1ba9b4b6daf0-ceph\") on node \"crc\" DevicePath \"\"" Nov 24 13:07:46 crc kubenswrapper[4752]: I1124 13:07:46.764901 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zd9l\" (UniqueName: \"kubernetes.io/projected/ee7cb878-862d-49ed-ace6-1ba9b4b6daf0-kube-api-access-9zd9l\") on node \"crc\" DevicePath \"\"" Nov 24 13:07:46 crc kubenswrapper[4752]: I1124 13:07:46.764913 4752 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ee7cb878-862d-49ed-ace6-1ba9b4b6daf0-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 24 13:07:46 crc kubenswrapper[4752]: I1124 13:07:46.764925 4752 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee7cb878-862d-49ed-ace6-1ba9b4b6daf0-inventory\") on node \"crc\" DevicePath \"\"" Nov 24 13:07:47 crc kubenswrapper[4752]: I1124 13:07:47.053245 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-rqzkn" event={"ID":"ee7cb878-862d-49ed-ace6-1ba9b4b6daf0","Type":"ContainerDied","Data":"77952ec4d6ea165f0d192a69c8e10ee79467042c9674fb0384e362d61463fb57"} Nov 24 13:07:47 crc kubenswrapper[4752]: I1124 13:07:47.053613 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77952ec4d6ea165f0d192a69c8e10ee79467042c9674fb0384e362d61463fb57" Nov 24 13:07:47 crc kubenswrapper[4752]: I1124 13:07:47.053327 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-rqzkn" Nov 24 13:07:47 crc kubenswrapper[4752]: I1124 13:07:47.199074 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-openstack-fm27m"] Nov 24 13:07:47 crc kubenswrapper[4752]: E1124 13:07:47.199577 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee7cb878-862d-49ed-ace6-1ba9b4b6daf0" containerName="configure-os-openstack-openstack-cell1" Nov 24 13:07:47 crc kubenswrapper[4752]: I1124 13:07:47.199600 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee7cb878-862d-49ed-ace6-1ba9b4b6daf0" containerName="configure-os-openstack-openstack-cell1" Nov 24 13:07:47 crc kubenswrapper[4752]: I1124 13:07:47.199925 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee7cb878-862d-49ed-ace6-1ba9b4b6daf0" containerName="configure-os-openstack-openstack-cell1" Nov 24 13:07:47 crc kubenswrapper[4752]: I1124 13:07:47.200865 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-fm27m" Nov 24 13:07:47 crc kubenswrapper[4752]: I1124 13:07:47.202676 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 24 13:07:47 crc kubenswrapper[4752]: I1124 13:07:47.203079 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qj7cx" Nov 24 13:07:47 crc kubenswrapper[4752]: I1124 13:07:47.203351 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 24 13:07:47 crc kubenswrapper[4752]: I1124 13:07:47.203364 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 24 13:07:47 crc kubenswrapper[4752]: I1124 13:07:47.214326 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-fm27m"] Nov 24 13:07:47 crc kubenswrapper[4752]: I1124 13:07:47.379683 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx7k5\" (UniqueName: \"kubernetes.io/projected/38958449-fe2e-4e55-899d-c6d573fbf809-kube-api-access-lx7k5\") pod \"ssh-known-hosts-openstack-fm27m\" (UID: \"38958449-fe2e-4e55-899d-c6d573fbf809\") " pod="openstack/ssh-known-hosts-openstack-fm27m" Nov 24 13:07:47 crc kubenswrapper[4752]: I1124 13:07:47.380252 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/38958449-fe2e-4e55-899d-c6d573fbf809-ceph\") pod \"ssh-known-hosts-openstack-fm27m\" (UID: \"38958449-fe2e-4e55-899d-c6d573fbf809\") " pod="openstack/ssh-known-hosts-openstack-fm27m" Nov 24 13:07:47 crc kubenswrapper[4752]: I1124 13:07:47.380517 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/38958449-fe2e-4e55-899d-c6d573fbf809-inventory-0\") pod \"ssh-known-hosts-openstack-fm27m\" (UID: \"38958449-fe2e-4e55-899d-c6d573fbf809\") " pod="openstack/ssh-known-hosts-openstack-fm27m" Nov 24 13:07:47 crc kubenswrapper[4752]: I1124 13:07:47.380709 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/38958449-fe2e-4e55-899d-c6d573fbf809-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-fm27m\" (UID: \"38958449-fe2e-4e55-899d-c6d573fbf809\") " pod="openstack/ssh-known-hosts-openstack-fm27m" Nov 24 13:07:47 crc kubenswrapper[4752]: I1124 13:07:47.482237 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/38958449-fe2e-4e55-899d-c6d573fbf809-inventory-0\") pod \"ssh-known-hosts-openstack-fm27m\" (UID: \"38958449-fe2e-4e55-899d-c6d573fbf809\") " pod="openstack/ssh-known-hosts-openstack-fm27m" Nov 24 13:07:47 crc kubenswrapper[4752]: I1124 13:07:47.482336 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/38958449-fe2e-4e55-899d-c6d573fbf809-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-fm27m\" (UID: \"38958449-fe2e-4e55-899d-c6d573fbf809\") " pod="openstack/ssh-known-hosts-openstack-fm27m" Nov 24 13:07:47 crc kubenswrapper[4752]: I1124 13:07:47.482405 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lx7k5\" (UniqueName: \"kubernetes.io/projected/38958449-fe2e-4e55-899d-c6d573fbf809-kube-api-access-lx7k5\") pod \"ssh-known-hosts-openstack-fm27m\" (UID: \"38958449-fe2e-4e55-899d-c6d573fbf809\") " pod="openstack/ssh-known-hosts-openstack-fm27m" Nov 24 13:07:47 crc kubenswrapper[4752]: I1124 13:07:47.482530 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/38958449-fe2e-4e55-899d-c6d573fbf809-ceph\") pod \"ssh-known-hosts-openstack-fm27m\" (UID: \"38958449-fe2e-4e55-899d-c6d573fbf809\") " pod="openstack/ssh-known-hosts-openstack-fm27m" Nov 24 13:07:47 crc kubenswrapper[4752]: I1124 13:07:47.487452 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/38958449-fe2e-4e55-899d-c6d573fbf809-ceph\") pod \"ssh-known-hosts-openstack-fm27m\" (UID: \"38958449-fe2e-4e55-899d-c6d573fbf809\") " pod="openstack/ssh-known-hosts-openstack-fm27m" Nov 24 13:07:47 crc kubenswrapper[4752]: I1124 13:07:47.487791 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/38958449-fe2e-4e55-899d-c6d573fbf809-inventory-0\") pod \"ssh-known-hosts-openstack-fm27m\" (UID: \"38958449-fe2e-4e55-899d-c6d573fbf809\") " pod="openstack/ssh-known-hosts-openstack-fm27m" Nov 24 13:07:47 crc kubenswrapper[4752]: I1124 13:07:47.487996 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/38958449-fe2e-4e55-899d-c6d573fbf809-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-fm27m\" (UID: \"38958449-fe2e-4e55-899d-c6d573fbf809\") " pod="openstack/ssh-known-hosts-openstack-fm27m" Nov 24 13:07:47 crc kubenswrapper[4752]: I1124 13:07:47.513106 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx7k5\" (UniqueName: \"kubernetes.io/projected/38958449-fe2e-4e55-899d-c6d573fbf809-kube-api-access-lx7k5\") pod \"ssh-known-hosts-openstack-fm27m\" (UID: \"38958449-fe2e-4e55-899d-c6d573fbf809\") " pod="openstack/ssh-known-hosts-openstack-fm27m" Nov 24 13:07:47 crc kubenswrapper[4752]: I1124 13:07:47.522356 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-fm27m" Nov 24 13:07:48 crc kubenswrapper[4752]: I1124 13:07:48.033988 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-fm27m"] Nov 24 13:07:48 crc kubenswrapper[4752]: I1124 13:07:48.063226 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-fm27m" event={"ID":"38958449-fe2e-4e55-899d-c6d573fbf809","Type":"ContainerStarted","Data":"e56bb5834896408a669d25f1ecf86eee6ce57e238c3799597270bc9e8ff846ef"} Nov 24 13:07:49 crc kubenswrapper[4752]: I1124 13:07:49.076260 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-fm27m" event={"ID":"38958449-fe2e-4e55-899d-c6d573fbf809","Type":"ContainerStarted","Data":"a2b162d06d44454f6a2aa8c2b11137ee52e004627bf866a171e1287b92ef5d73"} Nov 24 13:07:49 crc kubenswrapper[4752]: I1124 13:07:49.104410 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-openstack-fm27m" podStartSLOduration=1.544089408 podStartE2EDuration="2.104392315s" podCreationTimestamp="2025-11-24 13:07:47 +0000 UTC" firstStartedPulling="2025-11-24 13:07:48.042384352 +0000 UTC m=+7274.027204651" lastFinishedPulling="2025-11-24 13:07:48.602687269 +0000 UTC m=+7274.587507558" observedRunningTime="2025-11-24 13:07:49.096089577 +0000 UTC m=+7275.080909866" watchObservedRunningTime="2025-11-24 13:07:49.104392315 +0000 UTC m=+7275.089212604" Nov 24 13:07:57 crc kubenswrapper[4752]: I1124 13:07:57.200620 4752 generic.go:334] "Generic (PLEG): container finished" podID="38958449-fe2e-4e55-899d-c6d573fbf809" containerID="a2b162d06d44454f6a2aa8c2b11137ee52e004627bf866a171e1287b92ef5d73" exitCode=0 Nov 24 13:07:57 crc kubenswrapper[4752]: I1124 13:07:57.200728 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-fm27m" event={"ID":"38958449-fe2e-4e55-899d-c6d573fbf809","Type":"ContainerDied","Data":"a2b162d06d44454f6a2aa8c2b11137ee52e004627bf866a171e1287b92ef5d73"} Nov 24 13:07:57 crc kubenswrapper[4752]: I1124 13:07:57.729009 4752 scope.go:117] "RemoveContainer" containerID="ec8b3c7b99e04ac7de2a7811e3000500deed3097f15545f1d3f164ab34d7ec12" Nov 24 13:07:57 crc kubenswrapper[4752]: E1124 13:07:57.729339 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:07:58 crc kubenswrapper[4752]: I1124 13:07:58.689509 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-fm27m" Nov 24 13:07:58 crc kubenswrapper[4752]: I1124 13:07:58.841072 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lx7k5\" (UniqueName: \"kubernetes.io/projected/38958449-fe2e-4e55-899d-c6d573fbf809-kube-api-access-lx7k5\") pod \"38958449-fe2e-4e55-899d-c6d573fbf809\" (UID: \"38958449-fe2e-4e55-899d-c6d573fbf809\") " Nov 24 13:07:58 crc kubenswrapper[4752]: I1124 13:07:58.841221 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/38958449-fe2e-4e55-899d-c6d573fbf809-ceph\") pod \"38958449-fe2e-4e55-899d-c6d573fbf809\" (UID: \"38958449-fe2e-4e55-899d-c6d573fbf809\") " Nov 24 13:07:58 crc kubenswrapper[4752]: I1124 13:07:58.841332 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/38958449-fe2e-4e55-899d-c6d573fbf809-ssh-key-openstack-cell1\") pod \"38958449-fe2e-4e55-899d-c6d573fbf809\" (UID: \"38958449-fe2e-4e55-899d-c6d573fbf809\") " Nov 24 13:07:58 crc kubenswrapper[4752]: I1124 13:07:58.841480 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/38958449-fe2e-4e55-899d-c6d573fbf809-inventory-0\") pod \"38958449-fe2e-4e55-899d-c6d573fbf809\" (UID: \"38958449-fe2e-4e55-899d-c6d573fbf809\") " Nov 24 13:07:58 crc kubenswrapper[4752]: I1124 13:07:58.849375 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38958449-fe2e-4e55-899d-c6d573fbf809-kube-api-access-lx7k5" (OuterVolumeSpecName: "kube-api-access-lx7k5") pod "38958449-fe2e-4e55-899d-c6d573fbf809" (UID: "38958449-fe2e-4e55-899d-c6d573fbf809"). InnerVolumeSpecName "kube-api-access-lx7k5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:07:58 crc kubenswrapper[4752]: I1124 13:07:58.849698 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38958449-fe2e-4e55-899d-c6d573fbf809-ceph" (OuterVolumeSpecName: "ceph") pod "38958449-fe2e-4e55-899d-c6d573fbf809" (UID: "38958449-fe2e-4e55-899d-c6d573fbf809"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:07:58 crc kubenswrapper[4752]: I1124 13:07:58.873532 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38958449-fe2e-4e55-899d-c6d573fbf809-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "38958449-fe2e-4e55-899d-c6d573fbf809" (UID: "38958449-fe2e-4e55-899d-c6d573fbf809"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:07:58 crc kubenswrapper[4752]: I1124 13:07:58.880182 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38958449-fe2e-4e55-899d-c6d573fbf809-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "38958449-fe2e-4e55-899d-c6d573fbf809" (UID: "38958449-fe2e-4e55-899d-c6d573fbf809"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:07:58 crc kubenswrapper[4752]: I1124 13:07:58.944719 4752 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/38958449-fe2e-4e55-899d-c6d573fbf809-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Nov 24 13:07:58 crc kubenswrapper[4752]: I1124 13:07:58.944773 4752 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/38958449-fe2e-4e55-899d-c6d573fbf809-inventory-0\") on node \"crc\" DevicePath \"\"" Nov 24 13:07:58 crc kubenswrapper[4752]: I1124 13:07:58.944787 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lx7k5\" (UniqueName: \"kubernetes.io/projected/38958449-fe2e-4e55-899d-c6d573fbf809-kube-api-access-lx7k5\") on node \"crc\" DevicePath \"\"" Nov 24 13:07:58 crc kubenswrapper[4752]: I1124 13:07:58.944799 4752 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/38958449-fe2e-4e55-899d-c6d573fbf809-ceph\") on node \"crc\" DevicePath \"\"" Nov 24 13:07:59 crc kubenswrapper[4752]: I1124 13:07:59.223686 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-fm27m" event={"ID":"38958449-fe2e-4e55-899d-c6d573fbf809","Type":"ContainerDied","Data":"e56bb5834896408a669d25f1ecf86eee6ce57e238c3799597270bc9e8ff846ef"} Nov 24 13:07:59 crc kubenswrapper[4752]: I1124 13:07:59.223780 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e56bb5834896408a669d25f1ecf86eee6ce57e238c3799597270bc9e8ff846ef" Nov 24 13:07:59 crc kubenswrapper[4752]: I1124 13:07:59.224369 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-fm27m" Nov 24 13:07:59 crc kubenswrapper[4752]: I1124 13:07:59.334044 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-cell1-bcvfv"] Nov 24 13:07:59 crc kubenswrapper[4752]: E1124 13:07:59.334908 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38958449-fe2e-4e55-899d-c6d573fbf809" containerName="ssh-known-hosts-openstack" Nov 24 13:07:59 crc kubenswrapper[4752]: I1124 13:07:59.334929 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="38958449-fe2e-4e55-899d-c6d573fbf809" containerName="ssh-known-hosts-openstack" Nov 24 13:07:59 crc kubenswrapper[4752]: I1124 13:07:59.335213 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="38958449-fe2e-4e55-899d-c6d573fbf809" containerName="ssh-known-hosts-openstack" Nov 24 13:07:59 crc kubenswrapper[4752]: I1124 13:07:59.336206 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-bcvfv" Nov 24 13:07:59 crc kubenswrapper[4752]: I1124 13:07:59.339872 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qj7cx" Nov 24 13:07:59 crc kubenswrapper[4752]: I1124 13:07:59.339973 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 24 13:07:59 crc kubenswrapper[4752]: I1124 13:07:59.340630 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 24 13:07:59 crc kubenswrapper[4752]: I1124 13:07:59.340949 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 24 13:07:59 crc kubenswrapper[4752]: I1124 13:07:59.379234 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-bcvfv"] Nov 24 13:07:59 crc kubenswrapper[4752]: I1124 13:07:59.462351 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb41e8f1-6ced-4ade-b2d7-6458f2cb5808-inventory\") pod \"run-os-openstack-openstack-cell1-bcvfv\" (UID: \"bb41e8f1-6ced-4ade-b2d7-6458f2cb5808\") " pod="openstack/run-os-openstack-openstack-cell1-bcvfv" Nov 24 13:07:59 crc kubenswrapper[4752]: I1124 13:07:59.462450 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bb41e8f1-6ced-4ade-b2d7-6458f2cb5808-ceph\") pod \"run-os-openstack-openstack-cell1-bcvfv\" (UID: \"bb41e8f1-6ced-4ade-b2d7-6458f2cb5808\") " pod="openstack/run-os-openstack-openstack-cell1-bcvfv" Nov 24 13:07:59 crc kubenswrapper[4752]: I1124 13:07:59.462483 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqjhq\" (UniqueName: \"kubernetes.io/projected/bb41e8f1-6ced-4ade-b2d7-6458f2cb5808-kube-api-access-sqjhq\") pod \"run-os-openstack-openstack-cell1-bcvfv\" (UID: \"bb41e8f1-6ced-4ade-b2d7-6458f2cb5808\") " pod="openstack/run-os-openstack-openstack-cell1-bcvfv" Nov 24 13:07:59 crc kubenswrapper[4752]: I1124 13:07:59.462704 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb41e8f1-6ced-4ade-b2d7-6458f2cb5808-ssh-key\") pod \"run-os-openstack-openstack-cell1-bcvfv\" (UID: \"bb41e8f1-6ced-4ade-b2d7-6458f2cb5808\") " pod="openstack/run-os-openstack-openstack-cell1-bcvfv" Nov 24 13:07:59 crc kubenswrapper[4752]: I1124 13:07:59.565905 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb41e8f1-6ced-4ade-b2d7-6458f2cb5808-inventory\") pod \"run-os-openstack-openstack-cell1-bcvfv\" (UID: \"bb41e8f1-6ced-4ade-b2d7-6458f2cb5808\") " pod="openstack/run-os-openstack-openstack-cell1-bcvfv" Nov 24 13:07:59 crc kubenswrapper[4752]: I1124 13:07:59.565969 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bb41e8f1-6ced-4ade-b2d7-6458f2cb5808-ceph\") pod \"run-os-openstack-openstack-cell1-bcvfv\" (UID: \"bb41e8f1-6ced-4ade-b2d7-6458f2cb5808\") " pod="openstack/run-os-openstack-openstack-cell1-bcvfv" Nov 24 13:07:59 crc kubenswrapper[4752]: I1124 13:07:59.566001 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqjhq\" (UniqueName: \"kubernetes.io/projected/bb41e8f1-6ced-4ade-b2d7-6458f2cb5808-kube-api-access-sqjhq\") pod \"run-os-openstack-openstack-cell1-bcvfv\" (UID: \"bb41e8f1-6ced-4ade-b2d7-6458f2cb5808\") " pod="openstack/run-os-openstack-openstack-cell1-bcvfv" Nov 24 13:07:59 crc kubenswrapper[4752]: I1124 13:07:59.566103 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb41e8f1-6ced-4ade-b2d7-6458f2cb5808-ssh-key\") pod \"run-os-openstack-openstack-cell1-bcvfv\" (UID: \"bb41e8f1-6ced-4ade-b2d7-6458f2cb5808\") " pod="openstack/run-os-openstack-openstack-cell1-bcvfv" Nov 24 13:07:59 crc kubenswrapper[4752]: I1124 13:07:59.576818 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bb41e8f1-6ced-4ade-b2d7-6458f2cb5808-ceph\") pod \"run-os-openstack-openstack-cell1-bcvfv\" (UID: \"bb41e8f1-6ced-4ade-b2d7-6458f2cb5808\") " pod="openstack/run-os-openstack-openstack-cell1-bcvfv" Nov 24 13:07:59 crc kubenswrapper[4752]: I1124 13:07:59.581090 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb41e8f1-6ced-4ade-b2d7-6458f2cb5808-ssh-key\") pod \"run-os-openstack-openstack-cell1-bcvfv\" (UID: \"bb41e8f1-6ced-4ade-b2d7-6458f2cb5808\") " pod="openstack/run-os-openstack-openstack-cell1-bcvfv" Nov 24 13:07:59 crc kubenswrapper[4752]: I1124 13:07:59.581238 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb41e8f1-6ced-4ade-b2d7-6458f2cb5808-inventory\") pod \"run-os-openstack-openstack-cell1-bcvfv\" (UID: \"bb41e8f1-6ced-4ade-b2d7-6458f2cb5808\") " pod="openstack/run-os-openstack-openstack-cell1-bcvfv" Nov 24 13:07:59 crc kubenswrapper[4752]: I1124 13:07:59.585142 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqjhq\" (UniqueName: \"kubernetes.io/projected/bb41e8f1-6ced-4ade-b2d7-6458f2cb5808-kube-api-access-sqjhq\") pod \"run-os-openstack-openstack-cell1-bcvfv\" (UID: \"bb41e8f1-6ced-4ade-b2d7-6458f2cb5808\") " pod="openstack/run-os-openstack-openstack-cell1-bcvfv" Nov 24 13:07:59 crc kubenswrapper[4752]: I1124 13:07:59.675355 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-bcvfv" Nov 24 13:08:00 crc kubenswrapper[4752]: I1124 13:08:00.289058 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-bcvfv"] Nov 24 13:08:00 crc kubenswrapper[4752]: W1124 13:08:00.298587 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb41e8f1_6ced_4ade_b2d7_6458f2cb5808.slice/crio-7ca1d4eb6e629ff94d9d7e4ee45a2f4c2631ad9d27f072d69ba31b3f2ace1929 WatchSource:0}: Error finding container 7ca1d4eb6e629ff94d9d7e4ee45a2f4c2631ad9d27f072d69ba31b3f2ace1929: Status 404 returned error can't find the container with id 7ca1d4eb6e629ff94d9d7e4ee45a2f4c2631ad9d27f072d69ba31b3f2ace1929 Nov 24 13:08:01 crc kubenswrapper[4752]: I1124 13:08:01.258293 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-bcvfv" event={"ID":"bb41e8f1-6ced-4ade-b2d7-6458f2cb5808","Type":"ContainerStarted","Data":"f4938b7980add6a96b2060e5c3634f765d93740ceaf28570f65db4289de4cb6c"} Nov 24 13:08:01 crc kubenswrapper[4752]: I1124 13:08:01.259184 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-bcvfv" event={"ID":"bb41e8f1-6ced-4ade-b2d7-6458f2cb5808","Type":"ContainerStarted","Data":"7ca1d4eb6e629ff94d9d7e4ee45a2f4c2631ad9d27f072d69ba31b3f2ace1929"} Nov 24 13:08:01 crc kubenswrapper[4752]: I1124 13:08:01.275803 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-cell1-bcvfv" podStartSLOduration=1.8448398350000001 podStartE2EDuration="2.275783811s" podCreationTimestamp="2025-11-24 13:07:59 +0000 UTC" firstStartedPulling="2025-11-24 13:08:00.30027962 +0000 UTC m=+7286.285099909" lastFinishedPulling="2025-11-24 13:08:00.731223606 +0000 UTC m=+7286.716043885" observedRunningTime="2025-11-24 13:08:01.27576764 +0000 UTC m=+7287.260587929" watchObservedRunningTime="2025-11-24 13:08:01.275783811 +0000 UTC m=+7287.260604120" Nov 24 13:08:09 crc kubenswrapper[4752]: I1124 13:08:09.344958 4752 generic.go:334] "Generic (PLEG): container finished" podID="bb41e8f1-6ced-4ade-b2d7-6458f2cb5808" containerID="f4938b7980add6a96b2060e5c3634f765d93740ceaf28570f65db4289de4cb6c" exitCode=0 Nov 24 13:08:09 crc kubenswrapper[4752]: I1124 13:08:09.345078 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-bcvfv" event={"ID":"bb41e8f1-6ced-4ade-b2d7-6458f2cb5808","Type":"ContainerDied","Data":"f4938b7980add6a96b2060e5c3634f765d93740ceaf28570f65db4289de4cb6c"} Nov 24 13:08:09 crc kubenswrapper[4752]: I1124 13:08:09.727652 4752 scope.go:117] "RemoveContainer" containerID="ec8b3c7b99e04ac7de2a7811e3000500deed3097f15545f1d3f164ab34d7ec12" Nov 24 13:08:09 crc kubenswrapper[4752]: E1124 13:08:09.727987 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:08:10 crc kubenswrapper[4752]: I1124 13:08:10.869895 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-bcvfv" Nov 24 13:08:11 crc kubenswrapper[4752]: I1124 13:08:11.054516 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bb41e8f1-6ced-4ade-b2d7-6458f2cb5808-ceph\") pod \"bb41e8f1-6ced-4ade-b2d7-6458f2cb5808\" (UID: \"bb41e8f1-6ced-4ade-b2d7-6458f2cb5808\") " Nov 24 13:08:11 crc kubenswrapper[4752]: I1124 13:08:11.054863 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb41e8f1-6ced-4ade-b2d7-6458f2cb5808-inventory\") pod \"bb41e8f1-6ced-4ade-b2d7-6458f2cb5808\" (UID: \"bb41e8f1-6ced-4ade-b2d7-6458f2cb5808\") " Nov 24 13:08:11 crc kubenswrapper[4752]: I1124 13:08:11.055025 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb41e8f1-6ced-4ade-b2d7-6458f2cb5808-ssh-key\") pod \"bb41e8f1-6ced-4ade-b2d7-6458f2cb5808\" (UID: \"bb41e8f1-6ced-4ade-b2d7-6458f2cb5808\") " Nov 24 13:08:11 crc kubenswrapper[4752]: I1124 13:08:11.055106 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqjhq\" (UniqueName: \"kubernetes.io/projected/bb41e8f1-6ced-4ade-b2d7-6458f2cb5808-kube-api-access-sqjhq\") pod \"bb41e8f1-6ced-4ade-b2d7-6458f2cb5808\" (UID: \"bb41e8f1-6ced-4ade-b2d7-6458f2cb5808\") " Nov 24 13:08:11 crc kubenswrapper[4752]: I1124 13:08:11.061189 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb41e8f1-6ced-4ade-b2d7-6458f2cb5808-ceph" (OuterVolumeSpecName: "ceph") pod "bb41e8f1-6ced-4ade-b2d7-6458f2cb5808" (UID: "bb41e8f1-6ced-4ade-b2d7-6458f2cb5808"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:08:11 crc kubenswrapper[4752]: I1124 13:08:11.064487 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb41e8f1-6ced-4ade-b2d7-6458f2cb5808-kube-api-access-sqjhq" (OuterVolumeSpecName: "kube-api-access-sqjhq") pod "bb41e8f1-6ced-4ade-b2d7-6458f2cb5808" (UID: "bb41e8f1-6ced-4ade-b2d7-6458f2cb5808"). InnerVolumeSpecName "kube-api-access-sqjhq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:08:11 crc kubenswrapper[4752]: I1124 13:08:11.090159 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb41e8f1-6ced-4ade-b2d7-6458f2cb5808-inventory" (OuterVolumeSpecName: "inventory") pod "bb41e8f1-6ced-4ade-b2d7-6458f2cb5808" (UID: "bb41e8f1-6ced-4ade-b2d7-6458f2cb5808"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:08:11 crc kubenswrapper[4752]: I1124 13:08:11.099909 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb41e8f1-6ced-4ade-b2d7-6458f2cb5808-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bb41e8f1-6ced-4ade-b2d7-6458f2cb5808" (UID: "bb41e8f1-6ced-4ade-b2d7-6458f2cb5808"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:08:11 crc kubenswrapper[4752]: I1124 13:08:11.158054 4752 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bb41e8f1-6ced-4ade-b2d7-6458f2cb5808-ceph\") on node \"crc\" DevicePath \"\"" Nov 24 13:08:11 crc kubenswrapper[4752]: I1124 13:08:11.158081 4752 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb41e8f1-6ced-4ade-b2d7-6458f2cb5808-inventory\") on node \"crc\" DevicePath \"\"" Nov 24 13:08:11 crc kubenswrapper[4752]: I1124 13:08:11.158090 4752 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb41e8f1-6ced-4ade-b2d7-6458f2cb5808-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 24 13:08:11 crc kubenswrapper[4752]: I1124 13:08:11.158099 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqjhq\" (UniqueName: \"kubernetes.io/projected/bb41e8f1-6ced-4ade-b2d7-6458f2cb5808-kube-api-access-sqjhq\") on node \"crc\" DevicePath \"\"" Nov 24 13:08:11 crc kubenswrapper[4752]: I1124 13:08:11.365897 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-bcvfv" event={"ID":"bb41e8f1-6ced-4ade-b2d7-6458f2cb5808","Type":"ContainerDied","Data":"7ca1d4eb6e629ff94d9d7e4ee45a2f4c2631ad9d27f072d69ba31b3f2ace1929"} Nov 24 13:08:11 crc kubenswrapper[4752]: I1124 13:08:11.365937 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ca1d4eb6e629ff94d9d7e4ee45a2f4c2631ad9d27f072d69ba31b3f2ace1929" Nov 24 13:08:11 crc kubenswrapper[4752]: I1124 13:08:11.366003 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-bcvfv" Nov 24 13:08:11 crc kubenswrapper[4752]: I1124 13:08:11.462521 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-xhk6b"] Nov 24 13:08:11 crc kubenswrapper[4752]: E1124 13:08:11.463197 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb41e8f1-6ced-4ade-b2d7-6458f2cb5808" containerName="run-os-openstack-openstack-cell1" Nov 24 13:08:11 crc kubenswrapper[4752]: I1124 13:08:11.463223 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb41e8f1-6ced-4ade-b2d7-6458f2cb5808" containerName="run-os-openstack-openstack-cell1" Nov 24 13:08:11 crc kubenswrapper[4752]: I1124 13:08:11.463604 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb41e8f1-6ced-4ade-b2d7-6458f2cb5808" containerName="run-os-openstack-openstack-cell1" Nov 24 13:08:11 crc kubenswrapper[4752]: I1124 13:08:11.464788 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-xhk6b" Nov 24 13:08:11 crc kubenswrapper[4752]: I1124 13:08:11.467312 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 24 13:08:11 crc kubenswrapper[4752]: I1124 13:08:11.467561 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 24 13:08:11 crc kubenswrapper[4752]: I1124 13:08:11.467721 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 24 13:08:11 crc kubenswrapper[4752]: I1124 13:08:11.467996 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qj7cx" Nov 24 13:08:11 crc kubenswrapper[4752]: I1124 13:08:11.479458 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-xhk6b"] Nov 24 13:08:11 crc kubenswrapper[4752]: I1124 13:08:11.565965 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2948a607-5d86-4e4e-94ef-cc1cc219ef47-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-xhk6b\" (UID: \"2948a607-5d86-4e4e-94ef-cc1cc219ef47\") " pod="openstack/reboot-os-openstack-openstack-cell1-xhk6b" Nov 24 13:08:11 crc kubenswrapper[4752]: I1124 13:08:11.566448 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2948a607-5d86-4e4e-94ef-cc1cc219ef47-inventory\") pod \"reboot-os-openstack-openstack-cell1-xhk6b\" (UID: \"2948a607-5d86-4e4e-94ef-cc1cc219ef47\") " pod="openstack/reboot-os-openstack-openstack-cell1-xhk6b" Nov 24 13:08:11 crc kubenswrapper[4752]: I1124 13:08:11.566532 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2948a607-5d86-4e4e-94ef-cc1cc219ef47-ceph\") pod \"reboot-os-openstack-openstack-cell1-xhk6b\" (UID: \"2948a607-5d86-4e4e-94ef-cc1cc219ef47\") " pod="openstack/reboot-os-openstack-openstack-cell1-xhk6b" Nov 24 13:08:11 crc kubenswrapper[4752]: I1124 13:08:11.566618 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77g6c\" (UniqueName: \"kubernetes.io/projected/2948a607-5d86-4e4e-94ef-cc1cc219ef47-kube-api-access-77g6c\") pod \"reboot-os-openstack-openstack-cell1-xhk6b\" (UID: \"2948a607-5d86-4e4e-94ef-cc1cc219ef47\") " pod="openstack/reboot-os-openstack-openstack-cell1-xhk6b" Nov 24 13:08:11 crc kubenswrapper[4752]: I1124 13:08:11.670280 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2948a607-5d86-4e4e-94ef-cc1cc219ef47-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-xhk6b\" (UID: \"2948a607-5d86-4e4e-94ef-cc1cc219ef47\") " pod="openstack/reboot-os-openstack-openstack-cell1-xhk6b" Nov 24 13:08:11 crc kubenswrapper[4752]: I1124 13:08:11.670556 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2948a607-5d86-4e4e-94ef-cc1cc219ef47-inventory\") pod \"reboot-os-openstack-openstack-cell1-xhk6b\" (UID: \"2948a607-5d86-4e4e-94ef-cc1cc219ef47\") " pod="openstack/reboot-os-openstack-openstack-cell1-xhk6b" Nov 24 13:08:11 crc kubenswrapper[4752]: I1124 13:08:11.670694 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2948a607-5d86-4e4e-94ef-cc1cc219ef47-ceph\") pod \"reboot-os-openstack-openstack-cell1-xhk6b\" (UID: \"2948a607-5d86-4e4e-94ef-cc1cc219ef47\") " pod="openstack/reboot-os-openstack-openstack-cell1-xhk6b" Nov 24 13:08:11 crc kubenswrapper[4752]: I1124 13:08:11.670839 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77g6c\" (UniqueName: \"kubernetes.io/projected/2948a607-5d86-4e4e-94ef-cc1cc219ef47-kube-api-access-77g6c\") pod \"reboot-os-openstack-openstack-cell1-xhk6b\" (UID: \"2948a607-5d86-4e4e-94ef-cc1cc219ef47\") " pod="openstack/reboot-os-openstack-openstack-cell1-xhk6b" Nov 24 13:08:11 crc kubenswrapper[4752]: I1124 13:08:11.678591 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2948a607-5d86-4e4e-94ef-cc1cc219ef47-ceph\") pod \"reboot-os-openstack-openstack-cell1-xhk6b\" (UID: \"2948a607-5d86-4e4e-94ef-cc1cc219ef47\") " pod="openstack/reboot-os-openstack-openstack-cell1-xhk6b" Nov 24 13:08:11 crc kubenswrapper[4752]: I1124 13:08:11.680617 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2948a607-5d86-4e4e-94ef-cc1cc219ef47-inventory\") pod \"reboot-os-openstack-openstack-cell1-xhk6b\" (UID: \"2948a607-5d86-4e4e-94ef-cc1cc219ef47\") " pod="openstack/reboot-os-openstack-openstack-cell1-xhk6b" Nov 24 13:08:11 crc kubenswrapper[4752]: I1124 13:08:11.691407 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2948a607-5d86-4e4e-94ef-cc1cc219ef47-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-xhk6b\" (UID: \"2948a607-5d86-4e4e-94ef-cc1cc219ef47\") " pod="openstack/reboot-os-openstack-openstack-cell1-xhk6b" Nov 24 13:08:11 crc kubenswrapper[4752]: I1124 13:08:11.691890 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77g6c\" (UniqueName: \"kubernetes.io/projected/2948a607-5d86-4e4e-94ef-cc1cc219ef47-kube-api-access-77g6c\") pod \"reboot-os-openstack-openstack-cell1-xhk6b\" (UID: \"2948a607-5d86-4e4e-94ef-cc1cc219ef47\") " pod="openstack/reboot-os-openstack-openstack-cell1-xhk6b" Nov 24 13:08:11 crc kubenswrapper[4752]: I1124 13:08:11.792913 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-xhk6b" Nov 24 13:08:12 crc kubenswrapper[4752]: I1124 13:08:12.332813 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-xhk6b"] Nov 24 13:08:12 crc kubenswrapper[4752]: I1124 13:08:12.388466 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-xhk6b" event={"ID":"2948a607-5d86-4e4e-94ef-cc1cc219ef47","Type":"ContainerStarted","Data":"107363379dd58286597855961f31e02855b408209fbbb0d2dc5ad7d893dd0585"} Nov 24 13:08:13 crc kubenswrapper[4752]: I1124 13:08:13.400682 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-xhk6b" event={"ID":"2948a607-5d86-4e4e-94ef-cc1cc219ef47","Type":"ContainerStarted","Data":"59cdd5fe692dd0317977b2ace9f24d9986e6489232bccaffcf81d285cc391790"} Nov 24 13:08:13 crc kubenswrapper[4752]: I1124 13:08:13.428278 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-cell1-xhk6b" podStartSLOduration=1.9451568099999998 podStartE2EDuration="2.428255361s" podCreationTimestamp="2025-11-24 13:08:11 +0000 UTC" firstStartedPulling="2025-11-24 13:08:12.333483419 +0000 UTC m=+7298.318303708" lastFinishedPulling="2025-11-24 13:08:12.81658196 +0000 UTC m=+7298.801402259" observedRunningTime="2025-11-24 13:08:13.421341003 +0000 UTC m=+7299.406161302" watchObservedRunningTime="2025-11-24 13:08:13.428255361 +0000 UTC m=+7299.413075660" Nov 24 13:08:21 crc kubenswrapper[4752]: I1124 13:08:21.728437 4752 scope.go:117] "RemoveContainer" containerID="ec8b3c7b99e04ac7de2a7811e3000500deed3097f15545f1d3f164ab34d7ec12" Nov 24 13:08:21 crc kubenswrapper[4752]: E1124 13:08:21.729354 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:08:29 crc kubenswrapper[4752]: I1124 13:08:29.576111 4752 generic.go:334] "Generic (PLEG): container finished" podID="2948a607-5d86-4e4e-94ef-cc1cc219ef47" containerID="59cdd5fe692dd0317977b2ace9f24d9986e6489232bccaffcf81d285cc391790" exitCode=0 Nov 24 13:08:29 crc kubenswrapper[4752]: I1124 13:08:29.576200 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-xhk6b" event={"ID":"2948a607-5d86-4e4e-94ef-cc1cc219ef47","Type":"ContainerDied","Data":"59cdd5fe692dd0317977b2ace9f24d9986e6489232bccaffcf81d285cc391790"} Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.049967 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-xhk6b" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.239536 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2948a607-5d86-4e4e-94ef-cc1cc219ef47-inventory\") pod \"2948a607-5d86-4e4e-94ef-cc1cc219ef47\" (UID: \"2948a607-5d86-4e4e-94ef-cc1cc219ef47\") " Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.239797 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2948a607-5d86-4e4e-94ef-cc1cc219ef47-ssh-key\") pod \"2948a607-5d86-4e4e-94ef-cc1cc219ef47\" (UID: \"2948a607-5d86-4e4e-94ef-cc1cc219ef47\") " Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.239934 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77g6c\" (UniqueName: \"kubernetes.io/projected/2948a607-5d86-4e4e-94ef-cc1cc219ef47-kube-api-access-77g6c\") pod \"2948a607-5d86-4e4e-94ef-cc1cc219ef47\" (UID: \"2948a607-5d86-4e4e-94ef-cc1cc219ef47\") " Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.239984 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2948a607-5d86-4e4e-94ef-cc1cc219ef47-ceph\") pod \"2948a607-5d86-4e4e-94ef-cc1cc219ef47\" (UID: \"2948a607-5d86-4e4e-94ef-cc1cc219ef47\") " Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.247136 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2948a607-5d86-4e4e-94ef-cc1cc219ef47-ceph" (OuterVolumeSpecName: "ceph") pod "2948a607-5d86-4e4e-94ef-cc1cc219ef47" (UID: "2948a607-5d86-4e4e-94ef-cc1cc219ef47"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.247894 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2948a607-5d86-4e4e-94ef-cc1cc219ef47-kube-api-access-77g6c" (OuterVolumeSpecName: "kube-api-access-77g6c") pod "2948a607-5d86-4e4e-94ef-cc1cc219ef47" (UID: "2948a607-5d86-4e4e-94ef-cc1cc219ef47"). InnerVolumeSpecName "kube-api-access-77g6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.271443 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2948a607-5d86-4e4e-94ef-cc1cc219ef47-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2948a607-5d86-4e4e-94ef-cc1cc219ef47" (UID: "2948a607-5d86-4e4e-94ef-cc1cc219ef47"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.274809 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2948a607-5d86-4e4e-94ef-cc1cc219ef47-inventory" (OuterVolumeSpecName: "inventory") pod "2948a607-5d86-4e4e-94ef-cc1cc219ef47" (UID: "2948a607-5d86-4e4e-94ef-cc1cc219ef47"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.343304 4752 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2948a607-5d86-4e4e-94ef-cc1cc219ef47-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.343340 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77g6c\" (UniqueName: \"kubernetes.io/projected/2948a607-5d86-4e4e-94ef-cc1cc219ef47-kube-api-access-77g6c\") on node \"crc\" DevicePath \"\"" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.343351 4752 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2948a607-5d86-4e4e-94ef-cc1cc219ef47-ceph\") on node \"crc\" DevicePath \"\"" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.343360 4752 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2948a607-5d86-4e4e-94ef-cc1cc219ef47-inventory\") on node \"crc\" DevicePath \"\"" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.601944 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-xhk6b" event={"ID":"2948a607-5d86-4e4e-94ef-cc1cc219ef47","Type":"ContainerDied","Data":"107363379dd58286597855961f31e02855b408209fbbb0d2dc5ad7d893dd0585"} Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.602485 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="107363379dd58286597855961f31e02855b408209fbbb0d2dc5ad7d893dd0585" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.602017 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-xhk6b" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.711199 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-7gskg"] Nov 24 13:08:31 crc kubenswrapper[4752]: E1124 13:08:31.711776 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2948a607-5d86-4e4e-94ef-cc1cc219ef47" containerName="reboot-os-openstack-openstack-cell1" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.711796 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2948a607-5d86-4e4e-94ef-cc1cc219ef47" containerName="reboot-os-openstack-openstack-cell1" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.712090 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="2948a607-5d86-4e4e-94ef-cc1cc219ef47" containerName="reboot-os-openstack-openstack-cell1" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.713077 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-7gskg" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.714973 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qj7cx" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.715294 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.715445 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.720857 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-7gskg"] Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.725197 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.752944 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-7gskg\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " pod="openstack/install-certs-openstack-openstack-cell1-7gskg" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.753288 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-7gskg\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " pod="openstack/install-certs-openstack-openstack-cell1-7gskg" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.753404 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-7gskg\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " pod="openstack/install-certs-openstack-openstack-cell1-7gskg" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.753550 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-7gskg\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " pod="openstack/install-certs-openstack-openstack-cell1-7gskg" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.753685 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-ceph\") pod \"install-certs-openstack-openstack-cell1-7gskg\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " pod="openstack/install-certs-openstack-openstack-cell1-7gskg" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.753810 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-inventory\") pod \"install-certs-openstack-openstack-cell1-7gskg\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " pod="openstack/install-certs-openstack-openstack-cell1-7gskg" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.754002 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-7gskg\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " pod="openstack/install-certs-openstack-openstack-cell1-7gskg" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.754227 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5hp2\" (UniqueName: \"kubernetes.io/projected/b27b5307-a3a9-405c-beb9-5a8774f330d6-kube-api-access-r5hp2\") pod \"install-certs-openstack-openstack-cell1-7gskg\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " pod="openstack/install-certs-openstack-openstack-cell1-7gskg" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.754335 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-7gskg\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " pod="openstack/install-certs-openstack-openstack-cell1-7gskg" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.754438 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-7gskg\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " pod="openstack/install-certs-openstack-openstack-cell1-7gskg" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.754568 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-ssh-key\") pod \"install-certs-openstack-openstack-cell1-7gskg\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " pod="openstack/install-certs-openstack-openstack-cell1-7gskg" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.754716 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-7gskg\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " pod="openstack/install-certs-openstack-openstack-cell1-7gskg" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.857383 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-7gskg\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " pod="openstack/install-certs-openstack-openstack-cell1-7gskg" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.857555 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5hp2\" (UniqueName: \"kubernetes.io/projected/b27b5307-a3a9-405c-beb9-5a8774f330d6-kube-api-access-r5hp2\") pod \"install-certs-openstack-openstack-cell1-7gskg\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " pod="openstack/install-certs-openstack-openstack-cell1-7gskg" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.857595 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-7gskg\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " pod="openstack/install-certs-openstack-openstack-cell1-7gskg" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.857617 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-7gskg\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " pod="openstack/install-certs-openstack-openstack-cell1-7gskg" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.857689 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-ssh-key\") pod \"install-certs-openstack-openstack-cell1-7gskg\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " pod="openstack/install-certs-openstack-openstack-cell1-7gskg" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.857734 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-7gskg\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " pod="openstack/install-certs-openstack-openstack-cell1-7gskg" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.857881 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-7gskg\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " pod="openstack/install-certs-openstack-openstack-cell1-7gskg" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.857920 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-7gskg\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " pod="openstack/install-certs-openstack-openstack-cell1-7gskg" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.857951 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-7gskg\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " pod="openstack/install-certs-openstack-openstack-cell1-7gskg" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.857998 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-7gskg\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " pod="openstack/install-certs-openstack-openstack-cell1-7gskg" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.858045 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-ceph\") pod \"install-certs-openstack-openstack-cell1-7gskg\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " pod="openstack/install-certs-openstack-openstack-cell1-7gskg" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.858071 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-inventory\") pod \"install-certs-openstack-openstack-cell1-7gskg\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " pod="openstack/install-certs-openstack-openstack-cell1-7gskg" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.863188 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-7gskg\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " pod="openstack/install-certs-openstack-openstack-cell1-7gskg" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.863271 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-7gskg\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " pod="openstack/install-certs-openstack-openstack-cell1-7gskg" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.863771 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-7gskg\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " pod="openstack/install-certs-openstack-openstack-cell1-7gskg" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.864678 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-7gskg\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " pod="openstack/install-certs-openstack-openstack-cell1-7gskg" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.864962 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-7gskg\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " pod="openstack/install-certs-openstack-openstack-cell1-7gskg" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.865192 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-7gskg\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " pod="openstack/install-certs-openstack-openstack-cell1-7gskg" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.865398 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-7gskg\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " pod="openstack/install-certs-openstack-openstack-cell1-7gskg" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.865597 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-ssh-key\") pod \"install-certs-openstack-openstack-cell1-7gskg\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " pod="openstack/install-certs-openstack-openstack-cell1-7gskg" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.866324 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-7gskg\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " pod="openstack/install-certs-openstack-openstack-cell1-7gskg" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.866780 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-ceph\") pod \"install-certs-openstack-openstack-cell1-7gskg\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " pod="openstack/install-certs-openstack-openstack-cell1-7gskg" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.867456 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-inventory\") pod \"install-certs-openstack-openstack-cell1-7gskg\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " pod="openstack/install-certs-openstack-openstack-cell1-7gskg" Nov 24 13:08:31 crc kubenswrapper[4752]: I1124 13:08:31.875027 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5hp2\" (UniqueName: \"kubernetes.io/projected/b27b5307-a3a9-405c-beb9-5a8774f330d6-kube-api-access-r5hp2\") pod \"install-certs-openstack-openstack-cell1-7gskg\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " pod="openstack/install-certs-openstack-openstack-cell1-7gskg" Nov 24 13:08:32 crc kubenswrapper[4752]: I1124 13:08:32.081069 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-7gskg" Nov 24 13:08:32 crc kubenswrapper[4752]: I1124 13:08:32.624925 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-7gskg"] Nov 24 13:08:32 crc kubenswrapper[4752]: I1124 13:08:32.728670 4752 scope.go:117] "RemoveContainer" containerID="ec8b3c7b99e04ac7de2a7811e3000500deed3097f15545f1d3f164ab34d7ec12" Nov 24 13:08:32 crc kubenswrapper[4752]: E1124 13:08:32.729366 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:08:33 crc kubenswrapper[4752]: I1124 13:08:33.619873 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-7gskg" event={"ID":"b27b5307-a3a9-405c-beb9-5a8774f330d6","Type":"ContainerStarted","Data":"6384bdf161b1dc116ce40870d67d6e7a8c743fe7b94597284a1491c94df29a31"} Nov 24 13:08:33 crc kubenswrapper[4752]: I1124 13:08:33.620318 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-7gskg" event={"ID":"b27b5307-a3a9-405c-beb9-5a8774f330d6","Type":"ContainerStarted","Data":"be06bff5344000dec5c249bb28c52491f2e8e629f07907724a01363c79391c20"} Nov 24 13:08:33 crc kubenswrapper[4752]: I1124 13:08:33.652934 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-cell1-7gskg" podStartSLOduration=2.15206141 podStartE2EDuration="2.652909351s" podCreationTimestamp="2025-11-24 13:08:31 +0000 UTC" firstStartedPulling="2025-11-24 13:08:32.630185335 +0000 UTC m=+7318.615005624" lastFinishedPulling="2025-11-24 13:08:33.131033266 +0000 UTC m=+7319.115853565" observedRunningTime="2025-11-24 13:08:33.643328786 +0000 UTC m=+7319.628149075" watchObservedRunningTime="2025-11-24 13:08:33.652909351 +0000 UTC m=+7319.637729640" Nov 24 13:08:43 crc kubenswrapper[4752]: I1124 13:08:43.727872 4752 scope.go:117] "RemoveContainer" containerID="ec8b3c7b99e04ac7de2a7811e3000500deed3097f15545f1d3f164ab34d7ec12" Nov 24 13:08:43 crc kubenswrapper[4752]: E1124 13:08:43.730124 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:08:51 crc kubenswrapper[4752]: I1124 13:08:51.797006 4752 generic.go:334] "Generic (PLEG): container finished" podID="b27b5307-a3a9-405c-beb9-5a8774f330d6" containerID="6384bdf161b1dc116ce40870d67d6e7a8c743fe7b94597284a1491c94df29a31" exitCode=0 Nov 24 13:08:51 crc kubenswrapper[4752]: I1124 13:08:51.797082 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-7gskg" event={"ID":"b27b5307-a3a9-405c-beb9-5a8774f330d6","Type":"ContainerDied","Data":"6384bdf161b1dc116ce40870d67d6e7a8c743fe7b94597284a1491c94df29a31"} Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.331873 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-7gskg" Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.390478 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-inventory\") pod \"b27b5307-a3a9-405c-beb9-5a8774f330d6\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.390554 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5hp2\" (UniqueName: \"kubernetes.io/projected/b27b5307-a3a9-405c-beb9-5a8774f330d6-kube-api-access-r5hp2\") pod \"b27b5307-a3a9-405c-beb9-5a8774f330d6\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.390642 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-neutron-metadata-combined-ca-bundle\") pod \"b27b5307-a3a9-405c-beb9-5a8774f330d6\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.390676 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-nova-combined-ca-bundle\") pod \"b27b5307-a3a9-405c-beb9-5a8774f330d6\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.390708 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-libvirt-combined-ca-bundle\") pod \"b27b5307-a3a9-405c-beb9-5a8774f330d6\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.390735 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-bootstrap-combined-ca-bundle\") pod \"b27b5307-a3a9-405c-beb9-5a8774f330d6\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.390815 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-ssh-key\") pod \"b27b5307-a3a9-405c-beb9-5a8774f330d6\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.390831 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-ceph\") pod \"b27b5307-a3a9-405c-beb9-5a8774f330d6\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.390855 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-neutron-dhcp-combined-ca-bundle\") pod \"b27b5307-a3a9-405c-beb9-5a8774f330d6\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.390878 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-telemetry-combined-ca-bundle\") pod \"b27b5307-a3a9-405c-beb9-5a8774f330d6\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.390912 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-neutron-sriov-combined-ca-bundle\") pod \"b27b5307-a3a9-405c-beb9-5a8774f330d6\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.390974 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-ovn-combined-ca-bundle\") pod \"b27b5307-a3a9-405c-beb9-5a8774f330d6\" (UID: \"b27b5307-a3a9-405c-beb9-5a8774f330d6\") " Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.397645 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "b27b5307-a3a9-405c-beb9-5a8774f330d6" (UID: "b27b5307-a3a9-405c-beb9-5a8774f330d6"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.398991 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b27b5307-a3a9-405c-beb9-5a8774f330d6-kube-api-access-r5hp2" (OuterVolumeSpecName: "kube-api-access-r5hp2") pod "b27b5307-a3a9-405c-beb9-5a8774f330d6" (UID: "b27b5307-a3a9-405c-beb9-5a8774f330d6"). InnerVolumeSpecName "kube-api-access-r5hp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.399745 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "b27b5307-a3a9-405c-beb9-5a8774f330d6" (UID: "b27b5307-a3a9-405c-beb9-5a8774f330d6"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.400398 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "b27b5307-a3a9-405c-beb9-5a8774f330d6" (UID: "b27b5307-a3a9-405c-beb9-5a8774f330d6"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.400450 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "b27b5307-a3a9-405c-beb9-5a8774f330d6" (UID: "b27b5307-a3a9-405c-beb9-5a8774f330d6"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.400504 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "b27b5307-a3a9-405c-beb9-5a8774f330d6" (UID: "b27b5307-a3a9-405c-beb9-5a8774f330d6"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.400568 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "b27b5307-a3a9-405c-beb9-5a8774f330d6" (UID: "b27b5307-a3a9-405c-beb9-5a8774f330d6"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.401955 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-ceph" (OuterVolumeSpecName: "ceph") pod "b27b5307-a3a9-405c-beb9-5a8774f330d6" (UID: "b27b5307-a3a9-405c-beb9-5a8774f330d6"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.406943 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "b27b5307-a3a9-405c-beb9-5a8774f330d6" (UID: "b27b5307-a3a9-405c-beb9-5a8774f330d6"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.407663 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "b27b5307-a3a9-405c-beb9-5a8774f330d6" (UID: "b27b5307-a3a9-405c-beb9-5a8774f330d6"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.428399 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b27b5307-a3a9-405c-beb9-5a8774f330d6" (UID: "b27b5307-a3a9-405c-beb9-5a8774f330d6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.430034 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-inventory" (OuterVolumeSpecName: "inventory") pod "b27b5307-a3a9-405c-beb9-5a8774f330d6" (UID: "b27b5307-a3a9-405c-beb9-5a8774f330d6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.493607 4752 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.493638 4752 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-ceph\") on node \"crc\" DevicePath \"\"" Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.493648 4752 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.493658 4752 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.493667 4752 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.493679 4752 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.493688 4752 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-inventory\") on node \"crc\" DevicePath \"\"" Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.493696 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5hp2\" (UniqueName: \"kubernetes.io/projected/b27b5307-a3a9-405c-beb9-5a8774f330d6-kube-api-access-r5hp2\") on node \"crc\" DevicePath \"\"" Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.493705 4752 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.493714 4752 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.493723 4752 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.493730 4752 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b27b5307-a3a9-405c-beb9-5a8774f330d6-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.821831 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-7gskg" event={"ID":"b27b5307-a3a9-405c-beb9-5a8774f330d6","Type":"ContainerDied","Data":"be06bff5344000dec5c249bb28c52491f2e8e629f07907724a01363c79391c20"} Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.821901 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be06bff5344000dec5c249bb28c52491f2e8e629f07907724a01363c79391c20" Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.821921 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-7gskg" Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.923861 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-s67n5"] Nov 24 13:08:53 crc kubenswrapper[4752]: E1124 13:08:53.924637 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b27b5307-a3a9-405c-beb9-5a8774f330d6" containerName="install-certs-openstack-openstack-cell1" Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.924661 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="b27b5307-a3a9-405c-beb9-5a8774f330d6" containerName="install-certs-openstack-openstack-cell1" Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.924951 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="b27b5307-a3a9-405c-beb9-5a8774f330d6" containerName="install-certs-openstack-openstack-cell1" Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.925650 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-s67n5" Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.927862 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.927997 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.928009 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qj7cx" Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.931645 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 24 13:08:53 crc kubenswrapper[4752]: I1124 13:08:53.939462 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-s67n5"] Nov 24 13:08:54 crc kubenswrapper[4752]: I1124 13:08:54.005502 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4chv\" (UniqueName: \"kubernetes.io/projected/7acb322e-d9f4-48b7-a023-d42f3614ecc7-kube-api-access-d4chv\") pod \"ceph-client-openstack-openstack-cell1-s67n5\" (UID: \"7acb322e-d9f4-48b7-a023-d42f3614ecc7\") " pod="openstack/ceph-client-openstack-openstack-cell1-s67n5" Nov 24 13:08:54 crc kubenswrapper[4752]: I1124 13:08:54.005805 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7acb322e-d9f4-48b7-a023-d42f3614ecc7-inventory\") pod \"ceph-client-openstack-openstack-cell1-s67n5\" (UID: \"7acb322e-d9f4-48b7-a023-d42f3614ecc7\") " pod="openstack/ceph-client-openstack-openstack-cell1-s67n5" Nov 24 13:08:54 crc kubenswrapper[4752]: I1124 13:08:54.005853 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7acb322e-d9f4-48b7-a023-d42f3614ecc7-ceph\") pod \"ceph-client-openstack-openstack-cell1-s67n5\" (UID: \"7acb322e-d9f4-48b7-a023-d42f3614ecc7\") " pod="openstack/ceph-client-openstack-openstack-cell1-s67n5" Nov 24 13:08:54 crc kubenswrapper[4752]: I1124 13:08:54.005984 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7acb322e-d9f4-48b7-a023-d42f3614ecc7-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-s67n5\" (UID: \"7acb322e-d9f4-48b7-a023-d42f3614ecc7\") " pod="openstack/ceph-client-openstack-openstack-cell1-s67n5" Nov 24 13:08:54 crc kubenswrapper[4752]: I1124 13:08:54.107479 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7acb322e-d9f4-48b7-a023-d42f3614ecc7-inventory\") pod \"ceph-client-openstack-openstack-cell1-s67n5\" (UID: \"7acb322e-d9f4-48b7-a023-d42f3614ecc7\") " pod="openstack/ceph-client-openstack-openstack-cell1-s67n5" Nov 24 13:08:54 crc kubenswrapper[4752]: I1124 13:08:54.107521 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7acb322e-d9f4-48b7-a023-d42f3614ecc7-ceph\") pod \"ceph-client-openstack-openstack-cell1-s67n5\" (UID: \"7acb322e-d9f4-48b7-a023-d42f3614ecc7\") " pod="openstack/ceph-client-openstack-openstack-cell1-s67n5" Nov 24 13:08:54 crc kubenswrapper[4752]: I1124 13:08:54.107549 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7acb322e-d9f4-48b7-a023-d42f3614ecc7-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-s67n5\" (UID: \"7acb322e-d9f4-48b7-a023-d42f3614ecc7\") " pod="openstack/ceph-client-openstack-openstack-cell1-s67n5" Nov 24 13:08:54 crc kubenswrapper[4752]: I1124 13:08:54.107649 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4chv\" (UniqueName: \"kubernetes.io/projected/7acb322e-d9f4-48b7-a023-d42f3614ecc7-kube-api-access-d4chv\") pod \"ceph-client-openstack-openstack-cell1-s67n5\" (UID: \"7acb322e-d9f4-48b7-a023-d42f3614ecc7\") " pod="openstack/ceph-client-openstack-openstack-cell1-s67n5" Nov 24 13:08:54 crc kubenswrapper[4752]: I1124 13:08:54.112329 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7acb322e-d9f4-48b7-a023-d42f3614ecc7-inventory\") pod \"ceph-client-openstack-openstack-cell1-s67n5\" (UID: \"7acb322e-d9f4-48b7-a023-d42f3614ecc7\") " pod="openstack/ceph-client-openstack-openstack-cell1-s67n5" Nov 24 13:08:54 crc kubenswrapper[4752]: I1124 13:08:54.113579 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7acb322e-d9f4-48b7-a023-d42f3614ecc7-ceph\") pod \"ceph-client-openstack-openstack-cell1-s67n5\" (UID: \"7acb322e-d9f4-48b7-a023-d42f3614ecc7\") " pod="openstack/ceph-client-openstack-openstack-cell1-s67n5" Nov 24 13:08:54 crc kubenswrapper[4752]: I1124 13:08:54.115340 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7acb322e-d9f4-48b7-a023-d42f3614ecc7-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-s67n5\" (UID: \"7acb322e-d9f4-48b7-a023-d42f3614ecc7\") " pod="openstack/ceph-client-openstack-openstack-cell1-s67n5" Nov 24 13:08:54 crc kubenswrapper[4752]: I1124 13:08:54.126219 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4chv\" (UniqueName: \"kubernetes.io/projected/7acb322e-d9f4-48b7-a023-d42f3614ecc7-kube-api-access-d4chv\") pod \"ceph-client-openstack-openstack-cell1-s67n5\" (UID: \"7acb322e-d9f4-48b7-a023-d42f3614ecc7\") " pod="openstack/ceph-client-openstack-openstack-cell1-s67n5" Nov 24 13:08:54 crc kubenswrapper[4752]: I1124 13:08:54.243848 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-s67n5" Nov 24 13:08:54 crc kubenswrapper[4752]: I1124 13:08:54.812221 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-s67n5"] Nov 24 13:08:54 crc kubenswrapper[4752]: I1124 13:08:54.833906 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-s67n5" event={"ID":"7acb322e-d9f4-48b7-a023-d42f3614ecc7","Type":"ContainerStarted","Data":"ed0448d4c78535e76695db50be2b08a1d6a58bb4d549fc443edac9c38632eaf9"} Nov 24 13:08:55 crc kubenswrapper[4752]: I1124 13:08:55.845793 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-s67n5" event={"ID":"7acb322e-d9f4-48b7-a023-d42f3614ecc7","Type":"ContainerStarted","Data":"4834c350b5d51c565672fa1d4ca3038c38b89f2386e4ef0bdc044061144a8389"} Nov 24 13:08:55 crc kubenswrapper[4752]: I1124 13:08:55.866256 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-openstack-openstack-cell1-s67n5" podStartSLOduration=2.402095783 podStartE2EDuration="2.8662334s" podCreationTimestamp="2025-11-24 13:08:53 +0000 UTC" firstStartedPulling="2025-11-24 13:08:54.819035714 +0000 UTC m=+7340.803856003" lastFinishedPulling="2025-11-24 13:08:55.283173301 +0000 UTC m=+7341.267993620" observedRunningTime="2025-11-24 13:08:55.865627013 +0000 UTC m=+7341.850447312" watchObservedRunningTime="2025-11-24 13:08:55.8662334 +0000 UTC m=+7341.851053699" Nov 24 13:08:58 crc kubenswrapper[4752]: I1124 13:08:58.728292 4752 scope.go:117] "RemoveContainer" containerID="ec8b3c7b99e04ac7de2a7811e3000500deed3097f15545f1d3f164ab34d7ec12" Nov 24 13:08:58 crc kubenswrapper[4752]: E1124 13:08:58.729406 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:09:00 crc kubenswrapper[4752]: I1124 13:09:00.913964 4752 generic.go:334] "Generic (PLEG): container finished" podID="7acb322e-d9f4-48b7-a023-d42f3614ecc7" containerID="4834c350b5d51c565672fa1d4ca3038c38b89f2386e4ef0bdc044061144a8389" exitCode=0 Nov 24 13:09:00 crc kubenswrapper[4752]: I1124 13:09:00.914237 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-s67n5" event={"ID":"7acb322e-d9f4-48b7-a023-d42f3614ecc7","Type":"ContainerDied","Data":"4834c350b5d51c565672fa1d4ca3038c38b89f2386e4ef0bdc044061144a8389"} Nov 24 13:09:02 crc kubenswrapper[4752]: I1124 13:09:02.352238 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-s67n5" Nov 24 13:09:02 crc kubenswrapper[4752]: I1124 13:09:02.482936 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7acb322e-d9f4-48b7-a023-d42f3614ecc7-inventory\") pod \"7acb322e-d9f4-48b7-a023-d42f3614ecc7\" (UID: \"7acb322e-d9f4-48b7-a023-d42f3614ecc7\") " Nov 24 13:09:02 crc kubenswrapper[4752]: I1124 13:09:02.483010 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7acb322e-d9f4-48b7-a023-d42f3614ecc7-ssh-key\") pod \"7acb322e-d9f4-48b7-a023-d42f3614ecc7\" (UID: \"7acb322e-d9f4-48b7-a023-d42f3614ecc7\") " Nov 24 13:09:02 crc kubenswrapper[4752]: I1124 13:09:02.483068 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7acb322e-d9f4-48b7-a023-d42f3614ecc7-ceph\") pod \"7acb322e-d9f4-48b7-a023-d42f3614ecc7\" (UID: \"7acb322e-d9f4-48b7-a023-d42f3614ecc7\") " Nov 24 13:09:02 crc kubenswrapper[4752]: I1124 13:09:02.483368 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4chv\" (UniqueName: \"kubernetes.io/projected/7acb322e-d9f4-48b7-a023-d42f3614ecc7-kube-api-access-d4chv\") pod \"7acb322e-d9f4-48b7-a023-d42f3614ecc7\" (UID: \"7acb322e-d9f4-48b7-a023-d42f3614ecc7\") " Nov 24 13:09:02 crc kubenswrapper[4752]: I1124 13:09:02.492058 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7acb322e-d9f4-48b7-a023-d42f3614ecc7-kube-api-access-d4chv" (OuterVolumeSpecName: "kube-api-access-d4chv") pod "7acb322e-d9f4-48b7-a023-d42f3614ecc7" (UID: "7acb322e-d9f4-48b7-a023-d42f3614ecc7"). InnerVolumeSpecName "kube-api-access-d4chv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:09:02 crc kubenswrapper[4752]: I1124 13:09:02.492820 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7acb322e-d9f4-48b7-a023-d42f3614ecc7-ceph" (OuterVolumeSpecName: "ceph") pod "7acb322e-d9f4-48b7-a023-d42f3614ecc7" (UID: "7acb322e-d9f4-48b7-a023-d42f3614ecc7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:09:02 crc kubenswrapper[4752]: I1124 13:09:02.510788 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7acb322e-d9f4-48b7-a023-d42f3614ecc7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7acb322e-d9f4-48b7-a023-d42f3614ecc7" (UID: "7acb322e-d9f4-48b7-a023-d42f3614ecc7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:09:02 crc kubenswrapper[4752]: I1124 13:09:02.528785 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7acb322e-d9f4-48b7-a023-d42f3614ecc7-inventory" (OuterVolumeSpecName: "inventory") pod "7acb322e-d9f4-48b7-a023-d42f3614ecc7" (UID: "7acb322e-d9f4-48b7-a023-d42f3614ecc7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:09:02 crc kubenswrapper[4752]: I1124 13:09:02.586132 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4chv\" (UniqueName: \"kubernetes.io/projected/7acb322e-d9f4-48b7-a023-d42f3614ecc7-kube-api-access-d4chv\") on node \"crc\" DevicePath \"\"" Nov 24 13:09:02 crc kubenswrapper[4752]: I1124 13:09:02.586170 4752 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7acb322e-d9f4-48b7-a023-d42f3614ecc7-inventory\") on node \"crc\" DevicePath \"\"" Nov 24 13:09:02 crc kubenswrapper[4752]: I1124 13:09:02.586182 4752 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7acb322e-d9f4-48b7-a023-d42f3614ecc7-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 24 13:09:02 crc kubenswrapper[4752]: I1124 13:09:02.586191 4752 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7acb322e-d9f4-48b7-a023-d42f3614ecc7-ceph\") on node \"crc\" DevicePath \"\"" Nov 24 13:09:02 crc kubenswrapper[4752]: I1124 13:09:02.942674 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-s67n5" event={"ID":"7acb322e-d9f4-48b7-a023-d42f3614ecc7","Type":"ContainerDied","Data":"ed0448d4c78535e76695db50be2b08a1d6a58bb4d549fc443edac9c38632eaf9"} Nov 24 13:09:02 crc kubenswrapper[4752]: I1124 13:09:02.942715 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed0448d4c78535e76695db50be2b08a1d6a58bb4d549fc443edac9c38632eaf9" Nov 24 13:09:02 crc kubenswrapper[4752]: I1124 13:09:02.942804 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-s67n5" Nov 24 13:09:03 crc kubenswrapper[4752]: I1124 13:09:03.024120 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-tvrxz"] Nov 24 13:09:03 crc kubenswrapper[4752]: E1124 13:09:03.024627 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7acb322e-d9f4-48b7-a023-d42f3614ecc7" containerName="ceph-client-openstack-openstack-cell1" Nov 24 13:09:03 crc kubenswrapper[4752]: I1124 13:09:03.024645 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="7acb322e-d9f4-48b7-a023-d42f3614ecc7" containerName="ceph-client-openstack-openstack-cell1" Nov 24 13:09:03 crc kubenswrapper[4752]: I1124 13:09:03.025375 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="7acb322e-d9f4-48b7-a023-d42f3614ecc7" containerName="ceph-client-openstack-openstack-cell1" Nov 24 13:09:03 crc kubenswrapper[4752]: I1124 13:09:03.026396 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-tvrxz" Nov 24 13:09:03 crc kubenswrapper[4752]: I1124 13:09:03.030090 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 24 13:09:03 crc kubenswrapper[4752]: I1124 13:09:03.030344 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 24 13:09:03 crc kubenswrapper[4752]: I1124 13:09:03.030285 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Nov 24 13:09:03 crc kubenswrapper[4752]: I1124 13:09:03.030317 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qj7cx" Nov 24 13:09:03 crc kubenswrapper[4752]: I1124 13:09:03.030457 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 24 13:09:03 crc kubenswrapper[4752]: I1124 13:09:03.035723 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-tvrxz"] Nov 24 13:09:03 crc kubenswrapper[4752]: I1124 13:09:03.197966 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b92c65c-7cef-4eca-b3d5-452443fc7fb3-inventory\") pod \"ovn-openstack-openstack-cell1-tvrxz\" (UID: \"2b92c65c-7cef-4eca-b3d5-452443fc7fb3\") " pod="openstack/ovn-openstack-openstack-cell1-tvrxz" Nov 24 13:09:03 crc kubenswrapper[4752]: I1124 13:09:03.198138 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b92c65c-7cef-4eca-b3d5-452443fc7fb3-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-tvrxz\" (UID: \"2b92c65c-7cef-4eca-b3d5-452443fc7fb3\") " pod="openstack/ovn-openstack-openstack-cell1-tvrxz" Nov 24 13:09:03 crc kubenswrapper[4752]: I1124 13:09:03.198204 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2b92c65c-7cef-4eca-b3d5-452443fc7fb3-ceph\") pod \"ovn-openstack-openstack-cell1-tvrxz\" (UID: \"2b92c65c-7cef-4eca-b3d5-452443fc7fb3\") " pod="openstack/ovn-openstack-openstack-cell1-tvrxz" Nov 24 13:09:03 crc kubenswrapper[4752]: I1124 13:09:03.198475 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/2b92c65c-7cef-4eca-b3d5-452443fc7fb3-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-tvrxz\" (UID: \"2b92c65c-7cef-4eca-b3d5-452443fc7fb3\") " pod="openstack/ovn-openstack-openstack-cell1-tvrxz" Nov 24 13:09:03 crc kubenswrapper[4752]: I1124 13:09:03.198635 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b92c65c-7cef-4eca-b3d5-452443fc7fb3-ssh-key\") pod \"ovn-openstack-openstack-cell1-tvrxz\" (UID: \"2b92c65c-7cef-4eca-b3d5-452443fc7fb3\") " pod="openstack/ovn-openstack-openstack-cell1-tvrxz" Nov 24 13:09:03 crc kubenswrapper[4752]: I1124 13:09:03.198657 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk62w\" (UniqueName: \"kubernetes.io/projected/2b92c65c-7cef-4eca-b3d5-452443fc7fb3-kube-api-access-mk62w\") pod \"ovn-openstack-openstack-cell1-tvrxz\" (UID: \"2b92c65c-7cef-4eca-b3d5-452443fc7fb3\") " pod="openstack/ovn-openstack-openstack-cell1-tvrxz" Nov 24 13:09:03 crc kubenswrapper[4752]: I1124 13:09:03.301025 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b92c65c-7cef-4eca-b3d5-452443fc7fb3-inventory\") pod \"ovn-openstack-openstack-cell1-tvrxz\" (UID: \"2b92c65c-7cef-4eca-b3d5-452443fc7fb3\") " pod="openstack/ovn-openstack-openstack-cell1-tvrxz" Nov 24 13:09:03 crc kubenswrapper[4752]: I1124 13:09:03.301098 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b92c65c-7cef-4eca-b3d5-452443fc7fb3-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-tvrxz\" (UID: \"2b92c65c-7cef-4eca-b3d5-452443fc7fb3\") " pod="openstack/ovn-openstack-openstack-cell1-tvrxz" Nov 24 13:09:03 crc kubenswrapper[4752]: I1124 13:09:03.301124 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2b92c65c-7cef-4eca-b3d5-452443fc7fb3-ceph\") pod \"ovn-openstack-openstack-cell1-tvrxz\" (UID: \"2b92c65c-7cef-4eca-b3d5-452443fc7fb3\") " pod="openstack/ovn-openstack-openstack-cell1-tvrxz" Nov 24 13:09:03 crc kubenswrapper[4752]: I1124 13:09:03.301172 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/2b92c65c-7cef-4eca-b3d5-452443fc7fb3-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-tvrxz\" (UID: \"2b92c65c-7cef-4eca-b3d5-452443fc7fb3\") " pod="openstack/ovn-openstack-openstack-cell1-tvrxz" Nov 24 13:09:03 crc kubenswrapper[4752]: I1124 13:09:03.301219 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b92c65c-7cef-4eca-b3d5-452443fc7fb3-ssh-key\") pod \"ovn-openstack-openstack-cell1-tvrxz\" (UID: \"2b92c65c-7cef-4eca-b3d5-452443fc7fb3\") " pod="openstack/ovn-openstack-openstack-cell1-tvrxz" Nov 24 13:09:03 crc kubenswrapper[4752]: I1124 13:09:03.301234 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mk62w\" (UniqueName: \"kubernetes.io/projected/2b92c65c-7cef-4eca-b3d5-452443fc7fb3-kube-api-access-mk62w\") pod \"ovn-openstack-openstack-cell1-tvrxz\" (UID: \"2b92c65c-7cef-4eca-b3d5-452443fc7fb3\") " pod="openstack/ovn-openstack-openstack-cell1-tvrxz" Nov 24 13:09:03 crc kubenswrapper[4752]: I1124 13:09:03.302307 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/2b92c65c-7cef-4eca-b3d5-452443fc7fb3-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-tvrxz\" (UID: \"2b92c65c-7cef-4eca-b3d5-452443fc7fb3\") " pod="openstack/ovn-openstack-openstack-cell1-tvrxz" Nov 24 13:09:03 crc kubenswrapper[4752]: I1124 13:09:03.307158 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b92c65c-7cef-4eca-b3d5-452443fc7fb3-ssh-key\") pod \"ovn-openstack-openstack-cell1-tvrxz\" (UID: \"2b92c65c-7cef-4eca-b3d5-452443fc7fb3\") " pod="openstack/ovn-openstack-openstack-cell1-tvrxz" Nov 24 13:09:03 crc kubenswrapper[4752]: I1124 13:09:03.307170 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b92c65c-7cef-4eca-b3d5-452443fc7fb3-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-tvrxz\" (UID: \"2b92c65c-7cef-4eca-b3d5-452443fc7fb3\") " pod="openstack/ovn-openstack-openstack-cell1-tvrxz" Nov 24 13:09:03 crc kubenswrapper[4752]: I1124 13:09:03.308167 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2b92c65c-7cef-4eca-b3d5-452443fc7fb3-ceph\") pod \"ovn-openstack-openstack-cell1-tvrxz\" (UID: \"2b92c65c-7cef-4eca-b3d5-452443fc7fb3\") " pod="openstack/ovn-openstack-openstack-cell1-tvrxz" Nov 24 13:09:03 crc kubenswrapper[4752]: I1124 13:09:03.308233 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b92c65c-7cef-4eca-b3d5-452443fc7fb3-inventory\") pod \"ovn-openstack-openstack-cell1-tvrxz\" (UID: \"2b92c65c-7cef-4eca-b3d5-452443fc7fb3\") " pod="openstack/ovn-openstack-openstack-cell1-tvrxz" Nov 24 13:09:03 crc kubenswrapper[4752]: I1124 13:09:03.320327 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk62w\" (UniqueName: \"kubernetes.io/projected/2b92c65c-7cef-4eca-b3d5-452443fc7fb3-kube-api-access-mk62w\") pod \"ovn-openstack-openstack-cell1-tvrxz\" (UID: \"2b92c65c-7cef-4eca-b3d5-452443fc7fb3\") " pod="openstack/ovn-openstack-openstack-cell1-tvrxz" Nov 24 13:09:03 crc kubenswrapper[4752]: I1124 13:09:03.368386 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-tvrxz" Nov 24 13:09:03 crc kubenswrapper[4752]: I1124 13:09:03.944384 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-tvrxz"] Nov 24 13:09:04 crc kubenswrapper[4752]: I1124 13:09:04.961576 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-tvrxz" event={"ID":"2b92c65c-7cef-4eca-b3d5-452443fc7fb3","Type":"ContainerStarted","Data":"c4bd15d3a2ec91d18269911db5fa9ea5183dd7be669c8e9f1e865c4873c8d22e"} Nov 24 13:09:04 crc kubenswrapper[4752]: I1124 13:09:04.961899 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-tvrxz" event={"ID":"2b92c65c-7cef-4eca-b3d5-452443fc7fb3","Type":"ContainerStarted","Data":"66ded1e61a9bb27e0efc31d251b871c53515423a47c14dac5366b41472121553"} Nov 24 13:09:11 crc kubenswrapper[4752]: I1124 13:09:11.728522 4752 scope.go:117] "RemoveContainer" containerID="ec8b3c7b99e04ac7de2a7811e3000500deed3097f15545f1d3f164ab34d7ec12" Nov 24 13:09:11 crc kubenswrapper[4752]: E1124 13:09:11.729647 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:09:25 crc kubenswrapper[4752]: I1124 13:09:25.727899 4752 scope.go:117] "RemoveContainer" containerID="ec8b3c7b99e04ac7de2a7811e3000500deed3097f15545f1d3f164ab34d7ec12" Nov 24 13:09:26 crc kubenswrapper[4752]: I1124 13:09:26.208403 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerStarted","Data":"9c9cb22871fae3122be0525222b73b32cda01b283e33e73927f4054a906c5686"} Nov 24 13:09:26 crc kubenswrapper[4752]: I1124 13:09:26.236230 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell1-tvrxz" podStartSLOduration=22.817814655 podStartE2EDuration="23.23620085s" podCreationTimestamp="2025-11-24 13:09:03 +0000 UTC" firstStartedPulling="2025-11-24 13:09:03.948947569 +0000 UTC m=+7349.933767858" lastFinishedPulling="2025-11-24 13:09:04.367333764 +0000 UTC m=+7350.352154053" observedRunningTime="2025-11-24 13:09:04.986984504 +0000 UTC m=+7350.971804793" watchObservedRunningTime="2025-11-24 13:09:26.23620085 +0000 UTC m=+7372.221021139" Nov 24 13:10:11 crc kubenswrapper[4752]: I1124 13:10:11.707896 4752 generic.go:334] "Generic (PLEG): container finished" podID="2b92c65c-7cef-4eca-b3d5-452443fc7fb3" containerID="c4bd15d3a2ec91d18269911db5fa9ea5183dd7be669c8e9f1e865c4873c8d22e" exitCode=0 Nov 24 13:10:11 crc kubenswrapper[4752]: I1124 13:10:11.708005 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-tvrxz" event={"ID":"2b92c65c-7cef-4eca-b3d5-452443fc7fb3","Type":"ContainerDied","Data":"c4bd15d3a2ec91d18269911db5fa9ea5183dd7be669c8e9f1e865c4873c8d22e"} Nov 24 13:10:13 crc kubenswrapper[4752]: I1124 13:10:13.213658 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-tvrxz" Nov 24 13:10:13 crc kubenswrapper[4752]: I1124 13:10:13.305795 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/2b92c65c-7cef-4eca-b3d5-452443fc7fb3-ovncontroller-config-0\") pod \"2b92c65c-7cef-4eca-b3d5-452443fc7fb3\" (UID: \"2b92c65c-7cef-4eca-b3d5-452443fc7fb3\") " Nov 24 13:10:13 crc kubenswrapper[4752]: I1124 13:10:13.305876 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mk62w\" (UniqueName: \"kubernetes.io/projected/2b92c65c-7cef-4eca-b3d5-452443fc7fb3-kube-api-access-mk62w\") pod \"2b92c65c-7cef-4eca-b3d5-452443fc7fb3\" (UID: \"2b92c65c-7cef-4eca-b3d5-452443fc7fb3\") " Nov 24 13:10:13 crc kubenswrapper[4752]: I1124 13:10:13.305961 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b92c65c-7cef-4eca-b3d5-452443fc7fb3-ssh-key\") pod \"2b92c65c-7cef-4eca-b3d5-452443fc7fb3\" (UID: \"2b92c65c-7cef-4eca-b3d5-452443fc7fb3\") " Nov 24 13:10:13 crc kubenswrapper[4752]: I1124 13:10:13.305991 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b92c65c-7cef-4eca-b3d5-452443fc7fb3-ovn-combined-ca-bundle\") pod \"2b92c65c-7cef-4eca-b3d5-452443fc7fb3\" (UID: \"2b92c65c-7cef-4eca-b3d5-452443fc7fb3\") " Nov 24 13:10:13 crc kubenswrapper[4752]: I1124 13:10:13.306084 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2b92c65c-7cef-4eca-b3d5-452443fc7fb3-ceph\") pod \"2b92c65c-7cef-4eca-b3d5-452443fc7fb3\" (UID: \"2b92c65c-7cef-4eca-b3d5-452443fc7fb3\") " Nov 24 13:10:13 crc kubenswrapper[4752]: I1124 13:10:13.306211 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b92c65c-7cef-4eca-b3d5-452443fc7fb3-inventory\") pod \"2b92c65c-7cef-4eca-b3d5-452443fc7fb3\" (UID: \"2b92c65c-7cef-4eca-b3d5-452443fc7fb3\") " Nov 24 13:10:13 crc kubenswrapper[4752]: I1124 13:10:13.313835 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b92c65c-7cef-4eca-b3d5-452443fc7fb3-ceph" (OuterVolumeSpecName: "ceph") pod "2b92c65c-7cef-4eca-b3d5-452443fc7fb3" (UID: "2b92c65c-7cef-4eca-b3d5-452443fc7fb3"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:10:13 crc kubenswrapper[4752]: I1124 13:10:13.319030 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b92c65c-7cef-4eca-b3d5-452443fc7fb3-kube-api-access-mk62w" (OuterVolumeSpecName: "kube-api-access-mk62w") pod "2b92c65c-7cef-4eca-b3d5-452443fc7fb3" (UID: "2b92c65c-7cef-4eca-b3d5-452443fc7fb3"). InnerVolumeSpecName "kube-api-access-mk62w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:10:13 crc kubenswrapper[4752]: I1124 13:10:13.324985 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b92c65c-7cef-4eca-b3d5-452443fc7fb3-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "2b92c65c-7cef-4eca-b3d5-452443fc7fb3" (UID: "2b92c65c-7cef-4eca-b3d5-452443fc7fb3"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:10:13 crc kubenswrapper[4752]: I1124 13:10:13.340623 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b92c65c-7cef-4eca-b3d5-452443fc7fb3-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "2b92c65c-7cef-4eca-b3d5-452443fc7fb3" (UID: "2b92c65c-7cef-4eca-b3d5-452443fc7fb3"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 13:10:13 crc kubenswrapper[4752]: I1124 13:10:13.342530 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b92c65c-7cef-4eca-b3d5-452443fc7fb3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2b92c65c-7cef-4eca-b3d5-452443fc7fb3" (UID: "2b92c65c-7cef-4eca-b3d5-452443fc7fb3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:10:13 crc kubenswrapper[4752]: I1124 13:10:13.343888 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b92c65c-7cef-4eca-b3d5-452443fc7fb3-inventory" (OuterVolumeSpecName: "inventory") pod "2b92c65c-7cef-4eca-b3d5-452443fc7fb3" (UID: "2b92c65c-7cef-4eca-b3d5-452443fc7fb3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:10:13 crc kubenswrapper[4752]: I1124 13:10:13.408322 4752 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2b92c65c-7cef-4eca-b3d5-452443fc7fb3-ceph\") on node \"crc\" DevicePath \"\"" Nov 24 13:10:13 crc kubenswrapper[4752]: I1124 13:10:13.408558 4752 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b92c65c-7cef-4eca-b3d5-452443fc7fb3-inventory\") on node \"crc\" DevicePath \"\"" Nov 24 13:10:13 crc kubenswrapper[4752]: I1124 13:10:13.408669 4752 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/2b92c65c-7cef-4eca-b3d5-452443fc7fb3-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Nov 24 13:10:13 crc kubenswrapper[4752]: I1124 13:10:13.408765 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mk62w\" (UniqueName: \"kubernetes.io/projected/2b92c65c-7cef-4eca-b3d5-452443fc7fb3-kube-api-access-mk62w\") on node \"crc\" DevicePath \"\"" Nov 24 13:10:13 crc kubenswrapper[4752]: I1124 13:10:13.408860 4752 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b92c65c-7cef-4eca-b3d5-452443fc7fb3-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 24 13:10:13 crc kubenswrapper[4752]: I1124 13:10:13.408942 4752 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b92c65c-7cef-4eca-b3d5-452443fc7fb3-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 13:10:13 crc kubenswrapper[4752]: I1124 13:10:13.736364 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-tvrxz" event={"ID":"2b92c65c-7cef-4eca-b3d5-452443fc7fb3","Type":"ContainerDied","Data":"66ded1e61a9bb27e0efc31d251b871c53515423a47c14dac5366b41472121553"} Nov 24 13:10:13 crc kubenswrapper[4752]: I1124 13:10:13.736927 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66ded1e61a9bb27e0efc31d251b871c53515423a47c14dac5366b41472121553" Nov 24 13:10:13 crc kubenswrapper[4752]: I1124 13:10:13.736436 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-tvrxz" Nov 24 13:10:13 crc kubenswrapper[4752]: I1124 13:10:13.871229 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-jqv6t"] Nov 24 13:10:13 crc kubenswrapper[4752]: E1124 13:10:13.871946 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b92c65c-7cef-4eca-b3d5-452443fc7fb3" containerName="ovn-openstack-openstack-cell1" Nov 24 13:10:13 crc kubenswrapper[4752]: I1124 13:10:13.871965 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b92c65c-7cef-4eca-b3d5-452443fc7fb3" containerName="ovn-openstack-openstack-cell1" Nov 24 13:10:13 crc kubenswrapper[4752]: I1124 13:10:13.872277 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b92c65c-7cef-4eca-b3d5-452443fc7fb3" containerName="ovn-openstack-openstack-cell1" Nov 24 13:10:13 crc kubenswrapper[4752]: I1124 13:10:13.873248 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-jqv6t"] Nov 24 13:10:13 crc kubenswrapper[4752]: I1124 13:10:13.873358 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-jqv6t" Nov 24 13:10:13 crc kubenswrapper[4752]: I1124 13:10:13.879509 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Nov 24 13:10:13 crc kubenswrapper[4752]: I1124 13:10:13.879571 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 24 13:10:13 crc kubenswrapper[4752]: I1124 13:10:13.879697 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 24 13:10:13 crc kubenswrapper[4752]: I1124 13:10:13.879875 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 24 13:10:13 crc kubenswrapper[4752]: I1124 13:10:13.880073 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qj7cx" Nov 24 13:10:13 crc kubenswrapper[4752]: I1124 13:10:13.880382 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Nov 24 13:10:14 crc kubenswrapper[4752]: I1124 13:10:14.022445 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6a6b2962-03f6-4ed4-b504-429861f14548-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-jqv6t\" (UID: \"6a6b2962-03f6-4ed4-b504-429861f14548\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jqv6t" Nov 24 13:10:14 crc kubenswrapper[4752]: I1124 13:10:14.022529 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a6b2962-03f6-4ed4-b504-429861f14548-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-jqv6t\" (UID: \"6a6b2962-03f6-4ed4-b504-429861f14548\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jqv6t" Nov 24 13:10:14 crc kubenswrapper[4752]: I1124 13:10:14.022561 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmmls\" (UniqueName: \"kubernetes.io/projected/6a6b2962-03f6-4ed4-b504-429861f14548-kube-api-access-vmmls\") pod \"neutron-metadata-openstack-openstack-cell1-jqv6t\" (UID: \"6a6b2962-03f6-4ed4-b504-429861f14548\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jqv6t" Nov 24 13:10:14 crc kubenswrapper[4752]: I1124 13:10:14.022584 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6a6b2962-03f6-4ed4-b504-429861f14548-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-jqv6t\" (UID: \"6a6b2962-03f6-4ed4-b504-429861f14548\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jqv6t" Nov 24 13:10:14 crc kubenswrapper[4752]: I1124 13:10:14.022615 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a6b2962-03f6-4ed4-b504-429861f14548-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-jqv6t\" (UID: \"6a6b2962-03f6-4ed4-b504-429861f14548\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jqv6t" Nov 24 13:10:14 crc kubenswrapper[4752]: I1124 13:10:14.022660 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6a6b2962-03f6-4ed4-b504-429861f14548-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-jqv6t\" (UID: \"6a6b2962-03f6-4ed4-b504-429861f14548\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jqv6t" Nov 24 13:10:14 crc kubenswrapper[4752]: I1124 13:10:14.022963 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6a6b2962-03f6-4ed4-b504-429861f14548-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-jqv6t\" (UID: \"6a6b2962-03f6-4ed4-b504-429861f14548\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jqv6t" Nov 24 13:10:14 crc kubenswrapper[4752]: I1124 13:10:14.126133 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6a6b2962-03f6-4ed4-b504-429861f14548-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-jqv6t\" (UID: \"6a6b2962-03f6-4ed4-b504-429861f14548\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jqv6t" Nov 24 13:10:14 crc kubenswrapper[4752]: I1124 13:10:14.126784 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a6b2962-03f6-4ed4-b504-429861f14548-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-jqv6t\" (UID: \"6a6b2962-03f6-4ed4-b504-429861f14548\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jqv6t" Nov 24 13:10:14 crc kubenswrapper[4752]: I1124 13:10:14.126829 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmmls\" (UniqueName: \"kubernetes.io/projected/6a6b2962-03f6-4ed4-b504-429861f14548-kube-api-access-vmmls\") pod \"neutron-metadata-openstack-openstack-cell1-jqv6t\" (UID: \"6a6b2962-03f6-4ed4-b504-429861f14548\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jqv6t" Nov 24 13:10:14 crc kubenswrapper[4752]: I1124 13:10:14.126866 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6a6b2962-03f6-4ed4-b504-429861f14548-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-jqv6t\" (UID: \"6a6b2962-03f6-4ed4-b504-429861f14548\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jqv6t" Nov 24 13:10:14 crc kubenswrapper[4752]: I1124 13:10:14.126909 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a6b2962-03f6-4ed4-b504-429861f14548-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-jqv6t\" (UID: \"6a6b2962-03f6-4ed4-b504-429861f14548\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jqv6t" Nov 24 13:10:14 crc kubenswrapper[4752]: I1124 13:10:14.126966 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6a6b2962-03f6-4ed4-b504-429861f14548-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-jqv6t\" (UID: \"6a6b2962-03f6-4ed4-b504-429861f14548\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jqv6t" Nov 24 13:10:14 crc kubenswrapper[4752]: I1124 13:10:14.127015 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6a6b2962-03f6-4ed4-b504-429861f14548-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-jqv6t\" (UID: \"6a6b2962-03f6-4ed4-b504-429861f14548\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jqv6t" Nov 24 13:10:14 crc kubenswrapper[4752]: I1124 13:10:14.139036 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6a6b2962-03f6-4ed4-b504-429861f14548-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-jqv6t\" (UID: \"6a6b2962-03f6-4ed4-b504-429861f14548\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jqv6t" Nov 24 13:10:14 crc kubenswrapper[4752]: I1124 13:10:14.139869 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6a6b2962-03f6-4ed4-b504-429861f14548-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-jqv6t\" (UID: \"6a6b2962-03f6-4ed4-b504-429861f14548\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jqv6t" Nov 24 13:10:14 crc kubenswrapper[4752]: I1124 13:10:14.139899 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6a6b2962-03f6-4ed4-b504-429861f14548-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-jqv6t\" (UID: \"6a6b2962-03f6-4ed4-b504-429861f14548\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jqv6t" Nov 24 13:10:14 crc kubenswrapper[4752]: I1124 13:10:14.140343 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a6b2962-03f6-4ed4-b504-429861f14548-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-jqv6t\" (UID: \"6a6b2962-03f6-4ed4-b504-429861f14548\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jqv6t" Nov 24 13:10:14 crc kubenswrapper[4752]: I1124 13:10:14.140497 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a6b2962-03f6-4ed4-b504-429861f14548-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-jqv6t\" (UID: \"6a6b2962-03f6-4ed4-b504-429861f14548\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jqv6t" Nov 24 13:10:14 crc kubenswrapper[4752]: I1124 13:10:14.140826 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6a6b2962-03f6-4ed4-b504-429861f14548-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-jqv6t\" (UID: \"6a6b2962-03f6-4ed4-b504-429861f14548\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jqv6t" Nov 24 13:10:14 crc kubenswrapper[4752]: I1124 13:10:14.150944 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmmls\" (UniqueName: \"kubernetes.io/projected/6a6b2962-03f6-4ed4-b504-429861f14548-kube-api-access-vmmls\") pod \"neutron-metadata-openstack-openstack-cell1-jqv6t\" (UID: \"6a6b2962-03f6-4ed4-b504-429861f14548\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jqv6t" Nov 24 13:10:14 crc kubenswrapper[4752]: I1124 13:10:14.197261 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-jqv6t" Nov 24 13:10:14 crc kubenswrapper[4752]: I1124 13:10:14.783497 4752 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 24 13:10:14 crc kubenswrapper[4752]: I1124 13:10:14.790586 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-jqv6t"] Nov 24 13:10:15 crc kubenswrapper[4752]: I1124 13:10:15.779057 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-jqv6t" event={"ID":"6a6b2962-03f6-4ed4-b504-429861f14548","Type":"ContainerStarted","Data":"b67a58b4b2d1dfca9d17e226e3f19392880156849007b3d9906644494898f944"} Nov 24 13:10:15 crc kubenswrapper[4752]: I1124 13:10:15.779927 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-jqv6t" event={"ID":"6a6b2962-03f6-4ed4-b504-429861f14548","Type":"ContainerStarted","Data":"ef886f202007120d6eb5101b2f4e8114fd61f5c48629a01feb1b6aca409119f5"} Nov 24 13:10:15 crc kubenswrapper[4752]: I1124 13:10:15.816849 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-cell1-jqv6t" podStartSLOduration=2.398118126 podStartE2EDuration="2.81681546s" podCreationTimestamp="2025-11-24 13:10:13 +0000 UTC" firstStartedPulling="2025-11-24 13:10:14.783065748 +0000 UTC m=+7420.767886057" lastFinishedPulling="2025-11-24 13:10:15.201763092 +0000 UTC m=+7421.186583391" observedRunningTime="2025-11-24 13:10:15.801219112 +0000 UTC m=+7421.786039411" watchObservedRunningTime="2025-11-24 13:10:15.81681546 +0000 UTC m=+7421.801635769" Nov 24 13:10:47 crc kubenswrapper[4752]: I1124 13:10:47.757419 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2gz7r"] Nov 24 13:10:47 crc kubenswrapper[4752]: I1124 13:10:47.760707 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2gz7r" Nov 24 13:10:47 crc kubenswrapper[4752]: I1124 13:10:47.778653 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2gz7r"] Nov 24 13:10:47 crc kubenswrapper[4752]: I1124 13:10:47.887540 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-542f4\" (UniqueName: \"kubernetes.io/projected/fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c-kube-api-access-542f4\") pod \"community-operators-2gz7r\" (UID: \"fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c\") " pod="openshift-marketplace/community-operators-2gz7r" Nov 24 13:10:47 crc kubenswrapper[4752]: I1124 13:10:47.887686 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c-utilities\") pod \"community-operators-2gz7r\" (UID: \"fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c\") " pod="openshift-marketplace/community-operators-2gz7r" Nov 24 13:10:47 crc kubenswrapper[4752]: I1124 13:10:47.887990 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c-catalog-content\") pod \"community-operators-2gz7r\" (UID: \"fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c\") " pod="openshift-marketplace/community-operators-2gz7r" Nov 24 13:10:47 crc kubenswrapper[4752]: I1124 13:10:47.989954 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-542f4\" (UniqueName: \"kubernetes.io/projected/fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c-kube-api-access-542f4\") pod \"community-operators-2gz7r\" (UID: \"fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c\") " pod="openshift-marketplace/community-operators-2gz7r" Nov 24 13:10:47 crc kubenswrapper[4752]: I1124 13:10:47.990357 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c-utilities\") pod \"community-operators-2gz7r\" (UID: \"fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c\") " pod="openshift-marketplace/community-operators-2gz7r" Nov 24 13:10:47 crc kubenswrapper[4752]: I1124 13:10:47.990429 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c-catalog-content\") pod \"community-operators-2gz7r\" (UID: \"fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c\") " pod="openshift-marketplace/community-operators-2gz7r" Nov 24 13:10:47 crc kubenswrapper[4752]: I1124 13:10:47.990953 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c-catalog-content\") pod \"community-operators-2gz7r\" (UID: \"fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c\") " pod="openshift-marketplace/community-operators-2gz7r" Nov 24 13:10:47 crc kubenswrapper[4752]: I1124 13:10:47.991171 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c-utilities\") pod \"community-operators-2gz7r\" (UID: \"fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c\") " pod="openshift-marketplace/community-operators-2gz7r" Nov 24 13:10:48 crc kubenswrapper[4752]: I1124 13:10:48.011808 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-542f4\" (UniqueName: \"kubernetes.io/projected/fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c-kube-api-access-542f4\") pod \"community-operators-2gz7r\" (UID: \"fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c\") " pod="openshift-marketplace/community-operators-2gz7r" Nov 24 13:10:48 crc kubenswrapper[4752]: I1124 13:10:48.080270 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2gz7r" Nov 24 13:10:48 crc kubenswrapper[4752]: I1124 13:10:48.597610 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2gz7r"] Nov 24 13:10:49 crc kubenswrapper[4752]: I1124 13:10:49.302023 4752 generic.go:334] "Generic (PLEG): container finished" podID="fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c" containerID="8471c5a5a8229db65ede0c2f1c1e7ab4a49891b925a4b80f69b3f867461e45d3" exitCode=0 Nov 24 13:10:49 crc kubenswrapper[4752]: I1124 13:10:49.302107 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2gz7r" event={"ID":"fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c","Type":"ContainerDied","Data":"8471c5a5a8229db65ede0c2f1c1e7ab4a49891b925a4b80f69b3f867461e45d3"} Nov 24 13:10:49 crc kubenswrapper[4752]: I1124 13:10:49.302819 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2gz7r" event={"ID":"fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c","Type":"ContainerStarted","Data":"59756414a743ac55436ff2eea8c32695484feb58d2d6858ae5af29bdae5ac0e0"} Nov 24 13:10:51 crc kubenswrapper[4752]: I1124 13:10:51.324019 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2gz7r" event={"ID":"fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c","Type":"ContainerStarted","Data":"824af569cd3bb5fc5376c45126c9929ecd94344ea0ce502230fa1ada53c63fd5"} Nov 24 13:10:52 crc kubenswrapper[4752]: I1124 13:10:52.337580 4752 generic.go:334] "Generic (PLEG): container finished" podID="fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c" containerID="824af569cd3bb5fc5376c45126c9929ecd94344ea0ce502230fa1ada53c63fd5" exitCode=0 Nov 24 13:10:52 crc kubenswrapper[4752]: I1124 13:10:52.337946 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2gz7r" event={"ID":"fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c","Type":"ContainerDied","Data":"824af569cd3bb5fc5376c45126c9929ecd94344ea0ce502230fa1ada53c63fd5"} Nov 24 13:10:53 crc kubenswrapper[4752]: I1124 13:10:53.350405 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2gz7r" event={"ID":"fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c","Type":"ContainerStarted","Data":"bcd070ec00495678c73ea197ce4b8f4098107e96d8c20623938acb9bc4aa1e92"} Nov 24 13:10:53 crc kubenswrapper[4752]: I1124 13:10:53.374891 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2gz7r" podStartSLOduration=2.649832213 podStartE2EDuration="6.374873166s" podCreationTimestamp="2025-11-24 13:10:47 +0000 UTC" firstStartedPulling="2025-11-24 13:10:49.304284458 +0000 UTC m=+7455.289104747" lastFinishedPulling="2025-11-24 13:10:53.029325411 +0000 UTC m=+7459.014145700" observedRunningTime="2025-11-24 13:10:53.370681356 +0000 UTC m=+7459.355501725" watchObservedRunningTime="2025-11-24 13:10:53.374873166 +0000 UTC m=+7459.359693445" Nov 24 13:10:58 crc kubenswrapper[4752]: I1124 13:10:58.081153 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2gz7r" Nov 24 13:10:58 crc kubenswrapper[4752]: I1124 13:10:58.081901 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2gz7r" Nov 24 13:10:58 crc kubenswrapper[4752]: I1124 13:10:58.134414 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2gz7r" Nov 24 13:10:58 crc kubenswrapper[4752]: I1124 13:10:58.460902 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2gz7r" Nov 24 13:10:58 crc kubenswrapper[4752]: I1124 13:10:58.515605 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2gz7r"] Nov 24 13:11:00 crc kubenswrapper[4752]: I1124 13:11:00.416176 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2gz7r" podUID="fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c" containerName="registry-server" containerID="cri-o://bcd070ec00495678c73ea197ce4b8f4098107e96d8c20623938acb9bc4aa1e92" gracePeriod=2 Nov 24 13:11:00 crc kubenswrapper[4752]: I1124 13:11:00.950260 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2gz7r" Nov 24 13:11:01 crc kubenswrapper[4752]: I1124 13:11:01.077316 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c-utilities\") pod \"fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c\" (UID: \"fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c\") " Nov 24 13:11:01 crc kubenswrapper[4752]: I1124 13:11:01.077447 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-542f4\" (UniqueName: \"kubernetes.io/projected/fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c-kube-api-access-542f4\") pod \"fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c\" (UID: \"fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c\") " Nov 24 13:11:01 crc kubenswrapper[4752]: I1124 13:11:01.077561 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c-catalog-content\") pod \"fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c\" (UID: \"fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c\") " Nov 24 13:11:01 crc kubenswrapper[4752]: I1124 13:11:01.079021 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c-utilities" (OuterVolumeSpecName: "utilities") pod "fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c" (UID: "fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 13:11:01 crc kubenswrapper[4752]: I1124 13:11:01.083862 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c-kube-api-access-542f4" (OuterVolumeSpecName: "kube-api-access-542f4") pod "fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c" (UID: "fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c"). InnerVolumeSpecName "kube-api-access-542f4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:11:01 crc kubenswrapper[4752]: I1124 13:11:01.144923 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c" (UID: "fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 13:11:01 crc kubenswrapper[4752]: I1124 13:11:01.180519 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-542f4\" (UniqueName: \"kubernetes.io/projected/fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c-kube-api-access-542f4\") on node \"crc\" DevicePath \"\"" Nov 24 13:11:01 crc kubenswrapper[4752]: I1124 13:11:01.180560 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 13:11:01 crc kubenswrapper[4752]: I1124 13:11:01.180577 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 13:11:01 crc kubenswrapper[4752]: I1124 13:11:01.429141 4752 generic.go:334] "Generic (PLEG): container finished" podID="fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c" containerID="bcd070ec00495678c73ea197ce4b8f4098107e96d8c20623938acb9bc4aa1e92" exitCode=0 Nov 24 13:11:01 crc kubenswrapper[4752]: I1124 13:11:01.429206 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2gz7r" event={"ID":"fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c","Type":"ContainerDied","Data":"bcd070ec00495678c73ea197ce4b8f4098107e96d8c20623938acb9bc4aa1e92"} Nov 24 13:11:01 crc kubenswrapper[4752]: I1124 13:11:01.429244 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2gz7r" Nov 24 13:11:01 crc kubenswrapper[4752]: I1124 13:11:01.429528 4752 scope.go:117] "RemoveContainer" containerID="bcd070ec00495678c73ea197ce4b8f4098107e96d8c20623938acb9bc4aa1e92" Nov 24 13:11:01 crc kubenswrapper[4752]: I1124 13:11:01.429512 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2gz7r" event={"ID":"fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c","Type":"ContainerDied","Data":"59756414a743ac55436ff2eea8c32695484feb58d2d6858ae5af29bdae5ac0e0"} Nov 24 13:11:01 crc kubenswrapper[4752]: I1124 13:11:01.468926 4752 scope.go:117] "RemoveContainer" containerID="824af569cd3bb5fc5376c45126c9929ecd94344ea0ce502230fa1ada53c63fd5" Nov 24 13:11:01 crc kubenswrapper[4752]: I1124 13:11:01.478369 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2gz7r"] Nov 24 13:11:01 crc kubenswrapper[4752]: I1124 13:11:01.487219 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2gz7r"] Nov 24 13:11:01 crc kubenswrapper[4752]: I1124 13:11:01.500439 4752 scope.go:117] "RemoveContainer" containerID="8471c5a5a8229db65ede0c2f1c1e7ab4a49891b925a4b80f69b3f867461e45d3" Nov 24 13:11:01 crc kubenswrapper[4752]: I1124 13:11:01.558104 4752 scope.go:117] "RemoveContainer" containerID="bcd070ec00495678c73ea197ce4b8f4098107e96d8c20623938acb9bc4aa1e92" Nov 24 13:11:01 crc kubenswrapper[4752]: E1124 13:11:01.563259 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcd070ec00495678c73ea197ce4b8f4098107e96d8c20623938acb9bc4aa1e92\": container with ID starting with bcd070ec00495678c73ea197ce4b8f4098107e96d8c20623938acb9bc4aa1e92 not found: ID does not exist" containerID="bcd070ec00495678c73ea197ce4b8f4098107e96d8c20623938acb9bc4aa1e92" Nov 24 13:11:01 crc kubenswrapper[4752]: I1124 13:11:01.563320 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcd070ec00495678c73ea197ce4b8f4098107e96d8c20623938acb9bc4aa1e92"} err="failed to get container status \"bcd070ec00495678c73ea197ce4b8f4098107e96d8c20623938acb9bc4aa1e92\": rpc error: code = NotFound desc = could not find container \"bcd070ec00495678c73ea197ce4b8f4098107e96d8c20623938acb9bc4aa1e92\": container with ID starting with bcd070ec00495678c73ea197ce4b8f4098107e96d8c20623938acb9bc4aa1e92 not found: ID does not exist" Nov 24 13:11:01 crc kubenswrapper[4752]: I1124 13:11:01.563365 4752 scope.go:117] "RemoveContainer" containerID="824af569cd3bb5fc5376c45126c9929ecd94344ea0ce502230fa1ada53c63fd5" Nov 24 13:11:01 crc kubenswrapper[4752]: E1124 13:11:01.564293 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"824af569cd3bb5fc5376c45126c9929ecd94344ea0ce502230fa1ada53c63fd5\": container with ID starting with 824af569cd3bb5fc5376c45126c9929ecd94344ea0ce502230fa1ada53c63fd5 not found: ID does not exist" containerID="824af569cd3bb5fc5376c45126c9929ecd94344ea0ce502230fa1ada53c63fd5" Nov 24 13:11:01 crc kubenswrapper[4752]: I1124 13:11:01.564376 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"824af569cd3bb5fc5376c45126c9929ecd94344ea0ce502230fa1ada53c63fd5"} err="failed to get container status \"824af569cd3bb5fc5376c45126c9929ecd94344ea0ce502230fa1ada53c63fd5\": rpc error: code = NotFound desc = could not find container \"824af569cd3bb5fc5376c45126c9929ecd94344ea0ce502230fa1ada53c63fd5\": container with ID starting with 824af569cd3bb5fc5376c45126c9929ecd94344ea0ce502230fa1ada53c63fd5 not found: ID does not exist" Nov 24 13:11:01 crc kubenswrapper[4752]: I1124 13:11:01.564436 4752 scope.go:117] "RemoveContainer" containerID="8471c5a5a8229db65ede0c2f1c1e7ab4a49891b925a4b80f69b3f867461e45d3" Nov 24 13:11:01 crc kubenswrapper[4752]: E1124 13:11:01.565049 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8471c5a5a8229db65ede0c2f1c1e7ab4a49891b925a4b80f69b3f867461e45d3\": container with ID starting with 8471c5a5a8229db65ede0c2f1c1e7ab4a49891b925a4b80f69b3f867461e45d3 not found: ID does not exist" containerID="8471c5a5a8229db65ede0c2f1c1e7ab4a49891b925a4b80f69b3f867461e45d3" Nov 24 13:11:01 crc kubenswrapper[4752]: I1124 13:11:01.565102 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8471c5a5a8229db65ede0c2f1c1e7ab4a49891b925a4b80f69b3f867461e45d3"} err="failed to get container status \"8471c5a5a8229db65ede0c2f1c1e7ab4a49891b925a4b80f69b3f867461e45d3\": rpc error: code = NotFound desc = could not find container \"8471c5a5a8229db65ede0c2f1c1e7ab4a49891b925a4b80f69b3f867461e45d3\": container with ID starting with 8471c5a5a8229db65ede0c2f1c1e7ab4a49891b925a4b80f69b3f867461e45d3 not found: ID does not exist" Nov 24 13:11:02 crc kubenswrapper[4752]: I1124 13:11:02.746417 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c" path="/var/lib/kubelet/pods/fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c/volumes" Nov 24 13:11:09 crc kubenswrapper[4752]: I1124 13:11:09.535219 4752 generic.go:334] "Generic (PLEG): container finished" podID="6a6b2962-03f6-4ed4-b504-429861f14548" containerID="b67a58b4b2d1dfca9d17e226e3f19392880156849007b3d9906644494898f944" exitCode=0 Nov 24 13:11:09 crc kubenswrapper[4752]: I1124 13:11:09.535315 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-jqv6t" event={"ID":"6a6b2962-03f6-4ed4-b504-429861f14548","Type":"ContainerDied","Data":"b67a58b4b2d1dfca9d17e226e3f19392880156849007b3d9906644494898f944"} Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.026569 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-jqv6t" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.094616 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a6b2962-03f6-4ed4-b504-429861f14548-inventory\") pod \"6a6b2962-03f6-4ed4-b504-429861f14548\" (UID: \"6a6b2962-03f6-4ed4-b504-429861f14548\") " Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.094976 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6a6b2962-03f6-4ed4-b504-429861f14548-ceph\") pod \"6a6b2962-03f6-4ed4-b504-429861f14548\" (UID: \"6a6b2962-03f6-4ed4-b504-429861f14548\") " Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.095123 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a6b2962-03f6-4ed4-b504-429861f14548-neutron-metadata-combined-ca-bundle\") pod \"6a6b2962-03f6-4ed4-b504-429861f14548\" (UID: \"6a6b2962-03f6-4ed4-b504-429861f14548\") " Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.095196 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6a6b2962-03f6-4ed4-b504-429861f14548-nova-metadata-neutron-config-0\") pod \"6a6b2962-03f6-4ed4-b504-429861f14548\" (UID: \"6a6b2962-03f6-4ed4-b504-429861f14548\") " Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.095288 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6a6b2962-03f6-4ed4-b504-429861f14548-ssh-key\") pod \"6a6b2962-03f6-4ed4-b504-429861f14548\" (UID: \"6a6b2962-03f6-4ed4-b504-429861f14548\") " Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.095354 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6a6b2962-03f6-4ed4-b504-429861f14548-neutron-ovn-metadata-agent-neutron-config-0\") pod \"6a6b2962-03f6-4ed4-b504-429861f14548\" (UID: \"6a6b2962-03f6-4ed4-b504-429861f14548\") " Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.095433 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmmls\" (UniqueName: \"kubernetes.io/projected/6a6b2962-03f6-4ed4-b504-429861f14548-kube-api-access-vmmls\") pod \"6a6b2962-03f6-4ed4-b504-429861f14548\" (UID: \"6a6b2962-03f6-4ed4-b504-429861f14548\") " Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.100558 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a6b2962-03f6-4ed4-b504-429861f14548-ceph" (OuterVolumeSpecName: "ceph") pod "6a6b2962-03f6-4ed4-b504-429861f14548" (UID: "6a6b2962-03f6-4ed4-b504-429861f14548"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.101125 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a6b2962-03f6-4ed4-b504-429861f14548-kube-api-access-vmmls" (OuterVolumeSpecName: "kube-api-access-vmmls") pod "6a6b2962-03f6-4ed4-b504-429861f14548" (UID: "6a6b2962-03f6-4ed4-b504-429861f14548"). InnerVolumeSpecName "kube-api-access-vmmls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.101160 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a6b2962-03f6-4ed4-b504-429861f14548-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "6a6b2962-03f6-4ed4-b504-429861f14548" (UID: "6a6b2962-03f6-4ed4-b504-429861f14548"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.125290 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a6b2962-03f6-4ed4-b504-429861f14548-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6a6b2962-03f6-4ed4-b504-429861f14548" (UID: "6a6b2962-03f6-4ed4-b504-429861f14548"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.126311 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a6b2962-03f6-4ed4-b504-429861f14548-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "6a6b2962-03f6-4ed4-b504-429861f14548" (UID: "6a6b2962-03f6-4ed4-b504-429861f14548"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.126561 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a6b2962-03f6-4ed4-b504-429861f14548-inventory" (OuterVolumeSpecName: "inventory") pod "6a6b2962-03f6-4ed4-b504-429861f14548" (UID: "6a6b2962-03f6-4ed4-b504-429861f14548"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.138339 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a6b2962-03f6-4ed4-b504-429861f14548-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "6a6b2962-03f6-4ed4-b504-429861f14548" (UID: "6a6b2962-03f6-4ed4-b504-429861f14548"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.197836 4752 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6a6b2962-03f6-4ed4-b504-429861f14548-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.197879 4752 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6a6b2962-03f6-4ed4-b504-429861f14548-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.197895 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmmls\" (UniqueName: \"kubernetes.io/projected/6a6b2962-03f6-4ed4-b504-429861f14548-kube-api-access-vmmls\") on node \"crc\" DevicePath \"\"" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.197910 4752 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a6b2962-03f6-4ed4-b504-429861f14548-inventory\") on node \"crc\" DevicePath \"\"" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.197921 4752 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6a6b2962-03f6-4ed4-b504-429861f14548-ceph\") on node \"crc\" DevicePath \"\"" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.197932 4752 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a6b2962-03f6-4ed4-b504-429861f14548-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.197944 4752 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6a6b2962-03f6-4ed4-b504-429861f14548-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.562194 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-jqv6t" event={"ID":"6a6b2962-03f6-4ed4-b504-429861f14548","Type":"ContainerDied","Data":"ef886f202007120d6eb5101b2f4e8114fd61f5c48629a01feb1b6aca409119f5"} Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.562255 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef886f202007120d6eb5101b2f4e8114fd61f5c48629a01feb1b6aca409119f5" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.562254 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-jqv6t" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.672566 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-ll4l5"] Nov 24 13:11:11 crc kubenswrapper[4752]: E1124 13:11:11.673431 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a6b2962-03f6-4ed4-b504-429861f14548" containerName="neutron-metadata-openstack-openstack-cell1" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.673456 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a6b2962-03f6-4ed4-b504-429861f14548" containerName="neutron-metadata-openstack-openstack-cell1" Nov 24 13:11:11 crc kubenswrapper[4752]: E1124 13:11:11.673474 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c" containerName="extract-content" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.673484 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c" containerName="extract-content" Nov 24 13:11:11 crc kubenswrapper[4752]: E1124 13:11:11.673508 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c" containerName="registry-server" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.673517 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c" containerName="registry-server" Nov 24 13:11:11 crc kubenswrapper[4752]: E1124 13:11:11.673567 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c" containerName="extract-utilities" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.673577 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c" containerName="extract-utilities" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.674049 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a6b2962-03f6-4ed4-b504-429861f14548" containerName="neutron-metadata-openstack-openstack-cell1" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.674080 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcee9ef9-18fa-4da7-9fdc-1e930fbbea7c" containerName="registry-server" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.678585 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-ll4l5" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.688382 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-ll4l5"] Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.698373 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.698486 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qj7cx" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.698652 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.700597 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.700776 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.812387 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktfkq\" (UniqueName: \"kubernetes.io/projected/e5c17484-a3bc-4bac-a15d-5a8365781b23-kube-api-access-ktfkq\") pod \"libvirt-openstack-openstack-cell1-ll4l5\" (UID: \"e5c17484-a3bc-4bac-a15d-5a8365781b23\") " pod="openstack/libvirt-openstack-openstack-cell1-ll4l5" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.812533 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e5c17484-a3bc-4bac-a15d-5a8365781b23-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-ll4l5\" (UID: \"e5c17484-a3bc-4bac-a15d-5a8365781b23\") " pod="openstack/libvirt-openstack-openstack-cell1-ll4l5" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.812572 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e5c17484-a3bc-4bac-a15d-5a8365781b23-ceph\") pod \"libvirt-openstack-openstack-cell1-ll4l5\" (UID: \"e5c17484-a3bc-4bac-a15d-5a8365781b23\") " pod="openstack/libvirt-openstack-openstack-cell1-ll4l5" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.812592 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e5c17484-a3bc-4bac-a15d-5a8365781b23-ssh-key\") pod \"libvirt-openstack-openstack-cell1-ll4l5\" (UID: \"e5c17484-a3bc-4bac-a15d-5a8365781b23\") " pod="openstack/libvirt-openstack-openstack-cell1-ll4l5" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.812626 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5c17484-a3bc-4bac-a15d-5a8365781b23-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-ll4l5\" (UID: \"e5c17484-a3bc-4bac-a15d-5a8365781b23\") " pod="openstack/libvirt-openstack-openstack-cell1-ll4l5" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.812666 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e5c17484-a3bc-4bac-a15d-5a8365781b23-inventory\") pod \"libvirt-openstack-openstack-cell1-ll4l5\" (UID: \"e5c17484-a3bc-4bac-a15d-5a8365781b23\") " pod="openstack/libvirt-openstack-openstack-cell1-ll4l5" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.915267 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5c17484-a3bc-4bac-a15d-5a8365781b23-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-ll4l5\" (UID: \"e5c17484-a3bc-4bac-a15d-5a8365781b23\") " pod="openstack/libvirt-openstack-openstack-cell1-ll4l5" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.915490 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e5c17484-a3bc-4bac-a15d-5a8365781b23-inventory\") pod \"libvirt-openstack-openstack-cell1-ll4l5\" (UID: \"e5c17484-a3bc-4bac-a15d-5a8365781b23\") " pod="openstack/libvirt-openstack-openstack-cell1-ll4l5" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.915848 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktfkq\" (UniqueName: \"kubernetes.io/projected/e5c17484-a3bc-4bac-a15d-5a8365781b23-kube-api-access-ktfkq\") pod \"libvirt-openstack-openstack-cell1-ll4l5\" (UID: \"e5c17484-a3bc-4bac-a15d-5a8365781b23\") " pod="openstack/libvirt-openstack-openstack-cell1-ll4l5" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.916855 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e5c17484-a3bc-4bac-a15d-5a8365781b23-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-ll4l5\" (UID: \"e5c17484-a3bc-4bac-a15d-5a8365781b23\") " pod="openstack/libvirt-openstack-openstack-cell1-ll4l5" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.916970 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e5c17484-a3bc-4bac-a15d-5a8365781b23-ceph\") pod \"libvirt-openstack-openstack-cell1-ll4l5\" (UID: \"e5c17484-a3bc-4bac-a15d-5a8365781b23\") " pod="openstack/libvirt-openstack-openstack-cell1-ll4l5" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.917044 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e5c17484-a3bc-4bac-a15d-5a8365781b23-ssh-key\") pod \"libvirt-openstack-openstack-cell1-ll4l5\" (UID: \"e5c17484-a3bc-4bac-a15d-5a8365781b23\") " pod="openstack/libvirt-openstack-openstack-cell1-ll4l5" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.921479 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e5c17484-a3bc-4bac-a15d-5a8365781b23-ceph\") pod \"libvirt-openstack-openstack-cell1-ll4l5\" (UID: \"e5c17484-a3bc-4bac-a15d-5a8365781b23\") " pod="openstack/libvirt-openstack-openstack-cell1-ll4l5" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.922525 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e5c17484-a3bc-4bac-a15d-5a8365781b23-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-ll4l5\" (UID: \"e5c17484-a3bc-4bac-a15d-5a8365781b23\") " pod="openstack/libvirt-openstack-openstack-cell1-ll4l5" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.923210 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e5c17484-a3bc-4bac-a15d-5a8365781b23-ssh-key\") pod \"libvirt-openstack-openstack-cell1-ll4l5\" (UID: \"e5c17484-a3bc-4bac-a15d-5a8365781b23\") " pod="openstack/libvirt-openstack-openstack-cell1-ll4l5" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.925248 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5c17484-a3bc-4bac-a15d-5a8365781b23-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-ll4l5\" (UID: \"e5c17484-a3bc-4bac-a15d-5a8365781b23\") " pod="openstack/libvirt-openstack-openstack-cell1-ll4l5" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.935204 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e5c17484-a3bc-4bac-a15d-5a8365781b23-inventory\") pod \"libvirt-openstack-openstack-cell1-ll4l5\" (UID: \"e5c17484-a3bc-4bac-a15d-5a8365781b23\") " pod="openstack/libvirt-openstack-openstack-cell1-ll4l5" Nov 24 13:11:11 crc kubenswrapper[4752]: I1124 13:11:11.935623 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktfkq\" (UniqueName: \"kubernetes.io/projected/e5c17484-a3bc-4bac-a15d-5a8365781b23-kube-api-access-ktfkq\") pod \"libvirt-openstack-openstack-cell1-ll4l5\" (UID: \"e5c17484-a3bc-4bac-a15d-5a8365781b23\") " pod="openstack/libvirt-openstack-openstack-cell1-ll4l5" Nov 24 13:11:12 crc kubenswrapper[4752]: I1124 13:11:12.024046 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-ll4l5" Nov 24 13:11:12 crc kubenswrapper[4752]: I1124 13:11:12.630308 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-ll4l5"] Nov 24 13:11:13 crc kubenswrapper[4752]: I1124 13:11:13.582978 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-ll4l5" event={"ID":"e5c17484-a3bc-4bac-a15d-5a8365781b23","Type":"ContainerStarted","Data":"c4292de00bf4fec83f0fb4879a463fd7fa820de2daa152a2578b2431138b6b9d"} Nov 24 13:11:13 crc kubenswrapper[4752]: I1124 13:11:13.583437 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-ll4l5" event={"ID":"e5c17484-a3bc-4bac-a15d-5a8365781b23","Type":"ContainerStarted","Data":"89d859fc5e5b9155e471f4d45eda1c554c10f5c337b848160808426ef069e414"} Nov 24 13:11:13 crc kubenswrapper[4752]: I1124 13:11:13.607968 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell1-ll4l5" podStartSLOduration=2.100598786 podStartE2EDuration="2.607947664s" podCreationTimestamp="2025-11-24 13:11:11 +0000 UTC" firstStartedPulling="2025-11-24 13:11:12.650231214 +0000 UTC m=+7478.635051493" lastFinishedPulling="2025-11-24 13:11:13.157580082 +0000 UTC m=+7479.142400371" observedRunningTime="2025-11-24 13:11:13.60117791 +0000 UTC m=+7479.585998199" watchObservedRunningTime="2025-11-24 13:11:13.607947664 +0000 UTC m=+7479.592767953" Nov 24 13:11:45 crc kubenswrapper[4752]: I1124 13:11:45.469154 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 13:11:45 crc kubenswrapper[4752]: I1124 13:11:45.470134 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 13:12:15 crc kubenswrapper[4752]: I1124 13:12:15.469436 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 13:12:15 crc kubenswrapper[4752]: I1124 13:12:15.473217 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 13:12:45 crc kubenswrapper[4752]: I1124 13:12:45.469196 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 13:12:45 crc kubenswrapper[4752]: I1124 13:12:45.471184 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 13:12:45 crc kubenswrapper[4752]: I1124 13:12:45.471344 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 13:12:45 crc kubenswrapper[4752]: I1124 13:12:45.472401 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9c9cb22871fae3122be0525222b73b32cda01b283e33e73927f4054a906c5686"} pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 24 13:12:45 crc kubenswrapper[4752]: I1124 13:12:45.472567 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" containerID="cri-o://9c9cb22871fae3122be0525222b73b32cda01b283e33e73927f4054a906c5686" gracePeriod=600 Nov 24 13:12:45 crc kubenswrapper[4752]: I1124 13:12:45.680937 4752 generic.go:334] "Generic (PLEG): container finished" podID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerID="9c9cb22871fae3122be0525222b73b32cda01b283e33e73927f4054a906c5686" exitCode=0 Nov 24 13:12:45 crc kubenswrapper[4752]: I1124 13:12:45.680988 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerDied","Data":"9c9cb22871fae3122be0525222b73b32cda01b283e33e73927f4054a906c5686"} Nov 24 13:12:45 crc kubenswrapper[4752]: I1124 13:12:45.681024 4752 scope.go:117] "RemoveContainer" containerID="ec8b3c7b99e04ac7de2a7811e3000500deed3097f15545f1d3f164ab34d7ec12" Nov 24 13:12:46 crc kubenswrapper[4752]: I1124 13:12:46.699463 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerStarted","Data":"413a3aa76aff5620f0ac17705c66d766ac69f3c105c60fd81c26ab3012916875"} Nov 24 13:13:56 crc kubenswrapper[4752]: I1124 13:13:56.257896 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-67mxn"] Nov 24 13:13:56 crc kubenswrapper[4752]: I1124 13:13:56.261178 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-67mxn" Nov 24 13:13:56 crc kubenswrapper[4752]: I1124 13:13:56.276402 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-67mxn"] Nov 24 13:13:56 crc kubenswrapper[4752]: I1124 13:13:56.342628 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/156c1de5-3bf2-4980-ba74-fcf6bef16982-utilities\") pod \"certified-operators-67mxn\" (UID: \"156c1de5-3bf2-4980-ba74-fcf6bef16982\") " pod="openshift-marketplace/certified-operators-67mxn" Nov 24 13:13:56 crc kubenswrapper[4752]: I1124 13:13:56.342765 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/156c1de5-3bf2-4980-ba74-fcf6bef16982-catalog-content\") pod \"certified-operators-67mxn\" (UID: \"156c1de5-3bf2-4980-ba74-fcf6bef16982\") " pod="openshift-marketplace/certified-operators-67mxn" Nov 24 13:13:56 crc kubenswrapper[4752]: I1124 13:13:56.342838 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxw4q\" (UniqueName: \"kubernetes.io/projected/156c1de5-3bf2-4980-ba74-fcf6bef16982-kube-api-access-rxw4q\") pod \"certified-operators-67mxn\" (UID: \"156c1de5-3bf2-4980-ba74-fcf6bef16982\") " pod="openshift-marketplace/certified-operators-67mxn" Nov 24 13:13:56 crc kubenswrapper[4752]: I1124 13:13:56.444971 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/156c1de5-3bf2-4980-ba74-fcf6bef16982-catalog-content\") pod \"certified-operators-67mxn\" (UID: \"156c1de5-3bf2-4980-ba74-fcf6bef16982\") " pod="openshift-marketplace/certified-operators-67mxn" Nov 24 13:13:56 crc kubenswrapper[4752]: I1124 13:13:56.445332 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxw4q\" (UniqueName: \"kubernetes.io/projected/156c1de5-3bf2-4980-ba74-fcf6bef16982-kube-api-access-rxw4q\") pod \"certified-operators-67mxn\" (UID: \"156c1de5-3bf2-4980-ba74-fcf6bef16982\") " pod="openshift-marketplace/certified-operators-67mxn" Nov 24 13:13:56 crc kubenswrapper[4752]: I1124 13:13:56.445456 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/156c1de5-3bf2-4980-ba74-fcf6bef16982-utilities\") pod \"certified-operators-67mxn\" (UID: \"156c1de5-3bf2-4980-ba74-fcf6bef16982\") " pod="openshift-marketplace/certified-operators-67mxn" Nov 24 13:13:56 crc kubenswrapper[4752]: I1124 13:13:56.445490 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/156c1de5-3bf2-4980-ba74-fcf6bef16982-catalog-content\") pod \"certified-operators-67mxn\" (UID: \"156c1de5-3bf2-4980-ba74-fcf6bef16982\") " pod="openshift-marketplace/certified-operators-67mxn" Nov 24 13:13:56 crc kubenswrapper[4752]: I1124 13:13:56.445775 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/156c1de5-3bf2-4980-ba74-fcf6bef16982-utilities\") pod \"certified-operators-67mxn\" (UID: \"156c1de5-3bf2-4980-ba74-fcf6bef16982\") " pod="openshift-marketplace/certified-operators-67mxn" Nov 24 13:13:56 crc kubenswrapper[4752]: I1124 13:13:56.469197 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxw4q\" (UniqueName: \"kubernetes.io/projected/156c1de5-3bf2-4980-ba74-fcf6bef16982-kube-api-access-rxw4q\") pod \"certified-operators-67mxn\" (UID: \"156c1de5-3bf2-4980-ba74-fcf6bef16982\") " pod="openshift-marketplace/certified-operators-67mxn" Nov 24 13:13:56 crc kubenswrapper[4752]: I1124 13:13:56.593788 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-67mxn" Nov 24 13:13:57 crc kubenswrapper[4752]: I1124 13:13:57.192737 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-67mxn"] Nov 24 13:13:57 crc kubenswrapper[4752]: I1124 13:13:57.427087 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-67mxn" event={"ID":"156c1de5-3bf2-4980-ba74-fcf6bef16982","Type":"ContainerStarted","Data":"93bb9a3b1a4d28d40a028bc892f2cdf3ed7f2f3e46c5869e162baee2bd2c1152"} Nov 24 13:13:58 crc kubenswrapper[4752]: I1124 13:13:58.439452 4752 generic.go:334] "Generic (PLEG): container finished" podID="156c1de5-3bf2-4980-ba74-fcf6bef16982" containerID="e8a3808954bcdefd883485ca5489ff1291900f05086f601e657a8c03473e2a72" exitCode=0 Nov 24 13:13:58 crc kubenswrapper[4752]: I1124 13:13:58.439545 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-67mxn" event={"ID":"156c1de5-3bf2-4980-ba74-fcf6bef16982","Type":"ContainerDied","Data":"e8a3808954bcdefd883485ca5489ff1291900f05086f601e657a8c03473e2a72"} Nov 24 13:13:59 crc kubenswrapper[4752]: I1124 13:13:59.456319 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-67mxn" event={"ID":"156c1de5-3bf2-4980-ba74-fcf6bef16982","Type":"ContainerStarted","Data":"5b11cf06be71aa2353a81c452e5efa556b2debf16aa1ae5eda8427057e86610e"} Nov 24 13:14:01 crc kubenswrapper[4752]: I1124 13:14:01.484008 4752 generic.go:334] "Generic (PLEG): container finished" podID="156c1de5-3bf2-4980-ba74-fcf6bef16982" containerID="5b11cf06be71aa2353a81c452e5efa556b2debf16aa1ae5eda8427057e86610e" exitCode=0 Nov 24 13:14:01 crc kubenswrapper[4752]: I1124 13:14:01.484078 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-67mxn" event={"ID":"156c1de5-3bf2-4980-ba74-fcf6bef16982","Type":"ContainerDied","Data":"5b11cf06be71aa2353a81c452e5efa556b2debf16aa1ae5eda8427057e86610e"} Nov 24 13:14:03 crc kubenswrapper[4752]: I1124 13:14:03.509911 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-67mxn" event={"ID":"156c1de5-3bf2-4980-ba74-fcf6bef16982","Type":"ContainerStarted","Data":"2c96b59115605e0105628832f52b40f47091793fa513810ce4db28985a1e7b82"} Nov 24 13:14:03 crc kubenswrapper[4752]: I1124 13:14:03.550582 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-67mxn" podStartSLOduration=3.692872779 podStartE2EDuration="7.550552436s" podCreationTimestamp="2025-11-24 13:13:56 +0000 UTC" firstStartedPulling="2025-11-24 13:13:58.441620669 +0000 UTC m=+7644.426440958" lastFinishedPulling="2025-11-24 13:14:02.299300286 +0000 UTC m=+7648.284120615" observedRunningTime="2025-11-24 13:14:03.529962236 +0000 UTC m=+7649.514782535" watchObservedRunningTime="2025-11-24 13:14:03.550552436 +0000 UTC m=+7649.535372725" Nov 24 13:14:06 crc kubenswrapper[4752]: I1124 13:14:06.594447 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-67mxn" Nov 24 13:14:06 crc kubenswrapper[4752]: I1124 13:14:06.595010 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-67mxn" Nov 24 13:14:06 crc kubenswrapper[4752]: I1124 13:14:06.663394 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-67mxn" Nov 24 13:14:07 crc kubenswrapper[4752]: I1124 13:14:07.612288 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-67mxn" Nov 24 13:14:07 crc kubenswrapper[4752]: I1124 13:14:07.668857 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-67mxn"] Nov 24 13:14:09 crc kubenswrapper[4752]: I1124 13:14:09.571774 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-67mxn" podUID="156c1de5-3bf2-4980-ba74-fcf6bef16982" containerName="registry-server" containerID="cri-o://2c96b59115605e0105628832f52b40f47091793fa513810ce4db28985a1e7b82" gracePeriod=2 Nov 24 13:14:10 crc kubenswrapper[4752]: I1124 13:14:10.137224 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-67mxn" Nov 24 13:14:10 crc kubenswrapper[4752]: I1124 13:14:10.201346 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/156c1de5-3bf2-4980-ba74-fcf6bef16982-utilities\") pod \"156c1de5-3bf2-4980-ba74-fcf6bef16982\" (UID: \"156c1de5-3bf2-4980-ba74-fcf6bef16982\") " Nov 24 13:14:10 crc kubenswrapper[4752]: I1124 13:14:10.201800 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxw4q\" (UniqueName: \"kubernetes.io/projected/156c1de5-3bf2-4980-ba74-fcf6bef16982-kube-api-access-rxw4q\") pod \"156c1de5-3bf2-4980-ba74-fcf6bef16982\" (UID: \"156c1de5-3bf2-4980-ba74-fcf6bef16982\") " Nov 24 13:14:10 crc kubenswrapper[4752]: I1124 13:14:10.201835 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/156c1de5-3bf2-4980-ba74-fcf6bef16982-catalog-content\") pod \"156c1de5-3bf2-4980-ba74-fcf6bef16982\" (UID: \"156c1de5-3bf2-4980-ba74-fcf6bef16982\") " Nov 24 13:14:10 crc kubenswrapper[4752]: I1124 13:14:10.202361 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/156c1de5-3bf2-4980-ba74-fcf6bef16982-utilities" (OuterVolumeSpecName: "utilities") pod "156c1de5-3bf2-4980-ba74-fcf6bef16982" (UID: "156c1de5-3bf2-4980-ba74-fcf6bef16982"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 13:14:10 crc kubenswrapper[4752]: I1124 13:14:10.202968 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/156c1de5-3bf2-4980-ba74-fcf6bef16982-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 13:14:10 crc kubenswrapper[4752]: I1124 13:14:10.208505 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/156c1de5-3bf2-4980-ba74-fcf6bef16982-kube-api-access-rxw4q" (OuterVolumeSpecName: "kube-api-access-rxw4q") pod "156c1de5-3bf2-4980-ba74-fcf6bef16982" (UID: "156c1de5-3bf2-4980-ba74-fcf6bef16982"). InnerVolumeSpecName "kube-api-access-rxw4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:14:10 crc kubenswrapper[4752]: I1124 13:14:10.259384 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/156c1de5-3bf2-4980-ba74-fcf6bef16982-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "156c1de5-3bf2-4980-ba74-fcf6bef16982" (UID: "156c1de5-3bf2-4980-ba74-fcf6bef16982"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 13:14:10 crc kubenswrapper[4752]: I1124 13:14:10.304946 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxw4q\" (UniqueName: \"kubernetes.io/projected/156c1de5-3bf2-4980-ba74-fcf6bef16982-kube-api-access-rxw4q\") on node \"crc\" DevicePath \"\"" Nov 24 13:14:10 crc kubenswrapper[4752]: I1124 13:14:10.304980 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/156c1de5-3bf2-4980-ba74-fcf6bef16982-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 13:14:10 crc kubenswrapper[4752]: I1124 13:14:10.587081 4752 generic.go:334] "Generic (PLEG): container finished" podID="156c1de5-3bf2-4980-ba74-fcf6bef16982" containerID="2c96b59115605e0105628832f52b40f47091793fa513810ce4db28985a1e7b82" exitCode=0 Nov 24 13:14:10 crc kubenswrapper[4752]: I1124 13:14:10.587128 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-67mxn" event={"ID":"156c1de5-3bf2-4980-ba74-fcf6bef16982","Type":"ContainerDied","Data":"2c96b59115605e0105628832f52b40f47091793fa513810ce4db28985a1e7b82"} Nov 24 13:14:10 crc kubenswrapper[4752]: I1124 13:14:10.587154 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-67mxn" Nov 24 13:14:10 crc kubenswrapper[4752]: I1124 13:14:10.587163 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-67mxn" event={"ID":"156c1de5-3bf2-4980-ba74-fcf6bef16982","Type":"ContainerDied","Data":"93bb9a3b1a4d28d40a028bc892f2cdf3ed7f2f3e46c5869e162baee2bd2c1152"} Nov 24 13:14:10 crc kubenswrapper[4752]: I1124 13:14:10.587182 4752 scope.go:117] "RemoveContainer" containerID="2c96b59115605e0105628832f52b40f47091793fa513810ce4db28985a1e7b82" Nov 24 13:14:10 crc kubenswrapper[4752]: I1124 13:14:10.619204 4752 scope.go:117] "RemoveContainer" containerID="5b11cf06be71aa2353a81c452e5efa556b2debf16aa1ae5eda8427057e86610e" Nov 24 13:14:10 crc kubenswrapper[4752]: I1124 13:14:10.627239 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-67mxn"] Nov 24 13:14:10 crc kubenswrapper[4752]: I1124 13:14:10.636587 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-67mxn"] Nov 24 13:14:10 crc kubenswrapper[4752]: I1124 13:14:10.661401 4752 scope.go:117] "RemoveContainer" containerID="e8a3808954bcdefd883485ca5489ff1291900f05086f601e657a8c03473e2a72" Nov 24 13:14:10 crc kubenswrapper[4752]: I1124 13:14:10.710922 4752 scope.go:117] "RemoveContainer" containerID="2c96b59115605e0105628832f52b40f47091793fa513810ce4db28985a1e7b82" Nov 24 13:14:10 crc kubenswrapper[4752]: E1124 13:14:10.711438 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c96b59115605e0105628832f52b40f47091793fa513810ce4db28985a1e7b82\": container with ID starting with 2c96b59115605e0105628832f52b40f47091793fa513810ce4db28985a1e7b82 not found: ID does not exist" containerID="2c96b59115605e0105628832f52b40f47091793fa513810ce4db28985a1e7b82" Nov 24 13:14:10 crc kubenswrapper[4752]: I1124 13:14:10.711475 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c96b59115605e0105628832f52b40f47091793fa513810ce4db28985a1e7b82"} err="failed to get container status \"2c96b59115605e0105628832f52b40f47091793fa513810ce4db28985a1e7b82\": rpc error: code = NotFound desc = could not find container \"2c96b59115605e0105628832f52b40f47091793fa513810ce4db28985a1e7b82\": container with ID starting with 2c96b59115605e0105628832f52b40f47091793fa513810ce4db28985a1e7b82 not found: ID does not exist" Nov 24 13:14:10 crc kubenswrapper[4752]: I1124 13:14:10.711497 4752 scope.go:117] "RemoveContainer" containerID="5b11cf06be71aa2353a81c452e5efa556b2debf16aa1ae5eda8427057e86610e" Nov 24 13:14:10 crc kubenswrapper[4752]: E1124 13:14:10.711956 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b11cf06be71aa2353a81c452e5efa556b2debf16aa1ae5eda8427057e86610e\": container with ID starting with 5b11cf06be71aa2353a81c452e5efa556b2debf16aa1ae5eda8427057e86610e not found: ID does not exist" containerID="5b11cf06be71aa2353a81c452e5efa556b2debf16aa1ae5eda8427057e86610e" Nov 24 13:14:10 crc kubenswrapper[4752]: I1124 13:14:10.711978 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b11cf06be71aa2353a81c452e5efa556b2debf16aa1ae5eda8427057e86610e"} err="failed to get container status \"5b11cf06be71aa2353a81c452e5efa556b2debf16aa1ae5eda8427057e86610e\": rpc error: code = NotFound desc = could not find container \"5b11cf06be71aa2353a81c452e5efa556b2debf16aa1ae5eda8427057e86610e\": container with ID starting with 5b11cf06be71aa2353a81c452e5efa556b2debf16aa1ae5eda8427057e86610e not found: ID does not exist" Nov 24 13:14:10 crc kubenswrapper[4752]: I1124 13:14:10.711991 4752 scope.go:117] "RemoveContainer" containerID="e8a3808954bcdefd883485ca5489ff1291900f05086f601e657a8c03473e2a72" Nov 24 13:14:10 crc kubenswrapper[4752]: E1124 13:14:10.712391 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8a3808954bcdefd883485ca5489ff1291900f05086f601e657a8c03473e2a72\": container with ID starting with e8a3808954bcdefd883485ca5489ff1291900f05086f601e657a8c03473e2a72 not found: ID does not exist" containerID="e8a3808954bcdefd883485ca5489ff1291900f05086f601e657a8c03473e2a72" Nov 24 13:14:10 crc kubenswrapper[4752]: I1124 13:14:10.712411 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8a3808954bcdefd883485ca5489ff1291900f05086f601e657a8c03473e2a72"} err="failed to get container status \"e8a3808954bcdefd883485ca5489ff1291900f05086f601e657a8c03473e2a72\": rpc error: code = NotFound desc = could not find container \"e8a3808954bcdefd883485ca5489ff1291900f05086f601e657a8c03473e2a72\": container with ID starting with e8a3808954bcdefd883485ca5489ff1291900f05086f601e657a8c03473e2a72 not found: ID does not exist" Nov 24 13:14:10 crc kubenswrapper[4752]: I1124 13:14:10.743241 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="156c1de5-3bf2-4980-ba74-fcf6bef16982" path="/var/lib/kubelet/pods/156c1de5-3bf2-4980-ba74-fcf6bef16982/volumes" Nov 24 13:14:45 crc kubenswrapper[4752]: I1124 13:14:45.469090 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 13:14:45 crc kubenswrapper[4752]: I1124 13:14:45.469707 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 13:15:00 crc kubenswrapper[4752]: I1124 13:15:00.151521 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399835-7hbj4"] Nov 24 13:15:00 crc kubenswrapper[4752]: E1124 13:15:00.152632 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="156c1de5-3bf2-4980-ba74-fcf6bef16982" containerName="registry-server" Nov 24 13:15:00 crc kubenswrapper[4752]: I1124 13:15:00.152654 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="156c1de5-3bf2-4980-ba74-fcf6bef16982" containerName="registry-server" Nov 24 13:15:00 crc kubenswrapper[4752]: E1124 13:15:00.152683 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="156c1de5-3bf2-4980-ba74-fcf6bef16982" containerName="extract-content" Nov 24 13:15:00 crc kubenswrapper[4752]: I1124 13:15:00.152691 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="156c1de5-3bf2-4980-ba74-fcf6bef16982" containerName="extract-content" Nov 24 13:15:00 crc kubenswrapper[4752]: E1124 13:15:00.152716 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="156c1de5-3bf2-4980-ba74-fcf6bef16982" containerName="extract-utilities" Nov 24 13:15:00 crc kubenswrapper[4752]: I1124 13:15:00.152724 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="156c1de5-3bf2-4980-ba74-fcf6bef16982" containerName="extract-utilities" Nov 24 13:15:00 crc kubenswrapper[4752]: I1124 13:15:00.152998 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="156c1de5-3bf2-4980-ba74-fcf6bef16982" containerName="registry-server" Nov 24 13:15:00 crc kubenswrapper[4752]: I1124 13:15:00.153794 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399835-7hbj4" Nov 24 13:15:00 crc kubenswrapper[4752]: I1124 13:15:00.156190 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 24 13:15:00 crc kubenswrapper[4752]: I1124 13:15:00.156198 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 24 13:15:00 crc kubenswrapper[4752]: I1124 13:15:00.161846 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399835-7hbj4"] Nov 24 13:15:00 crc kubenswrapper[4752]: I1124 13:15:00.326697 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/85341510-dc26-4d83-9952-43cf5812e8e3-secret-volume\") pod \"collect-profiles-29399835-7hbj4\" (UID: \"85341510-dc26-4d83-9952-43cf5812e8e3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399835-7hbj4" Nov 24 13:15:00 crc kubenswrapper[4752]: I1124 13:15:00.326785 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvdlt\" (UniqueName: \"kubernetes.io/projected/85341510-dc26-4d83-9952-43cf5812e8e3-kube-api-access-rvdlt\") pod \"collect-profiles-29399835-7hbj4\" (UID: \"85341510-dc26-4d83-9952-43cf5812e8e3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399835-7hbj4" Nov 24 13:15:00 crc kubenswrapper[4752]: I1124 13:15:00.326837 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85341510-dc26-4d83-9952-43cf5812e8e3-config-volume\") pod \"collect-profiles-29399835-7hbj4\" (UID: \"85341510-dc26-4d83-9952-43cf5812e8e3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399835-7hbj4" Nov 24 13:15:00 crc kubenswrapper[4752]: I1124 13:15:00.428828 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85341510-dc26-4d83-9952-43cf5812e8e3-config-volume\") pod \"collect-profiles-29399835-7hbj4\" (UID: \"85341510-dc26-4d83-9952-43cf5812e8e3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399835-7hbj4" Nov 24 13:15:00 crc kubenswrapper[4752]: I1124 13:15:00.429122 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/85341510-dc26-4d83-9952-43cf5812e8e3-secret-volume\") pod \"collect-profiles-29399835-7hbj4\" (UID: \"85341510-dc26-4d83-9952-43cf5812e8e3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399835-7hbj4" Nov 24 13:15:00 crc kubenswrapper[4752]: I1124 13:15:00.429175 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvdlt\" (UniqueName: \"kubernetes.io/projected/85341510-dc26-4d83-9952-43cf5812e8e3-kube-api-access-rvdlt\") pod \"collect-profiles-29399835-7hbj4\" (UID: \"85341510-dc26-4d83-9952-43cf5812e8e3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399835-7hbj4" Nov 24 13:15:00 crc kubenswrapper[4752]: I1124 13:15:00.429864 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85341510-dc26-4d83-9952-43cf5812e8e3-config-volume\") pod \"collect-profiles-29399835-7hbj4\" (UID: \"85341510-dc26-4d83-9952-43cf5812e8e3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399835-7hbj4" Nov 24 13:15:00 crc kubenswrapper[4752]: I1124 13:15:00.435918 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/85341510-dc26-4d83-9952-43cf5812e8e3-secret-volume\") pod \"collect-profiles-29399835-7hbj4\" (UID: \"85341510-dc26-4d83-9952-43cf5812e8e3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399835-7hbj4" Nov 24 13:15:00 crc kubenswrapper[4752]: I1124 13:15:00.451794 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvdlt\" (UniqueName: \"kubernetes.io/projected/85341510-dc26-4d83-9952-43cf5812e8e3-kube-api-access-rvdlt\") pod \"collect-profiles-29399835-7hbj4\" (UID: \"85341510-dc26-4d83-9952-43cf5812e8e3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399835-7hbj4" Nov 24 13:15:00 crc kubenswrapper[4752]: I1124 13:15:00.475273 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399835-7hbj4" Nov 24 13:15:00 crc kubenswrapper[4752]: I1124 13:15:00.958131 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399835-7hbj4"] Nov 24 13:15:01 crc kubenswrapper[4752]: I1124 13:15:01.136472 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399835-7hbj4" event={"ID":"85341510-dc26-4d83-9952-43cf5812e8e3","Type":"ContainerStarted","Data":"91c4715214055b44a7fefc8c7682b4c57e488234854cebe8fb266dbb61f09fde"} Nov 24 13:15:01 crc kubenswrapper[4752]: E1124 13:15:01.701529 4752 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85341510_dc26_4d83_9952_43cf5812e8e3.slice/crio-conmon-eb78a48f195b20838a8f917fbd36dbe17649a980797a3edb16532e5f11a203cb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85341510_dc26_4d83_9952_43cf5812e8e3.slice/crio-eb78a48f195b20838a8f917fbd36dbe17649a980797a3edb16532e5f11a203cb.scope\": RecentStats: unable to find data in memory cache]" Nov 24 13:15:02 crc kubenswrapper[4752]: I1124 13:15:02.147468 4752 generic.go:334] "Generic (PLEG): container finished" podID="85341510-dc26-4d83-9952-43cf5812e8e3" containerID="eb78a48f195b20838a8f917fbd36dbe17649a980797a3edb16532e5f11a203cb" exitCode=0 Nov 24 13:15:02 crc kubenswrapper[4752]: I1124 13:15:02.147519 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399835-7hbj4" event={"ID":"85341510-dc26-4d83-9952-43cf5812e8e3","Type":"ContainerDied","Data":"eb78a48f195b20838a8f917fbd36dbe17649a980797a3edb16532e5f11a203cb"} Nov 24 13:15:03 crc kubenswrapper[4752]: I1124 13:15:03.562856 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399835-7hbj4" Nov 24 13:15:03 crc kubenswrapper[4752]: I1124 13:15:03.717866 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/85341510-dc26-4d83-9952-43cf5812e8e3-secret-volume\") pod \"85341510-dc26-4d83-9952-43cf5812e8e3\" (UID: \"85341510-dc26-4d83-9952-43cf5812e8e3\") " Nov 24 13:15:03 crc kubenswrapper[4752]: I1124 13:15:03.718333 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvdlt\" (UniqueName: \"kubernetes.io/projected/85341510-dc26-4d83-9952-43cf5812e8e3-kube-api-access-rvdlt\") pod \"85341510-dc26-4d83-9952-43cf5812e8e3\" (UID: \"85341510-dc26-4d83-9952-43cf5812e8e3\") " Nov 24 13:15:03 crc kubenswrapper[4752]: I1124 13:15:03.718383 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85341510-dc26-4d83-9952-43cf5812e8e3-config-volume\") pod \"85341510-dc26-4d83-9952-43cf5812e8e3\" (UID: \"85341510-dc26-4d83-9952-43cf5812e8e3\") " Nov 24 13:15:03 crc kubenswrapper[4752]: I1124 13:15:03.719342 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85341510-dc26-4d83-9952-43cf5812e8e3-config-volume" (OuterVolumeSpecName: "config-volume") pod "85341510-dc26-4d83-9952-43cf5812e8e3" (UID: "85341510-dc26-4d83-9952-43cf5812e8e3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 13:15:03 crc kubenswrapper[4752]: I1124 13:15:03.728999 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85341510-dc26-4d83-9952-43cf5812e8e3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "85341510-dc26-4d83-9952-43cf5812e8e3" (UID: "85341510-dc26-4d83-9952-43cf5812e8e3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:15:03 crc kubenswrapper[4752]: I1124 13:15:03.729901 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85341510-dc26-4d83-9952-43cf5812e8e3-kube-api-access-rvdlt" (OuterVolumeSpecName: "kube-api-access-rvdlt") pod "85341510-dc26-4d83-9952-43cf5812e8e3" (UID: "85341510-dc26-4d83-9952-43cf5812e8e3"). InnerVolumeSpecName "kube-api-access-rvdlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:15:03 crc kubenswrapper[4752]: I1124 13:15:03.821249 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvdlt\" (UniqueName: \"kubernetes.io/projected/85341510-dc26-4d83-9952-43cf5812e8e3-kube-api-access-rvdlt\") on node \"crc\" DevicePath \"\"" Nov 24 13:15:03 crc kubenswrapper[4752]: I1124 13:15:03.821286 4752 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85341510-dc26-4d83-9952-43cf5812e8e3-config-volume\") on node \"crc\" DevicePath \"\"" Nov 24 13:15:03 crc kubenswrapper[4752]: I1124 13:15:03.821295 4752 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/85341510-dc26-4d83-9952-43cf5812e8e3-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 24 13:15:04 crc kubenswrapper[4752]: I1124 13:15:04.165111 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399835-7hbj4" event={"ID":"85341510-dc26-4d83-9952-43cf5812e8e3","Type":"ContainerDied","Data":"91c4715214055b44a7fefc8c7682b4c57e488234854cebe8fb266dbb61f09fde"} Nov 24 13:15:04 crc kubenswrapper[4752]: I1124 13:15:04.165147 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91c4715214055b44a7fefc8c7682b4c57e488234854cebe8fb266dbb61f09fde" Nov 24 13:15:04 crc kubenswrapper[4752]: I1124 13:15:04.165163 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399835-7hbj4" Nov 24 13:15:04 crc kubenswrapper[4752]: I1124 13:15:04.634230 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399790-blmbw"] Nov 24 13:15:04 crc kubenswrapper[4752]: I1124 13:15:04.643400 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399790-blmbw"] Nov 24 13:15:04 crc kubenswrapper[4752]: I1124 13:15:04.742692 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32" path="/var/lib/kubelet/pods/351f3e1f-4d6b-4eaf-a9e1-32063b5d2c32/volumes" Nov 24 13:15:15 crc kubenswrapper[4752]: I1124 13:15:15.469576 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 13:15:15 crc kubenswrapper[4752]: I1124 13:15:15.470314 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 13:15:19 crc kubenswrapper[4752]: I1124 13:15:19.222840 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-68xbs"] Nov 24 13:15:19 crc kubenswrapper[4752]: E1124 13:15:19.226939 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85341510-dc26-4d83-9952-43cf5812e8e3" containerName="collect-profiles" Nov 24 13:15:19 crc kubenswrapper[4752]: I1124 13:15:19.227031 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="85341510-dc26-4d83-9952-43cf5812e8e3" containerName="collect-profiles" Nov 24 13:15:19 crc kubenswrapper[4752]: I1124 13:15:19.227329 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="85341510-dc26-4d83-9952-43cf5812e8e3" containerName="collect-profiles" Nov 24 13:15:19 crc kubenswrapper[4752]: I1124 13:15:19.228889 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-68xbs" Nov 24 13:15:19 crc kubenswrapper[4752]: I1124 13:15:19.253380 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-68xbs"] Nov 24 13:15:19 crc kubenswrapper[4752]: I1124 13:15:19.281477 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db8e70c6-49f8-4c07-8c8e-166f03799666-catalog-content\") pod \"redhat-operators-68xbs\" (UID: \"db8e70c6-49f8-4c07-8c8e-166f03799666\") " pod="openshift-marketplace/redhat-operators-68xbs" Nov 24 13:15:19 crc kubenswrapper[4752]: I1124 13:15:19.281543 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db8e70c6-49f8-4c07-8c8e-166f03799666-utilities\") pod \"redhat-operators-68xbs\" (UID: \"db8e70c6-49f8-4c07-8c8e-166f03799666\") " pod="openshift-marketplace/redhat-operators-68xbs" Nov 24 13:15:19 crc kubenswrapper[4752]: I1124 13:15:19.281575 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h2vz\" (UniqueName: \"kubernetes.io/projected/db8e70c6-49f8-4c07-8c8e-166f03799666-kube-api-access-6h2vz\") pod \"redhat-operators-68xbs\" (UID: \"db8e70c6-49f8-4c07-8c8e-166f03799666\") " pod="openshift-marketplace/redhat-operators-68xbs" Nov 24 13:15:19 crc kubenswrapper[4752]: I1124 13:15:19.384589 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db8e70c6-49f8-4c07-8c8e-166f03799666-catalog-content\") pod \"redhat-operators-68xbs\" (UID: \"db8e70c6-49f8-4c07-8c8e-166f03799666\") " pod="openshift-marketplace/redhat-operators-68xbs" Nov 24 13:15:19 crc kubenswrapper[4752]: I1124 13:15:19.384675 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db8e70c6-49f8-4c07-8c8e-166f03799666-utilities\") pod \"redhat-operators-68xbs\" (UID: \"db8e70c6-49f8-4c07-8c8e-166f03799666\") " pod="openshift-marketplace/redhat-operators-68xbs" Nov 24 13:15:19 crc kubenswrapper[4752]: I1124 13:15:19.384708 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6h2vz\" (UniqueName: \"kubernetes.io/projected/db8e70c6-49f8-4c07-8c8e-166f03799666-kube-api-access-6h2vz\") pod \"redhat-operators-68xbs\" (UID: \"db8e70c6-49f8-4c07-8c8e-166f03799666\") " pod="openshift-marketplace/redhat-operators-68xbs" Nov 24 13:15:19 crc kubenswrapper[4752]: I1124 13:15:19.385663 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db8e70c6-49f8-4c07-8c8e-166f03799666-catalog-content\") pod \"redhat-operators-68xbs\" (UID: \"db8e70c6-49f8-4c07-8c8e-166f03799666\") " pod="openshift-marketplace/redhat-operators-68xbs" Nov 24 13:15:19 crc kubenswrapper[4752]: I1124 13:15:19.386784 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db8e70c6-49f8-4c07-8c8e-166f03799666-utilities\") pod \"redhat-operators-68xbs\" (UID: \"db8e70c6-49f8-4c07-8c8e-166f03799666\") " pod="openshift-marketplace/redhat-operators-68xbs" Nov 24 13:15:19 crc kubenswrapper[4752]: I1124 13:15:19.409976 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h2vz\" (UniqueName: \"kubernetes.io/projected/db8e70c6-49f8-4c07-8c8e-166f03799666-kube-api-access-6h2vz\") pod \"redhat-operators-68xbs\" (UID: \"db8e70c6-49f8-4c07-8c8e-166f03799666\") " pod="openshift-marketplace/redhat-operators-68xbs" Nov 24 13:15:19 crc kubenswrapper[4752]: I1124 13:15:19.560349 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-68xbs" Nov 24 13:15:20 crc kubenswrapper[4752]: I1124 13:15:20.059528 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-68xbs"] Nov 24 13:15:20 crc kubenswrapper[4752]: W1124 13:15:20.070393 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb8e70c6_49f8_4c07_8c8e_166f03799666.slice/crio-0c0e540c6a0b2c39d814feb07445cf437cc0727b31246da1acef889987f2dcdd WatchSource:0}: Error finding container 0c0e540c6a0b2c39d814feb07445cf437cc0727b31246da1acef889987f2dcdd: Status 404 returned error can't find the container with id 0c0e540c6a0b2c39d814feb07445cf437cc0727b31246da1acef889987f2dcdd Nov 24 13:15:20 crc kubenswrapper[4752]: I1124 13:15:20.348378 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-68xbs" event={"ID":"db8e70c6-49f8-4c07-8c8e-166f03799666","Type":"ContainerStarted","Data":"0c0e540c6a0b2c39d814feb07445cf437cc0727b31246da1acef889987f2dcdd"} Nov 24 13:15:21 crc kubenswrapper[4752]: I1124 13:15:21.359408 4752 generic.go:334] "Generic (PLEG): container finished" podID="db8e70c6-49f8-4c07-8c8e-166f03799666" containerID="7768dcb98f630b8c69a5b80a0e8825bda764f7a71aa9793063fe849834a93fd5" exitCode=0 Nov 24 13:15:21 crc kubenswrapper[4752]: I1124 13:15:21.359501 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-68xbs" event={"ID":"db8e70c6-49f8-4c07-8c8e-166f03799666","Type":"ContainerDied","Data":"7768dcb98f630b8c69a5b80a0e8825bda764f7a71aa9793063fe849834a93fd5"} Nov 24 13:15:21 crc kubenswrapper[4752]: I1124 13:15:21.363080 4752 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 24 13:15:22 crc kubenswrapper[4752]: I1124 13:15:22.827549 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xsk8h"] Nov 24 13:15:22 crc kubenswrapper[4752]: I1124 13:15:22.830355 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xsk8h" Nov 24 13:15:22 crc kubenswrapper[4752]: I1124 13:15:22.850936 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xsk8h"] Nov 24 13:15:22 crc kubenswrapper[4752]: I1124 13:15:22.972358 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmc6z\" (UniqueName: \"kubernetes.io/projected/4610eb9a-17af-409c-a457-5d1836326c20-kube-api-access-tmc6z\") pod \"redhat-marketplace-xsk8h\" (UID: \"4610eb9a-17af-409c-a457-5d1836326c20\") " pod="openshift-marketplace/redhat-marketplace-xsk8h" Nov 24 13:15:22 crc kubenswrapper[4752]: I1124 13:15:22.972431 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4610eb9a-17af-409c-a457-5d1836326c20-catalog-content\") pod \"redhat-marketplace-xsk8h\" (UID: \"4610eb9a-17af-409c-a457-5d1836326c20\") " pod="openshift-marketplace/redhat-marketplace-xsk8h" Nov 24 13:15:22 crc kubenswrapper[4752]: I1124 13:15:22.972864 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4610eb9a-17af-409c-a457-5d1836326c20-utilities\") pod \"redhat-marketplace-xsk8h\" (UID: \"4610eb9a-17af-409c-a457-5d1836326c20\") " pod="openshift-marketplace/redhat-marketplace-xsk8h" Nov 24 13:15:23 crc kubenswrapper[4752]: I1124 13:15:23.074803 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmc6z\" (UniqueName: \"kubernetes.io/projected/4610eb9a-17af-409c-a457-5d1836326c20-kube-api-access-tmc6z\") pod \"redhat-marketplace-xsk8h\" (UID: \"4610eb9a-17af-409c-a457-5d1836326c20\") " pod="openshift-marketplace/redhat-marketplace-xsk8h" Nov 24 13:15:23 crc kubenswrapper[4752]: I1124 13:15:23.074902 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4610eb9a-17af-409c-a457-5d1836326c20-catalog-content\") pod \"redhat-marketplace-xsk8h\" (UID: \"4610eb9a-17af-409c-a457-5d1836326c20\") " pod="openshift-marketplace/redhat-marketplace-xsk8h" Nov 24 13:15:23 crc kubenswrapper[4752]: I1124 13:15:23.075000 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4610eb9a-17af-409c-a457-5d1836326c20-utilities\") pod \"redhat-marketplace-xsk8h\" (UID: \"4610eb9a-17af-409c-a457-5d1836326c20\") " pod="openshift-marketplace/redhat-marketplace-xsk8h" Nov 24 13:15:23 crc kubenswrapper[4752]: I1124 13:15:23.075416 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4610eb9a-17af-409c-a457-5d1836326c20-catalog-content\") pod \"redhat-marketplace-xsk8h\" (UID: \"4610eb9a-17af-409c-a457-5d1836326c20\") " pod="openshift-marketplace/redhat-marketplace-xsk8h" Nov 24 13:15:23 crc kubenswrapper[4752]: I1124 13:15:23.075527 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4610eb9a-17af-409c-a457-5d1836326c20-utilities\") pod \"redhat-marketplace-xsk8h\" (UID: \"4610eb9a-17af-409c-a457-5d1836326c20\") " pod="openshift-marketplace/redhat-marketplace-xsk8h" Nov 24 13:15:23 crc kubenswrapper[4752]: I1124 13:15:23.094529 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmc6z\" (UniqueName: \"kubernetes.io/projected/4610eb9a-17af-409c-a457-5d1836326c20-kube-api-access-tmc6z\") pod \"redhat-marketplace-xsk8h\" (UID: \"4610eb9a-17af-409c-a457-5d1836326c20\") " pod="openshift-marketplace/redhat-marketplace-xsk8h" Nov 24 13:15:23 crc kubenswrapper[4752]: I1124 13:15:23.151784 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xsk8h" Nov 24 13:15:23 crc kubenswrapper[4752]: I1124 13:15:23.395540 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-68xbs" event={"ID":"db8e70c6-49f8-4c07-8c8e-166f03799666","Type":"ContainerStarted","Data":"0eb5654606027038b07a8bb545f31b6e9ddcb5aaf1118e3cda4bbf561eed1923"} Nov 24 13:15:23 crc kubenswrapper[4752]: I1124 13:15:23.689859 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xsk8h"] Nov 24 13:15:23 crc kubenswrapper[4752]: W1124 13:15:23.692893 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4610eb9a_17af_409c_a457_5d1836326c20.slice/crio-104799041e4079384f9fe10f045b936668e6fd57e214f160a1e85e3d09625add WatchSource:0}: Error finding container 104799041e4079384f9fe10f045b936668e6fd57e214f160a1e85e3d09625add: Status 404 returned error can't find the container with id 104799041e4079384f9fe10f045b936668e6fd57e214f160a1e85e3d09625add Nov 24 13:15:24 crc kubenswrapper[4752]: I1124 13:15:24.405384 4752 generic.go:334] "Generic (PLEG): container finished" podID="4610eb9a-17af-409c-a457-5d1836326c20" containerID="3572b267f6a9690b820024762d0f244b75b3ca5db84fac9fb5485c6c4802e24d" exitCode=0 Nov 24 13:15:24 crc kubenswrapper[4752]: I1124 13:15:24.405787 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xsk8h" event={"ID":"4610eb9a-17af-409c-a457-5d1836326c20","Type":"ContainerDied","Data":"3572b267f6a9690b820024762d0f244b75b3ca5db84fac9fb5485c6c4802e24d"} Nov 24 13:15:24 crc kubenswrapper[4752]: I1124 13:15:24.405841 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xsk8h" event={"ID":"4610eb9a-17af-409c-a457-5d1836326c20","Type":"ContainerStarted","Data":"104799041e4079384f9fe10f045b936668e6fd57e214f160a1e85e3d09625add"} Nov 24 13:15:27 crc kubenswrapper[4752]: I1124 13:15:27.435911 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xsk8h" event={"ID":"4610eb9a-17af-409c-a457-5d1836326c20","Type":"ContainerStarted","Data":"9af2660a8a8ea475be726bfff4a810ed3814bcc5242ca84311246ea0fce2b6fc"} Nov 24 13:15:29 crc kubenswrapper[4752]: I1124 13:15:29.457064 4752 generic.go:334] "Generic (PLEG): container finished" podID="4610eb9a-17af-409c-a457-5d1836326c20" containerID="9af2660a8a8ea475be726bfff4a810ed3814bcc5242ca84311246ea0fce2b6fc" exitCode=0 Nov 24 13:15:29 crc kubenswrapper[4752]: I1124 13:15:29.457301 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xsk8h" event={"ID":"4610eb9a-17af-409c-a457-5d1836326c20","Type":"ContainerDied","Data":"9af2660a8a8ea475be726bfff4a810ed3814bcc5242ca84311246ea0fce2b6fc"} Nov 24 13:15:32 crc kubenswrapper[4752]: I1124 13:15:32.489871 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xsk8h" event={"ID":"4610eb9a-17af-409c-a457-5d1836326c20","Type":"ContainerStarted","Data":"be6c66c05aeb917338a44dc1fe0f5d025ec60e9591c2ea7db8b5e108bceef20b"} Nov 24 13:15:32 crc kubenswrapper[4752]: I1124 13:15:32.497225 4752 generic.go:334] "Generic (PLEG): container finished" podID="db8e70c6-49f8-4c07-8c8e-166f03799666" containerID="0eb5654606027038b07a8bb545f31b6e9ddcb5aaf1118e3cda4bbf561eed1923" exitCode=0 Nov 24 13:15:32 crc kubenswrapper[4752]: I1124 13:15:32.497287 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-68xbs" event={"ID":"db8e70c6-49f8-4c07-8c8e-166f03799666","Type":"ContainerDied","Data":"0eb5654606027038b07a8bb545f31b6e9ddcb5aaf1118e3cda4bbf561eed1923"} Nov 24 13:15:32 crc kubenswrapper[4752]: I1124 13:15:32.537888 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xsk8h" podStartSLOduration=3.620610768 podStartE2EDuration="10.537865173s" podCreationTimestamp="2025-11-24 13:15:22 +0000 UTC" firstStartedPulling="2025-11-24 13:15:24.407975423 +0000 UTC m=+7730.392795712" lastFinishedPulling="2025-11-24 13:15:31.325229828 +0000 UTC m=+7737.310050117" observedRunningTime="2025-11-24 13:15:32.513003841 +0000 UTC m=+7738.497824130" watchObservedRunningTime="2025-11-24 13:15:32.537865173 +0000 UTC m=+7738.522685452" Nov 24 13:15:33 crc kubenswrapper[4752]: I1124 13:15:33.152303 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xsk8h" Nov 24 13:15:33 crc kubenswrapper[4752]: I1124 13:15:33.152696 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xsk8h" Nov 24 13:15:33 crc kubenswrapper[4752]: I1124 13:15:33.509826 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-68xbs" event={"ID":"db8e70c6-49f8-4c07-8c8e-166f03799666","Type":"ContainerStarted","Data":"e7ebb04b4e7b4c734e59ba7c3e916c5707e0816a317fe04e4a4c454ce9cc3069"} Nov 24 13:15:33 crc kubenswrapper[4752]: I1124 13:15:33.528480 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-68xbs" podStartSLOduration=2.947534306 podStartE2EDuration="14.528464891s" podCreationTimestamp="2025-11-24 13:15:19 +0000 UTC" firstStartedPulling="2025-11-24 13:15:21.362787866 +0000 UTC m=+7727.347608155" lastFinishedPulling="2025-11-24 13:15:32.943718451 +0000 UTC m=+7738.928538740" observedRunningTime="2025-11-24 13:15:33.526918037 +0000 UTC m=+7739.511738316" watchObservedRunningTime="2025-11-24 13:15:33.528464891 +0000 UTC m=+7739.513285180" Nov 24 13:15:34 crc kubenswrapper[4752]: I1124 13:15:34.218915 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-xsk8h" podUID="4610eb9a-17af-409c-a457-5d1836326c20" containerName="registry-server" probeResult="failure" output=< Nov 24 13:15:34 crc kubenswrapper[4752]: timeout: failed to connect service ":50051" within 1s Nov 24 13:15:34 crc kubenswrapper[4752]: > Nov 24 13:15:39 crc kubenswrapper[4752]: I1124 13:15:39.561716 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-68xbs" Nov 24 13:15:39 crc kubenswrapper[4752]: I1124 13:15:39.562368 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-68xbs" Nov 24 13:15:39 crc kubenswrapper[4752]: I1124 13:15:39.611551 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-68xbs" Nov 24 13:15:40 crc kubenswrapper[4752]: I1124 13:15:40.630667 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-68xbs" Nov 24 13:15:40 crc kubenswrapper[4752]: I1124 13:15:40.704700 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-68xbs"] Nov 24 13:15:42 crc kubenswrapper[4752]: I1124 13:15:42.593299 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-68xbs" podUID="db8e70c6-49f8-4c07-8c8e-166f03799666" containerName="registry-server" containerID="cri-o://e7ebb04b4e7b4c734e59ba7c3e916c5707e0816a317fe04e4a4c454ce9cc3069" gracePeriod=2 Nov 24 13:15:42 crc kubenswrapper[4752]: E1124 13:15:42.803295 4752 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb8e70c6_49f8_4c07_8c8e_166f03799666.slice/crio-e7ebb04b4e7b4c734e59ba7c3e916c5707e0816a317fe04e4a4c454ce9cc3069.scope\": RecentStats: unable to find data in memory cache]" Nov 24 13:15:43 crc kubenswrapper[4752]: I1124 13:15:43.222956 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xsk8h" Nov 24 13:15:43 crc kubenswrapper[4752]: I1124 13:15:43.277414 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xsk8h" Nov 24 13:15:43 crc kubenswrapper[4752]: I1124 13:15:43.462070 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xsk8h"] Nov 24 13:15:43 crc kubenswrapper[4752]: I1124 13:15:43.609560 4752 generic.go:334] "Generic (PLEG): container finished" podID="db8e70c6-49f8-4c07-8c8e-166f03799666" containerID="e7ebb04b4e7b4c734e59ba7c3e916c5707e0816a317fe04e4a4c454ce9cc3069" exitCode=0 Nov 24 13:15:43 crc kubenswrapper[4752]: I1124 13:15:43.609635 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-68xbs" event={"ID":"db8e70c6-49f8-4c07-8c8e-166f03799666","Type":"ContainerDied","Data":"e7ebb04b4e7b4c734e59ba7c3e916c5707e0816a317fe04e4a4c454ce9cc3069"} Nov 24 13:15:43 crc kubenswrapper[4752]: I1124 13:15:43.609677 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-68xbs" event={"ID":"db8e70c6-49f8-4c07-8c8e-166f03799666","Type":"ContainerDied","Data":"0c0e540c6a0b2c39d814feb07445cf437cc0727b31246da1acef889987f2dcdd"} Nov 24 13:15:43 crc kubenswrapper[4752]: I1124 13:15:43.609690 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c0e540c6a0b2c39d814feb07445cf437cc0727b31246da1acef889987f2dcdd" Nov 24 13:15:43 crc kubenswrapper[4752]: I1124 13:15:43.609597 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-68xbs" Nov 24 13:15:43 crc kubenswrapper[4752]: I1124 13:15:43.776467 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db8e70c6-49f8-4c07-8c8e-166f03799666-utilities\") pod \"db8e70c6-49f8-4c07-8c8e-166f03799666\" (UID: \"db8e70c6-49f8-4c07-8c8e-166f03799666\") " Nov 24 13:15:43 crc kubenswrapper[4752]: I1124 13:15:43.776544 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6h2vz\" (UniqueName: \"kubernetes.io/projected/db8e70c6-49f8-4c07-8c8e-166f03799666-kube-api-access-6h2vz\") pod \"db8e70c6-49f8-4c07-8c8e-166f03799666\" (UID: \"db8e70c6-49f8-4c07-8c8e-166f03799666\") " Nov 24 13:15:43 crc kubenswrapper[4752]: I1124 13:15:43.776774 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db8e70c6-49f8-4c07-8c8e-166f03799666-catalog-content\") pod \"db8e70c6-49f8-4c07-8c8e-166f03799666\" (UID: \"db8e70c6-49f8-4c07-8c8e-166f03799666\") " Nov 24 13:15:43 crc kubenswrapper[4752]: I1124 13:15:43.777640 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db8e70c6-49f8-4c07-8c8e-166f03799666-utilities" (OuterVolumeSpecName: "utilities") pod "db8e70c6-49f8-4c07-8c8e-166f03799666" (UID: "db8e70c6-49f8-4c07-8c8e-166f03799666"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 13:15:43 crc kubenswrapper[4752]: I1124 13:15:43.777805 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db8e70c6-49f8-4c07-8c8e-166f03799666-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 13:15:43 crc kubenswrapper[4752]: I1124 13:15:43.783174 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db8e70c6-49f8-4c07-8c8e-166f03799666-kube-api-access-6h2vz" (OuterVolumeSpecName: "kube-api-access-6h2vz") pod "db8e70c6-49f8-4c07-8c8e-166f03799666" (UID: "db8e70c6-49f8-4c07-8c8e-166f03799666"). InnerVolumeSpecName "kube-api-access-6h2vz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:15:43 crc kubenswrapper[4752]: I1124 13:15:43.884435 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6h2vz\" (UniqueName: \"kubernetes.io/projected/db8e70c6-49f8-4c07-8c8e-166f03799666-kube-api-access-6h2vz\") on node \"crc\" DevicePath \"\"" Nov 24 13:15:43 crc kubenswrapper[4752]: I1124 13:15:43.884798 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db8e70c6-49f8-4c07-8c8e-166f03799666-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "db8e70c6-49f8-4c07-8c8e-166f03799666" (UID: "db8e70c6-49f8-4c07-8c8e-166f03799666"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 13:15:43 crc kubenswrapper[4752]: I1124 13:15:43.987113 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db8e70c6-49f8-4c07-8c8e-166f03799666-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 13:15:44 crc kubenswrapper[4752]: I1124 13:15:44.619520 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-68xbs" Nov 24 13:15:44 crc kubenswrapper[4752]: I1124 13:15:44.619601 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xsk8h" podUID="4610eb9a-17af-409c-a457-5d1836326c20" containerName="registry-server" containerID="cri-o://be6c66c05aeb917338a44dc1fe0f5d025ec60e9591c2ea7db8b5e108bceef20b" gracePeriod=2 Nov 24 13:15:44 crc kubenswrapper[4752]: I1124 13:15:44.677558 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-68xbs"] Nov 24 13:15:44 crc kubenswrapper[4752]: I1124 13:15:44.686546 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-68xbs"] Nov 24 13:15:44 crc kubenswrapper[4752]: I1124 13:15:44.746563 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db8e70c6-49f8-4c07-8c8e-166f03799666" path="/var/lib/kubelet/pods/db8e70c6-49f8-4c07-8c8e-166f03799666/volumes" Nov 24 13:15:45 crc kubenswrapper[4752]: I1124 13:15:45.128602 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xsk8h" Nov 24 13:15:45 crc kubenswrapper[4752]: I1124 13:15:45.317191 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4610eb9a-17af-409c-a457-5d1836326c20-catalog-content\") pod \"4610eb9a-17af-409c-a457-5d1836326c20\" (UID: \"4610eb9a-17af-409c-a457-5d1836326c20\") " Nov 24 13:15:45 crc kubenswrapper[4752]: I1124 13:15:45.317393 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmc6z\" (UniqueName: \"kubernetes.io/projected/4610eb9a-17af-409c-a457-5d1836326c20-kube-api-access-tmc6z\") pod \"4610eb9a-17af-409c-a457-5d1836326c20\" (UID: \"4610eb9a-17af-409c-a457-5d1836326c20\") " Nov 24 13:15:45 crc kubenswrapper[4752]: I1124 13:15:45.317510 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4610eb9a-17af-409c-a457-5d1836326c20-utilities\") pod \"4610eb9a-17af-409c-a457-5d1836326c20\" (UID: \"4610eb9a-17af-409c-a457-5d1836326c20\") " Nov 24 13:15:45 crc kubenswrapper[4752]: I1124 13:15:45.318970 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4610eb9a-17af-409c-a457-5d1836326c20-utilities" (OuterVolumeSpecName: "utilities") pod "4610eb9a-17af-409c-a457-5d1836326c20" (UID: "4610eb9a-17af-409c-a457-5d1836326c20"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 13:15:45 crc kubenswrapper[4752]: I1124 13:15:45.326303 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4610eb9a-17af-409c-a457-5d1836326c20-kube-api-access-tmc6z" (OuterVolumeSpecName: "kube-api-access-tmc6z") pod "4610eb9a-17af-409c-a457-5d1836326c20" (UID: "4610eb9a-17af-409c-a457-5d1836326c20"). InnerVolumeSpecName "kube-api-access-tmc6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:15:45 crc kubenswrapper[4752]: I1124 13:15:45.347409 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4610eb9a-17af-409c-a457-5d1836326c20-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4610eb9a-17af-409c-a457-5d1836326c20" (UID: "4610eb9a-17af-409c-a457-5d1836326c20"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 13:15:45 crc kubenswrapper[4752]: I1124 13:15:45.420385 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4610eb9a-17af-409c-a457-5d1836326c20-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 13:15:45 crc kubenswrapper[4752]: I1124 13:15:45.420413 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmc6z\" (UniqueName: \"kubernetes.io/projected/4610eb9a-17af-409c-a457-5d1836326c20-kube-api-access-tmc6z\") on node \"crc\" DevicePath \"\"" Nov 24 13:15:45 crc kubenswrapper[4752]: I1124 13:15:45.420423 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4610eb9a-17af-409c-a457-5d1836326c20-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 13:15:45 crc kubenswrapper[4752]: I1124 13:15:45.468488 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 13:15:45 crc kubenswrapper[4752]: I1124 13:15:45.468591 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 13:15:45 crc kubenswrapper[4752]: I1124 13:15:45.468650 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 13:15:45 crc kubenswrapper[4752]: I1124 13:15:45.469596 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"413a3aa76aff5620f0ac17705c66d766ac69f3c105c60fd81c26ab3012916875"} pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 24 13:15:45 crc kubenswrapper[4752]: I1124 13:15:45.469673 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" containerID="cri-o://413a3aa76aff5620f0ac17705c66d766ac69f3c105c60fd81c26ab3012916875" gracePeriod=600 Nov 24 13:15:45 crc kubenswrapper[4752]: E1124 13:15:45.599675 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:15:45 crc kubenswrapper[4752]: I1124 13:15:45.638090 4752 generic.go:334] "Generic (PLEG): container finished" podID="4610eb9a-17af-409c-a457-5d1836326c20" containerID="be6c66c05aeb917338a44dc1fe0f5d025ec60e9591c2ea7db8b5e108bceef20b" exitCode=0 Nov 24 13:15:45 crc kubenswrapper[4752]: I1124 13:15:45.638142 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xsk8h" Nov 24 13:15:45 crc kubenswrapper[4752]: I1124 13:15:45.638162 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xsk8h" event={"ID":"4610eb9a-17af-409c-a457-5d1836326c20","Type":"ContainerDied","Data":"be6c66c05aeb917338a44dc1fe0f5d025ec60e9591c2ea7db8b5e108bceef20b"} Nov 24 13:15:45 crc kubenswrapper[4752]: I1124 13:15:45.638680 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xsk8h" event={"ID":"4610eb9a-17af-409c-a457-5d1836326c20","Type":"ContainerDied","Data":"104799041e4079384f9fe10f045b936668e6fd57e214f160a1e85e3d09625add"} Nov 24 13:15:45 crc kubenswrapper[4752]: I1124 13:15:45.638713 4752 scope.go:117] "RemoveContainer" containerID="be6c66c05aeb917338a44dc1fe0f5d025ec60e9591c2ea7db8b5e108bceef20b" Nov 24 13:15:45 crc kubenswrapper[4752]: I1124 13:15:45.647195 4752 generic.go:334] "Generic (PLEG): container finished" podID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerID="413a3aa76aff5620f0ac17705c66d766ac69f3c105c60fd81c26ab3012916875" exitCode=0 Nov 24 13:15:45 crc kubenswrapper[4752]: I1124 13:15:45.647231 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerDied","Data":"413a3aa76aff5620f0ac17705c66d766ac69f3c105c60fd81c26ab3012916875"} Nov 24 13:15:45 crc kubenswrapper[4752]: I1124 13:15:45.647969 4752 scope.go:117] "RemoveContainer" containerID="413a3aa76aff5620f0ac17705c66d766ac69f3c105c60fd81c26ab3012916875" Nov 24 13:15:45 crc kubenswrapper[4752]: E1124 13:15:45.648225 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:15:45 crc kubenswrapper[4752]: I1124 13:15:45.696143 4752 scope.go:117] "RemoveContainer" containerID="9af2660a8a8ea475be726bfff4a810ed3814bcc5242ca84311246ea0fce2b6fc" Nov 24 13:15:45 crc kubenswrapper[4752]: I1124 13:15:45.707551 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xsk8h"] Nov 24 13:15:45 crc kubenswrapper[4752]: I1124 13:15:45.718191 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xsk8h"] Nov 24 13:15:45 crc kubenswrapper[4752]: I1124 13:15:45.736221 4752 scope.go:117] "RemoveContainer" containerID="3572b267f6a9690b820024762d0f244b75b3ca5db84fac9fb5485c6c4802e24d" Nov 24 13:15:45 crc kubenswrapper[4752]: I1124 13:15:45.802694 4752 scope.go:117] "RemoveContainer" containerID="be6c66c05aeb917338a44dc1fe0f5d025ec60e9591c2ea7db8b5e108bceef20b" Nov 24 13:15:45 crc kubenswrapper[4752]: E1124 13:15:45.803690 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be6c66c05aeb917338a44dc1fe0f5d025ec60e9591c2ea7db8b5e108bceef20b\": container with ID starting with be6c66c05aeb917338a44dc1fe0f5d025ec60e9591c2ea7db8b5e108bceef20b not found: ID does not exist" containerID="be6c66c05aeb917338a44dc1fe0f5d025ec60e9591c2ea7db8b5e108bceef20b" Nov 24 13:15:45 crc kubenswrapper[4752]: I1124 13:15:45.803857 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be6c66c05aeb917338a44dc1fe0f5d025ec60e9591c2ea7db8b5e108bceef20b"} err="failed to get container status \"be6c66c05aeb917338a44dc1fe0f5d025ec60e9591c2ea7db8b5e108bceef20b\": rpc error: code = NotFound desc = could not find container \"be6c66c05aeb917338a44dc1fe0f5d025ec60e9591c2ea7db8b5e108bceef20b\": container with ID starting with be6c66c05aeb917338a44dc1fe0f5d025ec60e9591c2ea7db8b5e108bceef20b not found: ID does not exist" Nov 24 13:15:45 crc kubenswrapper[4752]: I1124 13:15:45.803967 4752 scope.go:117] "RemoveContainer" containerID="9af2660a8a8ea475be726bfff4a810ed3814bcc5242ca84311246ea0fce2b6fc" Nov 24 13:15:45 crc kubenswrapper[4752]: E1124 13:15:45.804504 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9af2660a8a8ea475be726bfff4a810ed3814bcc5242ca84311246ea0fce2b6fc\": container with ID starting with 9af2660a8a8ea475be726bfff4a810ed3814bcc5242ca84311246ea0fce2b6fc not found: ID does not exist" containerID="9af2660a8a8ea475be726bfff4a810ed3814bcc5242ca84311246ea0fce2b6fc" Nov 24 13:15:45 crc kubenswrapper[4752]: I1124 13:15:45.804607 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9af2660a8a8ea475be726bfff4a810ed3814bcc5242ca84311246ea0fce2b6fc"} err="failed to get container status \"9af2660a8a8ea475be726bfff4a810ed3814bcc5242ca84311246ea0fce2b6fc\": rpc error: code = NotFound desc = could not find container \"9af2660a8a8ea475be726bfff4a810ed3814bcc5242ca84311246ea0fce2b6fc\": container with ID starting with 9af2660a8a8ea475be726bfff4a810ed3814bcc5242ca84311246ea0fce2b6fc not found: ID does not exist" Nov 24 13:15:45 crc kubenswrapper[4752]: I1124 13:15:45.804677 4752 scope.go:117] "RemoveContainer" containerID="3572b267f6a9690b820024762d0f244b75b3ca5db84fac9fb5485c6c4802e24d" Nov 24 13:15:45 crc kubenswrapper[4752]: E1124 13:15:45.805288 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3572b267f6a9690b820024762d0f244b75b3ca5db84fac9fb5485c6c4802e24d\": container with ID starting with 3572b267f6a9690b820024762d0f244b75b3ca5db84fac9fb5485c6c4802e24d not found: ID does not exist" containerID="3572b267f6a9690b820024762d0f244b75b3ca5db84fac9fb5485c6c4802e24d" Nov 24 13:15:45 crc kubenswrapper[4752]: I1124 13:15:45.805445 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3572b267f6a9690b820024762d0f244b75b3ca5db84fac9fb5485c6c4802e24d"} err="failed to get container status \"3572b267f6a9690b820024762d0f244b75b3ca5db84fac9fb5485c6c4802e24d\": rpc error: code = NotFound desc = could not find container \"3572b267f6a9690b820024762d0f244b75b3ca5db84fac9fb5485c6c4802e24d\": container with ID starting with 3572b267f6a9690b820024762d0f244b75b3ca5db84fac9fb5485c6c4802e24d not found: ID does not exist" Nov 24 13:15:45 crc kubenswrapper[4752]: I1124 13:15:45.805589 4752 scope.go:117] "RemoveContainer" containerID="9c9cb22871fae3122be0525222b73b32cda01b283e33e73927f4054a906c5686" Nov 24 13:15:46 crc kubenswrapper[4752]: I1124 13:15:46.754479 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4610eb9a-17af-409c-a457-5d1836326c20" path="/var/lib/kubelet/pods/4610eb9a-17af-409c-a457-5d1836326c20/volumes" Nov 24 13:15:49 crc kubenswrapper[4752]: I1124 13:15:49.496088 4752 scope.go:117] "RemoveContainer" containerID="ef2fed93ad2b502a48ca45b4eed05d67a84a7e3fea151747bc48d80d57be64d0" Nov 24 13:15:52 crc kubenswrapper[4752]: I1124 13:15:52.751247 4752 generic.go:334] "Generic (PLEG): container finished" podID="e5c17484-a3bc-4bac-a15d-5a8365781b23" containerID="c4292de00bf4fec83f0fb4879a463fd7fa820de2daa152a2578b2431138b6b9d" exitCode=0 Nov 24 13:15:52 crc kubenswrapper[4752]: I1124 13:15:52.754086 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-ll4l5" event={"ID":"e5c17484-a3bc-4bac-a15d-5a8365781b23","Type":"ContainerDied","Data":"c4292de00bf4fec83f0fb4879a463fd7fa820de2daa152a2578b2431138b6b9d"} Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.249361 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-ll4l5" Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.439693 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5c17484-a3bc-4bac-a15d-5a8365781b23-libvirt-combined-ca-bundle\") pod \"e5c17484-a3bc-4bac-a15d-5a8365781b23\" (UID: \"e5c17484-a3bc-4bac-a15d-5a8365781b23\") " Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.440926 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e5c17484-a3bc-4bac-a15d-5a8365781b23-inventory\") pod \"e5c17484-a3bc-4bac-a15d-5a8365781b23\" (UID: \"e5c17484-a3bc-4bac-a15d-5a8365781b23\") " Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.441091 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e5c17484-a3bc-4bac-a15d-5a8365781b23-ssh-key\") pod \"e5c17484-a3bc-4bac-a15d-5a8365781b23\" (UID: \"e5c17484-a3bc-4bac-a15d-5a8365781b23\") " Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.441553 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e5c17484-a3bc-4bac-a15d-5a8365781b23-ceph\") pod \"e5c17484-a3bc-4bac-a15d-5a8365781b23\" (UID: \"e5c17484-a3bc-4bac-a15d-5a8365781b23\") " Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.441671 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e5c17484-a3bc-4bac-a15d-5a8365781b23-libvirt-secret-0\") pod \"e5c17484-a3bc-4bac-a15d-5a8365781b23\" (UID: \"e5c17484-a3bc-4bac-a15d-5a8365781b23\") " Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.441823 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktfkq\" (UniqueName: \"kubernetes.io/projected/e5c17484-a3bc-4bac-a15d-5a8365781b23-kube-api-access-ktfkq\") pod \"e5c17484-a3bc-4bac-a15d-5a8365781b23\" (UID: \"e5c17484-a3bc-4bac-a15d-5a8365781b23\") " Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.448559 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5c17484-a3bc-4bac-a15d-5a8365781b23-kube-api-access-ktfkq" (OuterVolumeSpecName: "kube-api-access-ktfkq") pod "e5c17484-a3bc-4bac-a15d-5a8365781b23" (UID: "e5c17484-a3bc-4bac-a15d-5a8365781b23"). InnerVolumeSpecName "kube-api-access-ktfkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.453516 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5c17484-a3bc-4bac-a15d-5a8365781b23-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "e5c17484-a3bc-4bac-a15d-5a8365781b23" (UID: "e5c17484-a3bc-4bac-a15d-5a8365781b23"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.453575 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5c17484-a3bc-4bac-a15d-5a8365781b23-ceph" (OuterVolumeSpecName: "ceph") pod "e5c17484-a3bc-4bac-a15d-5a8365781b23" (UID: "e5c17484-a3bc-4bac-a15d-5a8365781b23"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.483131 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5c17484-a3bc-4bac-a15d-5a8365781b23-inventory" (OuterVolumeSpecName: "inventory") pod "e5c17484-a3bc-4bac-a15d-5a8365781b23" (UID: "e5c17484-a3bc-4bac-a15d-5a8365781b23"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.489576 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5c17484-a3bc-4bac-a15d-5a8365781b23-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "e5c17484-a3bc-4bac-a15d-5a8365781b23" (UID: "e5c17484-a3bc-4bac-a15d-5a8365781b23"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.508254 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5c17484-a3bc-4bac-a15d-5a8365781b23-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e5c17484-a3bc-4bac-a15d-5a8365781b23" (UID: "e5c17484-a3bc-4bac-a15d-5a8365781b23"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.546875 4752 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e5c17484-a3bc-4bac-a15d-5a8365781b23-ceph\") on node \"crc\" DevicePath \"\"" Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.546910 4752 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e5c17484-a3bc-4bac-a15d-5a8365781b23-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.546923 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktfkq\" (UniqueName: \"kubernetes.io/projected/e5c17484-a3bc-4bac-a15d-5a8365781b23-kube-api-access-ktfkq\") on node \"crc\" DevicePath \"\"" Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.546938 4752 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5c17484-a3bc-4bac-a15d-5a8365781b23-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.546949 4752 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e5c17484-a3bc-4bac-a15d-5a8365781b23-inventory\") on node \"crc\" DevicePath \"\"" Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.546958 4752 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e5c17484-a3bc-4bac-a15d-5a8365781b23-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.777437 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-ll4l5" event={"ID":"e5c17484-a3bc-4bac-a15d-5a8365781b23","Type":"ContainerDied","Data":"89d859fc5e5b9155e471f4d45eda1c554c10f5c337b848160808426ef069e414"} Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.777478 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89d859fc5e5b9155e471f4d45eda1c554c10f5c337b848160808426ef069e414" Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.777540 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-ll4l5" Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.879106 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-rbv2q"] Nov 24 13:15:54 crc kubenswrapper[4752]: E1124 13:15:54.879550 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4610eb9a-17af-409c-a457-5d1836326c20" containerName="extract-utilities" Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.879572 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="4610eb9a-17af-409c-a457-5d1836326c20" containerName="extract-utilities" Nov 24 13:15:54 crc kubenswrapper[4752]: E1124 13:15:54.879584 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db8e70c6-49f8-4c07-8c8e-166f03799666" containerName="extract-content" Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.879590 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="db8e70c6-49f8-4c07-8c8e-166f03799666" containerName="extract-content" Nov 24 13:15:54 crc kubenswrapper[4752]: E1124 13:15:54.879605 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db8e70c6-49f8-4c07-8c8e-166f03799666" containerName="registry-server" Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.879612 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="db8e70c6-49f8-4c07-8c8e-166f03799666" containerName="registry-server" Nov 24 13:15:54 crc kubenswrapper[4752]: E1124 13:15:54.879623 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4610eb9a-17af-409c-a457-5d1836326c20" containerName="registry-server" Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.879629 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="4610eb9a-17af-409c-a457-5d1836326c20" containerName="registry-server" Nov 24 13:15:54 crc kubenswrapper[4752]: E1124 13:15:54.879658 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5c17484-a3bc-4bac-a15d-5a8365781b23" containerName="libvirt-openstack-openstack-cell1" Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.879664 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5c17484-a3bc-4bac-a15d-5a8365781b23" containerName="libvirt-openstack-openstack-cell1" Nov 24 13:15:54 crc kubenswrapper[4752]: E1124 13:15:54.879680 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4610eb9a-17af-409c-a457-5d1836326c20" containerName="extract-content" Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.879687 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="4610eb9a-17af-409c-a457-5d1836326c20" containerName="extract-content" Nov 24 13:15:54 crc kubenswrapper[4752]: E1124 13:15:54.879710 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db8e70c6-49f8-4c07-8c8e-166f03799666" containerName="extract-utilities" Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.879722 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="db8e70c6-49f8-4c07-8c8e-166f03799666" containerName="extract-utilities" Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.879943 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="4610eb9a-17af-409c-a457-5d1836326c20" containerName="registry-server" Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.879969 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5c17484-a3bc-4bac-a15d-5a8365781b23" containerName="libvirt-openstack-openstack-cell1" Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.879976 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="db8e70c6-49f8-4c07-8c8e-166f03799666" containerName="registry-server" Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.880621 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-rbv2q" Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.883876 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.883923 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.883952 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.883968 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.884203 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.884634 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.886479 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qj7cx" Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.915784 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-rbv2q"] Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.958378 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-rbv2q\" (UID: \"d9673c7e-0da7-40fd-880a-f53c18050035\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbv2q" Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.958437 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-rbv2q\" (UID: \"d9673c7e-0da7-40fd-880a-f53c18050035\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbv2q" Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.958509 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-rbv2q\" (UID: \"d9673c7e-0da7-40fd-880a-f53c18050035\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbv2q" Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.958540 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/d9673c7e-0da7-40fd-880a-f53c18050035-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-rbv2q\" (UID: \"d9673c7e-0da7-40fd-880a-f53c18050035\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbv2q" Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.958568 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-rbv2q\" (UID: \"d9673c7e-0da7-40fd-880a-f53c18050035\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbv2q" Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.958608 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-inventory\") pod \"nova-cell1-openstack-openstack-cell1-rbv2q\" (UID: \"d9673c7e-0da7-40fd-880a-f53c18050035\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbv2q" Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.958699 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-ceph\") pod \"nova-cell1-openstack-openstack-cell1-rbv2q\" (UID: \"d9673c7e-0da7-40fd-880a-f53c18050035\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbv2q" Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.958728 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/d9673c7e-0da7-40fd-880a-f53c18050035-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-rbv2q\" (UID: \"d9673c7e-0da7-40fd-880a-f53c18050035\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbv2q" Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.958836 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-rbv2q\" (UID: \"d9673c7e-0da7-40fd-880a-f53c18050035\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbv2q" Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.958865 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-rbv2q\" (UID: \"d9673c7e-0da7-40fd-880a-f53c18050035\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbv2q" Nov 24 13:15:54 crc kubenswrapper[4752]: I1124 13:15:54.958898 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv8f8\" (UniqueName: \"kubernetes.io/projected/d9673c7e-0da7-40fd-880a-f53c18050035-kube-api-access-mv8f8\") pod \"nova-cell1-openstack-openstack-cell1-rbv2q\" (UID: \"d9673c7e-0da7-40fd-880a-f53c18050035\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbv2q" Nov 24 13:15:55 crc kubenswrapper[4752]: I1124 13:15:55.061081 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-rbv2q\" (UID: \"d9673c7e-0da7-40fd-880a-f53c18050035\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbv2q" Nov 24 13:15:55 crc kubenswrapper[4752]: I1124 13:15:55.061140 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/d9673c7e-0da7-40fd-880a-f53c18050035-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-rbv2q\" (UID: \"d9673c7e-0da7-40fd-880a-f53c18050035\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbv2q" Nov 24 13:15:55 crc kubenswrapper[4752]: I1124 13:15:55.061171 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-rbv2q\" (UID: \"d9673c7e-0da7-40fd-880a-f53c18050035\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbv2q" Nov 24 13:15:55 crc kubenswrapper[4752]: I1124 13:15:55.061259 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-inventory\") pod \"nova-cell1-openstack-openstack-cell1-rbv2q\" (UID: \"d9673c7e-0da7-40fd-880a-f53c18050035\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbv2q" Nov 24 13:15:55 crc kubenswrapper[4752]: I1124 13:15:55.061383 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-ceph\") pod \"nova-cell1-openstack-openstack-cell1-rbv2q\" (UID: \"d9673c7e-0da7-40fd-880a-f53c18050035\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbv2q" Nov 24 13:15:55 crc kubenswrapper[4752]: I1124 13:15:55.061418 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/d9673c7e-0da7-40fd-880a-f53c18050035-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-rbv2q\" (UID: \"d9673c7e-0da7-40fd-880a-f53c18050035\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbv2q" Nov 24 13:15:55 crc kubenswrapper[4752]: I1124 13:15:55.062339 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/d9673c7e-0da7-40fd-880a-f53c18050035-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-rbv2q\" (UID: \"d9673c7e-0da7-40fd-880a-f53c18050035\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbv2q" Nov 24 13:15:55 crc kubenswrapper[4752]: I1124 13:15:55.062339 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/d9673c7e-0da7-40fd-880a-f53c18050035-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-rbv2q\" (UID: \"d9673c7e-0da7-40fd-880a-f53c18050035\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbv2q" Nov 24 13:15:55 crc kubenswrapper[4752]: I1124 13:15:55.062363 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-rbv2q\" (UID: \"d9673c7e-0da7-40fd-880a-f53c18050035\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbv2q" Nov 24 13:15:55 crc kubenswrapper[4752]: I1124 13:15:55.062436 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-rbv2q\" (UID: \"d9673c7e-0da7-40fd-880a-f53c18050035\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbv2q" Nov 24 13:15:55 crc kubenswrapper[4752]: I1124 13:15:55.062501 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv8f8\" (UniqueName: \"kubernetes.io/projected/d9673c7e-0da7-40fd-880a-f53c18050035-kube-api-access-mv8f8\") pod \"nova-cell1-openstack-openstack-cell1-rbv2q\" (UID: \"d9673c7e-0da7-40fd-880a-f53c18050035\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbv2q" Nov 24 13:15:55 crc kubenswrapper[4752]: I1124 13:15:55.062540 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-rbv2q\" (UID: \"d9673c7e-0da7-40fd-880a-f53c18050035\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbv2q" Nov 24 13:15:55 crc kubenswrapper[4752]: I1124 13:15:55.062591 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-rbv2q\" (UID: \"d9673c7e-0da7-40fd-880a-f53c18050035\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbv2q" Nov 24 13:15:55 crc kubenswrapper[4752]: I1124 13:15:55.065344 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-inventory\") pod \"nova-cell1-openstack-openstack-cell1-rbv2q\" (UID: \"d9673c7e-0da7-40fd-880a-f53c18050035\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbv2q" Nov 24 13:15:55 crc kubenswrapper[4752]: I1124 13:15:55.065840 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-ceph\") pod \"nova-cell1-openstack-openstack-cell1-rbv2q\" (UID: \"d9673c7e-0da7-40fd-880a-f53c18050035\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbv2q" Nov 24 13:15:55 crc kubenswrapper[4752]: I1124 13:15:55.066078 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-rbv2q\" (UID: \"d9673c7e-0da7-40fd-880a-f53c18050035\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbv2q" Nov 24 13:15:55 crc kubenswrapper[4752]: I1124 13:15:55.066285 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-rbv2q\" (UID: \"d9673c7e-0da7-40fd-880a-f53c18050035\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbv2q" Nov 24 13:15:55 crc kubenswrapper[4752]: I1124 13:15:55.066454 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-rbv2q\" (UID: \"d9673c7e-0da7-40fd-880a-f53c18050035\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbv2q" Nov 24 13:15:55 crc kubenswrapper[4752]: I1124 13:15:55.066582 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-rbv2q\" (UID: \"d9673c7e-0da7-40fd-880a-f53c18050035\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbv2q" Nov 24 13:15:55 crc kubenswrapper[4752]: I1124 13:15:55.068210 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-rbv2q\" (UID: \"d9673c7e-0da7-40fd-880a-f53c18050035\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbv2q" Nov 24 13:15:55 crc kubenswrapper[4752]: I1124 13:15:55.068535 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-rbv2q\" (UID: \"d9673c7e-0da7-40fd-880a-f53c18050035\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbv2q" Nov 24 13:15:55 crc kubenswrapper[4752]: I1124 13:15:55.081181 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv8f8\" (UniqueName: \"kubernetes.io/projected/d9673c7e-0da7-40fd-880a-f53c18050035-kube-api-access-mv8f8\") pod \"nova-cell1-openstack-openstack-cell1-rbv2q\" (UID: \"d9673c7e-0da7-40fd-880a-f53c18050035\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbv2q" Nov 24 13:15:55 crc kubenswrapper[4752]: I1124 13:15:55.212665 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-rbv2q" Nov 24 13:15:55 crc kubenswrapper[4752]: I1124 13:15:55.741090 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-rbv2q"] Nov 24 13:15:55 crc kubenswrapper[4752]: I1124 13:15:55.792473 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-rbv2q" event={"ID":"d9673c7e-0da7-40fd-880a-f53c18050035","Type":"ContainerStarted","Data":"685f9d697563aab3c47f9ba3a9b14bf64065ce0e27740fc716070248e1201ab4"} Nov 24 13:15:56 crc kubenswrapper[4752]: I1124 13:15:56.811022 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-rbv2q" event={"ID":"d9673c7e-0da7-40fd-880a-f53c18050035","Type":"ContainerStarted","Data":"7e3e17aa87cf78fba157ca4807ea71705187a62060155f719b9c8e3dad7097c3"} Nov 24 13:15:56 crc kubenswrapper[4752]: I1124 13:15:56.838481 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-openstack-cell1-rbv2q" podStartSLOduration=2.236281715 podStartE2EDuration="2.838456354s" podCreationTimestamp="2025-11-24 13:15:54 +0000 UTC" firstStartedPulling="2025-11-24 13:15:55.745796564 +0000 UTC m=+7761.730616853" lastFinishedPulling="2025-11-24 13:15:56.347971203 +0000 UTC m=+7762.332791492" observedRunningTime="2025-11-24 13:15:56.830251359 +0000 UTC m=+7762.815071658" watchObservedRunningTime="2025-11-24 13:15:56.838456354 +0000 UTC m=+7762.823276653" Nov 24 13:15:59 crc kubenswrapper[4752]: I1124 13:15:59.728439 4752 scope.go:117] "RemoveContainer" containerID="413a3aa76aff5620f0ac17705c66d766ac69f3c105c60fd81c26ab3012916875" Nov 24 13:15:59 crc kubenswrapper[4752]: E1124 13:15:59.729174 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:16:13 crc kubenswrapper[4752]: I1124 13:16:13.728533 4752 scope.go:117] "RemoveContainer" containerID="413a3aa76aff5620f0ac17705c66d766ac69f3c105c60fd81c26ab3012916875" Nov 24 13:16:13 crc kubenswrapper[4752]: E1124 13:16:13.729483 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:16:25 crc kubenswrapper[4752]: I1124 13:16:25.728245 4752 scope.go:117] "RemoveContainer" containerID="413a3aa76aff5620f0ac17705c66d766ac69f3c105c60fd81c26ab3012916875" Nov 24 13:16:25 crc kubenswrapper[4752]: E1124 13:16:25.729076 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:16:37 crc kubenswrapper[4752]: I1124 13:16:37.728906 4752 scope.go:117] "RemoveContainer" containerID="413a3aa76aff5620f0ac17705c66d766ac69f3c105c60fd81c26ab3012916875" Nov 24 13:16:37 crc kubenswrapper[4752]: E1124 13:16:37.729805 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:16:51 crc kubenswrapper[4752]: I1124 13:16:51.729519 4752 scope.go:117] "RemoveContainer" containerID="413a3aa76aff5620f0ac17705c66d766ac69f3c105c60fd81c26ab3012916875" Nov 24 13:16:51 crc kubenswrapper[4752]: E1124 13:16:51.730374 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:17:06 crc kubenswrapper[4752]: I1124 13:17:06.732818 4752 scope.go:117] "RemoveContainer" containerID="413a3aa76aff5620f0ac17705c66d766ac69f3c105c60fd81c26ab3012916875" Nov 24 13:17:06 crc kubenswrapper[4752]: E1124 13:17:06.733850 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:17:19 crc kubenswrapper[4752]: I1124 13:17:19.727819 4752 scope.go:117] "RemoveContainer" containerID="413a3aa76aff5620f0ac17705c66d766ac69f3c105c60fd81c26ab3012916875" Nov 24 13:17:19 crc kubenswrapper[4752]: E1124 13:17:19.728648 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:17:30 crc kubenswrapper[4752]: I1124 13:17:30.729782 4752 scope.go:117] "RemoveContainer" containerID="413a3aa76aff5620f0ac17705c66d766ac69f3c105c60fd81c26ab3012916875" Nov 24 13:17:30 crc kubenswrapper[4752]: E1124 13:17:30.731250 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:17:42 crc kubenswrapper[4752]: I1124 13:17:42.729429 4752 scope.go:117] "RemoveContainer" containerID="413a3aa76aff5620f0ac17705c66d766ac69f3c105c60fd81c26ab3012916875" Nov 24 13:17:42 crc kubenswrapper[4752]: E1124 13:17:42.730558 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:17:53 crc kubenswrapper[4752]: I1124 13:17:53.729759 4752 scope.go:117] "RemoveContainer" containerID="413a3aa76aff5620f0ac17705c66d766ac69f3c105c60fd81c26ab3012916875" Nov 24 13:17:53 crc kubenswrapper[4752]: E1124 13:17:53.730438 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:18:08 crc kubenswrapper[4752]: I1124 13:18:08.728363 4752 scope.go:117] "RemoveContainer" containerID="413a3aa76aff5620f0ac17705c66d766ac69f3c105c60fd81c26ab3012916875" Nov 24 13:18:08 crc kubenswrapper[4752]: E1124 13:18:08.729018 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:18:21 crc kubenswrapper[4752]: I1124 13:18:21.728034 4752 scope.go:117] "RemoveContainer" containerID="413a3aa76aff5620f0ac17705c66d766ac69f3c105c60fd81c26ab3012916875" Nov 24 13:18:21 crc kubenswrapper[4752]: E1124 13:18:21.729048 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:18:33 crc kubenswrapper[4752]: I1124 13:18:33.728132 4752 scope.go:117] "RemoveContainer" containerID="413a3aa76aff5620f0ac17705c66d766ac69f3c105c60fd81c26ab3012916875" Nov 24 13:18:33 crc kubenswrapper[4752]: E1124 13:18:33.728868 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:18:47 crc kubenswrapper[4752]: I1124 13:18:47.729038 4752 scope.go:117] "RemoveContainer" containerID="413a3aa76aff5620f0ac17705c66d766ac69f3c105c60fd81c26ab3012916875" Nov 24 13:18:47 crc kubenswrapper[4752]: E1124 13:18:47.730156 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:18:58 crc kubenswrapper[4752]: I1124 13:18:58.728409 4752 scope.go:117] "RemoveContainer" containerID="413a3aa76aff5620f0ac17705c66d766ac69f3c105c60fd81c26ab3012916875" Nov 24 13:18:58 crc kubenswrapper[4752]: E1124 13:18:58.729439 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:19:07 crc kubenswrapper[4752]: I1124 13:19:07.048509 4752 generic.go:334] "Generic (PLEG): container finished" podID="d9673c7e-0da7-40fd-880a-f53c18050035" containerID="7e3e17aa87cf78fba157ca4807ea71705187a62060155f719b9c8e3dad7097c3" exitCode=0 Nov 24 13:19:07 crc kubenswrapper[4752]: I1124 13:19:07.048618 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-rbv2q" event={"ID":"d9673c7e-0da7-40fd-880a-f53c18050035","Type":"ContainerDied","Data":"7e3e17aa87cf78fba157ca4807ea71705187a62060155f719b9c8e3dad7097c3"} Nov 24 13:19:08 crc kubenswrapper[4752]: I1124 13:19:08.703344 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-rbv2q" Nov 24 13:19:08 crc kubenswrapper[4752]: I1124 13:19:08.850620 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-ssh-key\") pod \"d9673c7e-0da7-40fd-880a-f53c18050035\" (UID: \"d9673c7e-0da7-40fd-880a-f53c18050035\") " Nov 24 13:19:08 crc kubenswrapper[4752]: I1124 13:19:08.850958 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-nova-migration-ssh-key-0\") pod \"d9673c7e-0da7-40fd-880a-f53c18050035\" (UID: \"d9673c7e-0da7-40fd-880a-f53c18050035\") " Nov 24 13:19:08 crc kubenswrapper[4752]: I1124 13:19:08.851022 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-nova-migration-ssh-key-1\") pod \"d9673c7e-0da7-40fd-880a-f53c18050035\" (UID: \"d9673c7e-0da7-40fd-880a-f53c18050035\") " Nov 24 13:19:08 crc kubenswrapper[4752]: I1124 13:19:08.851072 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-inventory\") pod \"d9673c7e-0da7-40fd-880a-f53c18050035\" (UID: \"d9673c7e-0da7-40fd-880a-f53c18050035\") " Nov 24 13:19:08 crc kubenswrapper[4752]: I1124 13:19:08.851114 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-nova-cell1-compute-config-0\") pod \"d9673c7e-0da7-40fd-880a-f53c18050035\" (UID: \"d9673c7e-0da7-40fd-880a-f53c18050035\") " Nov 24 13:19:08 crc kubenswrapper[4752]: I1124 13:19:08.851150 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-nova-cell1-compute-config-1\") pod \"d9673c7e-0da7-40fd-880a-f53c18050035\" (UID: \"d9673c7e-0da7-40fd-880a-f53c18050035\") " Nov 24 13:19:08 crc kubenswrapper[4752]: I1124 13:19:08.851207 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/d9673c7e-0da7-40fd-880a-f53c18050035-nova-cells-global-config-0\") pod \"d9673c7e-0da7-40fd-880a-f53c18050035\" (UID: \"d9673c7e-0da7-40fd-880a-f53c18050035\") " Nov 24 13:19:08 crc kubenswrapper[4752]: I1124 13:19:08.851232 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-nova-cell1-combined-ca-bundle\") pod \"d9673c7e-0da7-40fd-880a-f53c18050035\" (UID: \"d9673c7e-0da7-40fd-880a-f53c18050035\") " Nov 24 13:19:08 crc kubenswrapper[4752]: I1124 13:19:08.851277 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/d9673c7e-0da7-40fd-880a-f53c18050035-nova-cells-global-config-1\") pod \"d9673c7e-0da7-40fd-880a-f53c18050035\" (UID: \"d9673c7e-0da7-40fd-880a-f53c18050035\") " Nov 24 13:19:08 crc kubenswrapper[4752]: I1124 13:19:08.851359 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mv8f8\" (UniqueName: \"kubernetes.io/projected/d9673c7e-0da7-40fd-880a-f53c18050035-kube-api-access-mv8f8\") pod \"d9673c7e-0da7-40fd-880a-f53c18050035\" (UID: \"d9673c7e-0da7-40fd-880a-f53c18050035\") " Nov 24 13:19:08 crc kubenswrapper[4752]: I1124 13:19:08.851432 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-ceph\") pod \"d9673c7e-0da7-40fd-880a-f53c18050035\" (UID: \"d9673c7e-0da7-40fd-880a-f53c18050035\") " Nov 24 13:19:08 crc kubenswrapper[4752]: I1124 13:19:08.855644 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-ceph" (OuterVolumeSpecName: "ceph") pod "d9673c7e-0da7-40fd-880a-f53c18050035" (UID: "d9673c7e-0da7-40fd-880a-f53c18050035"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:19:08 crc kubenswrapper[4752]: I1124 13:19:08.856304 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9673c7e-0da7-40fd-880a-f53c18050035-kube-api-access-mv8f8" (OuterVolumeSpecName: "kube-api-access-mv8f8") pod "d9673c7e-0da7-40fd-880a-f53c18050035" (UID: "d9673c7e-0da7-40fd-880a-f53c18050035"). InnerVolumeSpecName "kube-api-access-mv8f8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:19:08 crc kubenswrapper[4752]: I1124 13:19:08.864791 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "d9673c7e-0da7-40fd-880a-f53c18050035" (UID: "d9673c7e-0da7-40fd-880a-f53c18050035"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:19:08 crc kubenswrapper[4752]: I1124 13:19:08.881355 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9673c7e-0da7-40fd-880a-f53c18050035-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "d9673c7e-0da7-40fd-880a-f53c18050035" (UID: "d9673c7e-0da7-40fd-880a-f53c18050035"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 13:19:08 crc kubenswrapper[4752]: I1124 13:19:08.884697 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "d9673c7e-0da7-40fd-880a-f53c18050035" (UID: "d9673c7e-0da7-40fd-880a-f53c18050035"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:19:08 crc kubenswrapper[4752]: I1124 13:19:08.884759 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d9673c7e-0da7-40fd-880a-f53c18050035" (UID: "d9673c7e-0da7-40fd-880a-f53c18050035"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:19:08 crc kubenswrapper[4752]: I1124 13:19:08.887235 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9673c7e-0da7-40fd-880a-f53c18050035-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "d9673c7e-0da7-40fd-880a-f53c18050035" (UID: "d9673c7e-0da7-40fd-880a-f53c18050035"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 13:19:08 crc kubenswrapper[4752]: I1124 13:19:08.898360 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "d9673c7e-0da7-40fd-880a-f53c18050035" (UID: "d9673c7e-0da7-40fd-880a-f53c18050035"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:19:08 crc kubenswrapper[4752]: I1124 13:19:08.899079 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-inventory" (OuterVolumeSpecName: "inventory") pod "d9673c7e-0da7-40fd-880a-f53c18050035" (UID: "d9673c7e-0da7-40fd-880a-f53c18050035"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:19:08 crc kubenswrapper[4752]: I1124 13:19:08.899498 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "d9673c7e-0da7-40fd-880a-f53c18050035" (UID: "d9673c7e-0da7-40fd-880a-f53c18050035"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:19:08 crc kubenswrapper[4752]: I1124 13:19:08.903916 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "d9673c7e-0da7-40fd-880a-f53c18050035" (UID: "d9673c7e-0da7-40fd-880a-f53c18050035"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:19:08 crc kubenswrapper[4752]: I1124 13:19:08.958854 4752 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Nov 24 13:19:08 crc kubenswrapper[4752]: I1124 13:19:08.958919 4752 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-inventory\") on node \"crc\" DevicePath \"\"" Nov 24 13:19:08 crc kubenswrapper[4752]: I1124 13:19:08.958935 4752 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Nov 24 13:19:08 crc kubenswrapper[4752]: I1124 13:19:08.958950 4752 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Nov 24 13:19:08 crc kubenswrapper[4752]: I1124 13:19:08.958962 4752 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/d9673c7e-0da7-40fd-880a-f53c18050035-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Nov 24 13:19:08 crc kubenswrapper[4752]: I1124 13:19:08.958999 4752 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 13:19:08 crc kubenswrapper[4752]: I1124 13:19:08.959014 4752 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/d9673c7e-0da7-40fd-880a-f53c18050035-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Nov 24 13:19:08 crc kubenswrapper[4752]: I1124 13:19:08.959027 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mv8f8\" (UniqueName: \"kubernetes.io/projected/d9673c7e-0da7-40fd-880a-f53c18050035-kube-api-access-mv8f8\") on node \"crc\" DevicePath \"\"" Nov 24 13:19:08 crc kubenswrapper[4752]: I1124 13:19:08.959038 4752 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-ceph\") on node \"crc\" DevicePath \"\"" Nov 24 13:19:08 crc kubenswrapper[4752]: I1124 13:19:08.959071 4752 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 24 13:19:08 crc kubenswrapper[4752]: I1124 13:19:08.959142 4752 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d9673c7e-0da7-40fd-880a-f53c18050035-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Nov 24 13:19:09 crc kubenswrapper[4752]: I1124 13:19:09.069614 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-rbv2q" event={"ID":"d9673c7e-0da7-40fd-880a-f53c18050035","Type":"ContainerDied","Data":"685f9d697563aab3c47f9ba3a9b14bf64065ce0e27740fc716070248e1201ab4"} Nov 24 13:19:09 crc kubenswrapper[4752]: I1124 13:19:09.069662 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="685f9d697563aab3c47f9ba3a9b14bf64065ce0e27740fc716070248e1201ab4" Nov 24 13:19:09 crc kubenswrapper[4752]: I1124 13:19:09.070241 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-rbv2q" Nov 24 13:19:09 crc kubenswrapper[4752]: I1124 13:19:09.187911 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-g9gr2"] Nov 24 13:19:09 crc kubenswrapper[4752]: E1124 13:19:09.188436 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9673c7e-0da7-40fd-880a-f53c18050035" containerName="nova-cell1-openstack-openstack-cell1" Nov 24 13:19:09 crc kubenswrapper[4752]: I1124 13:19:09.188465 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9673c7e-0da7-40fd-880a-f53c18050035" containerName="nova-cell1-openstack-openstack-cell1" Nov 24 13:19:09 crc kubenswrapper[4752]: I1124 13:19:09.188798 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9673c7e-0da7-40fd-880a-f53c18050035" containerName="nova-cell1-openstack-openstack-cell1" Nov 24 13:19:09 crc kubenswrapper[4752]: I1124 13:19:09.189769 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-g9gr2" Nov 24 13:19:09 crc kubenswrapper[4752]: I1124 13:19:09.193192 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qj7cx" Nov 24 13:19:09 crc kubenswrapper[4752]: I1124 13:19:09.193421 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 24 13:19:09 crc kubenswrapper[4752]: I1124 13:19:09.193777 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 24 13:19:09 crc kubenswrapper[4752]: I1124 13:19:09.193963 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Nov 24 13:19:09 crc kubenswrapper[4752]: I1124 13:19:09.194100 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 24 13:19:09 crc kubenswrapper[4752]: I1124 13:19:09.201552 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-g9gr2"] Nov 24 13:19:09 crc kubenswrapper[4752]: I1124 13:19:09.264989 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23f896f6-ecd7-426c-98a6-66ce6cec1202-inventory\") pod \"telemetry-openstack-openstack-cell1-g9gr2\" (UID: \"23f896f6-ecd7-426c-98a6-66ce6cec1202\") " pod="openstack/telemetry-openstack-openstack-cell1-g9gr2" Nov 24 13:19:09 crc kubenswrapper[4752]: I1124 13:19:09.265063 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/23f896f6-ecd7-426c-98a6-66ce6cec1202-ceph\") pod \"telemetry-openstack-openstack-cell1-g9gr2\" (UID: \"23f896f6-ecd7-426c-98a6-66ce6cec1202\") " pod="openstack/telemetry-openstack-openstack-cell1-g9gr2" Nov 24 13:19:09 crc kubenswrapper[4752]: I1124 13:19:09.265145 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/23f896f6-ecd7-426c-98a6-66ce6cec1202-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-g9gr2\" (UID: \"23f896f6-ecd7-426c-98a6-66ce6cec1202\") " pod="openstack/telemetry-openstack-openstack-cell1-g9gr2" Nov 24 13:19:09 crc kubenswrapper[4752]: I1124 13:19:09.265214 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23f896f6-ecd7-426c-98a6-66ce6cec1202-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-g9gr2\" (UID: \"23f896f6-ecd7-426c-98a6-66ce6cec1202\") " pod="openstack/telemetry-openstack-openstack-cell1-g9gr2" Nov 24 13:19:09 crc kubenswrapper[4752]: I1124 13:19:09.265256 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/23f896f6-ecd7-426c-98a6-66ce6cec1202-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-g9gr2\" (UID: \"23f896f6-ecd7-426c-98a6-66ce6cec1202\") " pod="openstack/telemetry-openstack-openstack-cell1-g9gr2" Nov 24 13:19:09 crc kubenswrapper[4752]: I1124 13:19:09.265462 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jk5t\" (UniqueName: \"kubernetes.io/projected/23f896f6-ecd7-426c-98a6-66ce6cec1202-kube-api-access-4jk5t\") pod \"telemetry-openstack-openstack-cell1-g9gr2\" (UID: \"23f896f6-ecd7-426c-98a6-66ce6cec1202\") " pod="openstack/telemetry-openstack-openstack-cell1-g9gr2" Nov 24 13:19:09 crc kubenswrapper[4752]: I1124 13:19:09.265578 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/23f896f6-ecd7-426c-98a6-66ce6cec1202-ssh-key\") pod \"telemetry-openstack-openstack-cell1-g9gr2\" (UID: \"23f896f6-ecd7-426c-98a6-66ce6cec1202\") " pod="openstack/telemetry-openstack-openstack-cell1-g9gr2" Nov 24 13:19:09 crc kubenswrapper[4752]: I1124 13:19:09.265711 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/23f896f6-ecd7-426c-98a6-66ce6cec1202-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-g9gr2\" (UID: \"23f896f6-ecd7-426c-98a6-66ce6cec1202\") " pod="openstack/telemetry-openstack-openstack-cell1-g9gr2" Nov 24 13:19:09 crc kubenswrapper[4752]: I1124 13:19:09.367471 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23f896f6-ecd7-426c-98a6-66ce6cec1202-inventory\") pod \"telemetry-openstack-openstack-cell1-g9gr2\" (UID: \"23f896f6-ecd7-426c-98a6-66ce6cec1202\") " pod="openstack/telemetry-openstack-openstack-cell1-g9gr2" Nov 24 13:19:09 crc kubenswrapper[4752]: I1124 13:19:09.367517 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/23f896f6-ecd7-426c-98a6-66ce6cec1202-ceph\") pod \"telemetry-openstack-openstack-cell1-g9gr2\" (UID: \"23f896f6-ecd7-426c-98a6-66ce6cec1202\") " pod="openstack/telemetry-openstack-openstack-cell1-g9gr2" Nov 24 13:19:09 crc kubenswrapper[4752]: I1124 13:19:09.367580 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/23f896f6-ecd7-426c-98a6-66ce6cec1202-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-g9gr2\" (UID: \"23f896f6-ecd7-426c-98a6-66ce6cec1202\") " pod="openstack/telemetry-openstack-openstack-cell1-g9gr2" Nov 24 13:19:09 crc kubenswrapper[4752]: I1124 13:19:09.367635 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23f896f6-ecd7-426c-98a6-66ce6cec1202-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-g9gr2\" (UID: \"23f896f6-ecd7-426c-98a6-66ce6cec1202\") " pod="openstack/telemetry-openstack-openstack-cell1-g9gr2" Nov 24 13:19:09 crc kubenswrapper[4752]: I1124 13:19:09.367665 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/23f896f6-ecd7-426c-98a6-66ce6cec1202-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-g9gr2\" (UID: \"23f896f6-ecd7-426c-98a6-66ce6cec1202\") " pod="openstack/telemetry-openstack-openstack-cell1-g9gr2" Nov 24 13:19:09 crc kubenswrapper[4752]: I1124 13:19:09.367760 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jk5t\" (UniqueName: \"kubernetes.io/projected/23f896f6-ecd7-426c-98a6-66ce6cec1202-kube-api-access-4jk5t\") pod \"telemetry-openstack-openstack-cell1-g9gr2\" (UID: \"23f896f6-ecd7-426c-98a6-66ce6cec1202\") " pod="openstack/telemetry-openstack-openstack-cell1-g9gr2" Nov 24 13:19:09 crc kubenswrapper[4752]: I1124 13:19:09.367805 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/23f896f6-ecd7-426c-98a6-66ce6cec1202-ssh-key\") pod \"telemetry-openstack-openstack-cell1-g9gr2\" (UID: \"23f896f6-ecd7-426c-98a6-66ce6cec1202\") " pod="openstack/telemetry-openstack-openstack-cell1-g9gr2" Nov 24 13:19:09 crc kubenswrapper[4752]: I1124 13:19:09.367853 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/23f896f6-ecd7-426c-98a6-66ce6cec1202-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-g9gr2\" (UID: \"23f896f6-ecd7-426c-98a6-66ce6cec1202\") " pod="openstack/telemetry-openstack-openstack-cell1-g9gr2" Nov 24 13:19:09 crc kubenswrapper[4752]: I1124 13:19:09.374299 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/23f896f6-ecd7-426c-98a6-66ce6cec1202-ssh-key\") pod \"telemetry-openstack-openstack-cell1-g9gr2\" (UID: \"23f896f6-ecd7-426c-98a6-66ce6cec1202\") " pod="openstack/telemetry-openstack-openstack-cell1-g9gr2" Nov 24 13:19:09 crc kubenswrapper[4752]: I1124 13:19:09.374966 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/23f896f6-ecd7-426c-98a6-66ce6cec1202-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-g9gr2\" (UID: \"23f896f6-ecd7-426c-98a6-66ce6cec1202\") " pod="openstack/telemetry-openstack-openstack-cell1-g9gr2" Nov 24 13:19:09 crc kubenswrapper[4752]: I1124 13:19:09.375674 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/23f896f6-ecd7-426c-98a6-66ce6cec1202-ceph\") pod \"telemetry-openstack-openstack-cell1-g9gr2\" (UID: \"23f896f6-ecd7-426c-98a6-66ce6cec1202\") " pod="openstack/telemetry-openstack-openstack-cell1-g9gr2" Nov 24 13:19:09 crc kubenswrapper[4752]: I1124 13:19:09.376418 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23f896f6-ecd7-426c-98a6-66ce6cec1202-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-g9gr2\" (UID: \"23f896f6-ecd7-426c-98a6-66ce6cec1202\") " pod="openstack/telemetry-openstack-openstack-cell1-g9gr2" Nov 24 13:19:09 crc kubenswrapper[4752]: I1124 13:19:09.377233 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23f896f6-ecd7-426c-98a6-66ce6cec1202-inventory\") pod \"telemetry-openstack-openstack-cell1-g9gr2\" (UID: \"23f896f6-ecd7-426c-98a6-66ce6cec1202\") " pod="openstack/telemetry-openstack-openstack-cell1-g9gr2" Nov 24 13:19:09 crc kubenswrapper[4752]: I1124 13:19:09.379500 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/23f896f6-ecd7-426c-98a6-66ce6cec1202-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-g9gr2\" (UID: \"23f896f6-ecd7-426c-98a6-66ce6cec1202\") " pod="openstack/telemetry-openstack-openstack-cell1-g9gr2" Nov 24 13:19:09 crc kubenswrapper[4752]: I1124 13:19:09.383157 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/23f896f6-ecd7-426c-98a6-66ce6cec1202-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-g9gr2\" (UID: \"23f896f6-ecd7-426c-98a6-66ce6cec1202\") " pod="openstack/telemetry-openstack-openstack-cell1-g9gr2" Nov 24 13:19:09 crc kubenswrapper[4752]: I1124 13:19:09.386318 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jk5t\" (UniqueName: \"kubernetes.io/projected/23f896f6-ecd7-426c-98a6-66ce6cec1202-kube-api-access-4jk5t\") pod \"telemetry-openstack-openstack-cell1-g9gr2\" (UID: \"23f896f6-ecd7-426c-98a6-66ce6cec1202\") " pod="openstack/telemetry-openstack-openstack-cell1-g9gr2" Nov 24 13:19:09 crc kubenswrapper[4752]: I1124 13:19:09.505944 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-g9gr2" Nov 24 13:19:10 crc kubenswrapper[4752]: I1124 13:19:10.046269 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-g9gr2"] Nov 24 13:19:10 crc kubenswrapper[4752]: I1124 13:19:10.082502 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-g9gr2" event={"ID":"23f896f6-ecd7-426c-98a6-66ce6cec1202","Type":"ContainerStarted","Data":"af63e96132afbecbaf4e8ef0f0c02fe059b81ad421ecacac884efd529f31248e"} Nov 24 13:19:11 crc kubenswrapper[4752]: I1124 13:19:11.093534 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-g9gr2" event={"ID":"23f896f6-ecd7-426c-98a6-66ce6cec1202","Type":"ContainerStarted","Data":"28927d28fe89984bc4f9f6512c0def569f086674fcc79b2cd4bbc11c00d682f3"} Nov 24 13:19:11 crc kubenswrapper[4752]: I1124 13:19:11.113106 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-openstack-openstack-cell1-g9gr2" podStartSLOduration=1.605692758 podStartE2EDuration="2.113085404s" podCreationTimestamp="2025-11-24 13:19:09 +0000 UTC" firstStartedPulling="2025-11-24 13:19:10.063668041 +0000 UTC m=+7956.048488330" lastFinishedPulling="2025-11-24 13:19:10.571060677 +0000 UTC m=+7956.555880976" observedRunningTime="2025-11-24 13:19:11.109641845 +0000 UTC m=+7957.094462154" watchObservedRunningTime="2025-11-24 13:19:11.113085404 +0000 UTC m=+7957.097905703" Nov 24 13:19:13 crc kubenswrapper[4752]: I1124 13:19:13.727759 4752 scope.go:117] "RemoveContainer" containerID="413a3aa76aff5620f0ac17705c66d766ac69f3c105c60fd81c26ab3012916875" Nov 24 13:19:13 crc kubenswrapper[4752]: E1124 13:19:13.728383 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:19:26 crc kubenswrapper[4752]: I1124 13:19:26.727851 4752 scope.go:117] "RemoveContainer" containerID="413a3aa76aff5620f0ac17705c66d766ac69f3c105c60fd81c26ab3012916875" Nov 24 13:19:26 crc kubenswrapper[4752]: E1124 13:19:26.728972 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:19:38 crc kubenswrapper[4752]: I1124 13:19:38.728192 4752 scope.go:117] "RemoveContainer" containerID="413a3aa76aff5620f0ac17705c66d766ac69f3c105c60fd81c26ab3012916875" Nov 24 13:19:38 crc kubenswrapper[4752]: E1124 13:19:38.730438 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:19:53 crc kubenswrapper[4752]: I1124 13:19:53.728682 4752 scope.go:117] "RemoveContainer" containerID="413a3aa76aff5620f0ac17705c66d766ac69f3c105c60fd81c26ab3012916875" Nov 24 13:19:53 crc kubenswrapper[4752]: E1124 13:19:53.729627 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:20:08 crc kubenswrapper[4752]: I1124 13:20:08.728368 4752 scope.go:117] "RemoveContainer" containerID="413a3aa76aff5620f0ac17705c66d766ac69f3c105c60fd81c26ab3012916875" Nov 24 13:20:08 crc kubenswrapper[4752]: E1124 13:20:08.729679 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:20:23 crc kubenswrapper[4752]: I1124 13:20:23.729149 4752 scope.go:117] "RemoveContainer" containerID="413a3aa76aff5620f0ac17705c66d766ac69f3c105c60fd81c26ab3012916875" Nov 24 13:20:23 crc kubenswrapper[4752]: E1124 13:20:23.730065 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:20:38 crc kubenswrapper[4752]: I1124 13:20:38.729442 4752 scope.go:117] "RemoveContainer" containerID="413a3aa76aff5620f0ac17705c66d766ac69f3c105c60fd81c26ab3012916875" Nov 24 13:20:38 crc kubenswrapper[4752]: E1124 13:20:38.730359 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:20:53 crc kubenswrapper[4752]: I1124 13:20:53.728608 4752 scope.go:117] "RemoveContainer" containerID="413a3aa76aff5620f0ac17705c66d766ac69f3c105c60fd81c26ab3012916875" Nov 24 13:20:54 crc kubenswrapper[4752]: I1124 13:20:54.277933 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerStarted","Data":"6a9395a26994eb72298ba391dfa4612816d3a1b2facd98b558ce05b9e2e13f0f"} Nov 24 13:21:14 crc kubenswrapper[4752]: I1124 13:21:14.230039 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-j5z2r"] Nov 24 13:21:14 crc kubenswrapper[4752]: I1124 13:21:14.233697 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j5z2r" Nov 24 13:21:14 crc kubenswrapper[4752]: I1124 13:21:14.246299 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j5z2r"] Nov 24 13:21:14 crc kubenswrapper[4752]: I1124 13:21:14.252564 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a841180-ec33-4eb3-8fe0-3d861eff9c1e-utilities\") pod \"community-operators-j5z2r\" (UID: \"8a841180-ec33-4eb3-8fe0-3d861eff9c1e\") " pod="openshift-marketplace/community-operators-j5z2r" Nov 24 13:21:14 crc kubenswrapper[4752]: I1124 13:21:14.252633 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a841180-ec33-4eb3-8fe0-3d861eff9c1e-catalog-content\") pod \"community-operators-j5z2r\" (UID: \"8a841180-ec33-4eb3-8fe0-3d861eff9c1e\") " pod="openshift-marketplace/community-operators-j5z2r" Nov 24 13:21:14 crc kubenswrapper[4752]: I1124 13:21:14.252730 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9h9g\" (UniqueName: \"kubernetes.io/projected/8a841180-ec33-4eb3-8fe0-3d861eff9c1e-kube-api-access-m9h9g\") pod \"community-operators-j5z2r\" (UID: \"8a841180-ec33-4eb3-8fe0-3d861eff9c1e\") " pod="openshift-marketplace/community-operators-j5z2r" Nov 24 13:21:14 crc kubenswrapper[4752]: I1124 13:21:14.355851 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a841180-ec33-4eb3-8fe0-3d861eff9c1e-utilities\") pod \"community-operators-j5z2r\" (UID: \"8a841180-ec33-4eb3-8fe0-3d861eff9c1e\") " pod="openshift-marketplace/community-operators-j5z2r" Nov 24 13:21:14 crc kubenswrapper[4752]: I1124 13:21:14.355932 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a841180-ec33-4eb3-8fe0-3d861eff9c1e-catalog-content\") pod \"community-operators-j5z2r\" (UID: \"8a841180-ec33-4eb3-8fe0-3d861eff9c1e\") " pod="openshift-marketplace/community-operators-j5z2r" Nov 24 13:21:14 crc kubenswrapper[4752]: I1124 13:21:14.356013 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9h9g\" (UniqueName: \"kubernetes.io/projected/8a841180-ec33-4eb3-8fe0-3d861eff9c1e-kube-api-access-m9h9g\") pod \"community-operators-j5z2r\" (UID: \"8a841180-ec33-4eb3-8fe0-3d861eff9c1e\") " pod="openshift-marketplace/community-operators-j5z2r" Nov 24 13:21:14 crc kubenswrapper[4752]: I1124 13:21:14.357090 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a841180-ec33-4eb3-8fe0-3d861eff9c1e-utilities\") pod \"community-operators-j5z2r\" (UID: \"8a841180-ec33-4eb3-8fe0-3d861eff9c1e\") " pod="openshift-marketplace/community-operators-j5z2r" Nov 24 13:21:14 crc kubenswrapper[4752]: I1124 13:21:14.357156 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a841180-ec33-4eb3-8fe0-3d861eff9c1e-catalog-content\") pod \"community-operators-j5z2r\" (UID: \"8a841180-ec33-4eb3-8fe0-3d861eff9c1e\") " pod="openshift-marketplace/community-operators-j5z2r" Nov 24 13:21:14 crc kubenswrapper[4752]: I1124 13:21:14.380524 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9h9g\" (UniqueName: \"kubernetes.io/projected/8a841180-ec33-4eb3-8fe0-3d861eff9c1e-kube-api-access-m9h9g\") pod \"community-operators-j5z2r\" (UID: \"8a841180-ec33-4eb3-8fe0-3d861eff9c1e\") " pod="openshift-marketplace/community-operators-j5z2r" Nov 24 13:21:14 crc kubenswrapper[4752]: I1124 13:21:14.554467 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j5z2r" Nov 24 13:21:15 crc kubenswrapper[4752]: I1124 13:21:15.099043 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j5z2r"] Nov 24 13:21:15 crc kubenswrapper[4752]: I1124 13:21:15.517797 4752 generic.go:334] "Generic (PLEG): container finished" podID="8a841180-ec33-4eb3-8fe0-3d861eff9c1e" containerID="c64c73ec868bda842a2a1cd55b3f59dc76478a47d2c7e0b45510b75834ea1fe5" exitCode=0 Nov 24 13:21:15 crc kubenswrapper[4752]: I1124 13:21:15.517867 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j5z2r" event={"ID":"8a841180-ec33-4eb3-8fe0-3d861eff9c1e","Type":"ContainerDied","Data":"c64c73ec868bda842a2a1cd55b3f59dc76478a47d2c7e0b45510b75834ea1fe5"} Nov 24 13:21:15 crc kubenswrapper[4752]: I1124 13:21:15.518170 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j5z2r" event={"ID":"8a841180-ec33-4eb3-8fe0-3d861eff9c1e","Type":"ContainerStarted","Data":"5d430c347783177a54d1cd7f9f904d0c397a41277e17be68bd3a140f493abef4"} Nov 24 13:21:15 crc kubenswrapper[4752]: I1124 13:21:15.519838 4752 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 24 13:21:16 crc kubenswrapper[4752]: I1124 13:21:16.530198 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j5z2r" event={"ID":"8a841180-ec33-4eb3-8fe0-3d861eff9c1e","Type":"ContainerStarted","Data":"59960227f7bff28cb952dff51fb55f8f617aff04cfcdf3f78c24d8c845947c96"} Nov 24 13:21:17 crc kubenswrapper[4752]: I1124 13:21:17.543298 4752 generic.go:334] "Generic (PLEG): container finished" podID="8a841180-ec33-4eb3-8fe0-3d861eff9c1e" containerID="59960227f7bff28cb952dff51fb55f8f617aff04cfcdf3f78c24d8c845947c96" exitCode=0 Nov 24 13:21:17 crc kubenswrapper[4752]: I1124 13:21:17.543355 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j5z2r" event={"ID":"8a841180-ec33-4eb3-8fe0-3d861eff9c1e","Type":"ContainerDied","Data":"59960227f7bff28cb952dff51fb55f8f617aff04cfcdf3f78c24d8c845947c96"} Nov 24 13:21:18 crc kubenswrapper[4752]: I1124 13:21:18.572508 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j5z2r" event={"ID":"8a841180-ec33-4eb3-8fe0-3d861eff9c1e","Type":"ContainerStarted","Data":"9065c367a31166c257fa4ee1dd7430e6ca7bcec9b0a49ea5b88db096a5b64ac0"} Nov 24 13:21:18 crc kubenswrapper[4752]: I1124 13:21:18.602114 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-j5z2r" podStartSLOduration=2.181047422 podStartE2EDuration="4.602085771s" podCreationTimestamp="2025-11-24 13:21:14 +0000 UTC" firstStartedPulling="2025-11-24 13:21:15.519566775 +0000 UTC m=+8081.504387064" lastFinishedPulling="2025-11-24 13:21:17.940605114 +0000 UTC m=+8083.925425413" observedRunningTime="2025-11-24 13:21:18.596579743 +0000 UTC m=+8084.581400052" watchObservedRunningTime="2025-11-24 13:21:18.602085771 +0000 UTC m=+8084.586906060" Nov 24 13:21:24 crc kubenswrapper[4752]: I1124 13:21:24.555420 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-j5z2r" Nov 24 13:21:24 crc kubenswrapper[4752]: I1124 13:21:24.556048 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-j5z2r" Nov 24 13:21:24 crc kubenswrapper[4752]: I1124 13:21:24.623432 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-j5z2r" Nov 24 13:21:24 crc kubenswrapper[4752]: I1124 13:21:24.747651 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-j5z2r" Nov 24 13:21:25 crc kubenswrapper[4752]: I1124 13:21:25.821805 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j5z2r"] Nov 24 13:21:26 crc kubenswrapper[4752]: I1124 13:21:26.681452 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-j5z2r" podUID="8a841180-ec33-4eb3-8fe0-3d861eff9c1e" containerName="registry-server" containerID="cri-o://9065c367a31166c257fa4ee1dd7430e6ca7bcec9b0a49ea5b88db096a5b64ac0" gracePeriod=2 Nov 24 13:21:27 crc kubenswrapper[4752]: I1124 13:21:27.164229 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j5z2r" Nov 24 13:21:27 crc kubenswrapper[4752]: I1124 13:21:27.285034 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9h9g\" (UniqueName: \"kubernetes.io/projected/8a841180-ec33-4eb3-8fe0-3d861eff9c1e-kube-api-access-m9h9g\") pod \"8a841180-ec33-4eb3-8fe0-3d861eff9c1e\" (UID: \"8a841180-ec33-4eb3-8fe0-3d861eff9c1e\") " Nov 24 13:21:27 crc kubenswrapper[4752]: I1124 13:21:27.285194 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a841180-ec33-4eb3-8fe0-3d861eff9c1e-utilities\") pod \"8a841180-ec33-4eb3-8fe0-3d861eff9c1e\" (UID: \"8a841180-ec33-4eb3-8fe0-3d861eff9c1e\") " Nov 24 13:21:27 crc kubenswrapper[4752]: I1124 13:21:27.285395 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a841180-ec33-4eb3-8fe0-3d861eff9c1e-catalog-content\") pod \"8a841180-ec33-4eb3-8fe0-3d861eff9c1e\" (UID: \"8a841180-ec33-4eb3-8fe0-3d861eff9c1e\") " Nov 24 13:21:27 crc kubenswrapper[4752]: I1124 13:21:27.286063 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a841180-ec33-4eb3-8fe0-3d861eff9c1e-utilities" (OuterVolumeSpecName: "utilities") pod "8a841180-ec33-4eb3-8fe0-3d861eff9c1e" (UID: "8a841180-ec33-4eb3-8fe0-3d861eff9c1e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 13:21:27 crc kubenswrapper[4752]: I1124 13:21:27.299577 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a841180-ec33-4eb3-8fe0-3d861eff9c1e-kube-api-access-m9h9g" (OuterVolumeSpecName: "kube-api-access-m9h9g") pod "8a841180-ec33-4eb3-8fe0-3d861eff9c1e" (UID: "8a841180-ec33-4eb3-8fe0-3d861eff9c1e"). InnerVolumeSpecName "kube-api-access-m9h9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:21:27 crc kubenswrapper[4752]: I1124 13:21:27.387579 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9h9g\" (UniqueName: \"kubernetes.io/projected/8a841180-ec33-4eb3-8fe0-3d861eff9c1e-kube-api-access-m9h9g\") on node \"crc\" DevicePath \"\"" Nov 24 13:21:27 crc kubenswrapper[4752]: I1124 13:21:27.387615 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a841180-ec33-4eb3-8fe0-3d861eff9c1e-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 13:21:27 crc kubenswrapper[4752]: I1124 13:21:27.568932 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a841180-ec33-4eb3-8fe0-3d861eff9c1e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8a841180-ec33-4eb3-8fe0-3d861eff9c1e" (UID: "8a841180-ec33-4eb3-8fe0-3d861eff9c1e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 13:21:27 crc kubenswrapper[4752]: I1124 13:21:27.590946 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a841180-ec33-4eb3-8fe0-3d861eff9c1e-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 13:21:27 crc kubenswrapper[4752]: I1124 13:21:27.695241 4752 generic.go:334] "Generic (PLEG): container finished" podID="8a841180-ec33-4eb3-8fe0-3d861eff9c1e" containerID="9065c367a31166c257fa4ee1dd7430e6ca7bcec9b0a49ea5b88db096a5b64ac0" exitCode=0 Nov 24 13:21:27 crc kubenswrapper[4752]: I1124 13:21:27.695293 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j5z2r" event={"ID":"8a841180-ec33-4eb3-8fe0-3d861eff9c1e","Type":"ContainerDied","Data":"9065c367a31166c257fa4ee1dd7430e6ca7bcec9b0a49ea5b88db096a5b64ac0"} Nov 24 13:21:27 crc kubenswrapper[4752]: I1124 13:21:27.695332 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j5z2r" event={"ID":"8a841180-ec33-4eb3-8fe0-3d861eff9c1e","Type":"ContainerDied","Data":"5d430c347783177a54d1cd7f9f904d0c397a41277e17be68bd3a140f493abef4"} Nov 24 13:21:27 crc kubenswrapper[4752]: I1124 13:21:27.695371 4752 scope.go:117] "RemoveContainer" containerID="9065c367a31166c257fa4ee1dd7430e6ca7bcec9b0a49ea5b88db096a5b64ac0" Nov 24 13:21:27 crc kubenswrapper[4752]: I1124 13:21:27.695301 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j5z2r" Nov 24 13:21:27 crc kubenswrapper[4752]: I1124 13:21:27.731950 4752 scope.go:117] "RemoveContainer" containerID="59960227f7bff28cb952dff51fb55f8f617aff04cfcdf3f78c24d8c845947c96" Nov 24 13:21:27 crc kubenswrapper[4752]: I1124 13:21:27.739094 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j5z2r"] Nov 24 13:21:27 crc kubenswrapper[4752]: I1124 13:21:27.750129 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-j5z2r"] Nov 24 13:21:27 crc kubenswrapper[4752]: I1124 13:21:27.761033 4752 scope.go:117] "RemoveContainer" containerID="c64c73ec868bda842a2a1cd55b3f59dc76478a47d2c7e0b45510b75834ea1fe5" Nov 24 13:21:27 crc kubenswrapper[4752]: I1124 13:21:27.828695 4752 scope.go:117] "RemoveContainer" containerID="9065c367a31166c257fa4ee1dd7430e6ca7bcec9b0a49ea5b88db096a5b64ac0" Nov 24 13:21:27 crc kubenswrapper[4752]: E1124 13:21:27.829184 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9065c367a31166c257fa4ee1dd7430e6ca7bcec9b0a49ea5b88db096a5b64ac0\": container with ID starting with 9065c367a31166c257fa4ee1dd7430e6ca7bcec9b0a49ea5b88db096a5b64ac0 not found: ID does not exist" containerID="9065c367a31166c257fa4ee1dd7430e6ca7bcec9b0a49ea5b88db096a5b64ac0" Nov 24 13:21:27 crc kubenswrapper[4752]: I1124 13:21:27.829220 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9065c367a31166c257fa4ee1dd7430e6ca7bcec9b0a49ea5b88db096a5b64ac0"} err="failed to get container status \"9065c367a31166c257fa4ee1dd7430e6ca7bcec9b0a49ea5b88db096a5b64ac0\": rpc error: code = NotFound desc = could not find container \"9065c367a31166c257fa4ee1dd7430e6ca7bcec9b0a49ea5b88db096a5b64ac0\": container with ID starting with 9065c367a31166c257fa4ee1dd7430e6ca7bcec9b0a49ea5b88db096a5b64ac0 not found: ID does not exist" Nov 24 13:21:27 crc kubenswrapper[4752]: I1124 13:21:27.829246 4752 scope.go:117] "RemoveContainer" containerID="59960227f7bff28cb952dff51fb55f8f617aff04cfcdf3f78c24d8c845947c96" Nov 24 13:21:27 crc kubenswrapper[4752]: E1124 13:21:27.829541 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59960227f7bff28cb952dff51fb55f8f617aff04cfcdf3f78c24d8c845947c96\": container with ID starting with 59960227f7bff28cb952dff51fb55f8f617aff04cfcdf3f78c24d8c845947c96 not found: ID does not exist" containerID="59960227f7bff28cb952dff51fb55f8f617aff04cfcdf3f78c24d8c845947c96" Nov 24 13:21:27 crc kubenswrapper[4752]: I1124 13:21:27.829583 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59960227f7bff28cb952dff51fb55f8f617aff04cfcdf3f78c24d8c845947c96"} err="failed to get container status \"59960227f7bff28cb952dff51fb55f8f617aff04cfcdf3f78c24d8c845947c96\": rpc error: code = NotFound desc = could not find container \"59960227f7bff28cb952dff51fb55f8f617aff04cfcdf3f78c24d8c845947c96\": container with ID starting with 59960227f7bff28cb952dff51fb55f8f617aff04cfcdf3f78c24d8c845947c96 not found: ID does not exist" Nov 24 13:21:27 crc kubenswrapper[4752]: I1124 13:21:27.829612 4752 scope.go:117] "RemoveContainer" containerID="c64c73ec868bda842a2a1cd55b3f59dc76478a47d2c7e0b45510b75834ea1fe5" Nov 24 13:21:27 crc kubenswrapper[4752]: E1124 13:21:27.829896 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c64c73ec868bda842a2a1cd55b3f59dc76478a47d2c7e0b45510b75834ea1fe5\": container with ID starting with c64c73ec868bda842a2a1cd55b3f59dc76478a47d2c7e0b45510b75834ea1fe5 not found: ID does not exist" containerID="c64c73ec868bda842a2a1cd55b3f59dc76478a47d2c7e0b45510b75834ea1fe5" Nov 24 13:21:27 crc kubenswrapper[4752]: I1124 13:21:27.829926 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c64c73ec868bda842a2a1cd55b3f59dc76478a47d2c7e0b45510b75834ea1fe5"} err="failed to get container status \"c64c73ec868bda842a2a1cd55b3f59dc76478a47d2c7e0b45510b75834ea1fe5\": rpc error: code = NotFound desc = could not find container \"c64c73ec868bda842a2a1cd55b3f59dc76478a47d2c7e0b45510b75834ea1fe5\": container with ID starting with c64c73ec868bda842a2a1cd55b3f59dc76478a47d2c7e0b45510b75834ea1fe5 not found: ID does not exist" Nov 24 13:21:28 crc kubenswrapper[4752]: I1124 13:21:28.743516 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a841180-ec33-4eb3-8fe0-3d861eff9c1e" path="/var/lib/kubelet/pods/8a841180-ec33-4eb3-8fe0-3d861eff9c1e/volumes" Nov 24 13:21:49 crc kubenswrapper[4752]: I1124 13:21:49.682206 4752 scope.go:117] "RemoveContainer" containerID="7768dcb98f630b8c69a5b80a0e8825bda764f7a71aa9793063fe849834a93fd5" Nov 24 13:21:49 crc kubenswrapper[4752]: I1124 13:21:49.710700 4752 scope.go:117] "RemoveContainer" containerID="0eb5654606027038b07a8bb545f31b6e9ddcb5aaf1118e3cda4bbf561eed1923" Nov 24 13:21:49 crc kubenswrapper[4752]: I1124 13:21:49.784776 4752 scope.go:117] "RemoveContainer" containerID="e7ebb04b4e7b4c734e59ba7c3e916c5707e0816a317fe04e4a4c454ce9cc3069" Nov 24 13:23:15 crc kubenswrapper[4752]: I1124 13:23:15.051438 4752 generic.go:334] "Generic (PLEG): container finished" podID="23f896f6-ecd7-426c-98a6-66ce6cec1202" containerID="28927d28fe89984bc4f9f6512c0def569f086674fcc79b2cd4bbc11c00d682f3" exitCode=0 Nov 24 13:23:15 crc kubenswrapper[4752]: I1124 13:23:15.051536 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-g9gr2" event={"ID":"23f896f6-ecd7-426c-98a6-66ce6cec1202","Type":"ContainerDied","Data":"28927d28fe89984bc4f9f6512c0def569f086674fcc79b2cd4bbc11c00d682f3"} Nov 24 13:23:15 crc kubenswrapper[4752]: I1124 13:23:15.468338 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 13:23:15 crc kubenswrapper[4752]: I1124 13:23:15.469056 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 13:23:16 crc kubenswrapper[4752]: I1124 13:23:16.583228 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-g9gr2" Nov 24 13:23:16 crc kubenswrapper[4752]: I1124 13:23:16.741065 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/23f896f6-ecd7-426c-98a6-66ce6cec1202-ceilometer-compute-config-data-2\") pod \"23f896f6-ecd7-426c-98a6-66ce6cec1202\" (UID: \"23f896f6-ecd7-426c-98a6-66ce6cec1202\") " Nov 24 13:23:16 crc kubenswrapper[4752]: I1124 13:23:16.741678 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23f896f6-ecd7-426c-98a6-66ce6cec1202-telemetry-combined-ca-bundle\") pod \"23f896f6-ecd7-426c-98a6-66ce6cec1202\" (UID: \"23f896f6-ecd7-426c-98a6-66ce6cec1202\") " Nov 24 13:23:16 crc kubenswrapper[4752]: I1124 13:23:16.741932 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/23f896f6-ecd7-426c-98a6-66ce6cec1202-ceph\") pod \"23f896f6-ecd7-426c-98a6-66ce6cec1202\" (UID: \"23f896f6-ecd7-426c-98a6-66ce6cec1202\") " Nov 24 13:23:16 crc kubenswrapper[4752]: I1124 13:23:16.742025 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/23f896f6-ecd7-426c-98a6-66ce6cec1202-ceilometer-compute-config-data-0\") pod \"23f896f6-ecd7-426c-98a6-66ce6cec1202\" (UID: \"23f896f6-ecd7-426c-98a6-66ce6cec1202\") " Nov 24 13:23:16 crc kubenswrapper[4752]: I1124 13:23:16.742185 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jk5t\" (UniqueName: \"kubernetes.io/projected/23f896f6-ecd7-426c-98a6-66ce6cec1202-kube-api-access-4jk5t\") pod \"23f896f6-ecd7-426c-98a6-66ce6cec1202\" (UID: \"23f896f6-ecd7-426c-98a6-66ce6cec1202\") " Nov 24 13:23:16 crc kubenswrapper[4752]: I1124 13:23:16.742286 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/23f896f6-ecd7-426c-98a6-66ce6cec1202-ssh-key\") pod \"23f896f6-ecd7-426c-98a6-66ce6cec1202\" (UID: \"23f896f6-ecd7-426c-98a6-66ce6cec1202\") " Nov 24 13:23:16 crc kubenswrapper[4752]: I1124 13:23:16.742391 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23f896f6-ecd7-426c-98a6-66ce6cec1202-inventory\") pod \"23f896f6-ecd7-426c-98a6-66ce6cec1202\" (UID: \"23f896f6-ecd7-426c-98a6-66ce6cec1202\") " Nov 24 13:23:16 crc kubenswrapper[4752]: I1124 13:23:16.742497 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/23f896f6-ecd7-426c-98a6-66ce6cec1202-ceilometer-compute-config-data-1\") pod \"23f896f6-ecd7-426c-98a6-66ce6cec1202\" (UID: \"23f896f6-ecd7-426c-98a6-66ce6cec1202\") " Nov 24 13:23:16 crc kubenswrapper[4752]: I1124 13:23:16.756662 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23f896f6-ecd7-426c-98a6-66ce6cec1202-kube-api-access-4jk5t" (OuterVolumeSpecName: "kube-api-access-4jk5t") pod "23f896f6-ecd7-426c-98a6-66ce6cec1202" (UID: "23f896f6-ecd7-426c-98a6-66ce6cec1202"). InnerVolumeSpecName "kube-api-access-4jk5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:23:16 crc kubenswrapper[4752]: I1124 13:23:16.757115 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23f896f6-ecd7-426c-98a6-66ce6cec1202-ceph" (OuterVolumeSpecName: "ceph") pod "23f896f6-ecd7-426c-98a6-66ce6cec1202" (UID: "23f896f6-ecd7-426c-98a6-66ce6cec1202"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:23:16 crc kubenswrapper[4752]: I1124 13:23:16.760473 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23f896f6-ecd7-426c-98a6-66ce6cec1202-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "23f896f6-ecd7-426c-98a6-66ce6cec1202" (UID: "23f896f6-ecd7-426c-98a6-66ce6cec1202"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:23:16 crc kubenswrapper[4752]: I1124 13:23:16.778898 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23f896f6-ecd7-426c-98a6-66ce6cec1202-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "23f896f6-ecd7-426c-98a6-66ce6cec1202" (UID: "23f896f6-ecd7-426c-98a6-66ce6cec1202"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:23:16 crc kubenswrapper[4752]: I1124 13:23:16.783128 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23f896f6-ecd7-426c-98a6-66ce6cec1202-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "23f896f6-ecd7-426c-98a6-66ce6cec1202" (UID: "23f896f6-ecd7-426c-98a6-66ce6cec1202"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:23:16 crc kubenswrapper[4752]: I1124 13:23:16.786083 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23f896f6-ecd7-426c-98a6-66ce6cec1202-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "23f896f6-ecd7-426c-98a6-66ce6cec1202" (UID: "23f896f6-ecd7-426c-98a6-66ce6cec1202"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:23:16 crc kubenswrapper[4752]: I1124 13:23:16.792857 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23f896f6-ecd7-426c-98a6-66ce6cec1202-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "23f896f6-ecd7-426c-98a6-66ce6cec1202" (UID: "23f896f6-ecd7-426c-98a6-66ce6cec1202"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:23:16 crc kubenswrapper[4752]: I1124 13:23:16.823701 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23f896f6-ecd7-426c-98a6-66ce6cec1202-inventory" (OuterVolumeSpecName: "inventory") pod "23f896f6-ecd7-426c-98a6-66ce6cec1202" (UID: "23f896f6-ecd7-426c-98a6-66ce6cec1202"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:23:16 crc kubenswrapper[4752]: I1124 13:23:16.847089 4752 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/23f896f6-ecd7-426c-98a6-66ce6cec1202-ceph\") on node \"crc\" DevicePath \"\"" Nov 24 13:23:16 crc kubenswrapper[4752]: I1124 13:23:16.847123 4752 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/23f896f6-ecd7-426c-98a6-66ce6cec1202-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Nov 24 13:23:16 crc kubenswrapper[4752]: I1124 13:23:16.847139 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jk5t\" (UniqueName: \"kubernetes.io/projected/23f896f6-ecd7-426c-98a6-66ce6cec1202-kube-api-access-4jk5t\") on node \"crc\" DevicePath \"\"" Nov 24 13:23:16 crc kubenswrapper[4752]: I1124 13:23:16.847151 4752 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/23f896f6-ecd7-426c-98a6-66ce6cec1202-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 24 13:23:16 crc kubenswrapper[4752]: I1124 13:23:16.847164 4752 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23f896f6-ecd7-426c-98a6-66ce6cec1202-inventory\") on node \"crc\" DevicePath \"\"" Nov 24 13:23:16 crc kubenswrapper[4752]: I1124 13:23:16.847178 4752 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/23f896f6-ecd7-426c-98a6-66ce6cec1202-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Nov 24 13:23:16 crc kubenswrapper[4752]: I1124 13:23:16.847190 4752 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/23f896f6-ecd7-426c-98a6-66ce6cec1202-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Nov 24 13:23:16 crc kubenswrapper[4752]: I1124 13:23:16.847202 4752 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23f896f6-ecd7-426c-98a6-66ce6cec1202-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 13:23:17 crc kubenswrapper[4752]: I1124 13:23:17.076663 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-g9gr2" event={"ID":"23f896f6-ecd7-426c-98a6-66ce6cec1202","Type":"ContainerDied","Data":"af63e96132afbecbaf4e8ef0f0c02fe059b81ad421ecacac884efd529f31248e"} Nov 24 13:23:17 crc kubenswrapper[4752]: I1124 13:23:17.076731 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af63e96132afbecbaf4e8ef0f0c02fe059b81ad421ecacac884efd529f31248e" Nov 24 13:23:17 crc kubenswrapper[4752]: I1124 13:23:17.076769 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-g9gr2" Nov 24 13:23:17 crc kubenswrapper[4752]: I1124 13:23:17.182195 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-5d49m"] Nov 24 13:23:17 crc kubenswrapper[4752]: E1124 13:23:17.182816 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a841180-ec33-4eb3-8fe0-3d861eff9c1e" containerName="registry-server" Nov 24 13:23:17 crc kubenswrapper[4752]: I1124 13:23:17.182839 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a841180-ec33-4eb3-8fe0-3d861eff9c1e" containerName="registry-server" Nov 24 13:23:17 crc kubenswrapper[4752]: E1124 13:23:17.182865 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a841180-ec33-4eb3-8fe0-3d861eff9c1e" containerName="extract-content" Nov 24 13:23:17 crc kubenswrapper[4752]: I1124 13:23:17.182875 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a841180-ec33-4eb3-8fe0-3d861eff9c1e" containerName="extract-content" Nov 24 13:23:17 crc kubenswrapper[4752]: E1124 13:23:17.182895 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23f896f6-ecd7-426c-98a6-66ce6cec1202" containerName="telemetry-openstack-openstack-cell1" Nov 24 13:23:17 crc kubenswrapper[4752]: I1124 13:23:17.182904 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="23f896f6-ecd7-426c-98a6-66ce6cec1202" containerName="telemetry-openstack-openstack-cell1" Nov 24 13:23:17 crc kubenswrapper[4752]: E1124 13:23:17.182920 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a841180-ec33-4eb3-8fe0-3d861eff9c1e" containerName="extract-utilities" Nov 24 13:23:17 crc kubenswrapper[4752]: I1124 13:23:17.182929 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a841180-ec33-4eb3-8fe0-3d861eff9c1e" containerName="extract-utilities" Nov 24 13:23:17 crc kubenswrapper[4752]: I1124 13:23:17.183196 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="23f896f6-ecd7-426c-98a6-66ce6cec1202" containerName="telemetry-openstack-openstack-cell1" Nov 24 13:23:17 crc kubenswrapper[4752]: I1124 13:23:17.183220 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a841180-ec33-4eb3-8fe0-3d861eff9c1e" containerName="registry-server" Nov 24 13:23:17 crc kubenswrapper[4752]: I1124 13:23:17.184078 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-5d49m" Nov 24 13:23:17 crc kubenswrapper[4752]: I1124 13:23:17.186083 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 24 13:23:17 crc kubenswrapper[4752]: I1124 13:23:17.186771 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 24 13:23:17 crc kubenswrapper[4752]: I1124 13:23:17.186952 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-sriov-agent-neutron-config" Nov 24 13:23:17 crc kubenswrapper[4752]: I1124 13:23:17.189515 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 24 13:23:17 crc kubenswrapper[4752]: I1124 13:23:17.189768 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qj7cx" Nov 24 13:23:17 crc kubenswrapper[4752]: I1124 13:23:17.192426 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-5d49m"] Nov 24 13:23:17 crc kubenswrapper[4752]: I1124 13:23:17.255108 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j66cc\" (UniqueName: \"kubernetes.io/projected/82f20cf4-0c2e-483d-8105-643e3c975dd2-kube-api-access-j66cc\") pod \"neutron-sriov-openstack-openstack-cell1-5d49m\" (UID: \"82f20cf4-0c2e-483d-8105-643e3c975dd2\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-5d49m" Nov 24 13:23:17 crc kubenswrapper[4752]: I1124 13:23:17.255306 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82f20cf4-0c2e-483d-8105-643e3c975dd2-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-5d49m\" (UID: \"82f20cf4-0c2e-483d-8105-643e3c975dd2\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-5d49m" Nov 24 13:23:17 crc kubenswrapper[4752]: I1124 13:23:17.255332 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/82f20cf4-0c2e-483d-8105-643e3c975dd2-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-5d49m\" (UID: \"82f20cf4-0c2e-483d-8105-643e3c975dd2\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-5d49m" Nov 24 13:23:17 crc kubenswrapper[4752]: I1124 13:23:17.255353 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/82f20cf4-0c2e-483d-8105-643e3c975dd2-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-5d49m\" (UID: \"82f20cf4-0c2e-483d-8105-643e3c975dd2\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-5d49m" Nov 24 13:23:17 crc kubenswrapper[4752]: I1124 13:23:17.255400 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/82f20cf4-0c2e-483d-8105-643e3c975dd2-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-5d49m\" (UID: \"82f20cf4-0c2e-483d-8105-643e3c975dd2\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-5d49m" Nov 24 13:23:17 crc kubenswrapper[4752]: I1124 13:23:17.255435 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/82f20cf4-0c2e-483d-8105-643e3c975dd2-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-5d49m\" (UID: \"82f20cf4-0c2e-483d-8105-643e3c975dd2\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-5d49m" Nov 24 13:23:17 crc kubenswrapper[4752]: I1124 13:23:17.357582 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j66cc\" (UniqueName: \"kubernetes.io/projected/82f20cf4-0c2e-483d-8105-643e3c975dd2-kube-api-access-j66cc\") pod \"neutron-sriov-openstack-openstack-cell1-5d49m\" (UID: \"82f20cf4-0c2e-483d-8105-643e3c975dd2\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-5d49m" Nov 24 13:23:17 crc kubenswrapper[4752]: I1124 13:23:17.357684 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82f20cf4-0c2e-483d-8105-643e3c975dd2-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-5d49m\" (UID: \"82f20cf4-0c2e-483d-8105-643e3c975dd2\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-5d49m" Nov 24 13:23:17 crc kubenswrapper[4752]: I1124 13:23:17.357719 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/82f20cf4-0c2e-483d-8105-643e3c975dd2-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-5d49m\" (UID: \"82f20cf4-0c2e-483d-8105-643e3c975dd2\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-5d49m" Nov 24 13:23:17 crc kubenswrapper[4752]: I1124 13:23:17.357766 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/82f20cf4-0c2e-483d-8105-643e3c975dd2-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-5d49m\" (UID: \"82f20cf4-0c2e-483d-8105-643e3c975dd2\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-5d49m" Nov 24 13:23:17 crc kubenswrapper[4752]: I1124 13:23:17.357817 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/82f20cf4-0c2e-483d-8105-643e3c975dd2-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-5d49m\" (UID: \"82f20cf4-0c2e-483d-8105-643e3c975dd2\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-5d49m" Nov 24 13:23:17 crc kubenswrapper[4752]: I1124 13:23:17.357865 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/82f20cf4-0c2e-483d-8105-643e3c975dd2-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-5d49m\" (UID: \"82f20cf4-0c2e-483d-8105-643e3c975dd2\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-5d49m" Nov 24 13:23:17 crc kubenswrapper[4752]: I1124 13:23:17.363114 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/82f20cf4-0c2e-483d-8105-643e3c975dd2-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-5d49m\" (UID: \"82f20cf4-0c2e-483d-8105-643e3c975dd2\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-5d49m" Nov 24 13:23:17 crc kubenswrapper[4752]: I1124 13:23:17.363431 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/82f20cf4-0c2e-483d-8105-643e3c975dd2-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-5d49m\" (UID: \"82f20cf4-0c2e-483d-8105-643e3c975dd2\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-5d49m" Nov 24 13:23:17 crc kubenswrapper[4752]: I1124 13:23:17.364353 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82f20cf4-0c2e-483d-8105-643e3c975dd2-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-5d49m\" (UID: \"82f20cf4-0c2e-483d-8105-643e3c975dd2\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-5d49m" Nov 24 13:23:17 crc kubenswrapper[4752]: I1124 13:23:17.365071 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/82f20cf4-0c2e-483d-8105-643e3c975dd2-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-5d49m\" (UID: \"82f20cf4-0c2e-483d-8105-643e3c975dd2\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-5d49m" Nov 24 13:23:17 crc kubenswrapper[4752]: I1124 13:23:17.368005 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/82f20cf4-0c2e-483d-8105-643e3c975dd2-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-5d49m\" (UID: \"82f20cf4-0c2e-483d-8105-643e3c975dd2\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-5d49m" Nov 24 13:23:17 crc kubenswrapper[4752]: I1124 13:23:17.376157 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j66cc\" (UniqueName: \"kubernetes.io/projected/82f20cf4-0c2e-483d-8105-643e3c975dd2-kube-api-access-j66cc\") pod \"neutron-sriov-openstack-openstack-cell1-5d49m\" (UID: \"82f20cf4-0c2e-483d-8105-643e3c975dd2\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-5d49m" Nov 24 13:23:17 crc kubenswrapper[4752]: I1124 13:23:17.500010 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-5d49m" Nov 24 13:23:18 crc kubenswrapper[4752]: I1124 13:23:18.724099 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-5d49m"] Nov 24 13:23:19 crc kubenswrapper[4752]: I1124 13:23:19.106082 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-5d49m" event={"ID":"82f20cf4-0c2e-483d-8105-643e3c975dd2","Type":"ContainerStarted","Data":"119d9e0cadac01b440c0d48868ef270ba961391052266819d81a17b2893c179f"} Nov 24 13:23:20 crc kubenswrapper[4752]: I1124 13:23:20.117239 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-5d49m" event={"ID":"82f20cf4-0c2e-483d-8105-643e3c975dd2","Type":"ContainerStarted","Data":"9c936a6b6c79c85e6022481bc4a1efe3f0ced800ef3124ce9231517caef84bb6"} Nov 24 13:23:20 crc kubenswrapper[4752]: I1124 13:23:20.140404 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-sriov-openstack-openstack-cell1-5d49m" podStartSLOduration=2.7056168879999998 podStartE2EDuration="3.140384364s" podCreationTimestamp="2025-11-24 13:23:17 +0000 UTC" firstStartedPulling="2025-11-24 13:23:18.738864962 +0000 UTC m=+8204.723685251" lastFinishedPulling="2025-11-24 13:23:19.173632438 +0000 UTC m=+8205.158452727" observedRunningTime="2025-11-24 13:23:20.139592722 +0000 UTC m=+8206.124413021" watchObservedRunningTime="2025-11-24 13:23:20.140384364 +0000 UTC m=+8206.125204653" Nov 24 13:23:45 crc kubenswrapper[4752]: I1124 13:23:45.468562 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 13:23:45 crc kubenswrapper[4752]: I1124 13:23:45.469241 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 13:24:15 crc kubenswrapper[4752]: I1124 13:24:15.469848 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 13:24:15 crc kubenswrapper[4752]: I1124 13:24:15.470421 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 13:24:15 crc kubenswrapper[4752]: I1124 13:24:15.470470 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 13:24:15 crc kubenswrapper[4752]: I1124 13:24:15.471541 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6a9395a26994eb72298ba391dfa4612816d3a1b2facd98b558ce05b9e2e13f0f"} pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 24 13:24:15 crc kubenswrapper[4752]: I1124 13:24:15.471610 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" containerID="cri-o://6a9395a26994eb72298ba391dfa4612816d3a1b2facd98b558ce05b9e2e13f0f" gracePeriod=600 Nov 24 13:24:15 crc kubenswrapper[4752]: I1124 13:24:15.690921 4752 generic.go:334] "Generic (PLEG): container finished" podID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerID="6a9395a26994eb72298ba391dfa4612816d3a1b2facd98b558ce05b9e2e13f0f" exitCode=0 Nov 24 13:24:15 crc kubenswrapper[4752]: I1124 13:24:15.690951 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerDied","Data":"6a9395a26994eb72298ba391dfa4612816d3a1b2facd98b558ce05b9e2e13f0f"} Nov 24 13:24:15 crc kubenswrapper[4752]: I1124 13:24:15.691398 4752 scope.go:117] "RemoveContainer" containerID="413a3aa76aff5620f0ac17705c66d766ac69f3c105c60fd81c26ab3012916875" Nov 24 13:24:16 crc kubenswrapper[4752]: I1124 13:24:16.704022 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerStarted","Data":"1a089388d4df9ad719898856284c6337e3ff4bcf9f29f31651cff6612d3497f5"} Nov 24 13:24:25 crc kubenswrapper[4752]: I1124 13:24:25.289428 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-z47zf"] Nov 24 13:24:25 crc kubenswrapper[4752]: I1124 13:24:25.292557 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z47zf" Nov 24 13:24:25 crc kubenswrapper[4752]: I1124 13:24:25.316848 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z47zf"] Nov 24 13:24:25 crc kubenswrapper[4752]: I1124 13:24:25.364822 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfcxz\" (UniqueName: \"kubernetes.io/projected/67731764-c54b-41b1-acde-7f6e0bbed77c-kube-api-access-rfcxz\") pod \"certified-operators-z47zf\" (UID: \"67731764-c54b-41b1-acde-7f6e0bbed77c\") " pod="openshift-marketplace/certified-operators-z47zf" Nov 24 13:24:25 crc kubenswrapper[4752]: I1124 13:24:25.364987 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67731764-c54b-41b1-acde-7f6e0bbed77c-utilities\") pod \"certified-operators-z47zf\" (UID: \"67731764-c54b-41b1-acde-7f6e0bbed77c\") " pod="openshift-marketplace/certified-operators-z47zf" Nov 24 13:24:25 crc kubenswrapper[4752]: I1124 13:24:25.365284 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67731764-c54b-41b1-acde-7f6e0bbed77c-catalog-content\") pod \"certified-operators-z47zf\" (UID: \"67731764-c54b-41b1-acde-7f6e0bbed77c\") " pod="openshift-marketplace/certified-operators-z47zf" Nov 24 13:24:25 crc kubenswrapper[4752]: I1124 13:24:25.467545 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfcxz\" (UniqueName: \"kubernetes.io/projected/67731764-c54b-41b1-acde-7f6e0bbed77c-kube-api-access-rfcxz\") pod \"certified-operators-z47zf\" (UID: \"67731764-c54b-41b1-acde-7f6e0bbed77c\") " pod="openshift-marketplace/certified-operators-z47zf" Nov 24 13:24:25 crc kubenswrapper[4752]: I1124 13:24:25.467597 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67731764-c54b-41b1-acde-7f6e0bbed77c-utilities\") pod \"certified-operators-z47zf\" (UID: \"67731764-c54b-41b1-acde-7f6e0bbed77c\") " pod="openshift-marketplace/certified-operators-z47zf" Nov 24 13:24:25 crc kubenswrapper[4752]: I1124 13:24:25.467788 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67731764-c54b-41b1-acde-7f6e0bbed77c-catalog-content\") pod \"certified-operators-z47zf\" (UID: \"67731764-c54b-41b1-acde-7f6e0bbed77c\") " pod="openshift-marketplace/certified-operators-z47zf" Nov 24 13:24:25 crc kubenswrapper[4752]: I1124 13:24:25.469777 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67731764-c54b-41b1-acde-7f6e0bbed77c-utilities\") pod \"certified-operators-z47zf\" (UID: \"67731764-c54b-41b1-acde-7f6e0bbed77c\") " pod="openshift-marketplace/certified-operators-z47zf" Nov 24 13:24:25 crc kubenswrapper[4752]: I1124 13:24:25.469788 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67731764-c54b-41b1-acde-7f6e0bbed77c-catalog-content\") pod \"certified-operators-z47zf\" (UID: \"67731764-c54b-41b1-acde-7f6e0bbed77c\") " pod="openshift-marketplace/certified-operators-z47zf" Nov 24 13:24:25 crc kubenswrapper[4752]: I1124 13:24:25.491441 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfcxz\" (UniqueName: \"kubernetes.io/projected/67731764-c54b-41b1-acde-7f6e0bbed77c-kube-api-access-rfcxz\") pod \"certified-operators-z47zf\" (UID: \"67731764-c54b-41b1-acde-7f6e0bbed77c\") " pod="openshift-marketplace/certified-operators-z47zf" Nov 24 13:24:25 crc kubenswrapper[4752]: I1124 13:24:25.618939 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z47zf" Nov 24 13:24:26 crc kubenswrapper[4752]: I1124 13:24:26.280296 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z47zf"] Nov 24 13:24:26 crc kubenswrapper[4752]: I1124 13:24:26.857852 4752 generic.go:334] "Generic (PLEG): container finished" podID="67731764-c54b-41b1-acde-7f6e0bbed77c" containerID="29f973bf84b20bc4f3604e4d6beac6c1739f54dc45c7309dbda2a67350e7c129" exitCode=0 Nov 24 13:24:26 crc kubenswrapper[4752]: I1124 13:24:26.857923 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z47zf" event={"ID":"67731764-c54b-41b1-acde-7f6e0bbed77c","Type":"ContainerDied","Data":"29f973bf84b20bc4f3604e4d6beac6c1739f54dc45c7309dbda2a67350e7c129"} Nov 24 13:24:26 crc kubenswrapper[4752]: I1124 13:24:26.858189 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z47zf" event={"ID":"67731764-c54b-41b1-acde-7f6e0bbed77c","Type":"ContainerStarted","Data":"9f30cfe14f7d81f0a225cf2c1e79d7804830e626a42d7e4edbd6df7d5dd4d5ec"} Nov 24 13:24:27 crc kubenswrapper[4752]: I1124 13:24:27.871593 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z47zf" event={"ID":"67731764-c54b-41b1-acde-7f6e0bbed77c","Type":"ContainerStarted","Data":"8345284e0b81ae7a17e181083ddb551637e2a0d8ae2bab8d0a642d4886e4c596"} Nov 24 13:24:28 crc kubenswrapper[4752]: I1124 13:24:28.885982 4752 generic.go:334] "Generic (PLEG): container finished" podID="67731764-c54b-41b1-acde-7f6e0bbed77c" containerID="8345284e0b81ae7a17e181083ddb551637e2a0d8ae2bab8d0a642d4886e4c596" exitCode=0 Nov 24 13:24:28 crc kubenswrapper[4752]: I1124 13:24:28.886012 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z47zf" event={"ID":"67731764-c54b-41b1-acde-7f6e0bbed77c","Type":"ContainerDied","Data":"8345284e0b81ae7a17e181083ddb551637e2a0d8ae2bab8d0a642d4886e4c596"} Nov 24 13:24:29 crc kubenswrapper[4752]: I1124 13:24:29.898991 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z47zf" event={"ID":"67731764-c54b-41b1-acde-7f6e0bbed77c","Type":"ContainerStarted","Data":"18253b71d97442e56fc3ecfef3f7f3d2c0d3123e917504d60f411d243c92d408"} Nov 24 13:24:29 crc kubenswrapper[4752]: I1124 13:24:29.931456 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-z47zf" podStartSLOduration=2.435074279 podStartE2EDuration="4.931427935s" podCreationTimestamp="2025-11-24 13:24:25 +0000 UTC" firstStartedPulling="2025-11-24 13:24:26.861261213 +0000 UTC m=+8272.846081512" lastFinishedPulling="2025-11-24 13:24:29.357614869 +0000 UTC m=+8275.342435168" observedRunningTime="2025-11-24 13:24:29.919151463 +0000 UTC m=+8275.903971762" watchObservedRunningTime="2025-11-24 13:24:29.931427935 +0000 UTC m=+8275.916248224" Nov 24 13:24:35 crc kubenswrapper[4752]: I1124 13:24:35.620622 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-z47zf" Nov 24 13:24:35 crc kubenswrapper[4752]: I1124 13:24:35.621268 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-z47zf" Nov 24 13:24:35 crc kubenswrapper[4752]: I1124 13:24:35.669398 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-z47zf" Nov 24 13:24:36 crc kubenswrapper[4752]: I1124 13:24:36.016324 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-z47zf" Nov 24 13:24:36 crc kubenswrapper[4752]: I1124 13:24:36.063526 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z47zf"] Nov 24 13:24:37 crc kubenswrapper[4752]: I1124 13:24:37.985341 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-z47zf" podUID="67731764-c54b-41b1-acde-7f6e0bbed77c" containerName="registry-server" containerID="cri-o://18253b71d97442e56fc3ecfef3f7f3d2c0d3123e917504d60f411d243c92d408" gracePeriod=2 Nov 24 13:24:38 crc kubenswrapper[4752]: I1124 13:24:38.491026 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z47zf" Nov 24 13:24:38 crc kubenswrapper[4752]: I1124 13:24:38.573243 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfcxz\" (UniqueName: \"kubernetes.io/projected/67731764-c54b-41b1-acde-7f6e0bbed77c-kube-api-access-rfcxz\") pod \"67731764-c54b-41b1-acde-7f6e0bbed77c\" (UID: \"67731764-c54b-41b1-acde-7f6e0bbed77c\") " Nov 24 13:24:38 crc kubenswrapper[4752]: I1124 13:24:38.574454 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67731764-c54b-41b1-acde-7f6e0bbed77c-catalog-content\") pod \"67731764-c54b-41b1-acde-7f6e0bbed77c\" (UID: \"67731764-c54b-41b1-acde-7f6e0bbed77c\") " Nov 24 13:24:38 crc kubenswrapper[4752]: I1124 13:24:38.574500 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67731764-c54b-41b1-acde-7f6e0bbed77c-utilities\") pod \"67731764-c54b-41b1-acde-7f6e0bbed77c\" (UID: \"67731764-c54b-41b1-acde-7f6e0bbed77c\") " Nov 24 13:24:38 crc kubenswrapper[4752]: I1124 13:24:38.575623 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67731764-c54b-41b1-acde-7f6e0bbed77c-utilities" (OuterVolumeSpecName: "utilities") pod "67731764-c54b-41b1-acde-7f6e0bbed77c" (UID: "67731764-c54b-41b1-acde-7f6e0bbed77c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 13:24:38 crc kubenswrapper[4752]: I1124 13:24:38.582132 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67731764-c54b-41b1-acde-7f6e0bbed77c-kube-api-access-rfcxz" (OuterVolumeSpecName: "kube-api-access-rfcxz") pod "67731764-c54b-41b1-acde-7f6e0bbed77c" (UID: "67731764-c54b-41b1-acde-7f6e0bbed77c"). InnerVolumeSpecName "kube-api-access-rfcxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:24:38 crc kubenswrapper[4752]: I1124 13:24:38.676943 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfcxz\" (UniqueName: \"kubernetes.io/projected/67731764-c54b-41b1-acde-7f6e0bbed77c-kube-api-access-rfcxz\") on node \"crc\" DevicePath \"\"" Nov 24 13:24:38 crc kubenswrapper[4752]: I1124 13:24:38.676978 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67731764-c54b-41b1-acde-7f6e0bbed77c-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 13:24:38 crc kubenswrapper[4752]: I1124 13:24:38.794271 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67731764-c54b-41b1-acde-7f6e0bbed77c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "67731764-c54b-41b1-acde-7f6e0bbed77c" (UID: "67731764-c54b-41b1-acde-7f6e0bbed77c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 13:24:38 crc kubenswrapper[4752]: I1124 13:24:38.881279 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67731764-c54b-41b1-acde-7f6e0bbed77c-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 13:24:39 crc kubenswrapper[4752]: I1124 13:24:39.000077 4752 generic.go:334] "Generic (PLEG): container finished" podID="67731764-c54b-41b1-acde-7f6e0bbed77c" containerID="18253b71d97442e56fc3ecfef3f7f3d2c0d3123e917504d60f411d243c92d408" exitCode=0 Nov 24 13:24:39 crc kubenswrapper[4752]: I1124 13:24:39.000271 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z47zf" event={"ID":"67731764-c54b-41b1-acde-7f6e0bbed77c","Type":"ContainerDied","Data":"18253b71d97442e56fc3ecfef3f7f3d2c0d3123e917504d60f411d243c92d408"} Nov 24 13:24:39 crc kubenswrapper[4752]: I1124 13:24:39.000627 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z47zf" event={"ID":"67731764-c54b-41b1-acde-7f6e0bbed77c","Type":"ContainerDied","Data":"9f30cfe14f7d81f0a225cf2c1e79d7804830e626a42d7e4edbd6df7d5dd4d5ec"} Nov 24 13:24:39 crc kubenswrapper[4752]: I1124 13:24:39.000356 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z47zf" Nov 24 13:24:39 crc kubenswrapper[4752]: I1124 13:24:39.000664 4752 scope.go:117] "RemoveContainer" containerID="18253b71d97442e56fc3ecfef3f7f3d2c0d3123e917504d60f411d243c92d408" Nov 24 13:24:39 crc kubenswrapper[4752]: I1124 13:24:39.039795 4752 scope.go:117] "RemoveContainer" containerID="8345284e0b81ae7a17e181083ddb551637e2a0d8ae2bab8d0a642d4886e4c596" Nov 24 13:24:39 crc kubenswrapper[4752]: I1124 13:24:39.071302 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z47zf"] Nov 24 13:24:39 crc kubenswrapper[4752]: I1124 13:24:39.074559 4752 scope.go:117] "RemoveContainer" containerID="29f973bf84b20bc4f3604e4d6beac6c1739f54dc45c7309dbda2a67350e7c129" Nov 24 13:24:39 crc kubenswrapper[4752]: I1124 13:24:39.083250 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-z47zf"] Nov 24 13:24:39 crc kubenswrapper[4752]: I1124 13:24:39.115241 4752 scope.go:117] "RemoveContainer" containerID="18253b71d97442e56fc3ecfef3f7f3d2c0d3123e917504d60f411d243c92d408" Nov 24 13:24:39 crc kubenswrapper[4752]: E1124 13:24:39.116117 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18253b71d97442e56fc3ecfef3f7f3d2c0d3123e917504d60f411d243c92d408\": container with ID starting with 18253b71d97442e56fc3ecfef3f7f3d2c0d3123e917504d60f411d243c92d408 not found: ID does not exist" containerID="18253b71d97442e56fc3ecfef3f7f3d2c0d3123e917504d60f411d243c92d408" Nov 24 13:24:39 crc kubenswrapper[4752]: I1124 13:24:39.116163 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18253b71d97442e56fc3ecfef3f7f3d2c0d3123e917504d60f411d243c92d408"} err="failed to get container status \"18253b71d97442e56fc3ecfef3f7f3d2c0d3123e917504d60f411d243c92d408\": rpc error: code = NotFound desc = could not find container \"18253b71d97442e56fc3ecfef3f7f3d2c0d3123e917504d60f411d243c92d408\": container with ID starting with 18253b71d97442e56fc3ecfef3f7f3d2c0d3123e917504d60f411d243c92d408 not found: ID does not exist" Nov 24 13:24:39 crc kubenswrapper[4752]: I1124 13:24:39.116213 4752 scope.go:117] "RemoveContainer" containerID="8345284e0b81ae7a17e181083ddb551637e2a0d8ae2bab8d0a642d4886e4c596" Nov 24 13:24:39 crc kubenswrapper[4752]: E1124 13:24:39.116546 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8345284e0b81ae7a17e181083ddb551637e2a0d8ae2bab8d0a642d4886e4c596\": container with ID starting with 8345284e0b81ae7a17e181083ddb551637e2a0d8ae2bab8d0a642d4886e4c596 not found: ID does not exist" containerID="8345284e0b81ae7a17e181083ddb551637e2a0d8ae2bab8d0a642d4886e4c596" Nov 24 13:24:39 crc kubenswrapper[4752]: I1124 13:24:39.116578 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8345284e0b81ae7a17e181083ddb551637e2a0d8ae2bab8d0a642d4886e4c596"} err="failed to get container status \"8345284e0b81ae7a17e181083ddb551637e2a0d8ae2bab8d0a642d4886e4c596\": rpc error: code = NotFound desc = could not find container \"8345284e0b81ae7a17e181083ddb551637e2a0d8ae2bab8d0a642d4886e4c596\": container with ID starting with 8345284e0b81ae7a17e181083ddb551637e2a0d8ae2bab8d0a642d4886e4c596 not found: ID does not exist" Nov 24 13:24:39 crc kubenswrapper[4752]: I1124 13:24:39.116596 4752 scope.go:117] "RemoveContainer" containerID="29f973bf84b20bc4f3604e4d6beac6c1739f54dc45c7309dbda2a67350e7c129" Nov 24 13:24:39 crc kubenswrapper[4752]: E1124 13:24:39.117003 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29f973bf84b20bc4f3604e4d6beac6c1739f54dc45c7309dbda2a67350e7c129\": container with ID starting with 29f973bf84b20bc4f3604e4d6beac6c1739f54dc45c7309dbda2a67350e7c129 not found: ID does not exist" containerID="29f973bf84b20bc4f3604e4d6beac6c1739f54dc45c7309dbda2a67350e7c129" Nov 24 13:24:39 crc kubenswrapper[4752]: I1124 13:24:39.117059 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29f973bf84b20bc4f3604e4d6beac6c1739f54dc45c7309dbda2a67350e7c129"} err="failed to get container status \"29f973bf84b20bc4f3604e4d6beac6c1739f54dc45c7309dbda2a67350e7c129\": rpc error: code = NotFound desc = could not find container \"29f973bf84b20bc4f3604e4d6beac6c1739f54dc45c7309dbda2a67350e7c129\": container with ID starting with 29f973bf84b20bc4f3604e4d6beac6c1739f54dc45c7309dbda2a67350e7c129 not found: ID does not exist" Nov 24 13:24:40 crc kubenswrapper[4752]: I1124 13:24:40.742545 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67731764-c54b-41b1-acde-7f6e0bbed77c" path="/var/lib/kubelet/pods/67731764-c54b-41b1-acde-7f6e0bbed77c/volumes" Nov 24 13:26:15 crc kubenswrapper[4752]: I1124 13:26:15.469505 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 13:26:15 crc kubenswrapper[4752]: I1124 13:26:15.470100 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 13:26:24 crc kubenswrapper[4752]: I1124 13:26:24.187300 4752 generic.go:334] "Generic (PLEG): container finished" podID="82f20cf4-0c2e-483d-8105-643e3c975dd2" containerID="9c936a6b6c79c85e6022481bc4a1efe3f0ced800ef3124ce9231517caef84bb6" exitCode=0 Nov 24 13:26:24 crc kubenswrapper[4752]: I1124 13:26:24.187413 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-5d49m" event={"ID":"82f20cf4-0c2e-483d-8105-643e3c975dd2","Type":"ContainerDied","Data":"9c936a6b6c79c85e6022481bc4a1efe3f0ced800ef3124ce9231517caef84bb6"} Nov 24 13:26:25 crc kubenswrapper[4752]: I1124 13:26:25.726138 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-5d49m" Nov 24 13:26:25 crc kubenswrapper[4752]: I1124 13:26:25.843244 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/82f20cf4-0c2e-483d-8105-643e3c975dd2-inventory\") pod \"82f20cf4-0c2e-483d-8105-643e3c975dd2\" (UID: \"82f20cf4-0c2e-483d-8105-643e3c975dd2\") " Nov 24 13:26:25 crc kubenswrapper[4752]: I1124 13:26:25.843369 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j66cc\" (UniqueName: \"kubernetes.io/projected/82f20cf4-0c2e-483d-8105-643e3c975dd2-kube-api-access-j66cc\") pod \"82f20cf4-0c2e-483d-8105-643e3c975dd2\" (UID: \"82f20cf4-0c2e-483d-8105-643e3c975dd2\") " Nov 24 13:26:25 crc kubenswrapper[4752]: I1124 13:26:25.843400 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/82f20cf4-0c2e-483d-8105-643e3c975dd2-ceph\") pod \"82f20cf4-0c2e-483d-8105-643e3c975dd2\" (UID: \"82f20cf4-0c2e-483d-8105-643e3c975dd2\") " Nov 24 13:26:25 crc kubenswrapper[4752]: I1124 13:26:25.843466 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82f20cf4-0c2e-483d-8105-643e3c975dd2-neutron-sriov-combined-ca-bundle\") pod \"82f20cf4-0c2e-483d-8105-643e3c975dd2\" (UID: \"82f20cf4-0c2e-483d-8105-643e3c975dd2\") " Nov 24 13:26:25 crc kubenswrapper[4752]: I1124 13:26:25.843578 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/82f20cf4-0c2e-483d-8105-643e3c975dd2-ssh-key\") pod \"82f20cf4-0c2e-483d-8105-643e3c975dd2\" (UID: \"82f20cf4-0c2e-483d-8105-643e3c975dd2\") " Nov 24 13:26:25 crc kubenswrapper[4752]: I1124 13:26:25.843697 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/82f20cf4-0c2e-483d-8105-643e3c975dd2-neutron-sriov-agent-neutron-config-0\") pod \"82f20cf4-0c2e-483d-8105-643e3c975dd2\" (UID: \"82f20cf4-0c2e-483d-8105-643e3c975dd2\") " Nov 24 13:26:25 crc kubenswrapper[4752]: I1124 13:26:25.852935 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82f20cf4-0c2e-483d-8105-643e3c975dd2-kube-api-access-j66cc" (OuterVolumeSpecName: "kube-api-access-j66cc") pod "82f20cf4-0c2e-483d-8105-643e3c975dd2" (UID: "82f20cf4-0c2e-483d-8105-643e3c975dd2"). InnerVolumeSpecName "kube-api-access-j66cc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:26:25 crc kubenswrapper[4752]: I1124 13:26:25.861357 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82f20cf4-0c2e-483d-8105-643e3c975dd2-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "82f20cf4-0c2e-483d-8105-643e3c975dd2" (UID: "82f20cf4-0c2e-483d-8105-643e3c975dd2"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:26:25 crc kubenswrapper[4752]: I1124 13:26:25.862168 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82f20cf4-0c2e-483d-8105-643e3c975dd2-ceph" (OuterVolumeSpecName: "ceph") pod "82f20cf4-0c2e-483d-8105-643e3c975dd2" (UID: "82f20cf4-0c2e-483d-8105-643e3c975dd2"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:26:25 crc kubenswrapper[4752]: E1124 13:26:25.877438 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82f20cf4-0c2e-483d-8105-643e3c975dd2-neutron-sriov-agent-neutron-config-0 podName:82f20cf4-0c2e-483d-8105-643e3c975dd2 nodeName:}" failed. No retries permitted until 2025-11-24 13:26:26.377410462 +0000 UTC m=+8392.362230751 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "neutron-sriov-agent-neutron-config-0" (UniqueName: "kubernetes.io/secret/82f20cf4-0c2e-483d-8105-643e3c975dd2-neutron-sriov-agent-neutron-config-0") pod "82f20cf4-0c2e-483d-8105-643e3c975dd2" (UID: "82f20cf4-0c2e-483d-8105-643e3c975dd2") : error deleting /var/lib/kubelet/pods/82f20cf4-0c2e-483d-8105-643e3c975dd2/volume-subpaths: remove /var/lib/kubelet/pods/82f20cf4-0c2e-483d-8105-643e3c975dd2/volume-subpaths: no such file or directory Nov 24 13:26:25 crc kubenswrapper[4752]: I1124 13:26:25.877839 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82f20cf4-0c2e-483d-8105-643e3c975dd2-inventory" (OuterVolumeSpecName: "inventory") pod "82f20cf4-0c2e-483d-8105-643e3c975dd2" (UID: "82f20cf4-0c2e-483d-8105-643e3c975dd2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:26:25 crc kubenswrapper[4752]: I1124 13:26:25.882197 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82f20cf4-0c2e-483d-8105-643e3c975dd2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "82f20cf4-0c2e-483d-8105-643e3c975dd2" (UID: "82f20cf4-0c2e-483d-8105-643e3c975dd2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:26:25 crc kubenswrapper[4752]: I1124 13:26:25.955218 4752 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/82f20cf4-0c2e-483d-8105-643e3c975dd2-inventory\") on node \"crc\" DevicePath \"\"" Nov 24 13:26:25 crc kubenswrapper[4752]: I1124 13:26:25.955258 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j66cc\" (UniqueName: \"kubernetes.io/projected/82f20cf4-0c2e-483d-8105-643e3c975dd2-kube-api-access-j66cc\") on node \"crc\" DevicePath \"\"" Nov 24 13:26:25 crc kubenswrapper[4752]: I1124 13:26:25.955271 4752 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/82f20cf4-0c2e-483d-8105-643e3c975dd2-ceph\") on node \"crc\" DevicePath \"\"" Nov 24 13:26:25 crc kubenswrapper[4752]: I1124 13:26:25.955283 4752 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82f20cf4-0c2e-483d-8105-643e3c975dd2-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 13:26:25 crc kubenswrapper[4752]: I1124 13:26:25.955294 4752 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/82f20cf4-0c2e-483d-8105-643e3c975dd2-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 24 13:26:26 crc kubenswrapper[4752]: I1124 13:26:26.208197 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-5d49m" event={"ID":"82f20cf4-0c2e-483d-8105-643e3c975dd2","Type":"ContainerDied","Data":"119d9e0cadac01b440c0d48868ef270ba961391052266819d81a17b2893c179f"} Nov 24 13:26:26 crc kubenswrapper[4752]: I1124 13:26:26.208247 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="119d9e0cadac01b440c0d48868ef270ba961391052266819d81a17b2893c179f" Nov 24 13:26:26 crc kubenswrapper[4752]: I1124 13:26:26.208299 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-5d49m" Nov 24 13:26:26 crc kubenswrapper[4752]: I1124 13:26:26.312779 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-j8d65"] Nov 24 13:26:26 crc kubenswrapper[4752]: E1124 13:26:26.313432 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82f20cf4-0c2e-483d-8105-643e3c975dd2" containerName="neutron-sriov-openstack-openstack-cell1" Nov 24 13:26:26 crc kubenswrapper[4752]: I1124 13:26:26.313517 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="82f20cf4-0c2e-483d-8105-643e3c975dd2" containerName="neutron-sriov-openstack-openstack-cell1" Nov 24 13:26:26 crc kubenswrapper[4752]: E1124 13:26:26.313579 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67731764-c54b-41b1-acde-7f6e0bbed77c" containerName="extract-content" Nov 24 13:26:26 crc kubenswrapper[4752]: I1124 13:26:26.313628 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="67731764-c54b-41b1-acde-7f6e0bbed77c" containerName="extract-content" Nov 24 13:26:26 crc kubenswrapper[4752]: E1124 13:26:26.313696 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67731764-c54b-41b1-acde-7f6e0bbed77c" containerName="extract-utilities" Nov 24 13:26:26 crc kubenswrapper[4752]: I1124 13:26:26.313761 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="67731764-c54b-41b1-acde-7f6e0bbed77c" containerName="extract-utilities" Nov 24 13:26:26 crc kubenswrapper[4752]: E1124 13:26:26.313845 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67731764-c54b-41b1-acde-7f6e0bbed77c" containerName="registry-server" Nov 24 13:26:26 crc kubenswrapper[4752]: I1124 13:26:26.313902 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="67731764-c54b-41b1-acde-7f6e0bbed77c" containerName="registry-server" Nov 24 13:26:26 crc kubenswrapper[4752]: I1124 13:26:26.314168 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="67731764-c54b-41b1-acde-7f6e0bbed77c" containerName="registry-server" Nov 24 13:26:26 crc kubenswrapper[4752]: I1124 13:26:26.314243 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="82f20cf4-0c2e-483d-8105-643e3c975dd2" containerName="neutron-sriov-openstack-openstack-cell1" Nov 24 13:26:26 crc kubenswrapper[4752]: I1124 13:26:26.315086 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-j8d65" Nov 24 13:26:26 crc kubenswrapper[4752]: I1124 13:26:26.317954 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-dhcp-agent-neutron-config" Nov 24 13:26:26 crc kubenswrapper[4752]: I1124 13:26:26.340921 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-j8d65"] Nov 24 13:26:26 crc kubenswrapper[4752]: I1124 13:26:26.464938 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/82f20cf4-0c2e-483d-8105-643e3c975dd2-neutron-sriov-agent-neutron-config-0\") pod \"82f20cf4-0c2e-483d-8105-643e3c975dd2\" (UID: \"82f20cf4-0c2e-483d-8105-643e3c975dd2\") " Nov 24 13:26:26 crc kubenswrapper[4752]: I1124 13:26:26.465345 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn6ck\" (UniqueName: \"kubernetes.io/projected/4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7-kube-api-access-dn6ck\") pod \"neutron-dhcp-openstack-openstack-cell1-j8d65\" (UID: \"4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-j8d65" Nov 24 13:26:26 crc kubenswrapper[4752]: I1124 13:26:26.465373 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-j8d65\" (UID: \"4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-j8d65" Nov 24 13:26:26 crc kubenswrapper[4752]: I1124 13:26:26.465415 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-j8d65\" (UID: \"4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-j8d65" Nov 24 13:26:26 crc kubenswrapper[4752]: I1124 13:26:26.465435 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-j8d65\" (UID: \"4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-j8d65" Nov 24 13:26:26 crc kubenswrapper[4752]: I1124 13:26:26.466123 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-j8d65\" (UID: \"4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-j8d65" Nov 24 13:26:26 crc kubenswrapper[4752]: I1124 13:26:26.466355 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-j8d65\" (UID: \"4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-j8d65" Nov 24 13:26:26 crc kubenswrapper[4752]: I1124 13:26:26.469648 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82f20cf4-0c2e-483d-8105-643e3c975dd2-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "82f20cf4-0c2e-483d-8105-643e3c975dd2" (UID: "82f20cf4-0c2e-483d-8105-643e3c975dd2"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:26:26 crc kubenswrapper[4752]: I1124 13:26:26.569165 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-j8d65\" (UID: \"4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-j8d65" Nov 24 13:26:26 crc kubenswrapper[4752]: I1124 13:26:26.569234 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-j8d65\" (UID: \"4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-j8d65" Nov 24 13:26:26 crc kubenswrapper[4752]: I1124 13:26:26.569508 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-j8d65\" (UID: \"4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-j8d65" Nov 24 13:26:26 crc kubenswrapper[4752]: I1124 13:26:26.569661 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-j8d65\" (UID: \"4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-j8d65" Nov 24 13:26:26 crc kubenswrapper[4752]: I1124 13:26:26.569773 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dn6ck\" (UniqueName: \"kubernetes.io/projected/4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7-kube-api-access-dn6ck\") pod \"neutron-dhcp-openstack-openstack-cell1-j8d65\" (UID: \"4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-j8d65" Nov 24 13:26:26 crc kubenswrapper[4752]: I1124 13:26:26.569802 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-j8d65\" (UID: \"4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-j8d65" Nov 24 13:26:26 crc kubenswrapper[4752]: I1124 13:26:26.569894 4752 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/82f20cf4-0c2e-483d-8105-643e3c975dd2-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Nov 24 13:26:26 crc kubenswrapper[4752]: I1124 13:26:26.574176 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-j8d65\" (UID: \"4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-j8d65" Nov 24 13:26:26 crc kubenswrapper[4752]: I1124 13:26:26.576299 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-j8d65\" (UID: \"4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-j8d65" Nov 24 13:26:26 crc kubenswrapper[4752]: I1124 13:26:26.576398 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-j8d65\" (UID: \"4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-j8d65" Nov 24 13:26:26 crc kubenswrapper[4752]: I1124 13:26:26.583981 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-j8d65\" (UID: \"4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-j8d65" Nov 24 13:26:26 crc kubenswrapper[4752]: I1124 13:26:26.590015 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-j8d65\" (UID: \"4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-j8d65" Nov 24 13:26:26 crc kubenswrapper[4752]: I1124 13:26:26.593352 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn6ck\" (UniqueName: \"kubernetes.io/projected/4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7-kube-api-access-dn6ck\") pod \"neutron-dhcp-openstack-openstack-cell1-j8d65\" (UID: \"4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-j8d65" Nov 24 13:26:26 crc kubenswrapper[4752]: I1124 13:26:26.637546 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-j8d65" Nov 24 13:26:27 crc kubenswrapper[4752]: I1124 13:26:27.195218 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-j8d65"] Nov 24 13:26:27 crc kubenswrapper[4752]: I1124 13:26:27.199295 4752 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 24 13:26:27 crc kubenswrapper[4752]: I1124 13:26:27.218476 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-j8d65" event={"ID":"4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7","Type":"ContainerStarted","Data":"68c4730acd4d36eef160e8d16809b9ee88493b83bcebb3df4d3db8e0b6a35768"} Nov 24 13:26:29 crc kubenswrapper[4752]: I1124 13:26:29.240004 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-j8d65" event={"ID":"4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7","Type":"ContainerStarted","Data":"6b4a49f8622a3b349957eff02537004e2a9b6b774093089cffcd5ec66ed9858f"} Nov 24 13:26:29 crc kubenswrapper[4752]: I1124 13:26:29.269541 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dhcp-openstack-openstack-cell1-j8d65" podStartSLOduration=2.301664414 podStartE2EDuration="3.269524271s" podCreationTimestamp="2025-11-24 13:26:26 +0000 UTC" firstStartedPulling="2025-11-24 13:26:27.199001137 +0000 UTC m=+8393.183821426" lastFinishedPulling="2025-11-24 13:26:28.166860984 +0000 UTC m=+8394.151681283" observedRunningTime="2025-11-24 13:26:29.257461015 +0000 UTC m=+8395.242281314" watchObservedRunningTime="2025-11-24 13:26:29.269524271 +0000 UTC m=+8395.254344550" Nov 24 13:26:45 crc kubenswrapper[4752]: I1124 13:26:45.469024 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 13:26:45 crc kubenswrapper[4752]: I1124 13:26:45.471061 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 13:27:15 crc kubenswrapper[4752]: I1124 13:27:15.468510 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 13:27:15 crc kubenswrapper[4752]: I1124 13:27:15.469149 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 13:27:15 crc kubenswrapper[4752]: I1124 13:27:15.469199 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 13:27:15 crc kubenswrapper[4752]: I1124 13:27:15.470064 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1a089388d4df9ad719898856284c6337e3ff4bcf9f29f31651cff6612d3497f5"} pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 24 13:27:15 crc kubenswrapper[4752]: I1124 13:27:15.470131 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" containerID="cri-o://1a089388d4df9ad719898856284c6337e3ff4bcf9f29f31651cff6612d3497f5" gracePeriod=600 Nov 24 13:27:15 crc kubenswrapper[4752]: E1124 13:27:15.640968 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:27:15 crc kubenswrapper[4752]: I1124 13:27:15.768718 4752 generic.go:334] "Generic (PLEG): container finished" podID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerID="1a089388d4df9ad719898856284c6337e3ff4bcf9f29f31651cff6612d3497f5" exitCode=0 Nov 24 13:27:15 crc kubenswrapper[4752]: I1124 13:27:15.768783 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerDied","Data":"1a089388d4df9ad719898856284c6337e3ff4bcf9f29f31651cff6612d3497f5"} Nov 24 13:27:15 crc kubenswrapper[4752]: I1124 13:27:15.768841 4752 scope.go:117] "RemoveContainer" containerID="6a9395a26994eb72298ba391dfa4612816d3a1b2facd98b558ce05b9e2e13f0f" Nov 24 13:27:15 crc kubenswrapper[4752]: I1124 13:27:15.769981 4752 scope.go:117] "RemoveContainer" containerID="1a089388d4df9ad719898856284c6337e3ff4bcf9f29f31651cff6612d3497f5" Nov 24 13:27:15 crc kubenswrapper[4752]: E1124 13:27:15.770449 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:27:27 crc kubenswrapper[4752]: I1124 13:27:27.727467 4752 scope.go:117] "RemoveContainer" containerID="1a089388d4df9ad719898856284c6337e3ff4bcf9f29f31651cff6612d3497f5" Nov 24 13:27:27 crc kubenswrapper[4752]: E1124 13:27:27.728356 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:27:38 crc kubenswrapper[4752]: I1124 13:27:38.728070 4752 scope.go:117] "RemoveContainer" containerID="1a089388d4df9ad719898856284c6337e3ff4bcf9f29f31651cff6612d3497f5" Nov 24 13:27:38 crc kubenswrapper[4752]: E1124 13:27:38.728854 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:27:50 crc kubenswrapper[4752]: I1124 13:27:50.728396 4752 scope.go:117] "RemoveContainer" containerID="1a089388d4df9ad719898856284c6337e3ff4bcf9f29f31651cff6612d3497f5" Nov 24 13:27:50 crc kubenswrapper[4752]: E1124 13:27:50.729492 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:28:03 crc kubenswrapper[4752]: I1124 13:28:03.728935 4752 scope.go:117] "RemoveContainer" containerID="1a089388d4df9ad719898856284c6337e3ff4bcf9f29f31651cff6612d3497f5" Nov 24 13:28:03 crc kubenswrapper[4752]: E1124 13:28:03.730788 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:28:16 crc kubenswrapper[4752]: I1124 13:28:16.728546 4752 scope.go:117] "RemoveContainer" containerID="1a089388d4df9ad719898856284c6337e3ff4bcf9f29f31651cff6612d3497f5" Nov 24 13:28:16 crc kubenswrapper[4752]: E1124 13:28:16.729421 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:28:29 crc kubenswrapper[4752]: I1124 13:28:29.730943 4752 scope.go:117] "RemoveContainer" containerID="1a089388d4df9ad719898856284c6337e3ff4bcf9f29f31651cff6612d3497f5" Nov 24 13:28:29 crc kubenswrapper[4752]: E1124 13:28:29.734523 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:28:44 crc kubenswrapper[4752]: I1124 13:28:44.737672 4752 scope.go:117] "RemoveContainer" containerID="1a089388d4df9ad719898856284c6337e3ff4bcf9f29f31651cff6612d3497f5" Nov 24 13:28:44 crc kubenswrapper[4752]: E1124 13:28:44.738508 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:28:55 crc kubenswrapper[4752]: I1124 13:28:55.728550 4752 scope.go:117] "RemoveContainer" containerID="1a089388d4df9ad719898856284c6337e3ff4bcf9f29f31651cff6612d3497f5" Nov 24 13:28:55 crc kubenswrapper[4752]: E1124 13:28:55.729604 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:29:08 crc kubenswrapper[4752]: I1124 13:29:08.728643 4752 scope.go:117] "RemoveContainer" containerID="1a089388d4df9ad719898856284c6337e3ff4bcf9f29f31651cff6612d3497f5" Nov 24 13:29:08 crc kubenswrapper[4752]: E1124 13:29:08.729542 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:29:19 crc kubenswrapper[4752]: I1124 13:29:19.727548 4752 scope.go:117] "RemoveContainer" containerID="1a089388d4df9ad719898856284c6337e3ff4bcf9f29f31651cff6612d3497f5" Nov 24 13:29:19 crc kubenswrapper[4752]: E1124 13:29:19.728305 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:29:20 crc kubenswrapper[4752]: I1124 13:29:20.813448 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zjscd"] Nov 24 13:29:20 crc kubenswrapper[4752]: I1124 13:29:20.826784 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zjscd" Nov 24 13:29:20 crc kubenswrapper[4752]: I1124 13:29:20.845624 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zjscd"] Nov 24 13:29:20 crc kubenswrapper[4752]: I1124 13:29:20.876224 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22281e5b-c91e-439d-8cdd-57f5587b829b-catalog-content\") pod \"redhat-operators-zjscd\" (UID: \"22281e5b-c91e-439d-8cdd-57f5587b829b\") " pod="openshift-marketplace/redhat-operators-zjscd" Nov 24 13:29:20 crc kubenswrapper[4752]: I1124 13:29:20.876381 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22281e5b-c91e-439d-8cdd-57f5587b829b-utilities\") pod \"redhat-operators-zjscd\" (UID: \"22281e5b-c91e-439d-8cdd-57f5587b829b\") " pod="openshift-marketplace/redhat-operators-zjscd" Nov 24 13:29:20 crc kubenswrapper[4752]: I1124 13:29:20.876418 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qfcz\" (UniqueName: \"kubernetes.io/projected/22281e5b-c91e-439d-8cdd-57f5587b829b-kube-api-access-5qfcz\") pod \"redhat-operators-zjscd\" (UID: \"22281e5b-c91e-439d-8cdd-57f5587b829b\") " pod="openshift-marketplace/redhat-operators-zjscd" Nov 24 13:29:20 crc kubenswrapper[4752]: I1124 13:29:20.978059 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22281e5b-c91e-439d-8cdd-57f5587b829b-catalog-content\") pod \"redhat-operators-zjscd\" (UID: \"22281e5b-c91e-439d-8cdd-57f5587b829b\") " pod="openshift-marketplace/redhat-operators-zjscd" Nov 24 13:29:20 crc kubenswrapper[4752]: I1124 13:29:20.978167 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22281e5b-c91e-439d-8cdd-57f5587b829b-utilities\") pod \"redhat-operators-zjscd\" (UID: \"22281e5b-c91e-439d-8cdd-57f5587b829b\") " pod="openshift-marketplace/redhat-operators-zjscd" Nov 24 13:29:20 crc kubenswrapper[4752]: I1124 13:29:20.978188 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qfcz\" (UniqueName: \"kubernetes.io/projected/22281e5b-c91e-439d-8cdd-57f5587b829b-kube-api-access-5qfcz\") pod \"redhat-operators-zjscd\" (UID: \"22281e5b-c91e-439d-8cdd-57f5587b829b\") " pod="openshift-marketplace/redhat-operators-zjscd" Nov 24 13:29:20 crc kubenswrapper[4752]: I1124 13:29:20.978904 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22281e5b-c91e-439d-8cdd-57f5587b829b-catalog-content\") pod \"redhat-operators-zjscd\" (UID: \"22281e5b-c91e-439d-8cdd-57f5587b829b\") " pod="openshift-marketplace/redhat-operators-zjscd" Nov 24 13:29:20 crc kubenswrapper[4752]: I1124 13:29:20.979159 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22281e5b-c91e-439d-8cdd-57f5587b829b-utilities\") pod \"redhat-operators-zjscd\" (UID: \"22281e5b-c91e-439d-8cdd-57f5587b829b\") " pod="openshift-marketplace/redhat-operators-zjscd" Nov 24 13:29:21 crc kubenswrapper[4752]: I1124 13:29:21.012498 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qfcz\" (UniqueName: \"kubernetes.io/projected/22281e5b-c91e-439d-8cdd-57f5587b829b-kube-api-access-5qfcz\") pod \"redhat-operators-zjscd\" (UID: \"22281e5b-c91e-439d-8cdd-57f5587b829b\") " pod="openshift-marketplace/redhat-operators-zjscd" Nov 24 13:29:21 crc kubenswrapper[4752]: I1124 13:29:21.163723 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zjscd" Nov 24 13:29:21 crc kubenswrapper[4752]: I1124 13:29:21.507542 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zjscd"] Nov 24 13:29:22 crc kubenswrapper[4752]: I1124 13:29:22.193871 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9kbdr"] Nov 24 13:29:22 crc kubenswrapper[4752]: I1124 13:29:22.196500 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9kbdr" Nov 24 13:29:22 crc kubenswrapper[4752]: I1124 13:29:22.205304 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9kbdr"] Nov 24 13:29:22 crc kubenswrapper[4752]: I1124 13:29:22.319933 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06d1c715-750f-4588-a900-b02807e01098-utilities\") pod \"redhat-marketplace-9kbdr\" (UID: \"06d1c715-750f-4588-a900-b02807e01098\") " pod="openshift-marketplace/redhat-marketplace-9kbdr" Nov 24 13:29:22 crc kubenswrapper[4752]: I1124 13:29:22.320215 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wcfl\" (UniqueName: \"kubernetes.io/projected/06d1c715-750f-4588-a900-b02807e01098-kube-api-access-8wcfl\") pod \"redhat-marketplace-9kbdr\" (UID: \"06d1c715-750f-4588-a900-b02807e01098\") " pod="openshift-marketplace/redhat-marketplace-9kbdr" Nov 24 13:29:22 crc kubenswrapper[4752]: I1124 13:29:22.320239 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06d1c715-750f-4588-a900-b02807e01098-catalog-content\") pod \"redhat-marketplace-9kbdr\" (UID: \"06d1c715-750f-4588-a900-b02807e01098\") " pod="openshift-marketplace/redhat-marketplace-9kbdr" Nov 24 13:29:22 crc kubenswrapper[4752]: I1124 13:29:22.391762 4752 generic.go:334] "Generic (PLEG): container finished" podID="22281e5b-c91e-439d-8cdd-57f5587b829b" containerID="059a5aeb97512022af6da46675da64d8a8fdf6d63a365a46cb9170f2906f31d4" exitCode=0 Nov 24 13:29:22 crc kubenswrapper[4752]: I1124 13:29:22.391815 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjscd" event={"ID":"22281e5b-c91e-439d-8cdd-57f5587b829b","Type":"ContainerDied","Data":"059a5aeb97512022af6da46675da64d8a8fdf6d63a365a46cb9170f2906f31d4"} Nov 24 13:29:22 crc kubenswrapper[4752]: I1124 13:29:22.391847 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjscd" event={"ID":"22281e5b-c91e-439d-8cdd-57f5587b829b","Type":"ContainerStarted","Data":"aac999b53678f2abc738ff47a3ae8c3fa46ae3d95656bd99ef10513c28028aed"} Nov 24 13:29:22 crc kubenswrapper[4752]: I1124 13:29:22.421832 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06d1c715-750f-4588-a900-b02807e01098-utilities\") pod \"redhat-marketplace-9kbdr\" (UID: \"06d1c715-750f-4588-a900-b02807e01098\") " pod="openshift-marketplace/redhat-marketplace-9kbdr" Nov 24 13:29:22 crc kubenswrapper[4752]: I1124 13:29:22.421889 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wcfl\" (UniqueName: \"kubernetes.io/projected/06d1c715-750f-4588-a900-b02807e01098-kube-api-access-8wcfl\") pod \"redhat-marketplace-9kbdr\" (UID: \"06d1c715-750f-4588-a900-b02807e01098\") " pod="openshift-marketplace/redhat-marketplace-9kbdr" Nov 24 13:29:22 crc kubenswrapper[4752]: I1124 13:29:22.421916 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06d1c715-750f-4588-a900-b02807e01098-catalog-content\") pod \"redhat-marketplace-9kbdr\" (UID: \"06d1c715-750f-4588-a900-b02807e01098\") " pod="openshift-marketplace/redhat-marketplace-9kbdr" Nov 24 13:29:22 crc kubenswrapper[4752]: I1124 13:29:22.422331 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06d1c715-750f-4588-a900-b02807e01098-utilities\") pod \"redhat-marketplace-9kbdr\" (UID: \"06d1c715-750f-4588-a900-b02807e01098\") " pod="openshift-marketplace/redhat-marketplace-9kbdr" Nov 24 13:29:22 crc kubenswrapper[4752]: I1124 13:29:22.422661 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06d1c715-750f-4588-a900-b02807e01098-catalog-content\") pod \"redhat-marketplace-9kbdr\" (UID: \"06d1c715-750f-4588-a900-b02807e01098\") " pod="openshift-marketplace/redhat-marketplace-9kbdr" Nov 24 13:29:22 crc kubenswrapper[4752]: I1124 13:29:22.449350 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wcfl\" (UniqueName: \"kubernetes.io/projected/06d1c715-750f-4588-a900-b02807e01098-kube-api-access-8wcfl\") pod \"redhat-marketplace-9kbdr\" (UID: \"06d1c715-750f-4588-a900-b02807e01098\") " pod="openshift-marketplace/redhat-marketplace-9kbdr" Nov 24 13:29:22 crc kubenswrapper[4752]: I1124 13:29:22.526944 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9kbdr" Nov 24 13:29:23 crc kubenswrapper[4752]: I1124 13:29:23.102559 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9kbdr"] Nov 24 13:29:23 crc kubenswrapper[4752]: I1124 13:29:23.416030 4752 generic.go:334] "Generic (PLEG): container finished" podID="06d1c715-750f-4588-a900-b02807e01098" containerID="fb7fd57c528a563e969983580577d6498d145f1b7ec89e70f7d40779ef7f0137" exitCode=0 Nov 24 13:29:23 crc kubenswrapper[4752]: I1124 13:29:23.416140 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9kbdr" event={"ID":"06d1c715-750f-4588-a900-b02807e01098","Type":"ContainerDied","Data":"fb7fd57c528a563e969983580577d6498d145f1b7ec89e70f7d40779ef7f0137"} Nov 24 13:29:23 crc kubenswrapper[4752]: I1124 13:29:23.416354 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9kbdr" event={"ID":"06d1c715-750f-4588-a900-b02807e01098","Type":"ContainerStarted","Data":"d6a7f0b6a7067ad8fc3f4ebdf691d2f7f0445feb20a54a8cdc5e5d8b1cbff0c6"} Nov 24 13:29:24 crc kubenswrapper[4752]: I1124 13:29:24.427770 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjscd" event={"ID":"22281e5b-c91e-439d-8cdd-57f5587b829b","Type":"ContainerStarted","Data":"da5ef47379ff1f1cbb596c4478feb0d1487aed2fd846b733ad0e42998585ce64"} Nov 24 13:29:25 crc kubenswrapper[4752]: I1124 13:29:25.438469 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9kbdr" event={"ID":"06d1c715-750f-4588-a900-b02807e01098","Type":"ContainerStarted","Data":"8d26c7e05a5bdef70e2ed9bf2d786582f1d4e4e8f0556760fbeb792f95a0e0a0"} Nov 24 13:29:26 crc kubenswrapper[4752]: I1124 13:29:26.450198 4752 generic.go:334] "Generic (PLEG): container finished" podID="06d1c715-750f-4588-a900-b02807e01098" containerID="8d26c7e05a5bdef70e2ed9bf2d786582f1d4e4e8f0556760fbeb792f95a0e0a0" exitCode=0 Nov 24 13:29:26 crc kubenswrapper[4752]: I1124 13:29:26.450296 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9kbdr" event={"ID":"06d1c715-750f-4588-a900-b02807e01098","Type":"ContainerDied","Data":"8d26c7e05a5bdef70e2ed9bf2d786582f1d4e4e8f0556760fbeb792f95a0e0a0"} Nov 24 13:29:27 crc kubenswrapper[4752]: I1124 13:29:27.463695 4752 generic.go:334] "Generic (PLEG): container finished" podID="22281e5b-c91e-439d-8cdd-57f5587b829b" containerID="da5ef47379ff1f1cbb596c4478feb0d1487aed2fd846b733ad0e42998585ce64" exitCode=0 Nov 24 13:29:27 crc kubenswrapper[4752]: I1124 13:29:27.463740 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjscd" event={"ID":"22281e5b-c91e-439d-8cdd-57f5587b829b","Type":"ContainerDied","Data":"da5ef47379ff1f1cbb596c4478feb0d1487aed2fd846b733ad0e42998585ce64"} Nov 24 13:29:27 crc kubenswrapper[4752]: I1124 13:29:27.468328 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9kbdr" event={"ID":"06d1c715-750f-4588-a900-b02807e01098","Type":"ContainerStarted","Data":"fd0814f53d95f29023021a0bc120a85d1065d487e3d29074b06f7f4a29187ba0"} Nov 24 13:29:27 crc kubenswrapper[4752]: I1124 13:29:27.522110 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9kbdr" podStartSLOduration=2.043919072 podStartE2EDuration="5.522082474s" podCreationTimestamp="2025-11-24 13:29:22 +0000 UTC" firstStartedPulling="2025-11-24 13:29:23.41861324 +0000 UTC m=+8569.403433529" lastFinishedPulling="2025-11-24 13:29:26.896776642 +0000 UTC m=+8572.881596931" observedRunningTime="2025-11-24 13:29:27.510682867 +0000 UTC m=+8573.495503166" watchObservedRunningTime="2025-11-24 13:29:27.522082474 +0000 UTC m=+8573.506902763" Nov 24 13:29:28 crc kubenswrapper[4752]: I1124 13:29:28.481854 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjscd" event={"ID":"22281e5b-c91e-439d-8cdd-57f5587b829b","Type":"ContainerStarted","Data":"86605d06b4c2b1944b340db7b846a3dc687adcbb219426d0f4e447ce526f82e8"} Nov 24 13:29:28 crc kubenswrapper[4752]: I1124 13:29:28.508516 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zjscd" podStartSLOduration=3.066308016 podStartE2EDuration="8.508499303s" podCreationTimestamp="2025-11-24 13:29:20 +0000 UTC" firstStartedPulling="2025-11-24 13:29:22.394972157 +0000 UTC m=+8568.379792446" lastFinishedPulling="2025-11-24 13:29:27.837163424 +0000 UTC m=+8573.821983733" observedRunningTime="2025-11-24 13:29:28.497995972 +0000 UTC m=+8574.482816261" watchObservedRunningTime="2025-11-24 13:29:28.508499303 +0000 UTC m=+8574.493319592" Nov 24 13:29:31 crc kubenswrapper[4752]: I1124 13:29:31.164441 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zjscd" Nov 24 13:29:31 crc kubenswrapper[4752]: I1124 13:29:31.164995 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zjscd" Nov 24 13:29:32 crc kubenswrapper[4752]: I1124 13:29:32.217687 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zjscd" podUID="22281e5b-c91e-439d-8cdd-57f5587b829b" containerName="registry-server" probeResult="failure" output=< Nov 24 13:29:32 crc kubenswrapper[4752]: timeout: failed to connect service ":50051" within 1s Nov 24 13:29:32 crc kubenswrapper[4752]: > Nov 24 13:29:32 crc kubenswrapper[4752]: I1124 13:29:32.527562 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9kbdr" Nov 24 13:29:32 crc kubenswrapper[4752]: I1124 13:29:32.527635 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9kbdr" Nov 24 13:29:32 crc kubenswrapper[4752]: I1124 13:29:32.572812 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9kbdr" Nov 24 13:29:33 crc kubenswrapper[4752]: I1124 13:29:33.582680 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9kbdr" Nov 24 13:29:33 crc kubenswrapper[4752]: I1124 13:29:33.635028 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9kbdr"] Nov 24 13:29:33 crc kubenswrapper[4752]: I1124 13:29:33.728618 4752 scope.go:117] "RemoveContainer" containerID="1a089388d4df9ad719898856284c6337e3ff4bcf9f29f31651cff6612d3497f5" Nov 24 13:29:33 crc kubenswrapper[4752]: E1124 13:29:33.729177 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:29:35 crc kubenswrapper[4752]: I1124 13:29:35.548989 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9kbdr" podUID="06d1c715-750f-4588-a900-b02807e01098" containerName="registry-server" containerID="cri-o://fd0814f53d95f29023021a0bc120a85d1065d487e3d29074b06f7f4a29187ba0" gracePeriod=2 Nov 24 13:29:36 crc kubenswrapper[4752]: I1124 13:29:36.074934 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9kbdr" Nov 24 13:29:36 crc kubenswrapper[4752]: I1124 13:29:36.236221 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wcfl\" (UniqueName: \"kubernetes.io/projected/06d1c715-750f-4588-a900-b02807e01098-kube-api-access-8wcfl\") pod \"06d1c715-750f-4588-a900-b02807e01098\" (UID: \"06d1c715-750f-4588-a900-b02807e01098\") " Nov 24 13:29:36 crc kubenswrapper[4752]: I1124 13:29:36.236686 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06d1c715-750f-4588-a900-b02807e01098-utilities\") pod \"06d1c715-750f-4588-a900-b02807e01098\" (UID: \"06d1c715-750f-4588-a900-b02807e01098\") " Nov 24 13:29:36 crc kubenswrapper[4752]: I1124 13:29:36.237221 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06d1c715-750f-4588-a900-b02807e01098-catalog-content\") pod \"06d1c715-750f-4588-a900-b02807e01098\" (UID: \"06d1c715-750f-4588-a900-b02807e01098\") " Nov 24 13:29:36 crc kubenswrapper[4752]: I1124 13:29:36.237548 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06d1c715-750f-4588-a900-b02807e01098-utilities" (OuterVolumeSpecName: "utilities") pod "06d1c715-750f-4588-a900-b02807e01098" (UID: "06d1c715-750f-4588-a900-b02807e01098"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 13:29:36 crc kubenswrapper[4752]: I1124 13:29:36.238697 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06d1c715-750f-4588-a900-b02807e01098-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 13:29:36 crc kubenswrapper[4752]: I1124 13:29:36.243409 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06d1c715-750f-4588-a900-b02807e01098-kube-api-access-8wcfl" (OuterVolumeSpecName: "kube-api-access-8wcfl") pod "06d1c715-750f-4588-a900-b02807e01098" (UID: "06d1c715-750f-4588-a900-b02807e01098"). InnerVolumeSpecName "kube-api-access-8wcfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:29:36 crc kubenswrapper[4752]: I1124 13:29:36.255683 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06d1c715-750f-4588-a900-b02807e01098-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "06d1c715-750f-4588-a900-b02807e01098" (UID: "06d1c715-750f-4588-a900-b02807e01098"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 13:29:36 crc kubenswrapper[4752]: I1124 13:29:36.341326 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06d1c715-750f-4588-a900-b02807e01098-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 13:29:36 crc kubenswrapper[4752]: I1124 13:29:36.341361 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wcfl\" (UniqueName: \"kubernetes.io/projected/06d1c715-750f-4588-a900-b02807e01098-kube-api-access-8wcfl\") on node \"crc\" DevicePath \"\"" Nov 24 13:29:36 crc kubenswrapper[4752]: I1124 13:29:36.573090 4752 generic.go:334] "Generic (PLEG): container finished" podID="06d1c715-750f-4588-a900-b02807e01098" containerID="fd0814f53d95f29023021a0bc120a85d1065d487e3d29074b06f7f4a29187ba0" exitCode=0 Nov 24 13:29:36 crc kubenswrapper[4752]: I1124 13:29:36.573154 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9kbdr" event={"ID":"06d1c715-750f-4588-a900-b02807e01098","Type":"ContainerDied","Data":"fd0814f53d95f29023021a0bc120a85d1065d487e3d29074b06f7f4a29187ba0"} Nov 24 13:29:36 crc kubenswrapper[4752]: I1124 13:29:36.573188 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9kbdr" Nov 24 13:29:36 crc kubenswrapper[4752]: I1124 13:29:36.573202 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9kbdr" event={"ID":"06d1c715-750f-4588-a900-b02807e01098","Type":"ContainerDied","Data":"d6a7f0b6a7067ad8fc3f4ebdf691d2f7f0445feb20a54a8cdc5e5d8b1cbff0c6"} Nov 24 13:29:36 crc kubenswrapper[4752]: I1124 13:29:36.573218 4752 scope.go:117] "RemoveContainer" containerID="fd0814f53d95f29023021a0bc120a85d1065d487e3d29074b06f7f4a29187ba0" Nov 24 13:29:36 crc kubenswrapper[4752]: I1124 13:29:36.601548 4752 scope.go:117] "RemoveContainer" containerID="8d26c7e05a5bdef70e2ed9bf2d786582f1d4e4e8f0556760fbeb792f95a0e0a0" Nov 24 13:29:36 crc kubenswrapper[4752]: I1124 13:29:36.623004 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9kbdr"] Nov 24 13:29:36 crc kubenswrapper[4752]: I1124 13:29:36.633163 4752 scope.go:117] "RemoveContainer" containerID="fb7fd57c528a563e969983580577d6498d145f1b7ec89e70f7d40779ef7f0137" Nov 24 13:29:36 crc kubenswrapper[4752]: I1124 13:29:36.633235 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9kbdr"] Nov 24 13:29:36 crc kubenswrapper[4752]: I1124 13:29:36.715210 4752 scope.go:117] "RemoveContainer" containerID="fd0814f53d95f29023021a0bc120a85d1065d487e3d29074b06f7f4a29187ba0" Nov 24 13:29:36 crc kubenswrapper[4752]: E1124 13:29:36.715692 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd0814f53d95f29023021a0bc120a85d1065d487e3d29074b06f7f4a29187ba0\": container with ID starting with fd0814f53d95f29023021a0bc120a85d1065d487e3d29074b06f7f4a29187ba0 not found: ID does not exist" containerID="fd0814f53d95f29023021a0bc120a85d1065d487e3d29074b06f7f4a29187ba0" Nov 24 13:29:36 crc kubenswrapper[4752]: I1124 13:29:36.715733 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd0814f53d95f29023021a0bc120a85d1065d487e3d29074b06f7f4a29187ba0"} err="failed to get container status \"fd0814f53d95f29023021a0bc120a85d1065d487e3d29074b06f7f4a29187ba0\": rpc error: code = NotFound desc = could not find container \"fd0814f53d95f29023021a0bc120a85d1065d487e3d29074b06f7f4a29187ba0\": container with ID starting with fd0814f53d95f29023021a0bc120a85d1065d487e3d29074b06f7f4a29187ba0 not found: ID does not exist" Nov 24 13:29:36 crc kubenswrapper[4752]: I1124 13:29:36.715771 4752 scope.go:117] "RemoveContainer" containerID="8d26c7e05a5bdef70e2ed9bf2d786582f1d4e4e8f0556760fbeb792f95a0e0a0" Nov 24 13:29:36 crc kubenswrapper[4752]: E1124 13:29:36.716226 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d26c7e05a5bdef70e2ed9bf2d786582f1d4e4e8f0556760fbeb792f95a0e0a0\": container with ID starting with 8d26c7e05a5bdef70e2ed9bf2d786582f1d4e4e8f0556760fbeb792f95a0e0a0 not found: ID does not exist" containerID="8d26c7e05a5bdef70e2ed9bf2d786582f1d4e4e8f0556760fbeb792f95a0e0a0" Nov 24 13:29:36 crc kubenswrapper[4752]: I1124 13:29:36.716262 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d26c7e05a5bdef70e2ed9bf2d786582f1d4e4e8f0556760fbeb792f95a0e0a0"} err="failed to get container status \"8d26c7e05a5bdef70e2ed9bf2d786582f1d4e4e8f0556760fbeb792f95a0e0a0\": rpc error: code = NotFound desc = could not find container \"8d26c7e05a5bdef70e2ed9bf2d786582f1d4e4e8f0556760fbeb792f95a0e0a0\": container with ID starting with 8d26c7e05a5bdef70e2ed9bf2d786582f1d4e4e8f0556760fbeb792f95a0e0a0 not found: ID does not exist" Nov 24 13:29:36 crc kubenswrapper[4752]: I1124 13:29:36.716291 4752 scope.go:117] "RemoveContainer" containerID="fb7fd57c528a563e969983580577d6498d145f1b7ec89e70f7d40779ef7f0137" Nov 24 13:29:36 crc kubenswrapper[4752]: E1124 13:29:36.716590 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb7fd57c528a563e969983580577d6498d145f1b7ec89e70f7d40779ef7f0137\": container with ID starting with fb7fd57c528a563e969983580577d6498d145f1b7ec89e70f7d40779ef7f0137 not found: ID does not exist" containerID="fb7fd57c528a563e969983580577d6498d145f1b7ec89e70f7d40779ef7f0137" Nov 24 13:29:36 crc kubenswrapper[4752]: I1124 13:29:36.716614 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb7fd57c528a563e969983580577d6498d145f1b7ec89e70f7d40779ef7f0137"} err="failed to get container status \"fb7fd57c528a563e969983580577d6498d145f1b7ec89e70f7d40779ef7f0137\": rpc error: code = NotFound desc = could not find container \"fb7fd57c528a563e969983580577d6498d145f1b7ec89e70f7d40779ef7f0137\": container with ID starting with fb7fd57c528a563e969983580577d6498d145f1b7ec89e70f7d40779ef7f0137 not found: ID does not exist" Nov 24 13:29:36 crc kubenswrapper[4752]: I1124 13:29:36.755675 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06d1c715-750f-4588-a900-b02807e01098" path="/var/lib/kubelet/pods/06d1c715-750f-4588-a900-b02807e01098/volumes" Nov 24 13:29:41 crc kubenswrapper[4752]: I1124 13:29:41.243101 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zjscd" Nov 24 13:29:41 crc kubenswrapper[4752]: I1124 13:29:41.289878 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zjscd" Nov 24 13:29:41 crc kubenswrapper[4752]: I1124 13:29:41.481338 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zjscd"] Nov 24 13:29:42 crc kubenswrapper[4752]: I1124 13:29:42.650009 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zjscd" podUID="22281e5b-c91e-439d-8cdd-57f5587b829b" containerName="registry-server" containerID="cri-o://86605d06b4c2b1944b340db7b846a3dc687adcbb219426d0f4e447ce526f82e8" gracePeriod=2 Nov 24 13:29:43 crc kubenswrapper[4752]: I1124 13:29:43.217683 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zjscd" Nov 24 13:29:43 crc kubenswrapper[4752]: I1124 13:29:43.384388 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qfcz\" (UniqueName: \"kubernetes.io/projected/22281e5b-c91e-439d-8cdd-57f5587b829b-kube-api-access-5qfcz\") pod \"22281e5b-c91e-439d-8cdd-57f5587b829b\" (UID: \"22281e5b-c91e-439d-8cdd-57f5587b829b\") " Nov 24 13:29:43 crc kubenswrapper[4752]: I1124 13:29:43.384508 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22281e5b-c91e-439d-8cdd-57f5587b829b-catalog-content\") pod \"22281e5b-c91e-439d-8cdd-57f5587b829b\" (UID: \"22281e5b-c91e-439d-8cdd-57f5587b829b\") " Nov 24 13:29:43 crc kubenswrapper[4752]: I1124 13:29:43.384688 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22281e5b-c91e-439d-8cdd-57f5587b829b-utilities\") pod \"22281e5b-c91e-439d-8cdd-57f5587b829b\" (UID: \"22281e5b-c91e-439d-8cdd-57f5587b829b\") " Nov 24 13:29:43 crc kubenswrapper[4752]: I1124 13:29:43.385639 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22281e5b-c91e-439d-8cdd-57f5587b829b-utilities" (OuterVolumeSpecName: "utilities") pod "22281e5b-c91e-439d-8cdd-57f5587b829b" (UID: "22281e5b-c91e-439d-8cdd-57f5587b829b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 13:29:43 crc kubenswrapper[4752]: I1124 13:29:43.393667 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22281e5b-c91e-439d-8cdd-57f5587b829b-kube-api-access-5qfcz" (OuterVolumeSpecName: "kube-api-access-5qfcz") pod "22281e5b-c91e-439d-8cdd-57f5587b829b" (UID: "22281e5b-c91e-439d-8cdd-57f5587b829b"). InnerVolumeSpecName "kube-api-access-5qfcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:29:43 crc kubenswrapper[4752]: I1124 13:29:43.480317 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22281e5b-c91e-439d-8cdd-57f5587b829b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "22281e5b-c91e-439d-8cdd-57f5587b829b" (UID: "22281e5b-c91e-439d-8cdd-57f5587b829b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 13:29:43 crc kubenswrapper[4752]: I1124 13:29:43.487008 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22281e5b-c91e-439d-8cdd-57f5587b829b-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 13:29:43 crc kubenswrapper[4752]: I1124 13:29:43.487042 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qfcz\" (UniqueName: \"kubernetes.io/projected/22281e5b-c91e-439d-8cdd-57f5587b829b-kube-api-access-5qfcz\") on node \"crc\" DevicePath \"\"" Nov 24 13:29:43 crc kubenswrapper[4752]: I1124 13:29:43.487052 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22281e5b-c91e-439d-8cdd-57f5587b829b-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 13:29:43 crc kubenswrapper[4752]: I1124 13:29:43.664672 4752 generic.go:334] "Generic (PLEG): container finished" podID="22281e5b-c91e-439d-8cdd-57f5587b829b" containerID="86605d06b4c2b1944b340db7b846a3dc687adcbb219426d0f4e447ce526f82e8" exitCode=0 Nov 24 13:29:43 crc kubenswrapper[4752]: I1124 13:29:43.664930 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjscd" event={"ID":"22281e5b-c91e-439d-8cdd-57f5587b829b","Type":"ContainerDied","Data":"86605d06b4c2b1944b340db7b846a3dc687adcbb219426d0f4e447ce526f82e8"} Nov 24 13:29:43 crc kubenswrapper[4752]: I1124 13:29:43.665734 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjscd" event={"ID":"22281e5b-c91e-439d-8cdd-57f5587b829b","Type":"ContainerDied","Data":"aac999b53678f2abc738ff47a3ae8c3fa46ae3d95656bd99ef10513c28028aed"} Nov 24 13:29:43 crc kubenswrapper[4752]: I1124 13:29:43.665829 4752 scope.go:117] "RemoveContainer" containerID="86605d06b4c2b1944b340db7b846a3dc687adcbb219426d0f4e447ce526f82e8" Nov 24 13:29:43 crc kubenswrapper[4752]: I1124 13:29:43.664994 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zjscd" Nov 24 13:29:43 crc kubenswrapper[4752]: I1124 13:29:43.700042 4752 scope.go:117] "RemoveContainer" containerID="da5ef47379ff1f1cbb596c4478feb0d1487aed2fd846b733ad0e42998585ce64" Nov 24 13:29:43 crc kubenswrapper[4752]: I1124 13:29:43.700730 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zjscd"] Nov 24 13:29:43 crc kubenswrapper[4752]: I1124 13:29:43.709806 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zjscd"] Nov 24 13:29:43 crc kubenswrapper[4752]: I1124 13:29:43.735554 4752 scope.go:117] "RemoveContainer" containerID="059a5aeb97512022af6da46675da64d8a8fdf6d63a365a46cb9170f2906f31d4" Nov 24 13:29:43 crc kubenswrapper[4752]: I1124 13:29:43.784635 4752 scope.go:117] "RemoveContainer" containerID="86605d06b4c2b1944b340db7b846a3dc687adcbb219426d0f4e447ce526f82e8" Nov 24 13:29:43 crc kubenswrapper[4752]: E1124 13:29:43.785221 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86605d06b4c2b1944b340db7b846a3dc687adcbb219426d0f4e447ce526f82e8\": container with ID starting with 86605d06b4c2b1944b340db7b846a3dc687adcbb219426d0f4e447ce526f82e8 not found: ID does not exist" containerID="86605d06b4c2b1944b340db7b846a3dc687adcbb219426d0f4e447ce526f82e8" Nov 24 13:29:43 crc kubenswrapper[4752]: I1124 13:29:43.785271 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86605d06b4c2b1944b340db7b846a3dc687adcbb219426d0f4e447ce526f82e8"} err="failed to get container status \"86605d06b4c2b1944b340db7b846a3dc687adcbb219426d0f4e447ce526f82e8\": rpc error: code = NotFound desc = could not find container \"86605d06b4c2b1944b340db7b846a3dc687adcbb219426d0f4e447ce526f82e8\": container with ID starting with 86605d06b4c2b1944b340db7b846a3dc687adcbb219426d0f4e447ce526f82e8 not found: ID does not exist" Nov 24 13:29:43 crc kubenswrapper[4752]: I1124 13:29:43.785300 4752 scope.go:117] "RemoveContainer" containerID="da5ef47379ff1f1cbb596c4478feb0d1487aed2fd846b733ad0e42998585ce64" Nov 24 13:29:43 crc kubenswrapper[4752]: E1124 13:29:43.785987 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da5ef47379ff1f1cbb596c4478feb0d1487aed2fd846b733ad0e42998585ce64\": container with ID starting with da5ef47379ff1f1cbb596c4478feb0d1487aed2fd846b733ad0e42998585ce64 not found: ID does not exist" containerID="da5ef47379ff1f1cbb596c4478feb0d1487aed2fd846b733ad0e42998585ce64" Nov 24 13:29:43 crc kubenswrapper[4752]: I1124 13:29:43.786023 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da5ef47379ff1f1cbb596c4478feb0d1487aed2fd846b733ad0e42998585ce64"} err="failed to get container status \"da5ef47379ff1f1cbb596c4478feb0d1487aed2fd846b733ad0e42998585ce64\": rpc error: code = NotFound desc = could not find container \"da5ef47379ff1f1cbb596c4478feb0d1487aed2fd846b733ad0e42998585ce64\": container with ID starting with da5ef47379ff1f1cbb596c4478feb0d1487aed2fd846b733ad0e42998585ce64 not found: ID does not exist" Nov 24 13:29:43 crc kubenswrapper[4752]: I1124 13:29:43.786043 4752 scope.go:117] "RemoveContainer" containerID="059a5aeb97512022af6da46675da64d8a8fdf6d63a365a46cb9170f2906f31d4" Nov 24 13:29:43 crc kubenswrapper[4752]: E1124 13:29:43.789413 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"059a5aeb97512022af6da46675da64d8a8fdf6d63a365a46cb9170f2906f31d4\": container with ID starting with 059a5aeb97512022af6da46675da64d8a8fdf6d63a365a46cb9170f2906f31d4 not found: ID does not exist" containerID="059a5aeb97512022af6da46675da64d8a8fdf6d63a365a46cb9170f2906f31d4" Nov 24 13:29:43 crc kubenswrapper[4752]: I1124 13:29:43.789463 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"059a5aeb97512022af6da46675da64d8a8fdf6d63a365a46cb9170f2906f31d4"} err="failed to get container status \"059a5aeb97512022af6da46675da64d8a8fdf6d63a365a46cb9170f2906f31d4\": rpc error: code = NotFound desc = could not find container \"059a5aeb97512022af6da46675da64d8a8fdf6d63a365a46cb9170f2906f31d4\": container with ID starting with 059a5aeb97512022af6da46675da64d8a8fdf6d63a365a46cb9170f2906f31d4 not found: ID does not exist" Nov 24 13:29:44 crc kubenswrapper[4752]: I1124 13:29:44.741188 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22281e5b-c91e-439d-8cdd-57f5587b829b" path="/var/lib/kubelet/pods/22281e5b-c91e-439d-8cdd-57f5587b829b/volumes" Nov 24 13:29:45 crc kubenswrapper[4752]: I1124 13:29:45.728511 4752 scope.go:117] "RemoveContainer" containerID="1a089388d4df9ad719898856284c6337e3ff4bcf9f29f31651cff6612d3497f5" Nov 24 13:29:45 crc kubenswrapper[4752]: E1124 13:29:45.729087 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:29:57 crc kubenswrapper[4752]: I1124 13:29:57.728356 4752 scope.go:117] "RemoveContainer" containerID="1a089388d4df9ad719898856284c6337e3ff4bcf9f29f31651cff6612d3497f5" Nov 24 13:29:57 crc kubenswrapper[4752]: E1124 13:29:57.729699 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:30:00 crc kubenswrapper[4752]: I1124 13:30:00.157322 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399850-m9vgj"] Nov 24 13:30:00 crc kubenswrapper[4752]: E1124 13:30:00.158368 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06d1c715-750f-4588-a900-b02807e01098" containerName="extract-content" Nov 24 13:30:00 crc kubenswrapper[4752]: I1124 13:30:00.158385 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="06d1c715-750f-4588-a900-b02807e01098" containerName="extract-content" Nov 24 13:30:00 crc kubenswrapper[4752]: E1124 13:30:00.158395 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22281e5b-c91e-439d-8cdd-57f5587b829b" containerName="registry-server" Nov 24 13:30:00 crc kubenswrapper[4752]: I1124 13:30:00.158402 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="22281e5b-c91e-439d-8cdd-57f5587b829b" containerName="registry-server" Nov 24 13:30:00 crc kubenswrapper[4752]: E1124 13:30:00.158419 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22281e5b-c91e-439d-8cdd-57f5587b829b" containerName="extract-content" Nov 24 13:30:00 crc kubenswrapper[4752]: I1124 13:30:00.158426 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="22281e5b-c91e-439d-8cdd-57f5587b829b" containerName="extract-content" Nov 24 13:30:00 crc kubenswrapper[4752]: E1124 13:30:00.158445 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22281e5b-c91e-439d-8cdd-57f5587b829b" containerName="extract-utilities" Nov 24 13:30:00 crc kubenswrapper[4752]: I1124 13:30:00.158451 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="22281e5b-c91e-439d-8cdd-57f5587b829b" containerName="extract-utilities" Nov 24 13:30:00 crc kubenswrapper[4752]: E1124 13:30:00.158469 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06d1c715-750f-4588-a900-b02807e01098" containerName="extract-utilities" Nov 24 13:30:00 crc kubenswrapper[4752]: I1124 13:30:00.158475 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="06d1c715-750f-4588-a900-b02807e01098" containerName="extract-utilities" Nov 24 13:30:00 crc kubenswrapper[4752]: E1124 13:30:00.158520 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06d1c715-750f-4588-a900-b02807e01098" containerName="registry-server" Nov 24 13:30:00 crc kubenswrapper[4752]: I1124 13:30:00.158527 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="06d1c715-750f-4588-a900-b02807e01098" containerName="registry-server" Nov 24 13:30:00 crc kubenswrapper[4752]: I1124 13:30:00.158783 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="06d1c715-750f-4588-a900-b02807e01098" containerName="registry-server" Nov 24 13:30:00 crc kubenswrapper[4752]: I1124 13:30:00.158807 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="22281e5b-c91e-439d-8cdd-57f5587b829b" containerName="registry-server" Nov 24 13:30:00 crc kubenswrapper[4752]: I1124 13:30:00.159578 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399850-m9vgj" Nov 24 13:30:00 crc kubenswrapper[4752]: I1124 13:30:00.161324 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 24 13:30:00 crc kubenswrapper[4752]: I1124 13:30:00.161550 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 24 13:30:00 crc kubenswrapper[4752]: I1124 13:30:00.167470 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399850-m9vgj"] Nov 24 13:30:00 crc kubenswrapper[4752]: I1124 13:30:00.256882 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zchd5\" (UniqueName: \"kubernetes.io/projected/42159d58-d030-45af-838e-37766774dd1d-kube-api-access-zchd5\") pod \"collect-profiles-29399850-m9vgj\" (UID: \"42159d58-d030-45af-838e-37766774dd1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399850-m9vgj" Nov 24 13:30:00 crc kubenswrapper[4752]: I1124 13:30:00.257261 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42159d58-d030-45af-838e-37766774dd1d-secret-volume\") pod \"collect-profiles-29399850-m9vgj\" (UID: \"42159d58-d030-45af-838e-37766774dd1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399850-m9vgj" Nov 24 13:30:00 crc kubenswrapper[4752]: I1124 13:30:00.257340 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42159d58-d030-45af-838e-37766774dd1d-config-volume\") pod \"collect-profiles-29399850-m9vgj\" (UID: \"42159d58-d030-45af-838e-37766774dd1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399850-m9vgj" Nov 24 13:30:00 crc kubenswrapper[4752]: I1124 13:30:00.359806 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zchd5\" (UniqueName: \"kubernetes.io/projected/42159d58-d030-45af-838e-37766774dd1d-kube-api-access-zchd5\") pod \"collect-profiles-29399850-m9vgj\" (UID: \"42159d58-d030-45af-838e-37766774dd1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399850-m9vgj" Nov 24 13:30:00 crc kubenswrapper[4752]: I1124 13:30:00.360177 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42159d58-d030-45af-838e-37766774dd1d-secret-volume\") pod \"collect-profiles-29399850-m9vgj\" (UID: \"42159d58-d030-45af-838e-37766774dd1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399850-m9vgj" Nov 24 13:30:00 crc kubenswrapper[4752]: I1124 13:30:00.360216 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42159d58-d030-45af-838e-37766774dd1d-config-volume\") pod \"collect-profiles-29399850-m9vgj\" (UID: \"42159d58-d030-45af-838e-37766774dd1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399850-m9vgj" Nov 24 13:30:00 crc kubenswrapper[4752]: I1124 13:30:00.361840 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42159d58-d030-45af-838e-37766774dd1d-config-volume\") pod \"collect-profiles-29399850-m9vgj\" (UID: \"42159d58-d030-45af-838e-37766774dd1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399850-m9vgj" Nov 24 13:30:00 crc kubenswrapper[4752]: I1124 13:30:00.369969 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42159d58-d030-45af-838e-37766774dd1d-secret-volume\") pod \"collect-profiles-29399850-m9vgj\" (UID: \"42159d58-d030-45af-838e-37766774dd1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399850-m9vgj" Nov 24 13:30:00 crc kubenswrapper[4752]: I1124 13:30:00.381763 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zchd5\" (UniqueName: \"kubernetes.io/projected/42159d58-d030-45af-838e-37766774dd1d-kube-api-access-zchd5\") pod \"collect-profiles-29399850-m9vgj\" (UID: \"42159d58-d030-45af-838e-37766774dd1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399850-m9vgj" Nov 24 13:30:00 crc kubenswrapper[4752]: I1124 13:30:00.493090 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399850-m9vgj" Nov 24 13:30:00 crc kubenswrapper[4752]: I1124 13:30:00.958600 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399850-m9vgj"] Nov 24 13:30:00 crc kubenswrapper[4752]: W1124 13:30:00.974525 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42159d58_d030_45af_838e_37766774dd1d.slice/crio-88549d5cb75e89bfe1c734bd23c2b87498279eacf4a1017d36720ad9bf79da66 WatchSource:0}: Error finding container 88549d5cb75e89bfe1c734bd23c2b87498279eacf4a1017d36720ad9bf79da66: Status 404 returned error can't find the container with id 88549d5cb75e89bfe1c734bd23c2b87498279eacf4a1017d36720ad9bf79da66 Nov 24 13:30:01 crc kubenswrapper[4752]: I1124 13:30:01.848685 4752 generic.go:334] "Generic (PLEG): container finished" podID="42159d58-d030-45af-838e-37766774dd1d" containerID="30be99284b018398f36c5ded86c69b0265600a66c4414e81e1a0db5da997a52a" exitCode=0 Nov 24 13:30:01 crc kubenswrapper[4752]: I1124 13:30:01.849038 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399850-m9vgj" event={"ID":"42159d58-d030-45af-838e-37766774dd1d","Type":"ContainerDied","Data":"30be99284b018398f36c5ded86c69b0265600a66c4414e81e1a0db5da997a52a"} Nov 24 13:30:01 crc kubenswrapper[4752]: I1124 13:30:01.849075 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399850-m9vgj" event={"ID":"42159d58-d030-45af-838e-37766774dd1d","Type":"ContainerStarted","Data":"88549d5cb75e89bfe1c734bd23c2b87498279eacf4a1017d36720ad9bf79da66"} Nov 24 13:30:03 crc kubenswrapper[4752]: I1124 13:30:03.289814 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399850-m9vgj" Nov 24 13:30:03 crc kubenswrapper[4752]: I1124 13:30:03.457262 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42159d58-d030-45af-838e-37766774dd1d-config-volume\") pod \"42159d58-d030-45af-838e-37766774dd1d\" (UID: \"42159d58-d030-45af-838e-37766774dd1d\") " Nov 24 13:30:03 crc kubenswrapper[4752]: I1124 13:30:03.457387 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42159d58-d030-45af-838e-37766774dd1d-secret-volume\") pod \"42159d58-d030-45af-838e-37766774dd1d\" (UID: \"42159d58-d030-45af-838e-37766774dd1d\") " Nov 24 13:30:03 crc kubenswrapper[4752]: I1124 13:30:03.457530 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zchd5\" (UniqueName: \"kubernetes.io/projected/42159d58-d030-45af-838e-37766774dd1d-kube-api-access-zchd5\") pod \"42159d58-d030-45af-838e-37766774dd1d\" (UID: \"42159d58-d030-45af-838e-37766774dd1d\") " Nov 24 13:30:03 crc kubenswrapper[4752]: I1124 13:30:03.458056 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42159d58-d030-45af-838e-37766774dd1d-config-volume" (OuterVolumeSpecName: "config-volume") pod "42159d58-d030-45af-838e-37766774dd1d" (UID: "42159d58-d030-45af-838e-37766774dd1d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 13:30:03 crc kubenswrapper[4752]: I1124 13:30:03.458568 4752 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42159d58-d030-45af-838e-37766774dd1d-config-volume\") on node \"crc\" DevicePath \"\"" Nov 24 13:30:03 crc kubenswrapper[4752]: I1124 13:30:03.462646 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42159d58-d030-45af-838e-37766774dd1d-kube-api-access-zchd5" (OuterVolumeSpecName: "kube-api-access-zchd5") pod "42159d58-d030-45af-838e-37766774dd1d" (UID: "42159d58-d030-45af-838e-37766774dd1d"). InnerVolumeSpecName "kube-api-access-zchd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:30:03 crc kubenswrapper[4752]: I1124 13:30:03.466054 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42159d58-d030-45af-838e-37766774dd1d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "42159d58-d030-45af-838e-37766774dd1d" (UID: "42159d58-d030-45af-838e-37766774dd1d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:30:03 crc kubenswrapper[4752]: I1124 13:30:03.560983 4752 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42159d58-d030-45af-838e-37766774dd1d-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 24 13:30:03 crc kubenswrapper[4752]: I1124 13:30:03.561048 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zchd5\" (UniqueName: \"kubernetes.io/projected/42159d58-d030-45af-838e-37766774dd1d-kube-api-access-zchd5\") on node \"crc\" DevicePath \"\"" Nov 24 13:30:03 crc kubenswrapper[4752]: I1124 13:30:03.874095 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399850-m9vgj" event={"ID":"42159d58-d030-45af-838e-37766774dd1d","Type":"ContainerDied","Data":"88549d5cb75e89bfe1c734bd23c2b87498279eacf4a1017d36720ad9bf79da66"} Nov 24 13:30:03 crc kubenswrapper[4752]: I1124 13:30:03.874515 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88549d5cb75e89bfe1c734bd23c2b87498279eacf4a1017d36720ad9bf79da66" Nov 24 13:30:03 crc kubenswrapper[4752]: I1124 13:30:03.874179 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399850-m9vgj" Nov 24 13:30:04 crc kubenswrapper[4752]: I1124 13:30:04.365728 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399805-5vmr8"] Nov 24 13:30:04 crc kubenswrapper[4752]: I1124 13:30:04.374539 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399805-5vmr8"] Nov 24 13:30:04 crc kubenswrapper[4752]: I1124 13:30:04.743969 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10f37a67-1524-4a04-b7e6-7546a1b32ee2" path="/var/lib/kubelet/pods/10f37a67-1524-4a04-b7e6-7546a1b32ee2/volumes" Nov 24 13:30:11 crc kubenswrapper[4752]: I1124 13:30:11.728295 4752 scope.go:117] "RemoveContainer" containerID="1a089388d4df9ad719898856284c6337e3ff4bcf9f29f31651cff6612d3497f5" Nov 24 13:30:11 crc kubenswrapper[4752]: E1124 13:30:11.729357 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:30:23 crc kubenswrapper[4752]: I1124 13:30:23.728623 4752 scope.go:117] "RemoveContainer" containerID="1a089388d4df9ad719898856284c6337e3ff4bcf9f29f31651cff6612d3497f5" Nov 24 13:30:23 crc kubenswrapper[4752]: E1124 13:30:23.729893 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:30:38 crc kubenswrapper[4752]: I1124 13:30:38.246073 4752 generic.go:334] "Generic (PLEG): container finished" podID="4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7" containerID="6b4a49f8622a3b349957eff02537004e2a9b6b774093089cffcd5ec66ed9858f" exitCode=0 Nov 24 13:30:38 crc kubenswrapper[4752]: I1124 13:30:38.246186 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-j8d65" event={"ID":"4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7","Type":"ContainerDied","Data":"6b4a49f8622a3b349957eff02537004e2a9b6b774093089cffcd5ec66ed9858f"} Nov 24 13:30:38 crc kubenswrapper[4752]: I1124 13:30:38.730272 4752 scope.go:117] "RemoveContainer" containerID="1a089388d4df9ad719898856284c6337e3ff4bcf9f29f31651cff6612d3497f5" Nov 24 13:30:38 crc kubenswrapper[4752]: E1124 13:30:38.730732 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:30:39 crc kubenswrapper[4752]: I1124 13:30:39.750589 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-j8d65" Nov 24 13:30:39 crc kubenswrapper[4752]: I1124 13:30:39.886615 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7-ssh-key\") pod \"4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7\" (UID: \"4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7\") " Nov 24 13:30:39 crc kubenswrapper[4752]: I1124 13:30:39.886846 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7-ceph\") pod \"4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7\" (UID: \"4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7\") " Nov 24 13:30:39 crc kubenswrapper[4752]: I1124 13:30:39.886958 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7-neutron-dhcp-agent-neutron-config-0\") pod \"4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7\" (UID: \"4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7\") " Nov 24 13:30:39 crc kubenswrapper[4752]: I1124 13:30:39.887013 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dn6ck\" (UniqueName: \"kubernetes.io/projected/4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7-kube-api-access-dn6ck\") pod \"4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7\" (UID: \"4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7\") " Nov 24 13:30:39 crc kubenswrapper[4752]: I1124 13:30:39.887112 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7-neutron-dhcp-combined-ca-bundle\") pod \"4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7\" (UID: \"4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7\") " Nov 24 13:30:39 crc kubenswrapper[4752]: I1124 13:30:39.887273 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7-inventory\") pod \"4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7\" (UID: \"4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7\") " Nov 24 13:30:39 crc kubenswrapper[4752]: I1124 13:30:39.892239 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7-ceph" (OuterVolumeSpecName: "ceph") pod "4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7" (UID: "4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:30:39 crc kubenswrapper[4752]: I1124 13:30:39.892287 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7" (UID: "4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:30:39 crc kubenswrapper[4752]: I1124 13:30:39.895006 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7-kube-api-access-dn6ck" (OuterVolumeSpecName: "kube-api-access-dn6ck") pod "4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7" (UID: "4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7"). InnerVolumeSpecName "kube-api-access-dn6ck". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:30:39 crc kubenswrapper[4752]: I1124 13:30:39.916493 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7-inventory" (OuterVolumeSpecName: "inventory") pod "4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7" (UID: "4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:30:39 crc kubenswrapper[4752]: I1124 13:30:39.918680 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7" (UID: "4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:30:39 crc kubenswrapper[4752]: I1124 13:30:39.925520 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7" (UID: "4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:30:39 crc kubenswrapper[4752]: I1124 13:30:39.992144 4752 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 24 13:30:39 crc kubenswrapper[4752]: I1124 13:30:39.992417 4752 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7-ceph\") on node \"crc\" DevicePath \"\"" Nov 24 13:30:39 crc kubenswrapper[4752]: I1124 13:30:39.992498 4752 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Nov 24 13:30:39 crc kubenswrapper[4752]: I1124 13:30:39.992576 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dn6ck\" (UniqueName: \"kubernetes.io/projected/4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7-kube-api-access-dn6ck\") on node \"crc\" DevicePath \"\"" Nov 24 13:30:39 crc kubenswrapper[4752]: I1124 13:30:39.992650 4752 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 13:30:39 crc kubenswrapper[4752]: I1124 13:30:39.992716 4752 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7-inventory\") on node \"crc\" DevicePath \"\"" Nov 24 13:30:40 crc kubenswrapper[4752]: I1124 13:30:40.274906 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-j8d65" event={"ID":"4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7","Type":"ContainerDied","Data":"68c4730acd4d36eef160e8d16809b9ee88493b83bcebb3df4d3db8e0b6a35768"} Nov 24 13:30:40 crc kubenswrapper[4752]: I1124 13:30:40.274948 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68c4730acd4d36eef160e8d16809b9ee88493b83bcebb3df4d3db8e0b6a35768" Nov 24 13:30:40 crc kubenswrapper[4752]: I1124 13:30:40.274984 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-j8d65" Nov 24 13:30:48 crc kubenswrapper[4752]: I1124 13:30:48.346864 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 24 13:30:48 crc kubenswrapper[4752]: I1124 13:30:48.347697 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="041c17c0-b4dd-46a2-ba18-2f972ad0ff6a" containerName="nova-cell0-conductor-conductor" containerID="cri-o://e990c3beac152b75ded134cdd0efd6eeba5575ee5897b402d556662bee762d5f" gracePeriod=30 Nov 24 13:30:48 crc kubenswrapper[4752]: I1124 13:30:48.825689 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 24 13:30:48 crc kubenswrapper[4752]: I1124 13:30:48.825919 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="d14b0aa7-9a78-47bc-b0ae-e78d717028fc" containerName="nova-cell1-conductor-conductor" containerID="cri-o://732e9fc273cea2ed5d9d1b80fcc7ea7ac14e61ae1aeb25031f0da78653225ac8" gracePeriod=30 Nov 24 13:30:48 crc kubenswrapper[4752]: I1124 13:30:48.977551 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 13:30:48 crc kubenswrapper[4752]: I1124 13:30:48.977820 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="0d7b2d8a-5981-49b8-bebe-43341522af04" containerName="nova-scheduler-scheduler" containerID="cri-o://84a56a87791f3a03020377a51d622beb111f52f3e3611e7f4f1dd6c2e0bea6ea" gracePeriod=30 Nov 24 13:30:48 crc kubenswrapper[4752]: I1124 13:30:48.995337 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 24 13:30:48 crc kubenswrapper[4752]: I1124 13:30:48.995617 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="57480bf6-a5ed-4c0b-b787-fc2863744323" containerName="nova-api-log" containerID="cri-o://39289c74d13bc13a138076c50a5b483d89a5ef2e1c92a883045bd725d62f81d5" gracePeriod=30 Nov 24 13:30:48 crc kubenswrapper[4752]: I1124 13:30:48.995707 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="57480bf6-a5ed-4c0b-b787-fc2863744323" containerName="nova-api-api" containerID="cri-o://1b657b262b7c927418844cefa14a24a1f08b8545a12dc6c0372ba15a05df8e69" gracePeriod=30 Nov 24 13:30:49 crc kubenswrapper[4752]: I1124 13:30:49.009697 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 13:30:49 crc kubenswrapper[4752]: I1124 13:30:49.009977 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="fb42f12d-f3af-4fb3-9a41-e60dae4129e9" containerName="nova-metadata-log" containerID="cri-o://a56df5bb968f85b9c4b51ed2a7e57a2c030658b57c502d986589c01c7a4913f5" gracePeriod=30 Nov 24 13:30:49 crc kubenswrapper[4752]: I1124 13:30:49.010026 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="fb42f12d-f3af-4fb3-9a41-e60dae4129e9" containerName="nova-metadata-metadata" containerID="cri-o://c4f711db230ebc73693e2f6379e0f90fe08e7aafcdf39ea36f909c060c2f4c6b" gracePeriod=30 Nov 24 13:30:49 crc kubenswrapper[4752]: I1124 13:30:49.380070 4752 generic.go:334] "Generic (PLEG): container finished" podID="57480bf6-a5ed-4c0b-b787-fc2863744323" containerID="39289c74d13bc13a138076c50a5b483d89a5ef2e1c92a883045bd725d62f81d5" exitCode=143 Nov 24 13:30:49 crc kubenswrapper[4752]: I1124 13:30:49.380228 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"57480bf6-a5ed-4c0b-b787-fc2863744323","Type":"ContainerDied","Data":"39289c74d13bc13a138076c50a5b483d89a5ef2e1c92a883045bd725d62f81d5"} Nov 24 13:30:49 crc kubenswrapper[4752]: I1124 13:30:49.393729 4752 generic.go:334] "Generic (PLEG): container finished" podID="fb42f12d-f3af-4fb3-9a41-e60dae4129e9" containerID="a56df5bb968f85b9c4b51ed2a7e57a2c030658b57c502d986589c01c7a4913f5" exitCode=143 Nov 24 13:30:49 crc kubenswrapper[4752]: I1124 13:30:49.393799 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fb42f12d-f3af-4fb3-9a41-e60dae4129e9","Type":"ContainerDied","Data":"a56df5bb968f85b9c4b51ed2a7e57a2c030658b57c502d986589c01c7a4913f5"} Nov 24 13:30:50 crc kubenswrapper[4752]: I1124 13:30:50.104238 4752 scope.go:117] "RemoveContainer" containerID="ef9f7dec628d639345ec119d687bcf22ec53c0291705b4433e268c72e4abc1ea" Nov 24 13:30:50 crc kubenswrapper[4752]: I1124 13:30:50.405144 4752 generic.go:334] "Generic (PLEG): container finished" podID="0d7b2d8a-5981-49b8-bebe-43341522af04" containerID="84a56a87791f3a03020377a51d622beb111f52f3e3611e7f4f1dd6c2e0bea6ea" exitCode=0 Nov 24 13:30:50 crc kubenswrapper[4752]: I1124 13:30:50.405187 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0d7b2d8a-5981-49b8-bebe-43341522af04","Type":"ContainerDied","Data":"84a56a87791f3a03020377a51d622beb111f52f3e3611e7f4f1dd6c2e0bea6ea"} Nov 24 13:30:50 crc kubenswrapper[4752]: I1124 13:30:50.728618 4752 scope.go:117] "RemoveContainer" containerID="1a089388d4df9ad719898856284c6337e3ff4bcf9f29f31651cff6612d3497f5" Nov 24 13:30:50 crc kubenswrapper[4752]: E1124 13:30:50.730238 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:30:50 crc kubenswrapper[4752]: I1124 13:30:50.911018 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.027359 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d7b2d8a-5981-49b8-bebe-43341522af04-combined-ca-bundle\") pod \"0d7b2d8a-5981-49b8-bebe-43341522af04\" (UID: \"0d7b2d8a-5981-49b8-bebe-43341522af04\") " Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.027699 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6w8m\" (UniqueName: \"kubernetes.io/projected/0d7b2d8a-5981-49b8-bebe-43341522af04-kube-api-access-b6w8m\") pod \"0d7b2d8a-5981-49b8-bebe-43341522af04\" (UID: \"0d7b2d8a-5981-49b8-bebe-43341522af04\") " Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.027785 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d7b2d8a-5981-49b8-bebe-43341522af04-config-data\") pod \"0d7b2d8a-5981-49b8-bebe-43341522af04\" (UID: \"0d7b2d8a-5981-49b8-bebe-43341522af04\") " Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.032290 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d7b2d8a-5981-49b8-bebe-43341522af04-kube-api-access-b6w8m" (OuterVolumeSpecName: "kube-api-access-b6w8m") pod "0d7b2d8a-5981-49b8-bebe-43341522af04" (UID: "0d7b2d8a-5981-49b8-bebe-43341522af04"). InnerVolumeSpecName "kube-api-access-b6w8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.061898 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d7b2d8a-5981-49b8-bebe-43341522af04-config-data" (OuterVolumeSpecName: "config-data") pod "0d7b2d8a-5981-49b8-bebe-43341522af04" (UID: "0d7b2d8a-5981-49b8-bebe-43341522af04"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.070645 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d7b2d8a-5981-49b8-bebe-43341522af04-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0d7b2d8a-5981-49b8-bebe-43341522af04" (UID: "0d7b2d8a-5981-49b8-bebe-43341522af04"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.130724 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d7b2d8a-5981-49b8-bebe-43341522af04-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.130785 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6w8m\" (UniqueName: \"kubernetes.io/projected/0d7b2d8a-5981-49b8-bebe-43341522af04-kube-api-access-b6w8m\") on node \"crc\" DevicePath \"\"" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.130799 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d7b2d8a-5981-49b8-bebe-43341522af04-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.194341 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.333340 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d14b0aa7-9a78-47bc-b0ae-e78d717028fc-combined-ca-bundle\") pod \"d14b0aa7-9a78-47bc-b0ae-e78d717028fc\" (UID: \"d14b0aa7-9a78-47bc-b0ae-e78d717028fc\") " Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.333487 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmb46\" (UniqueName: \"kubernetes.io/projected/d14b0aa7-9a78-47bc-b0ae-e78d717028fc-kube-api-access-wmb46\") pod \"d14b0aa7-9a78-47bc-b0ae-e78d717028fc\" (UID: \"d14b0aa7-9a78-47bc-b0ae-e78d717028fc\") " Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.333550 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d14b0aa7-9a78-47bc-b0ae-e78d717028fc-config-data\") pod \"d14b0aa7-9a78-47bc-b0ae-e78d717028fc\" (UID: \"d14b0aa7-9a78-47bc-b0ae-e78d717028fc\") " Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.339304 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d14b0aa7-9a78-47bc-b0ae-e78d717028fc-kube-api-access-wmb46" (OuterVolumeSpecName: "kube-api-access-wmb46") pod "d14b0aa7-9a78-47bc-b0ae-e78d717028fc" (UID: "d14b0aa7-9a78-47bc-b0ae-e78d717028fc"). InnerVolumeSpecName "kube-api-access-wmb46". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.366164 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d14b0aa7-9a78-47bc-b0ae-e78d717028fc-config-data" (OuterVolumeSpecName: "config-data") pod "d14b0aa7-9a78-47bc-b0ae-e78d717028fc" (UID: "d14b0aa7-9a78-47bc-b0ae-e78d717028fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.384840 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d14b0aa7-9a78-47bc-b0ae-e78d717028fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d14b0aa7-9a78-47bc-b0ae-e78d717028fc" (UID: "d14b0aa7-9a78-47bc-b0ae-e78d717028fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.417648 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0d7b2d8a-5981-49b8-bebe-43341522af04","Type":"ContainerDied","Data":"a52245349bad39fbaad08e3ce6f1c6051909cb812c27b73697f40a4cacd6ea91"} Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.417695 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.417719 4752 scope.go:117] "RemoveContainer" containerID="84a56a87791f3a03020377a51d622beb111f52f3e3611e7f4f1dd6c2e0bea6ea" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.420971 4752 generic.go:334] "Generic (PLEG): container finished" podID="d14b0aa7-9a78-47bc-b0ae-e78d717028fc" containerID="732e9fc273cea2ed5d9d1b80fcc7ea7ac14e61ae1aeb25031f0da78653225ac8" exitCode=0 Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.421037 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"d14b0aa7-9a78-47bc-b0ae-e78d717028fc","Type":"ContainerDied","Data":"732e9fc273cea2ed5d9d1b80fcc7ea7ac14e61ae1aeb25031f0da78653225ac8"} Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.421066 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"d14b0aa7-9a78-47bc-b0ae-e78d717028fc","Type":"ContainerDied","Data":"6caa0846c9f8e67e866e7e3fdccd79606ee83c416acd802ba145599757a0079e"} Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.421121 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.425057 4752 generic.go:334] "Generic (PLEG): container finished" podID="041c17c0-b4dd-46a2-ba18-2f972ad0ff6a" containerID="e990c3beac152b75ded134cdd0efd6eeba5575ee5897b402d556662bee762d5f" exitCode=0 Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.425112 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"041c17c0-b4dd-46a2-ba18-2f972ad0ff6a","Type":"ContainerDied","Data":"e990c3beac152b75ded134cdd0efd6eeba5575ee5897b402d556662bee762d5f"} Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.436067 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d14b0aa7-9a78-47bc-b0ae-e78d717028fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.436102 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmb46\" (UniqueName: \"kubernetes.io/projected/d14b0aa7-9a78-47bc-b0ae-e78d717028fc-kube-api-access-wmb46\") on node \"crc\" DevicePath \"\"" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.436117 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d14b0aa7-9a78-47bc-b0ae-e78d717028fc-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.517847 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.525487 4752 scope.go:117] "RemoveContainer" containerID="732e9fc273cea2ed5d9d1b80fcc7ea7ac14e61ae1aeb25031f0da78653225ac8" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.539189 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/041c17c0-b4dd-46a2-ba18-2f972ad0ff6a-config-data\") pod \"041c17c0-b4dd-46a2-ba18-2f972ad0ff6a\" (UID: \"041c17c0-b4dd-46a2-ba18-2f972ad0ff6a\") " Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.539247 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnqbr\" (UniqueName: \"kubernetes.io/projected/041c17c0-b4dd-46a2-ba18-2f972ad0ff6a-kube-api-access-hnqbr\") pod \"041c17c0-b4dd-46a2-ba18-2f972ad0ff6a\" (UID: \"041c17c0-b4dd-46a2-ba18-2f972ad0ff6a\") " Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.539296 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/041c17c0-b4dd-46a2-ba18-2f972ad0ff6a-combined-ca-bundle\") pod \"041c17c0-b4dd-46a2-ba18-2f972ad0ff6a\" (UID: \"041c17c0-b4dd-46a2-ba18-2f972ad0ff6a\") " Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.577543 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/041c17c0-b4dd-46a2-ba18-2f972ad0ff6a-kube-api-access-hnqbr" (OuterVolumeSpecName: "kube-api-access-hnqbr") pod "041c17c0-b4dd-46a2-ba18-2f972ad0ff6a" (UID: "041c17c0-b4dd-46a2-ba18-2f972ad0ff6a"). InnerVolumeSpecName "kube-api-access-hnqbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.582843 4752 scope.go:117] "RemoveContainer" containerID="732e9fc273cea2ed5d9d1b80fcc7ea7ac14e61ae1aeb25031f0da78653225ac8" Nov 24 13:30:51 crc kubenswrapper[4752]: E1124 13:30:51.586413 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"732e9fc273cea2ed5d9d1b80fcc7ea7ac14e61ae1aeb25031f0da78653225ac8\": container with ID starting with 732e9fc273cea2ed5d9d1b80fcc7ea7ac14e61ae1aeb25031f0da78653225ac8 not found: ID does not exist" containerID="732e9fc273cea2ed5d9d1b80fcc7ea7ac14e61ae1aeb25031f0da78653225ac8" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.586600 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"732e9fc273cea2ed5d9d1b80fcc7ea7ac14e61ae1aeb25031f0da78653225ac8"} err="failed to get container status \"732e9fc273cea2ed5d9d1b80fcc7ea7ac14e61ae1aeb25031f0da78653225ac8\": rpc error: code = NotFound desc = could not find container \"732e9fc273cea2ed5d9d1b80fcc7ea7ac14e61ae1aeb25031f0da78653225ac8\": container with ID starting with 732e9fc273cea2ed5d9d1b80fcc7ea7ac14e61ae1aeb25031f0da78653225ac8 not found: ID does not exist" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.590344 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/041c17c0-b4dd-46a2-ba18-2f972ad0ff6a-config-data" (OuterVolumeSpecName: "config-data") pod "041c17c0-b4dd-46a2-ba18-2f972ad0ff6a" (UID: "041c17c0-b4dd-46a2-ba18-2f972ad0ff6a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.603337 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/041c17c0-b4dd-46a2-ba18-2f972ad0ff6a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "041c17c0-b4dd-46a2-ba18-2f972ad0ff6a" (UID: "041c17c0-b4dd-46a2-ba18-2f972ad0ff6a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.609281 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.619462 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.632811 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.641986 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/041c17c0-b4dd-46a2-ba18-2f972ad0ff6a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.642024 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/041c17c0-b4dd-46a2-ba18-2f972ad0ff6a-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.642035 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnqbr\" (UniqueName: \"kubernetes.io/projected/041c17c0-b4dd-46a2-ba18-2f972ad0ff6a-kube-api-access-hnqbr\") on node \"crc\" DevicePath \"\"" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.646078 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.660393 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 13:30:51 crc kubenswrapper[4752]: E1124 13:30:51.660936 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d14b0aa7-9a78-47bc-b0ae-e78d717028fc" containerName="nova-cell1-conductor-conductor" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.660956 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="d14b0aa7-9a78-47bc-b0ae-e78d717028fc" containerName="nova-cell1-conductor-conductor" Nov 24 13:30:51 crc kubenswrapper[4752]: E1124 13:30:51.660988 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d7b2d8a-5981-49b8-bebe-43341522af04" containerName="nova-scheduler-scheduler" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.660997 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d7b2d8a-5981-49b8-bebe-43341522af04" containerName="nova-scheduler-scheduler" Nov 24 13:30:51 crc kubenswrapper[4752]: E1124 13:30:51.661038 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42159d58-d030-45af-838e-37766774dd1d" containerName="collect-profiles" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.661049 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="42159d58-d030-45af-838e-37766774dd1d" containerName="collect-profiles" Nov 24 13:30:51 crc kubenswrapper[4752]: E1124 13:30:51.661066 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="041c17c0-b4dd-46a2-ba18-2f972ad0ff6a" containerName="nova-cell0-conductor-conductor" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.661073 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="041c17c0-b4dd-46a2-ba18-2f972ad0ff6a" containerName="nova-cell0-conductor-conductor" Nov 24 13:30:51 crc kubenswrapper[4752]: E1124 13:30:51.661089 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7" containerName="neutron-dhcp-openstack-openstack-cell1" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.661096 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7" containerName="neutron-dhcp-openstack-openstack-cell1" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.661352 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7" containerName="neutron-dhcp-openstack-openstack-cell1" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.661381 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="041c17c0-b4dd-46a2-ba18-2f972ad0ff6a" containerName="nova-cell0-conductor-conductor" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.661398 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d7b2d8a-5981-49b8-bebe-43341522af04" containerName="nova-scheduler-scheduler" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.661414 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="d14b0aa7-9a78-47bc-b0ae-e78d717028fc" containerName="nova-cell1-conductor-conductor" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.661440 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="42159d58-d030-45af-838e-37766774dd1d" containerName="collect-profiles" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.662466 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.664253 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.670420 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.672232 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.674660 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.683834 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.696076 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.746607 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrk58\" (UniqueName: \"kubernetes.io/projected/d2c088e9-08c6-4695-8c18-c345a40d1eb1-kube-api-access-rrk58\") pod \"nova-cell1-conductor-0\" (UID: \"d2c088e9-08c6-4695-8c18-c345a40d1eb1\") " pod="openstack/nova-cell1-conductor-0" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.747070 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2c088e9-08c6-4695-8c18-c345a40d1eb1-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"d2c088e9-08c6-4695-8c18-c345a40d1eb1\") " pod="openstack/nova-cell1-conductor-0" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.747419 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2c088e9-08c6-4695-8c18-c345a40d1eb1-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"d2c088e9-08c6-4695-8c18-c345a40d1eb1\") " pod="openstack/nova-cell1-conductor-0" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.849543 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9d8eb2b-bb50-40ba-89d0-38898e19bf14-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e9d8eb2b-bb50-40ba-89d0-38898e19bf14\") " pod="openstack/nova-scheduler-0" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.849677 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2c088e9-08c6-4695-8c18-c345a40d1eb1-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"d2c088e9-08c6-4695-8c18-c345a40d1eb1\") " pod="openstack/nova-cell1-conductor-0" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.849880 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2c088e9-08c6-4695-8c18-c345a40d1eb1-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"d2c088e9-08c6-4695-8c18-c345a40d1eb1\") " pod="openstack/nova-cell1-conductor-0" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.849916 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9d8eb2b-bb50-40ba-89d0-38898e19bf14-config-data\") pod \"nova-scheduler-0\" (UID: \"e9d8eb2b-bb50-40ba-89d0-38898e19bf14\") " pod="openstack/nova-scheduler-0" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.850068 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrk58\" (UniqueName: \"kubernetes.io/projected/d2c088e9-08c6-4695-8c18-c345a40d1eb1-kube-api-access-rrk58\") pod \"nova-cell1-conductor-0\" (UID: \"d2c088e9-08c6-4695-8c18-c345a40d1eb1\") " pod="openstack/nova-cell1-conductor-0" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.850138 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6bt4\" (UniqueName: \"kubernetes.io/projected/e9d8eb2b-bb50-40ba-89d0-38898e19bf14-kube-api-access-x6bt4\") pod \"nova-scheduler-0\" (UID: \"e9d8eb2b-bb50-40ba-89d0-38898e19bf14\") " pod="openstack/nova-scheduler-0" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.853702 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2c088e9-08c6-4695-8c18-c345a40d1eb1-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"d2c088e9-08c6-4695-8c18-c345a40d1eb1\") " pod="openstack/nova-cell1-conductor-0" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.853883 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2c088e9-08c6-4695-8c18-c345a40d1eb1-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"d2c088e9-08c6-4695-8c18-c345a40d1eb1\") " pod="openstack/nova-cell1-conductor-0" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.865620 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrk58\" (UniqueName: \"kubernetes.io/projected/d2c088e9-08c6-4695-8c18-c345a40d1eb1-kube-api-access-rrk58\") pod \"nova-cell1-conductor-0\" (UID: \"d2c088e9-08c6-4695-8c18-c345a40d1eb1\") " pod="openstack/nova-cell1-conductor-0" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.951945 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6bt4\" (UniqueName: \"kubernetes.io/projected/e9d8eb2b-bb50-40ba-89d0-38898e19bf14-kube-api-access-x6bt4\") pod \"nova-scheduler-0\" (UID: \"e9d8eb2b-bb50-40ba-89d0-38898e19bf14\") " pod="openstack/nova-scheduler-0" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.952071 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9d8eb2b-bb50-40ba-89d0-38898e19bf14-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e9d8eb2b-bb50-40ba-89d0-38898e19bf14\") " pod="openstack/nova-scheduler-0" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.952165 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9d8eb2b-bb50-40ba-89d0-38898e19bf14-config-data\") pod \"nova-scheduler-0\" (UID: \"e9d8eb2b-bb50-40ba-89d0-38898e19bf14\") " pod="openstack/nova-scheduler-0" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.956080 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9d8eb2b-bb50-40ba-89d0-38898e19bf14-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e9d8eb2b-bb50-40ba-89d0-38898e19bf14\") " pod="openstack/nova-scheduler-0" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.956345 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9d8eb2b-bb50-40ba-89d0-38898e19bf14-config-data\") pod \"nova-scheduler-0\" (UID: \"e9d8eb2b-bb50-40ba-89d0-38898e19bf14\") " pod="openstack/nova-scheduler-0" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.974214 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6bt4\" (UniqueName: \"kubernetes.io/projected/e9d8eb2b-bb50-40ba-89d0-38898e19bf14-kube-api-access-x6bt4\") pod \"nova-scheduler-0\" (UID: \"e9d8eb2b-bb50-40ba-89d0-38898e19bf14\") " pod="openstack/nova-scheduler-0" Nov 24 13:30:51 crc kubenswrapper[4752]: I1124 13:30:51.991234 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 24 13:30:52 crc kubenswrapper[4752]: I1124 13:30:52.004810 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 24 13:30:52 crc kubenswrapper[4752]: I1124 13:30:52.162171 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="fb42f12d-f3af-4fb3-9a41-e60dae4129e9" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.84:8775/\": read tcp 10.217.0.2:32948->10.217.1.84:8775: read: connection reset by peer" Nov 24 13:30:52 crc kubenswrapper[4752]: I1124 13:30:52.162251 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="fb42f12d-f3af-4fb3-9a41-e60dae4129e9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.84:8775/\": read tcp 10.217.0.2:32956->10.217.1.84:8775: read: connection reset by peer" Nov 24 13:30:52 crc kubenswrapper[4752]: I1124 13:30:52.203157 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="57480bf6-a5ed-4c0b-b787-fc2863744323" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.83:8774/\": read tcp 10.217.0.2:52880->10.217.1.83:8774: read: connection reset by peer" Nov 24 13:30:52 crc kubenswrapper[4752]: I1124 13:30:52.203228 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="57480bf6-a5ed-4c0b-b787-fc2863744323" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.83:8774/\": read tcp 10.217.0.2:52876->10.217.1.83:8774: read: connection reset by peer" Nov 24 13:30:52 crc kubenswrapper[4752]: I1124 13:30:52.440737 4752 generic.go:334] "Generic (PLEG): container finished" podID="57480bf6-a5ed-4c0b-b787-fc2863744323" containerID="1b657b262b7c927418844cefa14a24a1f08b8545a12dc6c0372ba15a05df8e69" exitCode=0 Nov 24 13:30:52 crc kubenswrapper[4752]: I1124 13:30:52.440837 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"57480bf6-a5ed-4c0b-b787-fc2863744323","Type":"ContainerDied","Data":"1b657b262b7c927418844cefa14a24a1f08b8545a12dc6c0372ba15a05df8e69"} Nov 24 13:30:52 crc kubenswrapper[4752]: I1124 13:30:52.455540 4752 generic.go:334] "Generic (PLEG): container finished" podID="fb42f12d-f3af-4fb3-9a41-e60dae4129e9" containerID="c4f711db230ebc73693e2f6379e0f90fe08e7aafcdf39ea36f909c060c2f4c6b" exitCode=0 Nov 24 13:30:52 crc kubenswrapper[4752]: I1124 13:30:52.455604 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fb42f12d-f3af-4fb3-9a41-e60dae4129e9","Type":"ContainerDied","Data":"c4f711db230ebc73693e2f6379e0f90fe08e7aafcdf39ea36f909c060c2f4c6b"} Nov 24 13:30:52 crc kubenswrapper[4752]: I1124 13:30:52.465424 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"041c17c0-b4dd-46a2-ba18-2f972ad0ff6a","Type":"ContainerDied","Data":"c1044144a0c40a324651100bfb962ebfd5b2fa6eacefe7c64c56ac9562bab6c9"} Nov 24 13:30:52 crc kubenswrapper[4752]: I1124 13:30:52.465503 4752 scope.go:117] "RemoveContainer" containerID="e990c3beac152b75ded134cdd0efd6eeba5575ee5897b402d556662bee762d5f" Nov 24 13:30:52 crc kubenswrapper[4752]: I1124 13:30:52.465642 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 24 13:30:52 crc kubenswrapper[4752]: I1124 13:30:52.510020 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 24 13:30:52 crc kubenswrapper[4752]: I1124 13:30:52.526148 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 24 13:30:52 crc kubenswrapper[4752]: I1124 13:30:52.544728 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 24 13:30:52 crc kubenswrapper[4752]: I1124 13:30:52.549214 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 24 13:30:52 crc kubenswrapper[4752]: I1124 13:30:52.551343 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 24 13:30:52 crc kubenswrapper[4752]: I1124 13:30:52.562552 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 24 13:30:52 crc kubenswrapper[4752]: I1124 13:30:52.565851 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a40b5c05-d6a1-457c-bba1-8496e195c2a4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a40b5c05-d6a1-457c-bba1-8496e195c2a4\") " pod="openstack/nova-cell0-conductor-0" Nov 24 13:30:52 crc kubenswrapper[4752]: I1124 13:30:52.565933 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a40b5c05-d6a1-457c-bba1-8496e195c2a4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a40b5c05-d6a1-457c-bba1-8496e195c2a4\") " pod="openstack/nova-cell0-conductor-0" Nov 24 13:30:52 crc kubenswrapper[4752]: I1124 13:30:52.565974 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9mqv\" (UniqueName: \"kubernetes.io/projected/a40b5c05-d6a1-457c-bba1-8496e195c2a4-kube-api-access-b9mqv\") pod \"nova-cell0-conductor-0\" (UID: \"a40b5c05-d6a1-457c-bba1-8496e195c2a4\") " pod="openstack/nova-cell0-conductor-0" Nov 24 13:30:52 crc kubenswrapper[4752]: I1124 13:30:52.637665 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 13:30:52 crc kubenswrapper[4752]: I1124 13:30:52.668111 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a40b5c05-d6a1-457c-bba1-8496e195c2a4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a40b5c05-d6a1-457c-bba1-8496e195c2a4\") " pod="openstack/nova-cell0-conductor-0" Nov 24 13:30:52 crc kubenswrapper[4752]: I1124 13:30:52.668372 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a40b5c05-d6a1-457c-bba1-8496e195c2a4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a40b5c05-d6a1-457c-bba1-8496e195c2a4\") " pod="openstack/nova-cell0-conductor-0" Nov 24 13:30:52 crc kubenswrapper[4752]: I1124 13:30:52.668488 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9mqv\" (UniqueName: \"kubernetes.io/projected/a40b5c05-d6a1-457c-bba1-8496e195c2a4-kube-api-access-b9mqv\") pod \"nova-cell0-conductor-0\" (UID: \"a40b5c05-d6a1-457c-bba1-8496e195c2a4\") " pod="openstack/nova-cell0-conductor-0" Nov 24 13:30:52 crc kubenswrapper[4752]: I1124 13:30:52.684888 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a40b5c05-d6a1-457c-bba1-8496e195c2a4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a40b5c05-d6a1-457c-bba1-8496e195c2a4\") " pod="openstack/nova-cell0-conductor-0" Nov 24 13:30:52 crc kubenswrapper[4752]: I1124 13:30:52.685391 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a40b5c05-d6a1-457c-bba1-8496e195c2a4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a40b5c05-d6a1-457c-bba1-8496e195c2a4\") " pod="openstack/nova-cell0-conductor-0" Nov 24 13:30:52 crc kubenswrapper[4752]: I1124 13:30:52.696654 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9mqv\" (UniqueName: \"kubernetes.io/projected/a40b5c05-d6a1-457c-bba1-8496e195c2a4-kube-api-access-b9mqv\") pod \"nova-cell0-conductor-0\" (UID: \"a40b5c05-d6a1-457c-bba1-8496e195c2a4\") " pod="openstack/nova-cell0-conductor-0" Nov 24 13:30:52 crc kubenswrapper[4752]: I1124 13:30:52.759575 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="041c17c0-b4dd-46a2-ba18-2f972ad0ff6a" path="/var/lib/kubelet/pods/041c17c0-b4dd-46a2-ba18-2f972ad0ff6a/volumes" Nov 24 13:30:52 crc kubenswrapper[4752]: I1124 13:30:52.761571 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d7b2d8a-5981-49b8-bebe-43341522af04" path="/var/lib/kubelet/pods/0d7b2d8a-5981-49b8-bebe-43341522af04/volumes" Nov 24 13:30:52 crc kubenswrapper[4752]: I1124 13:30:52.762269 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d14b0aa7-9a78-47bc-b0ae-e78d717028fc" path="/var/lib/kubelet/pods/d14b0aa7-9a78-47bc-b0ae-e78d717028fc/volumes" Nov 24 13:30:52 crc kubenswrapper[4752]: W1124 13:30:52.813282 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2c088e9_08c6_4695_8c18_c345a40d1eb1.slice/crio-a9be196ae50ae4b0eff2924fa9d9025fb1294f0d4cad06a127b06b416d5498e0 WatchSource:0}: Error finding container a9be196ae50ae4b0eff2924fa9d9025fb1294f0d4cad06a127b06b416d5498e0: Status 404 returned error can't find the container with id a9be196ae50ae4b0eff2924fa9d9025fb1294f0d4cad06a127b06b416d5498e0 Nov 24 13:30:52 crc kubenswrapper[4752]: I1124 13:30:52.815570 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 24 13:30:52 crc kubenswrapper[4752]: I1124 13:30:52.838771 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 13:30:52 crc kubenswrapper[4752]: I1124 13:30:52.857684 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 13:30:52 crc kubenswrapper[4752]: I1124 13:30:52.914868 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.021389 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57480bf6-a5ed-4c0b-b787-fc2863744323-combined-ca-bundle\") pod \"57480bf6-a5ed-4c0b-b787-fc2863744323\" (UID: \"57480bf6-a5ed-4c0b-b787-fc2863744323\") " Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.021837 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57480bf6-a5ed-4c0b-b787-fc2863744323-config-data\") pod \"57480bf6-a5ed-4c0b-b787-fc2863744323\" (UID: \"57480bf6-a5ed-4c0b-b787-fc2863744323\") " Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.021888 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb42f12d-f3af-4fb3-9a41-e60dae4129e9-combined-ca-bundle\") pod \"fb42f12d-f3af-4fb3-9a41-e60dae4129e9\" (UID: \"fb42f12d-f3af-4fb3-9a41-e60dae4129e9\") " Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.022049 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqb4w\" (UniqueName: \"kubernetes.io/projected/57480bf6-a5ed-4c0b-b787-fc2863744323-kube-api-access-qqb4w\") pod \"57480bf6-a5ed-4c0b-b787-fc2863744323\" (UID: \"57480bf6-a5ed-4c0b-b787-fc2863744323\") " Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.022096 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57480bf6-a5ed-4c0b-b787-fc2863744323-logs\") pod \"57480bf6-a5ed-4c0b-b787-fc2863744323\" (UID: \"57480bf6-a5ed-4c0b-b787-fc2863744323\") " Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.022281 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb42f12d-f3af-4fb3-9a41-e60dae4129e9-config-data\") pod \"fb42f12d-f3af-4fb3-9a41-e60dae4129e9\" (UID: \"fb42f12d-f3af-4fb3-9a41-e60dae4129e9\") " Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.022398 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb42f12d-f3af-4fb3-9a41-e60dae4129e9-logs\") pod \"fb42f12d-f3af-4fb3-9a41-e60dae4129e9\" (UID: \"fb42f12d-f3af-4fb3-9a41-e60dae4129e9\") " Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.022437 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mhx5\" (UniqueName: \"kubernetes.io/projected/fb42f12d-f3af-4fb3-9a41-e60dae4129e9-kube-api-access-4mhx5\") pod \"fb42f12d-f3af-4fb3-9a41-e60dae4129e9\" (UID: \"fb42f12d-f3af-4fb3-9a41-e60dae4129e9\") " Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.024363 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57480bf6-a5ed-4c0b-b787-fc2863744323-logs" (OuterVolumeSpecName: "logs") pod "57480bf6-a5ed-4c0b-b787-fc2863744323" (UID: "57480bf6-a5ed-4c0b-b787-fc2863744323"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.027889 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb42f12d-f3af-4fb3-9a41-e60dae4129e9-logs" (OuterVolumeSpecName: "logs") pod "fb42f12d-f3af-4fb3-9a41-e60dae4129e9" (UID: "fb42f12d-f3af-4fb3-9a41-e60dae4129e9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.028452 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57480bf6-a5ed-4c0b-b787-fc2863744323-kube-api-access-qqb4w" (OuterVolumeSpecName: "kube-api-access-qqb4w") pod "57480bf6-a5ed-4c0b-b787-fc2863744323" (UID: "57480bf6-a5ed-4c0b-b787-fc2863744323"). InnerVolumeSpecName "kube-api-access-qqb4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.032379 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb42f12d-f3af-4fb3-9a41-e60dae4129e9-kube-api-access-4mhx5" (OuterVolumeSpecName: "kube-api-access-4mhx5") pod "fb42f12d-f3af-4fb3-9a41-e60dae4129e9" (UID: "fb42f12d-f3af-4fb3-9a41-e60dae4129e9"). InnerVolumeSpecName "kube-api-access-4mhx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.133197 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqb4w\" (UniqueName: \"kubernetes.io/projected/57480bf6-a5ed-4c0b-b787-fc2863744323-kube-api-access-qqb4w\") on node \"crc\" DevicePath \"\"" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.133232 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57480bf6-a5ed-4c0b-b787-fc2863744323-logs\") on node \"crc\" DevicePath \"\"" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.133246 4752 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb42f12d-f3af-4fb3-9a41-e60dae4129e9-logs\") on node \"crc\" DevicePath \"\"" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.133256 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mhx5\" (UniqueName: \"kubernetes.io/projected/fb42f12d-f3af-4fb3-9a41-e60dae4129e9-kube-api-access-4mhx5\") on node \"crc\" DevicePath \"\"" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.142627 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57480bf6-a5ed-4c0b-b787-fc2863744323-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "57480bf6-a5ed-4c0b-b787-fc2863744323" (UID: "57480bf6-a5ed-4c0b-b787-fc2863744323"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.146874 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb42f12d-f3af-4fb3-9a41-e60dae4129e9-config-data" (OuterVolumeSpecName: "config-data") pod "fb42f12d-f3af-4fb3-9a41-e60dae4129e9" (UID: "fb42f12d-f3af-4fb3-9a41-e60dae4129e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.154071 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb42f12d-f3af-4fb3-9a41-e60dae4129e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb42f12d-f3af-4fb3-9a41-e60dae4129e9" (UID: "fb42f12d-f3af-4fb3-9a41-e60dae4129e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.184610 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57480bf6-a5ed-4c0b-b787-fc2863744323-config-data" (OuterVolumeSpecName: "config-data") pod "57480bf6-a5ed-4c0b-b787-fc2863744323" (UID: "57480bf6-a5ed-4c0b-b787-fc2863744323"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.235881 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb42f12d-f3af-4fb3-9a41-e60dae4129e9-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.236178 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57480bf6-a5ed-4c0b-b787-fc2863744323-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.236317 4752 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57480bf6-a5ed-4c0b-b787-fc2863744323-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.236428 4752 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb42f12d-f3af-4fb3-9a41-e60dae4129e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.475218 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"d2c088e9-08c6-4695-8c18-c345a40d1eb1","Type":"ContainerStarted","Data":"e94f63bc420ebe618d70330da84a8d60c35bb56346bf474c20900cee5e513ef0"} Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.475307 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"d2c088e9-08c6-4695-8c18-c345a40d1eb1","Type":"ContainerStarted","Data":"a9be196ae50ae4b0eff2924fa9d9025fb1294f0d4cad06a127b06b416d5498e0"} Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.475373 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.477006 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"57480bf6-a5ed-4c0b-b787-fc2863744323","Type":"ContainerDied","Data":"ec9161adbb3f6f2858f6501a2517eba020cba302b7a1cb700122281b3fa877d9"} Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.477058 4752 scope.go:117] "RemoveContainer" containerID="1b657b262b7c927418844cefa14a24a1f08b8545a12dc6c0372ba15a05df8e69" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.477252 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.491720 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fb42f12d-f3af-4fb3-9a41-e60dae4129e9","Type":"ContainerDied","Data":"acffa6ece8773a2be2418b546c4583b183f6d3675783f85a1309276db36ace32"} Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.491901 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.496558 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e9d8eb2b-bb50-40ba-89d0-38898e19bf14","Type":"ContainerStarted","Data":"211b12588db8bfc5b8133682c7060ae3da21fba02dc180ad12d3d6d67bdc6bbe"} Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.496606 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e9d8eb2b-bb50-40ba-89d0-38898e19bf14","Type":"ContainerStarted","Data":"f445f16cc4ae8c1894bd4475a0f0514546a530f2ec2848d6fb093218c299136b"} Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.659795 4752 scope.go:117] "RemoveContainer" containerID="39289c74d13bc13a138076c50a5b483d89a5ef2e1c92a883045bd725d62f81d5" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.693533 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.712168 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.712139525 podStartE2EDuration="2.712139525s" podCreationTimestamp="2025-11-24 13:30:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 13:30:53.658768975 +0000 UTC m=+8659.643589274" watchObservedRunningTime="2025-11-24 13:30:53.712139525 +0000 UTC m=+8659.696959824" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.739662 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.739634083 podStartE2EDuration="2.739634083s" podCreationTimestamp="2025-11-24 13:30:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 13:30:53.683658199 +0000 UTC m=+8659.668478488" watchObservedRunningTime="2025-11-24 13:30:53.739634083 +0000 UTC m=+8659.724454412" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.799375 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.814558 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.823724 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.833046 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 24 13:30:53 crc kubenswrapper[4752]: E1124 13:30:53.833576 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57480bf6-a5ed-4c0b-b787-fc2863744323" containerName="nova-api-log" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.833597 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="57480bf6-a5ed-4c0b-b787-fc2863744323" containerName="nova-api-log" Nov 24 13:30:53 crc kubenswrapper[4752]: E1124 13:30:53.833619 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57480bf6-a5ed-4c0b-b787-fc2863744323" containerName="nova-api-api" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.833626 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="57480bf6-a5ed-4c0b-b787-fc2863744323" containerName="nova-api-api" Nov 24 13:30:53 crc kubenswrapper[4752]: E1124 13:30:53.833633 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb42f12d-f3af-4fb3-9a41-e60dae4129e9" containerName="nova-metadata-metadata" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.833639 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb42f12d-f3af-4fb3-9a41-e60dae4129e9" containerName="nova-metadata-metadata" Nov 24 13:30:53 crc kubenswrapper[4752]: E1124 13:30:53.833671 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb42f12d-f3af-4fb3-9a41-e60dae4129e9" containerName="nova-metadata-log" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.833682 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb42f12d-f3af-4fb3-9a41-e60dae4129e9" containerName="nova-metadata-log" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.833943 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="57480bf6-a5ed-4c0b-b787-fc2863744323" containerName="nova-api-log" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.833970 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb42f12d-f3af-4fb3-9a41-e60dae4129e9" containerName="nova-metadata-log" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.833985 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="57480bf6-a5ed-4c0b-b787-fc2863744323" containerName="nova-api-api" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.833995 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb42f12d-f3af-4fb3-9a41-e60dae4129e9" containerName="nova-metadata-metadata" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.838391 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.842707 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.848495 4752 scope.go:117] "RemoveContainer" containerID="c4f711db230ebc73693e2f6379e0f90fe08e7aafcdf39ea36f909c060c2f4c6b" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.845925 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.858791 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.866811 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.868878 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.873355 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.881510 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.913354 4752 scope.go:117] "RemoveContainer" containerID="a56df5bb968f85b9c4b51ed2a7e57a2c030658b57c502d986589c01c7a4913f5" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.962614 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb38b860-ee90-4ebe-bf0d-02285792bbc9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"eb38b860-ee90-4ebe-bf0d-02285792bbc9\") " pod="openstack/nova-api-0" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.962682 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74f88a59-9db6-49da-9d6f-b564689e662f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"74f88a59-9db6-49da-9d6f-b564689e662f\") " pod="openstack/nova-metadata-0" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.962727 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95rd8\" (UniqueName: \"kubernetes.io/projected/74f88a59-9db6-49da-9d6f-b564689e662f-kube-api-access-95rd8\") pod \"nova-metadata-0\" (UID: \"74f88a59-9db6-49da-9d6f-b564689e662f\") " pod="openstack/nova-metadata-0" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.962797 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lpz4\" (UniqueName: \"kubernetes.io/projected/eb38b860-ee90-4ebe-bf0d-02285792bbc9-kube-api-access-2lpz4\") pod \"nova-api-0\" (UID: \"eb38b860-ee90-4ebe-bf0d-02285792bbc9\") " pod="openstack/nova-api-0" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.962878 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74f88a59-9db6-49da-9d6f-b564689e662f-logs\") pod \"nova-metadata-0\" (UID: \"74f88a59-9db6-49da-9d6f-b564689e662f\") " pod="openstack/nova-metadata-0" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.962904 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74f88a59-9db6-49da-9d6f-b564689e662f-config-data\") pod \"nova-metadata-0\" (UID: \"74f88a59-9db6-49da-9d6f-b564689e662f\") " pod="openstack/nova-metadata-0" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.962957 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb38b860-ee90-4ebe-bf0d-02285792bbc9-logs\") pod \"nova-api-0\" (UID: \"eb38b860-ee90-4ebe-bf0d-02285792bbc9\") " pod="openstack/nova-api-0" Nov 24 13:30:53 crc kubenswrapper[4752]: I1124 13:30:53.962992 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb38b860-ee90-4ebe-bf0d-02285792bbc9-config-data\") pod \"nova-api-0\" (UID: \"eb38b860-ee90-4ebe-bf0d-02285792bbc9\") " pod="openstack/nova-api-0" Nov 24 13:30:54 crc kubenswrapper[4752]: I1124 13:30:54.064242 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74f88a59-9db6-49da-9d6f-b564689e662f-config-data\") pod \"nova-metadata-0\" (UID: \"74f88a59-9db6-49da-9d6f-b564689e662f\") " pod="openstack/nova-metadata-0" Nov 24 13:30:54 crc kubenswrapper[4752]: I1124 13:30:54.064536 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb38b860-ee90-4ebe-bf0d-02285792bbc9-logs\") pod \"nova-api-0\" (UID: \"eb38b860-ee90-4ebe-bf0d-02285792bbc9\") " pod="openstack/nova-api-0" Nov 24 13:30:54 crc kubenswrapper[4752]: I1124 13:30:54.064679 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb38b860-ee90-4ebe-bf0d-02285792bbc9-config-data\") pod \"nova-api-0\" (UID: \"eb38b860-ee90-4ebe-bf0d-02285792bbc9\") " pod="openstack/nova-api-0" Nov 24 13:30:54 crc kubenswrapper[4752]: I1124 13:30:54.064786 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb38b860-ee90-4ebe-bf0d-02285792bbc9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"eb38b860-ee90-4ebe-bf0d-02285792bbc9\") " pod="openstack/nova-api-0" Nov 24 13:30:54 crc kubenswrapper[4752]: I1124 13:30:54.064866 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74f88a59-9db6-49da-9d6f-b564689e662f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"74f88a59-9db6-49da-9d6f-b564689e662f\") " pod="openstack/nova-metadata-0" Nov 24 13:30:54 crc kubenswrapper[4752]: I1124 13:30:54.064948 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95rd8\" (UniqueName: \"kubernetes.io/projected/74f88a59-9db6-49da-9d6f-b564689e662f-kube-api-access-95rd8\") pod \"nova-metadata-0\" (UID: \"74f88a59-9db6-49da-9d6f-b564689e662f\") " pod="openstack/nova-metadata-0" Nov 24 13:30:54 crc kubenswrapper[4752]: I1124 13:30:54.065043 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lpz4\" (UniqueName: \"kubernetes.io/projected/eb38b860-ee90-4ebe-bf0d-02285792bbc9-kube-api-access-2lpz4\") pod \"nova-api-0\" (UID: \"eb38b860-ee90-4ebe-bf0d-02285792bbc9\") " pod="openstack/nova-api-0" Nov 24 13:30:54 crc kubenswrapper[4752]: I1124 13:30:54.066344 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74f88a59-9db6-49da-9d6f-b564689e662f-logs\") pod \"nova-metadata-0\" (UID: \"74f88a59-9db6-49da-9d6f-b564689e662f\") " pod="openstack/nova-metadata-0" Nov 24 13:30:54 crc kubenswrapper[4752]: I1124 13:30:54.065057 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb38b860-ee90-4ebe-bf0d-02285792bbc9-logs\") pod \"nova-api-0\" (UID: \"eb38b860-ee90-4ebe-bf0d-02285792bbc9\") " pod="openstack/nova-api-0" Nov 24 13:30:54 crc kubenswrapper[4752]: I1124 13:30:54.066786 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74f88a59-9db6-49da-9d6f-b564689e662f-logs\") pod \"nova-metadata-0\" (UID: \"74f88a59-9db6-49da-9d6f-b564689e662f\") " pod="openstack/nova-metadata-0" Nov 24 13:30:54 crc kubenswrapper[4752]: I1124 13:30:54.068702 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb38b860-ee90-4ebe-bf0d-02285792bbc9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"eb38b860-ee90-4ebe-bf0d-02285792bbc9\") " pod="openstack/nova-api-0" Nov 24 13:30:54 crc kubenswrapper[4752]: I1124 13:30:54.068718 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74f88a59-9db6-49da-9d6f-b564689e662f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"74f88a59-9db6-49da-9d6f-b564689e662f\") " pod="openstack/nova-metadata-0" Nov 24 13:30:54 crc kubenswrapper[4752]: I1124 13:30:54.071198 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb38b860-ee90-4ebe-bf0d-02285792bbc9-config-data\") pod \"nova-api-0\" (UID: \"eb38b860-ee90-4ebe-bf0d-02285792bbc9\") " pod="openstack/nova-api-0" Nov 24 13:30:54 crc kubenswrapper[4752]: I1124 13:30:54.071854 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74f88a59-9db6-49da-9d6f-b564689e662f-config-data\") pod \"nova-metadata-0\" (UID: \"74f88a59-9db6-49da-9d6f-b564689e662f\") " pod="openstack/nova-metadata-0" Nov 24 13:30:54 crc kubenswrapper[4752]: I1124 13:30:54.083632 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lpz4\" (UniqueName: \"kubernetes.io/projected/eb38b860-ee90-4ebe-bf0d-02285792bbc9-kube-api-access-2lpz4\") pod \"nova-api-0\" (UID: \"eb38b860-ee90-4ebe-bf0d-02285792bbc9\") " pod="openstack/nova-api-0" Nov 24 13:30:54 crc kubenswrapper[4752]: I1124 13:30:54.086793 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95rd8\" (UniqueName: \"kubernetes.io/projected/74f88a59-9db6-49da-9d6f-b564689e662f-kube-api-access-95rd8\") pod \"nova-metadata-0\" (UID: \"74f88a59-9db6-49da-9d6f-b564689e662f\") " pod="openstack/nova-metadata-0" Nov 24 13:30:54 crc kubenswrapper[4752]: I1124 13:30:54.219221 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 13:30:54 crc kubenswrapper[4752]: I1124 13:30:54.230107 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 13:30:54 crc kubenswrapper[4752]: I1124 13:30:54.524661 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a40b5c05-d6a1-457c-bba1-8496e195c2a4","Type":"ContainerStarted","Data":"54a18734b191840632375930e55e1f7c4d6884576fde20de18011505c5b43bb0"} Nov 24 13:30:54 crc kubenswrapper[4752]: I1124 13:30:54.525030 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a40b5c05-d6a1-457c-bba1-8496e195c2a4","Type":"ContainerStarted","Data":"5f7d9a35f98f898a51e81c98d8e4720ab4f9aecfa93b6bd54ecc98d378262e54"} Nov 24 13:30:54 crc kubenswrapper[4752]: I1124 13:30:54.552194 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.552171488 podStartE2EDuration="2.552171488s" podCreationTimestamp="2025-11-24 13:30:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 13:30:54.545577989 +0000 UTC m=+8660.530398288" watchObservedRunningTime="2025-11-24 13:30:54.552171488 +0000 UTC m=+8660.536991787" Nov 24 13:30:54 crc kubenswrapper[4752]: I1124 13:30:54.747881 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57480bf6-a5ed-4c0b-b787-fc2863744323" path="/var/lib/kubelet/pods/57480bf6-a5ed-4c0b-b787-fc2863744323/volumes" Nov 24 13:30:54 crc kubenswrapper[4752]: I1124 13:30:54.748964 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb42f12d-f3af-4fb3-9a41-e60dae4129e9" path="/var/lib/kubelet/pods/fb42f12d-f3af-4fb3-9a41-e60dae4129e9/volumes" Nov 24 13:30:54 crc kubenswrapper[4752]: I1124 13:30:54.852715 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 13:30:54 crc kubenswrapper[4752]: W1124 13:30:54.963086 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb38b860_ee90_4ebe_bf0d_02285792bbc9.slice/crio-bdf87593ab2171bb7000ecb392e42da1d693c3ed4a45ba9c2fe986f1df068321 WatchSource:0}: Error finding container bdf87593ab2171bb7000ecb392e42da1d693c3ed4a45ba9c2fe986f1df068321: Status 404 returned error can't find the container with id bdf87593ab2171bb7000ecb392e42da1d693c3ed4a45ba9c2fe986f1df068321 Nov 24 13:30:54 crc kubenswrapper[4752]: I1124 13:30:54.965237 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 24 13:30:55 crc kubenswrapper[4752]: I1124 13:30:55.541868 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eb38b860-ee90-4ebe-bf0d-02285792bbc9","Type":"ContainerStarted","Data":"a9911a28b88215c07234214b979faa483c34411fcfbf6869b506fefdbf19ed5b"} Nov 24 13:30:55 crc kubenswrapper[4752]: I1124 13:30:55.542304 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eb38b860-ee90-4ebe-bf0d-02285792bbc9","Type":"ContainerStarted","Data":"3d0a905158456728df61b866ae776a52c7bce5a7535e116df1869cc5a38faac9"} Nov 24 13:30:55 crc kubenswrapper[4752]: I1124 13:30:55.542317 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eb38b860-ee90-4ebe-bf0d-02285792bbc9","Type":"ContainerStarted","Data":"bdf87593ab2171bb7000ecb392e42da1d693c3ed4a45ba9c2fe986f1df068321"} Nov 24 13:30:55 crc kubenswrapper[4752]: I1124 13:30:55.546874 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"74f88a59-9db6-49da-9d6f-b564689e662f","Type":"ContainerStarted","Data":"587ab8e4e03d705bce4a313fccc11c6d4f6b7eed684a56e2f27c2163523d6ee2"} Nov 24 13:30:55 crc kubenswrapper[4752]: I1124 13:30:55.546953 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"74f88a59-9db6-49da-9d6f-b564689e662f","Type":"ContainerStarted","Data":"24456e3293ec5767af10a0fe291274ab08026d888173ce6221692d11f400a556"} Nov 24 13:30:55 crc kubenswrapper[4752]: I1124 13:30:55.546970 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"74f88a59-9db6-49da-9d6f-b564689e662f","Type":"ContainerStarted","Data":"5c99e6cfab3df056ac2af20c89b705e360768055588a17736f0645d720933356"} Nov 24 13:30:55 crc kubenswrapper[4752]: I1124 13:30:55.547337 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Nov 24 13:30:55 crc kubenswrapper[4752]: I1124 13:30:55.569537 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.569505363 podStartE2EDuration="2.569505363s" podCreationTimestamp="2025-11-24 13:30:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 13:30:55.568589637 +0000 UTC m=+8661.553409926" watchObservedRunningTime="2025-11-24 13:30:55.569505363 +0000 UTC m=+8661.554325652" Nov 24 13:30:56 crc kubenswrapper[4752]: I1124 13:30:56.583995 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.583975025 podStartE2EDuration="3.583975025s" podCreationTimestamp="2025-11-24 13:30:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 13:30:56.576597214 +0000 UTC m=+8662.561417503" watchObservedRunningTime="2025-11-24 13:30:56.583975025 +0000 UTC m=+8662.568795314" Nov 24 13:30:56 crc kubenswrapper[4752]: I1124 13:30:56.991576 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 24 13:30:59 crc kubenswrapper[4752]: I1124 13:30:59.221050 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 24 13:30:59 crc kubenswrapper[4752]: I1124 13:30:59.222063 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 24 13:31:01 crc kubenswrapper[4752]: I1124 13:31:01.991600 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 24 13:31:02 crc kubenswrapper[4752]: I1124 13:31:02.017428 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 24 13:31:02 crc kubenswrapper[4752]: I1124 13:31:02.040034 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Nov 24 13:31:02 crc kubenswrapper[4752]: I1124 13:31:02.669318 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 24 13:31:02 crc kubenswrapper[4752]: I1124 13:31:02.954221 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Nov 24 13:31:03 crc kubenswrapper[4752]: I1124 13:31:03.728143 4752 scope.go:117] "RemoveContainer" containerID="1a089388d4df9ad719898856284c6337e3ff4bcf9f29f31651cff6612d3497f5" Nov 24 13:31:03 crc kubenswrapper[4752]: E1124 13:31:03.728633 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:31:04 crc kubenswrapper[4752]: I1124 13:31:04.220590 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 24 13:31:04 crc kubenswrapper[4752]: I1124 13:31:04.220651 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 24 13:31:04 crc kubenswrapper[4752]: I1124 13:31:04.231246 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 24 13:31:04 crc kubenswrapper[4752]: I1124 13:31:04.231569 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 24 13:31:05 crc kubenswrapper[4752]: I1124 13:31:05.344026 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="eb38b860-ee90-4ebe-bf0d-02285792bbc9" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 24 13:31:05 crc kubenswrapper[4752]: I1124 13:31:05.385958 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="eb38b860-ee90-4ebe-bf0d-02285792bbc9" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 24 13:31:05 crc kubenswrapper[4752]: I1124 13:31:05.385992 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="74f88a59-9db6-49da-9d6f-b564689e662f" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.192:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 24 13:31:05 crc kubenswrapper[4752]: I1124 13:31:05.386049 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="74f88a59-9db6-49da-9d6f-b564689e662f" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.192:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 24 13:31:14 crc kubenswrapper[4752]: I1124 13:31:14.223367 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 24 13:31:14 crc kubenswrapper[4752]: I1124 13:31:14.224432 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 24 13:31:14 crc kubenswrapper[4752]: I1124 13:31:14.225639 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 24 13:31:14 crc kubenswrapper[4752]: I1124 13:31:14.235898 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 24 13:31:14 crc kubenswrapper[4752]: I1124 13:31:14.236788 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 24 13:31:14 crc kubenswrapper[4752]: I1124 13:31:14.237528 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 24 13:31:14 crc kubenswrapper[4752]: I1124 13:31:14.254091 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 24 13:31:14 crc kubenswrapper[4752]: I1124 13:31:14.757209 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 24 13:31:14 crc kubenswrapper[4752]: I1124 13:31:14.759198 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 24 13:31:14 crc kubenswrapper[4752]: I1124 13:31:14.761184 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 24 13:31:15 crc kubenswrapper[4752]: I1124 13:31:15.921624 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc"] Nov 24 13:31:15 crc kubenswrapper[4752]: I1124 13:31:15.924297 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc" Nov 24 13:31:15 crc kubenswrapper[4752]: I1124 13:31:15.926556 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Nov 24 13:31:15 crc kubenswrapper[4752]: I1124 13:31:15.929833 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 24 13:31:15 crc kubenswrapper[4752]: I1124 13:31:15.930020 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Nov 24 13:31:15 crc kubenswrapper[4752]: I1124 13:31:15.930199 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Nov 24 13:31:15 crc kubenswrapper[4752]: I1124 13:31:15.930308 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qj7cx" Nov 24 13:31:15 crc kubenswrapper[4752]: I1124 13:31:15.930407 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 24 13:31:15 crc kubenswrapper[4752]: I1124 13:31:15.938051 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc"] Nov 24 13:31:15 crc kubenswrapper[4752]: I1124 13:31:15.939018 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 24 13:31:16 crc kubenswrapper[4752]: I1124 13:31:16.073574 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc\" (UID: \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc" Nov 24 13:31:16 crc kubenswrapper[4752]: I1124 13:31:16.073650 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc\" (UID: \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc" Nov 24 13:31:16 crc kubenswrapper[4752]: I1124 13:31:16.073695 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc\" (UID: \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc" Nov 24 13:31:16 crc kubenswrapper[4752]: I1124 13:31:16.073717 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc\" (UID: \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc" Nov 24 13:31:16 crc kubenswrapper[4752]: I1124 13:31:16.073849 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg5tz\" (UniqueName: \"kubernetes.io/projected/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-kube-api-access-kg5tz\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc\" (UID: \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc" Nov 24 13:31:16 crc kubenswrapper[4752]: I1124 13:31:16.073899 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc\" (UID: \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc" Nov 24 13:31:16 crc kubenswrapper[4752]: I1124 13:31:16.073948 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc\" (UID: \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc" Nov 24 13:31:16 crc kubenswrapper[4752]: I1124 13:31:16.074015 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc\" (UID: \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc" Nov 24 13:31:16 crc kubenswrapper[4752]: I1124 13:31:16.074163 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc\" (UID: \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc" Nov 24 13:31:16 crc kubenswrapper[4752]: I1124 13:31:16.074280 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc\" (UID: \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc" Nov 24 13:31:16 crc kubenswrapper[4752]: I1124 13:31:16.074321 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc\" (UID: \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc" Nov 24 13:31:16 crc kubenswrapper[4752]: I1124 13:31:16.176809 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg5tz\" (UniqueName: \"kubernetes.io/projected/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-kube-api-access-kg5tz\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc\" (UID: \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc" Nov 24 13:31:16 crc kubenswrapper[4752]: I1124 13:31:16.176888 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc\" (UID: \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc" Nov 24 13:31:16 crc kubenswrapper[4752]: I1124 13:31:16.176971 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc\" (UID: \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc" Nov 24 13:31:16 crc kubenswrapper[4752]: I1124 13:31:16.177074 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc\" (UID: \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc" Nov 24 13:31:16 crc kubenswrapper[4752]: I1124 13:31:16.177139 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc\" (UID: \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc" Nov 24 13:31:16 crc kubenswrapper[4752]: I1124 13:31:16.177206 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc\" (UID: \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc" Nov 24 13:31:16 crc kubenswrapper[4752]: I1124 13:31:16.177244 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc\" (UID: \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc" Nov 24 13:31:16 crc kubenswrapper[4752]: I1124 13:31:16.177295 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc\" (UID: \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc" Nov 24 13:31:16 crc kubenswrapper[4752]: I1124 13:31:16.177344 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc\" (UID: \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc" Nov 24 13:31:16 crc kubenswrapper[4752]: I1124 13:31:16.177610 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc\" (UID: \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc" Nov 24 13:31:16 crc kubenswrapper[4752]: I1124 13:31:16.177638 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc\" (UID: \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc" Nov 24 13:31:16 crc kubenswrapper[4752]: I1124 13:31:16.178089 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc\" (UID: \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc" Nov 24 13:31:16 crc kubenswrapper[4752]: I1124 13:31:16.179015 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc\" (UID: \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc" Nov 24 13:31:16 crc kubenswrapper[4752]: I1124 13:31:16.184911 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc\" (UID: \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc" Nov 24 13:31:16 crc kubenswrapper[4752]: I1124 13:31:16.185352 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc\" (UID: \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc" Nov 24 13:31:16 crc kubenswrapper[4752]: I1124 13:31:16.185488 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc\" (UID: \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc" Nov 24 13:31:16 crc kubenswrapper[4752]: I1124 13:31:16.185759 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc\" (UID: \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc" Nov 24 13:31:16 crc kubenswrapper[4752]: I1124 13:31:16.187982 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc\" (UID: \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc" Nov 24 13:31:16 crc kubenswrapper[4752]: I1124 13:31:16.188454 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc\" (UID: \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc" Nov 24 13:31:16 crc kubenswrapper[4752]: I1124 13:31:16.189046 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc\" (UID: \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc" Nov 24 13:31:16 crc kubenswrapper[4752]: I1124 13:31:16.189794 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc\" (UID: \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc" Nov 24 13:31:16 crc kubenswrapper[4752]: I1124 13:31:16.201962 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg5tz\" (UniqueName: \"kubernetes.io/projected/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-kube-api-access-kg5tz\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc\" (UID: \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc" Nov 24 13:31:16 crc kubenswrapper[4752]: I1124 13:31:16.273183 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc" Nov 24 13:31:16 crc kubenswrapper[4752]: I1124 13:31:16.953880 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc"] Nov 24 13:31:16 crc kubenswrapper[4752]: W1124 13:31:16.963430 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ef01ca0_2da9_4115_8152_29ea3b6d7d3b.slice/crio-19b68188e824a15cf293ecaa7960ec164bbcd51f0c410bc28efa1708ed1ea0c1 WatchSource:0}: Error finding container 19b68188e824a15cf293ecaa7960ec164bbcd51f0c410bc28efa1708ed1ea0c1: Status 404 returned error can't find the container with id 19b68188e824a15cf293ecaa7960ec164bbcd51f0c410bc28efa1708ed1ea0c1 Nov 24 13:31:17 crc kubenswrapper[4752]: I1124 13:31:17.812289 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc" event={"ID":"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b","Type":"ContainerStarted","Data":"c2232481cf2bfc637ea37829e0b8edb50f60dfda3a18b0ca5e47e947dcf97ada"} Nov 24 13:31:17 crc kubenswrapper[4752]: I1124 13:31:17.812850 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc" event={"ID":"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b","Type":"ContainerStarted","Data":"19b68188e824a15cf293ecaa7960ec164bbcd51f0c410bc28efa1708ed1ea0c1"} Nov 24 13:31:17 crc kubenswrapper[4752]: I1124 13:31:17.838094 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc" podStartSLOduration=2.375756804 podStartE2EDuration="2.838070768s" podCreationTimestamp="2025-11-24 13:31:15 +0000 UTC" firstStartedPulling="2025-11-24 13:31:16.966789371 +0000 UTC m=+8682.951609660" lastFinishedPulling="2025-11-24 13:31:17.429103295 +0000 UTC m=+8683.413923624" observedRunningTime="2025-11-24 13:31:17.828692769 +0000 UTC m=+8683.813513058" watchObservedRunningTime="2025-11-24 13:31:17.838070768 +0000 UTC m=+8683.822891057" Nov 24 13:31:18 crc kubenswrapper[4752]: I1124 13:31:18.728079 4752 scope.go:117] "RemoveContainer" containerID="1a089388d4df9ad719898856284c6337e3ff4bcf9f29f31651cff6612d3497f5" Nov 24 13:31:18 crc kubenswrapper[4752]: E1124 13:31:18.728650 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:31:31 crc kubenswrapper[4752]: I1124 13:31:31.647934 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-h4rsv"] Nov 24 13:31:31 crc kubenswrapper[4752]: I1124 13:31:31.651693 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h4rsv" Nov 24 13:31:31 crc kubenswrapper[4752]: I1124 13:31:31.660984 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h4rsv"] Nov 24 13:31:31 crc kubenswrapper[4752]: I1124 13:31:31.782275 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aee7e4ae-7900-4552-97b0-4466dfba2f60-catalog-content\") pod \"community-operators-h4rsv\" (UID: \"aee7e4ae-7900-4552-97b0-4466dfba2f60\") " pod="openshift-marketplace/community-operators-h4rsv" Nov 24 13:31:31 crc kubenswrapper[4752]: I1124 13:31:31.783000 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aee7e4ae-7900-4552-97b0-4466dfba2f60-utilities\") pod \"community-operators-h4rsv\" (UID: \"aee7e4ae-7900-4552-97b0-4466dfba2f60\") " pod="openshift-marketplace/community-operators-h4rsv" Nov 24 13:31:31 crc kubenswrapper[4752]: I1124 13:31:31.783034 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbhz7\" (UniqueName: \"kubernetes.io/projected/aee7e4ae-7900-4552-97b0-4466dfba2f60-kube-api-access-kbhz7\") pod \"community-operators-h4rsv\" (UID: \"aee7e4ae-7900-4552-97b0-4466dfba2f60\") " pod="openshift-marketplace/community-operators-h4rsv" Nov 24 13:31:31 crc kubenswrapper[4752]: I1124 13:31:31.885139 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aee7e4ae-7900-4552-97b0-4466dfba2f60-catalog-content\") pod \"community-operators-h4rsv\" (UID: \"aee7e4ae-7900-4552-97b0-4466dfba2f60\") " pod="openshift-marketplace/community-operators-h4rsv" Nov 24 13:31:31 crc kubenswrapper[4752]: I1124 13:31:31.885450 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aee7e4ae-7900-4552-97b0-4466dfba2f60-utilities\") pod \"community-operators-h4rsv\" (UID: \"aee7e4ae-7900-4552-97b0-4466dfba2f60\") " pod="openshift-marketplace/community-operators-h4rsv" Nov 24 13:31:31 crc kubenswrapper[4752]: I1124 13:31:31.885497 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbhz7\" (UniqueName: \"kubernetes.io/projected/aee7e4ae-7900-4552-97b0-4466dfba2f60-kube-api-access-kbhz7\") pod \"community-operators-h4rsv\" (UID: \"aee7e4ae-7900-4552-97b0-4466dfba2f60\") " pod="openshift-marketplace/community-operators-h4rsv" Nov 24 13:31:31 crc kubenswrapper[4752]: I1124 13:31:31.885673 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aee7e4ae-7900-4552-97b0-4466dfba2f60-catalog-content\") pod \"community-operators-h4rsv\" (UID: \"aee7e4ae-7900-4552-97b0-4466dfba2f60\") " pod="openshift-marketplace/community-operators-h4rsv" Nov 24 13:31:31 crc kubenswrapper[4752]: I1124 13:31:31.885871 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aee7e4ae-7900-4552-97b0-4466dfba2f60-utilities\") pod \"community-operators-h4rsv\" (UID: \"aee7e4ae-7900-4552-97b0-4466dfba2f60\") " pod="openshift-marketplace/community-operators-h4rsv" Nov 24 13:31:31 crc kubenswrapper[4752]: I1124 13:31:31.904907 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbhz7\" (UniqueName: \"kubernetes.io/projected/aee7e4ae-7900-4552-97b0-4466dfba2f60-kube-api-access-kbhz7\") pod \"community-operators-h4rsv\" (UID: \"aee7e4ae-7900-4552-97b0-4466dfba2f60\") " pod="openshift-marketplace/community-operators-h4rsv" Nov 24 13:31:31 crc kubenswrapper[4752]: I1124 13:31:31.980491 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h4rsv" Nov 24 13:31:32 crc kubenswrapper[4752]: I1124 13:31:32.569796 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h4rsv"] Nov 24 13:31:32 crc kubenswrapper[4752]: I1124 13:31:32.728177 4752 scope.go:117] "RemoveContainer" containerID="1a089388d4df9ad719898856284c6337e3ff4bcf9f29f31651cff6612d3497f5" Nov 24 13:31:32 crc kubenswrapper[4752]: E1124 13:31:32.730234 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:31:32 crc kubenswrapper[4752]: I1124 13:31:32.992366 4752 generic.go:334] "Generic (PLEG): container finished" podID="aee7e4ae-7900-4552-97b0-4466dfba2f60" containerID="b9c712cb4c4eedc481e6c5e77ab3ab45f96a68ca1e4e82772e34f7f87f81fc1a" exitCode=0 Nov 24 13:31:32 crc kubenswrapper[4752]: I1124 13:31:32.992421 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h4rsv" event={"ID":"aee7e4ae-7900-4552-97b0-4466dfba2f60","Type":"ContainerDied","Data":"b9c712cb4c4eedc481e6c5e77ab3ab45f96a68ca1e4e82772e34f7f87f81fc1a"} Nov 24 13:31:32 crc kubenswrapper[4752]: I1124 13:31:32.992450 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h4rsv" event={"ID":"aee7e4ae-7900-4552-97b0-4466dfba2f60","Type":"ContainerStarted","Data":"0bbbfe3ebf509022f5ba381e57e0adf9dbfe1cd6b9888f9e282c86e07306a7d3"} Nov 24 13:31:32 crc kubenswrapper[4752]: I1124 13:31:32.994604 4752 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 24 13:31:34 crc kubenswrapper[4752]: I1124 13:31:34.004010 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h4rsv" event={"ID":"aee7e4ae-7900-4552-97b0-4466dfba2f60","Type":"ContainerStarted","Data":"9b63628d0dbb00a082012841d4fb00969de4b8cf24f369357828149a61e4408e"} Nov 24 13:31:35 crc kubenswrapper[4752]: I1124 13:31:35.015360 4752 generic.go:334] "Generic (PLEG): container finished" podID="aee7e4ae-7900-4552-97b0-4466dfba2f60" containerID="9b63628d0dbb00a082012841d4fb00969de4b8cf24f369357828149a61e4408e" exitCode=0 Nov 24 13:31:35 crc kubenswrapper[4752]: I1124 13:31:35.015466 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h4rsv" event={"ID":"aee7e4ae-7900-4552-97b0-4466dfba2f60","Type":"ContainerDied","Data":"9b63628d0dbb00a082012841d4fb00969de4b8cf24f369357828149a61e4408e"} Nov 24 13:31:36 crc kubenswrapper[4752]: I1124 13:31:36.028577 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h4rsv" event={"ID":"aee7e4ae-7900-4552-97b0-4466dfba2f60","Type":"ContainerStarted","Data":"93dfcdbb7df5b99dc21f6dae30ee647f02f60282804581984fd6ed6e2a58abfa"} Nov 24 13:31:36 crc kubenswrapper[4752]: I1124 13:31:36.053594 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-h4rsv" podStartSLOduration=2.611098299 podStartE2EDuration="5.05357513s" podCreationTimestamp="2025-11-24 13:31:31 +0000 UTC" firstStartedPulling="2025-11-24 13:31:32.994250136 +0000 UTC m=+8698.979070445" lastFinishedPulling="2025-11-24 13:31:35.436726977 +0000 UTC m=+8701.421547276" observedRunningTime="2025-11-24 13:31:36.050621505 +0000 UTC m=+8702.035441794" watchObservedRunningTime="2025-11-24 13:31:36.05357513 +0000 UTC m=+8702.038395419" Nov 24 13:31:39 crc kubenswrapper[4752]: E1124 13:31:39.722657 4752 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaee7e4ae_7900_4552_97b0_4466dfba2f60.slice/crio-conmon-9b63628d0dbb00a082012841d4fb00969de4b8cf24f369357828149a61e4408e.scope\": RecentStats: unable to find data in memory cache]" Nov 24 13:31:41 crc kubenswrapper[4752]: I1124 13:31:41.981175 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-h4rsv" Nov 24 13:31:41 crc kubenswrapper[4752]: I1124 13:31:41.981715 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-h4rsv" Nov 24 13:31:42 crc kubenswrapper[4752]: I1124 13:31:42.028450 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-h4rsv" Nov 24 13:31:42 crc kubenswrapper[4752]: I1124 13:31:42.145535 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-h4rsv" Nov 24 13:31:43 crc kubenswrapper[4752]: I1124 13:31:43.079941 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h4rsv"] Nov 24 13:31:44 crc kubenswrapper[4752]: I1124 13:31:44.122818 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-h4rsv" podUID="aee7e4ae-7900-4552-97b0-4466dfba2f60" containerName="registry-server" containerID="cri-o://93dfcdbb7df5b99dc21f6dae30ee647f02f60282804581984fd6ed6e2a58abfa" gracePeriod=2 Nov 24 13:31:44 crc kubenswrapper[4752]: I1124 13:31:44.611331 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h4rsv" Nov 24 13:31:44 crc kubenswrapper[4752]: I1124 13:31:44.720011 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aee7e4ae-7900-4552-97b0-4466dfba2f60-catalog-content\") pod \"aee7e4ae-7900-4552-97b0-4466dfba2f60\" (UID: \"aee7e4ae-7900-4552-97b0-4466dfba2f60\") " Nov 24 13:31:44 crc kubenswrapper[4752]: I1124 13:31:44.720099 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbhz7\" (UniqueName: \"kubernetes.io/projected/aee7e4ae-7900-4552-97b0-4466dfba2f60-kube-api-access-kbhz7\") pod \"aee7e4ae-7900-4552-97b0-4466dfba2f60\" (UID: \"aee7e4ae-7900-4552-97b0-4466dfba2f60\") " Nov 24 13:31:44 crc kubenswrapper[4752]: I1124 13:31:44.720231 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aee7e4ae-7900-4552-97b0-4466dfba2f60-utilities\") pod \"aee7e4ae-7900-4552-97b0-4466dfba2f60\" (UID: \"aee7e4ae-7900-4552-97b0-4466dfba2f60\") " Nov 24 13:31:44 crc kubenswrapper[4752]: I1124 13:31:44.721344 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aee7e4ae-7900-4552-97b0-4466dfba2f60-utilities" (OuterVolumeSpecName: "utilities") pod "aee7e4ae-7900-4552-97b0-4466dfba2f60" (UID: "aee7e4ae-7900-4552-97b0-4466dfba2f60"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 13:31:44 crc kubenswrapper[4752]: I1124 13:31:44.726046 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aee7e4ae-7900-4552-97b0-4466dfba2f60-kube-api-access-kbhz7" (OuterVolumeSpecName: "kube-api-access-kbhz7") pod "aee7e4ae-7900-4552-97b0-4466dfba2f60" (UID: "aee7e4ae-7900-4552-97b0-4466dfba2f60"). InnerVolumeSpecName "kube-api-access-kbhz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:31:44 crc kubenswrapper[4752]: I1124 13:31:44.778828 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aee7e4ae-7900-4552-97b0-4466dfba2f60-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aee7e4ae-7900-4552-97b0-4466dfba2f60" (UID: "aee7e4ae-7900-4552-97b0-4466dfba2f60"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 13:31:44 crc kubenswrapper[4752]: I1124 13:31:44.823556 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aee7e4ae-7900-4552-97b0-4466dfba2f60-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 13:31:44 crc kubenswrapper[4752]: I1124 13:31:44.823601 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbhz7\" (UniqueName: \"kubernetes.io/projected/aee7e4ae-7900-4552-97b0-4466dfba2f60-kube-api-access-kbhz7\") on node \"crc\" DevicePath \"\"" Nov 24 13:31:44 crc kubenswrapper[4752]: I1124 13:31:44.823631 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aee7e4ae-7900-4552-97b0-4466dfba2f60-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 13:31:45 crc kubenswrapper[4752]: I1124 13:31:45.136279 4752 generic.go:334] "Generic (PLEG): container finished" podID="aee7e4ae-7900-4552-97b0-4466dfba2f60" containerID="93dfcdbb7df5b99dc21f6dae30ee647f02f60282804581984fd6ed6e2a58abfa" exitCode=0 Nov 24 13:31:45 crc kubenswrapper[4752]: I1124 13:31:45.136372 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h4rsv" event={"ID":"aee7e4ae-7900-4552-97b0-4466dfba2f60","Type":"ContainerDied","Data":"93dfcdbb7df5b99dc21f6dae30ee647f02f60282804581984fd6ed6e2a58abfa"} Nov 24 13:31:45 crc kubenswrapper[4752]: I1124 13:31:45.136724 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h4rsv" event={"ID":"aee7e4ae-7900-4552-97b0-4466dfba2f60","Type":"ContainerDied","Data":"0bbbfe3ebf509022f5ba381e57e0adf9dbfe1cd6b9888f9e282c86e07306a7d3"} Nov 24 13:31:45 crc kubenswrapper[4752]: I1124 13:31:45.136786 4752 scope.go:117] "RemoveContainer" containerID="93dfcdbb7df5b99dc21f6dae30ee647f02f60282804581984fd6ed6e2a58abfa" Nov 24 13:31:45 crc kubenswrapper[4752]: I1124 13:31:45.136430 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h4rsv" Nov 24 13:31:45 crc kubenswrapper[4752]: I1124 13:31:45.176875 4752 scope.go:117] "RemoveContainer" containerID="9b63628d0dbb00a082012841d4fb00969de4b8cf24f369357828149a61e4408e" Nov 24 13:31:45 crc kubenswrapper[4752]: I1124 13:31:45.179083 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h4rsv"] Nov 24 13:31:45 crc kubenswrapper[4752]: I1124 13:31:45.192559 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-h4rsv"] Nov 24 13:31:45 crc kubenswrapper[4752]: I1124 13:31:45.198534 4752 scope.go:117] "RemoveContainer" containerID="b9c712cb4c4eedc481e6c5e77ab3ab45f96a68ca1e4e82772e34f7f87f81fc1a" Nov 24 13:31:45 crc kubenswrapper[4752]: I1124 13:31:45.264777 4752 scope.go:117] "RemoveContainer" containerID="93dfcdbb7df5b99dc21f6dae30ee647f02f60282804581984fd6ed6e2a58abfa" Nov 24 13:31:45 crc kubenswrapper[4752]: E1124 13:31:45.265166 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93dfcdbb7df5b99dc21f6dae30ee647f02f60282804581984fd6ed6e2a58abfa\": container with ID starting with 93dfcdbb7df5b99dc21f6dae30ee647f02f60282804581984fd6ed6e2a58abfa not found: ID does not exist" containerID="93dfcdbb7df5b99dc21f6dae30ee647f02f60282804581984fd6ed6e2a58abfa" Nov 24 13:31:45 crc kubenswrapper[4752]: I1124 13:31:45.265220 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93dfcdbb7df5b99dc21f6dae30ee647f02f60282804581984fd6ed6e2a58abfa"} err="failed to get container status \"93dfcdbb7df5b99dc21f6dae30ee647f02f60282804581984fd6ed6e2a58abfa\": rpc error: code = NotFound desc = could not find container \"93dfcdbb7df5b99dc21f6dae30ee647f02f60282804581984fd6ed6e2a58abfa\": container with ID starting with 93dfcdbb7df5b99dc21f6dae30ee647f02f60282804581984fd6ed6e2a58abfa not found: ID does not exist" Nov 24 13:31:45 crc kubenswrapper[4752]: I1124 13:31:45.265249 4752 scope.go:117] "RemoveContainer" containerID="9b63628d0dbb00a082012841d4fb00969de4b8cf24f369357828149a61e4408e" Nov 24 13:31:45 crc kubenswrapper[4752]: E1124 13:31:45.265544 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b63628d0dbb00a082012841d4fb00969de4b8cf24f369357828149a61e4408e\": container with ID starting with 9b63628d0dbb00a082012841d4fb00969de4b8cf24f369357828149a61e4408e not found: ID does not exist" containerID="9b63628d0dbb00a082012841d4fb00969de4b8cf24f369357828149a61e4408e" Nov 24 13:31:45 crc kubenswrapper[4752]: I1124 13:31:45.265588 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b63628d0dbb00a082012841d4fb00969de4b8cf24f369357828149a61e4408e"} err="failed to get container status \"9b63628d0dbb00a082012841d4fb00969de4b8cf24f369357828149a61e4408e\": rpc error: code = NotFound desc = could not find container \"9b63628d0dbb00a082012841d4fb00969de4b8cf24f369357828149a61e4408e\": container with ID starting with 9b63628d0dbb00a082012841d4fb00969de4b8cf24f369357828149a61e4408e not found: ID does not exist" Nov 24 13:31:45 crc kubenswrapper[4752]: I1124 13:31:45.265620 4752 scope.go:117] "RemoveContainer" containerID="b9c712cb4c4eedc481e6c5e77ab3ab45f96a68ca1e4e82772e34f7f87f81fc1a" Nov 24 13:31:45 crc kubenswrapper[4752]: E1124 13:31:45.266887 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9c712cb4c4eedc481e6c5e77ab3ab45f96a68ca1e4e82772e34f7f87f81fc1a\": container with ID starting with b9c712cb4c4eedc481e6c5e77ab3ab45f96a68ca1e4e82772e34f7f87f81fc1a not found: ID does not exist" containerID="b9c712cb4c4eedc481e6c5e77ab3ab45f96a68ca1e4e82772e34f7f87f81fc1a" Nov 24 13:31:45 crc kubenswrapper[4752]: I1124 13:31:45.266924 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9c712cb4c4eedc481e6c5e77ab3ab45f96a68ca1e4e82772e34f7f87f81fc1a"} err="failed to get container status \"b9c712cb4c4eedc481e6c5e77ab3ab45f96a68ca1e4e82772e34f7f87f81fc1a\": rpc error: code = NotFound desc = could not find container \"b9c712cb4c4eedc481e6c5e77ab3ab45f96a68ca1e4e82772e34f7f87f81fc1a\": container with ID starting with b9c712cb4c4eedc481e6c5e77ab3ab45f96a68ca1e4e82772e34f7f87f81fc1a not found: ID does not exist" Nov 24 13:31:45 crc kubenswrapper[4752]: I1124 13:31:45.728429 4752 scope.go:117] "RemoveContainer" containerID="1a089388d4df9ad719898856284c6337e3ff4bcf9f29f31651cff6612d3497f5" Nov 24 13:31:45 crc kubenswrapper[4752]: E1124 13:31:45.728778 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:31:46 crc kubenswrapper[4752]: I1124 13:31:46.740994 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aee7e4ae-7900-4552-97b0-4466dfba2f60" path="/var/lib/kubelet/pods/aee7e4ae-7900-4552-97b0-4466dfba2f60/volumes" Nov 24 13:31:50 crc kubenswrapper[4752]: E1124 13:31:50.023638 4752 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaee7e4ae_7900_4552_97b0_4466dfba2f60.slice/crio-conmon-9b63628d0dbb00a082012841d4fb00969de4b8cf24f369357828149a61e4408e.scope\": RecentStats: unable to find data in memory cache]" Nov 24 13:31:56 crc kubenswrapper[4752]: I1124 13:31:56.728719 4752 scope.go:117] "RemoveContainer" containerID="1a089388d4df9ad719898856284c6337e3ff4bcf9f29f31651cff6612d3497f5" Nov 24 13:31:56 crc kubenswrapper[4752]: E1124 13:31:56.729651 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:32:00 crc kubenswrapper[4752]: E1124 13:32:00.307773 4752 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaee7e4ae_7900_4552_97b0_4466dfba2f60.slice/crio-conmon-9b63628d0dbb00a082012841d4fb00969de4b8cf24f369357828149a61e4408e.scope\": RecentStats: unable to find data in memory cache]" Nov 24 13:32:10 crc kubenswrapper[4752]: E1124 13:32:10.573021 4752 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaee7e4ae_7900_4552_97b0_4466dfba2f60.slice/crio-conmon-9b63628d0dbb00a082012841d4fb00969de4b8cf24f369357828149a61e4408e.scope\": RecentStats: unable to find data in memory cache]" Nov 24 13:32:11 crc kubenswrapper[4752]: I1124 13:32:11.728554 4752 scope.go:117] "RemoveContainer" containerID="1a089388d4df9ad719898856284c6337e3ff4bcf9f29f31651cff6612d3497f5" Nov 24 13:32:11 crc kubenswrapper[4752]: E1124 13:32:11.730274 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:32:20 crc kubenswrapper[4752]: E1124 13:32:20.901045 4752 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaee7e4ae_7900_4552_97b0_4466dfba2f60.slice/crio-conmon-9b63628d0dbb00a082012841d4fb00969de4b8cf24f369357828149a61e4408e.scope\": RecentStats: unable to find data in memory cache]" Nov 24 13:32:22 crc kubenswrapper[4752]: I1124 13:32:22.729504 4752 scope.go:117] "RemoveContainer" containerID="1a089388d4df9ad719898856284c6337e3ff4bcf9f29f31651cff6612d3497f5" Nov 24 13:32:23 crc kubenswrapper[4752]: I1124 13:32:23.534114 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerStarted","Data":"156d1b69d212e15eb8f8c152839e02ce3fd14403dd58653932b393efcaee9fff"} Nov 24 13:32:31 crc kubenswrapper[4752]: E1124 13:32:31.252249 4752 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaee7e4ae_7900_4552_97b0_4466dfba2f60.slice/crio-conmon-9b63628d0dbb00a082012841d4fb00969de4b8cf24f369357828149a61e4408e.scope\": RecentStats: unable to find data in memory cache]" Nov 24 13:34:45 crc kubenswrapper[4752]: I1124 13:34:45.469525 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 13:34:45 crc kubenswrapper[4752]: I1124 13:34:45.470401 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 13:35:15 crc kubenswrapper[4752]: I1124 13:35:15.468436 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 13:35:15 crc kubenswrapper[4752]: I1124 13:35:15.469097 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 13:35:33 crc kubenswrapper[4752]: I1124 13:35:33.308041 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wq8b7"] Nov 24 13:35:33 crc kubenswrapper[4752]: E1124 13:35:33.309321 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aee7e4ae-7900-4552-97b0-4466dfba2f60" containerName="registry-server" Nov 24 13:35:33 crc kubenswrapper[4752]: I1124 13:35:33.309442 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="aee7e4ae-7900-4552-97b0-4466dfba2f60" containerName="registry-server" Nov 24 13:35:33 crc kubenswrapper[4752]: E1124 13:35:33.309463 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aee7e4ae-7900-4552-97b0-4466dfba2f60" containerName="extract-utilities" Nov 24 13:35:33 crc kubenswrapper[4752]: I1124 13:35:33.309473 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="aee7e4ae-7900-4552-97b0-4466dfba2f60" containerName="extract-utilities" Nov 24 13:35:33 crc kubenswrapper[4752]: E1124 13:35:33.309509 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aee7e4ae-7900-4552-97b0-4466dfba2f60" containerName="extract-content" Nov 24 13:35:33 crc kubenswrapper[4752]: I1124 13:35:33.309519 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="aee7e4ae-7900-4552-97b0-4466dfba2f60" containerName="extract-content" Nov 24 13:35:33 crc kubenswrapper[4752]: I1124 13:35:33.309804 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="aee7e4ae-7900-4552-97b0-4466dfba2f60" containerName="registry-server" Nov 24 13:35:33 crc kubenswrapper[4752]: I1124 13:35:33.312284 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wq8b7" Nov 24 13:35:33 crc kubenswrapper[4752]: I1124 13:35:33.324540 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wq8b7"] Nov 24 13:35:33 crc kubenswrapper[4752]: I1124 13:35:33.367710 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q59ns\" (UniqueName: \"kubernetes.io/projected/612c5cd2-46fd-4dea-8159-c2d465f05e65-kube-api-access-q59ns\") pod \"certified-operators-wq8b7\" (UID: \"612c5cd2-46fd-4dea-8159-c2d465f05e65\") " pod="openshift-marketplace/certified-operators-wq8b7" Nov 24 13:35:33 crc kubenswrapper[4752]: I1124 13:35:33.367831 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/612c5cd2-46fd-4dea-8159-c2d465f05e65-catalog-content\") pod \"certified-operators-wq8b7\" (UID: \"612c5cd2-46fd-4dea-8159-c2d465f05e65\") " pod="openshift-marketplace/certified-operators-wq8b7" Nov 24 13:35:33 crc kubenswrapper[4752]: I1124 13:35:33.367870 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/612c5cd2-46fd-4dea-8159-c2d465f05e65-utilities\") pod \"certified-operators-wq8b7\" (UID: \"612c5cd2-46fd-4dea-8159-c2d465f05e65\") " pod="openshift-marketplace/certified-operators-wq8b7" Nov 24 13:35:33 crc kubenswrapper[4752]: I1124 13:35:33.469507 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q59ns\" (UniqueName: \"kubernetes.io/projected/612c5cd2-46fd-4dea-8159-c2d465f05e65-kube-api-access-q59ns\") pod \"certified-operators-wq8b7\" (UID: \"612c5cd2-46fd-4dea-8159-c2d465f05e65\") " pod="openshift-marketplace/certified-operators-wq8b7" Nov 24 13:35:33 crc kubenswrapper[4752]: I1124 13:35:33.469551 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/612c5cd2-46fd-4dea-8159-c2d465f05e65-catalog-content\") pod \"certified-operators-wq8b7\" (UID: \"612c5cd2-46fd-4dea-8159-c2d465f05e65\") " pod="openshift-marketplace/certified-operators-wq8b7" Nov 24 13:35:33 crc kubenswrapper[4752]: I1124 13:35:33.469572 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/612c5cd2-46fd-4dea-8159-c2d465f05e65-utilities\") pod \"certified-operators-wq8b7\" (UID: \"612c5cd2-46fd-4dea-8159-c2d465f05e65\") " pod="openshift-marketplace/certified-operators-wq8b7" Nov 24 13:35:33 crc kubenswrapper[4752]: I1124 13:35:33.470219 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/612c5cd2-46fd-4dea-8159-c2d465f05e65-catalog-content\") pod \"certified-operators-wq8b7\" (UID: \"612c5cd2-46fd-4dea-8159-c2d465f05e65\") " pod="openshift-marketplace/certified-operators-wq8b7" Nov 24 13:35:33 crc kubenswrapper[4752]: I1124 13:35:33.470219 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/612c5cd2-46fd-4dea-8159-c2d465f05e65-utilities\") pod \"certified-operators-wq8b7\" (UID: \"612c5cd2-46fd-4dea-8159-c2d465f05e65\") " pod="openshift-marketplace/certified-operators-wq8b7" Nov 24 13:35:33 crc kubenswrapper[4752]: I1124 13:35:33.491665 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q59ns\" (UniqueName: \"kubernetes.io/projected/612c5cd2-46fd-4dea-8159-c2d465f05e65-kube-api-access-q59ns\") pod \"certified-operators-wq8b7\" (UID: \"612c5cd2-46fd-4dea-8159-c2d465f05e65\") " pod="openshift-marketplace/certified-operators-wq8b7" Nov 24 13:35:33 crc kubenswrapper[4752]: I1124 13:35:33.644958 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wq8b7" Nov 24 13:35:34 crc kubenswrapper[4752]: I1124 13:35:34.191599 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wq8b7"] Nov 24 13:35:34 crc kubenswrapper[4752]: I1124 13:35:34.833131 4752 generic.go:334] "Generic (PLEG): container finished" podID="612c5cd2-46fd-4dea-8159-c2d465f05e65" containerID="2695e8dd0f2c5e53f59dcea1cfb7f3c919ceededf93d64621d83be4b0a42512b" exitCode=0 Nov 24 13:35:34 crc kubenswrapper[4752]: I1124 13:35:34.833192 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wq8b7" event={"ID":"612c5cd2-46fd-4dea-8159-c2d465f05e65","Type":"ContainerDied","Data":"2695e8dd0f2c5e53f59dcea1cfb7f3c919ceededf93d64621d83be4b0a42512b"} Nov 24 13:35:34 crc kubenswrapper[4752]: I1124 13:35:34.833452 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wq8b7" event={"ID":"612c5cd2-46fd-4dea-8159-c2d465f05e65","Type":"ContainerStarted","Data":"5d31d78cb57175131c1422a0e85b16b4b42681e678eed82477ced369e57d72c9"} Nov 24 13:35:35 crc kubenswrapper[4752]: I1124 13:35:35.851990 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wq8b7" event={"ID":"612c5cd2-46fd-4dea-8159-c2d465f05e65","Type":"ContainerStarted","Data":"d5632bfb0c288228e9b44f2d3e91dbe4699a59ad49f353d617a862766d847a5e"} Nov 24 13:35:36 crc kubenswrapper[4752]: I1124 13:35:36.867263 4752 generic.go:334] "Generic (PLEG): container finished" podID="612c5cd2-46fd-4dea-8159-c2d465f05e65" containerID="d5632bfb0c288228e9b44f2d3e91dbe4699a59ad49f353d617a862766d847a5e" exitCode=0 Nov 24 13:35:36 crc kubenswrapper[4752]: I1124 13:35:36.867387 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wq8b7" event={"ID":"612c5cd2-46fd-4dea-8159-c2d465f05e65","Type":"ContainerDied","Data":"d5632bfb0c288228e9b44f2d3e91dbe4699a59ad49f353d617a862766d847a5e"} Nov 24 13:35:37 crc kubenswrapper[4752]: I1124 13:35:37.880415 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wq8b7" event={"ID":"612c5cd2-46fd-4dea-8159-c2d465f05e65","Type":"ContainerStarted","Data":"ecf2e42cb919d9c7bd8b814a890dc8ae57afd41e01b65bdc3e81b75088fec391"} Nov 24 13:35:37 crc kubenswrapper[4752]: I1124 13:35:37.908103 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wq8b7" podStartSLOduration=2.299433475 podStartE2EDuration="4.90808531s" podCreationTimestamp="2025-11-24 13:35:33 +0000 UTC" firstStartedPulling="2025-11-24 13:35:34.836314619 +0000 UTC m=+8940.821134898" lastFinishedPulling="2025-11-24 13:35:37.444966444 +0000 UTC m=+8943.429786733" observedRunningTime="2025-11-24 13:35:37.9028411 +0000 UTC m=+8943.887661399" watchObservedRunningTime="2025-11-24 13:35:37.90808531 +0000 UTC m=+8943.892905599" Nov 24 13:35:43 crc kubenswrapper[4752]: I1124 13:35:43.645915 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wq8b7" Nov 24 13:35:43 crc kubenswrapper[4752]: I1124 13:35:43.646791 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wq8b7" Nov 24 13:35:43 crc kubenswrapper[4752]: I1124 13:35:43.704275 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wq8b7" Nov 24 13:35:43 crc kubenswrapper[4752]: I1124 13:35:43.994616 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wq8b7" Nov 24 13:35:44 crc kubenswrapper[4752]: I1124 13:35:44.045112 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wq8b7"] Nov 24 13:35:45 crc kubenswrapper[4752]: I1124 13:35:45.469544 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 13:35:45 crc kubenswrapper[4752]: I1124 13:35:45.470166 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 13:35:45 crc kubenswrapper[4752]: I1124 13:35:45.470235 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 13:35:45 crc kubenswrapper[4752]: I1124 13:35:45.471467 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"156d1b69d212e15eb8f8c152839e02ce3fd14403dd58653932b393efcaee9fff"} pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 24 13:35:45 crc kubenswrapper[4752]: I1124 13:35:45.471525 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" containerID="cri-o://156d1b69d212e15eb8f8c152839e02ce3fd14403dd58653932b393efcaee9fff" gracePeriod=600 Nov 24 13:35:45 crc kubenswrapper[4752]: I1124 13:35:45.983192 4752 generic.go:334] "Generic (PLEG): container finished" podID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerID="156d1b69d212e15eb8f8c152839e02ce3fd14403dd58653932b393efcaee9fff" exitCode=0 Nov 24 13:35:45 crc kubenswrapper[4752]: I1124 13:35:45.983955 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wq8b7" podUID="612c5cd2-46fd-4dea-8159-c2d465f05e65" containerName="registry-server" containerID="cri-o://ecf2e42cb919d9c7bd8b814a890dc8ae57afd41e01b65bdc3e81b75088fec391" gracePeriod=2 Nov 24 13:35:45 crc kubenswrapper[4752]: I1124 13:35:45.983396 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerDied","Data":"156d1b69d212e15eb8f8c152839e02ce3fd14403dd58653932b393efcaee9fff"} Nov 24 13:35:45 crc kubenswrapper[4752]: I1124 13:35:45.984048 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerStarted","Data":"ef6259f52270e7ab7fb09374faf507034084284bf08d7836a2a3735a7e8bbb29"} Nov 24 13:35:45 crc kubenswrapper[4752]: I1124 13:35:45.984074 4752 scope.go:117] "RemoveContainer" containerID="1a089388d4df9ad719898856284c6337e3ff4bcf9f29f31651cff6612d3497f5" Nov 24 13:35:47 crc kubenswrapper[4752]: I1124 13:35:47.001119 4752 generic.go:334] "Generic (PLEG): container finished" podID="612c5cd2-46fd-4dea-8159-c2d465f05e65" containerID="ecf2e42cb919d9c7bd8b814a890dc8ae57afd41e01b65bdc3e81b75088fec391" exitCode=0 Nov 24 13:35:47 crc kubenswrapper[4752]: I1124 13:35:47.001185 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wq8b7" event={"ID":"612c5cd2-46fd-4dea-8159-c2d465f05e65","Type":"ContainerDied","Data":"ecf2e42cb919d9c7bd8b814a890dc8ae57afd41e01b65bdc3e81b75088fec391"} Nov 24 13:35:47 crc kubenswrapper[4752]: I1124 13:35:47.284225 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wq8b7" Nov 24 13:35:47 crc kubenswrapper[4752]: I1124 13:35:47.319704 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/612c5cd2-46fd-4dea-8159-c2d465f05e65-utilities\") pod \"612c5cd2-46fd-4dea-8159-c2d465f05e65\" (UID: \"612c5cd2-46fd-4dea-8159-c2d465f05e65\") " Nov 24 13:35:47 crc kubenswrapper[4752]: I1124 13:35:47.319870 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q59ns\" (UniqueName: \"kubernetes.io/projected/612c5cd2-46fd-4dea-8159-c2d465f05e65-kube-api-access-q59ns\") pod \"612c5cd2-46fd-4dea-8159-c2d465f05e65\" (UID: \"612c5cd2-46fd-4dea-8159-c2d465f05e65\") " Nov 24 13:35:47 crc kubenswrapper[4752]: I1124 13:35:47.320152 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/612c5cd2-46fd-4dea-8159-c2d465f05e65-catalog-content\") pod \"612c5cd2-46fd-4dea-8159-c2d465f05e65\" (UID: \"612c5cd2-46fd-4dea-8159-c2d465f05e65\") " Nov 24 13:35:47 crc kubenswrapper[4752]: I1124 13:35:47.320914 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/612c5cd2-46fd-4dea-8159-c2d465f05e65-utilities" (OuterVolumeSpecName: "utilities") pod "612c5cd2-46fd-4dea-8159-c2d465f05e65" (UID: "612c5cd2-46fd-4dea-8159-c2d465f05e65"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 13:35:47 crc kubenswrapper[4752]: I1124 13:35:47.330255 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/612c5cd2-46fd-4dea-8159-c2d465f05e65-kube-api-access-q59ns" (OuterVolumeSpecName: "kube-api-access-q59ns") pod "612c5cd2-46fd-4dea-8159-c2d465f05e65" (UID: "612c5cd2-46fd-4dea-8159-c2d465f05e65"). InnerVolumeSpecName "kube-api-access-q59ns". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:35:47 crc kubenswrapper[4752]: I1124 13:35:47.380060 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/612c5cd2-46fd-4dea-8159-c2d465f05e65-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "612c5cd2-46fd-4dea-8159-c2d465f05e65" (UID: "612c5cd2-46fd-4dea-8159-c2d465f05e65"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 13:35:47 crc kubenswrapper[4752]: I1124 13:35:47.424160 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/612c5cd2-46fd-4dea-8159-c2d465f05e65-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 13:35:47 crc kubenswrapper[4752]: I1124 13:35:47.424238 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q59ns\" (UniqueName: \"kubernetes.io/projected/612c5cd2-46fd-4dea-8159-c2d465f05e65-kube-api-access-q59ns\") on node \"crc\" DevicePath \"\"" Nov 24 13:35:47 crc kubenswrapper[4752]: I1124 13:35:47.424253 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/612c5cd2-46fd-4dea-8159-c2d465f05e65-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 13:35:48 crc kubenswrapper[4752]: I1124 13:35:48.014042 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wq8b7" event={"ID":"612c5cd2-46fd-4dea-8159-c2d465f05e65","Type":"ContainerDied","Data":"5d31d78cb57175131c1422a0e85b16b4b42681e678eed82477ced369e57d72c9"} Nov 24 13:35:48 crc kubenswrapper[4752]: I1124 13:35:48.014417 4752 scope.go:117] "RemoveContainer" containerID="ecf2e42cb919d9c7bd8b814a890dc8ae57afd41e01b65bdc3e81b75088fec391" Nov 24 13:35:48 crc kubenswrapper[4752]: I1124 13:35:48.014211 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wq8b7" Nov 24 13:35:48 crc kubenswrapper[4752]: I1124 13:35:48.044938 4752 scope.go:117] "RemoveContainer" containerID="d5632bfb0c288228e9b44f2d3e91dbe4699a59ad49f353d617a862766d847a5e" Nov 24 13:35:48 crc kubenswrapper[4752]: I1124 13:35:48.078680 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wq8b7"] Nov 24 13:35:48 crc kubenswrapper[4752]: I1124 13:35:48.086700 4752 scope.go:117] "RemoveContainer" containerID="2695e8dd0f2c5e53f59dcea1cfb7f3c919ceededf93d64621d83be4b0a42512b" Nov 24 13:35:48 crc kubenswrapper[4752]: I1124 13:35:48.095415 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wq8b7"] Nov 24 13:35:48 crc kubenswrapper[4752]: I1124 13:35:48.745487 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="612c5cd2-46fd-4dea-8159-c2d465f05e65" path="/var/lib/kubelet/pods/612c5cd2-46fd-4dea-8159-c2d465f05e65/volumes" Nov 24 13:36:42 crc kubenswrapper[4752]: I1124 13:36:42.622574 4752 generic.go:334] "Generic (PLEG): container finished" podID="6ef01ca0-2da9-4115-8152-29ea3b6d7d3b" containerID="c2232481cf2bfc637ea37829e0b8edb50f60dfda3a18b0ca5e47e947dcf97ada" exitCode=0 Nov 24 13:36:42 crc kubenswrapper[4752]: I1124 13:36:42.622715 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc" event={"ID":"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b","Type":"ContainerDied","Data":"c2232481cf2bfc637ea37829e0b8edb50f60dfda3a18b0ca5e47e947dcf97ada"} Nov 24 13:36:44 crc kubenswrapper[4752]: I1124 13:36:44.128463 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc" Nov 24 13:36:44 crc kubenswrapper[4752]: I1124 13:36:44.236105 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kg5tz\" (UniqueName: \"kubernetes.io/projected/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-kube-api-access-kg5tz\") pod \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\" (UID: \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\") " Nov 24 13:36:44 crc kubenswrapper[4752]: I1124 13:36:44.236150 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-nova-cell1-compute-config-0\") pod \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\" (UID: \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\") " Nov 24 13:36:44 crc kubenswrapper[4752]: I1124 13:36:44.236169 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-nova-cells-global-config-0\") pod \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\" (UID: \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\") " Nov 24 13:36:44 crc kubenswrapper[4752]: I1124 13:36:44.236201 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-nova-cell1-compute-config-1\") pod \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\" (UID: \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\") " Nov 24 13:36:44 crc kubenswrapper[4752]: I1124 13:36:44.236226 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-nova-cells-global-config-1\") pod \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\" (UID: \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\") " Nov 24 13:36:44 crc kubenswrapper[4752]: I1124 13:36:44.236335 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-inventory\") pod \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\" (UID: \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\") " Nov 24 13:36:44 crc kubenswrapper[4752]: I1124 13:36:44.236444 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-ssh-key\") pod \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\" (UID: \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\") " Nov 24 13:36:44 crc kubenswrapper[4752]: I1124 13:36:44.236467 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-nova-migration-ssh-key-1\") pod \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\" (UID: \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\") " Nov 24 13:36:44 crc kubenswrapper[4752]: I1124 13:36:44.236491 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-nova-migration-ssh-key-0\") pod \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\" (UID: \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\") " Nov 24 13:36:44 crc kubenswrapper[4752]: I1124 13:36:44.236567 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-nova-cell1-combined-ca-bundle\") pod \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\" (UID: \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\") " Nov 24 13:36:44 crc kubenswrapper[4752]: I1124 13:36:44.236607 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-ceph\") pod \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\" (UID: \"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b\") " Nov 24 13:36:44 crc kubenswrapper[4752]: I1124 13:36:44.248927 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-ceph" (OuterVolumeSpecName: "ceph") pod "6ef01ca0-2da9-4115-8152-29ea3b6d7d3b" (UID: "6ef01ca0-2da9-4115-8152-29ea3b6d7d3b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:36:44 crc kubenswrapper[4752]: I1124 13:36:44.249067 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-kube-api-access-kg5tz" (OuterVolumeSpecName: "kube-api-access-kg5tz") pod "6ef01ca0-2da9-4115-8152-29ea3b6d7d3b" (UID: "6ef01ca0-2da9-4115-8152-29ea3b6d7d3b"). InnerVolumeSpecName "kube-api-access-kg5tz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:36:44 crc kubenswrapper[4752]: I1124 13:36:44.252588 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "6ef01ca0-2da9-4115-8152-29ea3b6d7d3b" (UID: "6ef01ca0-2da9-4115-8152-29ea3b6d7d3b"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:36:44 crc kubenswrapper[4752]: I1124 13:36:44.278012 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-inventory" (OuterVolumeSpecName: "inventory") pod "6ef01ca0-2da9-4115-8152-29ea3b6d7d3b" (UID: "6ef01ca0-2da9-4115-8152-29ea3b6d7d3b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:36:44 crc kubenswrapper[4752]: I1124 13:36:44.279088 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "6ef01ca0-2da9-4115-8152-29ea3b6d7d3b" (UID: "6ef01ca0-2da9-4115-8152-29ea3b6d7d3b"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:36:44 crc kubenswrapper[4752]: I1124 13:36:44.284939 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "6ef01ca0-2da9-4115-8152-29ea3b6d7d3b" (UID: "6ef01ca0-2da9-4115-8152-29ea3b6d7d3b"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:36:44 crc kubenswrapper[4752]: I1124 13:36:44.285633 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6ef01ca0-2da9-4115-8152-29ea3b6d7d3b" (UID: "6ef01ca0-2da9-4115-8152-29ea3b6d7d3b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:36:44 crc kubenswrapper[4752]: I1124 13:36:44.286537 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "6ef01ca0-2da9-4115-8152-29ea3b6d7d3b" (UID: "6ef01ca0-2da9-4115-8152-29ea3b6d7d3b"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:36:44 crc kubenswrapper[4752]: I1124 13:36:44.286838 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "6ef01ca0-2da9-4115-8152-29ea3b6d7d3b" (UID: "6ef01ca0-2da9-4115-8152-29ea3b6d7d3b"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:36:44 crc kubenswrapper[4752]: I1124 13:36:44.289466 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "6ef01ca0-2da9-4115-8152-29ea3b6d7d3b" (UID: "6ef01ca0-2da9-4115-8152-29ea3b6d7d3b"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 13:36:44 crc kubenswrapper[4752]: I1124 13:36:44.293311 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "6ef01ca0-2da9-4115-8152-29ea3b6d7d3b" (UID: "6ef01ca0-2da9-4115-8152-29ea3b6d7d3b"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 13:36:44 crc kubenswrapper[4752]: I1124 13:36:44.339317 4752 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 24 13:36:44 crc kubenswrapper[4752]: I1124 13:36:44.339348 4752 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Nov 24 13:36:44 crc kubenswrapper[4752]: I1124 13:36:44.339361 4752 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Nov 24 13:36:44 crc kubenswrapper[4752]: I1124 13:36:44.339370 4752 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 13:36:44 crc kubenswrapper[4752]: I1124 13:36:44.339379 4752 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-ceph\") on node \"crc\" DevicePath \"\"" Nov 24 13:36:44 crc kubenswrapper[4752]: I1124 13:36:44.339388 4752 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Nov 24 13:36:44 crc kubenswrapper[4752]: I1124 13:36:44.339398 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kg5tz\" (UniqueName: \"kubernetes.io/projected/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-kube-api-access-kg5tz\") on node \"crc\" DevicePath \"\"" Nov 24 13:36:44 crc kubenswrapper[4752]: I1124 13:36:44.339412 4752 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Nov 24 13:36:44 crc kubenswrapper[4752]: I1124 13:36:44.339431 4752 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Nov 24 13:36:44 crc kubenswrapper[4752]: I1124 13:36:44.339445 4752 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Nov 24 13:36:44 crc kubenswrapper[4752]: I1124 13:36:44.339457 4752 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ef01ca0-2da9-4115-8152-29ea3b6d7d3b-inventory\") on node \"crc\" DevicePath \"\"" Nov 24 13:36:44 crc kubenswrapper[4752]: I1124 13:36:44.654673 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc" event={"ID":"6ef01ca0-2da9-4115-8152-29ea3b6d7d3b","Type":"ContainerDied","Data":"19b68188e824a15cf293ecaa7960ec164bbcd51f0c410bc28efa1708ed1ea0c1"} Nov 24 13:36:44 crc kubenswrapper[4752]: I1124 13:36:44.655149 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19b68188e824a15cf293ecaa7960ec164bbcd51f0c410bc28efa1708ed1ea0c1" Nov 24 13:36:44 crc kubenswrapper[4752]: I1124 13:36:44.654841 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc" Nov 24 13:37:45 crc kubenswrapper[4752]: I1124 13:37:45.469010 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 13:37:45 crc kubenswrapper[4752]: I1124 13:37:45.469642 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 13:38:15 crc kubenswrapper[4752]: I1124 13:38:15.469666 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 13:38:15 crc kubenswrapper[4752]: I1124 13:38:15.470233 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 13:38:45 crc kubenswrapper[4752]: I1124 13:38:45.468683 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 13:38:45 crc kubenswrapper[4752]: I1124 13:38:45.469433 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 13:38:45 crc kubenswrapper[4752]: I1124 13:38:45.469498 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 13:38:45 crc kubenswrapper[4752]: I1124 13:38:45.470641 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ef6259f52270e7ab7fb09374faf507034084284bf08d7836a2a3735a7e8bbb29"} pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 24 13:38:45 crc kubenswrapper[4752]: I1124 13:38:45.470729 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" containerID="cri-o://ef6259f52270e7ab7fb09374faf507034084284bf08d7836a2a3735a7e8bbb29" gracePeriod=600 Nov 24 13:38:45 crc kubenswrapper[4752]: E1124 13:38:45.606426 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:38:46 crc kubenswrapper[4752]: I1124 13:38:46.140915 4752 generic.go:334] "Generic (PLEG): container finished" podID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerID="ef6259f52270e7ab7fb09374faf507034084284bf08d7836a2a3735a7e8bbb29" exitCode=0 Nov 24 13:38:46 crc kubenswrapper[4752]: I1124 13:38:46.141176 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerDied","Data":"ef6259f52270e7ab7fb09374faf507034084284bf08d7836a2a3735a7e8bbb29"} Nov 24 13:38:46 crc kubenswrapper[4752]: I1124 13:38:46.141287 4752 scope.go:117] "RemoveContainer" containerID="156d1b69d212e15eb8f8c152839e02ce3fd14403dd58653932b393efcaee9fff" Nov 24 13:38:46 crc kubenswrapper[4752]: I1124 13:38:46.142414 4752 scope.go:117] "RemoveContainer" containerID="ef6259f52270e7ab7fb09374faf507034084284bf08d7836a2a3735a7e8bbb29" Nov 24 13:38:46 crc kubenswrapper[4752]: E1124 13:38:46.143086 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:38:57 crc kubenswrapper[4752]: I1124 13:38:57.440818 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6hbmt/must-gather-m7jq2"] Nov 24 13:38:57 crc kubenswrapper[4752]: E1124 13:38:57.441895 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ef01ca0-2da9-4115-8152-29ea3b6d7d3b" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Nov 24 13:38:57 crc kubenswrapper[4752]: I1124 13:38:57.441913 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ef01ca0-2da9-4115-8152-29ea3b6d7d3b" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Nov 24 13:38:57 crc kubenswrapper[4752]: E1124 13:38:57.441943 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="612c5cd2-46fd-4dea-8159-c2d465f05e65" containerName="extract-content" Nov 24 13:38:57 crc kubenswrapper[4752]: I1124 13:38:57.441955 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="612c5cd2-46fd-4dea-8159-c2d465f05e65" containerName="extract-content" Nov 24 13:38:57 crc kubenswrapper[4752]: E1124 13:38:57.441980 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="612c5cd2-46fd-4dea-8159-c2d465f05e65" containerName="extract-utilities" Nov 24 13:38:57 crc kubenswrapper[4752]: I1124 13:38:57.441987 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="612c5cd2-46fd-4dea-8159-c2d465f05e65" containerName="extract-utilities" Nov 24 13:38:57 crc kubenswrapper[4752]: E1124 13:38:57.442005 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="612c5cd2-46fd-4dea-8159-c2d465f05e65" containerName="registry-server" Nov 24 13:38:57 crc kubenswrapper[4752]: I1124 13:38:57.442012 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="612c5cd2-46fd-4dea-8159-c2d465f05e65" containerName="registry-server" Nov 24 13:38:57 crc kubenswrapper[4752]: I1124 13:38:57.442255 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ef01ca0-2da9-4115-8152-29ea3b6d7d3b" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Nov 24 13:38:57 crc kubenswrapper[4752]: I1124 13:38:57.442298 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="612c5cd2-46fd-4dea-8159-c2d465f05e65" containerName="registry-server" Nov 24 13:38:57 crc kubenswrapper[4752]: I1124 13:38:57.443838 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6hbmt/must-gather-m7jq2" Nov 24 13:38:57 crc kubenswrapper[4752]: I1124 13:38:57.446507 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-6hbmt"/"kube-root-ca.crt" Nov 24 13:38:57 crc kubenswrapper[4752]: I1124 13:38:57.446779 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-6hbmt"/"openshift-service-ca.crt" Nov 24 13:38:57 crc kubenswrapper[4752]: I1124 13:38:57.471689 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6hbmt/must-gather-m7jq2"] Nov 24 13:38:57 crc kubenswrapper[4752]: I1124 13:38:57.541810 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfnzs\" (UniqueName: \"kubernetes.io/projected/82a974f6-7004-41a1-ba5c-bb67b6d5b23d-kube-api-access-jfnzs\") pod \"must-gather-m7jq2\" (UID: \"82a974f6-7004-41a1-ba5c-bb67b6d5b23d\") " pod="openshift-must-gather-6hbmt/must-gather-m7jq2" Nov 24 13:38:57 crc kubenswrapper[4752]: I1124 13:38:57.541873 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/82a974f6-7004-41a1-ba5c-bb67b6d5b23d-must-gather-output\") pod \"must-gather-m7jq2\" (UID: \"82a974f6-7004-41a1-ba5c-bb67b6d5b23d\") " pod="openshift-must-gather-6hbmt/must-gather-m7jq2" Nov 24 13:38:57 crc kubenswrapper[4752]: I1124 13:38:57.644332 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfnzs\" (UniqueName: \"kubernetes.io/projected/82a974f6-7004-41a1-ba5c-bb67b6d5b23d-kube-api-access-jfnzs\") pod \"must-gather-m7jq2\" (UID: \"82a974f6-7004-41a1-ba5c-bb67b6d5b23d\") " pod="openshift-must-gather-6hbmt/must-gather-m7jq2" Nov 24 13:38:57 crc kubenswrapper[4752]: I1124 13:38:57.644380 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/82a974f6-7004-41a1-ba5c-bb67b6d5b23d-must-gather-output\") pod \"must-gather-m7jq2\" (UID: \"82a974f6-7004-41a1-ba5c-bb67b6d5b23d\") " pod="openshift-must-gather-6hbmt/must-gather-m7jq2" Nov 24 13:38:57 crc kubenswrapper[4752]: I1124 13:38:57.645021 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/82a974f6-7004-41a1-ba5c-bb67b6d5b23d-must-gather-output\") pod \"must-gather-m7jq2\" (UID: \"82a974f6-7004-41a1-ba5c-bb67b6d5b23d\") " pod="openshift-must-gather-6hbmt/must-gather-m7jq2" Nov 24 13:38:57 crc kubenswrapper[4752]: I1124 13:38:57.666172 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfnzs\" (UniqueName: \"kubernetes.io/projected/82a974f6-7004-41a1-ba5c-bb67b6d5b23d-kube-api-access-jfnzs\") pod \"must-gather-m7jq2\" (UID: \"82a974f6-7004-41a1-ba5c-bb67b6d5b23d\") " pod="openshift-must-gather-6hbmt/must-gather-m7jq2" Nov 24 13:38:57 crc kubenswrapper[4752]: I1124 13:38:57.770515 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6hbmt/must-gather-m7jq2" Nov 24 13:38:58 crc kubenswrapper[4752]: I1124 13:38:58.262520 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6hbmt/must-gather-m7jq2"] Nov 24 13:38:58 crc kubenswrapper[4752]: I1124 13:38:58.281188 4752 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 24 13:38:58 crc kubenswrapper[4752]: I1124 13:38:58.293393 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6hbmt/must-gather-m7jq2" event={"ID":"82a974f6-7004-41a1-ba5c-bb67b6d5b23d","Type":"ContainerStarted","Data":"b56df395afd6f157cfb64b4316d3a8da282ed11187f7d656cc30be9f0555e266"} Nov 24 13:38:58 crc kubenswrapper[4752]: I1124 13:38:58.727770 4752 scope.go:117] "RemoveContainer" containerID="ef6259f52270e7ab7fb09374faf507034084284bf08d7836a2a3735a7e8bbb29" Nov 24 13:38:58 crc kubenswrapper[4752]: E1124 13:38:58.728137 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:39:04 crc kubenswrapper[4752]: I1124 13:39:04.403140 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6hbmt/must-gather-m7jq2" event={"ID":"82a974f6-7004-41a1-ba5c-bb67b6d5b23d","Type":"ContainerStarted","Data":"467bfe7c8afebe585c07bd7ed2d759a54393460ba0c516131b17f9837a45171a"} Nov 24 13:39:04 crc kubenswrapper[4752]: I1124 13:39:04.403820 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6hbmt/must-gather-m7jq2" event={"ID":"82a974f6-7004-41a1-ba5c-bb67b6d5b23d","Type":"ContainerStarted","Data":"1e0fdd6335fa3ee07faa1f5ea90b07b615fd21c158a27d17e6c912eac3ecbc21"} Nov 24 13:39:04 crc kubenswrapper[4752]: I1124 13:39:04.428646 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6hbmt/must-gather-m7jq2" podStartSLOduration=2.294609473 podStartE2EDuration="7.428624064s" podCreationTimestamp="2025-11-24 13:38:57 +0000 UTC" firstStartedPulling="2025-11-24 13:38:58.281105888 +0000 UTC m=+9144.265926177" lastFinishedPulling="2025-11-24 13:39:03.415120479 +0000 UTC m=+9149.399940768" observedRunningTime="2025-11-24 13:39:04.427400219 +0000 UTC m=+9150.412220518" watchObservedRunningTime="2025-11-24 13:39:04.428624064 +0000 UTC m=+9150.413444363" Nov 24 13:39:08 crc kubenswrapper[4752]: I1124 13:39:08.267255 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6hbmt/crc-debug-nrvp9"] Nov 24 13:39:08 crc kubenswrapper[4752]: I1124 13:39:08.270158 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6hbmt/crc-debug-nrvp9" Nov 24 13:39:08 crc kubenswrapper[4752]: I1124 13:39:08.272357 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-6hbmt"/"default-dockercfg-q2mtp" Nov 24 13:39:08 crc kubenswrapper[4752]: I1124 13:39:08.385227 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/66f48639-fd9e-46b3-b9a2-ebd50132a3cc-host\") pod \"crc-debug-nrvp9\" (UID: \"66f48639-fd9e-46b3-b9a2-ebd50132a3cc\") " pod="openshift-must-gather-6hbmt/crc-debug-nrvp9" Nov 24 13:39:08 crc kubenswrapper[4752]: I1124 13:39:08.385326 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szpq4\" (UniqueName: \"kubernetes.io/projected/66f48639-fd9e-46b3-b9a2-ebd50132a3cc-kube-api-access-szpq4\") pod \"crc-debug-nrvp9\" (UID: \"66f48639-fd9e-46b3-b9a2-ebd50132a3cc\") " pod="openshift-must-gather-6hbmt/crc-debug-nrvp9" Nov 24 13:39:08 crc kubenswrapper[4752]: I1124 13:39:08.487689 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/66f48639-fd9e-46b3-b9a2-ebd50132a3cc-host\") pod \"crc-debug-nrvp9\" (UID: \"66f48639-fd9e-46b3-b9a2-ebd50132a3cc\") " pod="openshift-must-gather-6hbmt/crc-debug-nrvp9" Nov 24 13:39:08 crc kubenswrapper[4752]: I1124 13:39:08.487819 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szpq4\" (UniqueName: \"kubernetes.io/projected/66f48639-fd9e-46b3-b9a2-ebd50132a3cc-kube-api-access-szpq4\") pod \"crc-debug-nrvp9\" (UID: \"66f48639-fd9e-46b3-b9a2-ebd50132a3cc\") " pod="openshift-must-gather-6hbmt/crc-debug-nrvp9" Nov 24 13:39:08 crc kubenswrapper[4752]: I1124 13:39:08.487924 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/66f48639-fd9e-46b3-b9a2-ebd50132a3cc-host\") pod \"crc-debug-nrvp9\" (UID: \"66f48639-fd9e-46b3-b9a2-ebd50132a3cc\") " pod="openshift-must-gather-6hbmt/crc-debug-nrvp9" Nov 24 13:39:08 crc kubenswrapper[4752]: I1124 13:39:08.511694 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szpq4\" (UniqueName: \"kubernetes.io/projected/66f48639-fd9e-46b3-b9a2-ebd50132a3cc-kube-api-access-szpq4\") pod \"crc-debug-nrvp9\" (UID: \"66f48639-fd9e-46b3-b9a2-ebd50132a3cc\") " pod="openshift-must-gather-6hbmt/crc-debug-nrvp9" Nov 24 13:39:08 crc kubenswrapper[4752]: I1124 13:39:08.588722 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6hbmt/crc-debug-nrvp9" Nov 24 13:39:08 crc kubenswrapper[4752]: W1124 13:39:08.636928 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66f48639_fd9e_46b3_b9a2_ebd50132a3cc.slice/crio-fa8713f1ec1e9a2dfa2fa68622959b57694eec6e89ee88481574e234464426d0 WatchSource:0}: Error finding container fa8713f1ec1e9a2dfa2fa68622959b57694eec6e89ee88481574e234464426d0: Status 404 returned error can't find the container with id fa8713f1ec1e9a2dfa2fa68622959b57694eec6e89ee88481574e234464426d0 Nov 24 13:39:09 crc kubenswrapper[4752]: I1124 13:39:09.457385 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6hbmt/crc-debug-nrvp9" event={"ID":"66f48639-fd9e-46b3-b9a2-ebd50132a3cc","Type":"ContainerStarted","Data":"fa8713f1ec1e9a2dfa2fa68622959b57694eec6e89ee88481574e234464426d0"} Nov 24 13:39:13 crc kubenswrapper[4752]: I1124 13:39:13.728810 4752 scope.go:117] "RemoveContainer" containerID="ef6259f52270e7ab7fb09374faf507034084284bf08d7836a2a3735a7e8bbb29" Nov 24 13:39:13 crc kubenswrapper[4752]: E1124 13:39:13.729952 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:39:21 crc kubenswrapper[4752]: I1124 13:39:21.614685 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6hbmt/crc-debug-nrvp9" event={"ID":"66f48639-fd9e-46b3-b9a2-ebd50132a3cc","Type":"ContainerStarted","Data":"e22e3ed2779a6ca6f8e787401c7c4a76d1ea8652ee3b2566b49e54fca11be959"} Nov 24 13:39:21 crc kubenswrapper[4752]: I1124 13:39:21.629074 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6hbmt/crc-debug-nrvp9" podStartSLOduration=1.603085975 podStartE2EDuration="13.629054636s" podCreationTimestamp="2025-11-24 13:39:08 +0000 UTC" firstStartedPulling="2025-11-24 13:39:08.639621325 +0000 UTC m=+9154.624441614" lastFinishedPulling="2025-11-24 13:39:20.665589986 +0000 UTC m=+9166.650410275" observedRunningTime="2025-11-24 13:39:21.626471782 +0000 UTC m=+9167.611292081" watchObservedRunningTime="2025-11-24 13:39:21.629054636 +0000 UTC m=+9167.613874935" Nov 24 13:39:27 crc kubenswrapper[4752]: I1124 13:39:27.728020 4752 scope.go:117] "RemoveContainer" containerID="ef6259f52270e7ab7fb09374faf507034084284bf08d7836a2a3735a7e8bbb29" Nov 24 13:39:27 crc kubenswrapper[4752]: E1124 13:39:27.728721 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:39:39 crc kubenswrapper[4752]: I1124 13:39:39.728942 4752 scope.go:117] "RemoveContainer" containerID="ef6259f52270e7ab7fb09374faf507034084284bf08d7836a2a3735a7e8bbb29" Nov 24 13:39:39 crc kubenswrapper[4752]: E1124 13:39:39.730263 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:39:42 crc kubenswrapper[4752]: I1124 13:39:42.873715 4752 generic.go:334] "Generic (PLEG): container finished" podID="66f48639-fd9e-46b3-b9a2-ebd50132a3cc" containerID="e22e3ed2779a6ca6f8e787401c7c4a76d1ea8652ee3b2566b49e54fca11be959" exitCode=0 Nov 24 13:39:42 crc kubenswrapper[4752]: I1124 13:39:42.873837 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6hbmt/crc-debug-nrvp9" event={"ID":"66f48639-fd9e-46b3-b9a2-ebd50132a3cc","Type":"ContainerDied","Data":"e22e3ed2779a6ca6f8e787401c7c4a76d1ea8652ee3b2566b49e54fca11be959"} Nov 24 13:39:44 crc kubenswrapper[4752]: I1124 13:39:44.012119 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6hbmt/crc-debug-nrvp9" Nov 24 13:39:44 crc kubenswrapper[4752]: I1124 13:39:44.058227 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6hbmt/crc-debug-nrvp9"] Nov 24 13:39:44 crc kubenswrapper[4752]: I1124 13:39:44.072352 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6hbmt/crc-debug-nrvp9"] Nov 24 13:39:44 crc kubenswrapper[4752]: I1124 13:39:44.201931 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szpq4\" (UniqueName: \"kubernetes.io/projected/66f48639-fd9e-46b3-b9a2-ebd50132a3cc-kube-api-access-szpq4\") pod \"66f48639-fd9e-46b3-b9a2-ebd50132a3cc\" (UID: \"66f48639-fd9e-46b3-b9a2-ebd50132a3cc\") " Nov 24 13:39:44 crc kubenswrapper[4752]: I1124 13:39:44.202355 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/66f48639-fd9e-46b3-b9a2-ebd50132a3cc-host\") pod \"66f48639-fd9e-46b3-b9a2-ebd50132a3cc\" (UID: \"66f48639-fd9e-46b3-b9a2-ebd50132a3cc\") " Nov 24 13:39:44 crc kubenswrapper[4752]: I1124 13:39:44.202454 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/66f48639-fd9e-46b3-b9a2-ebd50132a3cc-host" (OuterVolumeSpecName: "host") pod "66f48639-fd9e-46b3-b9a2-ebd50132a3cc" (UID: "66f48639-fd9e-46b3-b9a2-ebd50132a3cc"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 13:39:44 crc kubenswrapper[4752]: I1124 13:39:44.203040 4752 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/66f48639-fd9e-46b3-b9a2-ebd50132a3cc-host\") on node \"crc\" DevicePath \"\"" Nov 24 13:39:44 crc kubenswrapper[4752]: I1124 13:39:44.207149 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66f48639-fd9e-46b3-b9a2-ebd50132a3cc-kube-api-access-szpq4" (OuterVolumeSpecName: "kube-api-access-szpq4") pod "66f48639-fd9e-46b3-b9a2-ebd50132a3cc" (UID: "66f48639-fd9e-46b3-b9a2-ebd50132a3cc"). InnerVolumeSpecName "kube-api-access-szpq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:39:44 crc kubenswrapper[4752]: I1124 13:39:44.305010 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szpq4\" (UniqueName: \"kubernetes.io/projected/66f48639-fd9e-46b3-b9a2-ebd50132a3cc-kube-api-access-szpq4\") on node \"crc\" DevicePath \"\"" Nov 24 13:39:44 crc kubenswrapper[4752]: I1124 13:39:44.742253 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66f48639-fd9e-46b3-b9a2-ebd50132a3cc" path="/var/lib/kubelet/pods/66f48639-fd9e-46b3-b9a2-ebd50132a3cc/volumes" Nov 24 13:39:44 crc kubenswrapper[4752]: I1124 13:39:44.896378 4752 scope.go:117] "RemoveContainer" containerID="e22e3ed2779a6ca6f8e787401c7c4a76d1ea8652ee3b2566b49e54fca11be959" Nov 24 13:39:44 crc kubenswrapper[4752]: I1124 13:39:44.896592 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6hbmt/crc-debug-nrvp9" Nov 24 13:39:45 crc kubenswrapper[4752]: I1124 13:39:45.253260 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6hbmt/crc-debug-mpj8c"] Nov 24 13:39:45 crc kubenswrapper[4752]: E1124 13:39:45.254630 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66f48639-fd9e-46b3-b9a2-ebd50132a3cc" containerName="container-00" Nov 24 13:39:45 crc kubenswrapper[4752]: I1124 13:39:45.254647 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="66f48639-fd9e-46b3-b9a2-ebd50132a3cc" containerName="container-00" Nov 24 13:39:45 crc kubenswrapper[4752]: I1124 13:39:45.255030 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="66f48639-fd9e-46b3-b9a2-ebd50132a3cc" containerName="container-00" Nov 24 13:39:45 crc kubenswrapper[4752]: I1124 13:39:45.256087 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6hbmt/crc-debug-mpj8c" Nov 24 13:39:45 crc kubenswrapper[4752]: I1124 13:39:45.258857 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-6hbmt"/"default-dockercfg-q2mtp" Nov 24 13:39:45 crc kubenswrapper[4752]: I1124 13:39:45.430202 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fe9ce043-d08f-40ca-8d0c-c0101155336f-host\") pod \"crc-debug-mpj8c\" (UID: \"fe9ce043-d08f-40ca-8d0c-c0101155336f\") " pod="openshift-must-gather-6hbmt/crc-debug-mpj8c" Nov 24 13:39:45 crc kubenswrapper[4752]: I1124 13:39:45.430379 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p54lk\" (UniqueName: \"kubernetes.io/projected/fe9ce043-d08f-40ca-8d0c-c0101155336f-kube-api-access-p54lk\") pod \"crc-debug-mpj8c\" (UID: \"fe9ce043-d08f-40ca-8d0c-c0101155336f\") " pod="openshift-must-gather-6hbmt/crc-debug-mpj8c" Nov 24 13:39:45 crc kubenswrapper[4752]: I1124 13:39:45.532636 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fe9ce043-d08f-40ca-8d0c-c0101155336f-host\") pod \"crc-debug-mpj8c\" (UID: \"fe9ce043-d08f-40ca-8d0c-c0101155336f\") " pod="openshift-must-gather-6hbmt/crc-debug-mpj8c" Nov 24 13:39:45 crc kubenswrapper[4752]: I1124 13:39:45.532756 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p54lk\" (UniqueName: \"kubernetes.io/projected/fe9ce043-d08f-40ca-8d0c-c0101155336f-kube-api-access-p54lk\") pod \"crc-debug-mpj8c\" (UID: \"fe9ce043-d08f-40ca-8d0c-c0101155336f\") " pod="openshift-must-gather-6hbmt/crc-debug-mpj8c" Nov 24 13:39:45 crc kubenswrapper[4752]: I1124 13:39:45.532884 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fe9ce043-d08f-40ca-8d0c-c0101155336f-host\") pod \"crc-debug-mpj8c\" (UID: \"fe9ce043-d08f-40ca-8d0c-c0101155336f\") " pod="openshift-must-gather-6hbmt/crc-debug-mpj8c" Nov 24 13:39:45 crc kubenswrapper[4752]: I1124 13:39:45.560694 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p54lk\" (UniqueName: \"kubernetes.io/projected/fe9ce043-d08f-40ca-8d0c-c0101155336f-kube-api-access-p54lk\") pod \"crc-debug-mpj8c\" (UID: \"fe9ce043-d08f-40ca-8d0c-c0101155336f\") " pod="openshift-must-gather-6hbmt/crc-debug-mpj8c" Nov 24 13:39:45 crc kubenswrapper[4752]: I1124 13:39:45.575094 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6hbmt/crc-debug-mpj8c" Nov 24 13:39:45 crc kubenswrapper[4752]: I1124 13:39:45.913427 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6hbmt/crc-debug-mpj8c" event={"ID":"fe9ce043-d08f-40ca-8d0c-c0101155336f","Type":"ContainerStarted","Data":"1c6d41376bf194fe362f7befc9b615bc0eb04460702eabc99da95fbeb2283a86"} Nov 24 13:39:46 crc kubenswrapper[4752]: I1124 13:39:46.924375 4752 generic.go:334] "Generic (PLEG): container finished" podID="fe9ce043-d08f-40ca-8d0c-c0101155336f" containerID="4894ca11acd573f1a5ff2249cc3841baca4f1530ce9a0d8ff1fd6364c15b5ca5" exitCode=1 Nov 24 13:39:46 crc kubenswrapper[4752]: I1124 13:39:46.924483 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6hbmt/crc-debug-mpj8c" event={"ID":"fe9ce043-d08f-40ca-8d0c-c0101155336f","Type":"ContainerDied","Data":"4894ca11acd573f1a5ff2249cc3841baca4f1530ce9a0d8ff1fd6364c15b5ca5"} Nov 24 13:39:46 crc kubenswrapper[4752]: I1124 13:39:46.970331 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6hbmt/crc-debug-mpj8c"] Nov 24 13:39:46 crc kubenswrapper[4752]: I1124 13:39:46.981532 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6hbmt/crc-debug-mpj8c"] Nov 24 13:39:48 crc kubenswrapper[4752]: I1124 13:39:48.069573 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6hbmt/crc-debug-mpj8c" Nov 24 13:39:48 crc kubenswrapper[4752]: I1124 13:39:48.193523 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p54lk\" (UniqueName: \"kubernetes.io/projected/fe9ce043-d08f-40ca-8d0c-c0101155336f-kube-api-access-p54lk\") pod \"fe9ce043-d08f-40ca-8d0c-c0101155336f\" (UID: \"fe9ce043-d08f-40ca-8d0c-c0101155336f\") " Nov 24 13:39:48 crc kubenswrapper[4752]: I1124 13:39:48.193924 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fe9ce043-d08f-40ca-8d0c-c0101155336f-host\") pod \"fe9ce043-d08f-40ca-8d0c-c0101155336f\" (UID: \"fe9ce043-d08f-40ca-8d0c-c0101155336f\") " Nov 24 13:39:48 crc kubenswrapper[4752]: I1124 13:39:48.193978 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe9ce043-d08f-40ca-8d0c-c0101155336f-host" (OuterVolumeSpecName: "host") pod "fe9ce043-d08f-40ca-8d0c-c0101155336f" (UID: "fe9ce043-d08f-40ca-8d0c-c0101155336f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 13:39:48 crc kubenswrapper[4752]: I1124 13:39:48.194595 4752 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fe9ce043-d08f-40ca-8d0c-c0101155336f-host\") on node \"crc\" DevicePath \"\"" Nov 24 13:39:48 crc kubenswrapper[4752]: I1124 13:39:48.199192 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe9ce043-d08f-40ca-8d0c-c0101155336f-kube-api-access-p54lk" (OuterVolumeSpecName: "kube-api-access-p54lk") pod "fe9ce043-d08f-40ca-8d0c-c0101155336f" (UID: "fe9ce043-d08f-40ca-8d0c-c0101155336f"). InnerVolumeSpecName "kube-api-access-p54lk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:39:48 crc kubenswrapper[4752]: I1124 13:39:48.297518 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p54lk\" (UniqueName: \"kubernetes.io/projected/fe9ce043-d08f-40ca-8d0c-c0101155336f-kube-api-access-p54lk\") on node \"crc\" DevicePath \"\"" Nov 24 13:39:48 crc kubenswrapper[4752]: I1124 13:39:48.740504 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe9ce043-d08f-40ca-8d0c-c0101155336f" path="/var/lib/kubelet/pods/fe9ce043-d08f-40ca-8d0c-c0101155336f/volumes" Nov 24 13:39:48 crc kubenswrapper[4752]: I1124 13:39:48.813109 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hdfkl"] Nov 24 13:39:48 crc kubenswrapper[4752]: E1124 13:39:48.813676 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe9ce043-d08f-40ca-8d0c-c0101155336f" containerName="container-00" Nov 24 13:39:48 crc kubenswrapper[4752]: I1124 13:39:48.813696 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe9ce043-d08f-40ca-8d0c-c0101155336f" containerName="container-00" Nov 24 13:39:48 crc kubenswrapper[4752]: I1124 13:39:48.813929 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe9ce043-d08f-40ca-8d0c-c0101155336f" containerName="container-00" Nov 24 13:39:48 crc kubenswrapper[4752]: I1124 13:39:48.815581 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hdfkl" Nov 24 13:39:48 crc kubenswrapper[4752]: I1124 13:39:48.822756 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hdfkl"] Nov 24 13:39:48 crc kubenswrapper[4752]: I1124 13:39:48.909340 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac4a7fcc-df9c-4924-bfcc-777a84bc0c90-utilities\") pod \"redhat-marketplace-hdfkl\" (UID: \"ac4a7fcc-df9c-4924-bfcc-777a84bc0c90\") " pod="openshift-marketplace/redhat-marketplace-hdfkl" Nov 24 13:39:48 crc kubenswrapper[4752]: I1124 13:39:48.909511 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c548\" (UniqueName: \"kubernetes.io/projected/ac4a7fcc-df9c-4924-bfcc-777a84bc0c90-kube-api-access-8c548\") pod \"redhat-marketplace-hdfkl\" (UID: \"ac4a7fcc-df9c-4924-bfcc-777a84bc0c90\") " pod="openshift-marketplace/redhat-marketplace-hdfkl" Nov 24 13:39:48 crc kubenswrapper[4752]: I1124 13:39:48.909567 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac4a7fcc-df9c-4924-bfcc-777a84bc0c90-catalog-content\") pod \"redhat-marketplace-hdfkl\" (UID: \"ac4a7fcc-df9c-4924-bfcc-777a84bc0c90\") " pod="openshift-marketplace/redhat-marketplace-hdfkl" Nov 24 13:39:48 crc kubenswrapper[4752]: I1124 13:39:48.948882 4752 scope.go:117] "RemoveContainer" containerID="4894ca11acd573f1a5ff2249cc3841baca4f1530ce9a0d8ff1fd6364c15b5ca5" Nov 24 13:39:48 crc kubenswrapper[4752]: I1124 13:39:48.949155 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6hbmt/crc-debug-mpj8c" Nov 24 13:39:49 crc kubenswrapper[4752]: I1124 13:39:49.012937 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac4a7fcc-df9c-4924-bfcc-777a84bc0c90-utilities\") pod \"redhat-marketplace-hdfkl\" (UID: \"ac4a7fcc-df9c-4924-bfcc-777a84bc0c90\") " pod="openshift-marketplace/redhat-marketplace-hdfkl" Nov 24 13:39:49 crc kubenswrapper[4752]: I1124 13:39:49.013086 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c548\" (UniqueName: \"kubernetes.io/projected/ac4a7fcc-df9c-4924-bfcc-777a84bc0c90-kube-api-access-8c548\") pod \"redhat-marketplace-hdfkl\" (UID: \"ac4a7fcc-df9c-4924-bfcc-777a84bc0c90\") " pod="openshift-marketplace/redhat-marketplace-hdfkl" Nov 24 13:39:49 crc kubenswrapper[4752]: I1124 13:39:49.013136 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac4a7fcc-df9c-4924-bfcc-777a84bc0c90-catalog-content\") pod \"redhat-marketplace-hdfkl\" (UID: \"ac4a7fcc-df9c-4924-bfcc-777a84bc0c90\") " pod="openshift-marketplace/redhat-marketplace-hdfkl" Nov 24 13:39:49 crc kubenswrapper[4752]: I1124 13:39:49.013478 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac4a7fcc-df9c-4924-bfcc-777a84bc0c90-utilities\") pod \"redhat-marketplace-hdfkl\" (UID: \"ac4a7fcc-df9c-4924-bfcc-777a84bc0c90\") " pod="openshift-marketplace/redhat-marketplace-hdfkl" Nov 24 13:39:49 crc kubenswrapper[4752]: I1124 13:39:49.013907 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac4a7fcc-df9c-4924-bfcc-777a84bc0c90-catalog-content\") pod \"redhat-marketplace-hdfkl\" (UID: \"ac4a7fcc-df9c-4924-bfcc-777a84bc0c90\") " pod="openshift-marketplace/redhat-marketplace-hdfkl" Nov 24 13:39:49 crc kubenswrapper[4752]: I1124 13:39:49.035288 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c548\" (UniqueName: \"kubernetes.io/projected/ac4a7fcc-df9c-4924-bfcc-777a84bc0c90-kube-api-access-8c548\") pod \"redhat-marketplace-hdfkl\" (UID: \"ac4a7fcc-df9c-4924-bfcc-777a84bc0c90\") " pod="openshift-marketplace/redhat-marketplace-hdfkl" Nov 24 13:39:49 crc kubenswrapper[4752]: I1124 13:39:49.174091 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hdfkl" Nov 24 13:39:49 crc kubenswrapper[4752]: I1124 13:39:49.821858 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hdfkl"] Nov 24 13:39:49 crc kubenswrapper[4752]: W1124 13:39:49.985122 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac4a7fcc_df9c_4924_bfcc_777a84bc0c90.slice/crio-21a2e3952470800d1cfd76d588155650745cb2d11b2b16f7adeb9ffb110dd908 WatchSource:0}: Error finding container 21a2e3952470800d1cfd76d588155650745cb2d11b2b16f7adeb9ffb110dd908: Status 404 returned error can't find the container with id 21a2e3952470800d1cfd76d588155650745cb2d11b2b16f7adeb9ffb110dd908 Nov 24 13:39:51 crc kubenswrapper[4752]: I1124 13:39:51.025717 4752 generic.go:334] "Generic (PLEG): container finished" podID="ac4a7fcc-df9c-4924-bfcc-777a84bc0c90" containerID="bb8c8b98c3610df6e589998e0f63b35ca15a6178ddda319e24cf434bb2898a13" exitCode=0 Nov 24 13:39:51 crc kubenswrapper[4752]: I1124 13:39:51.025800 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hdfkl" event={"ID":"ac4a7fcc-df9c-4924-bfcc-777a84bc0c90","Type":"ContainerDied","Data":"bb8c8b98c3610df6e589998e0f63b35ca15a6178ddda319e24cf434bb2898a13"} Nov 24 13:39:51 crc kubenswrapper[4752]: I1124 13:39:51.026210 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hdfkl" event={"ID":"ac4a7fcc-df9c-4924-bfcc-777a84bc0c90","Type":"ContainerStarted","Data":"21a2e3952470800d1cfd76d588155650745cb2d11b2b16f7adeb9ffb110dd908"} Nov 24 13:39:53 crc kubenswrapper[4752]: I1124 13:39:53.051004 4752 generic.go:334] "Generic (PLEG): container finished" podID="ac4a7fcc-df9c-4924-bfcc-777a84bc0c90" containerID="8cb849ad0db0107bf547fd75e26a9e14c1ba841050d18ad3337f21ecfb3a4108" exitCode=0 Nov 24 13:39:53 crc kubenswrapper[4752]: I1124 13:39:53.051110 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hdfkl" event={"ID":"ac4a7fcc-df9c-4924-bfcc-777a84bc0c90","Type":"ContainerDied","Data":"8cb849ad0db0107bf547fd75e26a9e14c1ba841050d18ad3337f21ecfb3a4108"} Nov 24 13:39:54 crc kubenswrapper[4752]: I1124 13:39:54.064272 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hdfkl" event={"ID":"ac4a7fcc-df9c-4924-bfcc-777a84bc0c90","Type":"ContainerStarted","Data":"c365e5c89f7838135950bfd57949d1886ac3ace4a019e62a6ec27ae1aec83a0d"} Nov 24 13:39:54 crc kubenswrapper[4752]: I1124 13:39:54.093804 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hdfkl" podStartSLOduration=3.555323583 podStartE2EDuration="6.093776675s" podCreationTimestamp="2025-11-24 13:39:48 +0000 UTC" firstStartedPulling="2025-11-24 13:39:51.028272294 +0000 UTC m=+9197.013092583" lastFinishedPulling="2025-11-24 13:39:53.566725386 +0000 UTC m=+9199.551545675" observedRunningTime="2025-11-24 13:39:54.080796773 +0000 UTC m=+9200.065617072" watchObservedRunningTime="2025-11-24 13:39:54.093776675 +0000 UTC m=+9200.078596964" Nov 24 13:39:54 crc kubenswrapper[4752]: I1124 13:39:54.734344 4752 scope.go:117] "RemoveContainer" containerID="ef6259f52270e7ab7fb09374faf507034084284bf08d7836a2a3735a7e8bbb29" Nov 24 13:39:54 crc kubenswrapper[4752]: E1124 13:39:54.735153 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:39:59 crc kubenswrapper[4752]: I1124 13:39:59.175400 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hdfkl" Nov 24 13:39:59 crc kubenswrapper[4752]: I1124 13:39:59.176058 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hdfkl" Nov 24 13:39:59 crc kubenswrapper[4752]: I1124 13:39:59.250018 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hdfkl" Nov 24 13:40:00 crc kubenswrapper[4752]: I1124 13:40:00.208285 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hdfkl" Nov 24 13:40:00 crc kubenswrapper[4752]: I1124 13:40:00.273401 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hdfkl"] Nov 24 13:40:02 crc kubenswrapper[4752]: I1124 13:40:02.172635 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hdfkl" podUID="ac4a7fcc-df9c-4924-bfcc-777a84bc0c90" containerName="registry-server" containerID="cri-o://c365e5c89f7838135950bfd57949d1886ac3ace4a019e62a6ec27ae1aec83a0d" gracePeriod=2 Nov 24 13:40:02 crc kubenswrapper[4752]: I1124 13:40:02.739601 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hdfkl" Nov 24 13:40:02 crc kubenswrapper[4752]: I1124 13:40:02.856251 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac4a7fcc-df9c-4924-bfcc-777a84bc0c90-catalog-content\") pod \"ac4a7fcc-df9c-4924-bfcc-777a84bc0c90\" (UID: \"ac4a7fcc-df9c-4924-bfcc-777a84bc0c90\") " Nov 24 13:40:02 crc kubenswrapper[4752]: I1124 13:40:02.856338 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8c548\" (UniqueName: \"kubernetes.io/projected/ac4a7fcc-df9c-4924-bfcc-777a84bc0c90-kube-api-access-8c548\") pod \"ac4a7fcc-df9c-4924-bfcc-777a84bc0c90\" (UID: \"ac4a7fcc-df9c-4924-bfcc-777a84bc0c90\") " Nov 24 13:40:02 crc kubenswrapper[4752]: I1124 13:40:02.856541 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac4a7fcc-df9c-4924-bfcc-777a84bc0c90-utilities\") pod \"ac4a7fcc-df9c-4924-bfcc-777a84bc0c90\" (UID: \"ac4a7fcc-df9c-4924-bfcc-777a84bc0c90\") " Nov 24 13:40:02 crc kubenswrapper[4752]: I1124 13:40:02.857580 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac4a7fcc-df9c-4924-bfcc-777a84bc0c90-utilities" (OuterVolumeSpecName: "utilities") pod "ac4a7fcc-df9c-4924-bfcc-777a84bc0c90" (UID: "ac4a7fcc-df9c-4924-bfcc-777a84bc0c90"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 13:40:02 crc kubenswrapper[4752]: I1124 13:40:02.858053 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac4a7fcc-df9c-4924-bfcc-777a84bc0c90-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 13:40:02 crc kubenswrapper[4752]: I1124 13:40:02.874269 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac4a7fcc-df9c-4924-bfcc-777a84bc0c90-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac4a7fcc-df9c-4924-bfcc-777a84bc0c90" (UID: "ac4a7fcc-df9c-4924-bfcc-777a84bc0c90"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 13:40:02 crc kubenswrapper[4752]: I1124 13:40:02.959896 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac4a7fcc-df9c-4924-bfcc-777a84bc0c90-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 13:40:03 crc kubenswrapper[4752]: I1124 13:40:03.188392 4752 generic.go:334] "Generic (PLEG): container finished" podID="ac4a7fcc-df9c-4924-bfcc-777a84bc0c90" containerID="c365e5c89f7838135950bfd57949d1886ac3ace4a019e62a6ec27ae1aec83a0d" exitCode=0 Nov 24 13:40:03 crc kubenswrapper[4752]: I1124 13:40:03.188451 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hdfkl" event={"ID":"ac4a7fcc-df9c-4924-bfcc-777a84bc0c90","Type":"ContainerDied","Data":"c365e5c89f7838135950bfd57949d1886ac3ace4a019e62a6ec27ae1aec83a0d"} Nov 24 13:40:03 crc kubenswrapper[4752]: I1124 13:40:03.188492 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hdfkl" event={"ID":"ac4a7fcc-df9c-4924-bfcc-777a84bc0c90","Type":"ContainerDied","Data":"21a2e3952470800d1cfd76d588155650745cb2d11b2b16f7adeb9ffb110dd908"} Nov 24 13:40:03 crc kubenswrapper[4752]: I1124 13:40:03.188507 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hdfkl" Nov 24 13:40:03 crc kubenswrapper[4752]: I1124 13:40:03.188515 4752 scope.go:117] "RemoveContainer" containerID="c365e5c89f7838135950bfd57949d1886ac3ace4a019e62a6ec27ae1aec83a0d" Nov 24 13:40:03 crc kubenswrapper[4752]: I1124 13:40:03.222355 4752 scope.go:117] "RemoveContainer" containerID="8cb849ad0db0107bf547fd75e26a9e14c1ba841050d18ad3337f21ecfb3a4108" Nov 24 13:40:03 crc kubenswrapper[4752]: I1124 13:40:03.671627 4752 scope.go:117] "RemoveContainer" containerID="bb8c8b98c3610df6e589998e0f63b35ca15a6178ddda319e24cf434bb2898a13" Nov 24 13:40:03 crc kubenswrapper[4752]: I1124 13:40:03.671686 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac4a7fcc-df9c-4924-bfcc-777a84bc0c90-kube-api-access-8c548" (OuterVolumeSpecName: "kube-api-access-8c548") pod "ac4a7fcc-df9c-4924-bfcc-777a84bc0c90" (UID: "ac4a7fcc-df9c-4924-bfcc-777a84bc0c90"). InnerVolumeSpecName "kube-api-access-8c548". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:40:03 crc kubenswrapper[4752]: I1124 13:40:03.675161 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8c548\" (UniqueName: \"kubernetes.io/projected/ac4a7fcc-df9c-4924-bfcc-777a84bc0c90-kube-api-access-8c548\") on node \"crc\" DevicePath \"\"" Nov 24 13:40:03 crc kubenswrapper[4752]: I1124 13:40:03.780638 4752 scope.go:117] "RemoveContainer" containerID="c365e5c89f7838135950bfd57949d1886ac3ace4a019e62a6ec27ae1aec83a0d" Nov 24 13:40:03 crc kubenswrapper[4752]: E1124 13:40:03.781088 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c365e5c89f7838135950bfd57949d1886ac3ace4a019e62a6ec27ae1aec83a0d\": container with ID starting with c365e5c89f7838135950bfd57949d1886ac3ace4a019e62a6ec27ae1aec83a0d not found: ID does not exist" containerID="c365e5c89f7838135950bfd57949d1886ac3ace4a019e62a6ec27ae1aec83a0d" Nov 24 13:40:03 crc kubenswrapper[4752]: I1124 13:40:03.781120 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c365e5c89f7838135950bfd57949d1886ac3ace4a019e62a6ec27ae1aec83a0d"} err="failed to get container status \"c365e5c89f7838135950bfd57949d1886ac3ace4a019e62a6ec27ae1aec83a0d\": rpc error: code = NotFound desc = could not find container \"c365e5c89f7838135950bfd57949d1886ac3ace4a019e62a6ec27ae1aec83a0d\": container with ID starting with c365e5c89f7838135950bfd57949d1886ac3ace4a019e62a6ec27ae1aec83a0d not found: ID does not exist" Nov 24 13:40:03 crc kubenswrapper[4752]: I1124 13:40:03.781141 4752 scope.go:117] "RemoveContainer" containerID="8cb849ad0db0107bf547fd75e26a9e14c1ba841050d18ad3337f21ecfb3a4108" Nov 24 13:40:03 crc kubenswrapper[4752]: E1124 13:40:03.781412 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cb849ad0db0107bf547fd75e26a9e14c1ba841050d18ad3337f21ecfb3a4108\": container with ID starting with 8cb849ad0db0107bf547fd75e26a9e14c1ba841050d18ad3337f21ecfb3a4108 not found: ID does not exist" containerID="8cb849ad0db0107bf547fd75e26a9e14c1ba841050d18ad3337f21ecfb3a4108" Nov 24 13:40:03 crc kubenswrapper[4752]: I1124 13:40:03.781432 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cb849ad0db0107bf547fd75e26a9e14c1ba841050d18ad3337f21ecfb3a4108"} err="failed to get container status \"8cb849ad0db0107bf547fd75e26a9e14c1ba841050d18ad3337f21ecfb3a4108\": rpc error: code = NotFound desc = could not find container \"8cb849ad0db0107bf547fd75e26a9e14c1ba841050d18ad3337f21ecfb3a4108\": container with ID starting with 8cb849ad0db0107bf547fd75e26a9e14c1ba841050d18ad3337f21ecfb3a4108 not found: ID does not exist" Nov 24 13:40:03 crc kubenswrapper[4752]: I1124 13:40:03.781445 4752 scope.go:117] "RemoveContainer" containerID="bb8c8b98c3610df6e589998e0f63b35ca15a6178ddda319e24cf434bb2898a13" Nov 24 13:40:03 crc kubenswrapper[4752]: E1124 13:40:03.781611 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb8c8b98c3610df6e589998e0f63b35ca15a6178ddda319e24cf434bb2898a13\": container with ID starting with bb8c8b98c3610df6e589998e0f63b35ca15a6178ddda319e24cf434bb2898a13 not found: ID does not exist" containerID="bb8c8b98c3610df6e589998e0f63b35ca15a6178ddda319e24cf434bb2898a13" Nov 24 13:40:03 crc kubenswrapper[4752]: I1124 13:40:03.781630 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb8c8b98c3610df6e589998e0f63b35ca15a6178ddda319e24cf434bb2898a13"} err="failed to get container status \"bb8c8b98c3610df6e589998e0f63b35ca15a6178ddda319e24cf434bb2898a13\": rpc error: code = NotFound desc = could not find container \"bb8c8b98c3610df6e589998e0f63b35ca15a6178ddda319e24cf434bb2898a13\": container with ID starting with bb8c8b98c3610df6e589998e0f63b35ca15a6178ddda319e24cf434bb2898a13 not found: ID does not exist" Nov 24 13:40:03 crc kubenswrapper[4752]: I1124 13:40:03.843523 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hdfkl"] Nov 24 13:40:03 crc kubenswrapper[4752]: I1124 13:40:03.853137 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hdfkl"] Nov 24 13:40:04 crc kubenswrapper[4752]: I1124 13:40:04.748624 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac4a7fcc-df9c-4924-bfcc-777a84bc0c90" path="/var/lib/kubelet/pods/ac4a7fcc-df9c-4924-bfcc-777a84bc0c90/volumes" Nov 24 13:40:09 crc kubenswrapper[4752]: I1124 13:40:09.728159 4752 scope.go:117] "RemoveContainer" containerID="ef6259f52270e7ab7fb09374faf507034084284bf08d7836a2a3735a7e8bbb29" Nov 24 13:40:09 crc kubenswrapper[4752]: E1124 13:40:09.730267 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:40:21 crc kubenswrapper[4752]: I1124 13:40:21.729202 4752 scope.go:117] "RemoveContainer" containerID="ef6259f52270e7ab7fb09374faf507034084284bf08d7836a2a3735a7e8bbb29" Nov 24 13:40:21 crc kubenswrapper[4752]: E1124 13:40:21.732476 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:40:35 crc kubenswrapper[4752]: I1124 13:40:35.550172 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bm6cb"] Nov 24 13:40:35 crc kubenswrapper[4752]: E1124 13:40:35.551551 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac4a7fcc-df9c-4924-bfcc-777a84bc0c90" containerName="extract-content" Nov 24 13:40:35 crc kubenswrapper[4752]: I1124 13:40:35.551567 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac4a7fcc-df9c-4924-bfcc-777a84bc0c90" containerName="extract-content" Nov 24 13:40:35 crc kubenswrapper[4752]: E1124 13:40:35.551594 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac4a7fcc-df9c-4924-bfcc-777a84bc0c90" containerName="registry-server" Nov 24 13:40:35 crc kubenswrapper[4752]: I1124 13:40:35.551600 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac4a7fcc-df9c-4924-bfcc-777a84bc0c90" containerName="registry-server" Nov 24 13:40:35 crc kubenswrapper[4752]: E1124 13:40:35.551620 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac4a7fcc-df9c-4924-bfcc-777a84bc0c90" containerName="extract-utilities" Nov 24 13:40:35 crc kubenswrapper[4752]: I1124 13:40:35.551626 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac4a7fcc-df9c-4924-bfcc-777a84bc0c90" containerName="extract-utilities" Nov 24 13:40:35 crc kubenswrapper[4752]: I1124 13:40:35.551855 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac4a7fcc-df9c-4924-bfcc-777a84bc0c90" containerName="registry-server" Nov 24 13:40:35 crc kubenswrapper[4752]: I1124 13:40:35.553597 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bm6cb" Nov 24 13:40:35 crc kubenswrapper[4752]: I1124 13:40:35.564693 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bm6cb"] Nov 24 13:40:35 crc kubenswrapper[4752]: I1124 13:40:35.629922 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/804a9776-b4f3-4eda-bf42-aafc1b562519-utilities\") pod \"redhat-operators-bm6cb\" (UID: \"804a9776-b4f3-4eda-bf42-aafc1b562519\") " pod="openshift-marketplace/redhat-operators-bm6cb" Nov 24 13:40:35 crc kubenswrapper[4752]: I1124 13:40:35.630011 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/804a9776-b4f3-4eda-bf42-aafc1b562519-catalog-content\") pod \"redhat-operators-bm6cb\" (UID: \"804a9776-b4f3-4eda-bf42-aafc1b562519\") " pod="openshift-marketplace/redhat-operators-bm6cb" Nov 24 13:40:35 crc kubenswrapper[4752]: I1124 13:40:35.630167 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9qhg\" (UniqueName: \"kubernetes.io/projected/804a9776-b4f3-4eda-bf42-aafc1b562519-kube-api-access-v9qhg\") pod \"redhat-operators-bm6cb\" (UID: \"804a9776-b4f3-4eda-bf42-aafc1b562519\") " pod="openshift-marketplace/redhat-operators-bm6cb" Nov 24 13:40:35 crc kubenswrapper[4752]: I1124 13:40:35.728081 4752 scope.go:117] "RemoveContainer" containerID="ef6259f52270e7ab7fb09374faf507034084284bf08d7836a2a3735a7e8bbb29" Nov 24 13:40:35 crc kubenswrapper[4752]: E1124 13:40:35.728412 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:40:35 crc kubenswrapper[4752]: I1124 13:40:35.731796 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/804a9776-b4f3-4eda-bf42-aafc1b562519-utilities\") pod \"redhat-operators-bm6cb\" (UID: \"804a9776-b4f3-4eda-bf42-aafc1b562519\") " pod="openshift-marketplace/redhat-operators-bm6cb" Nov 24 13:40:35 crc kubenswrapper[4752]: I1124 13:40:35.732336 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/804a9776-b4f3-4eda-bf42-aafc1b562519-utilities\") pod \"redhat-operators-bm6cb\" (UID: \"804a9776-b4f3-4eda-bf42-aafc1b562519\") " pod="openshift-marketplace/redhat-operators-bm6cb" Nov 24 13:40:35 crc kubenswrapper[4752]: I1124 13:40:35.732457 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/804a9776-b4f3-4eda-bf42-aafc1b562519-catalog-content\") pod \"redhat-operators-bm6cb\" (UID: \"804a9776-b4f3-4eda-bf42-aafc1b562519\") " pod="openshift-marketplace/redhat-operators-bm6cb" Nov 24 13:40:35 crc kubenswrapper[4752]: I1124 13:40:35.732738 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/804a9776-b4f3-4eda-bf42-aafc1b562519-catalog-content\") pod \"redhat-operators-bm6cb\" (UID: \"804a9776-b4f3-4eda-bf42-aafc1b562519\") " pod="openshift-marketplace/redhat-operators-bm6cb" Nov 24 13:40:35 crc kubenswrapper[4752]: I1124 13:40:35.732945 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9qhg\" (UniqueName: \"kubernetes.io/projected/804a9776-b4f3-4eda-bf42-aafc1b562519-kube-api-access-v9qhg\") pod \"redhat-operators-bm6cb\" (UID: \"804a9776-b4f3-4eda-bf42-aafc1b562519\") " pod="openshift-marketplace/redhat-operators-bm6cb" Nov 24 13:40:35 crc kubenswrapper[4752]: I1124 13:40:35.754224 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9qhg\" (UniqueName: \"kubernetes.io/projected/804a9776-b4f3-4eda-bf42-aafc1b562519-kube-api-access-v9qhg\") pod \"redhat-operators-bm6cb\" (UID: \"804a9776-b4f3-4eda-bf42-aafc1b562519\") " pod="openshift-marketplace/redhat-operators-bm6cb" Nov 24 13:40:35 crc kubenswrapper[4752]: I1124 13:40:35.885053 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bm6cb" Nov 24 13:40:36 crc kubenswrapper[4752]: I1124 13:40:36.462633 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bm6cb"] Nov 24 13:40:36 crc kubenswrapper[4752]: W1124 13:40:36.470204 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod804a9776_b4f3_4eda_bf42_aafc1b562519.slice/crio-a298bea8e09bbb3f5a309f299a98f1af869e46830b5abe7122fd9cfa218d66f4 WatchSource:0}: Error finding container a298bea8e09bbb3f5a309f299a98f1af869e46830b5abe7122fd9cfa218d66f4: Status 404 returned error can't find the container with id a298bea8e09bbb3f5a309f299a98f1af869e46830b5abe7122fd9cfa218d66f4 Nov 24 13:40:36 crc kubenswrapper[4752]: I1124 13:40:36.559603 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bm6cb" event={"ID":"804a9776-b4f3-4eda-bf42-aafc1b562519","Type":"ContainerStarted","Data":"a298bea8e09bbb3f5a309f299a98f1af869e46830b5abe7122fd9cfa218d66f4"} Nov 24 13:40:37 crc kubenswrapper[4752]: I1124 13:40:37.570876 4752 generic.go:334] "Generic (PLEG): container finished" podID="804a9776-b4f3-4eda-bf42-aafc1b562519" containerID="389c5a75e0b6521f91851cce42727c4f35f92e38fe1a9588ef48c0cc864ba1f8" exitCode=0 Nov 24 13:40:37 crc kubenswrapper[4752]: I1124 13:40:37.570970 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bm6cb" event={"ID":"804a9776-b4f3-4eda-bf42-aafc1b562519","Type":"ContainerDied","Data":"389c5a75e0b6521f91851cce42727c4f35f92e38fe1a9588ef48c0cc864ba1f8"} Nov 24 13:40:39 crc kubenswrapper[4752]: I1124 13:40:39.593382 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bm6cb" event={"ID":"804a9776-b4f3-4eda-bf42-aafc1b562519","Type":"ContainerStarted","Data":"e861fc5c4abe6bcd6c4497d185a6379624960a3654971c79f5c910de03d7eb6c"} Nov 24 13:40:43 crc kubenswrapper[4752]: I1124 13:40:43.641725 4752 generic.go:334] "Generic (PLEG): container finished" podID="804a9776-b4f3-4eda-bf42-aafc1b562519" containerID="e861fc5c4abe6bcd6c4497d185a6379624960a3654971c79f5c910de03d7eb6c" exitCode=0 Nov 24 13:40:43 crc kubenswrapper[4752]: I1124 13:40:43.641783 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bm6cb" event={"ID":"804a9776-b4f3-4eda-bf42-aafc1b562519","Type":"ContainerDied","Data":"e861fc5c4abe6bcd6c4497d185a6379624960a3654971c79f5c910de03d7eb6c"} Nov 24 13:40:45 crc kubenswrapper[4752]: I1124 13:40:45.663333 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bm6cb" event={"ID":"804a9776-b4f3-4eda-bf42-aafc1b562519","Type":"ContainerStarted","Data":"43ef78655c9aacdc8abcaef2d488637d45fed6a3a79e1ba6cb04e7dc36bdf13d"} Nov 24 13:40:45 crc kubenswrapper[4752]: I1124 13:40:45.688053 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bm6cb" podStartSLOduration=3.146790336 podStartE2EDuration="10.68803537s" podCreationTimestamp="2025-11-24 13:40:35 +0000 UTC" firstStartedPulling="2025-11-24 13:40:37.574377616 +0000 UTC m=+9243.559197905" lastFinishedPulling="2025-11-24 13:40:45.11562264 +0000 UTC m=+9251.100442939" observedRunningTime="2025-11-24 13:40:45.682941584 +0000 UTC m=+9251.667761873" watchObservedRunningTime="2025-11-24 13:40:45.68803537 +0000 UTC m=+9251.672855659" Nov 24 13:40:45 crc kubenswrapper[4752]: I1124 13:40:45.886009 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bm6cb" Nov 24 13:40:45 crc kubenswrapper[4752]: I1124 13:40:45.886137 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bm6cb" Nov 24 13:40:46 crc kubenswrapper[4752]: I1124 13:40:46.964641 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bm6cb" podUID="804a9776-b4f3-4eda-bf42-aafc1b562519" containerName="registry-server" probeResult="failure" output=< Nov 24 13:40:46 crc kubenswrapper[4752]: timeout: failed to connect service ":50051" within 1s Nov 24 13:40:46 crc kubenswrapper[4752]: > Nov 24 13:40:47 crc kubenswrapper[4752]: I1124 13:40:47.729012 4752 scope.go:117] "RemoveContainer" containerID="ef6259f52270e7ab7fb09374faf507034084284bf08d7836a2a3735a7e8bbb29" Nov 24 13:40:47 crc kubenswrapper[4752]: E1124 13:40:47.729728 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:40:55 crc kubenswrapper[4752]: I1124 13:40:55.952665 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bm6cb" Nov 24 13:40:56 crc kubenswrapper[4752]: I1124 13:40:56.010502 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bm6cb" Nov 24 13:40:56 crc kubenswrapper[4752]: I1124 13:40:56.194507 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bm6cb"] Nov 24 13:40:57 crc kubenswrapper[4752]: I1124 13:40:57.151488 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bm6cb" podUID="804a9776-b4f3-4eda-bf42-aafc1b562519" containerName="registry-server" containerID="cri-o://43ef78655c9aacdc8abcaef2d488637d45fed6a3a79e1ba6cb04e7dc36bdf13d" gracePeriod=2 Nov 24 13:40:57 crc kubenswrapper[4752]: I1124 13:40:57.758479 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bm6cb" Nov 24 13:40:57 crc kubenswrapper[4752]: I1124 13:40:57.878310 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/804a9776-b4f3-4eda-bf42-aafc1b562519-utilities\") pod \"804a9776-b4f3-4eda-bf42-aafc1b562519\" (UID: \"804a9776-b4f3-4eda-bf42-aafc1b562519\") " Nov 24 13:40:57 crc kubenswrapper[4752]: I1124 13:40:57.878444 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9qhg\" (UniqueName: \"kubernetes.io/projected/804a9776-b4f3-4eda-bf42-aafc1b562519-kube-api-access-v9qhg\") pod \"804a9776-b4f3-4eda-bf42-aafc1b562519\" (UID: \"804a9776-b4f3-4eda-bf42-aafc1b562519\") " Nov 24 13:40:57 crc kubenswrapper[4752]: I1124 13:40:57.878519 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/804a9776-b4f3-4eda-bf42-aafc1b562519-catalog-content\") pod \"804a9776-b4f3-4eda-bf42-aafc1b562519\" (UID: \"804a9776-b4f3-4eda-bf42-aafc1b562519\") " Nov 24 13:40:57 crc kubenswrapper[4752]: I1124 13:40:57.879396 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/804a9776-b4f3-4eda-bf42-aafc1b562519-utilities" (OuterVolumeSpecName: "utilities") pod "804a9776-b4f3-4eda-bf42-aafc1b562519" (UID: "804a9776-b4f3-4eda-bf42-aafc1b562519"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 13:40:57 crc kubenswrapper[4752]: I1124 13:40:57.884906 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/804a9776-b4f3-4eda-bf42-aafc1b562519-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 13:40:57 crc kubenswrapper[4752]: I1124 13:40:57.886936 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/804a9776-b4f3-4eda-bf42-aafc1b562519-kube-api-access-v9qhg" (OuterVolumeSpecName: "kube-api-access-v9qhg") pod "804a9776-b4f3-4eda-bf42-aafc1b562519" (UID: "804a9776-b4f3-4eda-bf42-aafc1b562519"). InnerVolumeSpecName "kube-api-access-v9qhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:40:57 crc kubenswrapper[4752]: I1124 13:40:57.973004 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/804a9776-b4f3-4eda-bf42-aafc1b562519-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "804a9776-b4f3-4eda-bf42-aafc1b562519" (UID: "804a9776-b4f3-4eda-bf42-aafc1b562519"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 13:40:57 crc kubenswrapper[4752]: I1124 13:40:57.987209 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9qhg\" (UniqueName: \"kubernetes.io/projected/804a9776-b4f3-4eda-bf42-aafc1b562519-kube-api-access-v9qhg\") on node \"crc\" DevicePath \"\"" Nov 24 13:40:57 crc kubenswrapper[4752]: I1124 13:40:57.987244 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/804a9776-b4f3-4eda-bf42-aafc1b562519-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 13:40:58 crc kubenswrapper[4752]: I1124 13:40:58.163108 4752 generic.go:334] "Generic (PLEG): container finished" podID="804a9776-b4f3-4eda-bf42-aafc1b562519" containerID="43ef78655c9aacdc8abcaef2d488637d45fed6a3a79e1ba6cb04e7dc36bdf13d" exitCode=0 Nov 24 13:40:58 crc kubenswrapper[4752]: I1124 13:40:58.163157 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bm6cb" event={"ID":"804a9776-b4f3-4eda-bf42-aafc1b562519","Type":"ContainerDied","Data":"43ef78655c9aacdc8abcaef2d488637d45fed6a3a79e1ba6cb04e7dc36bdf13d"} Nov 24 13:40:58 crc kubenswrapper[4752]: I1124 13:40:58.163188 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bm6cb" event={"ID":"804a9776-b4f3-4eda-bf42-aafc1b562519","Type":"ContainerDied","Data":"a298bea8e09bbb3f5a309f299a98f1af869e46830b5abe7122fd9cfa218d66f4"} Nov 24 13:40:58 crc kubenswrapper[4752]: I1124 13:40:58.163197 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bm6cb" Nov 24 13:40:58 crc kubenswrapper[4752]: I1124 13:40:58.163209 4752 scope.go:117] "RemoveContainer" containerID="43ef78655c9aacdc8abcaef2d488637d45fed6a3a79e1ba6cb04e7dc36bdf13d" Nov 24 13:40:58 crc kubenswrapper[4752]: I1124 13:40:58.192704 4752 scope.go:117] "RemoveContainer" containerID="e861fc5c4abe6bcd6c4497d185a6379624960a3654971c79f5c910de03d7eb6c" Nov 24 13:40:58 crc kubenswrapper[4752]: I1124 13:40:58.221452 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bm6cb"] Nov 24 13:40:58 crc kubenswrapper[4752]: I1124 13:40:58.225033 4752 scope.go:117] "RemoveContainer" containerID="389c5a75e0b6521f91851cce42727c4f35f92e38fe1a9588ef48c0cc864ba1f8" Nov 24 13:40:58 crc kubenswrapper[4752]: I1124 13:40:58.234184 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bm6cb"] Nov 24 13:40:58 crc kubenswrapper[4752]: I1124 13:40:58.278837 4752 scope.go:117] "RemoveContainer" containerID="43ef78655c9aacdc8abcaef2d488637d45fed6a3a79e1ba6cb04e7dc36bdf13d" Nov 24 13:40:58 crc kubenswrapper[4752]: E1124 13:40:58.279266 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43ef78655c9aacdc8abcaef2d488637d45fed6a3a79e1ba6cb04e7dc36bdf13d\": container with ID starting with 43ef78655c9aacdc8abcaef2d488637d45fed6a3a79e1ba6cb04e7dc36bdf13d not found: ID does not exist" containerID="43ef78655c9aacdc8abcaef2d488637d45fed6a3a79e1ba6cb04e7dc36bdf13d" Nov 24 13:40:58 crc kubenswrapper[4752]: I1124 13:40:58.279304 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43ef78655c9aacdc8abcaef2d488637d45fed6a3a79e1ba6cb04e7dc36bdf13d"} err="failed to get container status \"43ef78655c9aacdc8abcaef2d488637d45fed6a3a79e1ba6cb04e7dc36bdf13d\": rpc error: code = NotFound desc = could not find container \"43ef78655c9aacdc8abcaef2d488637d45fed6a3a79e1ba6cb04e7dc36bdf13d\": container with ID starting with 43ef78655c9aacdc8abcaef2d488637d45fed6a3a79e1ba6cb04e7dc36bdf13d not found: ID does not exist" Nov 24 13:40:58 crc kubenswrapper[4752]: I1124 13:40:58.279329 4752 scope.go:117] "RemoveContainer" containerID="e861fc5c4abe6bcd6c4497d185a6379624960a3654971c79f5c910de03d7eb6c" Nov 24 13:40:58 crc kubenswrapper[4752]: E1124 13:40:58.279618 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e861fc5c4abe6bcd6c4497d185a6379624960a3654971c79f5c910de03d7eb6c\": container with ID starting with e861fc5c4abe6bcd6c4497d185a6379624960a3654971c79f5c910de03d7eb6c not found: ID does not exist" containerID="e861fc5c4abe6bcd6c4497d185a6379624960a3654971c79f5c910de03d7eb6c" Nov 24 13:40:58 crc kubenswrapper[4752]: I1124 13:40:58.279671 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e861fc5c4abe6bcd6c4497d185a6379624960a3654971c79f5c910de03d7eb6c"} err="failed to get container status \"e861fc5c4abe6bcd6c4497d185a6379624960a3654971c79f5c910de03d7eb6c\": rpc error: code = NotFound desc = could not find container \"e861fc5c4abe6bcd6c4497d185a6379624960a3654971c79f5c910de03d7eb6c\": container with ID starting with e861fc5c4abe6bcd6c4497d185a6379624960a3654971c79f5c910de03d7eb6c not found: ID does not exist" Nov 24 13:40:58 crc kubenswrapper[4752]: I1124 13:40:58.279700 4752 scope.go:117] "RemoveContainer" containerID="389c5a75e0b6521f91851cce42727c4f35f92e38fe1a9588ef48c0cc864ba1f8" Nov 24 13:40:58 crc kubenswrapper[4752]: E1124 13:40:58.279964 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"389c5a75e0b6521f91851cce42727c4f35f92e38fe1a9588ef48c0cc864ba1f8\": container with ID starting with 389c5a75e0b6521f91851cce42727c4f35f92e38fe1a9588ef48c0cc864ba1f8 not found: ID does not exist" containerID="389c5a75e0b6521f91851cce42727c4f35f92e38fe1a9588ef48c0cc864ba1f8" Nov 24 13:40:58 crc kubenswrapper[4752]: I1124 13:40:58.279991 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"389c5a75e0b6521f91851cce42727c4f35f92e38fe1a9588ef48c0cc864ba1f8"} err="failed to get container status \"389c5a75e0b6521f91851cce42727c4f35f92e38fe1a9588ef48c0cc864ba1f8\": rpc error: code = NotFound desc = could not find container \"389c5a75e0b6521f91851cce42727c4f35f92e38fe1a9588ef48c0cc864ba1f8\": container with ID starting with 389c5a75e0b6521f91851cce42727c4f35f92e38fe1a9588ef48c0cc864ba1f8 not found: ID does not exist" Nov 24 13:40:58 crc kubenswrapper[4752]: I1124 13:40:58.749963 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="804a9776-b4f3-4eda-bf42-aafc1b562519" path="/var/lib/kubelet/pods/804a9776-b4f3-4eda-bf42-aafc1b562519/volumes" Nov 24 13:41:02 crc kubenswrapper[4752]: I1124 13:41:02.729205 4752 scope.go:117] "RemoveContainer" containerID="ef6259f52270e7ab7fb09374faf507034084284bf08d7836a2a3735a7e8bbb29" Nov 24 13:41:02 crc kubenswrapper[4752]: E1124 13:41:02.730475 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:41:16 crc kubenswrapper[4752]: I1124 13:41:16.728730 4752 scope.go:117] "RemoveContainer" containerID="ef6259f52270e7ab7fb09374faf507034084284bf08d7836a2a3735a7e8bbb29" Nov 24 13:41:16 crc kubenswrapper[4752]: E1124 13:41:16.729653 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:41:31 crc kubenswrapper[4752]: I1124 13:41:31.728351 4752 scope.go:117] "RemoveContainer" containerID="ef6259f52270e7ab7fb09374faf507034084284bf08d7836a2a3735a7e8bbb29" Nov 24 13:41:31 crc kubenswrapper[4752]: E1124 13:41:31.730155 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:41:44 crc kubenswrapper[4752]: I1124 13:41:44.740422 4752 scope.go:117] "RemoveContainer" containerID="ef6259f52270e7ab7fb09374faf507034084284bf08d7836a2a3735a7e8bbb29" Nov 24 13:41:44 crc kubenswrapper[4752]: E1124 13:41:44.741650 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:41:59 crc kubenswrapper[4752]: I1124 13:41:59.730354 4752 scope.go:117] "RemoveContainer" containerID="ef6259f52270e7ab7fb09374faf507034084284bf08d7836a2a3735a7e8bbb29" Nov 24 13:41:59 crc kubenswrapper[4752]: E1124 13:41:59.732125 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:42:10 crc kubenswrapper[4752]: I1124 13:42:10.727962 4752 scope.go:117] "RemoveContainer" containerID="ef6259f52270e7ab7fb09374faf507034084284bf08d7836a2a3735a7e8bbb29" Nov 24 13:42:10 crc kubenswrapper[4752]: E1124 13:42:10.728797 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:42:23 crc kubenswrapper[4752]: I1124 13:42:23.728728 4752 scope.go:117] "RemoveContainer" containerID="ef6259f52270e7ab7fb09374faf507034084284bf08d7836a2a3735a7e8bbb29" Nov 24 13:42:23 crc kubenswrapper[4752]: E1124 13:42:23.729825 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:42:35 crc kubenswrapper[4752]: I1124 13:42:35.728538 4752 scope.go:117] "RemoveContainer" containerID="ef6259f52270e7ab7fb09374faf507034084284bf08d7836a2a3735a7e8bbb29" Nov 24 13:42:35 crc kubenswrapper[4752]: E1124 13:42:35.729368 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:42:49 crc kubenswrapper[4752]: I1124 13:42:49.728871 4752 scope.go:117] "RemoveContainer" containerID="ef6259f52270e7ab7fb09374faf507034084284bf08d7836a2a3735a7e8bbb29" Nov 24 13:42:49 crc kubenswrapper[4752]: E1124 13:42:49.729697 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:42:59 crc kubenswrapper[4752]: I1124 13:42:59.454843 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jdc4c"] Nov 24 13:42:59 crc kubenswrapper[4752]: E1124 13:42:59.456658 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="804a9776-b4f3-4eda-bf42-aafc1b562519" containerName="registry-server" Nov 24 13:42:59 crc kubenswrapper[4752]: I1124 13:42:59.456683 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="804a9776-b4f3-4eda-bf42-aafc1b562519" containerName="registry-server" Nov 24 13:42:59 crc kubenswrapper[4752]: E1124 13:42:59.456708 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="804a9776-b4f3-4eda-bf42-aafc1b562519" containerName="extract-utilities" Nov 24 13:42:59 crc kubenswrapper[4752]: I1124 13:42:59.456721 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="804a9776-b4f3-4eda-bf42-aafc1b562519" containerName="extract-utilities" Nov 24 13:42:59 crc kubenswrapper[4752]: E1124 13:42:59.456737 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="804a9776-b4f3-4eda-bf42-aafc1b562519" containerName="extract-content" Nov 24 13:42:59 crc kubenswrapper[4752]: I1124 13:42:59.456768 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="804a9776-b4f3-4eda-bf42-aafc1b562519" containerName="extract-content" Nov 24 13:42:59 crc kubenswrapper[4752]: I1124 13:42:59.457210 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="804a9776-b4f3-4eda-bf42-aafc1b562519" containerName="registry-server" Nov 24 13:42:59 crc kubenswrapper[4752]: I1124 13:42:59.460609 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jdc4c" Nov 24 13:42:59 crc kubenswrapper[4752]: I1124 13:42:59.483701 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jdc4c"] Nov 24 13:42:59 crc kubenswrapper[4752]: I1124 13:42:59.560700 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr54b\" (UniqueName: \"kubernetes.io/projected/6d92655d-8f98-4e2b-8200-335ca29de3fb-kube-api-access-mr54b\") pod \"community-operators-jdc4c\" (UID: \"6d92655d-8f98-4e2b-8200-335ca29de3fb\") " pod="openshift-marketplace/community-operators-jdc4c" Nov 24 13:42:59 crc kubenswrapper[4752]: I1124 13:42:59.560946 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d92655d-8f98-4e2b-8200-335ca29de3fb-catalog-content\") pod \"community-operators-jdc4c\" (UID: \"6d92655d-8f98-4e2b-8200-335ca29de3fb\") " pod="openshift-marketplace/community-operators-jdc4c" Nov 24 13:42:59 crc kubenswrapper[4752]: I1124 13:42:59.561165 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d92655d-8f98-4e2b-8200-335ca29de3fb-utilities\") pod \"community-operators-jdc4c\" (UID: \"6d92655d-8f98-4e2b-8200-335ca29de3fb\") " pod="openshift-marketplace/community-operators-jdc4c" Nov 24 13:42:59 crc kubenswrapper[4752]: I1124 13:42:59.662371 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr54b\" (UniqueName: \"kubernetes.io/projected/6d92655d-8f98-4e2b-8200-335ca29de3fb-kube-api-access-mr54b\") pod \"community-operators-jdc4c\" (UID: \"6d92655d-8f98-4e2b-8200-335ca29de3fb\") " pod="openshift-marketplace/community-operators-jdc4c" Nov 24 13:42:59 crc kubenswrapper[4752]: I1124 13:42:59.662492 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d92655d-8f98-4e2b-8200-335ca29de3fb-catalog-content\") pod \"community-operators-jdc4c\" (UID: \"6d92655d-8f98-4e2b-8200-335ca29de3fb\") " pod="openshift-marketplace/community-operators-jdc4c" Nov 24 13:42:59 crc kubenswrapper[4752]: I1124 13:42:59.662588 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d92655d-8f98-4e2b-8200-335ca29de3fb-utilities\") pod \"community-operators-jdc4c\" (UID: \"6d92655d-8f98-4e2b-8200-335ca29de3fb\") " pod="openshift-marketplace/community-operators-jdc4c" Nov 24 13:42:59 crc kubenswrapper[4752]: I1124 13:42:59.663343 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d92655d-8f98-4e2b-8200-335ca29de3fb-utilities\") pod \"community-operators-jdc4c\" (UID: \"6d92655d-8f98-4e2b-8200-335ca29de3fb\") " pod="openshift-marketplace/community-operators-jdc4c" Nov 24 13:42:59 crc kubenswrapper[4752]: I1124 13:42:59.663351 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d92655d-8f98-4e2b-8200-335ca29de3fb-catalog-content\") pod \"community-operators-jdc4c\" (UID: \"6d92655d-8f98-4e2b-8200-335ca29de3fb\") " pod="openshift-marketplace/community-operators-jdc4c" Nov 24 13:42:59 crc kubenswrapper[4752]: I1124 13:42:59.681783 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr54b\" (UniqueName: \"kubernetes.io/projected/6d92655d-8f98-4e2b-8200-335ca29de3fb-kube-api-access-mr54b\") pod \"community-operators-jdc4c\" (UID: \"6d92655d-8f98-4e2b-8200-335ca29de3fb\") " pod="openshift-marketplace/community-operators-jdc4c" Nov 24 13:42:59 crc kubenswrapper[4752]: I1124 13:42:59.802174 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jdc4c" Nov 24 13:43:00 crc kubenswrapper[4752]: I1124 13:43:00.404765 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jdc4c"] Nov 24 13:43:01 crc kubenswrapper[4752]: I1124 13:43:01.590103 4752 generic.go:334] "Generic (PLEG): container finished" podID="6d92655d-8f98-4e2b-8200-335ca29de3fb" containerID="33938273cd91505efe5a6e2c65c3307a6f4fccfb9d47d99df01b0d73b8e5d44b" exitCode=0 Nov 24 13:43:01 crc kubenswrapper[4752]: I1124 13:43:01.590665 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jdc4c" event={"ID":"6d92655d-8f98-4e2b-8200-335ca29de3fb","Type":"ContainerDied","Data":"33938273cd91505efe5a6e2c65c3307a6f4fccfb9d47d99df01b0d73b8e5d44b"} Nov 24 13:43:01 crc kubenswrapper[4752]: I1124 13:43:01.590699 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jdc4c" event={"ID":"6d92655d-8f98-4e2b-8200-335ca29de3fb","Type":"ContainerStarted","Data":"e0c5bccd94d00229a342e4e06656ef5de01ef45c5e624292abac06b2408f655a"} Nov 24 13:43:03 crc kubenswrapper[4752]: I1124 13:43:03.728052 4752 scope.go:117] "RemoveContainer" containerID="ef6259f52270e7ab7fb09374faf507034084284bf08d7836a2a3735a7e8bbb29" Nov 24 13:43:03 crc kubenswrapper[4752]: E1124 13:43:03.728922 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:43:06 crc kubenswrapper[4752]: I1124 13:43:06.652658 4752 generic.go:334] "Generic (PLEG): container finished" podID="6d92655d-8f98-4e2b-8200-335ca29de3fb" containerID="12d3790201cca18d0bcfa407be164afc5692cf88abfb21a3584ccd16fcea86a8" exitCode=0 Nov 24 13:43:06 crc kubenswrapper[4752]: I1124 13:43:06.653254 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jdc4c" event={"ID":"6d92655d-8f98-4e2b-8200-335ca29de3fb","Type":"ContainerDied","Data":"12d3790201cca18d0bcfa407be164afc5692cf88abfb21a3584ccd16fcea86a8"} Nov 24 13:43:07 crc kubenswrapper[4752]: I1124 13:43:07.670303 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jdc4c" event={"ID":"6d92655d-8f98-4e2b-8200-335ca29de3fb","Type":"ContainerStarted","Data":"9db2776c375d2b8d2071de8133190d26a2d14087ddd0e61dd400e1744feb70be"} Nov 24 13:43:07 crc kubenswrapper[4752]: I1124 13:43:07.689162 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jdc4c" podStartSLOduration=3.130598157 podStartE2EDuration="8.689138098s" podCreationTimestamp="2025-11-24 13:42:59 +0000 UTC" firstStartedPulling="2025-11-24 13:43:01.593844179 +0000 UTC m=+9387.578664468" lastFinishedPulling="2025-11-24 13:43:07.15238411 +0000 UTC m=+9393.137204409" observedRunningTime="2025-11-24 13:43:07.686655736 +0000 UTC m=+9393.671476025" watchObservedRunningTime="2025-11-24 13:43:07.689138098 +0000 UTC m=+9393.673958387" Nov 24 13:43:09 crc kubenswrapper[4752]: I1124 13:43:09.802918 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jdc4c" Nov 24 13:43:09 crc kubenswrapper[4752]: I1124 13:43:09.804437 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jdc4c" Nov 24 13:43:09 crc kubenswrapper[4752]: I1124 13:43:09.866149 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jdc4c" Nov 24 13:43:16 crc kubenswrapper[4752]: I1124 13:43:16.727861 4752 scope.go:117] "RemoveContainer" containerID="ef6259f52270e7ab7fb09374faf507034084284bf08d7836a2a3735a7e8bbb29" Nov 24 13:43:16 crc kubenswrapper[4752]: E1124 13:43:16.729645 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:43:19 crc kubenswrapper[4752]: I1124 13:43:19.864135 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jdc4c" Nov 24 13:43:19 crc kubenswrapper[4752]: I1124 13:43:19.954119 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jdc4c"] Nov 24 13:43:20 crc kubenswrapper[4752]: I1124 13:43:20.025135 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-llgnn"] Nov 24 13:43:20 crc kubenswrapper[4752]: I1124 13:43:20.025383 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-llgnn" podUID="5d0e16b8-9861-42f5-9ea9-3bc37334e0d7" containerName="registry-server" containerID="cri-o://e028edf06807a2676d3d40b0f38dc42eaca38d94f52f41261d141acb9583b73c" gracePeriod=2 Nov 24 13:43:20 crc kubenswrapper[4752]: I1124 13:43:20.607959 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-llgnn" Nov 24 13:43:20 crc kubenswrapper[4752]: I1124 13:43:20.701243 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d0e16b8-9861-42f5-9ea9-3bc37334e0d7-catalog-content\") pod \"5d0e16b8-9861-42f5-9ea9-3bc37334e0d7\" (UID: \"5d0e16b8-9861-42f5-9ea9-3bc37334e0d7\") " Nov 24 13:43:20 crc kubenswrapper[4752]: I1124 13:43:20.701289 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d0e16b8-9861-42f5-9ea9-3bc37334e0d7-utilities\") pod \"5d0e16b8-9861-42f5-9ea9-3bc37334e0d7\" (UID: \"5d0e16b8-9861-42f5-9ea9-3bc37334e0d7\") " Nov 24 13:43:20 crc kubenswrapper[4752]: I1124 13:43:20.701368 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8mcv\" (UniqueName: \"kubernetes.io/projected/5d0e16b8-9861-42f5-9ea9-3bc37334e0d7-kube-api-access-k8mcv\") pod \"5d0e16b8-9861-42f5-9ea9-3bc37334e0d7\" (UID: \"5d0e16b8-9861-42f5-9ea9-3bc37334e0d7\") " Nov 24 13:43:20 crc kubenswrapper[4752]: I1124 13:43:20.702284 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d0e16b8-9861-42f5-9ea9-3bc37334e0d7-utilities" (OuterVolumeSpecName: "utilities") pod "5d0e16b8-9861-42f5-9ea9-3bc37334e0d7" (UID: "5d0e16b8-9861-42f5-9ea9-3bc37334e0d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 13:43:20 crc kubenswrapper[4752]: I1124 13:43:20.728963 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d0e16b8-9861-42f5-9ea9-3bc37334e0d7-kube-api-access-k8mcv" (OuterVolumeSpecName: "kube-api-access-k8mcv") pod "5d0e16b8-9861-42f5-9ea9-3bc37334e0d7" (UID: "5d0e16b8-9861-42f5-9ea9-3bc37334e0d7"). InnerVolumeSpecName "kube-api-access-k8mcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:43:20 crc kubenswrapper[4752]: I1124 13:43:20.768859 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d0e16b8-9861-42f5-9ea9-3bc37334e0d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5d0e16b8-9861-42f5-9ea9-3bc37334e0d7" (UID: "5d0e16b8-9861-42f5-9ea9-3bc37334e0d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 13:43:20 crc kubenswrapper[4752]: I1124 13:43:20.803965 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d0e16b8-9861-42f5-9ea9-3bc37334e0d7-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 13:43:20 crc kubenswrapper[4752]: I1124 13:43:20.804007 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d0e16b8-9861-42f5-9ea9-3bc37334e0d7-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 13:43:20 crc kubenswrapper[4752]: I1124 13:43:20.804018 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8mcv\" (UniqueName: \"kubernetes.io/projected/5d0e16b8-9861-42f5-9ea9-3bc37334e0d7-kube-api-access-k8mcv\") on node \"crc\" DevicePath \"\"" Nov 24 13:43:20 crc kubenswrapper[4752]: I1124 13:43:20.815194 4752 generic.go:334] "Generic (PLEG): container finished" podID="5d0e16b8-9861-42f5-9ea9-3bc37334e0d7" containerID="e028edf06807a2676d3d40b0f38dc42eaca38d94f52f41261d141acb9583b73c" exitCode=0 Nov 24 13:43:20 crc kubenswrapper[4752]: I1124 13:43:20.816384 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-llgnn" Nov 24 13:43:20 crc kubenswrapper[4752]: I1124 13:43:20.818828 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llgnn" event={"ID":"5d0e16b8-9861-42f5-9ea9-3bc37334e0d7","Type":"ContainerDied","Data":"e028edf06807a2676d3d40b0f38dc42eaca38d94f52f41261d141acb9583b73c"} Nov 24 13:43:20 crc kubenswrapper[4752]: I1124 13:43:20.818880 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llgnn" event={"ID":"5d0e16b8-9861-42f5-9ea9-3bc37334e0d7","Type":"ContainerDied","Data":"8fc2f0bf59f7095e9c595929892bd863a8bd873ccc1ce73909c5f368ea3817a6"} Nov 24 13:43:20 crc kubenswrapper[4752]: I1124 13:43:20.818898 4752 scope.go:117] "RemoveContainer" containerID="e028edf06807a2676d3d40b0f38dc42eaca38d94f52f41261d141acb9583b73c" Nov 24 13:43:20 crc kubenswrapper[4752]: I1124 13:43:20.866810 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-llgnn"] Nov 24 13:43:20 crc kubenswrapper[4752]: I1124 13:43:20.873927 4752 scope.go:117] "RemoveContainer" containerID="9364f76c3c4b854fb8248c261b0f2b6ae7b204b410aaa4ebf13ede5761b40bf2" Nov 24 13:43:20 crc kubenswrapper[4752]: I1124 13:43:20.897525 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-llgnn"] Nov 24 13:43:21 crc kubenswrapper[4752]: I1124 13:43:21.046936 4752 scope.go:117] "RemoveContainer" containerID="a1f16d044088a62d67f6d574468f4427576b3a18a6528a836564dca841ef8f21" Nov 24 13:43:21 crc kubenswrapper[4752]: I1124 13:43:21.109480 4752 scope.go:117] "RemoveContainer" containerID="e028edf06807a2676d3d40b0f38dc42eaca38d94f52f41261d141acb9583b73c" Nov 24 13:43:21 crc kubenswrapper[4752]: E1124 13:43:21.112900 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e028edf06807a2676d3d40b0f38dc42eaca38d94f52f41261d141acb9583b73c\": container with ID starting with e028edf06807a2676d3d40b0f38dc42eaca38d94f52f41261d141acb9583b73c not found: ID does not exist" containerID="e028edf06807a2676d3d40b0f38dc42eaca38d94f52f41261d141acb9583b73c" Nov 24 13:43:21 crc kubenswrapper[4752]: I1124 13:43:21.112950 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e028edf06807a2676d3d40b0f38dc42eaca38d94f52f41261d141acb9583b73c"} err="failed to get container status \"e028edf06807a2676d3d40b0f38dc42eaca38d94f52f41261d141acb9583b73c\": rpc error: code = NotFound desc = could not find container \"e028edf06807a2676d3d40b0f38dc42eaca38d94f52f41261d141acb9583b73c\": container with ID starting with e028edf06807a2676d3d40b0f38dc42eaca38d94f52f41261d141acb9583b73c not found: ID does not exist" Nov 24 13:43:21 crc kubenswrapper[4752]: I1124 13:43:21.112977 4752 scope.go:117] "RemoveContainer" containerID="9364f76c3c4b854fb8248c261b0f2b6ae7b204b410aaa4ebf13ede5761b40bf2" Nov 24 13:43:21 crc kubenswrapper[4752]: E1124 13:43:21.113361 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9364f76c3c4b854fb8248c261b0f2b6ae7b204b410aaa4ebf13ede5761b40bf2\": container with ID starting with 9364f76c3c4b854fb8248c261b0f2b6ae7b204b410aaa4ebf13ede5761b40bf2 not found: ID does not exist" containerID="9364f76c3c4b854fb8248c261b0f2b6ae7b204b410aaa4ebf13ede5761b40bf2" Nov 24 13:43:21 crc kubenswrapper[4752]: I1124 13:43:21.113405 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9364f76c3c4b854fb8248c261b0f2b6ae7b204b410aaa4ebf13ede5761b40bf2"} err="failed to get container status \"9364f76c3c4b854fb8248c261b0f2b6ae7b204b410aaa4ebf13ede5761b40bf2\": rpc error: code = NotFound desc = could not find container \"9364f76c3c4b854fb8248c261b0f2b6ae7b204b410aaa4ebf13ede5761b40bf2\": container with ID starting with 9364f76c3c4b854fb8248c261b0f2b6ae7b204b410aaa4ebf13ede5761b40bf2 not found: ID does not exist" Nov 24 13:43:21 crc kubenswrapper[4752]: I1124 13:43:21.113439 4752 scope.go:117] "RemoveContainer" containerID="a1f16d044088a62d67f6d574468f4427576b3a18a6528a836564dca841ef8f21" Nov 24 13:43:21 crc kubenswrapper[4752]: E1124 13:43:21.113713 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1f16d044088a62d67f6d574468f4427576b3a18a6528a836564dca841ef8f21\": container with ID starting with a1f16d044088a62d67f6d574468f4427576b3a18a6528a836564dca841ef8f21 not found: ID does not exist" containerID="a1f16d044088a62d67f6d574468f4427576b3a18a6528a836564dca841ef8f21" Nov 24 13:43:21 crc kubenswrapper[4752]: I1124 13:43:21.113739 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1f16d044088a62d67f6d574468f4427576b3a18a6528a836564dca841ef8f21"} err="failed to get container status \"a1f16d044088a62d67f6d574468f4427576b3a18a6528a836564dca841ef8f21\": rpc error: code = NotFound desc = could not find container \"a1f16d044088a62d67f6d574468f4427576b3a18a6528a836564dca841ef8f21\": container with ID starting with a1f16d044088a62d67f6d574468f4427576b3a18a6528a836564dca841ef8f21 not found: ID does not exist" Nov 24 13:43:22 crc kubenswrapper[4752]: I1124 13:43:22.747880 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d0e16b8-9861-42f5-9ea9-3bc37334e0d7" path="/var/lib/kubelet/pods/5d0e16b8-9861-42f5-9ea9-3bc37334e0d7/volumes" Nov 24 13:43:28 crc kubenswrapper[4752]: I1124 13:43:28.728987 4752 scope.go:117] "RemoveContainer" containerID="ef6259f52270e7ab7fb09374faf507034084284bf08d7836a2a3735a7e8bbb29" Nov 24 13:43:28 crc kubenswrapper[4752]: E1124 13:43:28.730191 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:43:39 crc kubenswrapper[4752]: I1124 13:43:39.729639 4752 scope.go:117] "RemoveContainer" containerID="ef6259f52270e7ab7fb09374faf507034084284bf08d7836a2a3735a7e8bbb29" Nov 24 13:43:39 crc kubenswrapper[4752]: E1124 13:43:39.731174 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:43:51 crc kubenswrapper[4752]: I1124 13:43:51.729378 4752 scope.go:117] "RemoveContainer" containerID="ef6259f52270e7ab7fb09374faf507034084284bf08d7836a2a3735a7e8bbb29" Nov 24 13:43:52 crc kubenswrapper[4752]: I1124 13:43:52.163163 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerStarted","Data":"d9a0a12b328ff206a22a203489bbc2cd9a54ed504328596d07c6e50879316211"} Nov 24 13:45:00 crc kubenswrapper[4752]: I1124 13:45:00.146515 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399865-2prvg"] Nov 24 13:45:00 crc kubenswrapper[4752]: E1124 13:45:00.147764 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d0e16b8-9861-42f5-9ea9-3bc37334e0d7" containerName="extract-content" Nov 24 13:45:00 crc kubenswrapper[4752]: I1124 13:45:00.147782 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d0e16b8-9861-42f5-9ea9-3bc37334e0d7" containerName="extract-content" Nov 24 13:45:00 crc kubenswrapper[4752]: E1124 13:45:00.147811 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d0e16b8-9861-42f5-9ea9-3bc37334e0d7" containerName="extract-utilities" Nov 24 13:45:00 crc kubenswrapper[4752]: I1124 13:45:00.147821 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d0e16b8-9861-42f5-9ea9-3bc37334e0d7" containerName="extract-utilities" Nov 24 13:45:00 crc kubenswrapper[4752]: E1124 13:45:00.147873 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d0e16b8-9861-42f5-9ea9-3bc37334e0d7" containerName="registry-server" Nov 24 13:45:00 crc kubenswrapper[4752]: I1124 13:45:00.147881 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d0e16b8-9861-42f5-9ea9-3bc37334e0d7" containerName="registry-server" Nov 24 13:45:00 crc kubenswrapper[4752]: I1124 13:45:00.148115 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d0e16b8-9861-42f5-9ea9-3bc37334e0d7" containerName="registry-server" Nov 24 13:45:00 crc kubenswrapper[4752]: I1124 13:45:00.149049 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399865-2prvg" Nov 24 13:45:00 crc kubenswrapper[4752]: I1124 13:45:00.151629 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 24 13:45:00 crc kubenswrapper[4752]: I1124 13:45:00.152251 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 24 13:45:00 crc kubenswrapper[4752]: I1124 13:45:00.157149 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399865-2prvg"] Nov 24 13:45:00 crc kubenswrapper[4752]: I1124 13:45:00.281392 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/965d7607-4d08-4d28-b36c-73caed6dedc1-config-volume\") pod \"collect-profiles-29399865-2prvg\" (UID: \"965d7607-4d08-4d28-b36c-73caed6dedc1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399865-2prvg" Nov 24 13:45:00 crc kubenswrapper[4752]: I1124 13:45:00.281669 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnjcp\" (UniqueName: \"kubernetes.io/projected/965d7607-4d08-4d28-b36c-73caed6dedc1-kube-api-access-tnjcp\") pod \"collect-profiles-29399865-2prvg\" (UID: \"965d7607-4d08-4d28-b36c-73caed6dedc1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399865-2prvg" Nov 24 13:45:00 crc kubenswrapper[4752]: I1124 13:45:00.281814 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/965d7607-4d08-4d28-b36c-73caed6dedc1-secret-volume\") pod \"collect-profiles-29399865-2prvg\" (UID: \"965d7607-4d08-4d28-b36c-73caed6dedc1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399865-2prvg" Nov 24 13:45:00 crc kubenswrapper[4752]: I1124 13:45:00.384516 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/965d7607-4d08-4d28-b36c-73caed6dedc1-config-volume\") pod \"collect-profiles-29399865-2prvg\" (UID: \"965d7607-4d08-4d28-b36c-73caed6dedc1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399865-2prvg" Nov 24 13:45:00 crc kubenswrapper[4752]: I1124 13:45:00.384557 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnjcp\" (UniqueName: \"kubernetes.io/projected/965d7607-4d08-4d28-b36c-73caed6dedc1-kube-api-access-tnjcp\") pod \"collect-profiles-29399865-2prvg\" (UID: \"965d7607-4d08-4d28-b36c-73caed6dedc1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399865-2prvg" Nov 24 13:45:00 crc kubenswrapper[4752]: I1124 13:45:00.384597 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/965d7607-4d08-4d28-b36c-73caed6dedc1-secret-volume\") pod \"collect-profiles-29399865-2prvg\" (UID: \"965d7607-4d08-4d28-b36c-73caed6dedc1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399865-2prvg" Nov 24 13:45:00 crc kubenswrapper[4752]: I1124 13:45:00.387376 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/965d7607-4d08-4d28-b36c-73caed6dedc1-config-volume\") pod \"collect-profiles-29399865-2prvg\" (UID: \"965d7607-4d08-4d28-b36c-73caed6dedc1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399865-2prvg" Nov 24 13:45:00 crc kubenswrapper[4752]: I1124 13:45:00.395354 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/965d7607-4d08-4d28-b36c-73caed6dedc1-secret-volume\") pod \"collect-profiles-29399865-2prvg\" (UID: \"965d7607-4d08-4d28-b36c-73caed6dedc1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399865-2prvg" Nov 24 13:45:00 crc kubenswrapper[4752]: I1124 13:45:00.401276 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnjcp\" (UniqueName: \"kubernetes.io/projected/965d7607-4d08-4d28-b36c-73caed6dedc1-kube-api-access-tnjcp\") pod \"collect-profiles-29399865-2prvg\" (UID: \"965d7607-4d08-4d28-b36c-73caed6dedc1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399865-2prvg" Nov 24 13:45:00 crc kubenswrapper[4752]: I1124 13:45:00.472035 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399865-2prvg" Nov 24 13:45:00 crc kubenswrapper[4752]: I1124 13:45:00.933095 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399865-2prvg"] Nov 24 13:45:00 crc kubenswrapper[4752]: W1124 13:45:00.950609 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod965d7607_4d08_4d28_b36c_73caed6dedc1.slice/crio-4c5001247dee3fc2617fa55bb15e4635a30f3dc0f3d8791ab8dda43699717e1c WatchSource:0}: Error finding container 4c5001247dee3fc2617fa55bb15e4635a30f3dc0f3d8791ab8dda43699717e1c: Status 404 returned error can't find the container with id 4c5001247dee3fc2617fa55bb15e4635a30f3dc0f3d8791ab8dda43699717e1c Nov 24 13:45:01 crc kubenswrapper[4752]: I1124 13:45:01.962380 4752 generic.go:334] "Generic (PLEG): container finished" podID="965d7607-4d08-4d28-b36c-73caed6dedc1" containerID="139be79a0e9de8493169aa8c4bcebaeb150e1d02fb27b554270d6780c262ab44" exitCode=0 Nov 24 13:45:01 crc kubenswrapper[4752]: I1124 13:45:01.962453 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399865-2prvg" event={"ID":"965d7607-4d08-4d28-b36c-73caed6dedc1","Type":"ContainerDied","Data":"139be79a0e9de8493169aa8c4bcebaeb150e1d02fb27b554270d6780c262ab44"} Nov 24 13:45:01 crc kubenswrapper[4752]: I1124 13:45:01.962702 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399865-2prvg" event={"ID":"965d7607-4d08-4d28-b36c-73caed6dedc1","Type":"ContainerStarted","Data":"4c5001247dee3fc2617fa55bb15e4635a30f3dc0f3d8791ab8dda43699717e1c"} Nov 24 13:45:03 crc kubenswrapper[4752]: I1124 13:45:03.439215 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399865-2prvg" Nov 24 13:45:03 crc kubenswrapper[4752]: I1124 13:45:03.457659 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/965d7607-4d08-4d28-b36c-73caed6dedc1-config-volume\") pod \"965d7607-4d08-4d28-b36c-73caed6dedc1\" (UID: \"965d7607-4d08-4d28-b36c-73caed6dedc1\") " Nov 24 13:45:03 crc kubenswrapper[4752]: I1124 13:45:03.457720 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnjcp\" (UniqueName: \"kubernetes.io/projected/965d7607-4d08-4d28-b36c-73caed6dedc1-kube-api-access-tnjcp\") pod \"965d7607-4d08-4d28-b36c-73caed6dedc1\" (UID: \"965d7607-4d08-4d28-b36c-73caed6dedc1\") " Nov 24 13:45:03 crc kubenswrapper[4752]: I1124 13:45:03.458107 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/965d7607-4d08-4d28-b36c-73caed6dedc1-secret-volume\") pod \"965d7607-4d08-4d28-b36c-73caed6dedc1\" (UID: \"965d7607-4d08-4d28-b36c-73caed6dedc1\") " Nov 24 13:45:03 crc kubenswrapper[4752]: I1124 13:45:03.464320 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/965d7607-4d08-4d28-b36c-73caed6dedc1-config-volume" (OuterVolumeSpecName: "config-volume") pod "965d7607-4d08-4d28-b36c-73caed6dedc1" (UID: "965d7607-4d08-4d28-b36c-73caed6dedc1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 13:45:03 crc kubenswrapper[4752]: I1124 13:45:03.464666 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/965d7607-4d08-4d28-b36c-73caed6dedc1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "965d7607-4d08-4d28-b36c-73caed6dedc1" (UID: "965d7607-4d08-4d28-b36c-73caed6dedc1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 13:45:03 crc kubenswrapper[4752]: I1124 13:45:03.464896 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/965d7607-4d08-4d28-b36c-73caed6dedc1-kube-api-access-tnjcp" (OuterVolumeSpecName: "kube-api-access-tnjcp") pod "965d7607-4d08-4d28-b36c-73caed6dedc1" (UID: "965d7607-4d08-4d28-b36c-73caed6dedc1"). InnerVolumeSpecName "kube-api-access-tnjcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:45:03 crc kubenswrapper[4752]: I1124 13:45:03.562322 4752 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/965d7607-4d08-4d28-b36c-73caed6dedc1-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 24 13:45:03 crc kubenswrapper[4752]: I1124 13:45:03.562364 4752 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/965d7607-4d08-4d28-b36c-73caed6dedc1-config-volume\") on node \"crc\" DevicePath \"\"" Nov 24 13:45:03 crc kubenswrapper[4752]: I1124 13:45:03.562379 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnjcp\" (UniqueName: \"kubernetes.io/projected/965d7607-4d08-4d28-b36c-73caed6dedc1-kube-api-access-tnjcp\") on node \"crc\" DevicePath \"\"" Nov 24 13:45:03 crc kubenswrapper[4752]: I1124 13:45:03.985477 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399865-2prvg" event={"ID":"965d7607-4d08-4d28-b36c-73caed6dedc1","Type":"ContainerDied","Data":"4c5001247dee3fc2617fa55bb15e4635a30f3dc0f3d8791ab8dda43699717e1c"} Nov 24 13:45:03 crc kubenswrapper[4752]: I1124 13:45:03.985823 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c5001247dee3fc2617fa55bb15e4635a30f3dc0f3d8791ab8dda43699717e1c" Nov 24 13:45:03 crc kubenswrapper[4752]: I1124 13:45:03.985520 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399865-2prvg" Nov 24 13:45:04 crc kubenswrapper[4752]: I1124 13:45:04.523770 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399820-sltfb"] Nov 24 13:45:04 crc kubenswrapper[4752]: I1124 13:45:04.534393 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399820-sltfb"] Nov 24 13:45:04 crc kubenswrapper[4752]: I1124 13:45:04.748405 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c51d28f-4a83-4b39-bcff-877881ab970c" path="/var/lib/kubelet/pods/7c51d28f-4a83-4b39-bcff-877881ab970c/volumes" Nov 24 13:45:33 crc kubenswrapper[4752]: I1124 13:45:33.379089 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hgh9v"] Nov 24 13:45:33 crc kubenswrapper[4752]: E1124 13:45:33.381202 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="965d7607-4d08-4d28-b36c-73caed6dedc1" containerName="collect-profiles" Nov 24 13:45:33 crc kubenswrapper[4752]: I1124 13:45:33.381304 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="965d7607-4d08-4d28-b36c-73caed6dedc1" containerName="collect-profiles" Nov 24 13:45:33 crc kubenswrapper[4752]: I1124 13:45:33.381777 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="965d7607-4d08-4d28-b36c-73caed6dedc1" containerName="collect-profiles" Nov 24 13:45:33 crc kubenswrapper[4752]: I1124 13:45:33.389648 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hgh9v" Nov 24 13:45:33 crc kubenswrapper[4752]: I1124 13:45:33.415615 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hgh9v"] Nov 24 13:45:33 crc kubenswrapper[4752]: I1124 13:45:33.564408 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8ac8df1-02dc-43f5-93d8-f2c20d05a28c-catalog-content\") pod \"certified-operators-hgh9v\" (UID: \"a8ac8df1-02dc-43f5-93d8-f2c20d05a28c\") " pod="openshift-marketplace/certified-operators-hgh9v" Nov 24 13:45:33 crc kubenswrapper[4752]: I1124 13:45:33.564891 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qq96\" (UniqueName: \"kubernetes.io/projected/a8ac8df1-02dc-43f5-93d8-f2c20d05a28c-kube-api-access-5qq96\") pod \"certified-operators-hgh9v\" (UID: \"a8ac8df1-02dc-43f5-93d8-f2c20d05a28c\") " pod="openshift-marketplace/certified-operators-hgh9v" Nov 24 13:45:33 crc kubenswrapper[4752]: I1124 13:45:33.565074 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8ac8df1-02dc-43f5-93d8-f2c20d05a28c-utilities\") pod \"certified-operators-hgh9v\" (UID: \"a8ac8df1-02dc-43f5-93d8-f2c20d05a28c\") " pod="openshift-marketplace/certified-operators-hgh9v" Nov 24 13:45:33 crc kubenswrapper[4752]: I1124 13:45:33.667548 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8ac8df1-02dc-43f5-93d8-f2c20d05a28c-catalog-content\") pod \"certified-operators-hgh9v\" (UID: \"a8ac8df1-02dc-43f5-93d8-f2c20d05a28c\") " pod="openshift-marketplace/certified-operators-hgh9v" Nov 24 13:45:33 crc kubenswrapper[4752]: I1124 13:45:33.667684 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qq96\" (UniqueName: \"kubernetes.io/projected/a8ac8df1-02dc-43f5-93d8-f2c20d05a28c-kube-api-access-5qq96\") pod \"certified-operators-hgh9v\" (UID: \"a8ac8df1-02dc-43f5-93d8-f2c20d05a28c\") " pod="openshift-marketplace/certified-operators-hgh9v" Nov 24 13:45:33 crc kubenswrapper[4752]: I1124 13:45:33.667874 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8ac8df1-02dc-43f5-93d8-f2c20d05a28c-utilities\") pod \"certified-operators-hgh9v\" (UID: \"a8ac8df1-02dc-43f5-93d8-f2c20d05a28c\") " pod="openshift-marketplace/certified-operators-hgh9v" Nov 24 13:45:33 crc kubenswrapper[4752]: I1124 13:45:33.668093 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8ac8df1-02dc-43f5-93d8-f2c20d05a28c-catalog-content\") pod \"certified-operators-hgh9v\" (UID: \"a8ac8df1-02dc-43f5-93d8-f2c20d05a28c\") " pod="openshift-marketplace/certified-operators-hgh9v" Nov 24 13:45:33 crc kubenswrapper[4752]: I1124 13:45:33.668435 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8ac8df1-02dc-43f5-93d8-f2c20d05a28c-utilities\") pod \"certified-operators-hgh9v\" (UID: \"a8ac8df1-02dc-43f5-93d8-f2c20d05a28c\") " pod="openshift-marketplace/certified-operators-hgh9v" Nov 24 13:45:33 crc kubenswrapper[4752]: I1124 13:45:33.694646 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qq96\" (UniqueName: \"kubernetes.io/projected/a8ac8df1-02dc-43f5-93d8-f2c20d05a28c-kube-api-access-5qq96\") pod \"certified-operators-hgh9v\" (UID: \"a8ac8df1-02dc-43f5-93d8-f2c20d05a28c\") " pod="openshift-marketplace/certified-operators-hgh9v" Nov 24 13:45:33 crc kubenswrapper[4752]: I1124 13:45:33.712112 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hgh9v" Nov 24 13:45:34 crc kubenswrapper[4752]: I1124 13:45:34.273270 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hgh9v"] Nov 24 13:45:35 crc kubenswrapper[4752]: I1124 13:45:35.347378 4752 generic.go:334] "Generic (PLEG): container finished" podID="a8ac8df1-02dc-43f5-93d8-f2c20d05a28c" containerID="bf849acacc14fc92662e841240f947afc4514e0c473cdff349aef43fe030c09c" exitCode=0 Nov 24 13:45:35 crc kubenswrapper[4752]: I1124 13:45:35.347478 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hgh9v" event={"ID":"a8ac8df1-02dc-43f5-93d8-f2c20d05a28c","Type":"ContainerDied","Data":"bf849acacc14fc92662e841240f947afc4514e0c473cdff349aef43fe030c09c"} Nov 24 13:45:35 crc kubenswrapper[4752]: I1124 13:45:35.347949 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hgh9v" event={"ID":"a8ac8df1-02dc-43f5-93d8-f2c20d05a28c","Type":"ContainerStarted","Data":"a57500b674a18964f2c3ca92272b73ed510bfbdf819d30be11082cd2382367be"} Nov 24 13:45:35 crc kubenswrapper[4752]: I1124 13:45:35.349284 4752 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 24 13:45:36 crc kubenswrapper[4752]: I1124 13:45:36.372997 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hgh9v" event={"ID":"a8ac8df1-02dc-43f5-93d8-f2c20d05a28c","Type":"ContainerStarted","Data":"26616219592c9a1654c84049e9bee0f90cfdca8126fc2a40d55245b22ede689f"} Nov 24 13:45:38 crc kubenswrapper[4752]: I1124 13:45:38.397002 4752 generic.go:334] "Generic (PLEG): container finished" podID="a8ac8df1-02dc-43f5-93d8-f2c20d05a28c" containerID="26616219592c9a1654c84049e9bee0f90cfdca8126fc2a40d55245b22ede689f" exitCode=0 Nov 24 13:45:38 crc kubenswrapper[4752]: I1124 13:45:38.397077 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hgh9v" event={"ID":"a8ac8df1-02dc-43f5-93d8-f2c20d05a28c","Type":"ContainerDied","Data":"26616219592c9a1654c84049e9bee0f90cfdca8126fc2a40d55245b22ede689f"} Nov 24 13:45:39 crc kubenswrapper[4752]: I1124 13:45:39.415267 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hgh9v" event={"ID":"a8ac8df1-02dc-43f5-93d8-f2c20d05a28c","Type":"ContainerStarted","Data":"7da1c2b865bcae4470b2c53b84cd75c0aff709c91b7f817972b253c0a482662c"} Nov 24 13:45:39 crc kubenswrapper[4752]: I1124 13:45:39.441401 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hgh9v" podStartSLOduration=2.989622865 podStartE2EDuration="6.4413787s" podCreationTimestamp="2025-11-24 13:45:33 +0000 UTC" firstStartedPulling="2025-11-24 13:45:35.349055221 +0000 UTC m=+9541.333875500" lastFinishedPulling="2025-11-24 13:45:38.800811046 +0000 UTC m=+9544.785631335" observedRunningTime="2025-11-24 13:45:39.436043567 +0000 UTC m=+9545.420863856" watchObservedRunningTime="2025-11-24 13:45:39.4413787 +0000 UTC m=+9545.426198989" Nov 24 13:45:43 crc kubenswrapper[4752]: I1124 13:45:43.713192 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hgh9v" Nov 24 13:45:43 crc kubenswrapper[4752]: I1124 13:45:43.713729 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hgh9v" Nov 24 13:45:43 crc kubenswrapper[4752]: I1124 13:45:43.765012 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hgh9v" Nov 24 13:45:44 crc kubenswrapper[4752]: I1124 13:45:44.547078 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hgh9v" Nov 24 13:45:44 crc kubenswrapper[4752]: I1124 13:45:44.595826 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hgh9v"] Nov 24 13:45:46 crc kubenswrapper[4752]: I1124 13:45:46.495011 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hgh9v" podUID="a8ac8df1-02dc-43f5-93d8-f2c20d05a28c" containerName="registry-server" containerID="cri-o://7da1c2b865bcae4470b2c53b84cd75c0aff709c91b7f817972b253c0a482662c" gracePeriod=2 Nov 24 13:45:47 crc kubenswrapper[4752]: I1124 13:45:47.510919 4752 generic.go:334] "Generic (PLEG): container finished" podID="a8ac8df1-02dc-43f5-93d8-f2c20d05a28c" containerID="7da1c2b865bcae4470b2c53b84cd75c0aff709c91b7f817972b253c0a482662c" exitCode=0 Nov 24 13:45:47 crc kubenswrapper[4752]: I1124 13:45:47.511005 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hgh9v" event={"ID":"a8ac8df1-02dc-43f5-93d8-f2c20d05a28c","Type":"ContainerDied","Data":"7da1c2b865bcae4470b2c53b84cd75c0aff709c91b7f817972b253c0a482662c"} Nov 24 13:45:47 crc kubenswrapper[4752]: I1124 13:45:47.511060 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hgh9v" event={"ID":"a8ac8df1-02dc-43f5-93d8-f2c20d05a28c","Type":"ContainerDied","Data":"a57500b674a18964f2c3ca92272b73ed510bfbdf819d30be11082cd2382367be"} Nov 24 13:45:47 crc kubenswrapper[4752]: I1124 13:45:47.511077 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a57500b674a18964f2c3ca92272b73ed510bfbdf819d30be11082cd2382367be" Nov 24 13:45:47 crc kubenswrapper[4752]: I1124 13:45:47.515836 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hgh9v" Nov 24 13:45:47 crc kubenswrapper[4752]: I1124 13:45:47.586242 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8ac8df1-02dc-43f5-93d8-f2c20d05a28c-utilities\") pod \"a8ac8df1-02dc-43f5-93d8-f2c20d05a28c\" (UID: \"a8ac8df1-02dc-43f5-93d8-f2c20d05a28c\") " Nov 24 13:45:47 crc kubenswrapper[4752]: I1124 13:45:47.586763 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8ac8df1-02dc-43f5-93d8-f2c20d05a28c-catalog-content\") pod \"a8ac8df1-02dc-43f5-93d8-f2c20d05a28c\" (UID: \"a8ac8df1-02dc-43f5-93d8-f2c20d05a28c\") " Nov 24 13:45:47 crc kubenswrapper[4752]: I1124 13:45:47.586866 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qq96\" (UniqueName: \"kubernetes.io/projected/a8ac8df1-02dc-43f5-93d8-f2c20d05a28c-kube-api-access-5qq96\") pod \"a8ac8df1-02dc-43f5-93d8-f2c20d05a28c\" (UID: \"a8ac8df1-02dc-43f5-93d8-f2c20d05a28c\") " Nov 24 13:45:47 crc kubenswrapper[4752]: I1124 13:45:47.587596 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8ac8df1-02dc-43f5-93d8-f2c20d05a28c-utilities" (OuterVolumeSpecName: "utilities") pod "a8ac8df1-02dc-43f5-93d8-f2c20d05a28c" (UID: "a8ac8df1-02dc-43f5-93d8-f2c20d05a28c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 13:45:47 crc kubenswrapper[4752]: I1124 13:45:47.600485 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8ac8df1-02dc-43f5-93d8-f2c20d05a28c-kube-api-access-5qq96" (OuterVolumeSpecName: "kube-api-access-5qq96") pod "a8ac8df1-02dc-43f5-93d8-f2c20d05a28c" (UID: "a8ac8df1-02dc-43f5-93d8-f2c20d05a28c"). InnerVolumeSpecName "kube-api-access-5qq96". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:45:47 crc kubenswrapper[4752]: I1124 13:45:47.636479 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8ac8df1-02dc-43f5-93d8-f2c20d05a28c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a8ac8df1-02dc-43f5-93d8-f2c20d05a28c" (UID: "a8ac8df1-02dc-43f5-93d8-f2c20d05a28c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 13:45:47 crc kubenswrapper[4752]: I1124 13:45:47.689106 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8ac8df1-02dc-43f5-93d8-f2c20d05a28c-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 13:45:47 crc kubenswrapper[4752]: I1124 13:45:47.689144 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8ac8df1-02dc-43f5-93d8-f2c20d05a28c-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 13:45:47 crc kubenswrapper[4752]: I1124 13:45:47.689155 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qq96\" (UniqueName: \"kubernetes.io/projected/a8ac8df1-02dc-43f5-93d8-f2c20d05a28c-kube-api-access-5qq96\") on node \"crc\" DevicePath \"\"" Nov 24 13:45:48 crc kubenswrapper[4752]: I1124 13:45:48.523181 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hgh9v" Nov 24 13:45:48 crc kubenswrapper[4752]: I1124 13:45:48.588056 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hgh9v"] Nov 24 13:45:48 crc kubenswrapper[4752]: I1124 13:45:48.610659 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hgh9v"] Nov 24 13:45:48 crc kubenswrapper[4752]: I1124 13:45:48.739631 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8ac8df1-02dc-43f5-93d8-f2c20d05a28c" path="/var/lib/kubelet/pods/a8ac8df1-02dc-43f5-93d8-f2c20d05a28c/volumes" Nov 24 13:45:50 crc kubenswrapper[4752]: I1124 13:45:50.720417 4752 scope.go:117] "RemoveContainer" containerID="8d738190f518ed98518543d01faa817a6a104c05fe3aefc8374d73aa1f3c5e3c" Nov 24 13:46:15 crc kubenswrapper[4752]: I1124 13:46:15.469040 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 13:46:15 crc kubenswrapper[4752]: I1124 13:46:15.469661 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 13:46:45 crc kubenswrapper[4752]: I1124 13:46:45.468872 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 13:46:45 crc kubenswrapper[4752]: I1124 13:46:45.469378 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 13:47:15 crc kubenswrapper[4752]: I1124 13:47:15.469231 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 13:47:15 crc kubenswrapper[4752]: I1124 13:47:15.469913 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 13:47:15 crc kubenswrapper[4752]: I1124 13:47:15.469961 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 13:47:15 crc kubenswrapper[4752]: I1124 13:47:15.470782 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d9a0a12b328ff206a22a203489bbc2cd9a54ed504328596d07c6e50879316211"} pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 24 13:47:15 crc kubenswrapper[4752]: I1124 13:47:15.470871 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" containerID="cri-o://d9a0a12b328ff206a22a203489bbc2cd9a54ed504328596d07c6e50879316211" gracePeriod=600 Nov 24 13:47:15 crc kubenswrapper[4752]: I1124 13:47:15.531268 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_34761b79-c432-4e7e-9715-131bb3bb4450/init-config-reloader/0.log" Nov 24 13:47:15 crc kubenswrapper[4752]: I1124 13:47:15.662450 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_34761b79-c432-4e7e-9715-131bb3bb4450/init-config-reloader/0.log" Nov 24 13:47:15 crc kubenswrapper[4752]: I1124 13:47:15.748984 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_34761b79-c432-4e7e-9715-131bb3bb4450/config-reloader/0.log" Nov 24 13:47:15 crc kubenswrapper[4752]: I1124 13:47:15.779881 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_34761b79-c432-4e7e-9715-131bb3bb4450/alertmanager/0.log" Nov 24 13:47:15 crc kubenswrapper[4752]: I1124 13:47:15.969907 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_b8cfa846-da98-401c-968b-aa10bf8093de/aodh-listener/0.log" Nov 24 13:47:16 crc kubenswrapper[4752]: I1124 13:47:16.008446 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_b8cfa846-da98-401c-968b-aa10bf8093de/aodh-evaluator/0.log" Nov 24 13:47:16 crc kubenswrapper[4752]: I1124 13:47:16.033257 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_b8cfa846-da98-401c-968b-aa10bf8093de/aodh-api/0.log" Nov 24 13:47:16 crc kubenswrapper[4752]: I1124 13:47:16.208583 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_b8cfa846-da98-401c-968b-aa10bf8093de/aodh-notifier/0.log" Nov 24 13:47:16 crc kubenswrapper[4752]: I1124 13:47:16.289516 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-547c99f9b6-9fzgx_d182b26e-e887-47d5-a834-4584f6110213/barbican-api/0.log" Nov 24 13:47:16 crc kubenswrapper[4752]: I1124 13:47:16.296807 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-547c99f9b6-9fzgx_d182b26e-e887-47d5-a834-4584f6110213/barbican-api-log/0.log" Nov 24 13:47:16 crc kubenswrapper[4752]: I1124 13:47:16.529248 4752 generic.go:334] "Generic (PLEG): container finished" podID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerID="d9a0a12b328ff206a22a203489bbc2cd9a54ed504328596d07c6e50879316211" exitCode=0 Nov 24 13:47:16 crc kubenswrapper[4752]: I1124 13:47:16.529301 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerDied","Data":"d9a0a12b328ff206a22a203489bbc2cd9a54ed504328596d07c6e50879316211"} Nov 24 13:47:16 crc kubenswrapper[4752]: I1124 13:47:16.529341 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerStarted","Data":"5c3f2c41f195abfc5e76a86b31163072a21d4f0bdd1cd956393a7148e514a530"} Nov 24 13:47:16 crc kubenswrapper[4752]: I1124 13:47:16.529371 4752 scope.go:117] "RemoveContainer" containerID="ef6259f52270e7ab7fb09374faf507034084284bf08d7836a2a3735a7e8bbb29" Nov 24 13:47:16 crc kubenswrapper[4752]: I1124 13:47:16.792724 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6bbdb9cc7d-h6rsf_ab3ff015-f006-4c8b-8cde-6191a3ddf473/barbican-keystone-listener-log/0.log" Nov 24 13:47:16 crc kubenswrapper[4752]: I1124 13:47:16.814755 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6bbdb9cc7d-h6rsf_ab3ff015-f006-4c8b-8cde-6191a3ddf473/barbican-keystone-listener/0.log" Nov 24 13:47:16 crc kubenswrapper[4752]: I1124 13:47:16.998386 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-65559fc695-tzc6z_a4cbe655-b57b-4d57-97e5-7e1f18c47167/barbican-worker/0.log" Nov 24 13:47:17 crc kubenswrapper[4752]: I1124 13:47:17.021114 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-65559fc695-tzc6z_a4cbe655-b57b-4d57-97e5-7e1f18c47167/barbican-worker-log/0.log" Nov 24 13:47:17 crc kubenswrapper[4752]: I1124 13:47:17.138829 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-956sb_d98851aa-db89-424e-9304-44681012e2f0/bootstrap-openstack-openstack-cell1/0.log" Nov 24 13:47:17 crc kubenswrapper[4752]: I1124 13:47:17.282290 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_87748d6d-7490-4a95-9cdc-5fc516929b3d/ceilometer-central-agent/0.log" Nov 24 13:47:17 crc kubenswrapper[4752]: I1124 13:47:17.341498 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_87748d6d-7490-4a95-9cdc-5fc516929b3d/ceilometer-notification-agent/0.log" Nov 24 13:47:17 crc kubenswrapper[4752]: I1124 13:47:17.402301 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_87748d6d-7490-4a95-9cdc-5fc516929b3d/proxy-httpd/0.log" Nov 24 13:47:17 crc kubenswrapper[4752]: I1124 13:47:17.561421 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-openstack-openstack-cell1-s67n5_7acb322e-d9f4-48b7-a023-d42f3614ecc7/ceph-client-openstack-openstack-cell1/0.log" Nov 24 13:47:17 crc kubenswrapper[4752]: I1124 13:47:17.581145 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_87748d6d-7490-4a95-9cdc-5fc516929b3d/sg-core/0.log" Nov 24 13:47:17 crc kubenswrapper[4752]: I1124 13:47:17.771705 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_1df89d62-db8a-458a-a85b-cf1e95d942e8/cinder-api-log/0.log" Nov 24 13:47:17 crc kubenswrapper[4752]: I1124 13:47:17.849241 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_1df89d62-db8a-458a-a85b-cf1e95d942e8/cinder-api/0.log" Nov 24 13:47:18 crc kubenswrapper[4752]: I1124 13:47:18.060597 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_e731d115-86e9-4f89-b91d-955e67f8309c/probe/0.log" Nov 24 13:47:18 crc kubenswrapper[4752]: I1124 13:47:18.119599 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_e731d115-86e9-4f89-b91d-955e67f8309c/cinder-backup/0.log" Nov 24 13:47:18 crc kubenswrapper[4752]: I1124 13:47:18.162145 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_019f042d-12be-4cdc-b195-470abf83bb3a/cinder-scheduler/0.log" Nov 24 13:47:18 crc kubenswrapper[4752]: I1124 13:47:18.300189 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_019f042d-12be-4cdc-b195-470abf83bb3a/probe/0.log" Nov 24 13:47:18 crc kubenswrapper[4752]: I1124 13:47:18.390169 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_71d02fa2-b391-4f1a-9181-25e3469dd49b/cinder-volume/0.log" Nov 24 13:47:18 crc kubenswrapper[4752]: I1124 13:47:18.477156 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_71d02fa2-b391-4f1a-9181-25e3469dd49b/probe/0.log" Nov 24 13:47:18 crc kubenswrapper[4752]: I1124 13:47:18.572104 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-cell1-ftbqj_9b8b46cb-745f-489f-84e7-2b4e9001ac6e/configure-network-openstack-openstack-cell1/0.log" Nov 24 13:47:18 crc kubenswrapper[4752]: I1124 13:47:18.634406 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-cell1-rqzkn_ee7cb878-862d-49ed-ace6-1ba9b4b6daf0/configure-os-openstack-openstack-cell1/0.log" Nov 24 13:47:18 crc kubenswrapper[4752]: I1124 13:47:18.726437 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5cdb84b55c-kbnrm_8729d4c7-0edc-4f70-a988-ffb7c2a265ca/init/0.log" Nov 24 13:47:18 crc kubenswrapper[4752]: I1124 13:47:18.913550 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5cdb84b55c-kbnrm_8729d4c7-0edc-4f70-a988-ffb7c2a265ca/init/0.log" Nov 24 13:47:18 crc kubenswrapper[4752]: I1124 13:47:18.959478 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5cdb84b55c-kbnrm_8729d4c7-0edc-4f70-a988-ffb7c2a265ca/dnsmasq-dns/0.log" Nov 24 13:47:19 crc kubenswrapper[4752]: I1124 13:47:19.007535 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-cell1-t6pjb_291a26fd-6d14-45f7-bd62-cf7806827e6d/download-cache-openstack-openstack-cell1/0.log" Nov 24 13:47:19 crc kubenswrapper[4752]: I1124 13:47:19.135248 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_b1d02810-1b43-40b7-9b3c-99f316e7d3a9/glance-log/0.log" Nov 24 13:47:19 crc kubenswrapper[4752]: I1124 13:47:19.168107 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_b1d02810-1b43-40b7-9b3c-99f316e7d3a9/glance-httpd/0.log" Nov 24 13:47:19 crc kubenswrapper[4752]: I1124 13:47:19.287932 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_571d263a-a54a-4576-a0d2-cb6325f91b19/glance-httpd/0.log" Nov 24 13:47:19 crc kubenswrapper[4752]: I1124 13:47:19.307083 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_571d263a-a54a-4576-a0d2-cb6325f91b19/glance-log/0.log" Nov 24 13:47:19 crc kubenswrapper[4752]: I1124 13:47:19.531174 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-656f79b868-6plw4_62674a23-d680-4d6f-9764-aed369cee2a0/heat-api/0.log" Nov 24 13:47:19 crc kubenswrapper[4752]: I1124 13:47:19.616657 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-78b7786c85-fxgnh_7a2456d8-9cb0-4de8-b42a-bc5b7f7bd5a9/heat-cfnapi/0.log" Nov 24 13:47:19 crc kubenswrapper[4752]: I1124 13:47:19.667334 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-85c675c6dc-z7nps_d41cbaf4-ce55-43f4-8940-f58a0b2c62c0/heat-engine/0.log" Nov 24 13:47:19 crc kubenswrapper[4752]: I1124 13:47:19.885303 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-cell1-7gskg_b27b5307-a3a9-405c-beb9-5a8774f330d6/install-certs-openstack-openstack-cell1/0.log" Nov 24 13:47:19 crc kubenswrapper[4752]: I1124 13:47:19.910993 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-554cf5f95c-cppx5_da80ccb9-4d2f-49d3-8689-5cf720968e94/horizon-log/0.log" Nov 24 13:47:19 crc kubenswrapper[4752]: I1124 13:47:19.911725 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-554cf5f95c-cppx5_da80ccb9-4d2f-49d3-8689-5cf720968e94/horizon/0.log" Nov 24 13:47:20 crc kubenswrapper[4752]: I1124 13:47:20.113384 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-cell1-wq8r6_7f26d91c-fd10-48f3-b4cd-1cecf074d2b0/install-os-openstack-openstack-cell1/0.log" Nov 24 13:47:20 crc kubenswrapper[4752]: I1124 13:47:20.251962 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5d8447c8fd-t89p7_1fd5ad40-5a39-48db-8a9d-9d200ef0d0a8/keystone-api/0.log" Nov 24 13:47:20 crc kubenswrapper[4752]: I1124 13:47:20.476611 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29399821-qd67c_6cf09bbe-f5a2-4703-8c32-b1e84e6e26d2/keystone-cron/0.log" Nov 24 13:47:20 crc kubenswrapper[4752]: I1124 13:47:20.536230 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_f528da89-b1d3-4b08-a8af-b8371b35ff7c/kube-state-metrics/0.log" Nov 24 13:47:20 crc kubenswrapper[4752]: I1124 13:47:20.698584 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell1-ll4l5_e5c17484-a3bc-4bac-a15d-5a8365781b23/libvirt-openstack-openstack-cell1/0.log" Nov 24 13:47:20 crc kubenswrapper[4752]: I1124 13:47:20.796336 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_a736d712-c29c-4228-b878-46ab90132fe4/manila-api-log/0.log" Nov 24 13:47:20 crc kubenswrapper[4752]: I1124 13:47:20.827646 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_a736d712-c29c-4228-b878-46ab90132fe4/manila-api/0.log" Nov 24 13:47:20 crc kubenswrapper[4752]: I1124 13:47:20.997151 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_d9e8e957-b8c5-40bc-bbb6-aa5800cac83f/probe/0.log" Nov 24 13:47:21 crc kubenswrapper[4752]: I1124 13:47:21.035143 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_d9e8e957-b8c5-40bc-bbb6-aa5800cac83f/manila-scheduler/0.log" Nov 24 13:47:21 crc kubenswrapper[4752]: I1124 13:47:21.127174 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_5bf54a05-b002-4171-8c39-0707e219e168/probe/0.log" Nov 24 13:47:21 crc kubenswrapper[4752]: I1124 13:47:21.137616 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_5bf54a05-b002-4171-8c39-0707e219e168/manila-share/0.log" Nov 24 13:47:21 crc kubenswrapper[4752]: I1124 13:47:21.242121 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-copy-data_a69e61b5-1af7-4e15-90d6-a255a4e8e897/adoption/0.log" Nov 24 13:47:21 crc kubenswrapper[4752]: I1124 13:47:21.560673 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-8658d4b465-f6xtp_7919c790-c895-4c2c-bae5-4dd6fb5a86bf/neutron-httpd/0.log" Nov 24 13:47:21 crc kubenswrapper[4752]: I1124 13:47:21.625630 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-8658d4b465-f6xtp_7919c790-c895-4c2c-bae5-4dd6fb5a86bf/neutron-api/0.log" Nov 24 13:47:22 crc kubenswrapper[4752]: I1124 13:47:22.213777 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-dhcp-openstack-openstack-cell1-j8d65_4fc5d8d6-9c58-41e8-a3e3-6c41ae4bc7f7/neutron-dhcp-openstack-openstack-cell1/0.log" Nov 24 13:47:22 crc kubenswrapper[4752]: I1124 13:47:22.321112 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-cell1-jqv6t_6a6b2962-03f6-4ed4-b504-429861f14548/neutron-metadata-openstack-openstack-cell1/0.log" Nov 24 13:47:22 crc kubenswrapper[4752]: I1124 13:47:22.523953 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-sriov-openstack-openstack-cell1-5d49m_82f20cf4-0c2e-483d-8105-643e3c975dd2/neutron-sriov-openstack-openstack-cell1/0.log" Nov 24 13:47:22 crc kubenswrapper[4752]: I1124 13:47:22.646313 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_eb38b860-ee90-4ebe-bf0d-02285792bbc9/nova-api-api/0.log" Nov 24 13:47:22 crc kubenswrapper[4752]: I1124 13:47:22.692231 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_eb38b860-ee90-4ebe-bf0d-02285792bbc9/nova-api-log/0.log" Nov 24 13:47:22 crc kubenswrapper[4752]: I1124 13:47:22.873787 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_a40b5c05-d6a1-457c-bba1-8496e195c2a4/nova-cell0-conductor-conductor/0.log" Nov 24 13:47:23 crc kubenswrapper[4752]: I1124 13:47:23.042495 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_d2c088e9-08c6-4695-8c18-c345a40d1eb1/nova-cell1-conductor-conductor/0.log" Nov 24 13:47:23 crc kubenswrapper[4752]: I1124 13:47:23.279582 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_a3e6bac3-32fc-4dfe-9925-e297bc7c1059/nova-cell1-novncproxy-novncproxy/0.log" Nov 24 13:47:23 crc kubenswrapper[4752]: I1124 13:47:23.388088 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellq4cgc_6ef01ca0-2da9-4115-8152-29ea3b6d7d3b/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1/0.log" Nov 24 13:47:24 crc kubenswrapper[4752]: I1124 13:47:24.252427 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-openstack-cell1-rbv2q_d9673c7e-0da7-40fd-880a-f53c18050035/nova-cell1-openstack-openstack-cell1/0.log" Nov 24 13:47:24 crc kubenswrapper[4752]: I1124 13:47:24.270016 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_74f88a59-9db6-49da-9d6f-b564689e662f/nova-metadata-log/0.log" Nov 24 13:47:24 crc kubenswrapper[4752]: I1124 13:47:24.281356 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_74f88a59-9db6-49da-9d6f-b564689e662f/nova-metadata-metadata/0.log" Nov 24 13:47:24 crc kubenswrapper[4752]: I1124 13:47:24.782459 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-85bf5b5c78-688s2_497355f0-5071-444a-96df-145e4220d015/init/0.log" Nov 24 13:47:24 crc kubenswrapper[4752]: I1124 13:47:24.902157 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_e9d8eb2b-bb50-40ba-89d0-38898e19bf14/nova-scheduler-scheduler/0.log" Nov 24 13:47:24 crc kubenswrapper[4752]: I1124 13:47:24.977043 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-85bf5b5c78-688s2_497355f0-5071-444a-96df-145e4220d015/octavia-api-provider-agent/0.log" Nov 24 13:47:24 crc kubenswrapper[4752]: I1124 13:47:24.993151 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-85bf5b5c78-688s2_497355f0-5071-444a-96df-145e4220d015/init/0.log" Nov 24 13:47:25 crc kubenswrapper[4752]: I1124 13:47:25.153877 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-d6fr7_97c0db40-afff-4067-9fee-06657cfbf155/init/0.log" Nov 24 13:47:25 crc kubenswrapper[4752]: I1124 13:47:25.210525 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-85bf5b5c78-688s2_497355f0-5071-444a-96df-145e4220d015/octavia-api/0.log" Nov 24 13:47:25 crc kubenswrapper[4752]: I1124 13:47:25.535870 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-zvcnm_4fb19eb4-b714-4b63-a9b3-9e2427994194/init/0.log" Nov 24 13:47:25 crc kubenswrapper[4752]: I1124 13:47:25.594610 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-d6fr7_97c0db40-afff-4067-9fee-06657cfbf155/init/0.log" Nov 24 13:47:25 crc kubenswrapper[4752]: I1124 13:47:25.648162 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-d6fr7_97c0db40-afff-4067-9fee-06657cfbf155/octavia-healthmanager/0.log" Nov 24 13:47:25 crc kubenswrapper[4752]: I1124 13:47:25.725988 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-zvcnm_4fb19eb4-b714-4b63-a9b3-9e2427994194/init/0.log" Nov 24 13:47:25 crc kubenswrapper[4752]: I1124 13:47:25.803044 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-zvcnm_4fb19eb4-b714-4b63-a9b3-9e2427994194/octavia-housekeeping/0.log" Nov 24 13:47:25 crc kubenswrapper[4752]: I1124 13:47:25.865626 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-795nt_e2a0fc67-fcee-4ec4-ae55-de5a44214b27/init/0.log" Nov 24 13:47:26 crc kubenswrapper[4752]: I1124 13:47:26.121023 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-795nt_e2a0fc67-fcee-4ec4-ae55-de5a44214b27/octavia-rsyslog/0.log" Nov 24 13:47:26 crc kubenswrapper[4752]: I1124 13:47:26.200205 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-fztv9_ec1f33db-dd01-4521-b4de-c2d6cecc5695/init/0.log" Nov 24 13:47:26 crc kubenswrapper[4752]: I1124 13:47:26.211303 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-795nt_e2a0fc67-fcee-4ec4-ae55-de5a44214b27/init/0.log" Nov 24 13:47:26 crc kubenswrapper[4752]: I1124 13:47:26.459505 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-fztv9_ec1f33db-dd01-4521-b4de-c2d6cecc5695/init/0.log" Nov 24 13:47:26 crc kubenswrapper[4752]: I1124 13:47:26.611855 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33/mysql-bootstrap/0.log" Nov 24 13:47:26 crc kubenswrapper[4752]: I1124 13:47:26.763167 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-fztv9_ec1f33db-dd01-4521-b4de-c2d6cecc5695/octavia-worker/0.log" Nov 24 13:47:26 crc kubenswrapper[4752]: I1124 13:47:26.818557 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33/galera/0.log" Nov 24 13:47:27 crc kubenswrapper[4752]: I1124 13:47:27.002440 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_d606151b-c042-4691-89a7-f1d8f21d033d/mysql-bootstrap/0.log" Nov 24 13:47:27 crc kubenswrapper[4752]: I1124 13:47:27.037289 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_1fefe6b9-4e33-4ade-9212-0d2f8f6d4a33/mysql-bootstrap/0.log" Nov 24 13:47:27 crc kubenswrapper[4752]: I1124 13:47:27.311881 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_d606151b-c042-4691-89a7-f1d8f21d033d/mysql-bootstrap/0.log" Nov 24 13:47:27 crc kubenswrapper[4752]: I1124 13:47:27.324295 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_13a1dca4-d743-45d2-b4a8-262db404b0b4/openstackclient/0.log" Nov 24 13:47:27 crc kubenswrapper[4752]: I1124 13:47:27.362695 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_d606151b-c042-4691-89a7-f1d8f21d033d/galera/0.log" Nov 24 13:47:27 crc kubenswrapper[4752]: I1124 13:47:27.529907 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-f95n8_0be41fba-f8d1-426b-bedc-9d318f73bbbd/ovn-controller/0.log" Nov 24 13:47:27 crc kubenswrapper[4752]: I1124 13:47:27.652547 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-t82l7_02042310-6413-447c-9f15-0973957090ad/openstack-network-exporter/0.log" Nov 24 13:47:27 crc kubenswrapper[4752]: I1124 13:47:27.836954 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-z24v5_9e66e131-ee32-478d-86d3-c32da4efcb08/ovsdb-server-init/0.log" Nov 24 13:47:28 crc kubenswrapper[4752]: I1124 13:47:28.109032 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-z24v5_9e66e131-ee32-478d-86d3-c32da4efcb08/ovsdb-server-init/0.log" Nov 24 13:47:28 crc kubenswrapper[4752]: I1124 13:47:28.118017 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-z24v5_9e66e131-ee32-478d-86d3-c32da4efcb08/ovs-vswitchd/0.log" Nov 24 13:47:28 crc kubenswrapper[4752]: I1124 13:47:28.169201 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-z24v5_9e66e131-ee32-478d-86d3-c32da4efcb08/ovsdb-server/0.log" Nov 24 13:47:28 crc kubenswrapper[4752]: I1124 13:47:28.309120 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-copy-data_95e8030c-fdfe-47e0-811a-0b32eaa08a00/adoption/0.log" Nov 24 13:47:28 crc kubenswrapper[4752]: I1124 13:47:28.418517 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_25e82bdb-90e7-49a0-a243-ab4acabf0aa7/openstack-network-exporter/0.log" Nov 24 13:47:28 crc kubenswrapper[4752]: I1124 13:47:28.547169 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_25e82bdb-90e7-49a0-a243-ab4acabf0aa7/ovn-northd/0.log" Nov 24 13:47:28 crc kubenswrapper[4752]: I1124 13:47:28.781414 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_27731060-c814-4ebe-9d1b-4ac937bdc995/openstack-network-exporter/0.log" Nov 24 13:47:28 crc kubenswrapper[4752]: I1124 13:47:28.795934 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-tvrxz_2b92c65c-7cef-4eca-b3d5-452443fc7fb3/ovn-openstack-openstack-cell1/0.log" Nov 24 13:47:28 crc kubenswrapper[4752]: I1124 13:47:28.915458 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_27731060-c814-4ebe-9d1b-4ac937bdc995/ovsdbserver-nb/0.log" Nov 24 13:47:29 crc kubenswrapper[4752]: I1124 13:47:29.078361 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_2a45b572-4f27-4ab2-9eb6-488ebe78895e/openstack-network-exporter/0.log" Nov 24 13:47:29 crc kubenswrapper[4752]: I1124 13:47:29.098491 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_2a45b572-4f27-4ab2-9eb6-488ebe78895e/ovsdbserver-nb/0.log" Nov 24 13:47:29 crc kubenswrapper[4752]: I1124 13:47:29.415485 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_b25fbb23-a487-4f06-a9c8-66d110ac5903/openstack-network-exporter/0.log" Nov 24 13:47:29 crc kubenswrapper[4752]: I1124 13:47:29.466885 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_b25fbb23-a487-4f06-a9c8-66d110ac5903/ovsdbserver-nb/0.log" Nov 24 13:47:29 crc kubenswrapper[4752]: I1124 13:47:29.676931 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_82c20dce-5921-46ea-aaa2-b62e24923c1c/openstack-network-exporter/0.log" Nov 24 13:47:29 crc kubenswrapper[4752]: I1124 13:47:29.700803 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_82c20dce-5921-46ea-aaa2-b62e24923c1c/ovsdbserver-sb/0.log" Nov 24 13:47:29 crc kubenswrapper[4752]: I1124 13:47:29.858878 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_1d1d8b13-b2a7-483a-b1ad-be75868386e7/openstack-network-exporter/0.log" Nov 24 13:47:29 crc kubenswrapper[4752]: I1124 13:47:29.882628 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_1d1d8b13-b2a7-483a-b1ad-be75868386e7/ovsdbserver-sb/0.log" Nov 24 13:47:29 crc kubenswrapper[4752]: I1124 13:47:29.978859 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_63dc0a8c-07d1-4962-8686-22042a0db911/openstack-network-exporter/0.log" Nov 24 13:47:30 crc kubenswrapper[4752]: I1124 13:47:30.110989 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_63dc0a8c-07d1-4962-8686-22042a0db911/ovsdbserver-sb/0.log" Nov 24 13:47:30 crc kubenswrapper[4752]: I1124 13:47:30.381738 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-dd66498d8-m7grv_403b2bfd-19de-475e-8460-3d42506d6a5e/placement-api/0.log" Nov 24 13:47:30 crc kubenswrapper[4752]: I1124 13:47:30.396305 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-dd66498d8-m7grv_403b2bfd-19de-475e-8460-3d42506d6a5e/placement-log/0.log" Nov 24 13:47:30 crc kubenswrapper[4752]: I1124 13:47:30.471266 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-cmqtbp_0e4de15b-2388-46ea-a966-8471f04cd894/pre-adoption-validation-openstack-pre-adoption-openstack-cell1/0.log" Nov 24 13:47:30 crc kubenswrapper[4752]: I1124 13:47:30.589210 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_50616cf6-5526-40a2-bd9a-6b9608421058/init-config-reloader/0.log" Nov 24 13:47:30 crc kubenswrapper[4752]: I1124 13:47:30.815331 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_50616cf6-5526-40a2-bd9a-6b9608421058/prometheus/0.log" Nov 24 13:47:30 crc kubenswrapper[4752]: I1124 13:47:30.875946 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_50616cf6-5526-40a2-bd9a-6b9608421058/init-config-reloader/0.log" Nov 24 13:47:30 crc kubenswrapper[4752]: I1124 13:47:30.876318 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_50616cf6-5526-40a2-bd9a-6b9608421058/thanos-sidecar/0.log" Nov 24 13:47:30 crc kubenswrapper[4752]: I1124 13:47:30.902577 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_50616cf6-5526-40a2-bd9a-6b9608421058/config-reloader/0.log" Nov 24 13:47:31 crc kubenswrapper[4752]: I1124 13:47:31.008664 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_3b79e654-374a-40f7-87b6-08a45f62c170/memcached/0.log" Nov 24 13:47:31 crc kubenswrapper[4752]: I1124 13:47:31.081069 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ca78463f-b2cb-49d4-96a1-44c8b1f2ae85/setup-container/0.log" Nov 24 13:47:31 crc kubenswrapper[4752]: I1124 13:47:31.332212 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ca78463f-b2cb-49d4-96a1-44c8b1f2ae85/rabbitmq/0.log" Nov 24 13:47:31 crc kubenswrapper[4752]: I1124 13:47:31.366978 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6d47e62f-ca69-4327-97d4-c8e10e7bc522/setup-container/0.log" Nov 24 13:47:31 crc kubenswrapper[4752]: I1124 13:47:31.392704 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ca78463f-b2cb-49d4-96a1-44c8b1f2ae85/setup-container/0.log" Nov 24 13:47:31 crc kubenswrapper[4752]: I1124 13:47:31.582377 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6d47e62f-ca69-4327-97d4-c8e10e7bc522/setup-container/0.log" Nov 24 13:47:31 crc kubenswrapper[4752]: I1124 13:47:31.599640 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6d47e62f-ca69-4327-97d4-c8e10e7bc522/rabbitmq/0.log" Nov 24 13:47:31 crc kubenswrapper[4752]: I1124 13:47:31.606245 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-cell1-xhk6b_2948a607-5d86-4e4e-94ef-cc1cc219ef47/reboot-os-openstack-openstack-cell1/0.log" Nov 24 13:47:32 crc kubenswrapper[4752]: I1124 13:47:32.648806 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-cell1-bcvfv_bb41e8f1-6ced-4ade-b2d7-6458f2cb5808/run-os-openstack-openstack-cell1/0.log" Nov 24 13:47:32 crc kubenswrapper[4752]: I1124 13:47:32.700394 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-openstack-fm27m_38958449-fe2e-4e55-899d-c6d573fbf809/ssh-known-hosts-openstack/0.log" Nov 24 13:47:32 crc kubenswrapper[4752]: I1124 13:47:32.762720 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-openstack-openstack-cell1-g9gr2_23f896f6-ecd7-426c-98a6-66ce6cec1202/telemetry-openstack-openstack-cell1/0.log" Nov 24 13:47:32 crc kubenswrapper[4752]: I1124 13:47:32.958548 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-cell1-hhcgm_6b3b41b1-fa64-45fc-9d41-19b0be538111/validate-network-openstack-openstack-cell1/0.log" Nov 24 13:47:33 crc kubenswrapper[4752]: I1124 13:47:33.003974 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-cell1-v5jr7_3e737c81-721d-4220-ac1e-24a3057556fe/tripleo-cleanup-tripleo-cleanup-openstack-cell1/0.log" Nov 24 13:47:54 crc kubenswrapper[4752]: I1124 13:47:54.644048 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4a0953fd9391bf39ad887962f638d696e69e0cac7f27b9f8cbf4e65e676qzjp_2061713f-5dc5-441a-bc1a-9702e5959aea/util/0.log" Nov 24 13:47:54 crc kubenswrapper[4752]: I1124 13:47:54.779642 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4a0953fd9391bf39ad887962f638d696e69e0cac7f27b9f8cbf4e65e676qzjp_2061713f-5dc5-441a-bc1a-9702e5959aea/pull/0.log" Nov 24 13:47:54 crc kubenswrapper[4752]: I1124 13:47:54.801658 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4a0953fd9391bf39ad887962f638d696e69e0cac7f27b9f8cbf4e65e676qzjp_2061713f-5dc5-441a-bc1a-9702e5959aea/pull/0.log" Nov 24 13:47:54 crc kubenswrapper[4752]: I1124 13:47:54.816059 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4a0953fd9391bf39ad887962f638d696e69e0cac7f27b9f8cbf4e65e676qzjp_2061713f-5dc5-441a-bc1a-9702e5959aea/util/0.log" Nov 24 13:47:55 crc kubenswrapper[4752]: I1124 13:47:55.542113 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4a0953fd9391bf39ad887962f638d696e69e0cac7f27b9f8cbf4e65e676qzjp_2061713f-5dc5-441a-bc1a-9702e5959aea/util/0.log" Nov 24 13:47:55 crc kubenswrapper[4752]: I1124 13:47:55.556887 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4a0953fd9391bf39ad887962f638d696e69e0cac7f27b9f8cbf4e65e676qzjp_2061713f-5dc5-441a-bc1a-9702e5959aea/extract/0.log" Nov 24 13:47:55 crc kubenswrapper[4752]: I1124 13:47:55.588078 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4a0953fd9391bf39ad887962f638d696e69e0cac7f27b9f8cbf4e65e676qzjp_2061713f-5dc5-441a-bc1a-9702e5959aea/pull/0.log" Nov 24 13:47:55 crc kubenswrapper[4752]: I1124 13:47:55.722214 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-75fb479bcc-zjh9k_8d27c6dc-dd9d-4061-aa22-a334d0ffce1e/kube-rbac-proxy/0.log" Nov 24 13:47:55 crc kubenswrapper[4752]: I1124 13:47:55.850269 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6498cbf48f-chvx8_ab3c1b1a-1cd2-4acf-9208-2eb8d370d25f/kube-rbac-proxy/0.log" Nov 24 13:47:55 crc kubenswrapper[4752]: I1124 13:47:55.883139 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-75fb479bcc-zjh9k_8d27c6dc-dd9d-4061-aa22-a334d0ffce1e/manager/0.log" Nov 24 13:47:56 crc kubenswrapper[4752]: I1124 13:47:56.060498 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-767ccfd65f-pcppk_e4b457a1-f44d-4117-b4ac-96117293474b/kube-rbac-proxy/0.log" Nov 24 13:47:56 crc kubenswrapper[4752]: I1124 13:47:56.062838 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6498cbf48f-chvx8_ab3c1b1a-1cd2-4acf-9208-2eb8d370d25f/manager/0.log" Nov 24 13:47:56 crc kubenswrapper[4752]: I1124 13:47:56.084421 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-767ccfd65f-pcppk_e4b457a1-f44d-4117-b4ac-96117293474b/manager/0.log" Nov 24 13:47:56 crc kubenswrapper[4752]: I1124 13:47:56.232188 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7969689c84-56mql_190fcfd5-d931-49b7-bca0-b0347fb39619/kube-rbac-proxy/0.log" Nov 24 13:47:56 crc kubenswrapper[4752]: I1124 13:47:56.374929 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7969689c84-56mql_190fcfd5-d931-49b7-bca0-b0347fb39619/manager/0.log" Nov 24 13:47:56 crc kubenswrapper[4752]: I1124 13:47:56.396065 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-56f54d6746-r8x8j_c60d4c47-fcde-4a44-ad68-f6113546b3e5/kube-rbac-proxy/0.log" Nov 24 13:47:56 crc kubenswrapper[4752]: I1124 13:47:56.447551 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-56f54d6746-r8x8j_c60d4c47-fcde-4a44-ad68-f6113546b3e5/manager/0.log" Nov 24 13:47:56 crc kubenswrapper[4752]: I1124 13:47:56.553467 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-598f69df5d-hrlfx_97ad1b28-46b3-4eb3-a00d-03b6e5b58575/kube-rbac-proxy/0.log" Nov 24 13:47:56 crc kubenswrapper[4752]: I1124 13:47:56.585731 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-598f69df5d-hrlfx_97ad1b28-46b3-4eb3-a00d-03b6e5b58575/manager/0.log" Nov 24 13:47:56 crc kubenswrapper[4752]: I1124 13:47:56.790073 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-6dd8864d7c-248dp_e2ee6ec0-9173-4d22-8e3d-00fb401fe5a3/kube-rbac-proxy/0.log" Nov 24 13:47:56 crc kubenswrapper[4752]: I1124 13:47:56.860335 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-99b499f4-d8l5p_dec80ec2-1e3d-413e-aed2-426ce66a601a/kube-rbac-proxy/0.log" Nov 24 13:47:56 crc kubenswrapper[4752]: I1124 13:47:56.997890 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-6dd8864d7c-248dp_e2ee6ec0-9173-4d22-8e3d-00fb401fe5a3/manager/0.log" Nov 24 13:47:57 crc kubenswrapper[4752]: I1124 13:47:57.034020 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-99b499f4-d8l5p_dec80ec2-1e3d-413e-aed2-426ce66a601a/manager/0.log" Nov 24 13:47:57 crc kubenswrapper[4752]: I1124 13:47:57.617705 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7454b96578-wh2h5_a480eff6-4e20-4afc-942a-075e40ef0699/kube-rbac-proxy/0.log" Nov 24 13:47:57 crc kubenswrapper[4752]: I1124 13:47:57.789889 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7454b96578-wh2h5_a480eff6-4e20-4afc-942a-075e40ef0699/manager/0.log" Nov 24 13:47:57 crc kubenswrapper[4752]: I1124 13:47:57.832727 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-58f887965d-bfqrz_65cdca17-af51-44a1-b1ad-3411ff357a5f/kube-rbac-proxy/0.log" Nov 24 13:47:57 crc kubenswrapper[4752]: I1124 13:47:57.975698 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-58f887965d-bfqrz_65cdca17-af51-44a1-b1ad-3411ff357a5f/manager/0.log" Nov 24 13:47:58 crc kubenswrapper[4752]: I1124 13:47:58.017798 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-54b5986bb8-prxnc_d92f3535-049c-451a-8f0f-eb863b9e6319/kube-rbac-proxy/0.log" Nov 24 13:47:58 crc kubenswrapper[4752]: I1124 13:47:58.125951 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-54b5986bb8-prxnc_d92f3535-049c-451a-8f0f-eb863b9e6319/manager/0.log" Nov 24 13:47:58 crc kubenswrapper[4752]: I1124 13:47:58.235500 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-78bd47f458-d4fdd_7bb11c48-7c81-4e71-a258-1bd291051c79/manager/0.log" Nov 24 13:47:58 crc kubenswrapper[4752]: I1124 13:47:58.244428 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-78bd47f458-d4fdd_7bb11c48-7c81-4e71-a258-1bd291051c79/kube-rbac-proxy/0.log" Nov 24 13:47:58 crc kubenswrapper[4752]: I1124 13:47:58.351637 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-cfbb9c588-kvzg4_cbacf405-082e-46b7-94e2-e881df3184bf/kube-rbac-proxy/0.log" Nov 24 13:47:58 crc kubenswrapper[4752]: I1124 13:47:58.496776 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-54cfbf4c7d-sp84f_0450318e-f006-4998-ad0a-6b21fe253ec8/kube-rbac-proxy/0.log" Nov 24 13:47:58 crc kubenswrapper[4752]: I1124 13:47:58.576010 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-54cfbf4c7d-sp84f_0450318e-f006-4998-ad0a-6b21fe253ec8/manager/0.log" Nov 24 13:47:58 crc kubenswrapper[4752]: I1124 13:47:58.620681 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-cfbb9c588-kvzg4_cbacf405-082e-46b7-94e2-e881df3184bf/manager/0.log" Nov 24 13:47:58 crc kubenswrapper[4752]: I1124 13:47:58.710765 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-8c7444f48-csjb7_87c73240-34a4-4ad6-b134-1bacc37d2eaa/kube-rbac-proxy/0.log" Nov 24 13:47:58 crc kubenswrapper[4752]: I1124 13:47:58.711235 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-8c7444f48-csjb7_87c73240-34a4-4ad6-b134-1bacc37d2eaa/manager/0.log" Nov 24 13:47:58 crc kubenswrapper[4752]: I1124 13:47:58.853363 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-585c6fcdf4-p7vw7_c1df47a5-8725-40e7-bc90-2d77c49dba4a/kube-rbac-proxy/0.log" Nov 24 13:47:59 crc kubenswrapper[4752]: I1124 13:47:59.001427 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-689f78bdf7-xqn7j_2fc2a88f-2768-4cdc-a075-bdaa38c25853/kube-rbac-proxy/0.log" Nov 24 13:47:59 crc kubenswrapper[4752]: I1124 13:47:59.138237 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-689f78bdf7-xqn7j_2fc2a88f-2768-4cdc-a075-bdaa38c25853/operator/0.log" Nov 24 13:47:59 crc kubenswrapper[4752]: I1124 13:47:59.190529 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-p94tb_3f65a59f-6b3f-45c1-9e63-76472e38cee1/registry-server/0.log" Nov 24 13:47:59 crc kubenswrapper[4752]: I1124 13:47:59.262193 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-54fc5f65b7-vkkhd_6acbdf0a-2e7b-4c61-9e29-ca7ec67f0ca4/kube-rbac-proxy/0.log" Nov 24 13:47:59 crc kubenswrapper[4752]: I1124 13:47:59.401296 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5b797b8dff-vdzg7_c49282e6-3072-458d-8fdb-1e8282b3aa59/kube-rbac-proxy/0.log" Nov 24 13:47:59 crc kubenswrapper[4752]: I1124 13:47:59.515115 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-54fc5f65b7-vkkhd_6acbdf0a-2e7b-4c61-9e29-ca7ec67f0ca4/manager/0.log" Nov 24 13:47:59 crc kubenswrapper[4752]: I1124 13:47:59.541088 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5b797b8dff-vdzg7_c49282e6-3072-458d-8fdb-1e8282b3aa59/manager/0.log" Nov 24 13:47:59 crc kubenswrapper[4752]: I1124 13:47:59.692665 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-4fxhg_0198b944-3886-44c4-85c1-8786136c4f2a/operator/0.log" Nov 24 13:47:59 crc kubenswrapper[4752]: I1124 13:47:59.821704 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-d656998f4-pqj8q_4a5a406f-6b3d-4919-9bde-e7af06fd38d4/kube-rbac-proxy/0.log" Nov 24 13:47:59 crc kubenswrapper[4752]: I1124 13:47:59.948330 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-d656998f4-pqj8q_4a5a406f-6b3d-4919-9bde-e7af06fd38d4/manager/0.log" Nov 24 13:47:59 crc kubenswrapper[4752]: I1124 13:47:59.986855 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6d4bf84b58-p8b8t_341d84f5-7ebf-48bb-a7d1-6c55d45d0c58/kube-rbac-proxy/0.log" Nov 24 13:48:00 crc kubenswrapper[4752]: I1124 13:48:00.228047 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-b4c496f69-bbxj8_05751949-a258-4470-b4ad-4ad1ae9a3bc6/kube-rbac-proxy/0.log" Nov 24 13:48:00 crc kubenswrapper[4752]: I1124 13:48:00.249472 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6d4bf84b58-p8b8t_341d84f5-7ebf-48bb-a7d1-6c55d45d0c58/manager/0.log" Nov 24 13:48:00 crc kubenswrapper[4752]: I1124 13:48:00.264389 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-b4c496f69-bbxj8_05751949-a258-4470-b4ad-4ad1ae9a3bc6/manager/0.log" Nov 24 13:48:00 crc kubenswrapper[4752]: I1124 13:48:00.515319 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-8c6448b9f-llhmp_eb9eb6b3-a33b-4f64-92a4-f2415648f6c6/kube-rbac-proxy/0.log" Nov 24 13:48:00 crc kubenswrapper[4752]: I1124 13:48:00.524260 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-8c6448b9f-llhmp_eb9eb6b3-a33b-4f64-92a4-f2415648f6c6/manager/0.log" Nov 24 13:48:01 crc kubenswrapper[4752]: I1124 13:48:01.108171 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-585c6fcdf4-p7vw7_c1df47a5-8725-40e7-bc90-2d77c49dba4a/manager/0.log" Nov 24 13:48:18 crc kubenswrapper[4752]: I1124 13:48:18.281447 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-chccp_9b5fea82-8918-4237-99a6-eae4894a0b5f/control-plane-machine-set-operator/0.log" Nov 24 13:48:18 crc kubenswrapper[4752]: I1124 13:48:18.512863 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7xbdq_6e9f72eb-52cd-47cc-b939-3301c0aa7f3c/kube-rbac-proxy/0.log" Nov 24 13:48:18 crc kubenswrapper[4752]: I1124 13:48:18.566231 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7xbdq_6e9f72eb-52cd-47cc-b939-3301c0aa7f3c/machine-api-operator/0.log" Nov 24 13:48:31 crc kubenswrapper[4752]: I1124 13:48:31.921484 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-7k726_d362a2ce-aaf7-464c-b569-1dccdbf6edcf/cert-manager-controller/0.log" Nov 24 13:48:32 crc kubenswrapper[4752]: I1124 13:48:32.027152 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-g4qsq_85b94b6e-97ea-4d23-a0af-42ae0ee795ae/cert-manager-cainjector/0.log" Nov 24 13:48:32 crc kubenswrapper[4752]: I1124 13:48:32.124034 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-rxsjd_c3e7eab4-cc04-458f-adde-096e61b680f2/cert-manager-webhook/0.log" Nov 24 13:48:43 crc kubenswrapper[4752]: I1124 13:48:43.926950 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5874bd7bc5-tlw8h_645d9b0b-fd05-44fa-84ce-7fda6cd1c786/nmstate-console-plugin/0.log" Nov 24 13:48:44 crc kubenswrapper[4752]: I1124 13:48:44.042788 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-lbjrx_776b04ad-d48c-428f-8484-63bd82cde2a2/nmstate-handler/0.log" Nov 24 13:48:44 crc kubenswrapper[4752]: I1124 13:48:44.339736 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-5dcf9c57c5-fvwhs_469d4bbf-3549-4f0f-8abe-574354176c0e/kube-rbac-proxy/0.log" Nov 24 13:48:44 crc kubenswrapper[4752]: I1124 13:48:44.341270 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-5dcf9c57c5-fvwhs_469d4bbf-3549-4f0f-8abe-574354176c0e/nmstate-metrics/0.log" Nov 24 13:48:44 crc kubenswrapper[4752]: I1124 13:48:44.536523 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6b89b748d8-r8bcs_30debda6-1a90-4bb5-8b86-46344bc95a1e/nmstate-webhook/0.log" Nov 24 13:48:44 crc kubenswrapper[4752]: I1124 13:48:44.545466 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-557fdffb88-jrd2k_af6619a5-cfae-4ca3-99ff-dc2f716fee60/nmstate-operator/0.log" Nov 24 13:48:59 crc kubenswrapper[4752]: I1124 13:48:59.938001 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-w4xmc_2731a61e-82f6-43be-93c8-d0a5f9000bec/kube-rbac-proxy/0.log" Nov 24 13:49:00 crc kubenswrapper[4752]: I1124 13:49:00.169030 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjkc5_1244a8ec-63d5-438f-8f4f-40796b68de59/cp-frr-files/0.log" Nov 24 13:49:00 crc kubenswrapper[4752]: I1124 13:49:00.491768 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-w4xmc_2731a61e-82f6-43be-93c8-d0a5f9000bec/controller/0.log" Nov 24 13:49:00 crc kubenswrapper[4752]: I1124 13:49:00.526338 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjkc5_1244a8ec-63d5-438f-8f4f-40796b68de59/cp-frr-files/0.log" Nov 24 13:49:00 crc kubenswrapper[4752]: I1124 13:49:00.536916 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjkc5_1244a8ec-63d5-438f-8f4f-40796b68de59/cp-reloader/0.log" Nov 24 13:49:00 crc kubenswrapper[4752]: I1124 13:49:00.612074 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjkc5_1244a8ec-63d5-438f-8f4f-40796b68de59/cp-metrics/0.log" Nov 24 13:49:00 crc kubenswrapper[4752]: I1124 13:49:00.784808 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjkc5_1244a8ec-63d5-438f-8f4f-40796b68de59/cp-reloader/0.log" Nov 24 13:49:01 crc kubenswrapper[4752]: I1124 13:49:01.021338 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjkc5_1244a8ec-63d5-438f-8f4f-40796b68de59/cp-reloader/0.log" Nov 24 13:49:01 crc kubenswrapper[4752]: I1124 13:49:01.063087 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjkc5_1244a8ec-63d5-438f-8f4f-40796b68de59/cp-metrics/0.log" Nov 24 13:49:01 crc kubenswrapper[4752]: I1124 13:49:01.069204 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjkc5_1244a8ec-63d5-438f-8f4f-40796b68de59/cp-frr-files/0.log" Nov 24 13:49:01 crc kubenswrapper[4752]: I1124 13:49:01.077954 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjkc5_1244a8ec-63d5-438f-8f4f-40796b68de59/cp-metrics/0.log" Nov 24 13:49:01 crc kubenswrapper[4752]: I1124 13:49:01.331246 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjkc5_1244a8ec-63d5-438f-8f4f-40796b68de59/cp-frr-files/0.log" Nov 24 13:49:01 crc kubenswrapper[4752]: I1124 13:49:01.388328 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjkc5_1244a8ec-63d5-438f-8f4f-40796b68de59/cp-metrics/0.log" Nov 24 13:49:01 crc kubenswrapper[4752]: I1124 13:49:01.394714 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjkc5_1244a8ec-63d5-438f-8f4f-40796b68de59/cp-reloader/0.log" Nov 24 13:49:01 crc kubenswrapper[4752]: I1124 13:49:01.428938 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjkc5_1244a8ec-63d5-438f-8f4f-40796b68de59/controller/0.log" Nov 24 13:49:01 crc kubenswrapper[4752]: I1124 13:49:01.648007 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjkc5_1244a8ec-63d5-438f-8f4f-40796b68de59/frr-metrics/0.log" Nov 24 13:49:01 crc kubenswrapper[4752]: I1124 13:49:01.668497 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjkc5_1244a8ec-63d5-438f-8f4f-40796b68de59/kube-rbac-proxy-frr/0.log" Nov 24 13:49:01 crc kubenswrapper[4752]: I1124 13:49:01.668824 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjkc5_1244a8ec-63d5-438f-8f4f-40796b68de59/kube-rbac-proxy/0.log" Nov 24 13:49:01 crc kubenswrapper[4752]: I1124 13:49:01.906846 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-6998585d5-wwxm2_e7ee7fd9-f333-4573-bd44-49ffa58e8389/frr-k8s-webhook-server/0.log" Nov 24 13:49:01 crc kubenswrapper[4752]: I1124 13:49:01.912317 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjkc5_1244a8ec-63d5-438f-8f4f-40796b68de59/reloader/0.log" Nov 24 13:49:02 crc kubenswrapper[4752]: I1124 13:49:02.177227 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6947b4bf66-9rmcj_cb94bb73-0a35-4724-af4c-8333c6dbc07c/manager/0.log" Nov 24 13:49:02 crc kubenswrapper[4752]: I1124 13:49:02.884214 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5b8d98bc54-4tslc_a8771fed-834c-4867-b376-1c8b5347b532/webhook-server/0.log" Nov 24 13:49:02 crc kubenswrapper[4752]: I1124 13:49:02.941406 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zscb2_4cc40b5d-710f-44a1-917e-b330c4dcab18/kube-rbac-proxy/0.log" Nov 24 13:49:04 crc kubenswrapper[4752]: I1124 13:49:04.180563 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zscb2_4cc40b5d-710f-44a1-917e-b330c4dcab18/speaker/0.log" Nov 24 13:49:05 crc kubenswrapper[4752]: I1124 13:49:05.011087 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jjkc5_1244a8ec-63d5-438f-8f4f-40796b68de59/frr/0.log" Nov 24 13:49:17 crc kubenswrapper[4752]: I1124 13:49:17.532937 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931as4qf7_f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5/util/0.log" Nov 24 13:49:18 crc kubenswrapper[4752]: I1124 13:49:18.340554 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931as4qf7_f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5/pull/0.log" Nov 24 13:49:18 crc kubenswrapper[4752]: I1124 13:49:18.353925 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931as4qf7_f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5/util/0.log" Nov 24 13:49:18 crc kubenswrapper[4752]: I1124 13:49:18.389256 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931as4qf7_f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5/pull/0.log" Nov 24 13:49:18 crc kubenswrapper[4752]: I1124 13:49:18.528529 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931as4qf7_f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5/util/0.log" Nov 24 13:49:18 crc kubenswrapper[4752]: I1124 13:49:18.573561 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931as4qf7_f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5/pull/0.log" Nov 24 13:49:18 crc kubenswrapper[4752]: I1124 13:49:18.605126 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931as4qf7_f8f080cb-8080-4eb8-9f6e-61d50d4a9ad5/extract/0.log" Nov 24 13:49:18 crc kubenswrapper[4752]: I1124 13:49:18.718179 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e6h6hf_c193f46d-1b6e-4de5-a7e6-aac42bba0e53/util/0.log" Nov 24 13:49:18 crc kubenswrapper[4752]: I1124 13:49:18.911250 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e6h6hf_c193f46d-1b6e-4de5-a7e6-aac42bba0e53/pull/0.log" Nov 24 13:49:18 crc kubenswrapper[4752]: I1124 13:49:18.911281 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e6h6hf_c193f46d-1b6e-4de5-a7e6-aac42bba0e53/util/0.log" Nov 24 13:49:18 crc kubenswrapper[4752]: I1124 13:49:18.914581 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e6h6hf_c193f46d-1b6e-4de5-a7e6-aac42bba0e53/pull/0.log" Nov 24 13:49:19 crc kubenswrapper[4752]: I1124 13:49:19.073390 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e6h6hf_c193f46d-1b6e-4de5-a7e6-aac42bba0e53/pull/0.log" Nov 24 13:49:19 crc kubenswrapper[4752]: I1124 13:49:19.078704 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e6h6hf_c193f46d-1b6e-4de5-a7e6-aac42bba0e53/util/0.log" Nov 24 13:49:19 crc kubenswrapper[4752]: I1124 13:49:19.102429 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e6h6hf_c193f46d-1b6e-4de5-a7e6-aac42bba0e53/extract/0.log" Nov 24 13:49:19 crc kubenswrapper[4752]: I1124 13:49:19.227414 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210trm5w_9b24258d-310a-4504-829b-5bbd08f76070/util/0.log" Nov 24 13:49:19 crc kubenswrapper[4752]: I1124 13:49:19.433792 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210trm5w_9b24258d-310a-4504-829b-5bbd08f76070/pull/0.log" Nov 24 13:49:19 crc kubenswrapper[4752]: I1124 13:49:19.460628 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210trm5w_9b24258d-310a-4504-829b-5bbd08f76070/util/0.log" Nov 24 13:49:19 crc kubenswrapper[4752]: I1124 13:49:19.486633 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210trm5w_9b24258d-310a-4504-829b-5bbd08f76070/pull/0.log" Nov 24 13:49:19 crc kubenswrapper[4752]: I1124 13:49:19.621645 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210trm5w_9b24258d-310a-4504-829b-5bbd08f76070/util/0.log" Nov 24 13:49:19 crc kubenswrapper[4752]: I1124 13:49:19.652300 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210trm5w_9b24258d-310a-4504-829b-5bbd08f76070/pull/0.log" Nov 24 13:49:19 crc kubenswrapper[4752]: I1124 13:49:19.680197 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210trm5w_9b24258d-310a-4504-829b-5bbd08f76070/extract/0.log" Nov 24 13:49:19 crc kubenswrapper[4752]: I1124 13:49:19.783786 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jj77l_9794b695-5b9a-42ba-8e52-8aa1e8b95866/extract-utilities/0.log" Nov 24 13:49:19 crc kubenswrapper[4752]: I1124 13:49:19.940975 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jj77l_9794b695-5b9a-42ba-8e52-8aa1e8b95866/extract-utilities/0.log" Nov 24 13:49:19 crc kubenswrapper[4752]: I1124 13:49:19.947000 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jj77l_9794b695-5b9a-42ba-8e52-8aa1e8b95866/extract-content/0.log" Nov 24 13:49:19 crc kubenswrapper[4752]: I1124 13:49:19.974164 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jj77l_9794b695-5b9a-42ba-8e52-8aa1e8b95866/extract-content/0.log" Nov 24 13:49:20 crc kubenswrapper[4752]: I1124 13:49:20.116464 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jj77l_9794b695-5b9a-42ba-8e52-8aa1e8b95866/extract-utilities/0.log" Nov 24 13:49:20 crc kubenswrapper[4752]: I1124 13:49:20.142989 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jj77l_9794b695-5b9a-42ba-8e52-8aa1e8b95866/extract-content/0.log" Nov 24 13:49:20 crc kubenswrapper[4752]: I1124 13:49:20.316045 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jdc4c_6d92655d-8f98-4e2b-8200-335ca29de3fb/extract-utilities/0.log" Nov 24 13:49:20 crc kubenswrapper[4752]: I1124 13:49:20.596414 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jdc4c_6d92655d-8f98-4e2b-8200-335ca29de3fb/extract-utilities/0.log" Nov 24 13:49:20 crc kubenswrapper[4752]: I1124 13:49:20.596525 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jdc4c_6d92655d-8f98-4e2b-8200-335ca29de3fb/extract-content/0.log" Nov 24 13:49:20 crc kubenswrapper[4752]: I1124 13:49:20.682935 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jdc4c_6d92655d-8f98-4e2b-8200-335ca29de3fb/extract-content/0.log" Nov 24 13:49:20 crc kubenswrapper[4752]: I1124 13:49:20.780979 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jdc4c_6d92655d-8f98-4e2b-8200-335ca29de3fb/extract-content/0.log" Nov 24 13:49:20 crc kubenswrapper[4752]: I1124 13:49:20.845167 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jdc4c_6d92655d-8f98-4e2b-8200-335ca29de3fb/extract-utilities/0.log" Nov 24 13:49:21 crc kubenswrapper[4752]: I1124 13:49:21.151801 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c62vfl2_d853d04f-a54c-4f5a-a9b1-197da017ba29/util/0.log" Nov 24 13:49:21 crc kubenswrapper[4752]: I1124 13:49:21.283370 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jdc4c_6d92655d-8f98-4e2b-8200-335ca29de3fb/registry-server/0.log" Nov 24 13:49:21 crc kubenswrapper[4752]: I1124 13:49:21.294858 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c62vfl2_d853d04f-a54c-4f5a-a9b1-197da017ba29/util/0.log" Nov 24 13:49:21 crc kubenswrapper[4752]: I1124 13:49:21.414265 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c62vfl2_d853d04f-a54c-4f5a-a9b1-197da017ba29/pull/0.log" Nov 24 13:49:21 crc kubenswrapper[4752]: I1124 13:49:21.419222 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c62vfl2_d853d04f-a54c-4f5a-a9b1-197da017ba29/pull/0.log" Nov 24 13:49:21 crc kubenswrapper[4752]: I1124 13:49:21.578918 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c62vfl2_d853d04f-a54c-4f5a-a9b1-197da017ba29/util/0.log" Nov 24 13:49:21 crc kubenswrapper[4752]: I1124 13:49:21.642265 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c62vfl2_d853d04f-a54c-4f5a-a9b1-197da017ba29/pull/0.log" Nov 24 13:49:21 crc kubenswrapper[4752]: I1124 13:49:21.669065 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jj77l_9794b695-5b9a-42ba-8e52-8aa1e8b95866/registry-server/0.log" Nov 24 13:49:21 crc kubenswrapper[4752]: I1124 13:49:21.679996 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c62vfl2_d853d04f-a54c-4f5a-a9b1-197da017ba29/extract/0.log" Nov 24 13:49:21 crc kubenswrapper[4752]: I1124 13:49:21.834302 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-l9vf6_077748f3-107f-424e-9084-32a79b3ac58f/marketplace-operator/0.log" Nov 24 13:49:21 crc kubenswrapper[4752]: I1124 13:49:21.877052 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5jngx_7764e5eb-d7c5-4c69-8aa3-2dfd20e05660/extract-utilities/0.log" Nov 24 13:49:22 crc kubenswrapper[4752]: I1124 13:49:22.018052 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5jngx_7764e5eb-d7c5-4c69-8aa3-2dfd20e05660/extract-utilities/0.log" Nov 24 13:49:22 crc kubenswrapper[4752]: I1124 13:49:22.031488 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5jngx_7764e5eb-d7c5-4c69-8aa3-2dfd20e05660/extract-content/0.log" Nov 24 13:49:22 crc kubenswrapper[4752]: I1124 13:49:22.045073 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5jngx_7764e5eb-d7c5-4c69-8aa3-2dfd20e05660/extract-content/0.log" Nov 24 13:49:22 crc kubenswrapper[4752]: I1124 13:49:22.249940 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5jngx_7764e5eb-d7c5-4c69-8aa3-2dfd20e05660/extract-content/0.log" Nov 24 13:49:22 crc kubenswrapper[4752]: I1124 13:49:22.264839 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5jngx_7764e5eb-d7c5-4c69-8aa3-2dfd20e05660/extract-utilities/0.log" Nov 24 13:49:22 crc kubenswrapper[4752]: I1124 13:49:22.341867 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4dcbc_4f84466e-5f50-42cc-ae1a-be631dd3d74f/extract-utilities/0.log" Nov 24 13:49:22 crc kubenswrapper[4752]: I1124 13:49:22.473531 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4dcbc_4f84466e-5f50-42cc-ae1a-be631dd3d74f/extract-content/0.log" Nov 24 13:49:22 crc kubenswrapper[4752]: I1124 13:49:22.507714 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4dcbc_4f84466e-5f50-42cc-ae1a-be631dd3d74f/extract-utilities/0.log" Nov 24 13:49:22 crc kubenswrapper[4752]: I1124 13:49:22.527791 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4dcbc_4f84466e-5f50-42cc-ae1a-be631dd3d74f/extract-content/0.log" Nov 24 13:49:22 crc kubenswrapper[4752]: I1124 13:49:22.563841 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5jngx_7764e5eb-d7c5-4c69-8aa3-2dfd20e05660/registry-server/0.log" Nov 24 13:49:22 crc kubenswrapper[4752]: I1124 13:49:22.704580 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4dcbc_4f84466e-5f50-42cc-ae1a-be631dd3d74f/extract-content/0.log" Nov 24 13:49:22 crc kubenswrapper[4752]: I1124 13:49:22.711484 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4dcbc_4f84466e-5f50-42cc-ae1a-be631dd3d74f/extract-utilities/0.log" Nov 24 13:49:23 crc kubenswrapper[4752]: I1124 13:49:23.053615 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4dcbc_4f84466e-5f50-42cc-ae1a-be631dd3d74f/registry-server/0.log" Nov 24 13:49:36 crc kubenswrapper[4752]: I1124 13:49:36.001577 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-xwxrt_58f8c9ac-5f6e-4c39-a911-444d6ccf0391/prometheus-operator/0.log" Nov 24 13:49:36 crc kubenswrapper[4752]: I1124 13:49:36.141219 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5ccb6c774c-bctm6_57f059a9-5d15-4efb-a1d0-392993e2ae4f/prometheus-operator-admission-webhook/0.log" Nov 24 13:49:36 crc kubenswrapper[4752]: I1124 13:49:36.177265 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5ccb6c774c-q26pb_8eca23da-684b-433d-a748-adc988ebd2a0/prometheus-operator-admission-webhook/0.log" Nov 24 13:49:36 crc kubenswrapper[4752]: I1124 13:49:36.332435 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-tqxdw_e1c72bff-b2ec-4286-8016-d05fe2ea859e/operator/0.log" Nov 24 13:49:36 crc kubenswrapper[4752]: I1124 13:49:36.385464 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-grjrp_cbf04b43-41d6-4e39-9aa9-38bc4aa8a2bb/perses-operator/0.log" Nov 24 13:49:45 crc kubenswrapper[4752]: I1124 13:49:45.468706 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 13:49:45 crc kubenswrapper[4752]: I1124 13:49:45.469334 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 13:49:46 crc kubenswrapper[4752]: E1124 13:49:46.287002 4752 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.145:42904->38.102.83.145:38429: write tcp 38.102.83.145:42904->38.102.83.145:38429: write: broken pipe Nov 24 13:49:48 crc kubenswrapper[4752]: I1124 13:49:48.284958 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tx88k"] Nov 24 13:49:48 crc kubenswrapper[4752]: E1124 13:49:48.285770 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8ac8df1-02dc-43f5-93d8-f2c20d05a28c" containerName="extract-content" Nov 24 13:49:48 crc kubenswrapper[4752]: I1124 13:49:48.285783 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8ac8df1-02dc-43f5-93d8-f2c20d05a28c" containerName="extract-content" Nov 24 13:49:48 crc kubenswrapper[4752]: E1124 13:49:48.285819 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8ac8df1-02dc-43f5-93d8-f2c20d05a28c" containerName="registry-server" Nov 24 13:49:48 crc kubenswrapper[4752]: I1124 13:49:48.285826 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8ac8df1-02dc-43f5-93d8-f2c20d05a28c" containerName="registry-server" Nov 24 13:49:48 crc kubenswrapper[4752]: E1124 13:49:48.285862 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8ac8df1-02dc-43f5-93d8-f2c20d05a28c" containerName="extract-utilities" Nov 24 13:49:48 crc kubenswrapper[4752]: I1124 13:49:48.285868 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8ac8df1-02dc-43f5-93d8-f2c20d05a28c" containerName="extract-utilities" Nov 24 13:49:48 crc kubenswrapper[4752]: I1124 13:49:48.286071 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8ac8df1-02dc-43f5-93d8-f2c20d05a28c" containerName="registry-server" Nov 24 13:49:48 crc kubenswrapper[4752]: I1124 13:49:48.287640 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tx88k" Nov 24 13:49:48 crc kubenswrapper[4752]: I1124 13:49:48.303275 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tx88k"] Nov 24 13:49:48 crc kubenswrapper[4752]: I1124 13:49:48.421878 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz25c\" (UniqueName: \"kubernetes.io/projected/15db54a7-d68d-43fb-925d-ce7d11244442-kube-api-access-vz25c\") pod \"redhat-marketplace-tx88k\" (UID: \"15db54a7-d68d-43fb-925d-ce7d11244442\") " pod="openshift-marketplace/redhat-marketplace-tx88k" Nov 24 13:49:48 crc kubenswrapper[4752]: I1124 13:49:48.422052 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15db54a7-d68d-43fb-925d-ce7d11244442-catalog-content\") pod \"redhat-marketplace-tx88k\" (UID: \"15db54a7-d68d-43fb-925d-ce7d11244442\") " pod="openshift-marketplace/redhat-marketplace-tx88k" Nov 24 13:49:48 crc kubenswrapper[4752]: I1124 13:49:48.422092 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15db54a7-d68d-43fb-925d-ce7d11244442-utilities\") pod \"redhat-marketplace-tx88k\" (UID: \"15db54a7-d68d-43fb-925d-ce7d11244442\") " pod="openshift-marketplace/redhat-marketplace-tx88k" Nov 24 13:49:48 crc kubenswrapper[4752]: I1124 13:49:48.524443 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15db54a7-d68d-43fb-925d-ce7d11244442-catalog-content\") pod \"redhat-marketplace-tx88k\" (UID: \"15db54a7-d68d-43fb-925d-ce7d11244442\") " pod="openshift-marketplace/redhat-marketplace-tx88k" Nov 24 13:49:48 crc kubenswrapper[4752]: I1124 13:49:48.524515 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15db54a7-d68d-43fb-925d-ce7d11244442-utilities\") pod \"redhat-marketplace-tx88k\" (UID: \"15db54a7-d68d-43fb-925d-ce7d11244442\") " pod="openshift-marketplace/redhat-marketplace-tx88k" Nov 24 13:49:48 crc kubenswrapper[4752]: I1124 13:49:48.524590 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz25c\" (UniqueName: \"kubernetes.io/projected/15db54a7-d68d-43fb-925d-ce7d11244442-kube-api-access-vz25c\") pod \"redhat-marketplace-tx88k\" (UID: \"15db54a7-d68d-43fb-925d-ce7d11244442\") " pod="openshift-marketplace/redhat-marketplace-tx88k" Nov 24 13:49:48 crc kubenswrapper[4752]: I1124 13:49:48.524937 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15db54a7-d68d-43fb-925d-ce7d11244442-catalog-content\") pod \"redhat-marketplace-tx88k\" (UID: \"15db54a7-d68d-43fb-925d-ce7d11244442\") " pod="openshift-marketplace/redhat-marketplace-tx88k" Nov 24 13:49:48 crc kubenswrapper[4752]: I1124 13:49:48.525137 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15db54a7-d68d-43fb-925d-ce7d11244442-utilities\") pod \"redhat-marketplace-tx88k\" (UID: \"15db54a7-d68d-43fb-925d-ce7d11244442\") " pod="openshift-marketplace/redhat-marketplace-tx88k" Nov 24 13:49:48 crc kubenswrapper[4752]: I1124 13:49:48.555464 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz25c\" (UniqueName: \"kubernetes.io/projected/15db54a7-d68d-43fb-925d-ce7d11244442-kube-api-access-vz25c\") pod \"redhat-marketplace-tx88k\" (UID: \"15db54a7-d68d-43fb-925d-ce7d11244442\") " pod="openshift-marketplace/redhat-marketplace-tx88k" Nov 24 13:49:48 crc kubenswrapper[4752]: I1124 13:49:48.607017 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tx88k" Nov 24 13:49:49 crc kubenswrapper[4752]: I1124 13:49:49.146206 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tx88k"] Nov 24 13:49:50 crc kubenswrapper[4752]: I1124 13:49:50.137441 4752 generic.go:334] "Generic (PLEG): container finished" podID="15db54a7-d68d-43fb-925d-ce7d11244442" containerID="4263331c7b5868a9ca0300ad2dfb68c7ba1eda81f6758b179f22d21d01d9d4b6" exitCode=0 Nov 24 13:49:50 crc kubenswrapper[4752]: I1124 13:49:50.137486 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tx88k" event={"ID":"15db54a7-d68d-43fb-925d-ce7d11244442","Type":"ContainerDied","Data":"4263331c7b5868a9ca0300ad2dfb68c7ba1eda81f6758b179f22d21d01d9d4b6"} Nov 24 13:49:50 crc kubenswrapper[4752]: I1124 13:49:50.137842 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tx88k" event={"ID":"15db54a7-d68d-43fb-925d-ce7d11244442","Type":"ContainerStarted","Data":"7d95adbfa3ea61616ad2fb8de0cdd2044f982e39c45b0e4ee6e0a915a62b4ade"} Nov 24 13:49:51 crc kubenswrapper[4752]: I1124 13:49:51.149313 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tx88k" event={"ID":"15db54a7-d68d-43fb-925d-ce7d11244442","Type":"ContainerStarted","Data":"fa3a47b45531a7a5bacd52ce975c617455ea3a58c8514f02ccf8dda08a3ef725"} Nov 24 13:49:52 crc kubenswrapper[4752]: I1124 13:49:52.164197 4752 generic.go:334] "Generic (PLEG): container finished" podID="15db54a7-d68d-43fb-925d-ce7d11244442" containerID="fa3a47b45531a7a5bacd52ce975c617455ea3a58c8514f02ccf8dda08a3ef725" exitCode=0 Nov 24 13:49:52 crc kubenswrapper[4752]: I1124 13:49:52.164472 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tx88k" event={"ID":"15db54a7-d68d-43fb-925d-ce7d11244442","Type":"ContainerDied","Data":"fa3a47b45531a7a5bacd52ce975c617455ea3a58c8514f02ccf8dda08a3ef725"} Nov 24 13:49:54 crc kubenswrapper[4752]: I1124 13:49:54.198476 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tx88k" event={"ID":"15db54a7-d68d-43fb-925d-ce7d11244442","Type":"ContainerStarted","Data":"fd3f2a5636546a693291817f935a6b800bb6ebfae0d6e76007b5fcda21c437e4"} Nov 24 13:49:54 crc kubenswrapper[4752]: I1124 13:49:54.222131 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tx88k" podStartSLOduration=3.740538048 podStartE2EDuration="6.22211591s" podCreationTimestamp="2025-11-24 13:49:48 +0000 UTC" firstStartedPulling="2025-11-24 13:49:50.139399749 +0000 UTC m=+9796.124220078" lastFinishedPulling="2025-11-24 13:49:52.620977641 +0000 UTC m=+9798.605797940" observedRunningTime="2025-11-24 13:49:54.214421889 +0000 UTC m=+9800.199242178" watchObservedRunningTime="2025-11-24 13:49:54.22211591 +0000 UTC m=+9800.206936199" Nov 24 13:49:58 crc kubenswrapper[4752]: I1124 13:49:58.608151 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tx88k" Nov 24 13:49:58 crc kubenswrapper[4752]: I1124 13:49:58.608854 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tx88k" Nov 24 13:49:58 crc kubenswrapper[4752]: I1124 13:49:58.667614 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tx88k" Nov 24 13:49:59 crc kubenswrapper[4752]: I1124 13:49:59.309418 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tx88k" Nov 24 13:49:59 crc kubenswrapper[4752]: I1124 13:49:59.358165 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tx88k"] Nov 24 13:50:01 crc kubenswrapper[4752]: I1124 13:50:01.267868 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tx88k" podUID="15db54a7-d68d-43fb-925d-ce7d11244442" containerName="registry-server" containerID="cri-o://fd3f2a5636546a693291817f935a6b800bb6ebfae0d6e76007b5fcda21c437e4" gracePeriod=2 Nov 24 13:50:01 crc kubenswrapper[4752]: I1124 13:50:01.971113 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tx88k" Nov 24 13:50:02 crc kubenswrapper[4752]: I1124 13:50:02.029136 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15db54a7-d68d-43fb-925d-ce7d11244442-catalog-content\") pod \"15db54a7-d68d-43fb-925d-ce7d11244442\" (UID: \"15db54a7-d68d-43fb-925d-ce7d11244442\") " Nov 24 13:50:02 crc kubenswrapper[4752]: I1124 13:50:02.029330 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vz25c\" (UniqueName: \"kubernetes.io/projected/15db54a7-d68d-43fb-925d-ce7d11244442-kube-api-access-vz25c\") pod \"15db54a7-d68d-43fb-925d-ce7d11244442\" (UID: \"15db54a7-d68d-43fb-925d-ce7d11244442\") " Nov 24 13:50:02 crc kubenswrapper[4752]: I1124 13:50:02.029381 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15db54a7-d68d-43fb-925d-ce7d11244442-utilities\") pod \"15db54a7-d68d-43fb-925d-ce7d11244442\" (UID: \"15db54a7-d68d-43fb-925d-ce7d11244442\") " Nov 24 13:50:02 crc kubenswrapper[4752]: I1124 13:50:02.030522 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15db54a7-d68d-43fb-925d-ce7d11244442-utilities" (OuterVolumeSpecName: "utilities") pod "15db54a7-d68d-43fb-925d-ce7d11244442" (UID: "15db54a7-d68d-43fb-925d-ce7d11244442"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 13:50:02 crc kubenswrapper[4752]: I1124 13:50:02.060657 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15db54a7-d68d-43fb-925d-ce7d11244442-kube-api-access-vz25c" (OuterVolumeSpecName: "kube-api-access-vz25c") pod "15db54a7-d68d-43fb-925d-ce7d11244442" (UID: "15db54a7-d68d-43fb-925d-ce7d11244442"). InnerVolumeSpecName "kube-api-access-vz25c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:50:02 crc kubenswrapper[4752]: I1124 13:50:02.076582 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15db54a7-d68d-43fb-925d-ce7d11244442-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "15db54a7-d68d-43fb-925d-ce7d11244442" (UID: "15db54a7-d68d-43fb-925d-ce7d11244442"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 13:50:02 crc kubenswrapper[4752]: I1124 13:50:02.132289 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vz25c\" (UniqueName: \"kubernetes.io/projected/15db54a7-d68d-43fb-925d-ce7d11244442-kube-api-access-vz25c\") on node \"crc\" DevicePath \"\"" Nov 24 13:50:02 crc kubenswrapper[4752]: I1124 13:50:02.132332 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15db54a7-d68d-43fb-925d-ce7d11244442-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 13:50:02 crc kubenswrapper[4752]: I1124 13:50:02.132350 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15db54a7-d68d-43fb-925d-ce7d11244442-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 13:50:02 crc kubenswrapper[4752]: I1124 13:50:02.282642 4752 generic.go:334] "Generic (PLEG): container finished" podID="15db54a7-d68d-43fb-925d-ce7d11244442" containerID="fd3f2a5636546a693291817f935a6b800bb6ebfae0d6e76007b5fcda21c437e4" exitCode=0 Nov 24 13:50:02 crc kubenswrapper[4752]: I1124 13:50:02.282686 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tx88k" event={"ID":"15db54a7-d68d-43fb-925d-ce7d11244442","Type":"ContainerDied","Data":"fd3f2a5636546a693291817f935a6b800bb6ebfae0d6e76007b5fcda21c437e4"} Nov 24 13:50:02 crc kubenswrapper[4752]: I1124 13:50:02.282712 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tx88k" event={"ID":"15db54a7-d68d-43fb-925d-ce7d11244442","Type":"ContainerDied","Data":"7d95adbfa3ea61616ad2fb8de0cdd2044f982e39c45b0e4ee6e0a915a62b4ade"} Nov 24 13:50:02 crc kubenswrapper[4752]: I1124 13:50:02.282728 4752 scope.go:117] "RemoveContainer" containerID="fd3f2a5636546a693291817f935a6b800bb6ebfae0d6e76007b5fcda21c437e4" Nov 24 13:50:02 crc kubenswrapper[4752]: I1124 13:50:02.282882 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tx88k" Nov 24 13:50:02 crc kubenswrapper[4752]: I1124 13:50:02.327828 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tx88k"] Nov 24 13:50:02 crc kubenswrapper[4752]: I1124 13:50:02.333467 4752 scope.go:117] "RemoveContainer" containerID="fa3a47b45531a7a5bacd52ce975c617455ea3a58c8514f02ccf8dda08a3ef725" Nov 24 13:50:02 crc kubenswrapper[4752]: I1124 13:50:02.340925 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tx88k"] Nov 24 13:50:02 crc kubenswrapper[4752]: I1124 13:50:02.354957 4752 scope.go:117] "RemoveContainer" containerID="4263331c7b5868a9ca0300ad2dfb68c7ba1eda81f6758b179f22d21d01d9d4b6" Nov 24 13:50:02 crc kubenswrapper[4752]: I1124 13:50:02.408395 4752 scope.go:117] "RemoveContainer" containerID="fd3f2a5636546a693291817f935a6b800bb6ebfae0d6e76007b5fcda21c437e4" Nov 24 13:50:02 crc kubenswrapper[4752]: E1124 13:50:02.409017 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd3f2a5636546a693291817f935a6b800bb6ebfae0d6e76007b5fcda21c437e4\": container with ID starting with fd3f2a5636546a693291817f935a6b800bb6ebfae0d6e76007b5fcda21c437e4 not found: ID does not exist" containerID="fd3f2a5636546a693291817f935a6b800bb6ebfae0d6e76007b5fcda21c437e4" Nov 24 13:50:02 crc kubenswrapper[4752]: I1124 13:50:02.409050 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd3f2a5636546a693291817f935a6b800bb6ebfae0d6e76007b5fcda21c437e4"} err="failed to get container status \"fd3f2a5636546a693291817f935a6b800bb6ebfae0d6e76007b5fcda21c437e4\": rpc error: code = NotFound desc = could not find container \"fd3f2a5636546a693291817f935a6b800bb6ebfae0d6e76007b5fcda21c437e4\": container with ID starting with fd3f2a5636546a693291817f935a6b800bb6ebfae0d6e76007b5fcda21c437e4 not found: ID does not exist" Nov 24 13:50:02 crc kubenswrapper[4752]: I1124 13:50:02.409071 4752 scope.go:117] "RemoveContainer" containerID="fa3a47b45531a7a5bacd52ce975c617455ea3a58c8514f02ccf8dda08a3ef725" Nov 24 13:50:02 crc kubenswrapper[4752]: E1124 13:50:02.409442 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa3a47b45531a7a5bacd52ce975c617455ea3a58c8514f02ccf8dda08a3ef725\": container with ID starting with fa3a47b45531a7a5bacd52ce975c617455ea3a58c8514f02ccf8dda08a3ef725 not found: ID does not exist" containerID="fa3a47b45531a7a5bacd52ce975c617455ea3a58c8514f02ccf8dda08a3ef725" Nov 24 13:50:02 crc kubenswrapper[4752]: I1124 13:50:02.409461 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa3a47b45531a7a5bacd52ce975c617455ea3a58c8514f02ccf8dda08a3ef725"} err="failed to get container status \"fa3a47b45531a7a5bacd52ce975c617455ea3a58c8514f02ccf8dda08a3ef725\": rpc error: code = NotFound desc = could not find container \"fa3a47b45531a7a5bacd52ce975c617455ea3a58c8514f02ccf8dda08a3ef725\": container with ID starting with fa3a47b45531a7a5bacd52ce975c617455ea3a58c8514f02ccf8dda08a3ef725 not found: ID does not exist" Nov 24 13:50:02 crc kubenswrapper[4752]: I1124 13:50:02.409474 4752 scope.go:117] "RemoveContainer" containerID="4263331c7b5868a9ca0300ad2dfb68c7ba1eda81f6758b179f22d21d01d9d4b6" Nov 24 13:50:02 crc kubenswrapper[4752]: E1124 13:50:02.409893 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4263331c7b5868a9ca0300ad2dfb68c7ba1eda81f6758b179f22d21d01d9d4b6\": container with ID starting with 4263331c7b5868a9ca0300ad2dfb68c7ba1eda81f6758b179f22d21d01d9d4b6 not found: ID does not exist" containerID="4263331c7b5868a9ca0300ad2dfb68c7ba1eda81f6758b179f22d21d01d9d4b6" Nov 24 13:50:02 crc kubenswrapper[4752]: I1124 13:50:02.409942 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4263331c7b5868a9ca0300ad2dfb68c7ba1eda81f6758b179f22d21d01d9d4b6"} err="failed to get container status \"4263331c7b5868a9ca0300ad2dfb68c7ba1eda81f6758b179f22d21d01d9d4b6\": rpc error: code = NotFound desc = could not find container \"4263331c7b5868a9ca0300ad2dfb68c7ba1eda81f6758b179f22d21d01d9d4b6\": container with ID starting with 4263331c7b5868a9ca0300ad2dfb68c7ba1eda81f6758b179f22d21d01d9d4b6 not found: ID does not exist" Nov 24 13:50:02 crc kubenswrapper[4752]: I1124 13:50:02.757121 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15db54a7-d68d-43fb-925d-ce7d11244442" path="/var/lib/kubelet/pods/15db54a7-d68d-43fb-925d-ce7d11244442/volumes" Nov 24 13:50:15 crc kubenswrapper[4752]: I1124 13:50:15.468551 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 13:50:15 crc kubenswrapper[4752]: I1124 13:50:15.469209 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 13:50:37 crc kubenswrapper[4752]: I1124 13:50:37.971933 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-542h9"] Nov 24 13:50:37 crc kubenswrapper[4752]: E1124 13:50:37.975244 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15db54a7-d68d-43fb-925d-ce7d11244442" containerName="registry-server" Nov 24 13:50:37 crc kubenswrapper[4752]: I1124 13:50:37.975277 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="15db54a7-d68d-43fb-925d-ce7d11244442" containerName="registry-server" Nov 24 13:50:37 crc kubenswrapper[4752]: E1124 13:50:37.975313 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15db54a7-d68d-43fb-925d-ce7d11244442" containerName="extract-utilities" Nov 24 13:50:37 crc kubenswrapper[4752]: I1124 13:50:37.975320 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="15db54a7-d68d-43fb-925d-ce7d11244442" containerName="extract-utilities" Nov 24 13:50:37 crc kubenswrapper[4752]: E1124 13:50:37.975352 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15db54a7-d68d-43fb-925d-ce7d11244442" containerName="extract-content" Nov 24 13:50:37 crc kubenswrapper[4752]: I1124 13:50:37.975358 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="15db54a7-d68d-43fb-925d-ce7d11244442" containerName="extract-content" Nov 24 13:50:37 crc kubenswrapper[4752]: I1124 13:50:37.975550 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="15db54a7-d68d-43fb-925d-ce7d11244442" containerName="registry-server" Nov 24 13:50:37 crc kubenswrapper[4752]: I1124 13:50:37.977326 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-542h9" Nov 24 13:50:37 crc kubenswrapper[4752]: I1124 13:50:37.993715 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-542h9"] Nov 24 13:50:38 crc kubenswrapper[4752]: I1124 13:50:38.063880 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5b374ba-49c0-4c39-bba8-247fbd869477-utilities\") pod \"redhat-operators-542h9\" (UID: \"f5b374ba-49c0-4c39-bba8-247fbd869477\") " pod="openshift-marketplace/redhat-operators-542h9" Nov 24 13:50:38 crc kubenswrapper[4752]: I1124 13:50:38.063994 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2wsp\" (UniqueName: \"kubernetes.io/projected/f5b374ba-49c0-4c39-bba8-247fbd869477-kube-api-access-t2wsp\") pod \"redhat-operators-542h9\" (UID: \"f5b374ba-49c0-4c39-bba8-247fbd869477\") " pod="openshift-marketplace/redhat-operators-542h9" Nov 24 13:50:38 crc kubenswrapper[4752]: I1124 13:50:38.064074 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5b374ba-49c0-4c39-bba8-247fbd869477-catalog-content\") pod \"redhat-operators-542h9\" (UID: \"f5b374ba-49c0-4c39-bba8-247fbd869477\") " pod="openshift-marketplace/redhat-operators-542h9" Nov 24 13:50:38 crc kubenswrapper[4752]: I1124 13:50:38.166217 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5b374ba-49c0-4c39-bba8-247fbd869477-utilities\") pod \"redhat-operators-542h9\" (UID: \"f5b374ba-49c0-4c39-bba8-247fbd869477\") " pod="openshift-marketplace/redhat-operators-542h9" Nov 24 13:50:38 crc kubenswrapper[4752]: I1124 13:50:38.166279 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2wsp\" (UniqueName: \"kubernetes.io/projected/f5b374ba-49c0-4c39-bba8-247fbd869477-kube-api-access-t2wsp\") pod \"redhat-operators-542h9\" (UID: \"f5b374ba-49c0-4c39-bba8-247fbd869477\") " pod="openshift-marketplace/redhat-operators-542h9" Nov 24 13:50:38 crc kubenswrapper[4752]: I1124 13:50:38.166378 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5b374ba-49c0-4c39-bba8-247fbd869477-catalog-content\") pod \"redhat-operators-542h9\" (UID: \"f5b374ba-49c0-4c39-bba8-247fbd869477\") " pod="openshift-marketplace/redhat-operators-542h9" Nov 24 13:50:38 crc kubenswrapper[4752]: I1124 13:50:38.166903 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5b374ba-49c0-4c39-bba8-247fbd869477-utilities\") pod \"redhat-operators-542h9\" (UID: \"f5b374ba-49c0-4c39-bba8-247fbd869477\") " pod="openshift-marketplace/redhat-operators-542h9" Nov 24 13:50:38 crc kubenswrapper[4752]: I1124 13:50:38.167014 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5b374ba-49c0-4c39-bba8-247fbd869477-catalog-content\") pod \"redhat-operators-542h9\" (UID: \"f5b374ba-49c0-4c39-bba8-247fbd869477\") " pod="openshift-marketplace/redhat-operators-542h9" Nov 24 13:50:38 crc kubenswrapper[4752]: I1124 13:50:38.190302 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2wsp\" (UniqueName: \"kubernetes.io/projected/f5b374ba-49c0-4c39-bba8-247fbd869477-kube-api-access-t2wsp\") pod \"redhat-operators-542h9\" (UID: \"f5b374ba-49c0-4c39-bba8-247fbd869477\") " pod="openshift-marketplace/redhat-operators-542h9" Nov 24 13:50:38 crc kubenswrapper[4752]: I1124 13:50:38.311710 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-542h9" Nov 24 13:50:38 crc kubenswrapper[4752]: I1124 13:50:38.853896 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-542h9"] Nov 24 13:50:39 crc kubenswrapper[4752]: I1124 13:50:39.689574 4752 generic.go:334] "Generic (PLEG): container finished" podID="f5b374ba-49c0-4c39-bba8-247fbd869477" containerID="8aebbfe6b2fe1ccdf3fc99084987aa9c83e8a08675d042d835df51e56221b471" exitCode=0 Nov 24 13:50:39 crc kubenswrapper[4752]: I1124 13:50:39.689883 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-542h9" event={"ID":"f5b374ba-49c0-4c39-bba8-247fbd869477","Type":"ContainerDied","Data":"8aebbfe6b2fe1ccdf3fc99084987aa9c83e8a08675d042d835df51e56221b471"} Nov 24 13:50:39 crc kubenswrapper[4752]: I1124 13:50:39.689916 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-542h9" event={"ID":"f5b374ba-49c0-4c39-bba8-247fbd869477","Type":"ContainerStarted","Data":"4ad8560febbc21c268d7447afb2abeb6ed39be21d3c3b142d7f70aece8b6c240"} Nov 24 13:50:39 crc kubenswrapper[4752]: I1124 13:50:39.692788 4752 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 24 13:50:45 crc kubenswrapper[4752]: I1124 13:50:45.469788 4752 patch_prober.go:28] interesting pod/machine-config-daemon-vhwb4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 13:50:45 crc kubenswrapper[4752]: I1124 13:50:45.471622 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 13:50:45 crc kubenswrapper[4752]: I1124 13:50:45.472181 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" Nov 24 13:50:45 crc kubenswrapper[4752]: I1124 13:50:45.473819 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5c3f2c41f195abfc5e76a86b31163072a21d4f0bdd1cd956393a7148e514a530"} pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 24 13:50:45 crc kubenswrapper[4752]: I1124 13:50:45.473907 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerName="machine-config-daemon" containerID="cri-o://5c3f2c41f195abfc5e76a86b31163072a21d4f0bdd1cd956393a7148e514a530" gracePeriod=600 Nov 24 13:50:45 crc kubenswrapper[4752]: I1124 13:50:45.761531 4752 generic.go:334] "Generic (PLEG): container finished" podID="f890fc2e-8d6c-4109-882a-9e90340097a2" containerID="5c3f2c41f195abfc5e76a86b31163072a21d4f0bdd1cd956393a7148e514a530" exitCode=0 Nov 24 13:50:45 crc kubenswrapper[4752]: I1124 13:50:45.761613 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" event={"ID":"f890fc2e-8d6c-4109-882a-9e90340097a2","Type":"ContainerDied","Data":"5c3f2c41f195abfc5e76a86b31163072a21d4f0bdd1cd956393a7148e514a530"} Nov 24 13:50:45 crc kubenswrapper[4752]: I1124 13:50:45.761679 4752 scope.go:117] "RemoveContainer" containerID="d9a0a12b328ff206a22a203489bbc2cd9a54ed504328596d07c6e50879316211" Nov 24 13:50:50 crc kubenswrapper[4752]: E1124 13:50:50.576891 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:50:50 crc kubenswrapper[4752]: I1124 13:50:50.819726 4752 scope.go:117] "RemoveContainer" containerID="5c3f2c41f195abfc5e76a86b31163072a21d4f0bdd1cd956393a7148e514a530" Nov 24 13:50:50 crc kubenswrapper[4752]: E1124 13:50:50.820353 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:50:51 crc kubenswrapper[4752]: I1124 13:50:51.829188 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-542h9" event={"ID":"f5b374ba-49c0-4c39-bba8-247fbd869477","Type":"ContainerStarted","Data":"fd605a4b27458f7b0b3ca2dca1430b07a95589e44a00e6c317fa1e1be5da5a25"} Nov 24 13:50:52 crc kubenswrapper[4752]: I1124 13:50:52.842554 4752 generic.go:334] "Generic (PLEG): container finished" podID="f5b374ba-49c0-4c39-bba8-247fbd869477" containerID="fd605a4b27458f7b0b3ca2dca1430b07a95589e44a00e6c317fa1e1be5da5a25" exitCode=0 Nov 24 13:50:52 crc kubenswrapper[4752]: I1124 13:50:52.843070 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-542h9" event={"ID":"f5b374ba-49c0-4c39-bba8-247fbd869477","Type":"ContainerDied","Data":"fd605a4b27458f7b0b3ca2dca1430b07a95589e44a00e6c317fa1e1be5da5a25"} Nov 24 13:50:54 crc kubenswrapper[4752]: I1124 13:50:54.866276 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-542h9" event={"ID":"f5b374ba-49c0-4c39-bba8-247fbd869477","Type":"ContainerStarted","Data":"6b21ece33179d326d22a575b0c5ca802553e89fd6d8073822b14448360bba91a"} Nov 24 13:50:54 crc kubenswrapper[4752]: I1124 13:50:54.893953 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-542h9" podStartSLOduration=3.67674012 podStartE2EDuration="17.893932781s" podCreationTimestamp="2025-11-24 13:50:37 +0000 UTC" firstStartedPulling="2025-11-24 13:50:39.692452346 +0000 UTC m=+9845.677272635" lastFinishedPulling="2025-11-24 13:50:53.909645017 +0000 UTC m=+9859.894465296" observedRunningTime="2025-11-24 13:50:54.888062852 +0000 UTC m=+9860.872883141" watchObservedRunningTime="2025-11-24 13:50:54.893932781 +0000 UTC m=+9860.878753070" Nov 24 13:50:58 crc kubenswrapper[4752]: I1124 13:50:58.312395 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-542h9" Nov 24 13:50:58 crc kubenswrapper[4752]: I1124 13:50:58.313015 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-542h9" Nov 24 13:50:59 crc kubenswrapper[4752]: I1124 13:50:59.364345 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-542h9" podUID="f5b374ba-49c0-4c39-bba8-247fbd869477" containerName="registry-server" probeResult="failure" output=< Nov 24 13:50:59 crc kubenswrapper[4752]: timeout: failed to connect service ":50051" within 1s Nov 24 13:50:59 crc kubenswrapper[4752]: > Nov 24 13:51:02 crc kubenswrapper[4752]: I1124 13:51:02.728969 4752 scope.go:117] "RemoveContainer" containerID="5c3f2c41f195abfc5e76a86b31163072a21d4f0bdd1cd956393a7148e514a530" Nov 24 13:51:02 crc kubenswrapper[4752]: E1124 13:51:02.731323 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:51:08 crc kubenswrapper[4752]: I1124 13:51:08.384427 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-542h9" Nov 24 13:51:08 crc kubenswrapper[4752]: I1124 13:51:08.457764 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-542h9" Nov 24 13:51:08 crc kubenswrapper[4752]: I1124 13:51:08.996344 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-542h9"] Nov 24 13:51:09 crc kubenswrapper[4752]: I1124 13:51:09.173835 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4dcbc"] Nov 24 13:51:09 crc kubenswrapper[4752]: I1124 13:51:09.174411 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4dcbc" podUID="4f84466e-5f50-42cc-ae1a-be631dd3d74f" containerName="registry-server" containerID="cri-o://ac8d17017db8b9d73ff82d4b2fb63c2b57a697f070cdc437fd6fc090a2d22d0d" gracePeriod=2 Nov 24 13:51:10 crc kubenswrapper[4752]: I1124 13:51:10.048436 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4dcbc" Nov 24 13:51:10 crc kubenswrapper[4752]: I1124 13:51:10.049099 4752 generic.go:334] "Generic (PLEG): container finished" podID="4f84466e-5f50-42cc-ae1a-be631dd3d74f" containerID="ac8d17017db8b9d73ff82d4b2fb63c2b57a697f070cdc437fd6fc090a2d22d0d" exitCode=0 Nov 24 13:51:10 crc kubenswrapper[4752]: I1124 13:51:10.049184 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4dcbc" event={"ID":"4f84466e-5f50-42cc-ae1a-be631dd3d74f","Type":"ContainerDied","Data":"ac8d17017db8b9d73ff82d4b2fb63c2b57a697f070cdc437fd6fc090a2d22d0d"} Nov 24 13:51:10 crc kubenswrapper[4752]: I1124 13:51:10.049252 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4dcbc" event={"ID":"4f84466e-5f50-42cc-ae1a-be631dd3d74f","Type":"ContainerDied","Data":"768ad4872ad7d60396ab5c7cda75ef6e7328af26d372272f864a640f355f027d"} Nov 24 13:51:10 crc kubenswrapper[4752]: I1124 13:51:10.049273 4752 scope.go:117] "RemoveContainer" containerID="ac8d17017db8b9d73ff82d4b2fb63c2b57a697f070cdc437fd6fc090a2d22d0d" Nov 24 13:51:10 crc kubenswrapper[4752]: I1124 13:51:10.090891 4752 scope.go:117] "RemoveContainer" containerID="784d07c4126c945e3a54fb46d233371e710f0c83a3edddfae56f371c8bde34ac" Nov 24 13:51:10 crc kubenswrapper[4752]: I1124 13:51:10.122359 4752 scope.go:117] "RemoveContainer" containerID="0dcbcafc233f67c5468499e3160b3538264ca2321553ad30cdcf16066e93dd0b" Nov 24 13:51:10 crc kubenswrapper[4752]: I1124 13:51:10.175521 4752 scope.go:117] "RemoveContainer" containerID="ac8d17017db8b9d73ff82d4b2fb63c2b57a697f070cdc437fd6fc090a2d22d0d" Nov 24 13:51:10 crc kubenswrapper[4752]: E1124 13:51:10.176289 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac8d17017db8b9d73ff82d4b2fb63c2b57a697f070cdc437fd6fc090a2d22d0d\": container with ID starting with ac8d17017db8b9d73ff82d4b2fb63c2b57a697f070cdc437fd6fc090a2d22d0d not found: ID does not exist" containerID="ac8d17017db8b9d73ff82d4b2fb63c2b57a697f070cdc437fd6fc090a2d22d0d" Nov 24 13:51:10 crc kubenswrapper[4752]: I1124 13:51:10.176330 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac8d17017db8b9d73ff82d4b2fb63c2b57a697f070cdc437fd6fc090a2d22d0d"} err="failed to get container status \"ac8d17017db8b9d73ff82d4b2fb63c2b57a697f070cdc437fd6fc090a2d22d0d\": rpc error: code = NotFound desc = could not find container \"ac8d17017db8b9d73ff82d4b2fb63c2b57a697f070cdc437fd6fc090a2d22d0d\": container with ID starting with ac8d17017db8b9d73ff82d4b2fb63c2b57a697f070cdc437fd6fc090a2d22d0d not found: ID does not exist" Nov 24 13:51:10 crc kubenswrapper[4752]: I1124 13:51:10.176353 4752 scope.go:117] "RemoveContainer" containerID="784d07c4126c945e3a54fb46d233371e710f0c83a3edddfae56f371c8bde34ac" Nov 24 13:51:10 crc kubenswrapper[4752]: E1124 13:51:10.177181 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"784d07c4126c945e3a54fb46d233371e710f0c83a3edddfae56f371c8bde34ac\": container with ID starting with 784d07c4126c945e3a54fb46d233371e710f0c83a3edddfae56f371c8bde34ac not found: ID does not exist" containerID="784d07c4126c945e3a54fb46d233371e710f0c83a3edddfae56f371c8bde34ac" Nov 24 13:51:10 crc kubenswrapper[4752]: I1124 13:51:10.177206 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"784d07c4126c945e3a54fb46d233371e710f0c83a3edddfae56f371c8bde34ac"} err="failed to get container status \"784d07c4126c945e3a54fb46d233371e710f0c83a3edddfae56f371c8bde34ac\": rpc error: code = NotFound desc = could not find container \"784d07c4126c945e3a54fb46d233371e710f0c83a3edddfae56f371c8bde34ac\": container with ID starting with 784d07c4126c945e3a54fb46d233371e710f0c83a3edddfae56f371c8bde34ac not found: ID does not exist" Nov 24 13:51:10 crc kubenswrapper[4752]: I1124 13:51:10.177218 4752 scope.go:117] "RemoveContainer" containerID="0dcbcafc233f67c5468499e3160b3538264ca2321553ad30cdcf16066e93dd0b" Nov 24 13:51:10 crc kubenswrapper[4752]: E1124 13:51:10.177590 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dcbcafc233f67c5468499e3160b3538264ca2321553ad30cdcf16066e93dd0b\": container with ID starting with 0dcbcafc233f67c5468499e3160b3538264ca2321553ad30cdcf16066e93dd0b not found: ID does not exist" containerID="0dcbcafc233f67c5468499e3160b3538264ca2321553ad30cdcf16066e93dd0b" Nov 24 13:51:10 crc kubenswrapper[4752]: I1124 13:51:10.177610 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dcbcafc233f67c5468499e3160b3538264ca2321553ad30cdcf16066e93dd0b"} err="failed to get container status \"0dcbcafc233f67c5468499e3160b3538264ca2321553ad30cdcf16066e93dd0b\": rpc error: code = NotFound desc = could not find container \"0dcbcafc233f67c5468499e3160b3538264ca2321553ad30cdcf16066e93dd0b\": container with ID starting with 0dcbcafc233f67c5468499e3160b3538264ca2321553ad30cdcf16066e93dd0b not found: ID does not exist" Nov 24 13:51:10 crc kubenswrapper[4752]: I1124 13:51:10.214733 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f84466e-5f50-42cc-ae1a-be631dd3d74f-catalog-content\") pod \"4f84466e-5f50-42cc-ae1a-be631dd3d74f\" (UID: \"4f84466e-5f50-42cc-ae1a-be631dd3d74f\") " Nov 24 13:51:10 crc kubenswrapper[4752]: I1124 13:51:10.214932 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hgwn\" (UniqueName: \"kubernetes.io/projected/4f84466e-5f50-42cc-ae1a-be631dd3d74f-kube-api-access-5hgwn\") pod \"4f84466e-5f50-42cc-ae1a-be631dd3d74f\" (UID: \"4f84466e-5f50-42cc-ae1a-be631dd3d74f\") " Nov 24 13:51:10 crc kubenswrapper[4752]: I1124 13:51:10.215460 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f84466e-5f50-42cc-ae1a-be631dd3d74f-utilities\") pod \"4f84466e-5f50-42cc-ae1a-be631dd3d74f\" (UID: \"4f84466e-5f50-42cc-ae1a-be631dd3d74f\") " Nov 24 13:51:10 crc kubenswrapper[4752]: I1124 13:51:10.216968 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f84466e-5f50-42cc-ae1a-be631dd3d74f-utilities" (OuterVolumeSpecName: "utilities") pod "4f84466e-5f50-42cc-ae1a-be631dd3d74f" (UID: "4f84466e-5f50-42cc-ae1a-be631dd3d74f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 13:51:10 crc kubenswrapper[4752]: I1124 13:51:10.220262 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f84466e-5f50-42cc-ae1a-be631dd3d74f-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 13:51:10 crc kubenswrapper[4752]: I1124 13:51:10.224819 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f84466e-5f50-42cc-ae1a-be631dd3d74f-kube-api-access-5hgwn" (OuterVolumeSpecName: "kube-api-access-5hgwn") pod "4f84466e-5f50-42cc-ae1a-be631dd3d74f" (UID: "4f84466e-5f50-42cc-ae1a-be631dd3d74f"). InnerVolumeSpecName "kube-api-access-5hgwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:51:10 crc kubenswrapper[4752]: I1124 13:51:10.310878 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f84466e-5f50-42cc-ae1a-be631dd3d74f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4f84466e-5f50-42cc-ae1a-be631dd3d74f" (UID: "4f84466e-5f50-42cc-ae1a-be631dd3d74f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 13:51:10 crc kubenswrapper[4752]: I1124 13:51:10.322820 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f84466e-5f50-42cc-ae1a-be631dd3d74f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 13:51:10 crc kubenswrapper[4752]: I1124 13:51:10.322859 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hgwn\" (UniqueName: \"kubernetes.io/projected/4f84466e-5f50-42cc-ae1a-be631dd3d74f-kube-api-access-5hgwn\") on node \"crc\" DevicePath \"\"" Nov 24 13:51:11 crc kubenswrapper[4752]: I1124 13:51:11.060963 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4dcbc" Nov 24 13:51:11 crc kubenswrapper[4752]: I1124 13:51:11.087323 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4dcbc"] Nov 24 13:51:11 crc kubenswrapper[4752]: I1124 13:51:11.096477 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4dcbc"] Nov 24 13:51:12 crc kubenswrapper[4752]: I1124 13:51:12.749511 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f84466e-5f50-42cc-ae1a-be631dd3d74f" path="/var/lib/kubelet/pods/4f84466e-5f50-42cc-ae1a-be631dd3d74f/volumes" Nov 24 13:51:15 crc kubenswrapper[4752]: I1124 13:51:15.728670 4752 scope.go:117] "RemoveContainer" containerID="5c3f2c41f195abfc5e76a86b31163072a21d4f0bdd1cd956393a7148e514a530" Nov 24 13:51:15 crc kubenswrapper[4752]: E1124 13:51:15.729452 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:51:28 crc kubenswrapper[4752]: I1124 13:51:28.736296 4752 scope.go:117] "RemoveContainer" containerID="5c3f2c41f195abfc5e76a86b31163072a21d4f0bdd1cd956393a7148e514a530" Nov 24 13:51:28 crc kubenswrapper[4752]: E1124 13:51:28.741518 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:51:41 crc kubenswrapper[4752]: I1124 13:51:41.728821 4752 scope.go:117] "RemoveContainer" containerID="5c3f2c41f195abfc5e76a86b31163072a21d4f0bdd1cd956393a7148e514a530" Nov 24 13:51:41 crc kubenswrapper[4752]: E1124 13:51:41.730841 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:51:50 crc kubenswrapper[4752]: I1124 13:51:50.988613 4752 scope.go:117] "RemoveContainer" containerID="7da1c2b865bcae4470b2c53b84cd75c0aff709c91b7f817972b253c0a482662c" Nov 24 13:51:51 crc kubenswrapper[4752]: I1124 13:51:51.011009 4752 scope.go:117] "RemoveContainer" containerID="bf849acacc14fc92662e841240f947afc4514e0c473cdff349aef43fe030c09c" Nov 24 13:51:51 crc kubenswrapper[4752]: I1124 13:51:51.034130 4752 scope.go:117] "RemoveContainer" containerID="26616219592c9a1654c84049e9bee0f90cfdca8126fc2a40d55245b22ede689f" Nov 24 13:51:52 crc kubenswrapper[4752]: I1124 13:51:52.728797 4752 scope.go:117] "RemoveContainer" containerID="5c3f2c41f195abfc5e76a86b31163072a21d4f0bdd1cd956393a7148e514a530" Nov 24 13:51:52 crc kubenswrapper[4752]: E1124 13:51:52.729431 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:51:53 crc kubenswrapper[4752]: I1124 13:51:53.597893 4752 generic.go:334] "Generic (PLEG): container finished" podID="82a974f6-7004-41a1-ba5c-bb67b6d5b23d" containerID="1e0fdd6335fa3ee07faa1f5ea90b07b615fd21c158a27d17e6c912eac3ecbc21" exitCode=0 Nov 24 13:51:53 crc kubenswrapper[4752]: I1124 13:51:53.598115 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6hbmt/must-gather-m7jq2" event={"ID":"82a974f6-7004-41a1-ba5c-bb67b6d5b23d","Type":"ContainerDied","Data":"1e0fdd6335fa3ee07faa1f5ea90b07b615fd21c158a27d17e6c912eac3ecbc21"} Nov 24 13:51:53 crc kubenswrapper[4752]: I1124 13:51:53.599411 4752 scope.go:117] "RemoveContainer" containerID="1e0fdd6335fa3ee07faa1f5ea90b07b615fd21c158a27d17e6c912eac3ecbc21" Nov 24 13:51:54 crc kubenswrapper[4752]: I1124 13:51:54.905949 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6hbmt_must-gather-m7jq2_82a974f6-7004-41a1-ba5c-bb67b6d5b23d/gather/0.log" Nov 24 13:52:04 crc kubenswrapper[4752]: I1124 13:52:04.284177 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6hbmt/must-gather-m7jq2"] Nov 24 13:52:04 crc kubenswrapper[4752]: I1124 13:52:04.285047 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-6hbmt/must-gather-m7jq2" podUID="82a974f6-7004-41a1-ba5c-bb67b6d5b23d" containerName="copy" containerID="cri-o://467bfe7c8afebe585c07bd7ed2d759a54393460ba0c516131b17f9837a45171a" gracePeriod=2 Nov 24 13:52:04 crc kubenswrapper[4752]: I1124 13:52:04.299985 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6hbmt/must-gather-m7jq2"] Nov 24 13:52:04 crc kubenswrapper[4752]: I1124 13:52:04.822052 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6hbmt_must-gather-m7jq2_82a974f6-7004-41a1-ba5c-bb67b6d5b23d/copy/0.log" Nov 24 13:52:04 crc kubenswrapper[4752]: I1124 13:52:04.822734 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6hbmt/must-gather-m7jq2" Nov 24 13:52:04 crc kubenswrapper[4752]: I1124 13:52:04.940276 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6hbmt_must-gather-m7jq2_82a974f6-7004-41a1-ba5c-bb67b6d5b23d/copy/0.log" Nov 24 13:52:04 crc kubenswrapper[4752]: I1124 13:52:04.941083 4752 generic.go:334] "Generic (PLEG): container finished" podID="82a974f6-7004-41a1-ba5c-bb67b6d5b23d" containerID="467bfe7c8afebe585c07bd7ed2d759a54393460ba0c516131b17f9837a45171a" exitCode=143 Nov 24 13:52:04 crc kubenswrapper[4752]: I1124 13:52:04.941128 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6hbmt/must-gather-m7jq2" Nov 24 13:52:04 crc kubenswrapper[4752]: I1124 13:52:04.941166 4752 scope.go:117] "RemoveContainer" containerID="467bfe7c8afebe585c07bd7ed2d759a54393460ba0c516131b17f9837a45171a" Nov 24 13:52:04 crc kubenswrapper[4752]: I1124 13:52:04.981104 4752 scope.go:117] "RemoveContainer" containerID="1e0fdd6335fa3ee07faa1f5ea90b07b615fd21c158a27d17e6c912eac3ecbc21" Nov 24 13:52:05 crc kubenswrapper[4752]: I1124 13:52:05.003638 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/82a974f6-7004-41a1-ba5c-bb67b6d5b23d-must-gather-output\") pod \"82a974f6-7004-41a1-ba5c-bb67b6d5b23d\" (UID: \"82a974f6-7004-41a1-ba5c-bb67b6d5b23d\") " Nov 24 13:52:05 crc kubenswrapper[4752]: I1124 13:52:05.003803 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfnzs\" (UniqueName: \"kubernetes.io/projected/82a974f6-7004-41a1-ba5c-bb67b6d5b23d-kube-api-access-jfnzs\") pod \"82a974f6-7004-41a1-ba5c-bb67b6d5b23d\" (UID: \"82a974f6-7004-41a1-ba5c-bb67b6d5b23d\") " Nov 24 13:52:05 crc kubenswrapper[4752]: I1124 13:52:05.017706 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82a974f6-7004-41a1-ba5c-bb67b6d5b23d-kube-api-access-jfnzs" (OuterVolumeSpecName: "kube-api-access-jfnzs") pod "82a974f6-7004-41a1-ba5c-bb67b6d5b23d" (UID: "82a974f6-7004-41a1-ba5c-bb67b6d5b23d"). InnerVolumeSpecName "kube-api-access-jfnzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:52:05 crc kubenswrapper[4752]: I1124 13:52:05.038531 4752 scope.go:117] "RemoveContainer" containerID="467bfe7c8afebe585c07bd7ed2d759a54393460ba0c516131b17f9837a45171a" Nov 24 13:52:05 crc kubenswrapper[4752]: E1124 13:52:05.039529 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"467bfe7c8afebe585c07bd7ed2d759a54393460ba0c516131b17f9837a45171a\": container with ID starting with 467bfe7c8afebe585c07bd7ed2d759a54393460ba0c516131b17f9837a45171a not found: ID does not exist" containerID="467bfe7c8afebe585c07bd7ed2d759a54393460ba0c516131b17f9837a45171a" Nov 24 13:52:05 crc kubenswrapper[4752]: I1124 13:52:05.039602 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"467bfe7c8afebe585c07bd7ed2d759a54393460ba0c516131b17f9837a45171a"} err="failed to get container status \"467bfe7c8afebe585c07bd7ed2d759a54393460ba0c516131b17f9837a45171a\": rpc error: code = NotFound desc = could not find container \"467bfe7c8afebe585c07bd7ed2d759a54393460ba0c516131b17f9837a45171a\": container with ID starting with 467bfe7c8afebe585c07bd7ed2d759a54393460ba0c516131b17f9837a45171a not found: ID does not exist" Nov 24 13:52:05 crc kubenswrapper[4752]: I1124 13:52:05.039652 4752 scope.go:117] "RemoveContainer" containerID="1e0fdd6335fa3ee07faa1f5ea90b07b615fd21c158a27d17e6c912eac3ecbc21" Nov 24 13:52:05 crc kubenswrapper[4752]: E1124 13:52:05.040100 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e0fdd6335fa3ee07faa1f5ea90b07b615fd21c158a27d17e6c912eac3ecbc21\": container with ID starting with 1e0fdd6335fa3ee07faa1f5ea90b07b615fd21c158a27d17e6c912eac3ecbc21 not found: ID does not exist" containerID="1e0fdd6335fa3ee07faa1f5ea90b07b615fd21c158a27d17e6c912eac3ecbc21" Nov 24 13:52:05 crc kubenswrapper[4752]: I1124 13:52:05.040159 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e0fdd6335fa3ee07faa1f5ea90b07b615fd21c158a27d17e6c912eac3ecbc21"} err="failed to get container status \"1e0fdd6335fa3ee07faa1f5ea90b07b615fd21c158a27d17e6c912eac3ecbc21\": rpc error: code = NotFound desc = could not find container \"1e0fdd6335fa3ee07faa1f5ea90b07b615fd21c158a27d17e6c912eac3ecbc21\": container with ID starting with 1e0fdd6335fa3ee07faa1f5ea90b07b615fd21c158a27d17e6c912eac3ecbc21 not found: ID does not exist" Nov 24 13:52:05 crc kubenswrapper[4752]: I1124 13:52:05.105950 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfnzs\" (UniqueName: \"kubernetes.io/projected/82a974f6-7004-41a1-ba5c-bb67b6d5b23d-kube-api-access-jfnzs\") on node \"crc\" DevicePath \"\"" Nov 24 13:52:05 crc kubenswrapper[4752]: I1124 13:52:05.219398 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82a974f6-7004-41a1-ba5c-bb67b6d5b23d-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "82a974f6-7004-41a1-ba5c-bb67b6d5b23d" (UID: "82a974f6-7004-41a1-ba5c-bb67b6d5b23d"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 13:52:05 crc kubenswrapper[4752]: I1124 13:52:05.311048 4752 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/82a974f6-7004-41a1-ba5c-bb67b6d5b23d-must-gather-output\") on node \"crc\" DevicePath \"\"" Nov 24 13:52:05 crc kubenswrapper[4752]: I1124 13:52:05.727697 4752 scope.go:117] "RemoveContainer" containerID="5c3f2c41f195abfc5e76a86b31163072a21d4f0bdd1cd956393a7148e514a530" Nov 24 13:52:05 crc kubenswrapper[4752]: E1124 13:52:05.728444 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:52:06 crc kubenswrapper[4752]: I1124 13:52:06.743563 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82a974f6-7004-41a1-ba5c-bb67b6d5b23d" path="/var/lib/kubelet/pods/82a974f6-7004-41a1-ba5c-bb67b6d5b23d/volumes" Nov 24 13:52:19 crc kubenswrapper[4752]: I1124 13:52:19.728199 4752 scope.go:117] "RemoveContainer" containerID="5c3f2c41f195abfc5e76a86b31163072a21d4f0bdd1cd956393a7148e514a530" Nov 24 13:52:19 crc kubenswrapper[4752]: E1124 13:52:19.729416 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:52:33 crc kubenswrapper[4752]: I1124 13:52:33.728739 4752 scope.go:117] "RemoveContainer" containerID="5c3f2c41f195abfc5e76a86b31163072a21d4f0bdd1cd956393a7148e514a530" Nov 24 13:52:33 crc kubenswrapper[4752]: E1124 13:52:33.729382 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:52:48 crc kubenswrapper[4752]: I1124 13:52:48.727953 4752 scope.go:117] "RemoveContainer" containerID="5c3f2c41f195abfc5e76a86b31163072a21d4f0bdd1cd956393a7148e514a530" Nov 24 13:52:48 crc kubenswrapper[4752]: E1124 13:52:48.728712 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:53:00 crc kubenswrapper[4752]: I1124 13:53:00.729807 4752 scope.go:117] "RemoveContainer" containerID="5c3f2c41f195abfc5e76a86b31163072a21d4f0bdd1cd956393a7148e514a530" Nov 24 13:53:00 crc kubenswrapper[4752]: E1124 13:53:00.731030 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:53:11 crc kubenswrapper[4752]: I1124 13:53:11.728447 4752 scope.go:117] "RemoveContainer" containerID="5c3f2c41f195abfc5e76a86b31163072a21d4f0bdd1cd956393a7148e514a530" Nov 24 13:53:11 crc kubenswrapper[4752]: E1124 13:53:11.729399 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:53:22 crc kubenswrapper[4752]: I1124 13:53:22.728536 4752 scope.go:117] "RemoveContainer" containerID="5c3f2c41f195abfc5e76a86b31163072a21d4f0bdd1cd956393a7148e514a530" Nov 24 13:53:22 crc kubenswrapper[4752]: E1124 13:53:22.730028 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:53:36 crc kubenswrapper[4752]: I1124 13:53:36.728456 4752 scope.go:117] "RemoveContainer" containerID="5c3f2c41f195abfc5e76a86b31163072a21d4f0bdd1cd956393a7148e514a530" Nov 24 13:53:36 crc kubenswrapper[4752]: E1124 13:53:36.729807 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:53:47 crc kubenswrapper[4752]: I1124 13:53:47.728425 4752 scope.go:117] "RemoveContainer" containerID="5c3f2c41f195abfc5e76a86b31163072a21d4f0bdd1cd956393a7148e514a530" Nov 24 13:53:47 crc kubenswrapper[4752]: E1124 13:53:47.729812 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:54:01 crc kubenswrapper[4752]: I1124 13:54:01.728982 4752 scope.go:117] "RemoveContainer" containerID="5c3f2c41f195abfc5e76a86b31163072a21d4f0bdd1cd956393a7148e514a530" Nov 24 13:54:01 crc kubenswrapper[4752]: E1124 13:54:01.731209 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:54:14 crc kubenswrapper[4752]: I1124 13:54:14.485671 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9ntz5"] Nov 24 13:54:14 crc kubenswrapper[4752]: E1124 13:54:14.487824 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82a974f6-7004-41a1-ba5c-bb67b6d5b23d" containerName="gather" Nov 24 13:54:14 crc kubenswrapper[4752]: I1124 13:54:14.487844 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="82a974f6-7004-41a1-ba5c-bb67b6d5b23d" containerName="gather" Nov 24 13:54:14 crc kubenswrapper[4752]: E1124 13:54:14.487865 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f84466e-5f50-42cc-ae1a-be631dd3d74f" containerName="registry-server" Nov 24 13:54:14 crc kubenswrapper[4752]: I1124 13:54:14.487870 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f84466e-5f50-42cc-ae1a-be631dd3d74f" containerName="registry-server" Nov 24 13:54:14 crc kubenswrapper[4752]: E1124 13:54:14.487894 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f84466e-5f50-42cc-ae1a-be631dd3d74f" containerName="extract-utilities" Nov 24 13:54:14 crc kubenswrapper[4752]: I1124 13:54:14.487901 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f84466e-5f50-42cc-ae1a-be631dd3d74f" containerName="extract-utilities" Nov 24 13:54:14 crc kubenswrapper[4752]: E1124 13:54:14.487914 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f84466e-5f50-42cc-ae1a-be631dd3d74f" containerName="extract-content" Nov 24 13:54:14 crc kubenswrapper[4752]: I1124 13:54:14.487920 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f84466e-5f50-42cc-ae1a-be631dd3d74f" containerName="extract-content" Nov 24 13:54:14 crc kubenswrapper[4752]: E1124 13:54:14.487932 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82a974f6-7004-41a1-ba5c-bb67b6d5b23d" containerName="copy" Nov 24 13:54:14 crc kubenswrapper[4752]: I1124 13:54:14.487937 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="82a974f6-7004-41a1-ba5c-bb67b6d5b23d" containerName="copy" Nov 24 13:54:14 crc kubenswrapper[4752]: I1124 13:54:14.488236 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="82a974f6-7004-41a1-ba5c-bb67b6d5b23d" containerName="copy" Nov 24 13:54:14 crc kubenswrapper[4752]: I1124 13:54:14.488393 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f84466e-5f50-42cc-ae1a-be631dd3d74f" containerName="registry-server" Nov 24 13:54:14 crc kubenswrapper[4752]: I1124 13:54:14.488411 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="82a974f6-7004-41a1-ba5c-bb67b6d5b23d" containerName="gather" Nov 24 13:54:14 crc kubenswrapper[4752]: I1124 13:54:14.490285 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9ntz5" Nov 24 13:54:14 crc kubenswrapper[4752]: I1124 13:54:14.500224 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9ntz5"] Nov 24 13:54:14 crc kubenswrapper[4752]: I1124 13:54:14.569715 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57f442f1-3c48-4175-a20b-e888e69996c7-catalog-content\") pod \"community-operators-9ntz5\" (UID: \"57f442f1-3c48-4175-a20b-e888e69996c7\") " pod="openshift-marketplace/community-operators-9ntz5" Nov 24 13:54:14 crc kubenswrapper[4752]: I1124 13:54:14.569942 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57f442f1-3c48-4175-a20b-e888e69996c7-utilities\") pod \"community-operators-9ntz5\" (UID: \"57f442f1-3c48-4175-a20b-e888e69996c7\") " pod="openshift-marketplace/community-operators-9ntz5" Nov 24 13:54:14 crc kubenswrapper[4752]: I1124 13:54:14.570002 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7pr2\" (UniqueName: \"kubernetes.io/projected/57f442f1-3c48-4175-a20b-e888e69996c7-kube-api-access-w7pr2\") pod \"community-operators-9ntz5\" (UID: \"57f442f1-3c48-4175-a20b-e888e69996c7\") " pod="openshift-marketplace/community-operators-9ntz5" Nov 24 13:54:14 crc kubenswrapper[4752]: I1124 13:54:14.672566 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57f442f1-3c48-4175-a20b-e888e69996c7-utilities\") pod \"community-operators-9ntz5\" (UID: \"57f442f1-3c48-4175-a20b-e888e69996c7\") " pod="openshift-marketplace/community-operators-9ntz5" Nov 24 13:54:14 crc kubenswrapper[4752]: I1124 13:54:14.672665 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7pr2\" (UniqueName: \"kubernetes.io/projected/57f442f1-3c48-4175-a20b-e888e69996c7-kube-api-access-w7pr2\") pod \"community-operators-9ntz5\" (UID: \"57f442f1-3c48-4175-a20b-e888e69996c7\") " pod="openshift-marketplace/community-operators-9ntz5" Nov 24 13:54:14 crc kubenswrapper[4752]: I1124 13:54:14.672717 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57f442f1-3c48-4175-a20b-e888e69996c7-catalog-content\") pod \"community-operators-9ntz5\" (UID: \"57f442f1-3c48-4175-a20b-e888e69996c7\") " pod="openshift-marketplace/community-operators-9ntz5" Nov 24 13:54:14 crc kubenswrapper[4752]: I1124 13:54:14.673226 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57f442f1-3c48-4175-a20b-e888e69996c7-utilities\") pod \"community-operators-9ntz5\" (UID: \"57f442f1-3c48-4175-a20b-e888e69996c7\") " pod="openshift-marketplace/community-operators-9ntz5" Nov 24 13:54:14 crc kubenswrapper[4752]: I1124 13:54:14.674409 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57f442f1-3c48-4175-a20b-e888e69996c7-catalog-content\") pod \"community-operators-9ntz5\" (UID: \"57f442f1-3c48-4175-a20b-e888e69996c7\") " pod="openshift-marketplace/community-operators-9ntz5" Nov 24 13:54:14 crc kubenswrapper[4752]: I1124 13:54:14.691280 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7pr2\" (UniqueName: \"kubernetes.io/projected/57f442f1-3c48-4175-a20b-e888e69996c7-kube-api-access-w7pr2\") pod \"community-operators-9ntz5\" (UID: \"57f442f1-3c48-4175-a20b-e888e69996c7\") " pod="openshift-marketplace/community-operators-9ntz5" Nov 24 13:54:14 crc kubenswrapper[4752]: I1124 13:54:14.735154 4752 scope.go:117] "RemoveContainer" containerID="5c3f2c41f195abfc5e76a86b31163072a21d4f0bdd1cd956393a7148e514a530" Nov 24 13:54:14 crc kubenswrapper[4752]: E1124 13:54:14.735392 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:54:14 crc kubenswrapper[4752]: I1124 13:54:14.826760 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9ntz5" Nov 24 13:54:15 crc kubenswrapper[4752]: I1124 13:54:15.413725 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9ntz5"] Nov 24 13:54:15 crc kubenswrapper[4752]: I1124 13:54:15.492401 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9ntz5" event={"ID":"57f442f1-3c48-4175-a20b-e888e69996c7","Type":"ContainerStarted","Data":"67f7ac04297f0918b0c39f3b7f518715992e339b015dee62d71342d92590079b"} Nov 24 13:54:17 crc kubenswrapper[4752]: I1124 13:54:17.522569 4752 generic.go:334] "Generic (PLEG): container finished" podID="57f442f1-3c48-4175-a20b-e888e69996c7" containerID="a1015e6583423e0f73706b4a1db839bf654ec268fcc118362d5a508e574e949f" exitCode=0 Nov 24 13:54:17 crc kubenswrapper[4752]: I1124 13:54:17.522669 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9ntz5" event={"ID":"57f442f1-3c48-4175-a20b-e888e69996c7","Type":"ContainerDied","Data":"a1015e6583423e0f73706b4a1db839bf654ec268fcc118362d5a508e574e949f"} Nov 24 13:54:19 crc kubenswrapper[4752]: I1124 13:54:19.561225 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9ntz5" event={"ID":"57f442f1-3c48-4175-a20b-e888e69996c7","Type":"ContainerStarted","Data":"de8bc67325dc57d6548703a57afa3fd78a60659b0b255a9175f29a29fdada951"} Nov 24 13:54:21 crc kubenswrapper[4752]: I1124 13:54:21.586967 4752 generic.go:334] "Generic (PLEG): container finished" podID="57f442f1-3c48-4175-a20b-e888e69996c7" containerID="de8bc67325dc57d6548703a57afa3fd78a60659b0b255a9175f29a29fdada951" exitCode=0 Nov 24 13:54:21 crc kubenswrapper[4752]: I1124 13:54:21.587566 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9ntz5" event={"ID":"57f442f1-3c48-4175-a20b-e888e69996c7","Type":"ContainerDied","Data":"de8bc67325dc57d6548703a57afa3fd78a60659b0b255a9175f29a29fdada951"} Nov 24 13:54:22 crc kubenswrapper[4752]: I1124 13:54:22.601143 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9ntz5" event={"ID":"57f442f1-3c48-4175-a20b-e888e69996c7","Type":"ContainerStarted","Data":"a109af28fb8cfb80392e69006c3bb0b6c11d60c33bd342344aa48fbea001cadc"} Nov 24 13:54:22 crc kubenswrapper[4752]: I1124 13:54:22.633286 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9ntz5" podStartSLOduration=4.165176291 podStartE2EDuration="8.633268886s" podCreationTimestamp="2025-11-24 13:54:14 +0000 UTC" firstStartedPulling="2025-11-24 13:54:17.52682403 +0000 UTC m=+10063.511644359" lastFinishedPulling="2025-11-24 13:54:21.994916655 +0000 UTC m=+10067.979736954" observedRunningTime="2025-11-24 13:54:22.625359939 +0000 UTC m=+10068.610180248" watchObservedRunningTime="2025-11-24 13:54:22.633268886 +0000 UTC m=+10068.618089175" Nov 24 13:54:24 crc kubenswrapper[4752]: I1124 13:54:24.827391 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9ntz5" Nov 24 13:54:24 crc kubenswrapper[4752]: I1124 13:54:24.829468 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9ntz5" Nov 24 13:54:24 crc kubenswrapper[4752]: I1124 13:54:24.879563 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9ntz5" Nov 24 13:54:29 crc kubenswrapper[4752]: I1124 13:54:29.727836 4752 scope.go:117] "RemoveContainer" containerID="5c3f2c41f195abfc5e76a86b31163072a21d4f0bdd1cd956393a7148e514a530" Nov 24 13:54:29 crc kubenswrapper[4752]: E1124 13:54:29.728669 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:54:34 crc kubenswrapper[4752]: I1124 13:54:34.890922 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9ntz5" Nov 24 13:54:34 crc kubenswrapper[4752]: I1124 13:54:34.943667 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9ntz5"] Nov 24 13:54:35 crc kubenswrapper[4752]: I1124 13:54:35.744799 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9ntz5" podUID="57f442f1-3c48-4175-a20b-e888e69996c7" containerName="registry-server" containerID="cri-o://a109af28fb8cfb80392e69006c3bb0b6c11d60c33bd342344aa48fbea001cadc" gracePeriod=2 Nov 24 13:54:36 crc kubenswrapper[4752]: I1124 13:54:36.272688 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9ntz5" Nov 24 13:54:36 crc kubenswrapper[4752]: I1124 13:54:36.329634 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57f442f1-3c48-4175-a20b-e888e69996c7-utilities\") pod \"57f442f1-3c48-4175-a20b-e888e69996c7\" (UID: \"57f442f1-3c48-4175-a20b-e888e69996c7\") " Nov 24 13:54:36 crc kubenswrapper[4752]: I1124 13:54:36.329732 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57f442f1-3c48-4175-a20b-e888e69996c7-catalog-content\") pod \"57f442f1-3c48-4175-a20b-e888e69996c7\" (UID: \"57f442f1-3c48-4175-a20b-e888e69996c7\") " Nov 24 13:54:36 crc kubenswrapper[4752]: I1124 13:54:36.330040 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7pr2\" (UniqueName: \"kubernetes.io/projected/57f442f1-3c48-4175-a20b-e888e69996c7-kube-api-access-w7pr2\") pod \"57f442f1-3c48-4175-a20b-e888e69996c7\" (UID: \"57f442f1-3c48-4175-a20b-e888e69996c7\") " Nov 24 13:54:36 crc kubenswrapper[4752]: I1124 13:54:36.331006 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57f442f1-3c48-4175-a20b-e888e69996c7-utilities" (OuterVolumeSpecName: "utilities") pod "57f442f1-3c48-4175-a20b-e888e69996c7" (UID: "57f442f1-3c48-4175-a20b-e888e69996c7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 13:54:36 crc kubenswrapper[4752]: I1124 13:54:36.337740 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57f442f1-3c48-4175-a20b-e888e69996c7-kube-api-access-w7pr2" (OuterVolumeSpecName: "kube-api-access-w7pr2") pod "57f442f1-3c48-4175-a20b-e888e69996c7" (UID: "57f442f1-3c48-4175-a20b-e888e69996c7"). InnerVolumeSpecName "kube-api-access-w7pr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 13:54:36 crc kubenswrapper[4752]: I1124 13:54:36.394529 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57f442f1-3c48-4175-a20b-e888e69996c7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57f442f1-3c48-4175-a20b-e888e69996c7" (UID: "57f442f1-3c48-4175-a20b-e888e69996c7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 13:54:36 crc kubenswrapper[4752]: I1124 13:54:36.431551 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57f442f1-3c48-4175-a20b-e888e69996c7-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 13:54:36 crc kubenswrapper[4752]: I1124 13:54:36.431591 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57f442f1-3c48-4175-a20b-e888e69996c7-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 13:54:36 crc kubenswrapper[4752]: I1124 13:54:36.431607 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7pr2\" (UniqueName: \"kubernetes.io/projected/57f442f1-3c48-4175-a20b-e888e69996c7-kube-api-access-w7pr2\") on node \"crc\" DevicePath \"\"" Nov 24 13:54:36 crc kubenswrapper[4752]: I1124 13:54:36.786885 4752 generic.go:334] "Generic (PLEG): container finished" podID="57f442f1-3c48-4175-a20b-e888e69996c7" containerID="a109af28fb8cfb80392e69006c3bb0b6c11d60c33bd342344aa48fbea001cadc" exitCode=0 Nov 24 13:54:36 crc kubenswrapper[4752]: I1124 13:54:36.786967 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9ntz5" event={"ID":"57f442f1-3c48-4175-a20b-e888e69996c7","Type":"ContainerDied","Data":"a109af28fb8cfb80392e69006c3bb0b6c11d60c33bd342344aa48fbea001cadc"} Nov 24 13:54:36 crc kubenswrapper[4752]: I1124 13:54:36.787314 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9ntz5" event={"ID":"57f442f1-3c48-4175-a20b-e888e69996c7","Type":"ContainerDied","Data":"67f7ac04297f0918b0c39f3b7f518715992e339b015dee62d71342d92590079b"} Nov 24 13:54:36 crc kubenswrapper[4752]: I1124 13:54:36.787414 4752 scope.go:117] "RemoveContainer" containerID="a109af28fb8cfb80392e69006c3bb0b6c11d60c33bd342344aa48fbea001cadc" Nov 24 13:54:36 crc kubenswrapper[4752]: I1124 13:54:36.787028 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9ntz5" Nov 24 13:54:36 crc kubenswrapper[4752]: I1124 13:54:36.824781 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9ntz5"] Nov 24 13:54:36 crc kubenswrapper[4752]: I1124 13:54:36.825142 4752 scope.go:117] "RemoveContainer" containerID="de8bc67325dc57d6548703a57afa3fd78a60659b0b255a9175f29a29fdada951" Nov 24 13:54:36 crc kubenswrapper[4752]: I1124 13:54:36.839388 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9ntz5"] Nov 24 13:54:36 crc kubenswrapper[4752]: I1124 13:54:36.847992 4752 scope.go:117] "RemoveContainer" containerID="a1015e6583423e0f73706b4a1db839bf654ec268fcc118362d5a508e574e949f" Nov 24 13:54:36 crc kubenswrapper[4752]: I1124 13:54:36.896568 4752 scope.go:117] "RemoveContainer" containerID="a109af28fb8cfb80392e69006c3bb0b6c11d60c33bd342344aa48fbea001cadc" Nov 24 13:54:36 crc kubenswrapper[4752]: E1124 13:54:36.898378 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a109af28fb8cfb80392e69006c3bb0b6c11d60c33bd342344aa48fbea001cadc\": container with ID starting with a109af28fb8cfb80392e69006c3bb0b6c11d60c33bd342344aa48fbea001cadc not found: ID does not exist" containerID="a109af28fb8cfb80392e69006c3bb0b6c11d60c33bd342344aa48fbea001cadc" Nov 24 13:54:36 crc kubenswrapper[4752]: I1124 13:54:36.898423 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a109af28fb8cfb80392e69006c3bb0b6c11d60c33bd342344aa48fbea001cadc"} err="failed to get container status \"a109af28fb8cfb80392e69006c3bb0b6c11d60c33bd342344aa48fbea001cadc\": rpc error: code = NotFound desc = could not find container \"a109af28fb8cfb80392e69006c3bb0b6c11d60c33bd342344aa48fbea001cadc\": container with ID starting with a109af28fb8cfb80392e69006c3bb0b6c11d60c33bd342344aa48fbea001cadc not found: ID does not exist" Nov 24 13:54:36 crc kubenswrapper[4752]: I1124 13:54:36.898454 4752 scope.go:117] "RemoveContainer" containerID="de8bc67325dc57d6548703a57afa3fd78a60659b0b255a9175f29a29fdada951" Nov 24 13:54:36 crc kubenswrapper[4752]: E1124 13:54:36.898871 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de8bc67325dc57d6548703a57afa3fd78a60659b0b255a9175f29a29fdada951\": container with ID starting with de8bc67325dc57d6548703a57afa3fd78a60659b0b255a9175f29a29fdada951 not found: ID does not exist" containerID="de8bc67325dc57d6548703a57afa3fd78a60659b0b255a9175f29a29fdada951" Nov 24 13:54:36 crc kubenswrapper[4752]: I1124 13:54:36.898951 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de8bc67325dc57d6548703a57afa3fd78a60659b0b255a9175f29a29fdada951"} err="failed to get container status \"de8bc67325dc57d6548703a57afa3fd78a60659b0b255a9175f29a29fdada951\": rpc error: code = NotFound desc = could not find container \"de8bc67325dc57d6548703a57afa3fd78a60659b0b255a9175f29a29fdada951\": container with ID starting with de8bc67325dc57d6548703a57afa3fd78a60659b0b255a9175f29a29fdada951 not found: ID does not exist" Nov 24 13:54:36 crc kubenswrapper[4752]: I1124 13:54:36.898996 4752 scope.go:117] "RemoveContainer" containerID="a1015e6583423e0f73706b4a1db839bf654ec268fcc118362d5a508e574e949f" Nov 24 13:54:36 crc kubenswrapper[4752]: E1124 13:54:36.899504 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1015e6583423e0f73706b4a1db839bf654ec268fcc118362d5a508e574e949f\": container with ID starting with a1015e6583423e0f73706b4a1db839bf654ec268fcc118362d5a508e574e949f not found: ID does not exist" containerID="a1015e6583423e0f73706b4a1db839bf654ec268fcc118362d5a508e574e949f" Nov 24 13:54:36 crc kubenswrapper[4752]: I1124 13:54:36.899531 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1015e6583423e0f73706b4a1db839bf654ec268fcc118362d5a508e574e949f"} err="failed to get container status \"a1015e6583423e0f73706b4a1db839bf654ec268fcc118362d5a508e574e949f\": rpc error: code = NotFound desc = could not find container \"a1015e6583423e0f73706b4a1db839bf654ec268fcc118362d5a508e574e949f\": container with ID starting with a1015e6583423e0f73706b4a1db839bf654ec268fcc118362d5a508e574e949f not found: ID does not exist" Nov 24 13:54:38 crc kubenswrapper[4752]: I1124 13:54:38.750067 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57f442f1-3c48-4175-a20b-e888e69996c7" path="/var/lib/kubelet/pods/57f442f1-3c48-4175-a20b-e888e69996c7/volumes" Nov 24 13:54:42 crc kubenswrapper[4752]: I1124 13:54:42.728958 4752 scope.go:117] "RemoveContainer" containerID="5c3f2c41f195abfc5e76a86b31163072a21d4f0bdd1cd956393a7148e514a530" Nov 24 13:54:42 crc kubenswrapper[4752]: E1124 13:54:42.729947 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:54:57 crc kubenswrapper[4752]: I1124 13:54:57.728374 4752 scope.go:117] "RemoveContainer" containerID="5c3f2c41f195abfc5e76a86b31163072a21d4f0bdd1cd956393a7148e514a530" Nov 24 13:54:57 crc kubenswrapper[4752]: E1124 13:54:57.729066 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:55:11 crc kubenswrapper[4752]: I1124 13:55:11.727730 4752 scope.go:117] "RemoveContainer" containerID="5c3f2c41f195abfc5e76a86b31163072a21d4f0bdd1cd956393a7148e514a530" Nov 24 13:55:11 crc kubenswrapper[4752]: E1124 13:55:11.728567 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:55:22 crc kubenswrapper[4752]: I1124 13:55:22.728944 4752 scope.go:117] "RemoveContainer" containerID="5c3f2c41f195abfc5e76a86b31163072a21d4f0bdd1cd956393a7148e514a530" Nov 24 13:55:22 crc kubenswrapper[4752]: E1124 13:55:22.730003 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:55:36 crc kubenswrapper[4752]: I1124 13:55:36.729114 4752 scope.go:117] "RemoveContainer" containerID="5c3f2c41f195abfc5e76a86b31163072a21d4f0bdd1cd956393a7148e514a530" Nov 24 13:55:36 crc kubenswrapper[4752]: E1124 13:55:36.729927 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vhwb4_openshift-machine-config-operator(f890fc2e-8d6c-4109-882a-9e90340097a2)\"" pod="openshift-machine-config-operator/machine-config-daemon-vhwb4" podUID="f890fc2e-8d6c-4109-882a-9e90340097a2" Nov 24 13:55:37 crc kubenswrapper[4752]: I1124 13:55:37.707168 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-625wt"] Nov 24 13:55:37 crc kubenswrapper[4752]: E1124 13:55:37.708197 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57f442f1-3c48-4175-a20b-e888e69996c7" containerName="extract-utilities" Nov 24 13:55:37 crc kubenswrapper[4752]: I1124 13:55:37.708223 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="57f442f1-3c48-4175-a20b-e888e69996c7" containerName="extract-utilities" Nov 24 13:55:37 crc kubenswrapper[4752]: E1124 13:55:37.708260 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57f442f1-3c48-4175-a20b-e888e69996c7" containerName="extract-content" Nov 24 13:55:37 crc kubenswrapper[4752]: I1124 13:55:37.708272 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="57f442f1-3c48-4175-a20b-e888e69996c7" containerName="extract-content" Nov 24 13:55:37 crc kubenswrapper[4752]: E1124 13:55:37.708304 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57f442f1-3c48-4175-a20b-e888e69996c7" containerName="registry-server" Nov 24 13:55:37 crc kubenswrapper[4752]: I1124 13:55:37.708319 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="57f442f1-3c48-4175-a20b-e888e69996c7" containerName="registry-server" Nov 24 13:55:37 crc kubenswrapper[4752]: I1124 13:55:37.708672 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="57f442f1-3c48-4175-a20b-e888e69996c7" containerName="registry-server" Nov 24 13:55:37 crc kubenswrapper[4752]: I1124 13:55:37.711279 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-625wt" Nov 24 13:55:37 crc kubenswrapper[4752]: I1124 13:55:37.717402 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-625wt"] Nov 24 13:55:37 crc kubenswrapper[4752]: I1124 13:55:37.841824 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3882c71-bf49-46da-a5d4-b6ba560f136c-catalog-content\") pod \"certified-operators-625wt\" (UID: \"a3882c71-bf49-46da-a5d4-b6ba560f136c\") " pod="openshift-marketplace/certified-operators-625wt" Nov 24 13:55:37 crc kubenswrapper[4752]: I1124 13:55:37.841974 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgkxp\" (UniqueName: \"kubernetes.io/projected/a3882c71-bf49-46da-a5d4-b6ba560f136c-kube-api-access-bgkxp\") pod \"certified-operators-625wt\" (UID: \"a3882c71-bf49-46da-a5d4-b6ba560f136c\") " pod="openshift-marketplace/certified-operators-625wt" Nov 24 13:55:37 crc kubenswrapper[4752]: I1124 13:55:37.842065 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3882c71-bf49-46da-a5d4-b6ba560f136c-utilities\") pod \"certified-operators-625wt\" (UID: \"a3882c71-bf49-46da-a5d4-b6ba560f136c\") " pod="openshift-marketplace/certified-operators-625wt" Nov 24 13:55:37 crc kubenswrapper[4752]: I1124 13:55:37.944055 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgkxp\" (UniqueName: \"kubernetes.io/projected/a3882c71-bf49-46da-a5d4-b6ba560f136c-kube-api-access-bgkxp\") pod \"certified-operators-625wt\" (UID: \"a3882c71-bf49-46da-a5d4-b6ba560f136c\") " pod="openshift-marketplace/certified-operators-625wt" Nov 24 13:55:37 crc kubenswrapper[4752]: I1124 13:55:37.944203 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3882c71-bf49-46da-a5d4-b6ba560f136c-utilities\") pod \"certified-operators-625wt\" (UID: \"a3882c71-bf49-46da-a5d4-b6ba560f136c\") " pod="openshift-marketplace/certified-operators-625wt" Nov 24 13:55:37 crc kubenswrapper[4752]: I1124 13:55:37.944679 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3882c71-bf49-46da-a5d4-b6ba560f136c-utilities\") pod \"certified-operators-625wt\" (UID: \"a3882c71-bf49-46da-a5d4-b6ba560f136c\") " pod="openshift-marketplace/certified-operators-625wt" Nov 24 13:55:37 crc kubenswrapper[4752]: I1124 13:55:37.944872 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3882c71-bf49-46da-a5d4-b6ba560f136c-catalog-content\") pod \"certified-operators-625wt\" (UID: \"a3882c71-bf49-46da-a5d4-b6ba560f136c\") " pod="openshift-marketplace/certified-operators-625wt" Nov 24 13:55:37 crc kubenswrapper[4752]: I1124 13:55:37.945226 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3882c71-bf49-46da-a5d4-b6ba560f136c-catalog-content\") pod \"certified-operators-625wt\" (UID: \"a3882c71-bf49-46da-a5d4-b6ba560f136c\") " pod="openshift-marketplace/certified-operators-625wt" Nov 24 13:55:37 crc kubenswrapper[4752]: I1124 13:55:37.967080 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgkxp\" (UniqueName: \"kubernetes.io/projected/a3882c71-bf49-46da-a5d4-b6ba560f136c-kube-api-access-bgkxp\") pod \"certified-operators-625wt\" (UID: \"a3882c71-bf49-46da-a5d4-b6ba560f136c\") " pod="openshift-marketplace/certified-operators-625wt" Nov 24 13:55:38 crc kubenswrapper[4752]: I1124 13:55:38.040595 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-625wt" Nov 24 13:55:38 crc kubenswrapper[4752]: W1124 13:55:38.675797 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3882c71_bf49_46da_a5d4_b6ba560f136c.slice/crio-2a6ce6774022c8ad2fe712a18207c4191df31fed3c5e0a3b42be566ad672fa55 WatchSource:0}: Error finding container 2a6ce6774022c8ad2fe712a18207c4191df31fed3c5e0a3b42be566ad672fa55: Status 404 returned error can't find the container with id 2a6ce6774022c8ad2fe712a18207c4191df31fed3c5e0a3b42be566ad672fa55 Nov 24 13:55:38 crc kubenswrapper[4752]: I1124 13:55:38.686717 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-625wt"] Nov 24 13:55:39 crc kubenswrapper[4752]: I1124 13:55:39.497188 4752 generic.go:334] "Generic (PLEG): container finished" podID="a3882c71-bf49-46da-a5d4-b6ba560f136c" containerID="a2f73dc26d22b6ddd779cb29c6fa128eb104c258b7bc9c19382877261e4b6ae4" exitCode=0 Nov 24 13:55:39 crc kubenswrapper[4752]: I1124 13:55:39.497287 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-625wt" event={"ID":"a3882c71-bf49-46da-a5d4-b6ba560f136c","Type":"ContainerDied","Data":"a2f73dc26d22b6ddd779cb29c6fa128eb104c258b7bc9c19382877261e4b6ae4"} Nov 24 13:55:39 crc kubenswrapper[4752]: I1124 13:55:39.497480 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-625wt" event={"ID":"a3882c71-bf49-46da-a5d4-b6ba560f136c","Type":"ContainerStarted","Data":"2a6ce6774022c8ad2fe712a18207c4191df31fed3c5e0a3b42be566ad672fa55"} Nov 24 13:55:41 crc kubenswrapper[4752]: I1124 13:55:41.526303 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-625wt" event={"ID":"a3882c71-bf49-46da-a5d4-b6ba560f136c","Type":"ContainerStarted","Data":"319a67814f77130f15c1c34fd4965d9f543fdf538827f23a0e81619ba2e2a461"} Nov 24 13:55:42 crc kubenswrapper[4752]: I1124 13:55:42.538684 4752 generic.go:334] "Generic (PLEG): container finished" podID="a3882c71-bf49-46da-a5d4-b6ba560f136c" containerID="319a67814f77130f15c1c34fd4965d9f543fdf538827f23a0e81619ba2e2a461" exitCode=0 Nov 24 13:55:42 crc kubenswrapper[4752]: I1124 13:55:42.539066 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-625wt" event={"ID":"a3882c71-bf49-46da-a5d4-b6ba560f136c","Type":"ContainerDied","Data":"319a67814f77130f15c1c34fd4965d9f543fdf538827f23a0e81619ba2e2a461"} Nov 24 13:55:42 crc kubenswrapper[4752]: I1124 13:55:42.541718 4752 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 24 13:55:44 crc kubenswrapper[4752]: I1124 13:55:44.558677 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-625wt" event={"ID":"a3882c71-bf49-46da-a5d4-b6ba560f136c","Type":"ContainerStarted","Data":"f7209fae22fb1d3b7f97462d41508e62e4e4c945f4ca8df30f987fcd87d900ce"} Nov 24 13:55:44 crc kubenswrapper[4752]: I1124 13:55:44.582594 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-625wt" podStartSLOduration=3.297204253 podStartE2EDuration="7.582574286s" podCreationTimestamp="2025-11-24 13:55:37 +0000 UTC" firstStartedPulling="2025-11-24 13:55:39.500008795 +0000 UTC m=+10145.484829084" lastFinishedPulling="2025-11-24 13:55:43.785378828 +0000 UTC m=+10149.770199117" observedRunningTime="2025-11-24 13:55:44.574914416 +0000 UTC m=+10150.559734725" watchObservedRunningTime="2025-11-24 13:55:44.582574286 +0000 UTC m=+10150.567394575" Nov 24 13:55:48 crc kubenswrapper[4752]: I1124 13:55:48.041841 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-625wt" Nov 24 13:55:48 crc kubenswrapper[4752]: I1124 13:55:48.042639 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-625wt" Nov 24 13:55:48 crc kubenswrapper[4752]: I1124 13:55:48.125852 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-625wt" Nov 24 13:55:48 crc kubenswrapper[4752]: I1124 13:55:48.682014 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-625wt" Nov 24 13:55:48 crc kubenswrapper[4752]: I1124 13:55:48.755469 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-625wt"]